Common class for Linear, Conv1d, Conv2d,…, LSTM,When should you use a class vs a struct in C++?*.h or *.hpp for your class definitionsUse 'class' or 'typename' for template parameters?How to call a parent class function from derived class function?Meaning of 'const' last in a function declaration of a class?error: request for member '..' in '..' which is of non-class typeConstexpr class: Inheritance?Checking template parameter inheritance with SFINAEHow to correctly give inputs to Embedding, LSTM and Linear layers in PyTorch?pytorch - Where is “conv1d” implemented?

Why should there be two solutions for each parameter of likelihood ratio equation for Weibull-distribution?

Can I get a PhD for developing an educational software?

Higman's lemma and a manuscript of Erdős and Rado

Can you board the plane when your passport is valid less than 3 months?

Count the number of paths to n

Was the Boeing 2707 design flawed?

How to prevent a hosting company from accessing a VM's encryption keys?

Semantic difference between regular and irregular 'backen'

How much does Commander Data weigh?

How should i charge 3 lithium ion batteries?

Did Dr. Hannibal Lecter like Clarice or attracted towards her?

Prevent use of CNAME record for untrusted domain

If the Shillelagh cantrip is applied to a club with non-standard damage dice, what is the resulting damage dice?

Why is getting a PhD considered "financially irresponsible"?

What happened to the HDEV ISS Experiment? Is it over?

Can an Arcane Focus be embedded in one's body?

Number of academics in various EU countries

Tex Quotes(UVa 272)

When, exactly, does the Rogue Scout get to use their Skirmisher ability?

Retroactively modifying humans for Earth?

Breaker Mapping Questions

Can Orcus use Multiattack with any melee weapon?

Unlock your Lock

Why is strlen so complex in C?



Common class for Linear, Conv1d, Conv2d,…, LSTM,


When should you use a class vs a struct in C++?*.h or *.hpp for your class definitionsUse 'class' or 'typename' for template parameters?How to call a parent class function from derived class function?Meaning of 'const' last in a function declaration of a class?error: request for member '..' in '..' which is of non-class typeConstexpr class: Inheritance?Checking template parameter inheritance with SFINAEHow to correctly give inputs to Embedding, LSTM and Linear layers in PyTorch?pytorch - Where is “conv1d” implemented?






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;








2















Is there any class that all torch::nn::Linear, torch::nn::Conv1d, torch::nn::Conv2d, ... torch::nn::GRU, .... all inherit from that? torch::nn::Module seems be a good option, though there is a middle class, called torch::nn::Cloneable, so that torch::nn::Module does not work. Also, torch::nn::Cloneable itself is a template so that needs type in the declaration.
I want to create a general class model, which has std::vector<the common class> layers, so that later I can fill layers with any type of layer that I want, e.g., Linear, LSTM, etc. Is there such a capability in the current API? This can be done easily in python, though here we need declaration and this hinders the python's easiness.



Thanks,
Afshin










share|improve this question






























    2















    Is there any class that all torch::nn::Linear, torch::nn::Conv1d, torch::nn::Conv2d, ... torch::nn::GRU, .... all inherit from that? torch::nn::Module seems be a good option, though there is a middle class, called torch::nn::Cloneable, so that torch::nn::Module does not work. Also, torch::nn::Cloneable itself is a template so that needs type in the declaration.
    I want to create a general class model, which has std::vector<the common class> layers, so that later I can fill layers with any type of layer that I want, e.g., Linear, LSTM, etc. Is there such a capability in the current API? This can be done easily in python, though here we need declaration and this hinders the python's easiness.



    Thanks,
    Afshin










    share|improve this question


























      2












      2








      2


      1






      Is there any class that all torch::nn::Linear, torch::nn::Conv1d, torch::nn::Conv2d, ... torch::nn::GRU, .... all inherit from that? torch::nn::Module seems be a good option, though there is a middle class, called torch::nn::Cloneable, so that torch::nn::Module does not work. Also, torch::nn::Cloneable itself is a template so that needs type in the declaration.
      I want to create a general class model, which has std::vector<the common class> layers, so that later I can fill layers with any type of layer that I want, e.g., Linear, LSTM, etc. Is there such a capability in the current API? This can be done easily in python, though here we need declaration and this hinders the python's easiness.



      Thanks,
      Afshin










      share|improve this question














      Is there any class that all torch::nn::Linear, torch::nn::Conv1d, torch::nn::Conv2d, ... torch::nn::GRU, .... all inherit from that? torch::nn::Module seems be a good option, though there is a middle class, called torch::nn::Cloneable, so that torch::nn::Module does not work. Also, torch::nn::Cloneable itself is a template so that needs type in the declaration.
      I want to create a general class model, which has std::vector<the common class> layers, so that later I can fill layers with any type of layer that I want, e.g., Linear, LSTM, etc. Is there such a capability in the current API? This can be done easily in python, though here we need declaration and this hinders the python's easiness.



      Thanks,
      Afshin







      c++ pytorch libtorch






      share|improve this question













      share|improve this question











      share|improve this question




      share|improve this question










      asked Mar 18 at 14:31









      Afshin OroojlooyAfshin Oroojlooy

      5479 silver badges21 bronze badges




      5479 silver badges21 bronze badges

























          1 Answer
          1






          active

          oldest

          votes


















          0















          I found that nn::sequential can be used for a this purpose, and it does not need a forward implementation, which can be a positive point and at a same time a negative point. nn::sequential already requires each module to have a forward implementation, and calls the forward functions in a sequence that they have added in. So, one cannot create an ad-hock non-usual forward pass like Dense-Net with that, though it is good enough for general usages.



          In addition, it seems that nn::sequential just uses a std::vector<nn::AnyModule> as its underlying module list. So, std::vector<nn::AnyModule> also might be used.






          share|improve this answer
























            Your Answer






            StackExchange.ifUsing("editor", function ()
            StackExchange.using("externalEditor", function ()
            StackExchange.using("snippets", function ()
            StackExchange.snippets.init();
            );
            );
            , "code-snippets");

            StackExchange.ready(function()
            var channelOptions =
            tags: "".split(" "),
            id: "1"
            ;
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function()
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled)
            StackExchange.using("snippets", function()
            createEditor();
            );

            else
            createEditor();

            );

            function createEditor()
            StackExchange.prepareEditor(
            heartbeatType: 'answer',
            autoActivateHeartbeat: false,
            convertImagesToLinks: true,
            noModals: true,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: 10,
            bindNavPrevention: true,
            postfix: "",
            imageUploader:
            brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
            contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
            allowUrls: true
            ,
            onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            );



            );













            draft saved

            draft discarded


















            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55223728%2fcommon-class-for-linear-conv1d-conv2d-lstm%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown

























            1 Answer
            1






            active

            oldest

            votes








            1 Answer
            1






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes









            0















            I found that nn::sequential can be used for a this purpose, and it does not need a forward implementation, which can be a positive point and at a same time a negative point. nn::sequential already requires each module to have a forward implementation, and calls the forward functions in a sequence that they have added in. So, one cannot create an ad-hock non-usual forward pass like Dense-Net with that, though it is good enough for general usages.



            In addition, it seems that nn::sequential just uses a std::vector<nn::AnyModule> as its underlying module list. So, std::vector<nn::AnyModule> also might be used.






            share|improve this answer





























              0















              I found that nn::sequential can be used for a this purpose, and it does not need a forward implementation, which can be a positive point and at a same time a negative point. nn::sequential already requires each module to have a forward implementation, and calls the forward functions in a sequence that they have added in. So, one cannot create an ad-hock non-usual forward pass like Dense-Net with that, though it is good enough for general usages.



              In addition, it seems that nn::sequential just uses a std::vector<nn::AnyModule> as its underlying module list. So, std::vector<nn::AnyModule> also might be used.






              share|improve this answer



























                0














                0










                0









                I found that nn::sequential can be used for a this purpose, and it does not need a forward implementation, which can be a positive point and at a same time a negative point. nn::sequential already requires each module to have a forward implementation, and calls the forward functions in a sequence that they have added in. So, one cannot create an ad-hock non-usual forward pass like Dense-Net with that, though it is good enough for general usages.



                In addition, it seems that nn::sequential just uses a std::vector<nn::AnyModule> as its underlying module list. So, std::vector<nn::AnyModule> also might be used.






                share|improve this answer













                I found that nn::sequential can be used for a this purpose, and it does not need a forward implementation, which can be a positive point and at a same time a negative point. nn::sequential already requires each module to have a forward implementation, and calls the forward functions in a sequence that they have added in. So, one cannot create an ad-hock non-usual forward pass like Dense-Net with that, though it is good enough for general usages.



                In addition, it seems that nn::sequential just uses a std::vector<nn::AnyModule> as its underlying module list. So, std::vector<nn::AnyModule> also might be used.







                share|improve this answer












                share|improve this answer



                share|improve this answer










                answered Mar 27 at 20:01









                Afshin OroojlooyAfshin Oroojlooy

                5479 silver badges21 bronze badges




                5479 silver badges21 bronze badges





















                    Got a question that you can’t ask on public Stack Overflow? Learn more about sharing private information with Stack Overflow for Teams.







                    Got a question that you can’t ask on public Stack Overflow? Learn more about sharing private information with Stack Overflow for Teams.



















                    draft saved

                    draft discarded
















































                    Thanks for contributing an answer to Stack Overflow!


                    • Please be sure to answer the question. Provide details and share your research!

                    But avoid


                    • Asking for help, clarification, or responding to other answers.

                    • Making statements based on opinion; back them up with references or personal experience.

                    To learn more, see our tips on writing great answers.




                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function ()
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55223728%2fcommon-class-for-linear-conv1d-conv2d-lstm%23new-answer', 'question_page');

                    );

                    Post as a guest















                    Required, but never shown





















































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown

































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown







                    Popular posts from this blog

                    Kamusi Yaliyomo Aina za kamusi | Muundo wa kamusi | Faida za kamusi | Dhima ya picha katika kamusi | Marejeo | Tazama pia | Viungo vya nje | UrambazajiKuhusu kamusiGo-SwahiliWiki-KamusiKamusi ya Kiswahili na Kiingerezakuihariri na kuongeza habari

                    SQL error code 1064 with creating Laravel foreign keysForeign key constraints: When to use ON UPDATE and ON DELETEDropping column with foreign key Laravel error: General error: 1025 Error on renameLaravel SQL Can't create tableLaravel Migration foreign key errorLaravel php artisan migrate:refresh giving a syntax errorSQLSTATE[42S01]: Base table or view already exists or Base table or view already exists: 1050 Tableerror in migrating laravel file to xampp serverSyntax error or access violation: 1064:syntax to use near 'unsigned not null, modelName varchar(191) not null, title varchar(191) not nLaravel cannot create new table field in mysqlLaravel 5.7:Last migration creates table but is not registered in the migration table

                    은진 송씨 목차 역사 본관 분파 인물 조선 왕실과의 인척 관계 집성촌 항렬자 인구 같이 보기 각주 둘러보기 메뉴은진 송씨세종실록 149권, 지리지 충청도 공주목 은진현