Load (or combine) several pretrained checkpoints with tf.estimator.WarmStartSettingsHow to import other Python files?TensorFlow saving into/loading a graph from a fileTransfer learning with tf.estimator.Estimator frameworkWhere can I download pretrained weights for tensorflow.contrib.slim.nets?keras.estimator.model_to_estimator - Cannot warm-start or load previous checkpointModify pretrained model in tensorflowTensorflow remove layers from pretrained modelLoading a checkpoint from a trained model using estimatorWarning: variable is not available in checkpointLoading pretrained model in Tensorflow

Initializing variables variable in an "if" statement

/api/sitecore is not working in CD server

How to deal with a Murder Hobo Paladin?

Park the computer

How do I iterate equal values with the standard library?

Why do most airliners have underwing engines, while business jets have rear-mounted engines?

Is this car delivery via Ebay Motors on Craigslist a scam?

What is the shape of the upper boundary of water hitting a screen?

Do Goblin tokens count as Goblins?

Is kapton suitable for use as high voltage insulation?

How can a ban from entering the US be lifted?

What instances can be solved today by modern solvers (pure LP)?

What are some bad ways to subvert tropes?

How important is it for multiple POVs to run chronologically?

How frequently do Russian people still refer to others by their patronymic (отчество)?

Why is there paternal, for fatherly, fraternal, for brotherly, but no similar word for sons?

Should I increase my 401(k) contributions, or increase my mortgage payments

Why did moving the mouse cursor cause Windows 95 to run more quickly?

What's the big deal about the Nazgûl losing their horses?

What can a novel do that film and TV cannot?

Is it acceptable that I plot a time-series figure with years increasing from right to left?

Why do Martians have to wear space helmets?

How do I check that users don't write down their passwords?

Boss furious on bad appraisal



Load (or combine) several pretrained checkpoints with tf.estimator.WarmStartSettings


How to import other Python files?TensorFlow saving into/loading a graph from a fileTransfer learning with tf.estimator.Estimator frameworkWhere can I download pretrained weights for tensorflow.contrib.slim.nets?keras.estimator.model_to_estimator - Cannot warm-start or load previous checkpointModify pretrained model in tensorflowTensorflow remove layers from pretrained modelLoading a checkpoint from a trained model using estimatorWarning: variable is not available in checkpointLoading pretrained model in Tensorflow






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;








1















I want to use pretrained weights for 2 parts of my model. I have 2 checkpoints from different models, from which I can load only one into my main model with tf.estimator.WarmStart as I'm using the estimator architecture.



tf.WarmStartSettings(ckpt_to_initialize_from=X)


from the doc:




Either the directory or a specific checkpoint can be provided (in the case of the former, the latest checkpoint will be used).




I can't see how I can add an additional checkpoint. Maybe there is a way to load the weights from both checkpoint into one and load that one?










share|improve this question




























    1















    I want to use pretrained weights for 2 parts of my model. I have 2 checkpoints from different models, from which I can load only one into my main model with tf.estimator.WarmStart as I'm using the estimator architecture.



    tf.WarmStartSettings(ckpt_to_initialize_from=X)


    from the doc:




    Either the directory or a specific checkpoint can be provided (in the case of the former, the latest checkpoint will be used).




    I can't see how I can add an additional checkpoint. Maybe there is a way to load the weights from both checkpoint into one and load that one?










    share|improve this question
























      1












      1








      1








      I want to use pretrained weights for 2 parts of my model. I have 2 checkpoints from different models, from which I can load only one into my main model with tf.estimator.WarmStart as I'm using the estimator architecture.



      tf.WarmStartSettings(ckpt_to_initialize_from=X)


      from the doc:




      Either the directory or a specific checkpoint can be provided (in the case of the former, the latest checkpoint will be used).




      I can't see how I can add an additional checkpoint. Maybe there is a way to load the weights from both checkpoint into one and load that one?










      share|improve this question














      I want to use pretrained weights for 2 parts of my model. I have 2 checkpoints from different models, from which I can load only one into my main model with tf.estimator.WarmStart as I'm using the estimator architecture.



      tf.WarmStartSettings(ckpt_to_initialize_from=X)


      from the doc:




      Either the directory or a specific checkpoint can be provided (in the case of the former, the latest checkpoint will be used).




      I can't see how I can add an additional checkpoint. Maybe there is a way to load the weights from both checkpoint into one and load that one?







      python tensorflow tensorflow-estimator






      share|improve this question













      share|improve this question











      share|improve this question




      share|improve this question










      asked Mar 25 at 19:48









      user2368505user2368505

      1861 silver badge13 bronze badges




      1861 silver badge13 bronze badges






















          1 Answer
          1






          active

          oldest

          votes


















          1














          You can use init_from_checkpoint.



          First, define assignment map:



          dir = 'path_to_checkpoint_files'
          vars_to_load = [i[0] for i in tf.train.list_variables(dir)]


          This creates a list of all variables in checkpoints



          assignment_map = variable.op.name: variable for variable in tf.global_variables() if variable.op.name in vars_to_load


          And this creates a dict that has variables from current graph as key and variables from checkpoint as values



          tf.train.init_from_checkpoint(dir, assignment_map)


          This function is placed inside estimator's model_fn. It will override standard variable initialization.






          share|improve this answer
























            Your Answer






            StackExchange.ifUsing("editor", function ()
            StackExchange.using("externalEditor", function ()
            StackExchange.using("snippets", function ()
            StackExchange.snippets.init();
            );
            );
            , "code-snippets");

            StackExchange.ready(function()
            var channelOptions =
            tags: "".split(" "),
            id: "1"
            ;
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function()
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled)
            StackExchange.using("snippets", function()
            createEditor();
            );

            else
            createEditor();

            );

            function createEditor()
            StackExchange.prepareEditor(
            heartbeatType: 'answer',
            autoActivateHeartbeat: false,
            convertImagesToLinks: true,
            noModals: true,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: 10,
            bindNavPrevention: true,
            postfix: "",
            imageUploader:
            brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
            contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
            allowUrls: true
            ,
            onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            );



            );













            draft saved

            draft discarded


















            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55345384%2fload-or-combine-several-pretrained-checkpoints-with-tf-estimator-warmstartsett%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown

























            1 Answer
            1






            active

            oldest

            votes








            1 Answer
            1






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes









            1














            You can use init_from_checkpoint.



            First, define assignment map:



            dir = 'path_to_checkpoint_files'
            vars_to_load = [i[0] for i in tf.train.list_variables(dir)]


            This creates a list of all variables in checkpoints



            assignment_map = variable.op.name: variable for variable in tf.global_variables() if variable.op.name in vars_to_load


            And this creates a dict that has variables from current graph as key and variables from checkpoint as values



            tf.train.init_from_checkpoint(dir, assignment_map)


            This function is placed inside estimator's model_fn. It will override standard variable initialization.






            share|improve this answer





























              1














              You can use init_from_checkpoint.



              First, define assignment map:



              dir = 'path_to_checkpoint_files'
              vars_to_load = [i[0] for i in tf.train.list_variables(dir)]


              This creates a list of all variables in checkpoints



              assignment_map = variable.op.name: variable for variable in tf.global_variables() if variable.op.name in vars_to_load


              And this creates a dict that has variables from current graph as key and variables from checkpoint as values



              tf.train.init_from_checkpoint(dir, assignment_map)


              This function is placed inside estimator's model_fn. It will override standard variable initialization.






              share|improve this answer



























                1












                1








                1







                You can use init_from_checkpoint.



                First, define assignment map:



                dir = 'path_to_checkpoint_files'
                vars_to_load = [i[0] for i in tf.train.list_variables(dir)]


                This creates a list of all variables in checkpoints



                assignment_map = variable.op.name: variable for variable in tf.global_variables() if variable.op.name in vars_to_load


                And this creates a dict that has variables from current graph as key and variables from checkpoint as values



                tf.train.init_from_checkpoint(dir, assignment_map)


                This function is placed inside estimator's model_fn. It will override standard variable initialization.






                share|improve this answer















                You can use init_from_checkpoint.



                First, define assignment map:



                dir = 'path_to_checkpoint_files'
                vars_to_load = [i[0] for i in tf.train.list_variables(dir)]


                This creates a list of all variables in checkpoints



                assignment_map = variable.op.name: variable for variable in tf.global_variables() if variable.op.name in vars_to_load


                And this creates a dict that has variables from current graph as key and variables from checkpoint as values



                tf.train.init_from_checkpoint(dir, assignment_map)


                This function is placed inside estimator's model_fn. It will override standard variable initialization.







                share|improve this answer














                share|improve this answer



                share|improve this answer








                edited Mar 25 at 20:51

























                answered Mar 25 at 20:23









                SharkySharky

                2,6342 gold badges9 silver badges19 bronze badges




                2,6342 gold badges9 silver badges19 bronze badges


















                    Got a question that you can’t ask on public Stack Overflow? Learn more about sharing private information with Stack Overflow for Teams.







                    Got a question that you can’t ask on public Stack Overflow? Learn more about sharing private information with Stack Overflow for Teams.



















                    draft saved

                    draft discarded
















































                    Thanks for contributing an answer to Stack Overflow!


                    • Please be sure to answer the question. Provide details and share your research!

                    But avoid


                    • Asking for help, clarification, or responding to other answers.

                    • Making statements based on opinion; back them up with references or personal experience.

                    To learn more, see our tips on writing great answers.




                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function ()
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55345384%2fload-or-combine-several-pretrained-checkpoints-with-tf-estimator-warmstartsett%23new-answer', 'question_page');

                    );

                    Post as a guest















                    Required, but never shown





















































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown

































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown







                    Popular posts from this blog

                    Kamusi Yaliyomo Aina za kamusi | Muundo wa kamusi | Faida za kamusi | Dhima ya picha katika kamusi | Marejeo | Tazama pia | Viungo vya nje | UrambazajiKuhusu kamusiGo-SwahiliWiki-KamusiKamusi ya Kiswahili na Kiingerezakuihariri na kuongeza habari

                    SQL error code 1064 with creating Laravel foreign keysForeign key constraints: When to use ON UPDATE and ON DELETEDropping column with foreign key Laravel error: General error: 1025 Error on renameLaravel SQL Can't create tableLaravel Migration foreign key errorLaravel php artisan migrate:refresh giving a syntax errorSQLSTATE[42S01]: Base table or view already exists or Base table or view already exists: 1050 Tableerror in migrating laravel file to xampp serverSyntax error or access violation: 1064:syntax to use near 'unsigned not null, modelName varchar(191) not null, title varchar(191) not nLaravel cannot create new table field in mysqlLaravel 5.7:Last migration creates table but is not registered in the migration table

                    은진 송씨 목차 역사 본관 분파 인물 조선 왕실과의 인척 관계 집성촌 항렬자 인구 같이 보기 각주 둘러보기 메뉴은진 송씨세종실록 149권, 지리지 충청도 공주목 은진현