why Automatic differentiation and gradient tape need to use context manager?Putting together a Python context manager: a puzzleAutomatic differentiation with custom data typesAutomatic differentiation with ForwardDiff in JuliaUnderstanding `tf.nn.nce_loss()` in tensorflowDifference between symbolic differentiation and automatic differentiation?How should a list of sentences for NMT be flattened into a single “sentence” to facilitate skipgram embedding?C++ reverse automatic differentiation with graphDeriving symbolic derivative from automatic differentiationHow does tensorflow handle non differentiable nodes during gradient calculation?Ipopt automatic differentiation

Given a safe domain, are subdirectories safe as well?

What is a common way to tell if an academic is "above average," or outstanding in their field? Is their h-index (Hirsh index) one of them?

TIP120 Transistor + Solenoid Failing Randomly

Convert Numbers To Emoji Math

Is crescere the correct word meaning to to grow or cultivate?

Can an Iranian citizen enter the USA on a Dutch passport?

Books of scary stories with two or three plots each

While drilling into kitchen wall, hit a wire - any advice?

Picking a theme as a discovery writer

Why is the blank symbol not considered part of the input alphabet of a Turing machine?

Why doesn't a particle exert force on itself?

Ab major 9th chord in Bach

What is the meaning of 「隣のおじいさんは言いました」

Do Jedi mind tricks work on Ewoks?

What is monoid homomorphism exactly?

Referring to person by surname, keep or omit "von"?

What do you call a painting painted on a wall?

Two denim hijabs

How can I finally understand the confusing modal verb "мочь"?

Emergency stop in plain TeX, pdfTeX, XeTeX and LuaTeX?

How is Pauli's exclusion principle still valid in these cases?

How to deal with employer who keeps me at work after working hours

Problem with estimating a sequence with intuition

Installing Debian 10, upgrade to stable later?



why Automatic differentiation and gradient tape need to use context manager?


Putting together a Python context manager: a puzzleAutomatic differentiation with custom data typesAutomatic differentiation with ForwardDiff in JuliaUnderstanding `tf.nn.nce_loss()` in tensorflowDifference between symbolic differentiation and automatic differentiation?How should a list of sentences for NMT be flattened into a single “sentence” to facilitate skipgram embedding?C++ reverse automatic differentiation with graphDeriving symbolic derivative from automatic differentiationHow does tensorflow handle non differentiable nodes during gradient calculation?Ipopt automatic differentiation






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty height:90px;width:728px;box-sizing:border-box;








0















Context managers can change two two related operations into one.For example:



with open('some_file', 'w') as opened_file:
opened_file.write('Hola!')


The above code is equivalent to:



file = open('some_file', 'w')
try:
file.write('Hola!')
finally:
file.close()


But in https://www.tensorflow.org/tutorials/eager/custom_training_walkthrough#define_the_loss_and_gradient_function
I found:



def grad(model, inputs, targets):
with tf.GradientTape() as tape:
loss_value = loss(model, inputs, targets)
return loss_value, tape.gradient(loss_value, model.trainable_variables)


what is it equivalent to?










share|improve this question




























    0















    Context managers can change two two related operations into one.For example:



    with open('some_file', 'w') as opened_file:
    opened_file.write('Hola!')


    The above code is equivalent to:



    file = open('some_file', 'w')
    try:
    file.write('Hola!')
    finally:
    file.close()


    But in https://www.tensorflow.org/tutorials/eager/custom_training_walkthrough#define_the_loss_and_gradient_function
    I found:



    def grad(model, inputs, targets):
    with tf.GradientTape() as tape:
    loss_value = loss(model, inputs, targets)
    return loss_value, tape.gradient(loss_value, model.trainable_variables)


    what is it equivalent to?










    share|improve this question
























      0












      0








      0








      Context managers can change two two related operations into one.For example:



      with open('some_file', 'w') as opened_file:
      opened_file.write('Hola!')


      The above code is equivalent to:



      file = open('some_file', 'w')
      try:
      file.write('Hola!')
      finally:
      file.close()


      But in https://www.tensorflow.org/tutorials/eager/custom_training_walkthrough#define_the_loss_and_gradient_function
      I found:



      def grad(model, inputs, targets):
      with tf.GradientTape() as tape:
      loss_value = loss(model, inputs, targets)
      return loss_value, tape.gradient(loss_value, model.trainable_variables)


      what is it equivalent to?










      share|improve this question














      Context managers can change two two related operations into one.For example:



      with open('some_file', 'w') as opened_file:
      opened_file.write('Hola!')


      The above code is equivalent to:



      file = open('some_file', 'w')
      try:
      file.write('Hola!')
      finally:
      file.close()


      But in https://www.tensorflow.org/tutorials/eager/custom_training_walkthrough#define_the_loss_and_gradient_function
      I found:



      def grad(model, inputs, targets):
      with tf.GradientTape() as tape:
      loss_value = loss(model, inputs, targets)
      return loss_value, tape.gradient(loss_value, model.trainable_variables)


      what is it equivalent to?







      python tensorflow automatic-differentiation






      share|improve this question













      share|improve this question











      share|improve this question




      share|improve this question










      asked Mar 23 at 4:42









      andyandy

      98118




      98118






















          0






          active

          oldest

          votes












          Your Answer






          StackExchange.ifUsing("editor", function ()
          StackExchange.using("externalEditor", function ()
          StackExchange.using("snippets", function ()
          StackExchange.snippets.init();
          );
          );
          , "code-snippets");

          StackExchange.ready(function()
          var channelOptions =
          tags: "".split(" "),
          id: "1"
          ;
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function()
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled)
          StackExchange.using("snippets", function()
          createEditor();
          );

          else
          createEditor();

          );

          function createEditor()
          StackExchange.prepareEditor(
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader:
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          ,
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          );



          );













          draft saved

          draft discarded


















          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55310671%2fwhy-automatic-differentiation-and-gradient-tape-need-to-use-context-manager%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown

























          0






          active

          oldest

          votes








          0






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes















          draft saved

          draft discarded
















































          Thanks for contributing an answer to Stack Overflow!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid


          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.

          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55310671%2fwhy-automatic-differentiation-and-gradient-tape-need-to-use-context-manager%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Kamusi Yaliyomo Aina za kamusi | Muundo wa kamusi | Faida za kamusi | Dhima ya picha katika kamusi | Marejeo | Tazama pia | Viungo vya nje | UrambazajiKuhusu kamusiGo-SwahiliWiki-KamusiKamusi ya Kiswahili na Kiingerezakuihariri na kuongeza habari

          SQL error code 1064 with creating Laravel foreign keysForeign key constraints: When to use ON UPDATE and ON DELETEDropping column with foreign key Laravel error: General error: 1025 Error on renameLaravel SQL Can't create tableLaravel Migration foreign key errorLaravel php artisan migrate:refresh giving a syntax errorSQLSTATE[42S01]: Base table or view already exists or Base table or view already exists: 1050 Tableerror in migrating laravel file to xampp serverSyntax error or access violation: 1064:syntax to use near 'unsigned not null, modelName varchar(191) not null, title varchar(191) not nLaravel cannot create new table field in mysqlLaravel 5.7:Last migration creates table but is not registered in the migration table

          은진 송씨 목차 역사 본관 분파 인물 조선 왕실과의 인척 관계 집성촌 항렬자 인구 같이 보기 각주 둘러보기 메뉴은진 송씨세종실록 149권, 지리지 충청도 공주목 은진현