How to freeze a Keras graph with BatchNorm layersHow to merge two dictionaries in a single expression?How do I check if a list is empty?How do I check whether a file exists without exceptions?How can I safely create a nested directory?How can I make a time delay in Python?How do I sort a dictionary by value?How to make a chain of function decorators?How to make a flat list out of list of listsHow do I list all files of a directory?Can't import frozen graph after adding layers to Keras model

Why did the Death Eaters wait to reopen the Chamber of Secrets?

DBCC SHRINKFILE on the distribution database

What's the reason for the decade jump in the recent X-Men trilogy?

Approach sick days in feedback meeting

What publication claimed that Michael Jackson died in a nuclear holocaust?

Opposite of "Concerto Grosso"?

Must a CPU have a GPU if the motherboard provides a display port (when there isn't any separate video card)?

The best in flight meal option for those suffering from reflux

Is it a good security practice to force employees hide their employer to avoid being targeted?

How to search for Android apps without ads?

Optimising matrix generation time

Dedicated bike GPS computer over smartphone

How can this shape perfectly cover a cube?

Should I worry about having my credit pulled multiple times while car shopping?

Are athletes' college degrees discounted by employers and graduate school admissions?

Can a 40amp breaker be used safely and without issue with a 40amp device on 6AWG wire?

Was the Lonely Mountain, where Smaug lived, a volcano?

Arrows inside a commutative diagram using tikzcd

Why not make one big cpu core?

Is fission/fusion to iron the most efficient way to convert mass to energy?

Is it true that "only photographers care about noise"?

Is it possible to have battery technology that can't be duplicated?

Can I get a photo of an Ancient Arrow?

How effective would a full set of plate armor be against wild animals found in temperate regions (bears, snakes, wolves)?



How to freeze a Keras graph with BatchNorm layers


How to merge two dictionaries in a single expression?How do I check if a list is empty?How do I check whether a file exists without exceptions?How can I safely create a nested directory?How can I make a time delay in Python?How do I sort a dictionary by value?How to make a chain of function decorators?How to make a flat list out of list of listsHow do I list all files of a directory?Can't import frozen graph after adding layers to Keras model






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty height:90px;width:728px;box-sizing:border-box;








1















I'm trying to load a frozen Keras Graph with Batchnorm layers, but getting the error:



Message: TensorFlow.TFException : Input 0 of node
DenseNet/DenseBlock/ConvBlock/dense_0_0_bn/cond/ReadVariableOp/Switch was
passed float from DenseNet/DenseBlock/ConvBlock/dense_0_0_bn/gamma:0
incompatible with expected resource.


Normally the solution to this is to do: keras.backend.set_learning_phase(0), however when loading the graph in another API (for instance TensorflowSharp / TfLite) this isn't an option (as far as I can tell).



Here's how I'm currently saving the graph:



def freeze_session(session, keep_var_names=None, output_names=None, clear_devices=True):

from tensorflow.python.framework.graph_util import convert_variables_to_constants
import tensorflow as tf

graph = session.graph
with graph.as_default():
freeze_var_names = list(set(v.op.name for v in tf.global_variables()).difference(keep_var_names or []))
output_names = output_names or []
output_names += [v.op.name for v in tf.global_variables()]
# Graph -> GraphDef ProtoBuf
input_graph_def = graph.as_graph_def()
if clear_devices:
for node in input_graph_def.node:
node.device = ""

for node in input_graph_def.node:
if node.op == 'RefSwitch':
for index in range(len(node.input)):
if 'moving_' in node.input[index]:
node.input[index] = node.input[index] + '/read'
elif node.op == 'AssignSub':
node.op = 'Sub'
if 'use_locking' in node.attr: del node.attr['use_locking']

frozen_graph = convert_variables_to_constants(session, input_graph_def,
output_names, freeze_var_names)
return frozen_graph


Is there any way I can programmatically remove the Batchnorm layers before saving so that I can load the model in an environment outside Keras?










share|improve this question






























    1















    I'm trying to load a frozen Keras Graph with Batchnorm layers, but getting the error:



    Message: TensorFlow.TFException : Input 0 of node
    DenseNet/DenseBlock/ConvBlock/dense_0_0_bn/cond/ReadVariableOp/Switch was
    passed float from DenseNet/DenseBlock/ConvBlock/dense_0_0_bn/gamma:0
    incompatible with expected resource.


    Normally the solution to this is to do: keras.backend.set_learning_phase(0), however when loading the graph in another API (for instance TensorflowSharp / TfLite) this isn't an option (as far as I can tell).



    Here's how I'm currently saving the graph:



    def freeze_session(session, keep_var_names=None, output_names=None, clear_devices=True):

    from tensorflow.python.framework.graph_util import convert_variables_to_constants
    import tensorflow as tf

    graph = session.graph
    with graph.as_default():
    freeze_var_names = list(set(v.op.name for v in tf.global_variables()).difference(keep_var_names or []))
    output_names = output_names or []
    output_names += [v.op.name for v in tf.global_variables()]
    # Graph -> GraphDef ProtoBuf
    input_graph_def = graph.as_graph_def()
    if clear_devices:
    for node in input_graph_def.node:
    node.device = ""

    for node in input_graph_def.node:
    if node.op == 'RefSwitch':
    for index in range(len(node.input)):
    if 'moving_' in node.input[index]:
    node.input[index] = node.input[index] + '/read'
    elif node.op == 'AssignSub':
    node.op = 'Sub'
    if 'use_locking' in node.attr: del node.attr['use_locking']

    frozen_graph = convert_variables_to_constants(session, input_graph_def,
    output_names, freeze_var_names)
    return frozen_graph


    Is there any way I can programmatically remove the Batchnorm layers before saving so that I can load the model in an environment outside Keras?










    share|improve this question


























      1












      1








      1








      I'm trying to load a frozen Keras Graph with Batchnorm layers, but getting the error:



      Message: TensorFlow.TFException : Input 0 of node
      DenseNet/DenseBlock/ConvBlock/dense_0_0_bn/cond/ReadVariableOp/Switch was
      passed float from DenseNet/DenseBlock/ConvBlock/dense_0_0_bn/gamma:0
      incompatible with expected resource.


      Normally the solution to this is to do: keras.backend.set_learning_phase(0), however when loading the graph in another API (for instance TensorflowSharp / TfLite) this isn't an option (as far as I can tell).



      Here's how I'm currently saving the graph:



      def freeze_session(session, keep_var_names=None, output_names=None, clear_devices=True):

      from tensorflow.python.framework.graph_util import convert_variables_to_constants
      import tensorflow as tf

      graph = session.graph
      with graph.as_default():
      freeze_var_names = list(set(v.op.name for v in tf.global_variables()).difference(keep_var_names or []))
      output_names = output_names or []
      output_names += [v.op.name for v in tf.global_variables()]
      # Graph -> GraphDef ProtoBuf
      input_graph_def = graph.as_graph_def()
      if clear_devices:
      for node in input_graph_def.node:
      node.device = ""

      for node in input_graph_def.node:
      if node.op == 'RefSwitch':
      for index in range(len(node.input)):
      if 'moving_' in node.input[index]:
      node.input[index] = node.input[index] + '/read'
      elif node.op == 'AssignSub':
      node.op = 'Sub'
      if 'use_locking' in node.attr: del node.attr['use_locking']

      frozen_graph = convert_variables_to_constants(session, input_graph_def,
      output_names, freeze_var_names)
      return frozen_graph


      Is there any way I can programmatically remove the Batchnorm layers before saving so that I can load the model in an environment outside Keras?










      share|improve this question
















      I'm trying to load a frozen Keras Graph with Batchnorm layers, but getting the error:



      Message: TensorFlow.TFException : Input 0 of node
      DenseNet/DenseBlock/ConvBlock/dense_0_0_bn/cond/ReadVariableOp/Switch was
      passed float from DenseNet/DenseBlock/ConvBlock/dense_0_0_bn/gamma:0
      incompatible with expected resource.


      Normally the solution to this is to do: keras.backend.set_learning_phase(0), however when loading the graph in another API (for instance TensorflowSharp / TfLite) this isn't an option (as far as I can tell).



      Here's how I'm currently saving the graph:



      def freeze_session(session, keep_var_names=None, output_names=None, clear_devices=True):

      from tensorflow.python.framework.graph_util import convert_variables_to_constants
      import tensorflow as tf

      graph = session.graph
      with graph.as_default():
      freeze_var_names = list(set(v.op.name for v in tf.global_variables()).difference(keep_var_names or []))
      output_names = output_names or []
      output_names += [v.op.name for v in tf.global_variables()]
      # Graph -> GraphDef ProtoBuf
      input_graph_def = graph.as_graph_def()
      if clear_devices:
      for node in input_graph_def.node:
      node.device = ""

      for node in input_graph_def.node:
      if node.op == 'RefSwitch':
      for index in range(len(node.input)):
      if 'moving_' in node.input[index]:
      node.input[index] = node.input[index] + '/read'
      elif node.op == 'AssignSub':
      node.op = 'Sub'
      if 'use_locking' in node.attr: del node.attr['use_locking']

      frozen_graph = convert_variables_to_constants(session, input_graph_def,
      output_names, freeze_var_names)
      return frozen_graph


      Is there any way I can programmatically remove the Batchnorm layers before saving so that I can load the model in an environment outside Keras?







      python tensorflow keras






      share|improve this question















      share|improve this question













      share|improve this question




      share|improve this question








      edited Mar 25 at 9:23







      Lukeyb

















      asked Mar 25 at 1:36









      LukeybLukeyb

      343215




      343215






















          0






          active

          oldest

          votes












          Your Answer






          StackExchange.ifUsing("editor", function ()
          StackExchange.using("externalEditor", function ()
          StackExchange.using("snippets", function ()
          StackExchange.snippets.init();
          );
          );
          , "code-snippets");

          StackExchange.ready(function()
          var channelOptions =
          tags: "".split(" "),
          id: "1"
          ;
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function()
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled)
          StackExchange.using("snippets", function()
          createEditor();
          );

          else
          createEditor();

          );

          function createEditor()
          StackExchange.prepareEditor(
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader:
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          ,
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          );



          );













          draft saved

          draft discarded


















          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55330290%2fhow-to-freeze-a-keras-graph-with-batchnorm-layers%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown

























          0






          active

          oldest

          votes








          0






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes















          draft saved

          draft discarded
















































          Thanks for contributing an answer to Stack Overflow!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid


          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.

          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55330290%2fhow-to-freeze-a-keras-graph-with-batchnorm-layers%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Kamusi Yaliyomo Aina za kamusi | Muundo wa kamusi | Faida za kamusi | Dhima ya picha katika kamusi | Marejeo | Tazama pia | Viungo vya nje | UrambazajiKuhusu kamusiGo-SwahiliWiki-KamusiKamusi ya Kiswahili na Kiingerezakuihariri na kuongeza habari

          SQL error code 1064 with creating Laravel foreign keysForeign key constraints: When to use ON UPDATE and ON DELETEDropping column with foreign key Laravel error: General error: 1025 Error on renameLaravel SQL Can't create tableLaravel Migration foreign key errorLaravel php artisan migrate:refresh giving a syntax errorSQLSTATE[42S01]: Base table or view already exists or Base table or view already exists: 1050 Tableerror in migrating laravel file to xampp serverSyntax error or access violation: 1064:syntax to use near 'unsigned not null, modelName varchar(191) not null, title varchar(191) not nLaravel cannot create new table field in mysqlLaravel 5.7:Last migration creates table but is not registered in the migration table

          은진 송씨 목차 역사 본관 분파 인물 조선 왕실과의 인척 관계 집성촌 항렬자 인구 같이 보기 각주 둘러보기 메뉴은진 송씨세종실록 149권, 지리지 충청도 공주목 은진현