Can I change class_weight during training?How can I safely create a nested directory?How can I make a time delay in Python?Keras. ValueError: I/O operation on closed fileKeras AttributeError: 'list' object has no attribute 'ndim'LSTM with Keras: Input 'ref' of 'Assign' Op requires l-value inputInvalidArgumentError when running model.fit()IOError: [Errno 2] No such file or directory when training Keras modelNeural Network classification'Tensor' object has no attribute 'ndim'Keras add_loss will not work with y data(y_train, y_test) on Encoder-Decoder model

How important are the Author's mood and feelings for writing a story?

Why are flying carpets banned while flying brooms are not?

Who would use the word "manky"?

How to interpret a promising preprint that was never published in peer-review?

Who or what determines if a curse is valid or not?

How do I reproduce this layout and typography?

Do pedestrians imitate auto traffic?

Does unblocking power bar outlets through short extension cords increase fire risk?

How can I help our ranger feel special about her beast companion?

"Je suis petite, moi?", purpose of the "moi"?

Should I have one hand on the throttle during engine ignition?

Why is an object not defined as identity morphism?

Why teach C using scanf without talking about command line arguments?

Which modern firearm should a time traveler bring to be easily reproducible for a historic civilization?

Applying for jobs with an obvious scar

Is it possible to invoke "super" with less ambiguous results?

Demographic consequences of closed loop reincarnation

What is the name for the average of the largest and the smallest values in a given data set?

Who determines when road center lines are solid or dashed?

How to prove that the covariant derivative obeys the product rule

How can I automate this tensor computation?

Apex Legends stuck at 60 FPS (G-Sync 144hz monitor)

Why didn't Doctor Strange restore Tony Stark after he used the Stones?

Did Hitler say this quote about homeschooling?



Can I change class_weight during training?


How can I safely create a nested directory?How can I make a time delay in Python?Keras. ValueError: I/O operation on closed fileKeras AttributeError: 'list' object has no attribute 'ndim'LSTM with Keras: Input 'ref' of 'Assign' Op requires l-value inputInvalidArgumentError when running model.fit()IOError: [Errno 2] No such file or directory when training Keras modelNeural Network classification'Tensor' object has no attribute 'ndim'Keras add_loss will not work with y data(y_train, y_test) on Encoder-Decoder model






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;








1















I want to change my class_weight during training in Keras.



I used fit_generator and Callback method like below.



model.fit_generator(
decoder_generator(x_train, y_train),
steps_per_epoch=len(x_train),
epochs=args.epochs,
validation_data=decoder_generator(x_valid, y_valid),
validation_steps=len(x_valid),
callbacks=callback_list,
class_weight=class_weights,
verbose=1)


And



class Valid_checker(keras.callbacks.Callback):
def __init__(self, model_name, patience, val_data, x_length):
super().__init__()
self.best_score = 0
self.patience = patience
self.current_patience = 0
self.model_name = model_name
self.validation_data = val_data
self.x_length = x_length


def on_epoch_end(self, epoch, logs=):
X_val, y_val = self.validation_data
y_predict, x_predict = model.predict_generator(no_decoder_generator(X_val, y_val), steps=len(X_val))
y_predict = np.asarray(y_predict)
x_predict = np.asarray(x_predict)


decoder_generator and no_decoder_generator are just custom generator.



I want to change the class weight every time the epoch ends. Is it possible? Then how can I do?



My data is imbalanced data, and overfitting is continued for one class.



At the end of the epoch I want to increase the weight for classes with low accuracy by calculating the accuracy by class.



How can I do?










share|improve this question






























    1















    I want to change my class_weight during training in Keras.



    I used fit_generator and Callback method like below.



    model.fit_generator(
    decoder_generator(x_train, y_train),
    steps_per_epoch=len(x_train),
    epochs=args.epochs,
    validation_data=decoder_generator(x_valid, y_valid),
    validation_steps=len(x_valid),
    callbacks=callback_list,
    class_weight=class_weights,
    verbose=1)


    And



    class Valid_checker(keras.callbacks.Callback):
    def __init__(self, model_name, patience, val_data, x_length):
    super().__init__()
    self.best_score = 0
    self.patience = patience
    self.current_patience = 0
    self.model_name = model_name
    self.validation_data = val_data
    self.x_length = x_length


    def on_epoch_end(self, epoch, logs=):
    X_val, y_val = self.validation_data
    y_predict, x_predict = model.predict_generator(no_decoder_generator(X_val, y_val), steps=len(X_val))
    y_predict = np.asarray(y_predict)
    x_predict = np.asarray(x_predict)


    decoder_generator and no_decoder_generator are just custom generator.



    I want to change the class weight every time the epoch ends. Is it possible? Then how can I do?



    My data is imbalanced data, and overfitting is continued for one class.



    At the end of the epoch I want to increase the weight for classes with low accuracy by calculating the accuracy by class.



    How can I do?










    share|improve this question


























      1












      1








      1








      I want to change my class_weight during training in Keras.



      I used fit_generator and Callback method like below.



      model.fit_generator(
      decoder_generator(x_train, y_train),
      steps_per_epoch=len(x_train),
      epochs=args.epochs,
      validation_data=decoder_generator(x_valid, y_valid),
      validation_steps=len(x_valid),
      callbacks=callback_list,
      class_weight=class_weights,
      verbose=1)


      And



      class Valid_checker(keras.callbacks.Callback):
      def __init__(self, model_name, patience, val_data, x_length):
      super().__init__()
      self.best_score = 0
      self.patience = patience
      self.current_patience = 0
      self.model_name = model_name
      self.validation_data = val_data
      self.x_length = x_length


      def on_epoch_end(self, epoch, logs=):
      X_val, y_val = self.validation_data
      y_predict, x_predict = model.predict_generator(no_decoder_generator(X_val, y_val), steps=len(X_val))
      y_predict = np.asarray(y_predict)
      x_predict = np.asarray(x_predict)


      decoder_generator and no_decoder_generator are just custom generator.



      I want to change the class weight every time the epoch ends. Is it possible? Then how can I do?



      My data is imbalanced data, and overfitting is continued for one class.



      At the end of the epoch I want to increase the weight for classes with low accuracy by calculating the accuracy by class.



      How can I do?










      share|improve this question
















      I want to change my class_weight during training in Keras.



      I used fit_generator and Callback method like below.



      model.fit_generator(
      decoder_generator(x_train, y_train),
      steps_per_epoch=len(x_train),
      epochs=args.epochs,
      validation_data=decoder_generator(x_valid, y_valid),
      validation_steps=len(x_valid),
      callbacks=callback_list,
      class_weight=class_weights,
      verbose=1)


      And



      class Valid_checker(keras.callbacks.Callback):
      def __init__(self, model_name, patience, val_data, x_length):
      super().__init__()
      self.best_score = 0
      self.patience = patience
      self.current_patience = 0
      self.model_name = model_name
      self.validation_data = val_data
      self.x_length = x_length


      def on_epoch_end(self, epoch, logs=):
      X_val, y_val = self.validation_data
      y_predict, x_predict = model.predict_generator(no_decoder_generator(X_val, y_val), steps=len(X_val))
      y_predict = np.asarray(y_predict)
      x_predict = np.asarray(x_predict)


      decoder_generator and no_decoder_generator are just custom generator.



      I want to change the class weight every time the epoch ends. Is it possible? Then how can I do?



      My data is imbalanced data, and overfitting is continued for one class.



      At the end of the epoch I want to increase the weight for classes with low accuracy by calculating the accuracy by class.



      How can I do?







      python tensorflow keras deep-learning






      share|improve this question















      share|improve this question













      share|improve this question




      share|improve this question








      edited May 23 at 17:02









      double-beep

      3,1655 gold badges19 silver badges33 bronze badges




      3,1655 gold badges19 silver badges33 bronze badges










      asked Mar 26 at 10:42









      Jeonghwa YooJeonghwa Yoo

      577 bronze badges




      577 bronze badges






















          1 Answer
          1






          active

          oldest

          votes


















          1














          How about a simple approach like looping over one epoch at a time ?



          for i in range(args.epochs):
          class_weights = calculate_weights()
          model.fit_generator(
          decoder_generator(x_train, y_train),
          steps_per_epoch=len(x_train),
          epochs=1,
          validation_data=decoder_generator(x_valid, y_valid),
          validation_steps=len(x_valid),
          callbacks=callback_list,
          class_weight=class_weights,
          verbose=1)


          There is no straight forward way to use different class weights for each epoch in fit_generator. You can incorporate early stopping by checking the value of model.stop_training



          Sample



          import numpy as np
          from keras.models import Sequential
          from keras.layers import Input, Dense
          from keras.models import Model
          from keras.callbacks import Callback

          class Valid_checker(Callback):
          def __init__(self):
          super().__init__()
          self.model = model
          self.n_epoch = 0

          def on_epoch_end(self, epoch, logs=):
          self.n_epoch += 1
          if self.n_epoch == 8:
          self.model.stop_training = True

          def decoder_generator():
          while True:
          for i in range(10):
          yield np.random.rand(10,5), np.random.randint(3,size=(10,3))


          inputs = Input(shape=(5,))
          outputs = Dense(3, activation='relu')(inputs)
          model = Model(inputs=inputs, outputs=outputs)
          model.compile(optimizer='rmsprop',
          loss='categorical_crossentropy',
          metrics=['accuracy'])

          for i in range(10):
          model.fit_generator(generator=decoder_generator(),
          class_weight=0:1/3, 1:1/3, 2:1/3,
          steps_per_epoch=10,
          epochs=1,
          callbacks=[Valid_checker()])
          if model.stop_training:
          break





          share|improve this answer

























          • I think I could try that. However, in Valid_checker class, I implemented custom early stopping and model saving. So I think I have to return current accuracy in on_epoch_end function. Do you have any other comments for me?

            – Jeonghwa Yoo
            Mar 26 at 12:46











          Your Answer






          StackExchange.ifUsing("editor", function ()
          StackExchange.using("externalEditor", function ()
          StackExchange.using("snippets", function ()
          StackExchange.snippets.init();
          );
          );
          , "code-snippets");

          StackExchange.ready(function()
          var channelOptions =
          tags: "".split(" "),
          id: "1"
          ;
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function()
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled)
          StackExchange.using("snippets", function()
          createEditor();
          );

          else
          createEditor();

          );

          function createEditor()
          StackExchange.prepareEditor(
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader:
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          ,
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          );



          );













          draft saved

          draft discarded


















          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55355170%2fcan-i-change-class-weight-during-training%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          1














          How about a simple approach like looping over one epoch at a time ?



          for i in range(args.epochs):
          class_weights = calculate_weights()
          model.fit_generator(
          decoder_generator(x_train, y_train),
          steps_per_epoch=len(x_train),
          epochs=1,
          validation_data=decoder_generator(x_valid, y_valid),
          validation_steps=len(x_valid),
          callbacks=callback_list,
          class_weight=class_weights,
          verbose=1)


          There is no straight forward way to use different class weights for each epoch in fit_generator. You can incorporate early stopping by checking the value of model.stop_training



          Sample



          import numpy as np
          from keras.models import Sequential
          from keras.layers import Input, Dense
          from keras.models import Model
          from keras.callbacks import Callback

          class Valid_checker(Callback):
          def __init__(self):
          super().__init__()
          self.model = model
          self.n_epoch = 0

          def on_epoch_end(self, epoch, logs=):
          self.n_epoch += 1
          if self.n_epoch == 8:
          self.model.stop_training = True

          def decoder_generator():
          while True:
          for i in range(10):
          yield np.random.rand(10,5), np.random.randint(3,size=(10,3))


          inputs = Input(shape=(5,))
          outputs = Dense(3, activation='relu')(inputs)
          model = Model(inputs=inputs, outputs=outputs)
          model.compile(optimizer='rmsprop',
          loss='categorical_crossentropy',
          metrics=['accuracy'])

          for i in range(10):
          model.fit_generator(generator=decoder_generator(),
          class_weight=0:1/3, 1:1/3, 2:1/3,
          steps_per_epoch=10,
          epochs=1,
          callbacks=[Valid_checker()])
          if model.stop_training:
          break





          share|improve this answer

























          • I think I could try that. However, in Valid_checker class, I implemented custom early stopping and model saving. So I think I have to return current accuracy in on_epoch_end function. Do you have any other comments for me?

            – Jeonghwa Yoo
            Mar 26 at 12:46
















          1














          How about a simple approach like looping over one epoch at a time ?



          for i in range(args.epochs):
          class_weights = calculate_weights()
          model.fit_generator(
          decoder_generator(x_train, y_train),
          steps_per_epoch=len(x_train),
          epochs=1,
          validation_data=decoder_generator(x_valid, y_valid),
          validation_steps=len(x_valid),
          callbacks=callback_list,
          class_weight=class_weights,
          verbose=1)


          There is no straight forward way to use different class weights for each epoch in fit_generator. You can incorporate early stopping by checking the value of model.stop_training



          Sample



          import numpy as np
          from keras.models import Sequential
          from keras.layers import Input, Dense
          from keras.models import Model
          from keras.callbacks import Callback

          class Valid_checker(Callback):
          def __init__(self):
          super().__init__()
          self.model = model
          self.n_epoch = 0

          def on_epoch_end(self, epoch, logs=):
          self.n_epoch += 1
          if self.n_epoch == 8:
          self.model.stop_training = True

          def decoder_generator():
          while True:
          for i in range(10):
          yield np.random.rand(10,5), np.random.randint(3,size=(10,3))


          inputs = Input(shape=(5,))
          outputs = Dense(3, activation='relu')(inputs)
          model = Model(inputs=inputs, outputs=outputs)
          model.compile(optimizer='rmsprop',
          loss='categorical_crossentropy',
          metrics=['accuracy'])

          for i in range(10):
          model.fit_generator(generator=decoder_generator(),
          class_weight=0:1/3, 1:1/3, 2:1/3,
          steps_per_epoch=10,
          epochs=1,
          callbacks=[Valid_checker()])
          if model.stop_training:
          break





          share|improve this answer

























          • I think I could try that. However, in Valid_checker class, I implemented custom early stopping and model saving. So I think I have to return current accuracy in on_epoch_end function. Do you have any other comments for me?

            – Jeonghwa Yoo
            Mar 26 at 12:46














          1












          1








          1







          How about a simple approach like looping over one epoch at a time ?



          for i in range(args.epochs):
          class_weights = calculate_weights()
          model.fit_generator(
          decoder_generator(x_train, y_train),
          steps_per_epoch=len(x_train),
          epochs=1,
          validation_data=decoder_generator(x_valid, y_valid),
          validation_steps=len(x_valid),
          callbacks=callback_list,
          class_weight=class_weights,
          verbose=1)


          There is no straight forward way to use different class weights for each epoch in fit_generator. You can incorporate early stopping by checking the value of model.stop_training



          Sample



          import numpy as np
          from keras.models import Sequential
          from keras.layers import Input, Dense
          from keras.models import Model
          from keras.callbacks import Callback

          class Valid_checker(Callback):
          def __init__(self):
          super().__init__()
          self.model = model
          self.n_epoch = 0

          def on_epoch_end(self, epoch, logs=):
          self.n_epoch += 1
          if self.n_epoch == 8:
          self.model.stop_training = True

          def decoder_generator():
          while True:
          for i in range(10):
          yield np.random.rand(10,5), np.random.randint(3,size=(10,3))


          inputs = Input(shape=(5,))
          outputs = Dense(3, activation='relu')(inputs)
          model = Model(inputs=inputs, outputs=outputs)
          model.compile(optimizer='rmsprop',
          loss='categorical_crossentropy',
          metrics=['accuracy'])

          for i in range(10):
          model.fit_generator(generator=decoder_generator(),
          class_weight=0:1/3, 1:1/3, 2:1/3,
          steps_per_epoch=10,
          epochs=1,
          callbacks=[Valid_checker()])
          if model.stop_training:
          break





          share|improve this answer















          How about a simple approach like looping over one epoch at a time ?



          for i in range(args.epochs):
          class_weights = calculate_weights()
          model.fit_generator(
          decoder_generator(x_train, y_train),
          steps_per_epoch=len(x_train),
          epochs=1,
          validation_data=decoder_generator(x_valid, y_valid),
          validation_steps=len(x_valid),
          callbacks=callback_list,
          class_weight=class_weights,
          verbose=1)


          There is no straight forward way to use different class weights for each epoch in fit_generator. You can incorporate early stopping by checking the value of model.stop_training



          Sample



          import numpy as np
          from keras.models import Sequential
          from keras.layers import Input, Dense
          from keras.models import Model
          from keras.callbacks import Callback

          class Valid_checker(Callback):
          def __init__(self):
          super().__init__()
          self.model = model
          self.n_epoch = 0

          def on_epoch_end(self, epoch, logs=):
          self.n_epoch += 1
          if self.n_epoch == 8:
          self.model.stop_training = True

          def decoder_generator():
          while True:
          for i in range(10):
          yield np.random.rand(10,5), np.random.randint(3,size=(10,3))


          inputs = Input(shape=(5,))
          outputs = Dense(3, activation='relu')(inputs)
          model = Model(inputs=inputs, outputs=outputs)
          model.compile(optimizer='rmsprop',
          loss='categorical_crossentropy',
          metrics=['accuracy'])

          for i in range(10):
          model.fit_generator(generator=decoder_generator(),
          class_weight=0:1/3, 1:1/3, 2:1/3,
          steps_per_epoch=10,
          epochs=1,
          callbacks=[Valid_checker()])
          if model.stop_training:
          break






          share|improve this answer














          share|improve this answer



          share|improve this answer








          edited Mar 26 at 17:25

























          answered Mar 26 at 10:54









          mujjigamujjiga

          4,4802 gold badges14 silver badges22 bronze badges




          4,4802 gold badges14 silver badges22 bronze badges












          • I think I could try that. However, in Valid_checker class, I implemented custom early stopping and model saving. So I think I have to return current accuracy in on_epoch_end function. Do you have any other comments for me?

            – Jeonghwa Yoo
            Mar 26 at 12:46


















          • I think I could try that. However, in Valid_checker class, I implemented custom early stopping and model saving. So I think I have to return current accuracy in on_epoch_end function. Do you have any other comments for me?

            – Jeonghwa Yoo
            Mar 26 at 12:46

















          I think I could try that. However, in Valid_checker class, I implemented custom early stopping and model saving. So I think I have to return current accuracy in on_epoch_end function. Do you have any other comments for me?

          – Jeonghwa Yoo
          Mar 26 at 12:46






          I think I could try that. However, in Valid_checker class, I implemented custom early stopping and model saving. So I think I have to return current accuracy in on_epoch_end function. Do you have any other comments for me?

          – Jeonghwa Yoo
          Mar 26 at 12:46









          Got a question that you can’t ask on public Stack Overflow? Learn more about sharing private information with Stack Overflow for Teams.







          Got a question that you can’t ask on public Stack Overflow? Learn more about sharing private information with Stack Overflow for Teams.



















          draft saved

          draft discarded
















































          Thanks for contributing an answer to Stack Overflow!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid


          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.

          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55355170%2fcan-i-change-class-weight-during-training%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Kamusi Yaliyomo Aina za kamusi | Muundo wa kamusi | Faida za kamusi | Dhima ya picha katika kamusi | Marejeo | Tazama pia | Viungo vya nje | UrambazajiKuhusu kamusiGo-SwahiliWiki-KamusiKamusi ya Kiswahili na Kiingerezakuihariri na kuongeza habari

          SQL error code 1064 with creating Laravel foreign keysForeign key constraints: When to use ON UPDATE and ON DELETEDropping column with foreign key Laravel error: General error: 1025 Error on renameLaravel SQL Can't create tableLaravel Migration foreign key errorLaravel php artisan migrate:refresh giving a syntax errorSQLSTATE[42S01]: Base table or view already exists or Base table or view already exists: 1050 Tableerror in migrating laravel file to xampp serverSyntax error or access violation: 1064:syntax to use near 'unsigned not null, modelName varchar(191) not null, title varchar(191) not nLaravel cannot create new table field in mysqlLaravel 5.7:Last migration creates table but is not registered in the migration table

          은진 송씨 목차 역사 본관 분파 인물 조선 왕실과의 인척 관계 집성촌 항렬자 인구 같이 보기 각주 둘러보기 메뉴은진 송씨세종실록 149권, 지리지 충청도 공주목 은진현