Can I change class_weight during training?How can I safely create a nested directory?How can I make a time delay in Python?Keras. ValueError: I/O operation on closed fileKeras AttributeError: 'list' object has no attribute 'ndim'LSTM with Keras: Input 'ref' of 'Assign' Op requires l-value inputInvalidArgumentError when running model.fit()IOError: [Errno 2] No such file or directory when training Keras modelNeural Network classification'Tensor' object has no attribute 'ndim'Keras add_loss will not work with y data(y_train, y_test) on Encoder-Decoder model
How important are the Author's mood and feelings for writing a story?
Why are flying carpets banned while flying brooms are not?
Who would use the word "manky"?
How to interpret a promising preprint that was never published in peer-review?
Who or what determines if a curse is valid or not?
How do I reproduce this layout and typography?
Do pedestrians imitate auto traffic?
Does unblocking power bar outlets through short extension cords increase fire risk?
How can I help our ranger feel special about her beast companion?
"Je suis petite, moi?", purpose of the "moi"?
Should I have one hand on the throttle during engine ignition?
Why is an object not defined as identity morphism?
Why teach C using scanf without talking about command line arguments?
Which modern firearm should a time traveler bring to be easily reproducible for a historic civilization?
Applying for jobs with an obvious scar
Is it possible to invoke "super" with less ambiguous results?
Demographic consequences of closed loop reincarnation
What is the name for the average of the largest and the smallest values in a given data set?
Who determines when road center lines are solid or dashed?
How to prove that the covariant derivative obeys the product rule
How can I automate this tensor computation?
Apex Legends stuck at 60 FPS (G-Sync 144hz monitor)
Why didn't Doctor Strange restore Tony Stark after he used the Stones?
Did Hitler say this quote about homeschooling?
Can I change class_weight during training?
How can I safely create a nested directory?How can I make a time delay in Python?Keras. ValueError: I/O operation on closed fileKeras AttributeError: 'list' object has no attribute 'ndim'LSTM with Keras: Input 'ref' of 'Assign' Op requires l-value inputInvalidArgumentError when running model.fit()IOError: [Errno 2] No such file or directory when training Keras modelNeural Network classification'Tensor' object has no attribute 'ndim'Keras add_loss will not work with y data(y_train, y_test) on Encoder-Decoder model
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;
I want to change my class_weight during training in Keras.
I used fit_generator
and Callback
method like below.
model.fit_generator(
decoder_generator(x_train, y_train),
steps_per_epoch=len(x_train),
epochs=args.epochs,
validation_data=decoder_generator(x_valid, y_valid),
validation_steps=len(x_valid),
callbacks=callback_list,
class_weight=class_weights,
verbose=1)
And
class Valid_checker(keras.callbacks.Callback):
def __init__(self, model_name, patience, val_data, x_length):
super().__init__()
self.best_score = 0
self.patience = patience
self.current_patience = 0
self.model_name = model_name
self.validation_data = val_data
self.x_length = x_length
def on_epoch_end(self, epoch, logs=):
X_val, y_val = self.validation_data
y_predict, x_predict = model.predict_generator(no_decoder_generator(X_val, y_val), steps=len(X_val))
y_predict = np.asarray(y_predict)
x_predict = np.asarray(x_predict)
decoder_generator
and no_decoder_generator
are just custom generator.
I want to change the class weight every time the epoch ends. Is it possible? Then how can I do?
My data is imbalanced data, and overfitting is continued for one class.
At the end of the epoch I want to increase the weight for classes with low accuracy by calculating the accuracy by class.
How can I do?
python tensorflow keras deep-learning
add a comment |
I want to change my class_weight during training in Keras.
I used fit_generator
and Callback
method like below.
model.fit_generator(
decoder_generator(x_train, y_train),
steps_per_epoch=len(x_train),
epochs=args.epochs,
validation_data=decoder_generator(x_valid, y_valid),
validation_steps=len(x_valid),
callbacks=callback_list,
class_weight=class_weights,
verbose=1)
And
class Valid_checker(keras.callbacks.Callback):
def __init__(self, model_name, patience, val_data, x_length):
super().__init__()
self.best_score = 0
self.patience = patience
self.current_patience = 0
self.model_name = model_name
self.validation_data = val_data
self.x_length = x_length
def on_epoch_end(self, epoch, logs=):
X_val, y_val = self.validation_data
y_predict, x_predict = model.predict_generator(no_decoder_generator(X_val, y_val), steps=len(X_val))
y_predict = np.asarray(y_predict)
x_predict = np.asarray(x_predict)
decoder_generator
and no_decoder_generator
are just custom generator.
I want to change the class weight every time the epoch ends. Is it possible? Then how can I do?
My data is imbalanced data, and overfitting is continued for one class.
At the end of the epoch I want to increase the weight for classes with low accuracy by calculating the accuracy by class.
How can I do?
python tensorflow keras deep-learning
add a comment |
I want to change my class_weight during training in Keras.
I used fit_generator
and Callback
method like below.
model.fit_generator(
decoder_generator(x_train, y_train),
steps_per_epoch=len(x_train),
epochs=args.epochs,
validation_data=decoder_generator(x_valid, y_valid),
validation_steps=len(x_valid),
callbacks=callback_list,
class_weight=class_weights,
verbose=1)
And
class Valid_checker(keras.callbacks.Callback):
def __init__(self, model_name, patience, val_data, x_length):
super().__init__()
self.best_score = 0
self.patience = patience
self.current_patience = 0
self.model_name = model_name
self.validation_data = val_data
self.x_length = x_length
def on_epoch_end(self, epoch, logs=):
X_val, y_val = self.validation_data
y_predict, x_predict = model.predict_generator(no_decoder_generator(X_val, y_val), steps=len(X_val))
y_predict = np.asarray(y_predict)
x_predict = np.asarray(x_predict)
decoder_generator
and no_decoder_generator
are just custom generator.
I want to change the class weight every time the epoch ends. Is it possible? Then how can I do?
My data is imbalanced data, and overfitting is continued for one class.
At the end of the epoch I want to increase the weight for classes with low accuracy by calculating the accuracy by class.
How can I do?
python tensorflow keras deep-learning
I want to change my class_weight during training in Keras.
I used fit_generator
and Callback
method like below.
model.fit_generator(
decoder_generator(x_train, y_train),
steps_per_epoch=len(x_train),
epochs=args.epochs,
validation_data=decoder_generator(x_valid, y_valid),
validation_steps=len(x_valid),
callbacks=callback_list,
class_weight=class_weights,
verbose=1)
And
class Valid_checker(keras.callbacks.Callback):
def __init__(self, model_name, patience, val_data, x_length):
super().__init__()
self.best_score = 0
self.patience = patience
self.current_patience = 0
self.model_name = model_name
self.validation_data = val_data
self.x_length = x_length
def on_epoch_end(self, epoch, logs=):
X_val, y_val = self.validation_data
y_predict, x_predict = model.predict_generator(no_decoder_generator(X_val, y_val), steps=len(X_val))
y_predict = np.asarray(y_predict)
x_predict = np.asarray(x_predict)
decoder_generator
and no_decoder_generator
are just custom generator.
I want to change the class weight every time the epoch ends. Is it possible? Then how can I do?
My data is imbalanced data, and overfitting is continued for one class.
At the end of the epoch I want to increase the weight for classes with low accuracy by calculating the accuracy by class.
How can I do?
python tensorflow keras deep-learning
python tensorflow keras deep-learning
edited May 23 at 17:02
double-beep
3,1655 gold badges19 silver badges33 bronze badges
3,1655 gold badges19 silver badges33 bronze badges
asked Mar 26 at 10:42
Jeonghwa YooJeonghwa Yoo
577 bronze badges
577 bronze badges
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
How about a simple approach like looping over one epoch at a time ?
for i in range(args.epochs):
class_weights = calculate_weights()
model.fit_generator(
decoder_generator(x_train, y_train),
steps_per_epoch=len(x_train),
epochs=1,
validation_data=decoder_generator(x_valid, y_valid),
validation_steps=len(x_valid),
callbacks=callback_list,
class_weight=class_weights,
verbose=1)
There is no straight forward way to use different class weights for each epoch in fit_generator
. You can incorporate early stopping by checking the value of model.stop_training
Sample
import numpy as np
from keras.models import Sequential
from keras.layers import Input, Dense
from keras.models import Model
from keras.callbacks import Callback
class Valid_checker(Callback):
def __init__(self):
super().__init__()
self.model = model
self.n_epoch = 0
def on_epoch_end(self, epoch, logs=):
self.n_epoch += 1
if self.n_epoch == 8:
self.model.stop_training = True
def decoder_generator():
while True:
for i in range(10):
yield np.random.rand(10,5), np.random.randint(3,size=(10,3))
inputs = Input(shape=(5,))
outputs = Dense(3, activation='relu')(inputs)
model = Model(inputs=inputs, outputs=outputs)
model.compile(optimizer='rmsprop',
loss='categorical_crossentropy',
metrics=['accuracy'])
for i in range(10):
model.fit_generator(generator=decoder_generator(),
class_weight=0:1/3, 1:1/3, 2:1/3,
steps_per_epoch=10,
epochs=1,
callbacks=[Valid_checker()])
if model.stop_training:
break
I think I could try that. However, inValid_checker
class, I implemented custom early stopping and model saving. So I think I have to return current accuracy inon_epoch_end
function. Do you have any other comments for me?
– Jeonghwa Yoo
Mar 26 at 12:46
add a comment |
Your Answer
StackExchange.ifUsing("editor", function ()
StackExchange.using("externalEditor", function ()
StackExchange.using("snippets", function ()
StackExchange.snippets.init();
);
);
, "code-snippets");
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "1"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55355170%2fcan-i-change-class-weight-during-training%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
How about a simple approach like looping over one epoch at a time ?
for i in range(args.epochs):
class_weights = calculate_weights()
model.fit_generator(
decoder_generator(x_train, y_train),
steps_per_epoch=len(x_train),
epochs=1,
validation_data=decoder_generator(x_valid, y_valid),
validation_steps=len(x_valid),
callbacks=callback_list,
class_weight=class_weights,
verbose=1)
There is no straight forward way to use different class weights for each epoch in fit_generator
. You can incorporate early stopping by checking the value of model.stop_training
Sample
import numpy as np
from keras.models import Sequential
from keras.layers import Input, Dense
from keras.models import Model
from keras.callbacks import Callback
class Valid_checker(Callback):
def __init__(self):
super().__init__()
self.model = model
self.n_epoch = 0
def on_epoch_end(self, epoch, logs=):
self.n_epoch += 1
if self.n_epoch == 8:
self.model.stop_training = True
def decoder_generator():
while True:
for i in range(10):
yield np.random.rand(10,5), np.random.randint(3,size=(10,3))
inputs = Input(shape=(5,))
outputs = Dense(3, activation='relu')(inputs)
model = Model(inputs=inputs, outputs=outputs)
model.compile(optimizer='rmsprop',
loss='categorical_crossentropy',
metrics=['accuracy'])
for i in range(10):
model.fit_generator(generator=decoder_generator(),
class_weight=0:1/3, 1:1/3, 2:1/3,
steps_per_epoch=10,
epochs=1,
callbacks=[Valid_checker()])
if model.stop_training:
break
I think I could try that. However, inValid_checker
class, I implemented custom early stopping and model saving. So I think I have to return current accuracy inon_epoch_end
function. Do you have any other comments for me?
– Jeonghwa Yoo
Mar 26 at 12:46
add a comment |
How about a simple approach like looping over one epoch at a time ?
for i in range(args.epochs):
class_weights = calculate_weights()
model.fit_generator(
decoder_generator(x_train, y_train),
steps_per_epoch=len(x_train),
epochs=1,
validation_data=decoder_generator(x_valid, y_valid),
validation_steps=len(x_valid),
callbacks=callback_list,
class_weight=class_weights,
verbose=1)
There is no straight forward way to use different class weights for each epoch in fit_generator
. You can incorporate early stopping by checking the value of model.stop_training
Sample
import numpy as np
from keras.models import Sequential
from keras.layers import Input, Dense
from keras.models import Model
from keras.callbacks import Callback
class Valid_checker(Callback):
def __init__(self):
super().__init__()
self.model = model
self.n_epoch = 0
def on_epoch_end(self, epoch, logs=):
self.n_epoch += 1
if self.n_epoch == 8:
self.model.stop_training = True
def decoder_generator():
while True:
for i in range(10):
yield np.random.rand(10,5), np.random.randint(3,size=(10,3))
inputs = Input(shape=(5,))
outputs = Dense(3, activation='relu')(inputs)
model = Model(inputs=inputs, outputs=outputs)
model.compile(optimizer='rmsprop',
loss='categorical_crossentropy',
metrics=['accuracy'])
for i in range(10):
model.fit_generator(generator=decoder_generator(),
class_weight=0:1/3, 1:1/3, 2:1/3,
steps_per_epoch=10,
epochs=1,
callbacks=[Valid_checker()])
if model.stop_training:
break
I think I could try that. However, inValid_checker
class, I implemented custom early stopping and model saving. So I think I have to return current accuracy inon_epoch_end
function. Do you have any other comments for me?
– Jeonghwa Yoo
Mar 26 at 12:46
add a comment |
How about a simple approach like looping over one epoch at a time ?
for i in range(args.epochs):
class_weights = calculate_weights()
model.fit_generator(
decoder_generator(x_train, y_train),
steps_per_epoch=len(x_train),
epochs=1,
validation_data=decoder_generator(x_valid, y_valid),
validation_steps=len(x_valid),
callbacks=callback_list,
class_weight=class_weights,
verbose=1)
There is no straight forward way to use different class weights for each epoch in fit_generator
. You can incorporate early stopping by checking the value of model.stop_training
Sample
import numpy as np
from keras.models import Sequential
from keras.layers import Input, Dense
from keras.models import Model
from keras.callbacks import Callback
class Valid_checker(Callback):
def __init__(self):
super().__init__()
self.model = model
self.n_epoch = 0
def on_epoch_end(self, epoch, logs=):
self.n_epoch += 1
if self.n_epoch == 8:
self.model.stop_training = True
def decoder_generator():
while True:
for i in range(10):
yield np.random.rand(10,5), np.random.randint(3,size=(10,3))
inputs = Input(shape=(5,))
outputs = Dense(3, activation='relu')(inputs)
model = Model(inputs=inputs, outputs=outputs)
model.compile(optimizer='rmsprop',
loss='categorical_crossentropy',
metrics=['accuracy'])
for i in range(10):
model.fit_generator(generator=decoder_generator(),
class_weight=0:1/3, 1:1/3, 2:1/3,
steps_per_epoch=10,
epochs=1,
callbacks=[Valid_checker()])
if model.stop_training:
break
How about a simple approach like looping over one epoch at a time ?
for i in range(args.epochs):
class_weights = calculate_weights()
model.fit_generator(
decoder_generator(x_train, y_train),
steps_per_epoch=len(x_train),
epochs=1,
validation_data=decoder_generator(x_valid, y_valid),
validation_steps=len(x_valid),
callbacks=callback_list,
class_weight=class_weights,
verbose=1)
There is no straight forward way to use different class weights for each epoch in fit_generator
. You can incorporate early stopping by checking the value of model.stop_training
Sample
import numpy as np
from keras.models import Sequential
from keras.layers import Input, Dense
from keras.models import Model
from keras.callbacks import Callback
class Valid_checker(Callback):
def __init__(self):
super().__init__()
self.model = model
self.n_epoch = 0
def on_epoch_end(self, epoch, logs=):
self.n_epoch += 1
if self.n_epoch == 8:
self.model.stop_training = True
def decoder_generator():
while True:
for i in range(10):
yield np.random.rand(10,5), np.random.randint(3,size=(10,3))
inputs = Input(shape=(5,))
outputs = Dense(3, activation='relu')(inputs)
model = Model(inputs=inputs, outputs=outputs)
model.compile(optimizer='rmsprop',
loss='categorical_crossentropy',
metrics=['accuracy'])
for i in range(10):
model.fit_generator(generator=decoder_generator(),
class_weight=0:1/3, 1:1/3, 2:1/3,
steps_per_epoch=10,
epochs=1,
callbacks=[Valid_checker()])
if model.stop_training:
break
edited Mar 26 at 17:25
answered Mar 26 at 10:54
mujjigamujjiga
4,4802 gold badges14 silver badges22 bronze badges
4,4802 gold badges14 silver badges22 bronze badges
I think I could try that. However, inValid_checker
class, I implemented custom early stopping and model saving. So I think I have to return current accuracy inon_epoch_end
function. Do you have any other comments for me?
– Jeonghwa Yoo
Mar 26 at 12:46
add a comment |
I think I could try that. However, inValid_checker
class, I implemented custom early stopping and model saving. So I think I have to return current accuracy inon_epoch_end
function. Do you have any other comments for me?
– Jeonghwa Yoo
Mar 26 at 12:46
I think I could try that. However, in
Valid_checker
class, I implemented custom early stopping and model saving. So I think I have to return current accuracy in on_epoch_end
function. Do you have any other comments for me?– Jeonghwa Yoo
Mar 26 at 12:46
I think I could try that. However, in
Valid_checker
class, I implemented custom early stopping and model saving. So I think I have to return current accuracy in on_epoch_end
function. Do you have any other comments for me?– Jeonghwa Yoo
Mar 26 at 12:46
add a comment |
Got a question that you can’t ask on public Stack Overflow? Learn more about sharing private information with Stack Overflow for Teams.
Got a question that you can’t ask on public Stack Overflow? Learn more about sharing private information with Stack Overflow for Teams.
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55355170%2fcan-i-change-class-weight-during-training%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown