How to train a cnn with two classes with different frequency?How to merge two dictionaries in a single expression?What is the difference between old style and new style classes in Python?How do I concatenate two lists in Python?loss, val_loss, acc and val_acc do not update at all over epochsLSTM with Keras: Input 'ref' of 'Assign' Op requires l-value inputToo many parameters trying to rebuild VGG16Keras: Transfer Learning - Image scaling worsens performance of the model significantlyscipy.ndimage.zoom is taking long time even on small arraysIs it possible to train a CNN starting at an intermediate layer (in general and in Keras)?'Sequential' object has no attribute 'loss' - When I used GridSearchCV to tuning my Keras model
Why does F + F' = 1?
Is the space of Radon measures a Polish space or at least separable?
Stack class in Java 8
Are there any instances of members of different Hogwarts houses coupling up and marrying each other?
Starring Samurais - Several Scribbled Short Stories
Using the Fruit soaked in vodka
"Not enough RAM " error in PIC16F877a
Wrathful Smite, and the term 'Creature'
Gas pipes - why does gas burn "outwards?"
Is there a star over my head?
I changed a word from a source, how do I cite it correctly?
What does "synoptic" mean in avionics?
Why should I always enable compiler warnings?
Are programming languages necessary/useful for operations research practitioner?
A medieval fantasy adventurer lights a torch in a 100% pure oxygen room. What happens?
I see your BIDMAS and raise you a BADMIS
Should I use my toaster oven for slow roasting?
Can a magnet rip protons from a nucleus?
Usage of Offrir and Donner
Why is the the worst case for this function O(n^2)?
How to progress with CPLEX/Gurobi
SCOTUS - Can Congress overrule Marbury v. Madison by statute?
How to circle together certain entries of a matrix?
Could the government trigger by-elections to regain a majority?
How to train a cnn with two classes with different frequency?
How to merge two dictionaries in a single expression?What is the difference between old style and new style classes in Python?How do I concatenate two lists in Python?loss, val_loss, acc and val_acc do not update at all over epochsLSTM with Keras: Input 'ref' of 'Assign' Op requires l-value inputToo many parameters trying to rebuild VGG16Keras: Transfer Learning - Image scaling worsens performance of the model significantlyscipy.ndimage.zoom is taking long time even on small arraysIs it possible to train a CNN starting at an intermediate layer (in general and in Keras)?'Sequential' object has no attribute 'loss' - When I used GridSearchCV to tuning my Keras model
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;
I am training a simple Convolutional Neural Network (CNN) which should perform a binary classification. The package I am using is keras.
What I need is my training set to be unbalanced. For example, one of the classes should be trained with 900 images, and the other one with only 300 images.
The code I am using is the following:
from keras.models import Sequential
from keras.layers import Conv2D
from keras.layers import MaxPooling2D
from keras.layers import Flatten
from keras.layers import Dense
classifier = Sequential()
classifier.add(Conv2D(32, (3, 3),
input_shape=(64, 64, 3),
activation='relu'))
classifier.add(MaxPooling2D(pool_size=(2, 2)))
classifier.add(Flatten())
classifier.add(Dense(units=128, activation='relu'))
classifier.add(Dense(units=1, activation='sigmoid'))
classifier.compile(optimizer='adam',
loss='binary_crossentropy',
metrics=['accuracy'])
from keras.preprocessing.image import ImageDataGenerator
train_datagen = ImageDataGenerator(rescale=1./255,
shear_range=0.2,
zoom_range=0.2,
horizontal_flip=True)
test_datagen = ImageDataGenerator(rescale=1./255)
training_set = train_datagen.flow_from_directory('dataset/training_set',
target_size=(64, 64),
batch_size=32,
class_mode='binary')
test_set = test_datagen.flow_from_directory('dataset/test_set',
target_size=(64, 64),
batch_size=32,
class_mode='binary')
classifier.fit_generator(training_set,
steps_per_epoch=1200,
epochs=30,
validation_data=test_set,
validation_steps=50)
Right now the model is being trained with a batch_size of 32.
I am guessing that this means that it takes 16 training examples from one of the classes and 16 from the other?
What I need is to take 24 training examples from one of the classes and 8 examples from the other.
Probably, I should amend the flow_from_directory() function concerning the training data set in some way. Unfortunately, there is nothing connected to that in the keras documentation.
Do you have any suggestions?
python keras deep-learning conv-neural-network
add a comment |
I am training a simple Convolutional Neural Network (CNN) which should perform a binary classification. The package I am using is keras.
What I need is my training set to be unbalanced. For example, one of the classes should be trained with 900 images, and the other one with only 300 images.
The code I am using is the following:
from keras.models import Sequential
from keras.layers import Conv2D
from keras.layers import MaxPooling2D
from keras.layers import Flatten
from keras.layers import Dense
classifier = Sequential()
classifier.add(Conv2D(32, (3, 3),
input_shape=(64, 64, 3),
activation='relu'))
classifier.add(MaxPooling2D(pool_size=(2, 2)))
classifier.add(Flatten())
classifier.add(Dense(units=128, activation='relu'))
classifier.add(Dense(units=1, activation='sigmoid'))
classifier.compile(optimizer='adam',
loss='binary_crossentropy',
metrics=['accuracy'])
from keras.preprocessing.image import ImageDataGenerator
train_datagen = ImageDataGenerator(rescale=1./255,
shear_range=0.2,
zoom_range=0.2,
horizontal_flip=True)
test_datagen = ImageDataGenerator(rescale=1./255)
training_set = train_datagen.flow_from_directory('dataset/training_set',
target_size=(64, 64),
batch_size=32,
class_mode='binary')
test_set = test_datagen.flow_from_directory('dataset/test_set',
target_size=(64, 64),
batch_size=32,
class_mode='binary')
classifier.fit_generator(training_set,
steps_per_epoch=1200,
epochs=30,
validation_data=test_set,
validation_steps=50)
Right now the model is being trained with a batch_size of 32.
I am guessing that this means that it takes 16 training examples from one of the classes and 16 from the other?
What I need is to take 24 training examples from one of the classes and 8 examples from the other.
Probably, I should amend the flow_from_directory() function concerning the training data set in some way. Unfortunately, there is nothing connected to that in the keras documentation.
Do you have any suggestions?
python keras deep-learning conv-neural-network
1
I can't say with 100% certainty but as far as I know, batch_size does not take an equal number of samples from all the labels rather it takes randomly. So if your data is unbalanced then the random sample will also be unbalanced.
– Biranjan
Mar 28 at 9:30
Does that mean that I cannot guarantee that each training batch would have exactly 24 training examples from one of the classes and 8 examples from the other?
– petro.y
Mar 29 at 8:00
There might be some possible wary but I don't really know, however, if that data that you are fitting is unbalanced let's say the ratio of labels is 4:1, then I would imagine random sampling will approximately reflect that same ratio in batches of your data.
– Biranjan
Mar 29 at 8:58
add a comment |
I am training a simple Convolutional Neural Network (CNN) which should perform a binary classification. The package I am using is keras.
What I need is my training set to be unbalanced. For example, one of the classes should be trained with 900 images, and the other one with only 300 images.
The code I am using is the following:
from keras.models import Sequential
from keras.layers import Conv2D
from keras.layers import MaxPooling2D
from keras.layers import Flatten
from keras.layers import Dense
classifier = Sequential()
classifier.add(Conv2D(32, (3, 3),
input_shape=(64, 64, 3),
activation='relu'))
classifier.add(MaxPooling2D(pool_size=(2, 2)))
classifier.add(Flatten())
classifier.add(Dense(units=128, activation='relu'))
classifier.add(Dense(units=1, activation='sigmoid'))
classifier.compile(optimizer='adam',
loss='binary_crossentropy',
metrics=['accuracy'])
from keras.preprocessing.image import ImageDataGenerator
train_datagen = ImageDataGenerator(rescale=1./255,
shear_range=0.2,
zoom_range=0.2,
horizontal_flip=True)
test_datagen = ImageDataGenerator(rescale=1./255)
training_set = train_datagen.flow_from_directory('dataset/training_set',
target_size=(64, 64),
batch_size=32,
class_mode='binary')
test_set = test_datagen.flow_from_directory('dataset/test_set',
target_size=(64, 64),
batch_size=32,
class_mode='binary')
classifier.fit_generator(training_set,
steps_per_epoch=1200,
epochs=30,
validation_data=test_set,
validation_steps=50)
Right now the model is being trained with a batch_size of 32.
I am guessing that this means that it takes 16 training examples from one of the classes and 16 from the other?
What I need is to take 24 training examples from one of the classes and 8 examples from the other.
Probably, I should amend the flow_from_directory() function concerning the training data set in some way. Unfortunately, there is nothing connected to that in the keras documentation.
Do you have any suggestions?
python keras deep-learning conv-neural-network
I am training a simple Convolutional Neural Network (CNN) which should perform a binary classification. The package I am using is keras.
What I need is my training set to be unbalanced. For example, one of the classes should be trained with 900 images, and the other one with only 300 images.
The code I am using is the following:
from keras.models import Sequential
from keras.layers import Conv2D
from keras.layers import MaxPooling2D
from keras.layers import Flatten
from keras.layers import Dense
classifier = Sequential()
classifier.add(Conv2D(32, (3, 3),
input_shape=(64, 64, 3),
activation='relu'))
classifier.add(MaxPooling2D(pool_size=(2, 2)))
classifier.add(Flatten())
classifier.add(Dense(units=128, activation='relu'))
classifier.add(Dense(units=1, activation='sigmoid'))
classifier.compile(optimizer='adam',
loss='binary_crossentropy',
metrics=['accuracy'])
from keras.preprocessing.image import ImageDataGenerator
train_datagen = ImageDataGenerator(rescale=1./255,
shear_range=0.2,
zoom_range=0.2,
horizontal_flip=True)
test_datagen = ImageDataGenerator(rescale=1./255)
training_set = train_datagen.flow_from_directory('dataset/training_set',
target_size=(64, 64),
batch_size=32,
class_mode='binary')
test_set = test_datagen.flow_from_directory('dataset/test_set',
target_size=(64, 64),
batch_size=32,
class_mode='binary')
classifier.fit_generator(training_set,
steps_per_epoch=1200,
epochs=30,
validation_data=test_set,
validation_steps=50)
Right now the model is being trained with a batch_size of 32.
I am guessing that this means that it takes 16 training examples from one of the classes and 16 from the other?
What I need is to take 24 training examples from one of the classes and 8 examples from the other.
Probably, I should amend the flow_from_directory() function concerning the training data set in some way. Unfortunately, there is nothing connected to that in the keras documentation.
Do you have any suggestions?
python keras deep-learning conv-neural-network
python keras deep-learning conv-neural-network
asked Mar 28 at 8:26
petro.ypetro.y
12 bronze badges
12 bronze badges
1
I can't say with 100% certainty but as far as I know, batch_size does not take an equal number of samples from all the labels rather it takes randomly. So if your data is unbalanced then the random sample will also be unbalanced.
– Biranjan
Mar 28 at 9:30
Does that mean that I cannot guarantee that each training batch would have exactly 24 training examples from one of the classes and 8 examples from the other?
– petro.y
Mar 29 at 8:00
There might be some possible wary but I don't really know, however, if that data that you are fitting is unbalanced let's say the ratio of labels is 4:1, then I would imagine random sampling will approximately reflect that same ratio in batches of your data.
– Biranjan
Mar 29 at 8:58
add a comment |
1
I can't say with 100% certainty but as far as I know, batch_size does not take an equal number of samples from all the labels rather it takes randomly. So if your data is unbalanced then the random sample will also be unbalanced.
– Biranjan
Mar 28 at 9:30
Does that mean that I cannot guarantee that each training batch would have exactly 24 training examples from one of the classes and 8 examples from the other?
– petro.y
Mar 29 at 8:00
There might be some possible wary but I don't really know, however, if that data that you are fitting is unbalanced let's say the ratio of labels is 4:1, then I would imagine random sampling will approximately reflect that same ratio in batches of your data.
– Biranjan
Mar 29 at 8:58
1
1
I can't say with 100% certainty but as far as I know, batch_size does not take an equal number of samples from all the labels rather it takes randomly. So if your data is unbalanced then the random sample will also be unbalanced.
– Biranjan
Mar 28 at 9:30
I can't say with 100% certainty but as far as I know, batch_size does not take an equal number of samples from all the labels rather it takes randomly. So if your data is unbalanced then the random sample will also be unbalanced.
– Biranjan
Mar 28 at 9:30
Does that mean that I cannot guarantee that each training batch would have exactly 24 training examples from one of the classes and 8 examples from the other?
– petro.y
Mar 29 at 8:00
Does that mean that I cannot guarantee that each training batch would have exactly 24 training examples from one of the classes and 8 examples from the other?
– petro.y
Mar 29 at 8:00
There might be some possible wary but I don't really know, however, if that data that you are fitting is unbalanced let's say the ratio of labels is 4:1, then I would imagine random sampling will approximately reflect that same ratio in batches of your data.
– Biranjan
Mar 29 at 8:58
There might be some possible wary but I don't really know, however, if that data that you are fitting is unbalanced let's say the ratio of labels is 4:1, then I would imagine random sampling will approximately reflect that same ratio in batches of your data.
– Biranjan
Mar 29 at 8:58
add a comment |
0
active
oldest
votes
Your Answer
StackExchange.ifUsing("editor", function ()
StackExchange.using("externalEditor", function ()
StackExchange.using("snippets", function ()
StackExchange.snippets.init();
);
);
, "code-snippets");
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "1"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/4.0/"u003ecc by-sa 4.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55393030%2fhow-to-train-a-cnn-with-two-classes-with-different-frequency%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
0
active
oldest
votes
0
active
oldest
votes
active
oldest
votes
active
oldest
votes
Is this question similar to what you get asked at work? Learn more about asking and sharing private information with your coworkers using Stack Overflow for Teams.
Is this question similar to what you get asked at work? Learn more about asking and sharing private information with your coworkers using Stack Overflow for Teams.
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55393030%2fhow-to-train-a-cnn-with-two-classes-with-different-frequency%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
1
I can't say with 100% certainty but as far as I know, batch_size does not take an equal number of samples from all the labels rather it takes randomly. So if your data is unbalanced then the random sample will also be unbalanced.
– Biranjan
Mar 28 at 9:30
Does that mean that I cannot guarantee that each training batch would have exactly 24 training examples from one of the classes and 8 examples from the other?
– petro.y
Mar 29 at 8:00
There might be some possible wary but I don't really know, however, if that data that you are fitting is unbalanced let's say the ratio of labels is 4:1, then I would imagine random sampling will approximately reflect that same ratio in batches of your data.
– Biranjan
Mar 29 at 8:58