adding LSTM layer but getting required positional argument: 'units' errorHow to get the position of a character in Python?Argparse optional positional arguments?Theano error when using Masking layer with keras v2UserWarning: Update your `Dense` call to the Keras 2 API:Keras LSTM - Validation Loss Increasing From Epoch #1Seq2seq LSTM fails to produce sensible summariesKERAS: Get a SLICE of RNN timesteps with return_sequence = TrueKreas error TypeError: __init__() missing 1 required positional argument: 'units''Sequential' object has no attribute 'loss' - When I used GridSearchCV to tuning my Keras modelTypeError: ('Keyword argument not understood:', 'Dropout')
How to get to Antarctica without using a travel company
Do any languages mark social distinctions other than gender and status?
Matrix class in C#
Fantasy series about a human girl with gold tattoos who makes too much blood
Use GPLv3 library in a closed system (no software distribution)
What plausible reasons why people forget they didn't originally live on this new planet?
Given a Fibonacci number , find the next Fibonacci number
Is rent considered a debt?
Modeling the Round (Nearest Integer) function
Reverse Voltage?
Can every type of linear filter be modelled by a convolution?
Diagnosing instant circuit breaker tripping, with nothing connected
How to include conditional statements in NIntegrate?
Using IEEEtran with siunitx's num...
Why apt asking to uninstall GIMP when installing ardour?
Is there a minimal approach speed for airliners during rush hour?
Grade changes with auto grader
LTSpice Zener diode bug?
What does "away to insignificance" mean?
Why can a T* be passed in register, but a unique_ptr<T> cannot?
Can an Unconscious PC hear you?
When and why did the House rules change to permit an inquiry without a vote?
For piano scales, should I let go of the previous key before I hit the next one?
Image Manipulation Software That Is Extendable With Custom Filters
adding LSTM layer but getting required positional argument: 'units' error
How to get the position of a character in Python?Argparse optional positional arguments?Theano error when using Masking layer with keras v2UserWarning: Update your `Dense` call to the Keras 2 API:Keras LSTM - Validation Loss Increasing From Epoch #1Seq2seq LSTM fails to produce sensible summariesKERAS: Get a SLICE of RNN timesteps with return_sequence = TrueKreas error TypeError: __init__() missing 1 required positional argument: 'units''Sequential' object has no attribute 'loss' - When I used GridSearchCV to tuning my Keras modelTypeError: ('Keyword argument not understood:', 'Dropout')
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty
margin-bottom:0;
I am trying to run my first machine learning model. However I am getting the error below.
return_sequences=True))
TypeError: init() missing 1 required positional argument: 'units'
from tensorflow.python.keras.models import Sequential
from tensorflow.python.keras.layers import Dense, LSTM, Dropout
model = Sequential()
model.add(LSTM(input_dim=1,
output_dim=50,
return_sequences=True))
model.add(Dropout(0.2))
model.add(LSTM(100, return_sequences=False))
model.add(Dropout(0.2))
model.add(Dense(output_dim=1))
model.add(Activation('linear'))
start = time.time()
model.compile(loss="mse", optimizer="rmsprop")
Since it said the parameter units was missing I have also tried the below line,
model.add(LSTM(100,
input_dim=1,
output_dim=50,
return_sequences=True))
Then get this error message but I don't understand why that doesn't come in my first attempt. What am I missing?
TypeError: ('Keyword argument not understood:', 'input_dim')
python tensorflow keras recurrent-neural-network
add a comment
|
I am trying to run my first machine learning model. However I am getting the error below.
return_sequences=True))
TypeError: init() missing 1 required positional argument: 'units'
from tensorflow.python.keras.models import Sequential
from tensorflow.python.keras.layers import Dense, LSTM, Dropout
model = Sequential()
model.add(LSTM(input_dim=1,
output_dim=50,
return_sequences=True))
model.add(Dropout(0.2))
model.add(LSTM(100, return_sequences=False))
model.add(Dropout(0.2))
model.add(Dense(output_dim=1))
model.add(Activation('linear'))
start = time.time()
model.compile(loss="mse", optimizer="rmsprop")
Since it said the parameter units was missing I have also tried the below line,
model.add(LSTM(100,
input_dim=1,
output_dim=50,
return_sequences=True))
Then get this error message but I don't understand why that doesn't come in my first attempt. What am I missing?
TypeError: ('Keyword argument not understood:', 'input_dim')
python tensorflow keras recurrent-neural-network
add a comment
|
I am trying to run my first machine learning model. However I am getting the error below.
return_sequences=True))
TypeError: init() missing 1 required positional argument: 'units'
from tensorflow.python.keras.models import Sequential
from tensorflow.python.keras.layers import Dense, LSTM, Dropout
model = Sequential()
model.add(LSTM(input_dim=1,
output_dim=50,
return_sequences=True))
model.add(Dropout(0.2))
model.add(LSTM(100, return_sequences=False))
model.add(Dropout(0.2))
model.add(Dense(output_dim=1))
model.add(Activation('linear'))
start = time.time()
model.compile(loss="mse", optimizer="rmsprop")
Since it said the parameter units was missing I have also tried the below line,
model.add(LSTM(100,
input_dim=1,
output_dim=50,
return_sequences=True))
Then get this error message but I don't understand why that doesn't come in my first attempt. What am I missing?
TypeError: ('Keyword argument not understood:', 'input_dim')
python tensorflow keras recurrent-neural-network
I am trying to run my first machine learning model. However I am getting the error below.
return_sequences=True))
TypeError: init() missing 1 required positional argument: 'units'
from tensorflow.python.keras.models import Sequential
from tensorflow.python.keras.layers import Dense, LSTM, Dropout
model = Sequential()
model.add(LSTM(input_dim=1,
output_dim=50,
return_sequences=True))
model.add(Dropout(0.2))
model.add(LSTM(100, return_sequences=False))
model.add(Dropout(0.2))
model.add(Dense(output_dim=1))
model.add(Activation('linear'))
start = time.time()
model.compile(loss="mse", optimizer="rmsprop")
Since it said the parameter units was missing I have also tried the below line,
model.add(LSTM(100,
input_dim=1,
output_dim=50,
return_sequences=True))
Then get this error message but I don't understand why that doesn't come in my first attempt. What am I missing?
TypeError: ('Keyword argument not understood:', 'input_dim')
python tensorflow keras recurrent-neural-network
python tensorflow keras recurrent-neural-network
asked Mar 28 at 21:32
mHelpMemHelpMe
2,31115 gold badges45 silver badges88 bronze badges
2,31115 gold badges45 silver badges88 bronze badges
add a comment
|
add a comment
|
1 Answer
1
active
oldest
votes
units
is the first parameter of LSTM
, which represents the last dimension of the output data at this layer. It shows the first error because your code doesn't have units
in your first attempt. units
satisfies the condition so that it shows the second error in the second attempt.
You should use the input_shape
parameter to specify the shape of the first layer input in this case. Your first LSTM
layer input_shape
should have two data (timestep
and feature
,batch_size
doesn't need to be filled in by default) since LSTM requires three dimensional input. Assuming your timestep is 10, your code should be changed to the following.
from tensorflow.python.keras.models import Sequential
from tensorflow.python.keras.layers import Dense, LSTM, Dropout,Activation
model = Sequential()
model.add(LSTM(units=100,input_shape=(10,1),return_sequences=True))
model.add(Dropout(0.2))
model.add(LSTM(100, return_sequences=False))
model.add(Dropout(0.2))
model.add(Dense(units=1))
model.add(Activation('linear'))
model.compile(loss="mse", optimizer="rmsprop")
print(model.summary())
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
lstm (LSTM) (None, 10, 100) 40800
_________________________________________________________________
dropout (Dropout) (None, 10, 100) 0
_________________________________________________________________
lstm_1 (LSTM) (None, 100) 80400
_________________________________________________________________
dropout_1 (Dropout) (None, 100) 0
_________________________________________________________________
dense (Dense) (None, 1) 101
_________________________________________________________________
activation (Activation) (None, 1) 0
=================================================================
Total params: 121,301
Trainable params: 121,301
Non-trainable params: 0
_________________________________________________________________
thanks for the clear answer, much appreciated. The Activation('linear'), which library is Activation from as my code doesn't recognise it?
– mHelpMe
Mar 29 at 8:59
1
@mHelpMeActivation
is a keras layer. I added it to the answer.
– giser_yugang
Mar 29 at 9:04
add a comment
|
Your Answer
StackExchange.ifUsing("editor", function ()
StackExchange.using("externalEditor", function ()
StackExchange.using("snippets", function ()
StackExchange.snippets.init();
);
);
, "code-snippets");
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "1"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/4.0/"u003ecc by-sa 4.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55407165%2fadding-lstm-layer-but-getting-required-positional-argument-units-error%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
units
is the first parameter of LSTM
, which represents the last dimension of the output data at this layer. It shows the first error because your code doesn't have units
in your first attempt. units
satisfies the condition so that it shows the second error in the second attempt.
You should use the input_shape
parameter to specify the shape of the first layer input in this case. Your first LSTM
layer input_shape
should have two data (timestep
and feature
,batch_size
doesn't need to be filled in by default) since LSTM requires three dimensional input. Assuming your timestep is 10, your code should be changed to the following.
from tensorflow.python.keras.models import Sequential
from tensorflow.python.keras.layers import Dense, LSTM, Dropout,Activation
model = Sequential()
model.add(LSTM(units=100,input_shape=(10,1),return_sequences=True))
model.add(Dropout(0.2))
model.add(LSTM(100, return_sequences=False))
model.add(Dropout(0.2))
model.add(Dense(units=1))
model.add(Activation('linear'))
model.compile(loss="mse", optimizer="rmsprop")
print(model.summary())
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
lstm (LSTM) (None, 10, 100) 40800
_________________________________________________________________
dropout (Dropout) (None, 10, 100) 0
_________________________________________________________________
lstm_1 (LSTM) (None, 100) 80400
_________________________________________________________________
dropout_1 (Dropout) (None, 100) 0
_________________________________________________________________
dense (Dense) (None, 1) 101
_________________________________________________________________
activation (Activation) (None, 1) 0
=================================================================
Total params: 121,301
Trainable params: 121,301
Non-trainable params: 0
_________________________________________________________________
thanks for the clear answer, much appreciated. The Activation('linear'), which library is Activation from as my code doesn't recognise it?
– mHelpMe
Mar 29 at 8:59
1
@mHelpMeActivation
is a keras layer. I added it to the answer.
– giser_yugang
Mar 29 at 9:04
add a comment
|
units
is the first parameter of LSTM
, which represents the last dimension of the output data at this layer. It shows the first error because your code doesn't have units
in your first attempt. units
satisfies the condition so that it shows the second error in the second attempt.
You should use the input_shape
parameter to specify the shape of the first layer input in this case. Your first LSTM
layer input_shape
should have two data (timestep
and feature
,batch_size
doesn't need to be filled in by default) since LSTM requires three dimensional input. Assuming your timestep is 10, your code should be changed to the following.
from tensorflow.python.keras.models import Sequential
from tensorflow.python.keras.layers import Dense, LSTM, Dropout,Activation
model = Sequential()
model.add(LSTM(units=100,input_shape=(10,1),return_sequences=True))
model.add(Dropout(0.2))
model.add(LSTM(100, return_sequences=False))
model.add(Dropout(0.2))
model.add(Dense(units=1))
model.add(Activation('linear'))
model.compile(loss="mse", optimizer="rmsprop")
print(model.summary())
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
lstm (LSTM) (None, 10, 100) 40800
_________________________________________________________________
dropout (Dropout) (None, 10, 100) 0
_________________________________________________________________
lstm_1 (LSTM) (None, 100) 80400
_________________________________________________________________
dropout_1 (Dropout) (None, 100) 0
_________________________________________________________________
dense (Dense) (None, 1) 101
_________________________________________________________________
activation (Activation) (None, 1) 0
=================================================================
Total params: 121,301
Trainable params: 121,301
Non-trainable params: 0
_________________________________________________________________
thanks for the clear answer, much appreciated. The Activation('linear'), which library is Activation from as my code doesn't recognise it?
– mHelpMe
Mar 29 at 8:59
1
@mHelpMeActivation
is a keras layer. I added it to the answer.
– giser_yugang
Mar 29 at 9:04
add a comment
|
units
is the first parameter of LSTM
, which represents the last dimension of the output data at this layer. It shows the first error because your code doesn't have units
in your first attempt. units
satisfies the condition so that it shows the second error in the second attempt.
You should use the input_shape
parameter to specify the shape of the first layer input in this case. Your first LSTM
layer input_shape
should have two data (timestep
and feature
,batch_size
doesn't need to be filled in by default) since LSTM requires three dimensional input. Assuming your timestep is 10, your code should be changed to the following.
from tensorflow.python.keras.models import Sequential
from tensorflow.python.keras.layers import Dense, LSTM, Dropout,Activation
model = Sequential()
model.add(LSTM(units=100,input_shape=(10,1),return_sequences=True))
model.add(Dropout(0.2))
model.add(LSTM(100, return_sequences=False))
model.add(Dropout(0.2))
model.add(Dense(units=1))
model.add(Activation('linear'))
model.compile(loss="mse", optimizer="rmsprop")
print(model.summary())
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
lstm (LSTM) (None, 10, 100) 40800
_________________________________________________________________
dropout (Dropout) (None, 10, 100) 0
_________________________________________________________________
lstm_1 (LSTM) (None, 100) 80400
_________________________________________________________________
dropout_1 (Dropout) (None, 100) 0
_________________________________________________________________
dense (Dense) (None, 1) 101
_________________________________________________________________
activation (Activation) (None, 1) 0
=================================================================
Total params: 121,301
Trainable params: 121,301
Non-trainable params: 0
_________________________________________________________________
units
is the first parameter of LSTM
, which represents the last dimension of the output data at this layer. It shows the first error because your code doesn't have units
in your first attempt. units
satisfies the condition so that it shows the second error in the second attempt.
You should use the input_shape
parameter to specify the shape of the first layer input in this case. Your first LSTM
layer input_shape
should have two data (timestep
and feature
,batch_size
doesn't need to be filled in by default) since LSTM requires three dimensional input. Assuming your timestep is 10, your code should be changed to the following.
from tensorflow.python.keras.models import Sequential
from tensorflow.python.keras.layers import Dense, LSTM, Dropout,Activation
model = Sequential()
model.add(LSTM(units=100,input_shape=(10,1),return_sequences=True))
model.add(Dropout(0.2))
model.add(LSTM(100, return_sequences=False))
model.add(Dropout(0.2))
model.add(Dense(units=1))
model.add(Activation('linear'))
model.compile(loss="mse", optimizer="rmsprop")
print(model.summary())
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
lstm (LSTM) (None, 10, 100) 40800
_________________________________________________________________
dropout (Dropout) (None, 10, 100) 0
_________________________________________________________________
lstm_1 (LSTM) (None, 100) 80400
_________________________________________________________________
dropout_1 (Dropout) (None, 100) 0
_________________________________________________________________
dense (Dense) (None, 1) 101
_________________________________________________________________
activation (Activation) (None, 1) 0
=================================================================
Total params: 121,301
Trainable params: 121,301
Non-trainable params: 0
_________________________________________________________________
edited Mar 29 at 9:04
answered Mar 29 at 6:00
giser_yuganggiser_yugang
4,4713 gold badges9 silver badges32 bronze badges
4,4713 gold badges9 silver badges32 bronze badges
thanks for the clear answer, much appreciated. The Activation('linear'), which library is Activation from as my code doesn't recognise it?
– mHelpMe
Mar 29 at 8:59
1
@mHelpMeActivation
is a keras layer. I added it to the answer.
– giser_yugang
Mar 29 at 9:04
add a comment
|
thanks for the clear answer, much appreciated. The Activation('linear'), which library is Activation from as my code doesn't recognise it?
– mHelpMe
Mar 29 at 8:59
1
@mHelpMeActivation
is a keras layer. I added it to the answer.
– giser_yugang
Mar 29 at 9:04
thanks for the clear answer, much appreciated. The Activation('linear'), which library is Activation from as my code doesn't recognise it?
– mHelpMe
Mar 29 at 8:59
thanks for the clear answer, much appreciated. The Activation('linear'), which library is Activation from as my code doesn't recognise it?
– mHelpMe
Mar 29 at 8:59
1
1
@mHelpMe
Activation
is a keras layer. I added it to the answer.– giser_yugang
Mar 29 at 9:04
@mHelpMe
Activation
is a keras layer. I added it to the answer.– giser_yugang
Mar 29 at 9:04
add a comment
|
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55407165%2fadding-lstm-layer-but-getting-required-positional-argument-units-error%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown