adding LSTM layer but getting required positional argument: 'units' errorHow to get the position of a character in Python?Argparse optional positional arguments?Theano error when using Masking layer with keras v2UserWarning: Update your `Dense` call to the Keras 2 API:Keras LSTM - Validation Loss Increasing From Epoch #1Seq2seq LSTM fails to produce sensible summariesKERAS: Get a SLICE of RNN timesteps with return_sequence = TrueKreas error TypeError: __init__() missing 1 required positional argument: 'units''Sequential' object has no attribute 'loss' - When I used GridSearchCV to tuning my Keras modelTypeError: ('Keyword argument not understood:', 'Dropout')

How to get to Antarctica without using a travel company

Do any languages mark social distinctions other than gender and status?

Matrix class in C#

Fantasy series about a human girl with gold tattoos who makes too much blood

Use GPLv3 library in a closed system (no software distribution)

What plausible reasons why people forget they didn't originally live on this new planet?

Given a Fibonacci number , find the next Fibonacci number

Is rent considered a debt?

Modeling the Round (Nearest Integer) function

Reverse Voltage?

Can every type of linear filter be modelled by a convolution?

Diagnosing instant circuit breaker tripping, with nothing connected

How to include conditional statements in NIntegrate?

Using IEEEtran with siunitx's num...

Why apt asking to uninstall GIMP when installing ardour?

Is there a minimal approach speed for airliners during rush hour?

Grade changes with auto grader

LTSpice Zener diode bug?

What does "away to insignificance" mean?

Why can a T* be passed in register, but a unique_ptr<T> cannot?

Can an Unconscious PC hear you?

When and why did the House rules change to permit an inquiry without a vote?

For piano scales, should I let go of the previous key before I hit the next one?

Image Manipulation Software That Is Extendable With Custom Filters



adding LSTM layer but getting required positional argument: 'units' error


How to get the position of a character in Python?Argparse optional positional arguments?Theano error when using Masking layer with keras v2UserWarning: Update your `Dense` call to the Keras 2 API:Keras LSTM - Validation Loss Increasing From Epoch #1Seq2seq LSTM fails to produce sensible summariesKERAS: Get a SLICE of RNN timesteps with return_sequence = TrueKreas error TypeError: __init__() missing 1 required positional argument: 'units''Sequential' object has no attribute 'loss' - When I used GridSearchCV to tuning my Keras modelTypeError: ('Keyword argument not understood:', 'Dropout')






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty
margin-bottom:0;









2

















I am trying to run my first machine learning model. However I am getting the error below.




return_sequences=True))
TypeError: init() missing 1 required positional argument: 'units'




from tensorflow.python.keras.models import Sequential
from tensorflow.python.keras.layers import Dense, LSTM, Dropout

model = Sequential()

model.add(LSTM(input_dim=1,
output_dim=50,
return_sequences=True))

model.add(Dropout(0.2))

model.add(LSTM(100, return_sequences=False))
model.add(Dropout(0.2))

model.add(Dense(output_dim=1))
model.add(Activation('linear'))

start = time.time()

model.compile(loss="mse", optimizer="rmsprop")


Since it said the parameter units was missing I have also tried the below line,



model.add(LSTM(100,
input_dim=1,
output_dim=50,
return_sequences=True))


Then get this error message but I don't understand why that doesn't come in my first attempt. What am I missing?




TypeError: ('Keyword argument not understood:', 'input_dim')











share|improve this question
































    2

















    I am trying to run my first machine learning model. However I am getting the error below.




    return_sequences=True))
    TypeError: init() missing 1 required positional argument: 'units'




    from tensorflow.python.keras.models import Sequential
    from tensorflow.python.keras.layers import Dense, LSTM, Dropout

    model = Sequential()

    model.add(LSTM(input_dim=1,
    output_dim=50,
    return_sequences=True))

    model.add(Dropout(0.2))

    model.add(LSTM(100, return_sequences=False))
    model.add(Dropout(0.2))

    model.add(Dense(output_dim=1))
    model.add(Activation('linear'))

    start = time.time()

    model.compile(loss="mse", optimizer="rmsprop")


    Since it said the parameter units was missing I have also tried the below line,



    model.add(LSTM(100,
    input_dim=1,
    output_dim=50,
    return_sequences=True))


    Then get this error message but I don't understand why that doesn't come in my first attempt. What am I missing?




    TypeError: ('Keyword argument not understood:', 'input_dim')











    share|improve this question




























      2












      2








      2








      I am trying to run my first machine learning model. However I am getting the error below.




      return_sequences=True))
      TypeError: init() missing 1 required positional argument: 'units'




      from tensorflow.python.keras.models import Sequential
      from tensorflow.python.keras.layers import Dense, LSTM, Dropout

      model = Sequential()

      model.add(LSTM(input_dim=1,
      output_dim=50,
      return_sequences=True))

      model.add(Dropout(0.2))

      model.add(LSTM(100, return_sequences=False))
      model.add(Dropout(0.2))

      model.add(Dense(output_dim=1))
      model.add(Activation('linear'))

      start = time.time()

      model.compile(loss="mse", optimizer="rmsprop")


      Since it said the parameter units was missing I have also tried the below line,



      model.add(LSTM(100,
      input_dim=1,
      output_dim=50,
      return_sequences=True))


      Then get this error message but I don't understand why that doesn't come in my first attempt. What am I missing?




      TypeError: ('Keyword argument not understood:', 'input_dim')











      share|improve this question















      I am trying to run my first machine learning model. However I am getting the error below.




      return_sequences=True))
      TypeError: init() missing 1 required positional argument: 'units'




      from tensorflow.python.keras.models import Sequential
      from tensorflow.python.keras.layers import Dense, LSTM, Dropout

      model = Sequential()

      model.add(LSTM(input_dim=1,
      output_dim=50,
      return_sequences=True))

      model.add(Dropout(0.2))

      model.add(LSTM(100, return_sequences=False))
      model.add(Dropout(0.2))

      model.add(Dense(output_dim=1))
      model.add(Activation('linear'))

      start = time.time()

      model.compile(loss="mse", optimizer="rmsprop")


      Since it said the parameter units was missing I have also tried the below line,



      model.add(LSTM(100,
      input_dim=1,
      output_dim=50,
      return_sequences=True))


      Then get this error message but I don't understand why that doesn't come in my first attempt. What am I missing?




      TypeError: ('Keyword argument not understood:', 'input_dim')








      python tensorflow keras recurrent-neural-network






      share|improve this question














      share|improve this question











      share|improve this question




      share|improve this question










      asked Mar 28 at 21:32









      mHelpMemHelpMe

      2,31115 gold badges45 silver badges88 bronze badges




      2,31115 gold badges45 silver badges88 bronze badges

























          1 Answer
          1






          active

          oldest

          votes


















          1


















          units is the first parameter of LSTM, which represents the last dimension of the output data at this layer. It shows the first error because your code doesn't have units in your first attempt. units satisfies the condition so that it shows the second error in the second attempt.



          You should use the input_shape parameter to specify the shape of the first layer input in this case. Your first LSTM layer input_shape should have two data (timestep and feature,batch_size doesn't need to be filled in by default) since LSTM requires three dimensional input. Assuming your timestep is 10, your code should be changed to the following.



          from tensorflow.python.keras.models import Sequential
          from tensorflow.python.keras.layers import Dense, LSTM, Dropout,Activation

          model = Sequential()
          model.add(LSTM(units=100,input_shape=(10,1),return_sequences=True))
          model.add(Dropout(0.2))
          model.add(LSTM(100, return_sequences=False))
          model.add(Dropout(0.2))
          model.add(Dense(units=1))
          model.add(Activation('linear'))
          model.compile(loss="mse", optimizer="rmsprop")
          print(model.summary())

          _________________________________________________________________
          Layer (type) Output Shape Param #
          =================================================================
          lstm (LSTM) (None, 10, 100) 40800
          _________________________________________________________________
          dropout (Dropout) (None, 10, 100) 0
          _________________________________________________________________
          lstm_1 (LSTM) (None, 100) 80400
          _________________________________________________________________
          dropout_1 (Dropout) (None, 100) 0
          _________________________________________________________________
          dense (Dense) (None, 1) 101
          _________________________________________________________________
          activation (Activation) (None, 1) 0
          =================================================================
          Total params: 121,301
          Trainable params: 121,301
          Non-trainable params: 0
          _________________________________________________________________





          share|improve this answer




























          • thanks for the clear answer, much appreciated. The Activation('linear'), which library is Activation from as my code doesn't recognise it?

            – mHelpMe
            Mar 29 at 8:59






          • 1





            @mHelpMe Activation is a keras layer. I added it to the answer.

            – giser_yugang
            Mar 29 at 9:04












          Your Answer






          StackExchange.ifUsing("editor", function ()
          StackExchange.using("externalEditor", function ()
          StackExchange.using("snippets", function ()
          StackExchange.snippets.init();
          );
          );
          , "code-snippets");

          StackExchange.ready(function()
          var channelOptions =
          tags: "".split(" "),
          id: "1"
          ;
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function()
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled)
          StackExchange.using("snippets", function()
          createEditor();
          );

          else
          createEditor();

          );

          function createEditor()
          StackExchange.prepareEditor(
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader:
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/4.0/"u003ecc by-sa 4.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          ,
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          );



          );














          draft saved

          draft discarded
















          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55407165%2fadding-lstm-layer-but-getting-required-positional-argument-units-error%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown


























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          1


















          units is the first parameter of LSTM, which represents the last dimension of the output data at this layer. It shows the first error because your code doesn't have units in your first attempt. units satisfies the condition so that it shows the second error in the second attempt.



          You should use the input_shape parameter to specify the shape of the first layer input in this case. Your first LSTM layer input_shape should have two data (timestep and feature,batch_size doesn't need to be filled in by default) since LSTM requires three dimensional input. Assuming your timestep is 10, your code should be changed to the following.



          from tensorflow.python.keras.models import Sequential
          from tensorflow.python.keras.layers import Dense, LSTM, Dropout,Activation

          model = Sequential()
          model.add(LSTM(units=100,input_shape=(10,1),return_sequences=True))
          model.add(Dropout(0.2))
          model.add(LSTM(100, return_sequences=False))
          model.add(Dropout(0.2))
          model.add(Dense(units=1))
          model.add(Activation('linear'))
          model.compile(loss="mse", optimizer="rmsprop")
          print(model.summary())

          _________________________________________________________________
          Layer (type) Output Shape Param #
          =================================================================
          lstm (LSTM) (None, 10, 100) 40800
          _________________________________________________________________
          dropout (Dropout) (None, 10, 100) 0
          _________________________________________________________________
          lstm_1 (LSTM) (None, 100) 80400
          _________________________________________________________________
          dropout_1 (Dropout) (None, 100) 0
          _________________________________________________________________
          dense (Dense) (None, 1) 101
          _________________________________________________________________
          activation (Activation) (None, 1) 0
          =================================================================
          Total params: 121,301
          Trainable params: 121,301
          Non-trainable params: 0
          _________________________________________________________________





          share|improve this answer




























          • thanks for the clear answer, much appreciated. The Activation('linear'), which library is Activation from as my code doesn't recognise it?

            – mHelpMe
            Mar 29 at 8:59






          • 1





            @mHelpMe Activation is a keras layer. I added it to the answer.

            – giser_yugang
            Mar 29 at 9:04















          1


















          units is the first parameter of LSTM, which represents the last dimension of the output data at this layer. It shows the first error because your code doesn't have units in your first attempt. units satisfies the condition so that it shows the second error in the second attempt.



          You should use the input_shape parameter to specify the shape of the first layer input in this case. Your first LSTM layer input_shape should have two data (timestep and feature,batch_size doesn't need to be filled in by default) since LSTM requires three dimensional input. Assuming your timestep is 10, your code should be changed to the following.



          from tensorflow.python.keras.models import Sequential
          from tensorflow.python.keras.layers import Dense, LSTM, Dropout,Activation

          model = Sequential()
          model.add(LSTM(units=100,input_shape=(10,1),return_sequences=True))
          model.add(Dropout(0.2))
          model.add(LSTM(100, return_sequences=False))
          model.add(Dropout(0.2))
          model.add(Dense(units=1))
          model.add(Activation('linear'))
          model.compile(loss="mse", optimizer="rmsprop")
          print(model.summary())

          _________________________________________________________________
          Layer (type) Output Shape Param #
          =================================================================
          lstm (LSTM) (None, 10, 100) 40800
          _________________________________________________________________
          dropout (Dropout) (None, 10, 100) 0
          _________________________________________________________________
          lstm_1 (LSTM) (None, 100) 80400
          _________________________________________________________________
          dropout_1 (Dropout) (None, 100) 0
          _________________________________________________________________
          dense (Dense) (None, 1) 101
          _________________________________________________________________
          activation (Activation) (None, 1) 0
          =================================================================
          Total params: 121,301
          Trainable params: 121,301
          Non-trainable params: 0
          _________________________________________________________________





          share|improve this answer




























          • thanks for the clear answer, much appreciated. The Activation('linear'), which library is Activation from as my code doesn't recognise it?

            – mHelpMe
            Mar 29 at 8:59






          • 1





            @mHelpMe Activation is a keras layer. I added it to the answer.

            – giser_yugang
            Mar 29 at 9:04













          1














          1










          1









          units is the first parameter of LSTM, which represents the last dimension of the output data at this layer. It shows the first error because your code doesn't have units in your first attempt. units satisfies the condition so that it shows the second error in the second attempt.



          You should use the input_shape parameter to specify the shape of the first layer input in this case. Your first LSTM layer input_shape should have two data (timestep and feature,batch_size doesn't need to be filled in by default) since LSTM requires three dimensional input. Assuming your timestep is 10, your code should be changed to the following.



          from tensorflow.python.keras.models import Sequential
          from tensorflow.python.keras.layers import Dense, LSTM, Dropout,Activation

          model = Sequential()
          model.add(LSTM(units=100,input_shape=(10,1),return_sequences=True))
          model.add(Dropout(0.2))
          model.add(LSTM(100, return_sequences=False))
          model.add(Dropout(0.2))
          model.add(Dense(units=1))
          model.add(Activation('linear'))
          model.compile(loss="mse", optimizer="rmsprop")
          print(model.summary())

          _________________________________________________________________
          Layer (type) Output Shape Param #
          =================================================================
          lstm (LSTM) (None, 10, 100) 40800
          _________________________________________________________________
          dropout (Dropout) (None, 10, 100) 0
          _________________________________________________________________
          lstm_1 (LSTM) (None, 100) 80400
          _________________________________________________________________
          dropout_1 (Dropout) (None, 100) 0
          _________________________________________________________________
          dense (Dense) (None, 1) 101
          _________________________________________________________________
          activation (Activation) (None, 1) 0
          =================================================================
          Total params: 121,301
          Trainable params: 121,301
          Non-trainable params: 0
          _________________________________________________________________





          share|improve this answer
















          units is the first parameter of LSTM, which represents the last dimension of the output data at this layer. It shows the first error because your code doesn't have units in your first attempt. units satisfies the condition so that it shows the second error in the second attempt.



          You should use the input_shape parameter to specify the shape of the first layer input in this case. Your first LSTM layer input_shape should have two data (timestep and feature,batch_size doesn't need to be filled in by default) since LSTM requires three dimensional input. Assuming your timestep is 10, your code should be changed to the following.



          from tensorflow.python.keras.models import Sequential
          from tensorflow.python.keras.layers import Dense, LSTM, Dropout,Activation

          model = Sequential()
          model.add(LSTM(units=100,input_shape=(10,1),return_sequences=True))
          model.add(Dropout(0.2))
          model.add(LSTM(100, return_sequences=False))
          model.add(Dropout(0.2))
          model.add(Dense(units=1))
          model.add(Activation('linear'))
          model.compile(loss="mse", optimizer="rmsprop")
          print(model.summary())

          _________________________________________________________________
          Layer (type) Output Shape Param #
          =================================================================
          lstm (LSTM) (None, 10, 100) 40800
          _________________________________________________________________
          dropout (Dropout) (None, 10, 100) 0
          _________________________________________________________________
          lstm_1 (LSTM) (None, 100) 80400
          _________________________________________________________________
          dropout_1 (Dropout) (None, 100) 0
          _________________________________________________________________
          dense (Dense) (None, 1) 101
          _________________________________________________________________
          activation (Activation) (None, 1) 0
          =================================================================
          Total params: 121,301
          Trainable params: 121,301
          Non-trainable params: 0
          _________________________________________________________________






          share|improve this answer















          share|improve this answer




          share|improve this answer








          edited Mar 29 at 9:04

























          answered Mar 29 at 6:00









          giser_yuganggiser_yugang

          4,4713 gold badges9 silver badges32 bronze badges




          4,4713 gold badges9 silver badges32 bronze badges















          • thanks for the clear answer, much appreciated. The Activation('linear'), which library is Activation from as my code doesn't recognise it?

            – mHelpMe
            Mar 29 at 8:59






          • 1





            @mHelpMe Activation is a keras layer. I added it to the answer.

            – giser_yugang
            Mar 29 at 9:04

















          • thanks for the clear answer, much appreciated. The Activation('linear'), which library is Activation from as my code doesn't recognise it?

            – mHelpMe
            Mar 29 at 8:59






          • 1





            @mHelpMe Activation is a keras layer. I added it to the answer.

            – giser_yugang
            Mar 29 at 9:04
















          thanks for the clear answer, much appreciated. The Activation('linear'), which library is Activation from as my code doesn't recognise it?

          – mHelpMe
          Mar 29 at 8:59





          thanks for the clear answer, much appreciated. The Activation('linear'), which library is Activation from as my code doesn't recognise it?

          – mHelpMe
          Mar 29 at 8:59




          1




          1





          @mHelpMe Activation is a keras layer. I added it to the answer.

          – giser_yugang
          Mar 29 at 9:04





          @mHelpMe Activation is a keras layer. I added it to the answer.

          – giser_yugang
          Mar 29 at 9:04




















          draft saved

          draft discarded















































          Thanks for contributing an answer to Stack Overflow!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid


          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.

          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55407165%2fadding-lstm-layer-but-getting-required-positional-argument-units-error%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown









          Popular posts from this blog

          Kamusi Yaliyomo Aina za kamusi | Muundo wa kamusi | Faida za kamusi | Dhima ya picha katika kamusi | Marejeo | Tazama pia | Viungo vya nje | UrambazajiKuhusu kamusiGo-SwahiliWiki-KamusiKamusi ya Kiswahili na Kiingerezakuihariri na kuongeza habari

          SQL error code 1064 with creating Laravel foreign keysForeign key constraints: When to use ON UPDATE and ON DELETEDropping column with foreign key Laravel error: General error: 1025 Error on renameLaravel SQL Can't create tableLaravel Migration foreign key errorLaravel php artisan migrate:refresh giving a syntax errorSQLSTATE[42S01]: Base table or view already exists or Base table or view already exists: 1050 Tableerror in migrating laravel file to xampp serverSyntax error or access violation: 1064:syntax to use near 'unsigned not null, modelName varchar(191) not null, title varchar(191) not nLaravel cannot create new table field in mysqlLaravel 5.7:Last migration creates table but is not registered in the migration table

          은진 송씨 목차 역사 본관 분파 인물 조선 왕실과의 인척 관계 집성촌 항렬자 인구 같이 보기 각주 둘러보기 메뉴은진 송씨세종실록 149권, 지리지 충청도 공주목 은진현