Error in using K.function together with K.gradientsKeras error: expected dense_input_1 to have 3 dimensionsKeras AttributeError: 'list' object has no attribute 'ndim'AveragePooling2D doesn't recognize a dtypeTensorflow compute_output_shape() Not Working For Custom LayerInput tensors to a Model must come from `tf.layers.Input` when I concatenate two models with Keras API on Tensorflowhow to create specificity custom metric for Keras Neural Netscouldn't run embedding network Keras with multiplue inputKeras sparse_categorical_accuracy metric produces “Incompatible shapes” errormodel.predict in keras using universal sentence encoder giving shape errorError executing rnn model . How to fix it?
How to get the decimal part of a number in apex
In a series of books, what happens after the coming of age?
Drug Testing and Prescribed Medications
Concatenate all values of the same XML element using XPath/XQuery
Does this website provide consistent translation into Wookiee?
Translation of "invincible independence"
A problem with Hebrew and English underlined text
Where do 5 or more U.S. counties meet in a single point?
An adjective or a noun to describe a very small apartment / house etc
How can I finally understand the confusing modal verb "мочь"?
Assuming a normal distribution: what is the sd for a given mean?
Explaining intravenous drug abuse to a small child
Make me a minimum magic sum
Searching for a sentence that I only know part of it using Google's operators
Which "exotic salt" can lower water's freezing point by 70 °C?
Did Ham the Chimp follow commands, or did he just randomly push levers?
Extracting the parent, leaf, and extension from a valid path
Why is there a cap on 401k contributions?
How can I draw a rectangle around venn Diagrams?
Why is the episode called "The Last of the Starks"?
My C Drive is full without reason
Would a legitimized Baratheon have the best claim for the Iron Throne?
Saying the right thing then saying the wrong thing toch kedai dibur
What is more safe for browsing the web: PC or smartphone?
Error in using K.function together with K.gradients
Keras error: expected dense_input_1 to have 3 dimensionsKeras AttributeError: 'list' object has no attribute 'ndim'AveragePooling2D doesn't recognize a dtypeTensorflow compute_output_shape() Not Working For Custom LayerInput tensors to a Model must come from `tf.layers.Input` when I concatenate two models with Keras API on Tensorflowhow to create specificity custom metric for Keras Neural Netscouldn't run embedding network Keras with multiplue inputKeras sparse_categorical_accuracy metric produces “Incompatible shapes” errormodel.predict in keras using universal sentence encoder giving shape errorError executing rnn model . How to fix it?
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty height:90px;width:728px;box-sizing:border-box;
I'm writing a test model using keras, where I want do some mathematics depends on numerical values of the output of a layer and its the derivatives.
I'm using tensorflow backend.
I use K.function in order to get the values of the outputs of the Lambda layer and derivative layers. However I got some weird err if I choose the function in the Lambda layer as power function, e.g. x**2. If I change x**2 to sin(x), it works fine.
import numpy as np
from keras.models import Model
from keras.layers import Input, Layer, Lambda
from keras import backend as K
x = Input(shape=(1,))
# the Lambda layer
c = Lambda(lambda x: x**2)(x) # this will causs err
#c = Lambda(lambda x: K.sin(x))(x) # but this works fine
class dc_layer(Layer):
def __init__(self,*args,**kwargs):
self.is_placeholder = True
super(dc_layer, self).__init__(*args,**kwargs)
def call(self,inputs):
x = inputs[0]
c0 = inputs[1]
c1 = K.gradients(c0,x)
return c1
# the derivatives of the lambda layer
c1 = dc_layer()([x,c])
c2 = dc_layer()([x,c1])
Then I use backend.function to define a function in order to get layer outputs
# define a function to get the derivatives
get_layer_outputs = K.function([x],[c2])
x_data = np.linspace(0,1,6)
val = get_layer_outputs([x_data])[0]
print(val)
I got the following err message in jupyter notebook
InvalidArgumentError: data[0].shape = [1] does not start with indices[0].shape = [2]
which tracback to
---> 36 val = get_layer_outputs([x_data])[0]
but if I look at the c1 layer
# define a function to get the derivatives
get_layer_outputs = K.function([x],[c1])
x_data = np.linspace(0,1,6)
val = get_layer_outputs([x_data])[0]
print(val)
it works fine.
I guess it is some thing wrong when I use K.function. Any solutions/suggestions would be appreciated.
======================================================
Additional question:
Even if I try a very simple code, I got err when use K.function, as follows
x = Input(shape=(1,))
h = Dense(10,activation='sigmoid')(x)
c = Dense(1)(h)
get_layer_outputs = K.function([x],[c])
x_data = np.linspace(0,1,6)
val = get_layer_outputs([x_data])[0]
print(val)
I got
InvalidArgumentError: In[0] is not a matrix
[[Node: dense_24/MatMul = MatMul[T=DT_FLOAT, transpose_a=false, transpose_b=false, _device="/job:localhost/replica:0/task:0/device:CPU:0"](_arg_input_19_0_0, dense_24/kernel/read)]]
Now I'm really confused on how to use K.function properly. Please help if you have any idea. Thanks!
tensorflow keras jupyter-notebook python-3.5
add a comment |
I'm writing a test model using keras, where I want do some mathematics depends on numerical values of the output of a layer and its the derivatives.
I'm using tensorflow backend.
I use K.function in order to get the values of the outputs of the Lambda layer and derivative layers. However I got some weird err if I choose the function in the Lambda layer as power function, e.g. x**2. If I change x**2 to sin(x), it works fine.
import numpy as np
from keras.models import Model
from keras.layers import Input, Layer, Lambda
from keras import backend as K
x = Input(shape=(1,))
# the Lambda layer
c = Lambda(lambda x: x**2)(x) # this will causs err
#c = Lambda(lambda x: K.sin(x))(x) # but this works fine
class dc_layer(Layer):
def __init__(self,*args,**kwargs):
self.is_placeholder = True
super(dc_layer, self).__init__(*args,**kwargs)
def call(self,inputs):
x = inputs[0]
c0 = inputs[1]
c1 = K.gradients(c0,x)
return c1
# the derivatives of the lambda layer
c1 = dc_layer()([x,c])
c2 = dc_layer()([x,c1])
Then I use backend.function to define a function in order to get layer outputs
# define a function to get the derivatives
get_layer_outputs = K.function([x],[c2])
x_data = np.linspace(0,1,6)
val = get_layer_outputs([x_data])[0]
print(val)
I got the following err message in jupyter notebook
InvalidArgumentError: data[0].shape = [1] does not start with indices[0].shape = [2]
which tracback to
---> 36 val = get_layer_outputs([x_data])[0]
but if I look at the c1 layer
# define a function to get the derivatives
get_layer_outputs = K.function([x],[c1])
x_data = np.linspace(0,1,6)
val = get_layer_outputs([x_data])[0]
print(val)
it works fine.
I guess it is some thing wrong when I use K.function. Any solutions/suggestions would be appreciated.
======================================================
Additional question:
Even if I try a very simple code, I got err when use K.function, as follows
x = Input(shape=(1,))
h = Dense(10,activation='sigmoid')(x)
c = Dense(1)(h)
get_layer_outputs = K.function([x],[c])
x_data = np.linspace(0,1,6)
val = get_layer_outputs([x_data])[0]
print(val)
I got
InvalidArgumentError: In[0] is not a matrix
[[Node: dense_24/MatMul = MatMul[T=DT_FLOAT, transpose_a=false, transpose_b=false, _device="/job:localhost/replica:0/task:0/device:CPU:0"](_arg_input_19_0_0, dense_24/kernel/read)]]
Now I'm really confused on how to use K.function properly. Please help if you have any idea. Thanks!
tensorflow keras jupyter-notebook python-3.5
add a comment |
I'm writing a test model using keras, where I want do some mathematics depends on numerical values of the output of a layer and its the derivatives.
I'm using tensorflow backend.
I use K.function in order to get the values of the outputs of the Lambda layer and derivative layers. However I got some weird err if I choose the function in the Lambda layer as power function, e.g. x**2. If I change x**2 to sin(x), it works fine.
import numpy as np
from keras.models import Model
from keras.layers import Input, Layer, Lambda
from keras import backend as K
x = Input(shape=(1,))
# the Lambda layer
c = Lambda(lambda x: x**2)(x) # this will causs err
#c = Lambda(lambda x: K.sin(x))(x) # but this works fine
class dc_layer(Layer):
def __init__(self,*args,**kwargs):
self.is_placeholder = True
super(dc_layer, self).__init__(*args,**kwargs)
def call(self,inputs):
x = inputs[0]
c0 = inputs[1]
c1 = K.gradients(c0,x)
return c1
# the derivatives of the lambda layer
c1 = dc_layer()([x,c])
c2 = dc_layer()([x,c1])
Then I use backend.function to define a function in order to get layer outputs
# define a function to get the derivatives
get_layer_outputs = K.function([x],[c2])
x_data = np.linspace(0,1,6)
val = get_layer_outputs([x_data])[0]
print(val)
I got the following err message in jupyter notebook
InvalidArgumentError: data[0].shape = [1] does not start with indices[0].shape = [2]
which tracback to
---> 36 val = get_layer_outputs([x_data])[0]
but if I look at the c1 layer
# define a function to get the derivatives
get_layer_outputs = K.function([x],[c1])
x_data = np.linspace(0,1,6)
val = get_layer_outputs([x_data])[0]
print(val)
it works fine.
I guess it is some thing wrong when I use K.function. Any solutions/suggestions would be appreciated.
======================================================
Additional question:
Even if I try a very simple code, I got err when use K.function, as follows
x = Input(shape=(1,))
h = Dense(10,activation='sigmoid')(x)
c = Dense(1)(h)
get_layer_outputs = K.function([x],[c])
x_data = np.linspace(0,1,6)
val = get_layer_outputs([x_data])[0]
print(val)
I got
InvalidArgumentError: In[0] is not a matrix
[[Node: dense_24/MatMul = MatMul[T=DT_FLOAT, transpose_a=false, transpose_b=false, _device="/job:localhost/replica:0/task:0/device:CPU:0"](_arg_input_19_0_0, dense_24/kernel/read)]]
Now I'm really confused on how to use K.function properly. Please help if you have any idea. Thanks!
tensorflow keras jupyter-notebook python-3.5
I'm writing a test model using keras, where I want do some mathematics depends on numerical values of the output of a layer and its the derivatives.
I'm using tensorflow backend.
I use K.function in order to get the values of the outputs of the Lambda layer and derivative layers. However I got some weird err if I choose the function in the Lambda layer as power function, e.g. x**2. If I change x**2 to sin(x), it works fine.
import numpy as np
from keras.models import Model
from keras.layers import Input, Layer, Lambda
from keras import backend as K
x = Input(shape=(1,))
# the Lambda layer
c = Lambda(lambda x: x**2)(x) # this will causs err
#c = Lambda(lambda x: K.sin(x))(x) # but this works fine
class dc_layer(Layer):
def __init__(self,*args,**kwargs):
self.is_placeholder = True
super(dc_layer, self).__init__(*args,**kwargs)
def call(self,inputs):
x = inputs[0]
c0 = inputs[1]
c1 = K.gradients(c0,x)
return c1
# the derivatives of the lambda layer
c1 = dc_layer()([x,c])
c2 = dc_layer()([x,c1])
Then I use backend.function to define a function in order to get layer outputs
# define a function to get the derivatives
get_layer_outputs = K.function([x],[c2])
x_data = np.linspace(0,1,6)
val = get_layer_outputs([x_data])[0]
print(val)
I got the following err message in jupyter notebook
InvalidArgumentError: data[0].shape = [1] does not start with indices[0].shape = [2]
which tracback to
---> 36 val = get_layer_outputs([x_data])[0]
but if I look at the c1 layer
# define a function to get the derivatives
get_layer_outputs = K.function([x],[c1])
x_data = np.linspace(0,1,6)
val = get_layer_outputs([x_data])[0]
print(val)
it works fine.
I guess it is some thing wrong when I use K.function. Any solutions/suggestions would be appreciated.
======================================================
Additional question:
Even if I try a very simple code, I got err when use K.function, as follows
x = Input(shape=(1,))
h = Dense(10,activation='sigmoid')(x)
c = Dense(1)(h)
get_layer_outputs = K.function([x],[c])
x_data = np.linspace(0,1,6)
val = get_layer_outputs([x_data])[0]
print(val)
I got
InvalidArgumentError: In[0] is not a matrix
[[Node: dense_24/MatMul = MatMul[T=DT_FLOAT, transpose_a=false, transpose_b=false, _device="/job:localhost/replica:0/task:0/device:CPU:0"](_arg_input_19_0_0, dense_24/kernel/read)]]
Now I'm really confused on how to use K.function properly. Please help if you have any idea. Thanks!
tensorflow keras jupyter-notebook python-3.5
tensorflow keras jupyter-notebook python-3.5
edited Mar 23 at 7:05
Lihui Chai
asked Mar 23 at 6:00
Lihui ChaiLihui Chai
183
183
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
For me this works - your x_data vector was 0-Dimensional:
import numpy as np
from keras.models import Model
from keras.layers import Input, Layer, Lambda, Dense
from keras import backend as K
x = Input(shape=(1,))
# the Lambda layer
c = Lambda(lambda x: x**2)(x) # this will causs err
#c = Lambda(lambda x: K.sin(x))(x) # but this works fine
class dc_layer(Layer):
def __init__(self,*args,**kwargs):
self.is_placeholder = True
super(dc_layer, self).__init__(*args,**kwargs)
def call(self,inputs):
x = inputs[0]
c0 = inputs[1]
c1 = K.gradients(c0,x)
return c1
# the derivatives of the lambda layer
c1 = dc_layer()([x,c]) # in Keras 2.0.2 need to unpack results, Keras 2.2.4 seems fine.
c2 = dc_layer()([x,c1])
# define a function to get the derivatives
get_layer_outputs = K.function([x],[c2])
x_data = np.linspace(0,1,6)[:,None] # ensure vector is 1D, not 0D
val = get_layer_outputs([x_data])[0]
print(val)
output:
[[2.]
[2.]
[2.]
[2.]
[2.]
[2.]]
add a comment |
Your Answer
StackExchange.ifUsing("editor", function ()
StackExchange.using("externalEditor", function ()
StackExchange.using("snippets", function ()
StackExchange.snippets.init();
);
);
, "code-snippets");
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "1"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55311073%2ferror-in-using-k-function-together-with-k-gradients%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
For me this works - your x_data vector was 0-Dimensional:
import numpy as np
from keras.models import Model
from keras.layers import Input, Layer, Lambda, Dense
from keras import backend as K
x = Input(shape=(1,))
# the Lambda layer
c = Lambda(lambda x: x**2)(x) # this will causs err
#c = Lambda(lambda x: K.sin(x))(x) # but this works fine
class dc_layer(Layer):
def __init__(self,*args,**kwargs):
self.is_placeholder = True
super(dc_layer, self).__init__(*args,**kwargs)
def call(self,inputs):
x = inputs[0]
c0 = inputs[1]
c1 = K.gradients(c0,x)
return c1
# the derivatives of the lambda layer
c1 = dc_layer()([x,c]) # in Keras 2.0.2 need to unpack results, Keras 2.2.4 seems fine.
c2 = dc_layer()([x,c1])
# define a function to get the derivatives
get_layer_outputs = K.function([x],[c2])
x_data = np.linspace(0,1,6)[:,None] # ensure vector is 1D, not 0D
val = get_layer_outputs([x_data])[0]
print(val)
output:
[[2.]
[2.]
[2.]
[2.]
[2.]
[2.]]
add a comment |
For me this works - your x_data vector was 0-Dimensional:
import numpy as np
from keras.models import Model
from keras.layers import Input, Layer, Lambda, Dense
from keras import backend as K
x = Input(shape=(1,))
# the Lambda layer
c = Lambda(lambda x: x**2)(x) # this will causs err
#c = Lambda(lambda x: K.sin(x))(x) # but this works fine
class dc_layer(Layer):
def __init__(self,*args,**kwargs):
self.is_placeholder = True
super(dc_layer, self).__init__(*args,**kwargs)
def call(self,inputs):
x = inputs[0]
c0 = inputs[1]
c1 = K.gradients(c0,x)
return c1
# the derivatives of the lambda layer
c1 = dc_layer()([x,c]) # in Keras 2.0.2 need to unpack results, Keras 2.2.4 seems fine.
c2 = dc_layer()([x,c1])
# define a function to get the derivatives
get_layer_outputs = K.function([x],[c2])
x_data = np.linspace(0,1,6)[:,None] # ensure vector is 1D, not 0D
val = get_layer_outputs([x_data])[0]
print(val)
output:
[[2.]
[2.]
[2.]
[2.]
[2.]
[2.]]
add a comment |
For me this works - your x_data vector was 0-Dimensional:
import numpy as np
from keras.models import Model
from keras.layers import Input, Layer, Lambda, Dense
from keras import backend as K
x = Input(shape=(1,))
# the Lambda layer
c = Lambda(lambda x: x**2)(x) # this will causs err
#c = Lambda(lambda x: K.sin(x))(x) # but this works fine
class dc_layer(Layer):
def __init__(self,*args,**kwargs):
self.is_placeholder = True
super(dc_layer, self).__init__(*args,**kwargs)
def call(self,inputs):
x = inputs[0]
c0 = inputs[1]
c1 = K.gradients(c0,x)
return c1
# the derivatives of the lambda layer
c1 = dc_layer()([x,c]) # in Keras 2.0.2 need to unpack results, Keras 2.2.4 seems fine.
c2 = dc_layer()([x,c1])
# define a function to get the derivatives
get_layer_outputs = K.function([x],[c2])
x_data = np.linspace(0,1,6)[:,None] # ensure vector is 1D, not 0D
val = get_layer_outputs([x_data])[0]
print(val)
output:
[[2.]
[2.]
[2.]
[2.]
[2.]
[2.]]
For me this works - your x_data vector was 0-Dimensional:
import numpy as np
from keras.models import Model
from keras.layers import Input, Layer, Lambda, Dense
from keras import backend as K
x = Input(shape=(1,))
# the Lambda layer
c = Lambda(lambda x: x**2)(x) # this will causs err
#c = Lambda(lambda x: K.sin(x))(x) # but this works fine
class dc_layer(Layer):
def __init__(self,*args,**kwargs):
self.is_placeholder = True
super(dc_layer, self).__init__(*args,**kwargs)
def call(self,inputs):
x = inputs[0]
c0 = inputs[1]
c1 = K.gradients(c0,x)
return c1
# the derivatives of the lambda layer
c1 = dc_layer()([x,c]) # in Keras 2.0.2 need to unpack results, Keras 2.2.4 seems fine.
c2 = dc_layer()([x,c1])
# define a function to get the derivatives
get_layer_outputs = K.function([x],[c2])
x_data = np.linspace(0,1,6)[:,None] # ensure vector is 1D, not 0D
val = get_layer_outputs([x_data])[0]
print(val)
output:
[[2.]
[2.]
[2.]
[2.]
[2.]
[2.]]
edited Mar 23 at 10:29
answered Mar 23 at 9:33
Kai AeberliKai Aeberli
662515
662515
add a comment |
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55311073%2ferror-in-using-k-function-together-with-k-gradients%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown