Discussion:
[theano-users] about theano.gradient.DisconnectedInputError
Yang Xiang
2016-01-16 15:02:06 UTC
Permalink
Hi all,

I encountered theano.gradient.DisconnectedInputError when I wrote my code
for an end-to-end process. I have a series of parameters to update. In
order to check which parameter caused the disconnect error, I removed them
from the function's parameters one by one. But after I removed all the
parameters (params=[]), this error was still there? What does this case
mean?

The error report stated: theano.gradient.DisconnectedInputError: grad
method was asked to compute the gradient with respect to a variable that is
not part of the computational graph of the cost, or is used only by a
non-differentiable operator: <TensorType(float64, 4D)>

Could anyone help?

Thanks.

Yang
--
---
You received this message because you are subscribed to the Google Groups "theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to theano-users+***@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Daniel Renshaw
2016-01-17 13:53:09 UTC
Permalink
Are you saying you have code of this form:

import theano.tensor as tt

x = tt.matrix()
c = tt.sum(2 * x)
gs = tt.grad(c, [])

i.e. an attempt to compute the gradient of some cost c with respect to...
nothing, is generating the exception whose details you posted?

If so we'll probably need to see what the cost computation is, can you
share more code? Have you been able to reproduce the problem with simple
code that can be executed without any external dependencies?

Daniel
Post by Yang Xiang
Hi all,
I encountered theano.gradient.DisconnectedInputError when I wrote my code
for an end-to-end process. I have a series of parameters to update. In
order to check which parameter caused the disconnect error, I removed them
from the function's parameters one by one. But after I removed all the
parameters (params=[]), this error was still there? What does this case
mean?
The error report stated: theano.gradient.DisconnectedInputError: grad
method was asked to compute the gradient with respect to a variable that is
not part of the computational graph of the cost, or is used only by a
non-differentiable operator: <TensorType(float64, 4D)>
Could anyone help?
Thanks.
Yang
--
---
You received this message because you are subscribed to the Google Groups
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an
For more options, visit https://groups.google.com/d/optout.
--
---
You received this message because you are subscribed to the Google Groups "theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to theano-users+***@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Yang Xiang
2016-01-20 01:24:21 UTC
Permalink
Thanks Daniel,

I tried to draw out the whole graph and label all the parameters along the
path, and finally fixed the problem. The name of one parameter (when I
copied the paramters from the forward RNN to backward) was misspelled.

Finally I still have no idea about why the NIL params for grad would also
caused this problem, and I suggest this warning (exception) should be threw
along with the name of the disconnected paramter but not only some type
information. And one thing more I learned is that drawing the graph and
labeling the parameters is a good way for debugging this disconnect error.

Yang

圚 2016幎1月17日星期日 UTC+8䞋午9:53:53Daniel Renshaw写道
Post by Daniel Renshaw
import theano.tensor as tt
x = tt.matrix()
c = tt.sum(2 * x)
gs = tt.grad(c, [])
i.e. an attempt to compute the gradient of some cost c with respect to...
nothing, is generating the exception whose details you posted?
If so we'll probably need to see what the cost computation is, can you
share more code? Have you been able to reproduce the problem with simple
code that can be executed without any external dependencies?
Daniel
Post by Yang Xiang
Hi all,
I encountered theano.gradient.DisconnectedInputError when I wrote my code
for an end-to-end process. I have a series of parameters to update. In
order to check which parameter caused the disconnect error, I removed them
from the function's parameters one by one. But after I removed all the
parameters (params=[]), this error was still there? What does this case
mean?
The error report stated: theano.gradient.DisconnectedInputError: grad
method was asked to compute the gradient with respect to a variable that is
not part of the computational graph of the cost, or is used only by a
non-differentiable operator: <TensorType(float64, 4D)>
Could anyone help?
Thanks.
Yang
--
---
You received this message because you are subscribed to the Google Groups
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an
For more options, visit https://groups.google.com/d/optout.
--
---
You received this message because you are subscribed to the Google Groups "theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to theano-users+***@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Lijun Wu
2016-12-02 08:29:47 UTC
Permalink
Hi Yang,

can you share more about how to draw the graph and label parameter to
debug? I met one some problem, but my usage is different:
I first trained a modelA, then I use this modelA just as part of an
function during training modelB, and modelA should be fixed, but when I try
to compile modelB, it dropped this error: DisconnetedInputError: xxxx by a
non-differentiable operator: b.
But I even don't know where this b occurred, could you give some advice?
Thanks.
Post by Yang Xiang
Thanks Daniel,
I tried to draw out the whole graph and label all the parameters along the
path, and finally fixed the problem. The name of one parameter (when I
copied the paramters from the forward RNN to backward) was misspelled.
Finally I still have no idea about why the NIL params for grad would also
caused this problem, and I suggest this warning (exception) should be threw
along with the name of the disconnected paramter but not only some type
information. And one thing more I learned is that drawing the graph and
labeling the parameters is a good way for debugging this disconnect error.
Yang
圚 2016幎1月17日星期日 UTC+8䞋午9:53:53Daniel Renshaw写道
Post by Daniel Renshaw
import theano.tensor as tt
x = tt.matrix()
c = tt.sum(2 * x)
gs = tt.grad(c, [])
i.e. an attempt to compute the gradient of some cost c with respect to...
nothing, is generating the exception whose details you posted?
If so we'll probably need to see what the cost computation is, can you
share more code? Have you been able to reproduce the problem with simple
code that can be executed without any external dependencies?
Daniel
Post by Yang Xiang
Hi all,
I encountered theano.gradient.DisconnectedInputError when I wrote my
code for an end-to-end process. I have a series of parameters to update. In
order to check which parameter caused the disconnect error, I removed them
from the function's parameters one by one. But after I removed all the
parameters (params=[]), this error was still there? What does this case
mean?
The error report stated: theano.gradient.DisconnectedInputError: grad
method was asked to compute the gradient with respect to a variable that is
not part of the computational graph of the cost, or is used only by a
non-differentiable operator: <TensorType(float64, 4D)>
Could anyone help?
Thanks.
Yang
--
---
You received this message because you are subscribed to the Google
Groups "theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send
For more options, visit https://groups.google.com/d/optout.
--
---
You received this message because you are subscribed to the Google Groups "theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to theano-users+***@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
相洋
2016-12-02 09:28:00 UTC
Permalink
Hi Lijun,

I have been out of keras for a period. But I think your problem is
something easy to debug. You can directly draw the graph yourself by
plotting all the nodes and arcs, and certainly you can use the
visualization function provided by keras: https://keras.io/visualization/.
Hope this would be helpful!

Good luck!
Yang
Post by Lijun Wu
Hi Yang,
can you share more about how to draw the graph and label parameter to
I first trained a modelA, then I use this modelA just as part of an
function during training modelB, and modelA should be fixed, but when I try
to compile modelB, it dropped this error: DisconnetedInputError: xxxx by a
non-differentiable operator: b.
But I even don't know where this b occurred, could you give some advice?
Thanks.
Post by Yang Xiang
Thanks Daniel,
I tried to draw out the whole graph and label all the parameters along
the path, and finally fixed the problem. The name of one parameter (when I
copied the paramters from the forward RNN to backward) was misspelled.
Finally I still have no idea about why the NIL params for grad would also
caused this problem, and I suggest this warning (exception) should be threw
along with the name of the disconnected paramter but not only some type
information. And one thing more I learned is that drawing the graph and
labeling the parameters is a good way for debugging this disconnect error.
Yang
圚 2016幎1月17日星期日 UTC+8䞋午9:53:53Daniel Renshaw写道
Post by Daniel Renshaw
import theano.tensor as tt
x = tt.matrix()
c = tt.sum(2 * x)
gs = tt.grad(c, [])
i.e. an attempt to compute the gradient of some cost c with respect
to... nothing, is generating the exception whose details you posted?
If so we'll probably need to see what the cost computation is, can you
share more code? Have you been able to reproduce the problem with simple
code that can be executed without any external dependencies?
Daniel
Post by Yang Xiang
Hi all,
I encountered theano.gradient.DisconnectedInputError when I wrote my
code for an end-to-end process. I have a series of parameters to update. In
order to check which parameter caused the disconnect error, I removed them
from the function's parameters one by one. But after I removed all the
parameters (params=[]), this error was still there? What does this case
mean?
The error report stated: theano.gradient.DisconnectedInputError: grad
method was asked to compute the gradient with respect to a variable that is
not part of the computational graph of the cost, or is used only by a
non-differentiable operator: <TensorType(float64, 4D)>
Could anyone help?
Thanks.
Yang
--
---
You received this message because you are subscribed to the Google
Groups "theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send
For more options, visit https://groups.google.com/d/optout.
--
---
You received this message because you are subscribed to the Google Groups "theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to theano-users+***@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Priyank Pathak
2017-06-19 23:58:22 UTC
Permalink
I'm facing a similar problem.

I'm trying to implement a VAE for audio.

the code that is causing me the problem are briefed :

z_mean = Dense( self.z_dim , init=initialization , activation='linear')(H2)
z_mean = LeakyReLU(alpha=.001)(z_mean)

z_log_var = Dense( self.z_dim , init=initialization ,
activation='linear')(H2)
z_log_var = LeakyReLU(alpha=.001)(z_log_var)

z = Lambda(self.sampling , output_shape=K.int_shape(z_mean) )([z_mean,
z_log_var])
H3 = Dense(input_dim - 1, init=initialization , activation='linear')(z)
#causing all troublle

grads fail to communicate information to the z_log_var and z_mean.
When I do
grads = K.gradients(cost, trainable_vars)

"Backtrace when that variable is created " shows me above line
Post by Yang Xiang
Hi all,
I encountered theano.gradient.DisconnectedInputError when I wrote my code
for an end-to-end process. I have a series of parameters to update. In
order to check which parameter caused the disconnect error, I removed them
from the function's parameters one by one. But after I removed all the
parameters (params=[]), this error was still there? What does this case
mean?
The error report stated: theano.gradient.DisconnectedInputError: grad
method was asked to compute the gradient with respect to a variable that is
not part of the computational graph of the cost, or is used only by a
non-differentiable operator: <TensorType(float64, 4D)>
Could anyone help?
Thanks.
Yang
--
---
You received this message because you are subscribed to the Google Groups "theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to theano-users+***@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Loading...