Discussion:
[theano-users] [theano.gradient.DisconnectedInputError]
Priyank Pathak
2017-06-20 16:11:58 UTC
Permalink
Hey,

So I'm trying to implement a VAE on keras, and the following is a snippet
of the code I've written.

z_mean = Dense( self.z_dim , init=initialization , activation='linear')(H2)
z_mean = LeakyReLU(alpha=.001)(z_mean)


z_log_var = Dense( self.z_dim,init=initialization , activation='linear')(H2)
z_log_var = LeakyReLU(alpha=.001)(z_log_var)


z = Lambda(self.sampling , output_shape=K.int_shape(z_mean) )([z_mean,
z_log_var])

*H3 = Dense(input_dim - 1, init=initialization , activation='linear')(z)*

Now when I later compile the model:

model = Model(input=data_input, output=[xh , z_mean , z_log_var ] )
grads = K.gradients(cost, trainable_vars)

it gives me the error:
*theano.gradient.DisconnectedInputError: *

*Backtrace when that variable is created:*

* H3 = Dense(input_dim - 1, init=initialization , activation='linear')(z)*


Does anyone have any idea why this error is there?
My guess is that the gradient fail to apply from the z to z_mean and
z_log_var

Regards
Priyank Pathak
--
---
You received this message because you are subscribed to the Google Groups "theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to theano-users+***@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Loading...