Discussion:
[theano-users] How can I take a gradient through a random normal distribution?
Sunjeet Jena
2017-06-16 22:43:44 UTC
Permalink
I am working on a code to implement a deep RL algorithm where the policy
function is the sampled values from a normal distribution. But when I
differentiate through the cost function(which of course depends upon the
the distribution) show the following error:

"theano.gradient.NullTypeGradError: tensor.grad encountered a NaN. This
variable is Null because the grad method for input 2
(Subtensor{int64:int64:}.0) of the RandomFunction{normal} op is
mathematically undefined. No gradient defined through raw random numbers op"


Is there anyway I can solve it?
--
---
You received this message because you are subscribed to the Google Groups "theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to theano-users+***@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Michael Harradon
2017-06-17 18:59:03 UTC
Permalink
Typically people use the reparameterization trick to handle this. See the
original variational autoencoder paper, and example lasagne implementation
here: https://github.com/Lasagne/Recipes/blob/master/examples/variational_autoencoder/variational_autoencoder.py#L92
Post by Sunjeet Jena
I am working on a code to implement a deep RL algorithm where the policy
function is the sampled values from a normal distribution. But when I
differentiate through the cost function(which of course depends upon the
"theano.gradient.NullTypeGradError: tensor.grad encountered a NaN. This
variable is Null because the grad method for input 2
(Subtensor{int64:int64:}.0) of the RandomFunction{normal} op is
mathematically undefined. No gradient defined through raw random numbers op"
Is there anyway I can solve it?
--
---
You received this message because you are subscribed to the Google Groups "theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to theano-users+***@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Loading...