Discussion:
[theano-users] implementing dropout layer for shared weight matrix
Shadekur Rahman
2017-10-23 08:04:29 UTC
Permalink
I am trying to implement dropout on *W_a* matrix

from theano.tensor.shared_randomstreams import RandomStreams
import theano.tensor as T

rng = np.random.RandomState(1234)
srng = RandomStreams(rng.randint(999999))

W_a = theano.shared(self.sample_weight(50, 50)) #'sample_weight' creates 2
dimensional random array

#W_a.get_value() is a valid call

W_a = W_a * srng.binomial(size=weight.shape,p=0.5)

#W_a.get_value() is not a valid call anymore



Can anyone tell me how to implement code such that I can call *W_a.get_value()
*even after applying *dropout?*

*Reference: *
*http://rishy.github.io/ml/2016/10/12/dropout-with-theano/
<http://rishy.github.io/ml/2016/10/12/dropout-with-theano/>*
--
---
You received this message because you are subscribed to the Google Groups "theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to theano-users+***@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Loading...