Alexander Botev
2017-05-20 15:55:19 UTC
a = T.fmatrix()
b = T.sqr(a)
c = T.nnet.sigmoid(a)
g = T.fmatrix()
d = T.Lop(c, a, g)
f = theano.function([a, g], d)
theano.printing.debugprint(f)
Elemwise{mul} [id A] '' 5b = T.sqr(a)
c = T.nnet.sigmoid(a)
g = T.fmatrix()
d = T.Lop(c, a, g)
f = theano.function([a, g], d)
theano.printing.debugprint(f)
|Elemwise{mul} [id B] '' 3
| |<TensorType(float32, matrix)> [id C]
| |Elemwise{scalar_sigmoid} [id D] '' 1
| |<TensorType(float32, matrix)> [id E]
|Elemwise{sub} [id F] '' 4
|InplaceDimShuffle{x,x} [id G] '' 2
| |TensorConstant{1.0} [id H]
|Elemwise{scalar_sigmoid} [id I] '' 0
|<TensorType(float32, matrix)> [id E]
My question is why does it compute the Sigmoid 2 times, when it can just
reuse that computation? Or if it does this how can I notice it on the
graph. I have not switched any of the optimisations.
--
---
You received this message because you are subscribed to the Google Groups "theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to theano-users+***@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
---
You received this message because you are subscribed to the Google Groups "theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to theano-users+***@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.