Discussion:
[theano-users] theano with keras, using evaluate and clone
slavakung
2017-05-17 11:07:30 UTC
Permalink
Hello,

I am a researcher in optimization and I am looking to work on optimization
algorithms for DNNs. I am writing a custom solver, and to code it it is
necessary for me to do two things within the get_updates function in the
optimizer's class:
1) Evaluate some aspect of the tensor and put it in a numpy array
2) Calculate a second gradient, as in, a gradient of the loss function with
respect the parameter tensors, but not the one that is at the current value
of the parameters, a slightly different value for them

#2 proved to be quite a nightmare, it took me two months and lots of
questions of stackoverfow and github to find out ultimately that this is
actually impossible using tensorflow (strangely enough), and so I switched
to theano as the backend. Now I do this as follows:
in the get_updates function:

grads = self.get_gradients(loss, params)
....

replace = {p:npm for p, npm in zip(params, otherparams)}
gradsn = [th.clone(g, replace = replace) for g in grads]

does this look correct? It seems to be fine from the execution, but I am
not certain.

I am now having trouble with #1. In tensorflow it was easy, you just set
sess = tf.session()
K.set_session(sess)

in the beginning and do
K.get_value(tensorvar)

to get the value in the code, but I have not found any equivalent in
theano, and when I try this particular line in my code:

sumnorm = (sum([K.eval(g.norm(2))**2 for g in grads]))**0.5

I get the error:
theano.gof.fg.
MissingInputError: ("An input of the graph, used to compute
dot(dense_input_43, dense_211_W), was not provided and not given a
value.Use the Theano flag exception_verbosity='high',for more information
on this error.", dense_input_43)


Suggestions? Thank you
--
---
You received this message because you are subscribed to the Google Groups "theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to theano-users+***@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Loading...