Discussion:
[theano-users] TypeError: ('An update must have the same type as the original shared variable
Haree Varshan Jeyaram
2018-07-23 22:11:23 UTC
Permalink
Hi,


I am getting a type error while I update in the optimizer function. The
code and error are as follows

*Code:*

def adam(lr, tparams, grads, inp, cost):
gshared = [theano.shared(p.get_value() * 0., name='%s_grad'%k) for k, p
in tparams.iteritems()]
gsup = [(gs, g) for gs, g in zip(gshared, grads)]

f_grad_shared = theano.function(inp, cost, updates=gsup, profile=False)

b1 = 0.1
b2 = 0.001
e = 1e-8

updates = []

i = theano.shared(numpy.float32(0.))
i_t = i + 1.
fix1 = 1. - b1**(i_t)
fix2 = 1. - b2**(i_t)
lr_t = lr * (tensor.sqrt(fix2) / fix1)

for p, g in zip(tparams.values(), gshared):
m = theano.shared(p.get_value() * 0.)
v = theano.shared(p.get_value() * 0.)
m_t = (b1 * g) + ((1. - b1) * m)
v_t = (b2 * tensor.sqr(g)) + ((1. - b2) * v)
g_t = m_t / (tensor.sqrt(v_t) + e)
p_t = p - (lr_t * g_t)
updates.append((m, m_t))
updates.append((v, v_t))
updates.append((p, p_t))
updates.append((i, i_t))

f_update = theano.function([lr], [], updates=updates,
on_unused_input='ignore', profile=False)

return f_grad_shared, f_update


*Error:*


File "optim.py", line 39, in adam
f_update = theano.function([lr], [], updates=updates,
on_unused_input='ignore', profile=False)

File
"C:\Users\hareevarshan\Anaconda2\lib\site-packages\theano\compile\function.py",
line 317, in function
output_keys=output_keys)

File
"C:\Users\hareevarshan\Anaconda2\lib\site-packages\theano\compile\pfunc.py",
line 449, in pfunc
no_default_updates=no_default_updates)

File
"C:\Users\hareevarshan\Anaconda2\lib\site-packages\theano\compile\pfunc.py",
line 208, in rebuild_collect_shared
raise TypeError(err_msg, err_sug)

TypeError: ('An update must have the same type as the original shared
variable (shared_var=<TensorType(float32, matrix)>,
shared_var.type=TensorType(float32, matrix),
update_val=Elemwise{add,no_inplace}.0, update_val.type=TensorType(float64,
matrix)).', 'If the difference is related to the broadcast pattern, you can
call the tensor.unbroadcast(var, axis_to_unbroadcast[, ...]) function to
remove broadcastable dimensions.')


I'm not able to crack the issue. Kindly help me in resovling this.

Thanks.
--
---
You received this message because you are subscribed to the Google Groups "theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to theano-users+***@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Loading...