Discussion:
[theano-users] Theano dtensor3 type variable gives error in GPU
shyam krishna khadka
2017-05-15 21:10:42 UTC
Permalink
I have code that is running smoothly in cpu with a variable defined as :

l = T.dtensor3('l')





Please help.
--
---
You received this message because you are subscribed to the Google Groups "theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to theano-users+***@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
shyam krishna khadka
2017-05-15 21:12:19 UTC
Permalink
I have code that is running smoothly in cpu with a variable defined as :

l = T.dtensor3('l')


But when I ran it in NVIDIA GPU, it gives error as :

TypeError: Cannot convert Type TensorType(float32, 3D) (of Variable Subtensor{int64:int64:, ::}.0) into Type TensorType(float64, 3D). You can try to manually convert Subtensor{int64:int64:, ::}.0 into a TensorType(float64, 3D).

It seems the datatype is float64. But for GPU, we need to make it float32. What will be equivalent for this in GPU ? Itired with

l = T.tensor3('l')

But now I am getting assertion error as

A.ndim == 3 error
--
---
You received this message because you are subscribed to the Google Groups "theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to theano-users+***@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Loading...