Discussion:
[theano-users] Theano GPU results not reproducible
Wenpeng Yin
2017-07-18 20:20:41 UTC
Permalink
Hi guys,

I have a long-term problem when running theano code in GPU: even I use two
command windows to run the same program (on the same GPU or different
GPUs), they show different performances. It's hard to say the difference is
small or big, depending on the task. This makes difficult to judge a
program modification is better or worse.

I can not find the problem, as I notice that I always use the same random
seed, for example "rng = numpy.random.RandomState(23455)", whenever I
create parameters, so they are expected to repeat the process, right?

The only thing I can think about is that GPU uses 32 bits, not 64, this
will lose precision?

Thanks for any hints.
--
---
You received this message because you are subscribed to the Google Groups "theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to theano-users+***@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Jesse Livezey
2017-07-20 01:20:16 UTC
Permalink
Some operations on GPU are not deterministic. I think some convolution
operations and also reduction operations are two examples. See this thread
for more info
https://groups.google.com/forum/#!searchin/theano-users/atomic$20add%7Csort:relevance/theano-users/g-BF6zwMirM/ojWzbUBPBwAJ
Post by Wenpeng Yin
Hi guys,
I have a long-term problem when running theano code in GPU: even I use two
command windows to run the same program (on the same GPU or different
GPUs), they show different performances. It's hard to say the difference is
small or big, depending on the task. This makes difficult to judge a
program modification is better or worse.
I can not find the problem, as I notice that I always use the same random
seed, for example "rng = numpy.random.RandomState(23455)", whenever I
create parameters, so they are expected to repeat the process, right?
The only thing I can think about is that GPU uses 32 bits, not 64, this
will lose precision?
Thanks for any hints.
--
---
You received this message because you are subscribed to the Google Groups "theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to theano-users+***@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Wenpeng Yin
2017-07-20 19:23:15 UTC
Permalink
Thanks Jesse, it helps a lot!


**************************************************************************************
*Wenpeng Yin*
University of Munich
CIS, Oettingenstr.67, 80538 Munich, Germany
*Homepage:http://sites.google.com/site/yinwenpeng1987/
<https://sites.google.com/site/yinwenpeng1987/>*
Post by Jesse Livezey
Some operations on GPU are not deterministic. I think some convolution
operations and also reduction operations are two examples. See this thread
for more info
https://groups.google.com/forum/#!searchin/theano-users/
atomic$20add%7Csort:relevance/theano-users/g-BF6zwMirM/ojWzbUBPBwAJ
Post by Wenpeng Yin
Hi guys,
I have a long-term problem when running theano code in GPU: even I use
two command windows to run the same program (on the same GPU or different
GPUs), they show different performances. It's hard to say the difference is
small or big, depending on the task. This makes difficult to judge a
program modification is better or worse.
I can not find the problem, as I notice that I always use the same random
seed, for example "rng = numpy.random.RandomState(23455)", whenever I
create parameters, so they are expected to repeat the process, right?
The only thing I can think about is that GPU uses 32 bits, not 64, this
will lose precision?
Thanks for any hints.
--
---
You received this message because you are subscribed to a topic in the
Google Groups "theano-users" group.
To unsubscribe from this topic, visit https://groups.google.com/d/
topic/theano-users/ktfKnwq7ynQ/unsubscribe.
To unsubscribe from this group and all its topics, send an email to
For more options, visit https://groups.google.com/d/optout.
--
---
You received this message because you are subscribed to the Google Groups "theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to theano-users+***@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Loading...