Discussion:
[theano-users] Using multiple GPUs
Alfred Ferrer Florensa
2017-08-22 16:33:31 UTC
Permalink
Hello,

I am not sure where to ask this, but here seem the most appropiate place.
I have a little question about the working of the Multiple GPUs; would it
be equally fast this option (the one in the example on the
http://deeplearning.net/software/theano/tutorial/using_multi_gpu.html):

import numpyimport theano
v01 = theano.shared(numpy.random.random((1024, 1024)).astype('float32'),
target='dev0')v02 = theano.shared(numpy.random.random((1024, 1024)).astype('float32'),
target='dev0')v11 = theano.shared(numpy.random.random((1024, 1024)).astype('float32'),
target='dev1')v12 = theano.shared(numpy.random.random((1024, 1024)).astype('float32'),
target='dev1')
f = theano.function([], [theano.tensor.dot(v01, v02),
theano.tensor.dot(v11, v12)])
f()

Or this one where I am using the same device for both dots:

import numpyimport theano
v01 = theano.shared(numpy.random.random((1024, 1024)).astype('float32'),
target='dev0')v02 = theano.shared(numpy.random.random((1024, 1024)).astype('float32'),
target='dev0')v11 = theano.shared(numpy.random.random((1024, 1024)).astype('float32'),
target='dev0')v12 = theano.shared(numpy.random.random((1024, 1024)).astype('float32'),
target='dev0')
f = theano.function([], [theano.tensor.dot(v01, v02),
theano.tensor.dot(v11, v12)])
f()


Thanks for your time
--
---
You received this message because you are subscribed to the Google Groups "theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to theano-users+***@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Frédéric Bastien
2017-08-23 01:15:59 UTC
Permalink
The first one is probably faster most of the time.

But this interface isn't week supported on Theano. This is still
experimental and have known crash cases.
Post by Alfred Ferrer Florensa
Hello,
I am not sure where to ask this, but here seem the most appropiate place.
I have a little question about the working of the Multiple GPUs; would it
be equally fast this option (the one in the example on the
import numpyimport theano
v01 = theano.shared(numpy.random.random((1024, 1024)).astype('float32'),
target='dev0')v02 = theano.shared(numpy.random.random((1024, 1024)).astype('float32'),
target='dev0')v11 = theano.shared(numpy.random.random((1024, 1024)).astype('float32'),
target='dev1')v12 = theano.shared(numpy.random.random((1024, 1024)).astype('float32'),
target='dev1')
f = theano.function([], [theano.tensor.dot(v01, v02),
theano.tensor.dot(v11, v12)])
f()
import numpyimport theano
v01 = theano.shared(numpy.random.random((1024, 1024)).astype('float32'),
target='dev0')v02 = theano.shared(numpy.random.random((1024, 1024)).astype('float32'),
target='dev0')v11 = theano.shared(numpy.random.random((1024, 1024)).astype('float32'),
target='dev0')v12 = theano.shared(numpy.random.random((1024, 1024)).astype('float32'),
target='dev0')
f = theano.function([], [theano.tensor.dot(v01, v02),
theano.tensor.dot(v11, v12)])
f()
Thanks for your time
--
---
You received this message because you are subscribed to the Google Groups
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an
For more options, visit https://groups.google.com/d/optout.
--
---
You received this message because you are subscribed to the Google Groups "theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to theano-users+***@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Continue reading on narkive:
Loading...