Discussion:
[theano-users] How to build different average pooling operation I'll call it local average pooling ?
Feras Almasri
2017-08-08 11:36:51 UTC
Permalink
I want to have an node that take the average of the only activated points
in the last feature map. what I mean by activated points is any pixel
higher than zero. instead of taking the global average of the full feature
map I'd rather take it of the only activated pixels.
If I just do this in normal operation then gradient descent will be
discontinued in a way that location for back prorogation are not there.

Any hint or advice would be appreciated, thank you.
--
---
You received this message because you are subscribed to the Google Groups "theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to theano-users+***@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Frédéric Bastien
2017-08-09 18:36:05 UTC
Permalink
I don't understand the problem with using normal operation. Can you give
this code? I don't see more problem with that implementation vs a normal
average pooling.
Post by Feras Almasri
I want to have an node that take the average of the only activated points
in the last feature map. what I mean by activated points is any pixel
higher than zero. instead of taking the global average of the full feature
map I'd rather take it of the only activated pixels.
If I just do this in normal operation then gradient descent will be
discontinued in a way that location for back prorogation are not there.
Any hint or advice would be appreciated, thank you.
--
---
You received this message because you are subscribed to the Google Groups
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an
For more options, visit https://groups.google.com/d/optout.
--
---
You received this message because you are subscribed to the Google Groups "theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to theano-users+***@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Jesse Livezey
2017-08-09 20:22:00 UTC
Permalink
I think this idea would be something like

y = [1, 2, 3, 0]

y_current_avgpool = (1 + 2 + 3 + 0) / 4

y_new_avgpool = (1 + 2 + 3) / 3

I'm not sure that there is a simple way to do this currently. You could do
sum pooling first, then compute the divisors by looking at the number of
non-zero elements using this
http://deeplearning.net/software/theano/library/tensor/nnet/neighbours.html#theano.tensor.nnet.neighbours.images2neibs
and T.switch
Post by Frédéric Bastien
I don't understand the problem with using normal operation. Can you give
this code? I don't see more problem with that implementation vs a normal
average pooling.
Post by Feras Almasri
I want to have an node that take the average of the only activated points
in the last feature map. what I mean by activated points is any pixel
higher than zero. instead of taking the global average of the full feature
map I'd rather take it of the only activated pixels.
If I just do this in normal operation then gradient descent will be
discontinued in a way that location for back prorogation are not there.
Any hint or advice would be appreciated, thank you.
--
---
You received this message because you are subscribed to the Google Groups
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an
For more options, visit https://groups.google.com/d/optout.
--
---
You received this message because you are subscribed to the Google Groups "theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to theano-users+***@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Feras Almasri
2017-08-18 08:21:13 UTC
Permalink
Jessy yes this is the idea exactly. I don't want to use the global average
because I don't want to include zero activated points in the cost value
neither on the gradient calculation. So the local pooling might help to
give the average on that local activated area.

by applying the sum the trace of the gradient will take into account all
the points in that output which is not what the function should do. it is
more like global maximum that save the location of that point but for
multiple trace.
Post by Jesse Livezey
I think this idea would be something like
y = [1, 2, 3, 0]
y_current_avgpool = (1 + 2 + 3 + 0) / 4
y_new_avgpool = (1 + 2 + 3) / 3
I'm not sure that there is a simple way to do this currently. You could do
sum pooling first, then compute the divisors by looking at the number of
non-zero elements using this
http://deeplearning.net/software/theano/library/
tensor/nnet/neighbours.html#theano.tensor.nnet.neighbours.images2neibs
and T.switch
Post by Frédéric Bastien
I don't understand the problem with using normal operation. Can you give
this code? I don't see more problem with that implementation vs a normal
average pooling.
Post by Feras Almasri
I want to have an node that take the average of the only activated
points in the last feature map. what I mean by activated points is any
pixel higher than zero. instead of taking the global average of the full
feature map I'd rather take it of the only activated pixels.
If I just do this in normal operation then gradient descent will be
discontinued in a way that location for back prorogation are not there.
Any hint or advice would be appreciated, thank you.
--
---
You received this message because you are subscribed to the Google
Groups "theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send
For more options, visit https://groups.google.com/d/optout.
--
---
You received this message because you are subscribed to a topic in the
Google Groups "theano-users" group.
To unsubscribe from this topic, visit https://groups.google.com/d/
topic/theano-users/7lcKAhwA6L8/unsubscribe.
To unsubscribe from this group and all its topics, send an email to
For more options, visit https://groups.google.com/d/optout.
--
---
You received this message because you are subscribed to the Google Groups "theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to theano-users+***@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Loading...