Steven Bocco
2017-09-07 13:48:22 UTC
Announcing Theano 0.10.0beta2
This is a beta release for a major version, with new features and bug fixes.
The upgrade is recommended for developers who want to help test and report
bugs, or want to use new features now.
For those using the bleeding edge version in the git repository, we
encourage you to update to the 0.10.0beta2 tag.
What's New Highlights:
- Support NumPy 1.13
- Support pygpu 0.7
- Added conda recipe
- Optional faster optimization step with new destroy handler
- Added documentation for RNNBlock
- Bug fixes, crash fixes, warning improvements and documentation updates
Interface changes:
- Added new parameter target for MRG functions
Convolution updates:
- Added unshared convolutions
- Added 3D separable convolutions
- Added 3D grouped convolutions
- Removed old conv3d interface
- Deprecated old conv2d interface
- Updated conv documentation
GPU:
- Added a meta-optimizer to select the fastest GPU implementations for
convolutions
- cuDNN:
- Official support for v6.* and v7.*, support for v5.* will be
removed in next release
- Added spatial transformation operation based on cuDNN
- Updated and improved caching system for runtime-chosen cuDNN
convolution algorithms
- Support cuDNN v7 tensor core operations for convolutions with
runtime timed algorithms
- Restricted cuDNN reductions to contiguous inputs
- Automatic addition of cuDNN DLL path to PATH environment variable
on Windows
New features:
- Added tensor6() and tensor7() in theano.tensor module
- Added boolean indexing for sub-tensors
- Added covariance matrix function theano.tensor.cov
- Added new Theano flag pickle_test_value to help disable pickling test
values
Others:
- Kept stack trace for optimizations in new GPU backend
Other more detailed changes:
- Moved all C code files into separate folder c_code in every Theano
module
- Improvements for Jenkins tests
Download and Install
You can download Theano from http://pypi.python.org/pypi/Theano
Installation instructions are available at
http://deeplearning.net/software/theano/install.html
Description
Theano is a Python library that allows you to define, optimize, and
efficiently evaluate mathematical expressions involving multi-dimensional
arrays. It is built on top of NumPy. Theano features:
- tight integration with NumPy: a similar interface to NumPy's.
numpy.ndarrays are also used internally in Theano-compiled functions.
- transparent use of a GPU: perform data-intensive computations much
faster than on a CPU.
- efficient symbolic differentiation: Theano can compute derivatives for
functions of one or many inputs.
- speed and stability optimizations: avoid nasty bugs when computing
expressions such as log(1+ exp(x)) for large values of x.
- dynamic C code generation: evaluate expressions faster.
- extensive unit-testing and self-verification: includes tools for
detecting and diagnosing bugs and/or potential problems.
Theano has been powering large-scale computationally intensive scientific
research since 2007, but it is also approachable enough to be used in the
classroom (IFT6266 at the University of Montreal).
Resources
About Theano:
http://deeplearning.net/software/theano/
Theano-related projects:
http://github.com/Theano/Theano/wiki/Related-projects
About NumPy:
http://numpy.scipy.org/
About SciPy:
http://www.scipy.org/
Machine Learning Tutorial with Theano on Deep Architectures:
http://deeplearning.net/tutorial/
Acknowledgments
I would like to thank all contributors of Theano. Since release 0.9.0, many
people have helped, notably (in alphabetical order):
- Aarni Koskela
- Adam Becker
- Adam Geitgey
- Adrian Keet
- Adrian Seyboldt
- Aleksandar Botev
- Alexander Matyasko
- amrithasuresh
- Andrei Costinescu
- Anirudh Goyal
- Anmol Sahoo
- Arnaud Bergeron
- Bogdan Budescu
- Boris Fomitchev
- Cesar Laurent
- Chiheb Trabelsi
- Chong Wu
- dareneiri
- Daren Eiri
- Dzmitry Bahdanau
- erakra
- Faruk Ahmed
- Florian Bordes
- fo40225
- Frederic Bastien
- Gabe Schwartz
- Ghislain Antony Vaillant
- Gijs van Tulder
- Holger Kohr
- Jan SchlÃŒter
- Jayanth Koushik
- Jeff Donahue
- jhelie
- João Victor Tozatti Risso
- Joseph Paul Cohen
- Juan Camilo Gamboa Higuera
- Laurent Dinh
- Lilian Besson
- lrast
- Lv Tao
- Matt Graham
- Michael Manukyan
- Mohamed Ishmael Diwan Belghazi
- Mohammed Affan
- morrme
- Murugesh Marvel
- NALEPA
- Pascal Lamblin
- Ramana Subramanyam
- Reyhane Askari
- Saizheng Zhang
- Shawn Tan
- Shubh Vachher
- Simon Lefrancois
- Sina Honari
- Steven Bocco
- Tegan Maharaj
- Thomas George
- Tim Cooijmans
- Vikram
- vipulraheja
- wyjw
- Xavier Bouthillier
- xiaoqie
- Yikang Shen
- Zhouhan LIN
- Zotov Yuriy
Also, thank you to all NumPy and Scipy developers as Theano builds on their
strengths.
All questions/comments are always welcome on the Theano mailing-lists (
http://deeplearning.net/software/theano/#community )
This is a beta release for a major version, with new features and bug fixes.
The upgrade is recommended for developers who want to help test and report
bugs, or want to use new features now.
For those using the bleeding edge version in the git repository, we
encourage you to update to the 0.10.0beta2 tag.
What's New Highlights:
- Support NumPy 1.13
- Support pygpu 0.7
- Added conda recipe
- Optional faster optimization step with new destroy handler
- Added documentation for RNNBlock
- Bug fixes, crash fixes, warning improvements and documentation updates
Interface changes:
- Added new parameter target for MRG functions
Convolution updates:
- Added unshared convolutions
- Added 3D separable convolutions
- Added 3D grouped convolutions
- Removed old conv3d interface
- Deprecated old conv2d interface
- Updated conv documentation
GPU:
- Added a meta-optimizer to select the fastest GPU implementations for
convolutions
- cuDNN:
- Official support for v6.* and v7.*, support for v5.* will be
removed in next release
- Added spatial transformation operation based on cuDNN
- Updated and improved caching system for runtime-chosen cuDNN
convolution algorithms
- Support cuDNN v7 tensor core operations for convolutions with
runtime timed algorithms
- Restricted cuDNN reductions to contiguous inputs
- Automatic addition of cuDNN DLL path to PATH environment variable
on Windows
New features:
- Added tensor6() and tensor7() in theano.tensor module
- Added boolean indexing for sub-tensors
- Added covariance matrix function theano.tensor.cov
- Added new Theano flag pickle_test_value to help disable pickling test
values
Others:
- Kept stack trace for optimizations in new GPU backend
Other more detailed changes:
- Moved all C code files into separate folder c_code in every Theano
module
- Improvements for Jenkins tests
Download and Install
You can download Theano from http://pypi.python.org/pypi/Theano
Installation instructions are available at
http://deeplearning.net/software/theano/install.html
Description
Theano is a Python library that allows you to define, optimize, and
efficiently evaluate mathematical expressions involving multi-dimensional
arrays. It is built on top of NumPy. Theano features:
- tight integration with NumPy: a similar interface to NumPy's.
numpy.ndarrays are also used internally in Theano-compiled functions.
- transparent use of a GPU: perform data-intensive computations much
faster than on a CPU.
- efficient symbolic differentiation: Theano can compute derivatives for
functions of one or many inputs.
- speed and stability optimizations: avoid nasty bugs when computing
expressions such as log(1+ exp(x)) for large values of x.
- dynamic C code generation: evaluate expressions faster.
- extensive unit-testing and self-verification: includes tools for
detecting and diagnosing bugs and/or potential problems.
Theano has been powering large-scale computationally intensive scientific
research since 2007, but it is also approachable enough to be used in the
classroom (IFT6266 at the University of Montreal).
Resources
About Theano:
http://deeplearning.net/software/theano/
Theano-related projects:
http://github.com/Theano/Theano/wiki/Related-projects
About NumPy:
http://numpy.scipy.org/
About SciPy:
http://www.scipy.org/
Machine Learning Tutorial with Theano on Deep Architectures:
http://deeplearning.net/tutorial/
Acknowledgments
I would like to thank all contributors of Theano. Since release 0.9.0, many
people have helped, notably (in alphabetical order):
- Aarni Koskela
- Adam Becker
- Adam Geitgey
- Adrian Keet
- Adrian Seyboldt
- Aleksandar Botev
- Alexander Matyasko
- amrithasuresh
- Andrei Costinescu
- Anirudh Goyal
- Anmol Sahoo
- Arnaud Bergeron
- Bogdan Budescu
- Boris Fomitchev
- Cesar Laurent
- Chiheb Trabelsi
- Chong Wu
- dareneiri
- Daren Eiri
- Dzmitry Bahdanau
- erakra
- Faruk Ahmed
- Florian Bordes
- fo40225
- Frederic Bastien
- Gabe Schwartz
- Ghislain Antony Vaillant
- Gijs van Tulder
- Holger Kohr
- Jan SchlÃŒter
- Jayanth Koushik
- Jeff Donahue
- jhelie
- João Victor Tozatti Risso
- Joseph Paul Cohen
- Juan Camilo Gamboa Higuera
- Laurent Dinh
- Lilian Besson
- lrast
- Lv Tao
- Matt Graham
- Michael Manukyan
- Mohamed Ishmael Diwan Belghazi
- Mohammed Affan
- morrme
- Murugesh Marvel
- NALEPA
- Pascal Lamblin
- Ramana Subramanyam
- Reyhane Askari
- Saizheng Zhang
- Shawn Tan
- Shubh Vachher
- Simon Lefrancois
- Sina Honari
- Steven Bocco
- Tegan Maharaj
- Thomas George
- Tim Cooijmans
- Vikram
- vipulraheja
- wyjw
- Xavier Bouthillier
- xiaoqie
- Yikang Shen
- Zhouhan LIN
- Zotov Yuriy
Also, thank you to all NumPy and Scipy developers as Theano builds on their
strengths.
All questions/comments are always welcome on the Theano mailing-lists (
http://deeplearning.net/software/theano/#community )
--
Steven Bocco
--
---
You received this message because you are subscribed to the Google Groups "theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to theano-users+***@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Steven Bocco
--
---
You received this message because you are subscribed to the Google Groups "theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to theano-users+***@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.