Discussion:
[theano-users] Announcing Theano 0.10.0beta4
Steven Bocco
2017-10-17 14:28:58 UTC
Permalink
Announcing Theano 0.10.0beta4

This is a beta release for a major version, with new features and bug fixes.

The upgrade is recommended for developers who want to help test and report
bugs, or want to use new features now.

For those using the bleeding edge version in the git repository, we
encourage you to update to the rel-0.10.0beta4 tag.
What's New Highlights:

- Announcing that MILA will stop developing Theano
<https://groups.google.com/d/msg/theano-users/7Poq8BZutbY/rNCIfvAEAwAJ>
- Bug fixes, crash fixes, warning improvements and documentation updates

Interface changes:

- Generalized AllocDiag for any non-scalar input

Convolution updates:

- Implemented fractional bilinear upsampling

cuDNN (GPU):

- Disallowed float16 precision for convolution gradients
- Fixed memory alignment detection
- Added profiling in C debug mode (with theano flag cmodule.debug=True)

New features:

- Implemented truncated normal distribution with box-muller transform
- Added L_op() overriding option for OpFromGraph
- Added NumPy C-API based fallback implementation for [sd]gemv_ and
[sd]dot_

Other more detailed changes:

- Improved stack trace follow-up for GPU optimizations
- Fixed gradient error for elemwise minimum and maximum when compared
values are the same
- Fixed gradient for ARange
- Removed ViewOp subclass during optimization

Download and Install

You can download Theano from http://pypi.python.org/pypi/Theano

Installation instructions are available at
http://deeplearning.net/software/theano/install.html
Description

Theano is a Python library that allows you to define, optimize, and
efficiently evaluate mathematical expressions involving multi-dimensional
arrays. It is built on top of NumPy. Theano features:


- tight integration with NumPy: a similar interface to NumPy's.
numpy.ndarrays are also used internally in Theano-compiled functions.
- transparent use of a GPU: perform data-intensive computations much
faster than on a CPU.
- efficient symbolic differentiation: Theano can compute derivatives for
functions of one or many inputs.
- speed and stability optimizations: avoid nasty bugs when computing
expressions such as log(1+ exp(x)) for large values of x.
- dynamic C code generation: evaluate expressions faster.
- extensive unit-testing and self-verification: includes tools for
detecting and diagnosing bugs and/or potential problems.

Theano has been powering large-scale computationally intensive scientific
research since 2007, but it is also approachable enough to be used in the
classroom (IFT6266 at the University of Montreal).
Resources

About Theano:

http://deeplearning.net/software/theano/

Theano-related projects:

http://github.com/Theano/Theano/wiki/Related-projects

About NumPy:

http://numpy.scipy.org/

About SciPy:

http://www.scipy.org/

Machine Learning Tutorial with Theano on Deep Architectures:

http://deeplearning.net/tutorial/
Acknowledgments

I would like to thank all contributors of Theano. Since release 0.9.0, many
people have helped, notably (in alphabetical order):


- Aarni Koskela
- Adam Becker
- Adam Geitgey
- Adrian Keet
- Adrian Seyboldt
- Aleksandar Botev
- Alexander Matyasko
- amrithasuresh
- Andrei Costinescu
- Anirudh Goyal
- Anmol Sahoo
- Arnaud Bergeron
- Bogdan Budescu
- Boris Fomitchev
- Cesar Laurent
- Chiheb Trabelsi
- Chong Wu
- Daren Eiri
- dareneiri
- Dzmitry Bahdanau
- erakra
- Faruk Ahmed
- Florian Bordes
- fo40225
- Frederic Bastien
- Gabe Schwartz
- Ghislain Antony Vaillant
- Gijs van Tulder
- Holger Kohr
- Jan SchlÃŒter
- Jayanth Koushik
- Jeff Donahue
- jhelie
- João Victor Tozatti Risso
- Joseph Paul Cohen
- Juan Camilo Gamboa Higuera
- Laurent Dinh
- Lilian Besson
- lrast
- Lv Tao
- Matt Graham
- Michael Manukyan
- Mohamed Ishmael Diwan Belghazi
- Mohammed Affan
- morrme
- mrTsjolder
- Murugesh Marvel
- naitonium
- NALEPA
- Nan Jiang
- Pascal Lamblin
- Ramana Subramanyam
- Reyhane Askari
- Saizheng Zhang
- Shawn Tan
- Shubh Vachher
- Simon Lefrancois
- Sina Honari
- Steven Bocco
- Tegan Maharaj
- Thomas George
- Tim Cooijmans
- Vikram
- vipulraheja
- wyjw
- Xavier Bouthillier
- xiaoqie
- Yikang Shen
- Zhouhan LIN
- Zotov Yuriy

Also, thank you to all NumPy and Scipy developers as Theano builds on their
strengths.

All questions/comments are always welcome on the Theano mailing-lists (
http://deeplearning.net/software/theano/#community )
--
Steven Bocco
--
---
You received this message because you are subscribed to the Google Groups "theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to theano-users+***@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Loading...