|October 25 · Issue #12 · View online |
this week a team at MIT tries to scare us with a neural net based nightmare machine, Microsoft matched human performance in conversational speech recognition and fascinating papers were released.
Happy reading and hacking!
As always if you enjoy receiving this newsletter, please considering sharing it with friends and colleagues, your support is very much appreciated.
| Historic Achievement: Microsoft researchers reach human parity in conversational speech recognition |
Microsoft has reached the lowest ever recorded error rate in conversational speech recognition. Thats the same error rate you would reach when trying to transcribe a conversation!
| Deep Learning Benchmarks |
Deep learning benchmarks for different GPUs against the most common deep learning libraries. Is it better to buy a brand new TITAN X Pascal or we can go for a cheap used TITAN X Maxwell, and what about a GTX 1080?
| Deep Learning papers reading roadmap |
A clearly structured reading roadmap covering the history and basics, methods and applications. If you want to get into the field, start right here.
| Design Patterns for Deep Learning Architectures |
This working draft of an upcoming book is already a good read and has many detailed explanations of common design patterns and methods.
| If I Can Learn to Play Atari, I Can Learn TensorFlow |
A nice summary of new deep learning libraries, tools, updates to existing frameworks and where to begin if you want to explore Deep Learning.
| Introducing TensorFlow Ruby API |
Always wanted to get into TensorFlow, but afraid of Python and C++? Arafat Khan has created a Ruby API to let you use TensorFlow from Ruby.
| A tensorflow implementation of Zhao et al's EBGAN paper |
A tensorflow implementation of the EBGAN paper listed in the papers section.
| Nightmare Machine |
The MIT put GANs to great use and created a compilation of horror images for Halloween.
| Energy-based Generative Adversarial Network |
Zhao et al. presented a new EBGAN model which views the discriminator as an energy function that associates low energies with the regions near the data manifold and higher energies with other regions.
| Equilibrium Propagation: Bridging the Gap Between Energy-Based Models and Backpropagation |
Equilibrium Propagation is a learning algorithm for energy-based models. The learning process is quite fascinating and leads to assumptions about the human brain.