|December 20 · Issue #20 · View online |
another week of lots of great articles, publications, code and frameworks. We picked the best and present you the New York Times on AI, predictions for 2017, Andrej Karpathy revisiting backprop, Harvards advanced machine translation framework and much more…
We hope you’ll enjoy reading as much as we did and wish you a happy holiday season!
As always we appreciate you sharing this newsletter with your friends and colleagues and would love to see you joining our deep learning study group here
| The Great A.I. Awakening - The New York Times |
How Google used artificial intelligence to transform Google Translate, one of its more popular services — and how machine learning is poised to reinvent computing itself. Long but good.
| Predictions for Deep Learning in 2017 |
We have already seen a hugely successful consumer application with Google Translate and there is more to come. More tools in the consumer photo market, music composition, open-source development and more. We are looking forward to everything and will report right here!
| Deep Learning Race: A Survey of Industry Players’ Strategies |
A comparison of the different approaches that the global deep learning players take in their research. Quite interesting to see an overview across the whole machine learning landscape.
| Has Deep Learning Made Traditional Machine Learning Irrelevant? |
A nice comparison of the current state of deep and machine learning in research and industry. Sheds some light on competitions, neural nets in production and todays actual use of machine learning in production.
| Yes you should understand backprop |
Have you done your homework? Andrej Karpathy explains his motivation for teaching the details of backpropagation in the famous CS231n class. Contains great examples and points to more detailed course materials.
| A Visual and Interactive Guide to the Basics of Neural Networks |
Do you prefer some visuals when learning things? J. Alammar has got you covered with this amazing visual and interactive guide that teaches you the basics of neural networks.
| Building Jarvis |
Mark Zuckerbergs describes his experiences while working on his side project, an automated home that relies on deep learning. Nice insights and a fun read.
| What is the Future of Deep Reinforcement Learning (DL + RL)? |
Some insights on where deep reinforcement learning is heading, which obstacles hinder current research and how these obstacles can be removed.
| Uncertainty in Deep Learning |
Yarin Gal has published his PhD thesis on deep learning and offers some great insights and gives an introduction to Bayesian Deep Learning. The full thesis has 120 pages, but the introduction is a nice read already.
| OpenNMT |
This week Harvard surprised us with a ready to use, industrial-strength and open-source machine translation system that uses the Torch toolkit and looks very promising. Either as a finished solution or as a starting point for further development.
| Enny1991/PLSTM |
Phased LSTMs, from a recent paper published by Neil et al. at NIPS 2016, implemented in Tensorflow.
| Intel’s Optimized Tools and Frameworks for Machine Learning and Deep Learning | Intel® Software |
Intel keeps going in machine and deep learning and presents their optimised tools and frameworks.
| Mini World of Bits benchmark |
OpenAI published a benchmark for reinforcement learning agents that consists of tiny websites. Great idea and definitely very useful.
| Tracking the World State with Recurrent Entity Networks |
LeCun et al. presents the EntNet, which employs a dynamic long-term memory that makes maintaining and updating a model of the world state possible. The trained model can solve reasoning tasks that require lots of supporting facts and beats other systems.
| System predicts 85 percent of cyber-attacks using input from human experts |
Researchers from MIT’s CSAIL presented a system that predicts cyber-attacks better than existing systems by incorporating input from human experts.
| Real-time interactive sequence generation and control with Recurrent Neural Network ensembles |
Akten et al. present a method to conduct text generation backed by RNNs using gestures.