|December 13 · Issue #19 · View online |
NIPS 2016 seems to have been a huge blast, this week’s issue features lots of retrospectives, tutorials, and code from the conference for those of us who couldn’t make it.
In this issue we’ll also see how deep learning improves hearing aids, DeepMind shows us how to implement their learning to learn paper, we’ll learn how to train GANs and much more…
PS: We published a wildly popular open-source deep learning curriculum
this week and have decided to create a Slack group to facilitate exchange and discussion of the materials. You can join us here
| AMD launches Radeon Instinct GPUs to tackle deep learning, artificial intelligence |
AMD is throwing its own hat into the deep learning and AI markets, with a new lineup of Radeon Instinct GPUs.
| The major advancements in Deep Learning in 2016 |
A comprehensive overview of this year’s developments in deep learning.
| Deep Learning Reinvents the Hearing Aid |
Researchers have finally solved the ‘cocktail party problem’ making it possible for the wearers of hearing aids to pick out a voice in a crowded. The article is quite in-depth and features interesting audio examples of speech before and after filtering by the algorithm.
| The Audi Q2 Deep Learning Concept Car |
Audi showcases a concept vehicle at 1:8 scale that is powered by “deep reinforcement learning". It will be interesting to see what the Germans will come up with in their race to catch up in the autonomous vehicle space. The Q2, at least, looks promising.
| Machined Learnings: NIPS 2016 Reflections |
Insightful reflections on NIPS 2016 A major theme I noticed at the conference was the use of simulated environments. GANs and the GANification of everything.
| Experiments in Handwriting with a Neural Network |
Really intriguing post on visualizing a generative handwriting model that makes a valuable point in that often it is not that deep learning models are a black box per se but that we lack the proper tools to make them fully intelligible.
| Deep Learning Cheat Sheet |
Deep Learning can be overwhelming when new to the subject. Here are some cheats and tips to get you through it.
| Richard Socher on the future of Deep Learning |
Richard Socher, chief scientist at Salesforce gives his perspective on the future of the field with a focus on NLP and its applications.
| Machine Learning Yearning |
Sign up to receive drafts of Andrew Ngs new book Machine Learning Yearning as he writes it.
| Tutorials from NIPS 2016 |
A NIPS attendee was so nice as to put together a list of links to slides and materials of the tutorials given at the conference.
| GANhacks or "How to Train a GAN?" |
A collection of useful hacks and tricks on how train Generative Adversarial Networks
| tiny-dnn a tiny Deep Learning Framework in C++ |
tiny-dnn - is a header only, dependency-free deep learning framework in C++ intended for use in IoT devices.
| DeepMind's Learning to Learn in TensorFlow |
| Knowing When to Look: Adaptive Attention via A Visual Sentinel for Image Captioning |
When captioning an image some words require more *attention* to the image than others; ‘the’ can be inferred without much reference to the image, whereas 'dog’ is quite reliant on what is being depicted. This paper proposes a novel adaptive attention model to decide whether to attend and where.
| Predicting Brain Age with Deep Learning from raw Imaging Data |
The authors introduce a predictive modelling approach based on deep learning, and specifically convolutional neural networks (CNN) to predict brain aging. Deviations from healthy brain aging have, in turn, been associated with cognitive impairment and disease.
| Spatially Adaptive Computation Time for Residual Networks |
A deep learning architecture based on Residual Network that dynamically adjusts the number of executed layers for the regions of the image thereby significantly improving on computational efficiency.
| Research Scientist, Google Brain |
Great opening at Google Brain group in Montreal, perks include Hugo Larochelle as a colleague.