|October 30 · Issue #63 · View online |
Happy hacking and reading. As always if you enjoy receiving this newsletter you can help us by sharing it with friends and colleagues.
See you later this week!
| Colaboratory – Google Docs for Jupyter Notebooks |
Google’s Colaboratory is a research project created to help disseminate machine learning education and research. It’s a Jupyter notebook environment that requires no setup to use.
| Big Data Meets Big Brother as China Moves to Rate its Citizens |
The Chinese government plans to launch its Social Credit System in 2020. The aim? To judge the trustworthiness – or otherwise – of its 1.3 billion residents
| Hey Siri: An On-device DNN-powered Voice Trigger for Apple’s Personal Assistant |
The “Hey Siri” detector uses a Deep Neural Network (DNN) to convert the acoustic pattern of your voice at each instant into a probability distribution over speech sounds. It then uses a temporal integration process to compute a confidence score that the phrase you uttered was “Hey Siri”. If the score is high enough, Siri wakes up. This article takes a look at the underlying technology.
| A Decade after DARPA: Our View on the State of the Art in Self-Driving Cars |
Argo AI CEO Bryan Salesky surveys advances in self-driving car technology in the past decade and looks at the challenges lying ahead concluding:
Those who think fully self-driving vehicles will be ubiquitous on city streets months from now or even in a few years are not well connected to the state of the art or committed to the safe deployment of the technology.
| Ever Wonder What A Rugged, Self-Contained AI Camera System Would Look Like? Here It Is. (sponsored) |
Most AI camera systems require a dry, dust-free operating environment and a high speed local network. A new device called DNNCam™ is different. With on-board processing and data storage, DNNCam™ can function as a stand-alone AI system. With a dustproof and waterproof case, DNNCam™ operates in nearly any environment. Learn more here
| A Visual Guide to Evolution Strategies |
An in-depth post explaining evolution strategies with the help of visual examples, it also includes extensive references to the literature for anyone wanting to take a deeper dive.
| A bit-by-bit Guide to the Equations Governing Differentiable Neural Computers |
The concept of a differentiable neural computer was first introduced back in 2016 in the DeepMind paper “Hybrid computing using a neural network with dynamic external memory”
, it holds the potential to apply neural networks to a number of algorithmic task that have hitherto been inaccessible. This great post delves into the nitty gritty mathematical foundations of DNC and explains the architecture of the memory augmented model in detail. Highly recommended.
| The Neural Net Tank Urban Legend |
AI folklore tells a story about a neural network trained to detect tanks which instead learned to detect time of day; investigating, this probably never happened.
| Deep Learning Book Club |
A playlist of videos accompanying the The Deep Learning Book
chapters. Each session is held by an expert such as one of the book authors, Ian Goodfellow.
| How to Unit Test Machine Learning Code |
A sorely needed advice list by a former Google Brain resident on how to efficiently test deep neural networks and other machine learning systems.
| StackGAN-Pytorch |
Contribute to StackGAN-Pytorch development by creating an account on GitHub.
| Dynamic Routing Between Capsules |
Geoffrey Hinton’s highly anticipated paper on capsules. A capsule is a group of neurons whose activity vector represents the instantiation parameters of a specific type of entity such as an object or object part. The length of the activity vector is used to represent the probability that the entity exists and its orientation to represent the instantiation parameters. Active capsules at one level make predictions, via transformation matrices, for the instantiation parameters of higher-level capsules. When multiple predictions agree, a higher level capsule becomes active. The authors show that a discriminatively trained, multi-layer capsule system achieves state-of-the-art performance on MNIST and is considerably better than a convolutional net at recognizing highly overlapping digits.
| A generative vision model that trains with high data efficiency and breaks text-based CAPTCHAs |
By drawing inspiration from systems neuroscience, the authors introduce a probabilistic generative model for vision in which message-passing based inference handles recognition, segmentation and reasoning in a unified way.
| Deep Reinforcement Learning: Framework, Applications, and Embedded Implementations |
This paper first presents a general DRL framework, which can be widely utilized in many applications with different optimization objectives and then focuses on three applications: the cloud computing resource allocation problem, the residential smart grid task scheduling problem, and building HVAC system optimal control problem.
| Progressive Growing of GANS for improved Quality, Stability and Variation |
Nvidia has managed to achieve impressive results by training GAN’s using a new method. The key idea is to grow both the generator and discriminator progressively: starting from a low resolution, they add new layers that model increasingly fine details as training progresses.
| Efficient Processing of Deep Neural Networks: A Tutorial and Survey |
This article aims to provide a comprehensive tutorial and survey about the recent advances towards the goal of enabling efficient processing of DNNs. Specifically, it will provide an overview of DNNs, discuss various hardware platforms and architectures that support DNNs, and highlight key trends in reducing the computation cost of DNNs.