Deep Learning Weekly | Issue #67: AI Progress, Computer Vision, Augmenting humans, Population Based Training, Superintelligence, Optimization

Bringing you everything new and exciting in the world of
 deep learning from academia to the grubby depth
 of industry every week right to your inbox. Free.

Hey and welcome to another week in deep learning! This week, we take a look at a project that explore
December 7 · Issue #67 · View online
Deep Learning Weekly
Hey and welcome to another week in deep learning!
This week, we take a look at a project that explores the current progress in AI, read why Francois Chollet thinks a sudden superintelligence is impossible, get to know the AI Index and look back at the optimization highlights of 2017.
We move on with a new way to find good hyperparameter configurations, take  an extensive look at the field of computer vision, learn about the use of AI to augment human intelligence and get a gentle introduction to distributed TensorFlow.
To top it off, we’ve got a speech recognition dataset and model from Mozilla, a tool for generating animations and great papers about AlphaZero, network distillation and segmentation.
Happy reading and hacking!
If you like receiving this newsletter and would like to support our work, you can do so by sharing this issue with friends and colleagues who might find it interesting. Thanks!

AI Progress Measurement
The Impossibility of Intelligence Explosion
Artificial Intelligence Index Report
Optimization for Deep Learning Highlights in 2017
A Year in Computer Vision
Population based training of neural networks
Sound Classification with TensorFlow
Sequence Modeling with CTC
Using Artificial Intelligence to Augment Human Intelligence
Distributed TensorFlow: A Gentle Introduction
Libraries & Code
DeepSpeech: Mozilla’s Open Source Speech Recognition Model and Voice Dataset
TopoSketch Web App: Generating Animations by Sketching in Conceptual Space
StarGAN: Unified Generative Adversarial Networks for Multi-Domain Image-to-Image Translation
Papers & Publications
Learning to Segment Every Thing
Distilling a Neural Network Into a Soft Decision Tree