|August 25 · Issue #54 · View online |
It’s been an exciting week again, so let’s head right in:
We hope you’ll enjoy reading as much as we did and would appreciate you sharing this newsletter with your friends and colleagues.
See you next week!
| Andrew Ng is raising a $150M AI Fund |
Looks like Andrew Ng is on a run. Not only did he launch his Coursera course
, but it looks like he’s created a hefty fund, specialized on AI as well. We’ll see where this goes, but the horizon looks quite bright for AI startups.
| How A.I. Is Creating Building Blocks to Reshape Music and Art |
Telling the story of Douglas Eck pursuing the use of A.I. for creating music and art, this article explains history, current state and future developments of the field, including Googles Magenta team, DeepDream, synthetic instruments.
| Deep Learning for Siri’s Voice |
In this article, Apple has shared many details on how they create Siri’s voice and which techniques are used to make the voice sound as natural as possible. They made major changes starting with iOS 10 and have since then achieved great results using a hybrid unit selection text-to-speech system based on mixture density networks, which is described in detail.
| The New Age of Artificial Intelligence: Giving it Some Heart |
In the last part of his series on empathy in AI, Ben Virdee-Chapman shares his vision on how AI will incorporate empathy to better understand and work with humans, as well as real-world examples where intelligent systems use empathy to achieve better results.
| How to Intentionally Trick Neural Networks |
A great introduction to ‘hacking’ neural networks. Adam Geitgey shows, how you can create a model that tries to slightly modify a network input in order to produce a different result in an existing classifier. He even covers more advanced circumstances like not having access to the actual classifier you want to fool and how to protect against such effects. All this is topped with a Keras implementation to get your hands dirty right away.
| Designing a Deep Learning Project |
This short read summarizes Andrew Ng’s thoughts on designing a deep learning projects from his new course. Some important tips and it may make you enroll into the really well done Coursera course.
| Contouring learning rate to optimize neural nets |
Who hasn’t gone crazy while trying to find the right learning rate? As we don’t have the resources to simply try everything, these tips and tricks for treating learning rate as a hyperparameter, and using visualizations to see what’s really going on, come in really handy.
| chrisranderson/beholder |
A TensorBoard plugin for visualizing arbitrary tensors in a video as your network trains.
| BenWhetton/keras-surgeon |
Pruning and other network surgery for trained Keras models.
| bethgelab/foolbox |
Python toolbox to create adversarial examples that fool neural networks.
| Predicting the Popularity of Instagram Posts for a Lifestyle Magazine Using Deep Learning |
Interesting paper, where the authors try to predict, which post will gain lots of popularity. The possible applications are endless and it’s definitely worth a look.
| Super-Convergence: Very Fast Training of Residual Networks Using Large Learning Rates |
This paper shows a phenomenon where residual networks can be trained using an order of magnitude fewer iterations than used in standard training methods, which the authors named “super-convergence”.