|April 6 · Issue #35 · View online |
As always, we’d appreciate it if you could help us spread the word by sharing this issue with friends and colleagues :)
| New Institute Aims to Make Toronto an ‘Intellectual Centre’ of AI Capability |
Large companies including Google and Air Canada are sponsoring the Vector Institute, which intends to retain and repatriate the AI talent Canada is already producing.
| First In-Depth Look at Google’s TPU Architecture |
Google released an exhaustive comparison of its Tensor Processing Unit’s performance and efficiencies compared with Haswell CPUs and Nvidia Tesla K80 GPUs (see here for report
). The article explores what gives TPU such give it such a leg up on other hardware for deep learning inference. This is a major step in Google’s march to dominate the AI cloud computing space.
| Classifying White Blood Cells With Deep Learning |
This post demonstrates how one can leverage deep learning techniques to classify white blood cell images. The model classifies white blood cells as Polynuclear or Mononuclear with an accuracy of 98% on the reference dataset (Includes code & data).
| 29 Amazing Applications of Deep Learning |
A great tour of deep learning applications including computer vision, robots, computer generated art and even computer hallucinations.
| Lecture Collection | Natural Language Processing with Deep Learning |
The video lectures for Richard Socher’s great course on NLP with deep learning have just been released.
| Failures of Deep Learning |
This insightful talk given at the Simons Institute elaborates on the failures and limitations of deep learning… lest we all get carried away.
| How to Choose a Neural Network |
A handy cheat sheet on choosing the right neural network architecture for the task at hand.
| Jupyter Notebook 5.0 |
This long awaited release sports some cool new features, such as cell tagging, customizing keyboard shortcuts, copying & pasting cells between notebooks, and a more attractive default style for tables.
| GitHub - Starter code for Evolution Strategies |
The accompanying starter code for OpenAI’s Evolution Strategies as a Scalable Alternative to Reinforcement Learning (see below).
| Monet to photo (Good Results) |
Enjoy these Monet to photo translations generated using CycleGAN
| Frames Dataset by Maluuba |
This new dataset by Maluuba Inc adds conversational memory to support complex goal-oriented tasks. It is optimal for natural language understanding, machine reading comprehension, goal oriented dialogue systems, conversational interfaces and reinforcement learning.
| Evolution Strategies as a Scalable Alternative to Reinforcement Learning |
This paper by OpenAI has been making waves recently since it presents a method to competitively and efficiently train models on CPUs without the need for backpropagation.
| English Conversational Telephone Speech Recognition by Humans and Machines |
This important paper set out to verify the popular claim that we have reached human-level performance in speech recognition. The authors performed an independent set of human performance measurements on two conversational tasks and found that human performance may be considerably better than what was earlier reported, giving the community a significantly harder goal to achieve.