|September 25 · Issue #97 · View online |
As always, we hope you’ll enjoy reading as much as we did and would appreciate you sharing this newsletter with friends and colleagues.
See you next week!
| Why building your own Deep Learning computer is 10x cheaper than AWS |
Here, Jeff Chen
explores the cost of a self-built deep learning rig and compares it to Amazons AWS offerings. He gives detailed numbers on when which solution works best and what you should consider when building your own machine.
| From humanitarian programs to new business tools: AI news from Microsoft’s Ignite conference |
At their 2018 Ignite Conference, Microsoft announced the launch of a $40 million, five-year program called AI for Humanitarian Action and updates to Cortana and their machine learning services, as well as a new milestone in text-to-speech synthesis.
| Apple hopes you'll figure out what to do with AI on the iPhone XS |
You probably couldn’t miss the new iPhone model announcements, but the fact that Apples latest chip packs an improved version of their “neural engine” was easily overlooked. And for the first time, CoreML, the iOS machine learning framework can directly benefit from it. This seems to yield great performance
and we’re excited for the upcoming apps.
| Artificial Intelligence Can Reinforce Bias, Cloud Giants Announce Tools For AI Fairness |
As the challenge of building and training models or intelligent systems that do not inherit an unwanted bias from their training data becomes more critical, all major cloud providers have started working on tools to tackle the issues.
| How to make a racist AI without really trying |
Very interesting tutorial describing how to build a simple sentiment analyser, which eventually turns out to be a rather racist system. The article shows how to fix that and why it’s important to keep an eye on bias in your model.
| An Intuitive Guide to Optimal Transport, Part I: Formulating the Problem |
Optimal transport theory became popular with Wasserstein GANs and is extremely interesting but hard to grasp, especially if math is not your favourite hobby. This series explains all you need to know in two parts and should help you get started.
| Machine Learning: The Opportunity and the Opportunists |
Very interesting talk on the current state-of-the-art of machine learning in different fields, both academia and industry and especially how the new “AI’s” are handled in the media and what possibilities this opens up to everyone involved.
| Microsoft’s machine learning tools for developers get smarter |
Some more details on the tools and services released by Microsoft. Especially a tool for automated selection, testing and tweaking of models sounds interesting.
| jmschrei/apricot |
A handy module for handling large datasets. It implements submodular selection for the purpose of selecting subsets of massive data sets to train machine learning models quickly.
| shayneobrien/generative-models |
Impressive repository of annotated, understandable, and visually interpretable PyTorch implementations of: VAE, BIRVAE, NSGAN, MMGAN, WGAN, WGANGP, LSGAN, DRAGAN, BEGAN, RaGAN, InfoGAN, fGAN, FisherGAN.
| Automated Game Design via Conceptual Expansion |
This paper relies on machine learning to learn approximate representations of games. The approach recombines knowledge from these learned representations to create new games via conceptual expansion. The authors evaluate this approach by demonstrating the ability for the system to recreate existing games.
| Identifying Generalization Properties in Neural Networks |
Here, the authors connect the model generalization capability with the local property of a solution under the PAC-Bayes paradigm. In particular, they prove the model generalization ability is related to the Hessian, the Lipschitz constant of the Hessian, as well as the scales of the parameters.