|July 7 · Issue #47 · View online |
There are plenty of great reads in this issue and we hope you’ll enjoy as much as we did. As always, we appreciate you sharing this newsletter with your friends and colleagues.
See you next week!
| Under the Hood of a Self-Driving Taxi |
Voyage, a self-driving taxi startup from the bay area, explains a part of self-driving cars that doesn’t receive as much public attention as the software: Compute, power, and the drive-by-wire kit.
| Torc Robotics unveils self-driving system for consumer cars |
Another entrant has joined the field of those offering self-driving tech to consumer car makers – but this one likely has a bit more experience than most…
| Google Stakes Its Future on a Piece of Software |
This article sheds some light on the role of TensorFlow for Googles future and how it may enable Google to compete with Amazon in the cloud computing space.
| A 2017 Guide to Semantic Segmentation with Deep Learning |
This detailed article explains semantic segmentation in general, covers the most common approaches and reviews some of the most important papers on the topic. A great read to get started in the field!
| Interpreting neurons in an LSTM network |
A well-written study on what a single neuron in an LSTM network actually learns.
| What I’ve learned about neural network quantization |
Some great insights on neural network quantization from the lead of the TensorFlow Mobile/Embedded team, Pete Warden.
| How HBO’s Silicon Valley built “Not Hotdog” with mobile TensorFlow, Keras & React Native |
Awesome article on how the makers of Silicon Valley built their famous Not Hotdog
app, that covers everything from used frameworks and initial prototypes to used hardware, training process, and even remote model deployment. Contains really great insights to model tuning and architecture decisions and should definitely be on your reading list this weekend!
| YellowFin: An automatic tuner for momentum SGD |
Hand-tuned momentum SGD is still competetive with state-of-the art methods, which motivated a team at Stanford to create this little library. It’s able to train large models in fewer iterations, by automatically tuning the SGD hyperparameters and comes with support for PyTorch and TensorFlow.
| Privacy-preserving generative deep neural networks support clinical data sharing |
A new approach to the privacy issues arising from medical research: The authors trained neural networks to generate realistic medical records of fictional subjects that may be used for research.