Bighead: Airbnb's end-to-end machine learning platform
January 26, 2020
Atul Kale and Xiaohan Zeng offer an overview of Bighead, Airbnb's user-friendly and scalable end-to-end machine learning framework that powers Airbnb's data-driven products. Built on Python, Spark, and Kubernetes, Bighead integrates popular libraries like TensorFlow, XGBoost, and PyTorch and is designed be used in modular pieces.
Neural Network Distiller: A PyTorch environment for neural network compression
January 11, 2020
Neta Zmora offers an overview of Distiller, an open source Python package for neural network compression research. Neta discusses the motivation for compressing DNNs, outlines compression approaches, and explores Distiller's design and tools, supported algorithms, and code and documentation. Neta concludes with an example implementation of a compression research paper.
Distributed training of deep learning models
December 10, 2019
Mathew Salvaris, Miguel Gonzalez-Fierro, and Ilia Karmanov offer a comparison of two platforms for running distributed deep learning training in the cloud, using a ResNet network trained on the ImageNet dataset as an example. You'll examine the performance of each as the number of nodes scales and learn some tips and tricks as well as some pitfalls to watch out for.
Machine learning with PyTorch
November 23, 2019
PyTorch is a recent deep learning framework from Facebook that is gaining massive momentum in the deep learning community. Its fundamentally flexible design makes building and debugging models straightforward, simple, and fun. Delip Rao and Brian McMahan walk you through PyTorch's capabilities and demonstrate how to use PyTorch to build deep learning models and apply them to real-world problems.