Neural Network Distiller: A PyTorch environment for neural network compression
January 11, 2020
Neta Zmora offers an overview of Distiller, an open source Python package for neural network compression research. Neta discusses the motivation for compressing DNNs, outlines compression approaches, and explores Distiller's design and tools, supported algorithms, and code and documentation. Neta concludes with an example implementation of a compression research paper.