Tensor abuse in the workplace
Ted Dunning offers an overview of tensor computingcovering, in practical terms, the high-level principles behind tensor computing systemsand explains how it can be put to good use in a variety of settings beyond training deep neural networks (the most common use case).
Talk Title | Tensor abuse in the workplace |
Speakers | Ted Dunning (MapR, now part of HPE) |
Conference | Strata + Hadoop World |
Conf Tag | Big Data Expo |
Location | San Jose, California |
Date | March 14-16, 2017 |
URL | Talk Page |
Slides | Talk Slides |
Video | |
Tensors are the latest fad in machine learning, but there is real content beyond the buzzword. Tensors are the basic data type for modern numerical systems in much the same way that matrices were fundamental before. Tensors provide a consistent shorthand for describing a variety of computations in a way that is highly suitable for computation on GPUs, but they also provide a useful formalism for high-performance computation on ordinary processors. The reason that this works so well is that tensor operations not only allow the inner loop to be specified using numerical primitives but often also permit the enclosing two or three loops to be specified at the same time, enabling distributed computation with much less communication and thus much higher throughput. Ted Dunning demystifies modern tensor-based computation systems by showing how they really just implement incredibly simple operations and allow us to express these operations very concisely. While tensor-based systems are often used for developing deep neural networks, Ted shows how they can be used for a number of other computations as well, sometimes in surprising ways—offering examples using TensorFlow that illustrate this simplicity and sophistication.