January 28, 2020

201 words 1 min read

AI/ML Deployment at the Edge

AI/ML Deployment at the Edge

Arm and Linaro launched the AI initiative one year ago to collaborate on an open source inference engine common to all Arm edge devices and support SoC specific NN acceleration via a plug-in back end …

Talk Title AI/ML Deployment at the Edge
Speakers Andrea Gallo (VP of Membership Development, Linaro)
Conference Open Source Summit + ELC Europe
Conf Tag
Location Lyon, France
Date Oct 27-Nov 1, 2019
URL Talk Page
Slides Talk Slides
Video

Arm and Linaro launched the AI initiative one year ago to collaborate on an open source inference engine common to all Arm edge devices and support SoC specific NN acceleration via a plug-in back end framework.The mlplatform.org platform hosts the upstream open source work for both Arm NN and the Arm Compute Libraries. The team, made up of engineers from Arm, Linaro, Qualcomm, TI, and other members, is deploying Arm NN and the Arm Compute Libraries in edge devices with support for the most widely used frameworks like Tensorflow, ONNX, etc.Andrea Gallo, Linaro VP of Membership Development, will provide an overview of the ongoing activities to support multiple SoCs in Arm NN, set up CI and testing infrastructure, integrate in runtime frameworks and graph compilation technologies.

comments powered by Disqus