February 20, 2020

212 words 1 min read

Containerized architectures for deep learning

Containerized architectures for deep learning

Container and cloud native technologies around Kubernetes have become the de facto standard in modern ML and AI application development. Antje Barth examines common architecture blueprints and popular technologies used to integrate AI into existing infrastructures and explains how you can build a production-ready containerized platform for deep learning.

Talk Title Containerized architectures for deep learning
Speakers Antje Barth (AWS)
Conference O’Reilly Artificial Intelligence Conference
Conf Tag Put AI to Work
Location London, United Kingdom
Date October 15-17, 2019
URL Talk Page
Slides Talk Slides
Video

Container and cloud native technologies around Kubernetes have become the de facto standard in modern ML and AI application development. And while many data scientists and engineers tend to focus on tools, the platform that enables these tools is equally important and often overlooked. Antje Barth examines some common architecture blueprints and popular technologies used to integrate AI into existing infrastructures and explains how you can build a production-ready containerized platform for deep learning. In particular, she explores Docker and Kubernetes, with its associated cloud native technologies, and its use and advantages in ML/AI environments. You’ll see a demo of a microservices-based AI application and walk through a typical ML workflow—data ingestion, preprocessing, continuous model training, evaluation, model deployment, and serving—to see the technologies in action.

comments powered by Disqus