Deploy Spark ML TensorFlow AI models from notebooks to hybrid clouds (including GPUs)
Chris Fregly explores an often-overlooked area of machine learning and artificial intelligencethe real-time, end-user-facing "serving layer in hybrid-cloud and on-premises deployment environmentsand shares a production-ready environment to serve your notebook-based Spark ML and TensorFlow AI models with highly scalable and highly available robustness.
Talk Title | Deploy Spark ML TensorFlow AI models from notebooks to hybrid clouds (including GPUs) |
Speakers | Chris Fregly (Amazon Web Services) |
Conference | Strata Data Conference |
Conf Tag | Making Data Work |
Location | London, United Kingdom |
Date | May 23-25, 2017 |
URL | Talk Page |
Slides | Talk Slides |
Video | |
Chris Fregly explores an often-overlooked area of machine learning and artificial intelligence—the real-time, end-user-facing “serving” layer in hybrid-cloud and on-premises deployment environments. Serving models to end users in real time in a highly scalable, fault-tolerant manner requires an understanding of not only machine learning fundamentals but also distributed systems and scalable microservices. Drawing on his time at both Databricks and Netflix, Chris shares a 100% open source, real-world, hybrid-cloud, on-premises, and NetflixOSS-based production-ready environment to serve your notebook-based Spark ML and TensorFlow AI models with highly scalable and highly available robustness.