December 18, 2019

214 words 2 mins read

How to Build Deep Learning Inference Through Knative Serverless Framework

How to Build Deep Learning Inference Through Knative Serverless Framework

Knative, a Kubernetes native serverless frame, was just made public. Although Knative is powerful and flexible, very little is known about how to integrate, use, and extend it. In this talk, we prese …

Talk Title How to Build Deep Learning Inference Through Knative Serverless Framework
Speakers Huamin Chen (Principal Software Engineer, Red Hat), Yehuda Sadeh-Weinraub (Senior Principal Software Engineer, Red Hat)
Conference KubeCon + CloudNativeCon North America
Conf Tag
Location Seattle, WA, USA
Date Dec 9-14, 2018
URL Talk Page
Slides Talk Slides
Video

Knative, a Kubernetes native serverless frame, was just made public. Although Knative is powerful and flexible, very little is known about how to integrate, use, and extend it. In this talk, we present a deep learning inference demo to walk through Knative end-to-end. Specially, we explain the following concepts and process: - How to extend Knative Eventing by showing a new Knative Eventing Source: Ceph Rados Gateway (RGW) PubSub. - Knative Serving concepts, such as route and configuration. We illustrate these concepts by providing a Knative route that is able to invoked by RGW PubSub event and trigger a deep learning inference task. - Knative build concepts, such as build and buildtemplate. We substantiate these concepts by a real-world source to image pipeline that builds the deep learning task.

comments powered by Disqus