October 12, 2019

197 words 1 min read

Hyperparameter Tuning Using Kubeflow

Hyperparameter Tuning Using Kubeflow

In machine learning, hyperparameter tuning refers to the process of finding the optimal constraints for training models. Choosing optimal hyperparameters can drastically improve the performance of a m …

Talk Title Hyperparameter Tuning Using Kubeflow
Speakers Johnu George (Technical Lead, Cloud CTO, Cisco Systems), Richard Liu (Senior Software Engineer, Google)
Conference KubeCon + CloudNativeCon
Conf Tag
Location Shanghai, China
Date Jun 23-26, 2019
URL Talk Page
Slides Talk Slides
Video

In machine learning, hyperparameter tuning refers to the process of finding the optimal constraints for training models. Choosing optimal hyperparameters can drastically improve the performance of a model, but the search space grows exponentially with the addition of new hyperparameters.A closely related subfield of automated machine learning is neural architecture search (NAS). In recent research, networks generated by NAS algorithms can even outperform handcrafted neural networks. However, like hyperparameter tuning, the process can be time-consuming and expensive.We present Katib - a Kubernetes-native automated machine learning platform for hyperparameter tuning and NAS. Part of the Kubeflow platform, Katib offers a rich set of management APIs in the form of custom resources. We will demonstrate how to configure and run an experiment and compare performance in Katib’s UI dashboard.

comments powered by Disqus