Securing Apache Kafka
With Apache Kakfa 0.9, the community has introduced a number of features to make data streams secure. Jun Rao explains the motivation for making these changes, discusses the design of Kafka security, and demonstrates how to secure a Kafka cluster. Jun also covers common pitfalls in securing Kafka and talks about ongoing security work.
Talk Title | Securing Apache Kafka |
Speakers | |
Conference | Strata + Hadoop World |
Conf Tag | Make Data Work |
Location | New York, New York |
Date | September 27-29, 2016 |
URL | Talk Page |
Slides | Talk Slides |
Video | |
Kafka, developed at LinkedIn in 2010, was originally an open system to encourage adoption; developers could easily create new data streams, add data to the pipeline, and read data as it was created. It succeeded brilliantly at encouraging developers to build new data applications, improved the reliability of systems and applications, and helped LinkedIn scale its logging and data infrastructure. Unfortunately, as Kafka usage grew at LinkedIn (and at other sites), the problems with a totally open system became apparent. Developers might inadvertently cause production problems when creating new Kafka streams, engineers might change the configuration of critical systems, and employees might get access to sensitive data. As Kafka has been adopted by larger enterprises with more complex security requirements, the Kafka community has had to rethink its architecture. With Apache Kakfa 0.9, the community has introduced a number of features to make data streams secure. Jun Rao explains the motivation for making these changes and the threats that Kafka Security mitigates, discusses the design of Kafka security, and demonstrates how to secure a Kafka cluster. Jun also covers common pitfalls in securing Kafka and talks about ongoing security work. Topics include: