March 22, 2020

357 words 2 mins read

Using actors for data-driven stream processing

Using actors for data-driven stream processing

Amanda Kabak explains why there's more to stream processing than serverless workflows. Actors can provide the ability to create complex calculations meshes that run on cloud resources with cost-effective density.

Talk Title Using actors for data-driven stream processing
Speakers Amanda Kabak (CleanSpark)
Conference O’Reilly Software Architecture Conference
Conf Tag Engineering the Future of Software
Location New York, New York
Date February 24-26, 2020
URL Talk Page
Slides Talk Slides

There are many methods available for processing streaming data, specifically for the IoT space, but every site or project you need to support may require a processing pipeline unique to its configuration. Yes, Amanda Kabak explains, serverless functions are cool, but wiring up sequences of them by hand or even coding multiple steps into a single function’s execution isn’t a sustainable answer to this problem, and processing thousands of incoming data points a minute can lead to large operating costs. Enter actors. Actors are small bits of code that execute in a single-threaded way that can be combined with an internal state that only it can mutate. Within an actor platform, actors provide highly scalable processing power, but the true value of actors in stream processing comes not from each actor individually but from how they can be networked together. Small actors that each execute a very simple and testable calculation can be combined into a data-driven calculation mesh that can provide high-level derivations in a thread-safe way for huge amounts of streaming data. Imagine, for example, an industrial IoT monitoring solution that can be deployed across sites with different equipment configurations. The only things executives want to know are total downtime in their plant and total cost savings, but those are calculated and aggregated differently based on the type and number of equipment monitored at each site. With a calculation mesh running across networked actors, small, exhaustively tested calculation steps can be combined in unique ways using metadata-based configuration rather than code customization to provide valuable insights from streaming data. You’ll discover actors conceptually and explore a case study for this approach running in Microsoft’s Service Fabric for a platform monitoring and controlling renewable energy assets.

comments powered by Disqus