Flink Jobs as Agents 🤖 – Unlocking Agentic AI with Stream Processing

Breakout Session

Apache Flink is uniquely positioned to serve as the backbone for AI agents, enhancing them with stream processing as a new, powerful tool. We’ll explore how Flink jobs can be transformed into autonomous, goal-driven "Agents" that interact with data streams, trigger actions, and adapt in real time.

We’ll showcase Flink jobs as AI agents through two key stream processing & AI use cases: 1) financial planning & detection of spending anomalies, as well as 2) forecasting demand & supply chain monitoring for disruptions.

AI agents need business context. We’ll discuss embedding foundation models with schema registries and data catalogs for contextual intelligence while ensuring data governance and security. We’ll integrate Apache Kafka event streams with data lakes in open-table formats like Apache Iceberg, enabling AI agents to leverage real-time and historical data for consistency and reasoning. We’ll also cover latency optimization for time-sensitive use cases while preventing hallucinations.

Finally, we’ll demonstrate an open-source conversational platform on Apache Kafka, where multiple AI agents are assigned to a business process, continuously process real-time events while optimizing for their individual goals, interacting, and negotiating with each other.

By combining Flink and Kafka, we can build systems that are not just reactive but proactive and predictive, paving the way for next-generation agentic AI.


Steffen Hoellinger

Airy, Inc.