The Current Experience

I am a

Your curated path to insights, innovation, and inspiration.

Day 1
Main Keynote

Level up with the latest in AI and data streaming and breakthrough product innovations.

DSE Certification

Claim Your Spot in the Hall of Fame

Step up your game—head to the Certification Lounge to earn your Data Streaming Engineer Certification. And it’s free! Boost your professional profile and career prospects, score exclusive swag, and secure your spot in the Confluent Hall of Fame!

AI Alley

Catch Innovation in Action

Explore AI Alley, an immersive space in the Expo Hall where AI and real-time data come to life. Tour the AI-powered art gallery and take part in hands-on workshops and learning experiences by companies including AWS and Databricks.

Current Party

Unwind, Connect, and Celebrate

Step into the spirit of New Orleans with a second line parade, and celebrate with authentic food, live entertainment, and a touch of Halloween magic—ripe with voodoo vibes and spooky surprises.

Confluent Booth

Empty title

Stop by the Confluent Booth and see what’s possible with the world’s data streaming platform. Discover how we help you build faster, scale smarter, and innovate bolder.

Day 2
Developer Keynote

Dive into Kafka and Flink updates and celebrate open-source innovations.

Meetup Hub

Where New Ideas Come to Life

Stop by the Confluent Booth and see what’s possible with the world’s data streaming platform. Discover how we help you build faster, scale smarter, and innovate bolder.

Organized by Confluent
Hadar Federovsky
Akamai
Swaroop Oggu from Databricks as part of the Current Keynote
Hadar Federovsky
Akamai
Keynote
Now Streaming Live

Stream On: From Bottlenecks to Streamline with Kafka Streams Template

Tuesday, May 20, 2025
5:30 PM - 6:15 PM

How do you make 10TB of data per hour accessible, scalable, and easy to integrate for multiple internal consumers? In this talk, we’ll share how we overcame storage throughput limitations by migrating to Kafka Streams and developing a unified template application. Our solution not only eliminated bottlenecks but also empowered internal clients to build reliable Kafka Streams applications in just a few clicks—focusing solely on business logic without worrying about infrastructure complexity. We’ll dive into our architecture, implementation strategies, and key optimizations, covering performance tuning, monitoring, and how our approach accelerates adoption across teams. Whether you're managing massive data pipelines or seeking to streamline access for diverse stakeholders, this session will provide practical insights into leveraging Kafka Streams for seamless, scalable data flow.

Location
Breakout Room 6
Level
Intermediate
Audience
Data Engineer/Scientist, Developer, Executive (Technical)
Track
Apache Kafka

Mike Araujo

Principle Engineer, Medidate Solutions

How do you make 10TB of data per hour accessible, scalable, and easy to integrate for multiple internal consumers? In this talk, we’ll share how we overcame storage throughput limitations by migrating to Kafka Streams and developing a unified template application. Our solution not only eliminated bottlenecks but also empowered internal clients to build reliable Kafka Streams applications in just a few clicks—focusing solely on business logic without worrying about infrastructure complexity. We’ll dive into our architecture, implementation strategies, and key optimizations, covering performance tuning, monitoring, and how our approach accelerates adoption across teams. Whether you're managing massive data pipelines or seeking to streamline access for diverse stakeholders, this session will provide practical insights into leveraging Kafka Streams for seamless, scalable data flow.

Speaking at