Zero to Data Streaming Platform in Under 15 Minutes
Breakout Session
Data streaming engineers need tooling to efficiently provision, maintain, and evolve the data stream platform. The Confluent Terraform Provider does just that, providing human-readable infrastructure-as-code to build a Confluent Cloud environment in a matter of minutes.
In this session, we’ll start from a blank canvas and create a new environment - complete with an Apache KafkaⓇ cluster, stream governance, and processing with Flink. Next we’ll create Kafka topics, define data contracts and determine how to transform our input data. We won’t forget about security and access controls - so let’s create service accounts with the necessary roles and permissions. Finally, we’ll set it all in motion by streaming events into Kafka and querying the output of our new data pipeline.
When we’re done, you’ll have the tools needed to build and maintain your data streaming platform. Let’s do this!
Sandon Jacobs
Confluent