Background

Breakout Session

Scaling Data Ingestion: Overcoming Challenges with Cell Architecture

Ingesting massive volumes of data, often reaching hundreds of gigabytes per second, poses a significant challenge for companies across various industries, and New Relic is no exception. Leveraging cell architecture has proven to be an effective strategy for scaling New Relic's pipelines, yet it has introduced its own set of complexities. In this presentation, Ramesh and Anton will share invaluable insights gained from their experiences in overcoming these limitations.  

  They will delve into the utilization of key technologies such as Apache Flink, Apache Kafka, and S3 with Apache Paimon to enhance New Relic's data processing capabilities. Specifically, they will explore how these tools are integrated to deliver advanced functionalities, including intelligent routing between Kafka clusters deployed across different regions or clouds, establishment of global state capabilities, and ensuring ubiquitous data availability.  

  Attendees keen on understanding the intricacies of combining diverse storage engines to optimize and scale data pipelines will find this talk immensely beneficial. Join us to uncover actionable strategies and best practices for tackling the complexities of data ingestion and processing at scale.

Antón Rodríguez

New Relic

Ramesh Motaparthy

New Relic

Download