Speaker Details

Danica Fine
Confluent

Danica Fine is a Staff Developer Advocate at Confluent where she helps others get the most out of Kafka, Flink, and their event-driven pipelines. In her previous role as a software engineer on a streaming infrastructure team, she predominantly worked on Kafka Streams- and Kafka Connect-based projects to support computing financial market data at scale. She can be found on Twitter, tweeting about tech, plants, and baking @TheDanicaFine.

Do you know how your data moves into your Apache Kafka® instance? From the programmer’s point of view, it’s relatively simple. But under the hood, writing to Kafka is a complex process with a fascinating life cycle that’s worth understanding.

Anytime you call producer.send(), those calls are translated into low-level requests which are sent along to the brokers for processing. In this session, we’ll dive into the world of Kafka producers to follow a request from an initial call to send(), all the way to disk, and back to the client via the broker’s final response. Along the way, we’ll explore a number of client and broker configurations that affect how these requests are handled and discuss the metrics that you can monitor to help you to keep track of every stage of the request life cycle.

By the end of this session, you’ll know the ins and outs of the read and write requests that your Kafka clients make, making your next debugging or performance analysis session a breeze.

More

Searching for speaker images...