Spring Kafka Tutorial — Complete Guide
A complete, hands-on Apache Kafka and Spring Kafka tutorial series. Master event streaming from first principles — brokers, topics, producers, consumers, error handling, transactions, Avro serialization, Kafka Streams, and production patterns.
Spring Kafka Tutorial — Complete Guide
Apache Kafka is the backbone of modern event-driven architectures. This series dismantles it from the ground up — starting from what a broker actually is, through every producer and consumer knob, to exactly-once transactions and Kafka Streams. Every article uses a consistent e-commerce order-processing system as the running example.
37 articles. Real architecture. Production-grade patterns throughout.
Part 1: Apache Kafka Fundamentals
- What Is Apache Kafka: Event Streaming From First Principles
- Kafka Architecture: Brokers, Topics, Partitions, and Replicas
- Consumer Groups, Offsets, and the __consumer_offsets Topic
- KRaft Mode: Running Kafka Without ZooKeeper
- Starting a Kafka Cluster: Single-Broker and 3-Broker with KRaft
- Kafka CLI: Creating Topics, Producing, and Consuming Messages
Part 2: Spring Kafka Producers
- Kafka Producer in Spring Boot: KafkaTemplate Basics
- Sending Messages with Keys, Headers, and Custom Partitioning
- Producer @Bean Configuration: Beyond application.properties
- Producer Acknowledgments: acks, min.insync.replicas, and Data Durability
- Producer Retries: Backoff, Timeouts, and Retry Strategies
- Idempotent Producers: Eliminating Duplicate Messages
Part 3: Spring Kafka Consumers
- Kafka Consumer in Spring Boot: @KafkaListener Basics
- Consumer Groups: Parallel Processing and Partition Assignment Strategies
- Offset Management: Auto-Commit vs Manual Acknowledgment
- Seeking to Specific Offsets: Replay, Recovery, and Time-Based Seeking
- Consumer @Bean Configuration: ConcurrentKafkaListenerContainerFactory
- Filtering Messages with RecordFilterStrategy
- Pausing, Resuming, and Stopping Listener Containers
Part 4: Serialization and Message Handling
- JSON Serialization: JsonSerializer, JsonDeserializer, and Type Mapping
- Avro Serialization with Confluent Schema Registry
- Custom Serializers and Deserializers
- Message Headers: Metadata, Routing, and Custom Header Propagation
Part 5: Error Handling and Resilience
- Error Handling Basics: DefaultErrorHandler and CommonErrorHandler
- Retryable vs Non-Retryable Exceptions: Custom Exception Classification
- Dead Letter Topics: Routing Failed Messages with DeadLetterPublishingRecoverer
- Handling Deserialization Errors Gracefully
- Non-Blocking Retries: @RetryableTopic, BackOff, and the Retry Topic Chain
- Kafka Transactions and Exactly-Once Semantics
Part 6: Advanced Patterns
- @SendTo and @KafkaHandler: Chaining Consumers and Multi-Type Dispatch
- Request-Reply Pattern with ReplyingKafkaTemplate
- Dynamic Listener Containers and Programmatic Topic Registration
- Kafka Streams with Spring Boot: Stateless and Stateful Processing
Part 7: Administration, Testing, and Monitoring
- KafkaAdmin and AdminClient: Managing Topics Programmatically
- Testing Kafka Applications: EmbeddedKafka and Testcontainers
- Monitoring: Consumer Lag, Micrometer Metrics, and Actuator Integration
Part 8: Production Best Practices
What You Will Learn
- How Apache Kafka stores and replays events — brokers, topics, partitions, offsets
- How to run Kafka locally with KRaft mode (no ZooKeeper) using Docker Compose
- Building producers:
KafkaTemplate, acknowledgments, retries, idempotence - Building consumers:
@KafkaListener, consumer groups, offset management, manual commits - Serialization: JSON, Avro with Schema Registry, custom serializers
- Error handling:
DefaultErrorHandler, dead letter topics, non-blocking retries - Exactly-once semantics with Kafka transactions
- Advanced patterns: request-reply, dynamic listeners, Kafka Streams
- Testing:
@EmbeddedKafka, Testcontainers, integration test patterns - Production: monitoring consumer lag, metrics, and the production launch checklist
Prerequisites
- Java 11+ (examples use Java 21)
- Spring Boot basics (dependency injection, REST controllers)
- Basic understanding of messaging concepts (helpful but not required)
- Docker Desktop installed (for running Kafka locally)
No prior Kafka experience needed.