Event-driven Architecture - Deep dive

by mahidhar

Dive into Event Brokers and Tools

Event Brokers Overview

Event brokers (also known as message brokers or event buses) are middleware systems that facilitate the transmission of events between producers and consumers. They decouple the components that produce events from those that consume them, enhancing scalability and fault tolerance. Some popular event brokers include Apache Kafka, RabbitMQ, and AWS SNS/SQS.

Key Features of Event Brokers

  • Durability: Ensuring messages are not lost.
  • Scalability: Handling increased load by adding more nodes.
  • Fault Tolerance: Continuing to function despite hardware failures.
  • Ordering: Guaranteeing the order of message delivery.

Apache Kafka

Apache Kafka is a distributed event streaming platform designed for high-throughput, fault-tolerant, and scalable messaging. It is widely used for building real-time data pipelines and streaming applications.

Core Concepts

  • Topics: Categories to which records are published.
  • Partitions: Divisions of a topic for parallel processing.
  • Producers: Entities that publish events to topics.
  • Consumers: Entities that read events from topics.
  • Brokers: Kafka servers that store and manage topics.
  • ZooKeeper: Manages Kafka brokers and cluster configuration.

Setting Up Kafka

  1. Install Kafka: Download Kafka from the official site.
  2. Start ZooKeeper: Kafka requires ZooKeeper to manage its cluster.
code
bin/zookeeper-server-start.sh config/zookeeper.properties
  1. Start Kafka Broker:
code
bin/kafka-server-start.sh config/server.properties

Kafka Producer Example

code
Properties props = new Properties();
props.put("bootstrap.servers", "localhost:9092");
props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");

KafkaProducer<String, String> producer = new KafkaProducer<>(props);
producer.send(new ProducerRecord<>("my-topic", "key", "value"));
producer.close();

Kafka Consumer Example

code
Properties props = new Properties();
props.put("bootstrap.servers", "localhost:9092");
props.put("group.id", "test-group");
props.put("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
props.put("value.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");

KafkaConsumer<String, String> consumer = new KafkaConsumer<>(props);
consumer.subscribe(Collections.singletonList("my-topic"));

while (true) {
    ConsumerRecords<String, String> records = consumer.poll(Duration.ofMillis(100));
    for (ConsumerRecord<String, String> record : records) {
        System.out.printf("offset = %d, key = %s, value = %s%n", record.offset(), record.key(), record.value());
    }
}

Practical Implementation in Java

Setting Up Your Environment

Before diving into Kafka with Java, ensure you have the following:

  • Java Development Kit (JDK) installed.
  • Apache Kafka installed and running.
  • Your preferred IDE or text editor.

Kafka Producer in Java

Steps to Create a Kafka Producer:

  1. Add Kafka Dependencies: Include Kafka client dependencies in your pom.xml for Maven or build.gradle for Gradle.
code
<dependency>
       <groupId>org.apache.kafka</groupId>
       <artifactId>kafka-clients</artifactId>
       <version>2.7.0</version>
   </dependency>
  1. Create Producer Properties:
code
Properties props = new Properties();
   props.put("bootstrap.servers", "localhost:9092");
   props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
   props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");
  1. Send Messages:
code
KafkaProducer<String, String> producer = new KafkaProducer<>(props);
   producer.send(new ProducerRecord<>("my-topic", "key", "value"));
   producer.close();

Kafka Consumer in Java

Steps to Create a Kafka Consumer:

  1. Add Kafka Dependencies: (Same as for the producer).
  2. Create Consumer Properties:
code
Properties props = new Properties();
   props.put("bootstrap.servers", "localhost:9092");
   props.put("group.id", "test-group");
   props.put("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
   props.put("value.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
  1. Consume Messages:
code
KafkaConsumer<String, String> consumer = new KafkaConsumer<>(props);
   consumer.subscribe(Collections.singletonList("my-topic"));

   while (true) {
       ConsumerRecords<String, String> records = consumer.poll(Duration.ofMillis(100));
       for (ConsumerRecord<String, String> record : records) {
           System.out.printf("offset = %d, key = %s, value = %s%n", record.offset(), record.key(), record.value());
       }
   }

Kafka Streams

Kafka Streams API allows you to build real-time applications that transform or aggregate data in Kafka topics.

Steps to Create a Kafka Streams Application:

  1. Add Kafka Streams Dependencies:
code
<dependency>
       <groupId>org.apache.kafka</groupId>
       <artifactId>kafka-streams</artifactId>
       <version>2.7.0</version>
   </dependency>
  1. Configure Streams Properties:
code
Properties props = new Properties();
   props.put(StreamsConfig.APPLICATION_ID_CONFIG, "streams-example");
   props.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
   props.put(StreamsConfig.DEFAULT_KEY_SERDE_CLASS_CONFIG, Serdes.String().getClass());
   props.put(StreamsConfig.DEFAULT_VALUE_SERDE_CLASS_CONFIG, Serdes.String().getClass());
  1. Build Stream Topology:
code
StreamsBuilder builder = new StreamsBuilder();
   KStream<String, String> source = builder.stream("input-topic");
   source.to("output-topic");
  1. Start Streams Application:
code
KafkaStreams streams = new KafkaStreams(builder.build(), props);
   streams.start();

Conclusion

By diving deep into the basics of events, exploring event-driven patterns like Pub/Sub, Event Sourcing, and CQRS, and implementing practical examples with Apache Kafka in Java, you can effectively build robust and scalable event-driven systems. These foundational and advanced concepts will enable you to leverage the full potential of event-driven architecture in real-world applications.