Kafka消费者订阅分区 在命令行启动Kafka后,手动创建’assgin_topic’主题,分区数量为2,执行命令如下: .\bin\windows\kafka-topics.bat --create --zookeeper localhost:2181 --replication-factor 1 --partitions 2 --topic assign_topic 1. 根据上面给出的主题,完成如下操作: (1)编写生产者程序,以通用唯一标识符UUID作为消息,发送给主题assign_topic; (2)编写消费者程序1,订阅主题的分区0,只消费分区0数据; (3)编写消费者程序2,订阅主题的分区1,只消费分区1数据;
时间: 2024-02-05 10:10:39 浏览: 161
以下是相应的代码:
生产者程序:
```java
import org.apache.kafka.clients.producer.*;
import java.util.Properties;
import java.util.UUID;
public class ProducerDemo {
public static void main(String[] args) {
String topicName = "assign_topic";
Properties props = new Properties();
props.put("bootstrap.servers", "localhost:9092");
props.put("acks", "all");
props.put("retries", 0);
props.put("batch.size", 16384);
props.put("linger.ms", 1);
props.put("buffer.memory", 33554432);
props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");
Producer<String, String> producer = new KafkaProducer<>(props);
for (int i = 0; i < 10; i++) {
String uuid = UUID.randomUUID().toString();
producer.send(new ProducerRecord<>(topicName, uuid));
}
producer.close();
}
}
```
消费者程序1:
```java
import org.apache.kafka.clients.consumer.*;
import org.apache.kafka.common.TopicPartition;
import java.util.Collections;
import java.util.Properties;
public class ConsumerDemo1 {
public static void main(String[] args) {
String topicName = "assign_topic";
Properties props = new Properties();
props.put("bootstrap.servers", "localhost:9092");
props.put("group.id", "test-group");
props.put("enable.auto.commit", "true");
props.put("auto.commit.interval.ms", "1000");
props.put("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
props.put("value.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
KafkaConsumer<String, String> consumer = new KafkaConsumer<>(props);
TopicPartition partition0 = new TopicPartition(topicName, 0);
consumer.assign(Collections.singletonList(partition0));
while (true) {
ConsumerRecords<String, String> records = consumer.poll(100);
for (ConsumerRecord<String, String> record : records) {
System.out.printf("offset = %d, key = %s, value = %s%n", record.offset(), record.key(), record.value());
}
}
}
}
```
消费者程序2:
```java
import org.apache.kafka.clients.consumer.*;
import org.apache.kafka.common.TopicPartition;
import java.util.Collections;
import java.util.Properties;
public class ConsumerDemo2 {
public static void main(String[] args) {
String topicName = "assign_topic";
Properties props = new Properties();
props.put("bootstrap.servers", "localhost:9092");
props.put("group.id", "test-group");
props.put("enable.auto.commit", "true");
props.put("auto.commit.interval.ms", "1000");
props.put("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
props.put("value.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
KafkaConsumer<String, String> consumer = new KafkaConsumer<>(props);
TopicPartition partition1 = new TopicPartition(topicName, 1);
consumer.assign(Collections.singletonList(partition1));
while (true) {
ConsumerRecords<String, String> records = consumer.poll(100);
for (ConsumerRecord<String, String> record : records) {
System.out.printf("offset = %d, key = %s, value = %s%n", record.offset(), record.key(), record.value());
}
}
}
}
```
生产者程序通过生成UUID作为消息,发送给主题"assign_topic"。消费者程序1订阅了主题"assign_topic"的分区0,只消费分区0数据;消费者程序2订阅了主题"assign_topic"的分区1,只消费分区1数据。
阅读全文