1)导入依赖
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>0.11.0.0</version>
</dependency>
2)编写代码
需要用到的类
KafkaProducer
:需要创建一个生产者对象,用来发送数据。
ProducerConfig
:获取所需的一系列配置参数。
ProducerRecord
:每条数据都要封装成一个ProducerRecord对象。
1.不带回调函数的API
package com.young.springbootdemo.kafkademo;
import org.apache.kafka.clients.producer.KafkaProducer;
import org.apache.kafka.clients.producer.ProducerConfig;
import org.apache.kafka.clients.producer.ProducerRecord;
import java.util.Properties;
public class MyProducer {
public static void main(String[] args) {
//1.创建Kafka生产者的配置信息
Properties properties = new Properties();
//2.指定连接的kafka集群
properties.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "192.168.235.3:9092");
//3.ACK应答级别
properties.put(ProducerConfig.ACKS_CONFIG, "all");
//4.重试次数
properties.put(ProducerConfig.RETRIES_CONFIG, 3);
//5.批次大小
properties.put(ProducerConfig.BATCH_SIZE_CONFIG, 16384);
//6.等待时间
properties.put(ProducerConfig.LINGER_MS_CONFIG, 1);
//7.RecordAccmulator缓冲区大小
properties.put(ProducerConfig.BUFFER_MEMORY_CONFIG, 3355432);
//8.Key,Value序列化类
properties.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.StringSerializer");
properties.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.StringSerializer");
//9.创建生产者对象
KafkaProducer<String, String> producer = new KafkaProducer<>(properties);
for (int i = 0; i < 10; i++) {
//10.发送数据
producer.send(new ProducerRecord<String, String>("first", "kafka_message_" + i));
}
//11.关闭资源
producer.close();
}
}
启动一个消费者消费消息
/opt/kafka_2.11-0.11.0.3/bin # kafka-console-consumer.sh --zookeeper zkKafka1:2181 --topic first
Using the ConsoleConsumer with old consumer is deprecated and will be removed in a future major release. Consider using the new consumer by passing [bootstrap-server] instead of [zookeeper].
kafka_message_1
kafka_message_3
kafka_message_5
kafka_message_7
kafka_message_9
kafka_message_0
kafka_message_2
kafka_message_4
kafka_message_6
kafka_message_8