当前位置: 首页 > news >正文

Kafka在Java项目中的应用

Kafka在Java项目中的应用

Docker 安装Kafka

一.首先需要安装docker,可看这篇文章安装docker

二.拉取zookeeper和KafKa镜像

docker pull wurstmeister/zookeeperdocker pull wurstmeister/kafka

Kafka组件需要向zookeeper进行注册,所以也需要安装zookeeper

三.启动zookeeper、kafka组件

docker run -d --name zookeeper -p 2181:2181 wurstmeister/zookeeperdocker run -d --name kafka --publish 9092:9092 --link zookeeper --env KAFKA_ZOOKEEPER_CONNECT=zookeeper:2181 --env KAFKA_ADVERTISED_HOST_NAME=localhost --env KAFKA_ADVERTISED_PORT=9092 wurstmeister/kafka

启动成功界面如下,status即为running(运行中)
在这里插入图片描述

四.创建Springboot项目

4.1 添加依赖

<dependencies><dependency><groupId>org.springframework.boot</groupId><artifactId>spring-boot-starter-web</artifactId></dependency><dependency><groupId>org.springframework.kafka</groupId><artifactId>spring-kafka</artifactId></dependency><dependency><groupId>org.springframework.boot</groupId><artifactId>spring-boot-starter-test</artifactId><scope>test</scope><exclusions><exclusion><groupId>org.junit.vintage</groupId><artifactId>junit-vintage-engine</artifactId></exclusion></exclusions></dependency><dependency><groupId>org.springframework.kafka</groupId><artifactId>spring-kafka-test</artifactId><scope>test</scope></dependency></dependencies>

4.2 application.yml文件

server:port: 9090
spring:kafka:bootstrap-servers: localhost:9092consumer:# 配置消费者消息offset是否自动重置(消费者重连会能够接收最开始的消息)auto-offset-reset: earliestproducer:value-serializer: org.springframework.kafka.support.serializer.JsonSerializerretries: 3  #  重试次数
kafka:topic:my-topic: my-topicmy-topic2: my-topic2

4.3 创建实体类Book

public class Book {private Long id;private String name;public Book() {}public Book(Long id, String name) {this.id = id;this.name = name;}public Long getId() {return id;}public void setId(Long id) {this.id = id;}public String getName() {return name;}public void setName(String name) {this.name = name;}@Overridepublic String toString() {return "Book{" +"id=" + id +", name='" + name + '\'' +'}';}
}

4.4 配置KafKa信息

@Configuration
public class KafkaConfig {@Value("${kafka.topic.my-topic}")String myTopic;@Value("${kafka.topic.my-topic2}")String myTopic2;/*** JSON消息转换器*/@Beanpublic RecordMessageConverter jsonConverter() {return new StringJsonMessageConverter();}/*** 通过注入一个 NewTopic 类型的 Bean 来创建 topic,如果 topic 已存在,则会忽略。*/@Beanpublic NewTopic myTopic() {return new NewTopic(myTopic, 2, (short) 1);}@Beanpublic NewTopic myTopic2() {return new NewTopic(myTopic2, 1, (short) 1);}
}

4.5 controller代码

@RestController
@RequestMapping(value = "/book")
public class BookController {@Value("${kafka.topic.my-topic}")String myTopic;@Value("${kafka.topic.my-topic2}")String myTopic2;BookProducerService producer;private AtomicLong atomicLong = new AtomicLong();BookController(BookProducerService producer) {this.producer = producer;}@GetMapping("/send")public String sendMessageToKafkaTopic(@RequestParam("name") String name) {this.producer.sendMessage(myTopic, new Book(atomicLong.addAndGet(1), name));this.producer.sendMessage(myTopic2, new Book(atomicLong.addAndGet(1), name));return name+" : 消息已经发送!";}
}

4.6 book 的生成者业务

@Service
public class BookProducerService {private static final Logger logger = LoggerFactory.getLogger(BookProducerService.class);private final KafkaTemplate<String, Object> kafkaTemplate;//通过构造方法进行注入public BookProducerService(KafkaTemplate<String, Object> kafkaTemplate) {this.kafkaTemplate = kafkaTemplate;}public void sendMessage(String topic, Object o) {ListenableFuture<SendResult<String, Object>> future = kafkaTemplate.send(topic, o);future.addCallback(result -> logger.info("生产者成功发送消息到topic:{} partition:{}的消息",result.getRecordMetadata().topic(),result.getRecordMetadata().partition()),ex -> logger.error("生产者发送消失败,原因:{}", ex.getMessage()));}}

4.7 book的消费者业务

@Service
public class BookConsumerService {@Value("${kafka.topic.my-topic}")private String myTopic;@Value("${kafka.topic.my-topic2}")private String myTopic2;private final Logger logger = LoggerFactory.getLogger(BookProducerService.class);private final ObjectMapper objectMapper = new ObjectMapper();@KafkaListener(topics = {"${kafka.topic.my-topic}"}, groupId = "group1")public void consumeMessage(ConsumerRecord<String, String> bookConsumerRecord) {try {Book book = objectMapper.readValue(bookConsumerRecord.value(), Book.class);logger.info("消费者消费topic:{} partition:{}的消息 -> {}", bookConsumerRecord.topic(), bookConsumerRecord.partition(), book.toString());} catch (JsonProcessingException e) {e.printStackTrace();}}@KafkaListener(topics = {"${kafka.topic.my-topic2}"}, groupId = "group2")public void consumeMessage2(Book book,ConsumerRecord<String,String> bookConsumerRecord) throws JsonProcessingException {Book value = objectMapper.readValue(bookConsumerRecord.value(), Book.class);logger.info("消费者消费topic:{} partition:{}的消息 -> {}", bookConsumerRecord.topic(), bookConsumerRecord.partition(), value.toString());logger.info("消费者消费{}的消息 -> {}", myTopic2, book.toString());}
}

代码整体目录如下

在这里插入图片描述

4.8 启动成功界面

在这里插入图片描述

4.9 浏览器访问

在这里插入图片描述

4.10 控制台显示

在这里插入图片描述

至此.基于KafKa的Springboot项目简单应用已经完成,后续需要对Kafka进行更深的学习以及应用!

http://www.lryc.cn/news/68879.html

相关文章:

  • 理解分布式id生成算法SnowFlake
  • 光纤收发器可以连接光模块吗?
  • 一文快速了解浏览器Sui Explorer
  • python中lambda、yield、map、filter、reduce的使用
  • 第十八章 使用LNMP架构部署动态网站环境
  • 无人值守的IDC机房动环综合运维方案
  • 桌面远程工具推荐
  • MySQL高级——第15章_锁
  • 【ROS】Ubuntu22.04安装ROS2(Humble Hawksbill)
  • 【ChatGPT】体验一下ChatGPT
  • Android 串口通信
  • Python3 日期和时间
  • Go 爬虫三种框架的基本使用介绍
  • python实现斐波那契数列详解(黄金分割)
  • 整合营销和内容营销哪个好,有什么区别
  • C# | [二进制字符串] 与 [字节数组] 互相转换,一行代码就搞定! - CodePlus系列
  • Java 细节汇总(5)-Comparator#compare() 升降序确定
  • 湖北棒球发展报告·棒球5号位
  • 使用Eclipse 进行远程 Debug 调试
  • 记第一次出差得出的经验
  • 第12章:视图
  • Word控件Aspose.Words教程:操作 XPS 和 EPS 文档
  • java并发-Exchanger
  • 毫米波雷达系列 | 传统CFAR检测(自适应类)
  • 【2023/05/19】NFA
  • 汽车功能安全
  • 【Python】数据分析与可视化实践:收支日统计数据可视化的实现
  • Halcon 中_xld算子的概念与应用? select_shape_std 和 select_shape_xld区别?
  • [pgrx开发postgresql数据库扩展]7.返回序列的函数编写(3)多行表序列
  • 刚入职,就想跑路了...