当前位置: 首页 > news >正文

flink cdc 应用

SQLServer

1. The db history topic or its content is fully or partially missing. Please check database history topic configuration and re-execute the snapshot.

遇到了一下问题,多次尝试,最终发现是数据库大小写要一致。

Caused by: io.debezium.DebeziumException: The db history topic or its content is fully or partially missing. Please check database history topic configuration and re-execute the snapshot.at io.debezium.relational.HistorizedRelationalDatabaseSchema.recover(HistorizedRelationalDatabaseSchema.java:59) ~[flink-sql-connector-sqlserver-cdc-3.2.0.jar:3.2.0]at io.debezium.schema.HistorizedDatabaseSchema.recover(HistorizedDatabaseSchema.java:38) ~[flink-sql-connector-sqlserver-cdc-3.2.0.jar:3.2.0]at org.apache.flink.cdc.connectors.sqlserver.source.reader.fetch.SqlServerSourceFetchTaskContext.validateAndLoadDatabaseHistory(SqlServerSourceFetchTaskContext.java:187) ~[flink-sql-connector-sqlserver-cdc-3.2.0.jar:3.2.0]at org.apache.flink.cdc.connectors.sqlserver.source.reader.fetch.SqlServerSourceFetchTaskContext.configure(SqlServerSourceFetchTaskContext.java:130) ~[flink-sql-connector-sqlserver-cdc-3.2.0.jar:3.2.0]at org.apache.flink.cdc.connectors.base.source.reader.external.IncrementalSourceStreamFetcher.submitTask(IncrementalSourceStreamFetcher.java:84) ~[flink-sql-connector-sqlserver-cdc-3.2.0.jar:3.2.0]at org.apache.flink.cdc.connectors.base.source.reader.IncrementalSourceSplitReader.submitStreamSplit(IncrementalSourceSplitReader.java:261) ~[flink-sql-connector-sqlserver-cdc-3.2.0.jar:3.2.0]at org.apache.flink.cdc.connectors.base.source.reader.IncrementalSourceSplitReader.pollSplitRecords(IncrementalSourceSplitReader.java:153) ~[flink-sql-connector-sqlserver-cdc-3.2.0.jar:3.2.0]at org.apache.flink.cdc.connectors.base.source.reader.IncrementalSourceSplitReader.fetch(IncrementalSourceSplitReader.java:98) ~[flink-sql-connector-sqlserver-cdc-3.2.0.jar:3.2.0]at org.apache.flink.connector.base.source.reader.fetcher.FetchTask.run(FetchTask.java:58) ~[flink-connector-files-1.20.0.jar:1.20.0]at org.apache.flink.connector.base.source.reader.fetcher.SplitFetcher.runOnce(SplitFetcher.java:165) ~[flink-connector-files-1.20.0.jar:1.20.0]at org.apache.flink.connector.base.source.reader.fetcher.SplitFetcher.run(SplitFetcher.java:117) ~[flink-connector-files-1.20.0.jar:1.20.0]at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) ~[?:?]at java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[?:?]at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?]
  CREATE TABLE Member_Extend (ID INT, MemberID INT,  PRIMARY KEY (ID) NOT ENFORCED) WITH ('connector' = 'sqlserver-cdc','hostname' = '192.168.1.3','port' = '1433','username' = 'test','password' = 'test','database-name' = 'CrmExtend','table-name' = 'dbo.Member_Extend');

作业安全启停


show jobs
Flink SQL> show jobs;
+----------------------------------+------------------------------------------------------------------+----------+-------------------------+
|                           job id |                                                         job name |   status |              start time |
+----------------------------------+------------------------------------------------------------------+----------+-------------------------+
| ce5a0e938563cf52317c5b9055ad102f| testjob |  RUNNING | 2024-11-15T03:38:47.919 |+----------------------------------+------------------------------------------------------------------+----------+-------------------------+
4 rows in setSET state.checkpoints.dir='s3://flink/cdc-1.20/savepoints';
stop job 'ce5a0e938563cf52317c5b9055ad102f' with savepoint;Flink SQL> stop job 'ce5a0e938563cf52317c5b9055ad102f' with savepoint;
+--------------------------------------------------------------+
|                                               savepoint path |
+--------------------------------------------------------------+
| s3://flink/cdc-1.20/savepoints/savepoint-ce5a0e-2935055bb307 |
+--------------------------------------------------------------+
1 row in setSET execution.savepoint.path='s3://flink/cdc-1.20/savepoints/savepoiznt-ce5a0e-2935055bb307';  
set 'execution.savepoint.ignore-unclaimed-state' = 'true'; 重新执行原有sqlinsert into flink_user select * from user ;
http://www.lryc.cn/news/487633.html

相关文章:

  • MyBlog(三) -- APP的应用
  • docker有哪些网络模式
  • npoi 如何设置单元格为文本类型
  • Vue3、Vite5、Primevue、Oxlint、Husky9 简单快速搭建最新的Web项目模板
  • DataStream编程模型之数据源、数据转换、数据输出
  • 海康IPC接入TRTC时,从海康中获取的数据显示时色差不正确
  • 『VUE』31. 生命周期的应用(详细图文注释)
  • Mybatis框架之建造者模式 (Builder Pattern)
  • Java从入门到精通笔记篇(十三)
  • 嵌入式:STM32的启动(Startup)文件解析
  • ElasticSearch学习笔记四:基础操作(二)
  • ODA-em-application.log太大处理
  • 基于现金红包营销活动的开源 AI 智能名片与 S2B2C 商城小程序融合发展研究
  • 远程管理不再难!树莓派5安装Raspberry Pi OS并实现使用VNC异地连接
  • React中 setState 是同步的还是异步的?调和阶段 setState 干了什么?
  • 【D3.js in Action 3 精译_040】4.4 D3 弧形图的绘制方法
  • C++设计模式:抽象工厂模式(风格切换案例)
  • 搜维尔科技:Xsens随时随地捕捉,在任何环境下实时录制或捕捉
  • 爬虫基础总结 —— 附带爬取案例
  • 图像处理学习笔记-20241118
  • 不能打开网页,但能打开QQ、微信(三种方式)
  • 使用 start-local 脚本在本地运行 Elasticsearch
  • 计算机网络:概述知识点及习题练习
  • python蓝桥杯刷题2
  • 在openi平台 基于华为顶级深度计算平台 openmind 动手实践
  • KF UKF
  • 中伟视界:AI智能分析算法如何针对非煤矿山的特定需求,提供定制化的安全生产解决方案
  • Unity 编辑器下 Android 平台 Addressable 加载模型粉红色,类似材质丢失
  • Pytest-Bdd-Playwright 系列教程(10):配置功能文件路径 优化场景定义
  • rust逆向初探