upsert-kafka
时间: 2023-08-27 21:16:54 浏览: 124
kafka需要的jar包集合
在Flink 1.12版本中,新增了一个upsert connector,即upsert-kafka connector。这个connector是在现有的Kafka connector的基础上进行扩展的,它可以在upsert模式下工作。新的upsert-kafka connector不仅可以作为source应用,也可以作为sink应用,同时提供了与现有kafka connector相似的基本功能和持久性保证。这意味着它可以用来读取compact topic或者输出聚合结果,并根据Kafka消息记录的key来确定是插入、删除还是更新记录的处理方式。如果你想了解更多关于upsert-kafka connector的信息,可以参考Apache Flink 1.12 Documentation: Upsert Kafka SQL Connector实时数仓和以upsert的方式读写Kafka数据的文章。<span class="em">1</span><span class="em">2</span><span class="em">3</span>
#### 引用[.reference_title]
- *1* [flink sql clinet 实战:upsert kafka connector -- flink-1.12](https://blog.csdn.net/wuxintdrh/article/details/127434263)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v92^chatsearchT3_1"}}] [.reference_item style="max-width: 50%"]
- *2* *3* [Flink1.12 Kafka Upsert 笔记](https://blog.csdn.net/u010834071/article/details/113057879)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v92^chatsearchT3_1"}}] [.reference_item style="max-width: 50%"]
[ .reference_list ]
阅读全文