java.lang.NoSuchMethodError: org.apache.kafka.clients.consumer.KafkaConsumer.assign(Ljava/util/List
Flink入门程序异常,记录一下跟大家分享。
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
org.apache.flink.runtime.client.JobExecutionException: java.lang.NoSuchMethodError: org.apache.kafka.clients.consumer.KafkaConsumer.assign(Ljava/util/List;)V
at org.apache.flink.runtime.minicluster.MiniCluster.executeJobBlocking(MiniCluster.java:623)
at org.apache.flink.streaming.api.environment.LocalStreamEnvironment.execute(LocalStreamEnvironment.java:123)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.java:1511)
at Streaming.ReadFromKafka.main(ReadFromKafka.java:41)
Caused by: java.lang.NoSuchMethodError: org.apache.kafka.clients.consumer.KafkaConsumer.assign(Ljava/util/List;)V
at org.apache.flink.streaming.connectors.kafka.internal.KafkaConsumerCallBridge.assignPartitions(KafkaConsumerCallBridge.java:42)
at org.apache.flink.streaming.connectors.kafka.internal.KafkaConsumerThread.reassignPartitions(KafkaConsumerThread.java:405)
at org.apache.flink.streaming.connectors.kafka.internal.KafkaConsumerThread.run(KafkaConsumerThread.java:243)
当各位遇到这个错误的时候,相信你们也是写的Flink的入门程序,读取或者写入kafka。网上的资料少之甚少,经过一番寻找还是找到了一点东西。希望大家以后可以少走弯路。
【尖叫提示】:这是入门级别的一个大坑。
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>0.9.0.1</version>
</dependency>
这个kafka-clients的版本一定要写这个。
如果写下面这个,则会报错:具体原因应该是1.0.0的不支持了。
org.apache.flink.runtime.client.JobExecutionException: java.lang.NoSuchMethodError: org.apache.kafka.clients.consumer.KafkaConsumer.assign(Ljava/util/List;)V |
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients_2.11</artifactId>
<version>1.6.0</version>
</dependency>
具体的代码如下:
- package Streaming;
- import org.apache.flink.api.common.functions.MapFunction;
- import org.apache.flink.api.common.serialization.SimpleStringSchema;
- import org.apache.flink.streaming.api.datastream.DataStream;
- import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
- import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer09;
- import java.util.Properties;
- /**
- * Created with IntelliJ IDEA.
- * User: @ziyu freedomziyua@gmail.com
- * Date: 2018-09-10
- * Time: 11:25
- * Description: kafka.Streaming.ReadFromKafka
- */
- public class ReadFromKafka {
- public static void main(String args[]) throws Exception{
- StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
- Properties properties = new Properties();
- properties.setProperty("bootstrap.servers", "192.168.2.41:9092");
- properties.setProperty("group.id", "test");
- DataStream<String> stream = env
- .addSource(new FlinkKafkaConsumer09("flink-demo", new SimpleStringSchema(), properties));
- stream.map(new MapFunction<String, String>() {
- private static final long serialVersionUID = -6867736771747690202L;
- public String map(String value) throws Exception {
- return "Stream Value: " + value;
- }
- }).print();
- try {
- env.execute();
- } catch (Exception e) {
- e.printStackTrace();
- }
- }
- }
如果运行的话,只要环境修改好了,然后引入Flink连接kafka 的依赖
- <properties>
- <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
- <flink.version>1.6.0</flink.version>
- </properties>
- <dependencies>
- <dependency>
- <groupId>org.apache.flink</groupId>
- <artifactId>flink-java</artifactId>
- <version>${flink.version}</version>
- </dependency>
- <dependency>
- <groupId>org.apache.flink</groupId>
- <artifactId>flink-streaming-java_2.11</artifactId>
- <version>${flink.version}</version>
- </dependency>
- <dependency>
- <groupId>org.apache.kafka</groupId>
- <artifactId>kafka-clients</artifactId>
- <version>0.9.0.1</version>
- </dependency>
- <!-- Flink Connector Kafka | exclude Kafka implementation to use MapR -->
- <dependency>
- <groupId>org.apache.flink</groupId>
- <artifactId>flink-connector-kafka-0.10_2.11</artifactId>
- <version>${flink.version}</version>
- </dependency>
- </dependencies>
【运行】
1.kafka创建flink-demo 的主题
2.启动kafka 的生产者和消费者,观察时候可以互通
3.如果上述都没问题,启动Flink
4.运行本地程序,观察输出即可
以上为初学Flink遇到的一个比较棘手的问题,希望大家少走弯路。
java.lang.NoSuchMethodError: org.apache.kafka.clients.consumer.KafkaConsumer.assign(Ljava/util/List相关推荐
- spark2+kafka报错:java.lang.NoSuchMethodError:org.apache.kafka.clients.consumer.KafkaConsumer.subscribe
spark读取kafka数据 // Create DataFrame representing the stream of input lines from kafkaval lines = spar ...
- 【Flink】flink消费kafka报错 KafkaConsumer.assign Ljava/util/List
文章目录 1.概述 1.概述 flink消费kafka上数据时报错: Caused by: java.lang.NoSuchMethodError: org.apache.kafka.clients. ...
- java.io.NotSerializableException: org.apache.kafka.clients.consumer.ConsumerRecord
java消费kafka数据时报错 ERROR [Executor task launch worker for task 90] - Exception in task 0.0 in stage 54 ...
- 【Flink实战系列】Lorg/apache/flink/kafka/shaded/org/apache/kafka/clients/consumer/ConsumerRecord;)Ljava/
java.lang.AbstractMethodError: Method flink/stream/deserialization/PoJoDeserializationSchema.deseria ...
- java.lang.NoSuchMethodError: org.apache.flink.table.api.TableColumn.isGenerated()Z
完整报错如下: select * from dim_behavior; [ERROR] Could not execute SQL statement. Reason: java.lang.NoSuc ...
- hive执行drop卡死一例:java.lang.NoSuchMethodError: org.apache.commons.lang3.StringUtils.isAnyBlank
环境: 组件 版本 Hadoop 3.1.2 Hive 2.3.4 故障复现操作: hive中尝试drop table卡死,然后去hadoop的yarn界面发现如下报错 完整报错: org.apach ...
- ant 时 --java.lang.NoSuchMethodError: org.apache.tools.ant.util.FileUtils.getFileUtils 解决方法
使用ant 时, 有可能出现java.lang.NoSuchMethodError: org.apache.tools.ant.util.FileUtils.getFileUtils的错误 但是直接命 ...
- 异常:java.lang.NoSuchMethodError: org.apache.poi.ss.usermodel.Workbook.getCellStyleAt
异常:java.lang.NoSuchMethodError: org.apache.poi.ss.usermodel.Workbook.getCellStyleAt 参考文章: (1)异常:java ...
- 【记一次kafka报org.apache.kafka.clients.consumer.CommitFailedException异常处理】
项目场景: 项目中,使用到了kafka作为消息中间件,项目作为消费端,消费消息并进行业务处理 问题描述 在实际应用的过程中,发现偶尔但是一直存在的,有消费数据报:org.apache.kafka.cl ...
最新文章
- g_thread_init
- 新浪微博客户端(eoe)
- leetcode 43. Multiply Strings | 43. 字符串相乘(Java)
- Android 递归删除文件和文件夹
- 大学python笔记整理_python 笔记整理
- PHP购物网站(含购物车、全部源码、数据库设计表及其源码)
- Chrome 插件集推荐
- tms sparkle创建server以及module实例
- 一年级计算c语言编程,用C语言switch语句做一年级算术题。
- 大话主流分布式文件系统!
- 计算机瑞士留学经验,瑞士留学生活分享
- 休问情怀谁得似——冰雪小五台苦旅记(十完结篇)
- 计算机编程专业有哪些好的大学?
- 【SpringDataJPA从入门到精通】02-JPA API
- glibc2.29+的off by null利用
- 任务七、名片管理系统
- oracle建库sid大写,oracle解除账号锁定ORA-01078错误举例:SID的大小写错误
- 在Word2013中同时插入上标和下标记
- 获取敌人在屏幕的方向并显示箭头
- Ubuntu下如何用命令行运行deb安装包