Invalid value earliest for configuration auto.commit.interval.ms: Not a number of type INT 以及 Failed to construct kafka consumer

kafka消费者的api操作

代码如下

package com.qf.kafka.day1;import org.apache.kafka.clients.consumer.*;
import scala.Int;import java.util.Collections;
import java.util.Properties;public class KafkaConsumerDemo {public static void main(String[] args) {Properties props = new Properties();props.setProperty(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "hadoop01:9092,hadoop02:9092,hadoop03:9092");props.setProperty(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.StringSerializer");props.setProperty(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.StringSerializer");props.setProperty(ConsumerConfig.GROUP_ID_CONFIG, "g2");props.setProperty(ConsumerConfig.AUTO_COMMIT_INTERVAL_MS_CONFIG, "earliest");//自动提交偏移量,默认true//当设置为false,consumer不再自动提交offset
//        props.setProperty(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG,"false");//报错信息指向的位置KafkaConsumer consumer = new KafkaConsumer<String, String>(props);consumer.subscribe(Collections.singletonList("hadoop"));while (true) {ConsumerRecords<String, String> consumerRecord = consumer.poll(1000);//consumer会在Kafka停(猫)一秒for (ConsumerRecord<String, String> record : consumerRecord) {System.out.println(String.format("key=%s value=%s topic=%s partition=%d offset=%d timestamp=%d", record.key(), record.value(), record.topic(), record.partition(), record.offset(), record.timestamp()));}}}
}

(1)报错信息如下

"C:\Program Files\Java\jdk1.8.0_241\bin\java.exe" "-javaagent:D:\大数据\Big--Data软件\IDEA\IntelliJ IDEA 2021.1.1\lib\idea_rt.jar=11555:D:\大数据\Big--Data软件\IDEA\IntelliJ IDEA 2021.1.1\bin" -Dfile.encoding=UTF-8 -classpath "C:\Program Files\Java\jdk1.8.0_241\jre\lib\charsets.jar;C:\Program Files\Java\jdk1.8.0_241\jre\lib\deploy.jar;C:\Program Files\Java\jdk1.8.0_241\jre\lib\ext\access-bridge-64.jar;C:\Program Files\Java\jdk1.8.0_241\jre\lib\ext\cldrdata.jar;C:\Program Files\Java\jdk1.8.0_241\jre\lib\ext\dnsns.jar;C:\Program Files\Java\jdk1.8.0_241\jre\lib\ext\jaccess.jar;C:\Program Files\Java\jdk1.8.0_241\jre\lib\ext\jfxrt.jar;C:\Program Files\Java\jdk1.8.0_241\jre\lib\ext\localedata.jar;C:\Program Files\Java\jdk1.8.0_241\jre\lib\ext\nashorn.jar;C:\Program Files\Java\jdk1.8.0_241\jre\lib\ext\sunec.jar;C:\Program Files\Java\jdk1.8.0_241\jre\lib\ext\sunjce_provider.jar;C:\Program Files\Java\jdk1.8.0_241\jre\lib\ext\sunmscapi.jar;C:\Program Files\Java\jdk1.8.0_241\jre\lib\ext\sunpkcs11.jar;C:\Program Files\Java\jdk1.8.0_241\jre\lib\ext\zipfs.jar;C:\Program Files\Java\jdk1.8.0_241\jre\lib\javaws.jar;C:\Program Files\Java\jdk1.8.0_241\jre\lib\jce.jar;C:\Program Files\Java\jdk1.8.0_241\jre\lib\jfr.jar;C:\Program Files\Java\jdk1.8.0_241\jre\lib\jfxswt.jar;C:\Program Files\Java\jdk1.8.0_241\jre\lib\jsse.jar;C:\Program Files\Java\jdk1.8.0_241\jre\lib\management-agent.jar;C:\Program Files\Java\jdk1.8.0_241\jre\lib\plugin.jar;C:\Program Files\Java\jdk1.8.0_241\jre\lib\resources.jar;C:\Program Files\Java\jdk1.8.0_241\jre\lib\rt.jar;E:\IDEA__workstation\kafka-2102\target\classes;F:\apache-maven-3.6.3\repository\org\apache\kafka\kafka_2.11\2.3.0\kafka_2.11-2.3.0.jar;F:\apache-maven-3.6.3\repository\org\apache\kafka\kafka-clients\2.3.0\kafka-clients-2.3.0.jar;F:\apache-maven-3.6.3\repository\com\github\luben\zstd-jni\1.4.0-1\zstd-jni-1.4.0-1.jar;F:\apache-maven-3.6.3\repository\org\lz4\lz4-java\1.6.0\lz4-java-1.6.0.jar;F:\apache-maven-3.6.3\repository\org\xerial\snappy\snappy-java\1.1.7.3\snappy-java-1.1.7.3.jar;F:\apache-maven-3.6.3\repository\com\fasterxml\jackson\core\jackson-databind\2.9.9\jackson-databind-2.9.9.jar;F:\apache-maven-3.6.3\repository\com\fasterxml\jackson\core\jackson-annotations\2.9.0\jackson-annotations-2.9.0.jar;F:\apache-maven-3.6.3\repository\com\fasterxml\jackson\core\jackson-core\2.9.9\jackson-core-2.9.9.jar;F:\apache-maven-3.6.3\repository\com\fasterxml\jackson\module\jackson-module-scala_2.11\2.9.9\jackson-module-scala_2.11-2.9.9.jar;F:\apache-maven-3.6.3\repository\com\fasterxml\jackson\module\jackson-module-paranamer\2.9.9\jackson-module-paranamer-2.9.9.jar;F:\apache-maven-3.6.3\repository\com\thoughtworks\paranamer\paranamer\2.8\paranamer-2.8.jar;F:\apache-maven-3.6.3\repository\com\fasterxml\jackson\dataformat\jackson-dataformat-csv\2.9.9\jackson-dataformat-csv-2.9.9.jar;F:\apache-maven-3.6.3\repository\com\fasterxml\jackson\datatype\jackson-datatype-jdk8\2.9.9\jackson-datatype-jdk8-2.9.9.jar;F:\apache-maven-3.6.3\repository\net\sf\jopt-simple\jopt-simple\5.0.4\jopt-simple-5.0.4.jar;F:\apache-maven-3.6.3\repository\com\yammer\metrics\metrics-core\2.2.0\metrics-core-2.2.0.jar;F:\apache-maven-3.6.3\repository\org\scala-lang\scala-library\2.11.12\scala-library-2.11.12.jar;F:\apache-maven-3.6.3\repository\org\scala-lang\scala-reflect\2.11.12\scala-reflect-2.11.12.jar;F:\apache-maven-3.6.3\repository\com\typesafe\scala-logging\scala-logging_2.11\3.9.0\scala-logging_2.11-3.9.0.jar;F:\apache-maven-3.6.3\repository\org\slf4j\slf4j-api\1.7.26\slf4j-api-1.7.26.jar;F:\apache-maven-3.6.3\repository\com\101tec\zkclient\0.11\zkclient-0.11.jar;F:\apache-maven-3.6.3\repository\org\apache\zookeeper\zookeeper\3.4.14\zookeeper-3.4.14.jar;F:\apache-maven-3.6.3\repository\com\github\spotbugs\spotbugs-annotations\3.1.9\spotbugs-annotations-3.1.9.jar;F:\apache-maven-3.6.3\repository\com\google\code\findbugs\jsr305\3.0.2\jsr305-3.0.2.jar;F:\apache-maven-3.6.3\repository\org\apache\yetus\audience-annotations\0.5.0\audience-annotations-0.5.0.jar" com.qf.kafka.day1.KafkaConsumerDemo
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
Exception in thread "main" org.apache.kafka.common.config.ConfigException: Invalid value earliest for configuration auto.commit.interval.ms: Not a number of type INTat org.apache.kafka.common.config.ConfigDef.parseType(ConfigDef.java:718)at org.apache.kafka.common.config.ConfigDef.parseValue(ConfigDef.java:473)at org.apache.kafka.common.config.ConfigDef.parse(ConfigDef.java:466)at org.apache.kafka.common.config.AbstractConfig.<init>(AbstractConfig.java:108)at org.apache.kafka.common.config.AbstractConfig.<init>(AbstractConfig.java:129)at org.apache.kafka.clients.consumer.ConsumerConfig.<init>(ConsumerConfig.java:544)at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:664)at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:644)at com.qf.kafka.day1.KafkaConsumerDemo.main(KafkaConsumerDemo.java:23)Process finished with exit code 1

解决办法
粗心导致

props.setProperty(ConsumerConfig.AUTO_COMMIT_INTERVAL_MS_CONFIG, "earliest");改为:
props.setProperty(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest"); // auto.offset.reset 默认是 latest 当指定的组是一个未知的组,应该从哪个位置开始消费

#(2) Failed to construct kafka consumer
报错信息

"C:\Program Files\Java\jdk1.8.0_241\bin\java.exe" "-javaagent:D:\大数据\Big--Data软件\IDEA\IntelliJ IDEA 2021.1.1\lib\idea_rt.jar=9495:D:\大数据\Big--Data软件\IDEA\IntelliJ IDEA 2021.1.1\bin" -Dfile.encoding=UTF-8 -classpath "C:\Program Files\Java\jdk1.8.0_241\jre\lib\charsets.jar;C:\Program Files\Java\jdk1.8.0_241\jre\lib\deploy.jar;C:\Program Files\Java\jdk1.8.0_241\jre\lib\ext\access-bridge-64.jar;C:\Program Files\Java\jdk1.8.0_241\jre\lib\ext\cldrdata.jar;C:\Program Files\Java\jdk1.8.0_241\jre\lib\ext\dnsns.jar;C:\Program Files\Java\jdk1.8.0_241\jre\lib\ext\jaccess.jar;C:\Program Files\Java\jdk1.8.0_241\jre\lib\ext\jfxrt.jar;C:\Program Files\Java\jdk1.8.0_241\jre\lib\ext\localedata.jar;C:\Program Files\Java\jdk1.8.0_241\jre\lib\ext\nashorn.jar;C:\Program Files\Java\jdk1.8.0_241\jre\lib\ext\sunec.jar;C:\Program Files\Java\jdk1.8.0_241\jre\lib\ext\sunjce_provider.jar;C:\Program Files\Java\jdk1.8.0_241\jre\lib\ext\sunmscapi.jar;C:\Program Files\Java\jdk1.8.0_241\jre\lib\ext\sunpkcs11.jar;C:\Program Files\Java\jdk1.8.0_241\jre\lib\ext\zipfs.jar;C:\Program Files\Java\jdk1.8.0_241\jre\lib\javaws.jar;C:\Program Files\Java\jdk1.8.0_241\jre\lib\jce.jar;C:\Program Files\Java\jdk1.8.0_241\jre\lib\jfr.jar;C:\Program Files\Java\jdk1.8.0_241\jre\lib\jfxswt.jar;C:\Program Files\Java\jdk1.8.0_241\jre\lib\jsse.jar;C:\Program Files\Java\jdk1.8.0_241\jre\lib\management-agent.jar;C:\Program Files\Java\jdk1.8.0_241\jre\lib\plugin.jar;C:\Program Files\Java\jdk1.8.0_241\jre\lib\resources.jar;C:\Program Files\Java\jdk1.8.0_241\jre\lib\rt.jar;E:\IDEA__workstation\kafka-2102\target\classes;F:\apache-maven-3.6.3\repository\org\apache\kafka\kafka_2.11\2.3.0\kafka_2.11-2.3.0.jar;F:\apache-maven-3.6.3\repository\org\apache\kafka\kafka-clients\2.3.0\kafka-clients-2.3.0.jar;F:\apache-maven-3.6.3\repository\com\github\luben\zstd-jni\1.4.0-1\zstd-jni-1.4.0-1.jar;F:\apache-maven-3.6.3\repository\org\lz4\lz4-java\1.6.0\lz4-java-1.6.0.jar;F:\apache-maven-3.6.3\repository\org\xerial\snappy\snappy-java\1.1.7.3\snappy-java-1.1.7.3.jar;F:\apache-maven-3.6.3\repository\com\fasterxml\jackson\core\jackson-databind\2.9.9\jackson-databind-2.9.9.jar;F:\apache-maven-3.6.3\repository\com\fasterxml\jackson\core\jackson-annotations\2.9.0\jackson-annotations-2.9.0.jar;F:\apache-maven-3.6.3\repository\com\fasterxml\jackson\core\jackson-core\2.9.9\jackson-core-2.9.9.jar;F:\apache-maven-3.6.3\repository\com\fasterxml\jackson\module\jackson-module-scala_2.11\2.9.9\jackson-module-scala_2.11-2.9.9.jar;F:\apache-maven-3.6.3\repository\com\fasterxml\jackson\module\jackson-module-paranamer\2.9.9\jackson-module-paranamer-2.9.9.jar;F:\apache-maven-3.6.3\repository\com\thoughtworks\paranamer\paranamer\2.8\paranamer-2.8.jar;F:\apache-maven-3.6.3\repository\com\fasterxml\jackson\dataformat\jackson-dataformat-csv\2.9.9\jackson-dataformat-csv-2.9.9.jar;F:\apache-maven-3.6.3\repository\com\fasterxml\jackson\datatype\jackson-datatype-jdk8\2.9.9\jackson-datatype-jdk8-2.9.9.jar;F:\apache-maven-3.6.3\repository\net\sf\jopt-simple\jopt-simple\5.0.4\jopt-simple-5.0.4.jar;F:\apache-maven-3.6.3\repository\com\yammer\metrics\metrics-core\2.2.0\metrics-core-2.2.0.jar;F:\apache-maven-3.6.3\repository\org\scala-lang\scala-library\2.11.12\scala-library-2.11.12.jar;F:\apache-maven-3.6.3\repository\org\scala-lang\scala-reflect\2.11.12\scala-reflect-2.11.12.jar;F:\apache-maven-3.6.3\repository\com\typesafe\scala-logging\scala-logging_2.11\3.9.0\scala-logging_2.11-3.9.0.jar;F:\apache-maven-3.6.3\repository\org\slf4j\slf4j-api\1.7.26\slf4j-api-1.7.26.jar;F:\apache-maven-3.6.3\repository\com\101tec\zkclient\0.11\zkclient-0.11.jar;F:\apache-maven-3.6.3\repository\org\apache\zookeeper\zookeeper\3.4.14\zookeeper-3.4.14.jar;F:\apache-maven-3.6.3\repository\com\github\spotbugs\spotbugs-annotations\3.1.9\spotbugs-annotations-3.1.9.jar;F:\apache-maven-3.6.3\repository\com\google\code\findbugs\jsr305\3.0.2\jsr305-3.0.2.jar;F:\apache-maven-3.6.3\repository\org\apache\yetus\audience-annotations\0.5.0\audience-annotations-0.5.0.jar" com.qf.kafka.day1.KafkaConsumerDemo
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
Exception in thread "main" org.apache.kafka.common.KafkaException: Failed to construct kafka consumerat org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:827)at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:664)at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:644)at com.qf.kafka.day1.KafkaConsumerDemo.main(KafkaConsumerDemo.java:25)
Caused by: org.apache.kafka.common.KafkaException: org.apache.kafka.common.serialization.StringSerializer is not an instance of org.apache.kafka.common.serialization.Deserializerat org.apache.kafka.common.config.AbstractConfig.getConfiguredInstance(AbstractConfig.java:372)at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:713)... 3 moreProcess finished with exit code 1

这是包导错了,以上述代码为例

props.setProperty(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.StringSerializer");
props.setProperty(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.StringSerializer");改为:props.setProperty(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.StringDeserializer");
props.setProperty(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.StringDeserializer");

Invalid value earliest for configuration auto.commit.interval.ms: Not a number of type INT相关推荐

  1. 容易被误会的 Kafka 消费者属性 enable.auto.commit

    前言 理解一下Kafka的读的自动提交功能. 找到了一篇专门介绍这个功能的文章,选择主要的内容进行一下翻译和做笔记. 自动提交参数auto.commit的设置 Understanding the 'e ...

  2. 理解 Kafka 消费者属性的 enable.auto.commit

    前言 理解一下Kafka的读的自动提交功能. 找到了一篇专门介绍这个功能的文章,选择主要的内容进行一下翻译和做笔记. 正文 Understanding the 'enable.auto.commit' ...

  3. Kafka之enable.auto.commit使用解析

    通过字面意思我们不难理解这是kafka的自动提交功能. 配置消费者(配置ENABLE_AUTO_COMMIT_CONFIG为 true 配置自动提交) enable.auto.commit 的默认值是 ...

  4. 【grafana】grafana 报错 Invalid interval string expecting a number followed by one of Mwdhmsy

    1.背景 添加了一个mysql数据源,然后就报错了 Error: Invalid interval string, expecting a number followed by one of &quo ...

  5. springboot和kafka结合其中enable.auto.commit等于false失效

    事件描述 公司使用的是Spring Cloud工作的微服务框架.其中做了SpringBoot和kafka的结合.但是意外的是enable.auto.commit参数设置成了false,kafka的of ...

  6. Apache Kafka-max.poll.interval.ms参数含义说明

    文章目录 官方说明 配置 原生API Spring Kafka 测试 官方说明 https://kafka.apache.org/documentation/ 选择对应的版本,我这里选的是 2.4.X ...

  7. kafka 中参数:session.timeout.ms 和 heartbeat.interval.ms的区别

    文章目录 1.heartbeat.interval.ms 2.heartbeat.interval.ms 与 session.timeout.ms 的对比 3.session.timeout.ms 和 ...

  8. oracle的have,Does oracle have “auto number” data type [duplicate]

    问题 This question already has answers here: It is possible to do a autonumber sequence in a SELECT on ...

  9. Debezium系列之:增加心跳检测heartbeat.interval.ms

    Debezium系列之:增加心跳检测heartbeat.interval.ms 一.需求背景 二.心跳检测含义 三.增加心跳检测参数 四.查看心跳检测topic 五.修改心跳检测topic前缀 一.需 ...

  10. Consumer cannot be configured for auto commit for ackMode MANUAL_IMMEDIATE

    Kafka不能既开启消费端的自动应答又开启监听模式的自动应答

最新文章

  1. C#代码像QQ的右下角消息框一样,无论现在用户的焦点在哪个窗口,消息框弹出后都不影响焦点的变化,那么有两种方法...
  2. 设计模式之简单工厂模式学习笔记
  3. 【译】A Brief History of P2P Content Distribution, in 10 Major Steps
  4. 三星手机com.android.settings,三星手机恢复出厂设置方法【具体步骤】
  5. delphichm博客于2013年10月16日申请成功了!
  6. ad file type not recognised_Java实用工具类:File工具类方法学习,可创建目录及文件...
  7. ExtJs2.0学习系列(6)--Ext.FormPanel之第三式(ComboBox篇)
  8. 关于 ApacheCN 未来发展的思考 2019.5.20
  9. 使用 fail2ban 防御 SSH 服务器的暴力破解
  10. 广州自由自在进口食品进入寻常百姓家
  11. CentOS操作系统服务器搭建MYSQL数据库
  12. bios 微星click_微星发表全新搭载图形化『Click BIOS』主机板 采用最新UEFI架构 迎接全图形化BIOS调教新时代...
  13. 科技创新就要高浓度、高密度
  14. 浅谈 - 技术人员为什么更喜欢进行人身攻击?
  15. 微信小程序自定义tabbar以及闪烁问题
  16. 淘气的小丁-Ajax
  17. 网上打印店可以急速打印东西吗?
  18. PTA 1107 老鼠爱大米(C++实现)
  19. 用手机GPRS使电脑上网
  20. 拉线位移编码器零线有电的原因

热门文章

  1. c#之简单人力资源管理系统
  2. 号外号外 !新媒之家APP2.1.0版本震撼上线!!!
  3. Laravel Eloquent If Record Exists
  4. Google APK下载
  5. flink(三):数据处理Transformation
  6. 斗战神与服务器断开响应,win10系统玩斗战神游戏掉线的设置办法
  7. 基于原子哥开发套件,STM32应用开发的学习笔记
  8. 轻松易懂的CSS学习权威指南来了
  9. python练手经典100例项目-Python 的练手项目有哪些值得推荐?
  10. 厦大C语言上机 1360 算日期