在java连接集群的时候出错,完整报错如下:

20/07/30 11:04:11 WARN TaskSetManager: Lost task 1.0 in stage 0.0 (TID 1, 192.168.0.102, executor 0): java.lang.ClassCastException: cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of org.apache.spark.rdd.MapPartitionsRDDat java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2133)at java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1305)at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2251)at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2169)at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2027)at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535)at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2245)at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2169)at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2027)at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535)at java.io.ObjectInputStream.readObject(ObjectInputStream.java:422)at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:115)at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:83)at org.apache.spark.scheduler.Task.run(Task.scala:127)at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:444)at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1377)at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:447)at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)at java.lang.Thread.run(Thread.java:748)20/07/30 11:04:11 INFO TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0) on 192.168.0.102, executor 0: java.lang.ClassCastException (cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of org.apache.spark.rdd.MapPartitionsRDD) [duplicate 1]
20/07/30 11:04:11 INFO TaskSetManager: Starting task 0.1 in stage 0.0 (TID 2, 192.168.0.102, executor 0, partition 0, PROCESS_LOCAL, 7452 bytes)
20/07/30 11:04:11 INFO TaskSetManager: Starting task 1.1 in stage 0.0 (TID 3, 192.168.0.102, executor 0, partition 1, PROCESS_LOCAL, 7452 bytes)
20/07/30 11:04:11 INFO TaskSetManager: Lost task 0.1 in stage 0.0 (TID 2) on 192.168.0.102, executor 0: java.lang.ClassCastException (cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of org.apache.spark.rdd.MapPartitionsRDD) [duplicate 2]
20/07/30 11:04:11 INFO TaskSetManager: Starting task 0.2 in stage 0.0 (TID 4, 192.168.0.102, executor 0, partition 0, PROCESS_LOCAL, 7452 bytes)
20/07/30 11:04:11 INFO TaskSetManager: Lost task 1.1 in stage 0.0 (TID 3) on 192.168.0.102, executor 0: java.lang.ClassCastException (cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of org.apache.spark.rdd.MapPartitionsRDD) [duplicate 3]
20/07/30 11:04:11 INFO TaskSetManager: Starting task 1.2 in stage 0.0 (TID 5, 192.168.0.102, executor 0, partition 1, PROCESS_LOCAL, 7452 bytes)
20/07/30 11:04:11 INFO TaskSetManager: Lost task 0.2 in stage 0.0 (TID 4) on 192.168.0.102, executor 0: java.lang.ClassCastException (cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of org.apache.spark.rdd.MapPartitionsRDD) [duplicate 4]
20/07/30 11:04:11 INFO TaskSetManager: Starting task 0.3 in stage 0.0 (TID 6, 192.168.0.102, executor 0, partition 0, PROCESS_LOCAL, 7452 bytes)
20/07/30 11:04:11 INFO TaskSetManager: Lost task 1.2 in stage 0.0 (TID 5) on 192.168.0.102, executor 0: java.lang.ClassCastException (cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of org.apache.spark.rdd.MapPartitionsRDD) [duplicate 5]
20/07/30 11:04:11 INFO TaskSetManager: Starting task 1.3 in stage 0.0 (TID 7, 192.168.0.102, executor 0, partition 1, PROCESS_LOCAL, 7452 bytes)
20/07/30 11:04:11 INFO TaskSetManager: Lost task 0.3 in stage 0.0 (TID 6) on 192.168.0.102, executor 0: java.lang.ClassCastException (cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of org.apache.spark.rdd.MapPartitionsRDD) [duplicate 6]
20/07/30 11:04:11 ERROR TaskSetManager: Task 0 in stage 0.0 failed 4 times; aborting job
20/07/30 11:04:11 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool
20/07/30 11:04:11 INFO TaskSetManager: Lost task 1.3 in stage 0.0 (TID 7) on 192.168.0.102, executor 0: java.lang.ClassCastException (cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of org.apache.spark.rdd.MapPartitionsRDD) [duplicate 7]
20/07/30 11:04:11 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool
20/07/30 11:04:11 INFO TaskSchedulerImpl: Cancelling stage 0
20/07/30 11:04:11 INFO TaskSchedulerImpl: Killing all running tasks in stage 0: Stage cancelled
20/07/30 11:04:11 INFO DAGScheduler: ResultStage 0 (count at TestSparkJava.java:21) failed in 2.319 s due to Job aborted due to stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0 (TID 6, 192.168.0.102, executor 0): java.lang.ClassCastException: cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of org.apache.spark.rdd.MapPartitionsRDDat java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2133)at java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1305)at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2251)at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2169)at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2027)at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535)at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2245)at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2169)at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2027)at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535)at java.io.ObjectInputStream.readObject(ObjectInputStream.java:422)at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:115)at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:83)at org.apache.spark.scheduler.Task.run(Task.scala:127)at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:444)at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1377)at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:447)at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)at java.lang.Thread.run(Thread.java:748)Driver stacktrace:
20/07/30 11:04:11 INFO DAGScheduler: Job 0 failed: count at TestSparkJava.java:21, took 2.376507 s
Exception in thread "main" org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0 (TID 6, 192.168.0.102, executor 0): java.lang.ClassCastException: cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of org.apache.spark.rdd.MapPartitionsRDDat java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2133)at java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1305)at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2251)at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2169)at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2027)at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535)at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2245)at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2169)at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2027)at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535)at java.io.ObjectInputStream.readObject(ObjectInputStream.java:422)at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:115)at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:83)at org.apache.spark.scheduler.Task.run(Task.scala:127)at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:444)at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1377)at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:447)at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)at java.lang.Thread.run(Thread.java:748)Driver stacktrace:at org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:2023)at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:1972)at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:1971)at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1971)at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:950)at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:950)at scala.Option.foreach(Option.scala:407)at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:950)at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2203)at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2152)at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2141)at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:752)at org.apache.spark.SparkContext.runJob(SparkContext.scala:2093)at org.apache.spark.SparkContext.runJob(SparkContext.scala:2114)at org.apache.spark.SparkContext.runJob(SparkContext.scala:2133)at org.apache.spark.SparkContext.runJob(SparkContext.scala:2158)at org.apache.spark.rdd.RDD.count(RDD.scala:1227)at org.apache.spark.api.java.JavaRDDLike.count(JavaRDDLike.scala:455)at org.apache.spark.api.java.JavaRDDLike.count$(JavaRDDLike.scala:455)at org.apache.spark.api.java.AbstractJavaRDDLike.count(JavaRDDLike.scala:45)at TestSparkJava.main(TestSparkJava.java:21)
Caused by: java.lang.ClassCastException: cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of org.apache.spark.rdd.MapPartitionsRDDat java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2133)at java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1305)at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2251)at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2169)at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2027)at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535)at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2245)at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2169)at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2027)at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535)at java.io.ObjectInputStream.readObject(ObjectInputStream.java:422)at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:115)at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:83)at org.apache.spark.scheduler.Task.run(Task.scala:127)at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:444)at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1377)at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:447)at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)at java.lang.Thread.run(Thread.java:748)
20/07/30 11:04:11 INFO SparkContext: Invoking stop() from shutdown hook
20/07/30 11:04:11 INFO SparkUI: Stopped Spark web UI at http://Desktop:4040
20/07/30 11:04:11 INFO StandaloneSchedulerBackend: Shutting down all executors
20/07/30 11:04:11 INFO CoarseGrainedSchedulerBackend$DriverEndpoint: Asking each executor to shut down
20/07/30 11:04:11 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
20/07/30 11:04:11 INFO MemoryStore: MemoryStore cleared
20/07/30 11:04:11 INFO BlockManager: BlockManager stopped
20/07/30 11:04:11 INFO BlockManagerMaster: BlockManagerMaster stopped
20/07/30 11:04:11 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
20/07/30 11:04:11 INFO SparkContext: Successfully stopped SparkContext
20/07/30 11:04:11 INFO ShutdownHookManager: Shutdown hook called
20/07/30 11:04:11 INFO ShutdownHookManager: Deleting directory /tmp/spark-639e7f1e-b13f-4996-938c-ae65fb283f2e

这个报错的特殊之处在于:

local没事,但是如果连接真实集群就有上述报错.

解决方案如下:

增加setJars,setJars里面的路径是你mvn package后生成的jar包的路径

完整解决方案如下:

import org.apache.spark.api.java.*;
import org.apache.spark.SparkConf;
import org.apache.spark.api.java.function.Function;public class TestSparkJava
{public static void main(String[] args){String logFile = "/home/appleyuchi/IdeaProjects/SpringBoot2EnterPrise/第4章-SpringBoot的数据访问/testSpark/ab.txt";SparkConf conf = new SparkConf().setMaster("spark://Desktop:7077").setJars(new String[]{"/home/appleyuchi/IdeaProjects/SpringBoot2EnterPrise/第4章-SpringBoot的数据访问/testSpark/target/testSpark-1.0-SNAPSHOT.jar"}).setAppName("TestSpark");JavaSparkContext sc = new JavaSparkContext(conf);JavaRDD<String> logData = sc.textFile(logFile).cache();long numAs = logData.filter(new Function<String, Boolean>(){public Boolean call(String s) { return s.contains("0");}}).count();long numBs = logData.filter(new Function<String, Boolean>(){public Boolean call(String s) { return s.contains("1");}}).count();System.out.println("Lines with 0: " + numAs + ", lines with 1: " + numBs);sc.stop();}
}

java.lang.ClassCastException: cannot assign instance of java.lang.invoke.SerializedLambda to field相关推荐

  1. java.lang.ClassCastException:无法将类java.lang.Integer强制转换为类java.lang.Long

    在下面的示例中, jdbcTemplate.queryForList返回一个Integer对象,我们尝试将其直接转换为Long : public List<Customer> findAl ...

  2. 【FLink】cannot assign instance LinkedMap FlinkKafkaConsumerBase.pendingOffsetsToCommit

    文章目录 1.场景1 1.1 概述 1.场景1 1.1 概述 时间:20210603 一个老项目,打包后报错如下 Caused by: java.lang.ClassCastException: ca ...

  3. 报错java.lang.ClassCastException: java.lang.Long cannot be cast to java.lang.String解决踩坑

    java.lang.ClassCaption: java.lang.Long cannot be cast to java.lang.String 问题背景 service传参调用mapper,报错: ...

  4. Java面试题系列【1】JAVA初级经典五十问

    有道无术,术尚可求,有术无道,止于术. 资料整理来自网络 文章目录 1. JDK.JRE.JVM 有什么区别 2. == 和 equals 有什么区别 3. String类可以被继承吗? 4. Str ...

  5. java.lang.ClassCastException: java.math.BigDecimal cannot be cast to java.lang.Integer

    java.lang.ClassCastException: java.math.BigDecimal cannot be cast to java.lang.Integer 1.改sql select ...

  6. oracle timestamp约束,java.lang.ClassCastException:oracle.sql.TIMESTAMP不能转换为java.sql.Timestamp...

    我正在处理通过网络流式传输ResultSet的应用程序.我最终使用了CachedRowSetImpl类.但是当我连接到一个Oracle数据库时,我会收到一个这样的错误 java.lang.ClassC ...

  7. android fastjson java.lang.ClassCastException

    错误堆栈: Process: com.chaozh.iReader, PID: 14502java.lang.ClassCastException: com.alibaba.fastjson.JSON ...

  8. 解决 fastjson 泛型报错 : java.lang.ClassCastException: com.alibaba.fastjson.JSONObject cannot be cast to X

    错误堆栈: Process: com.huawei.himovie1, PID: 20329java.lang.ClassCastException: com.alibaba.fastjson.JSO ...

  9. ArrayMap java.lang.ClassCastException: java.lang.String cannot be cast to java.lang.Object[]

    错误堆栈: java.lang.ClassCastException: java.lang.String cannot be cast to java.lang.Object[]at android. ...

最新文章

  1. BZOJ[1713][Usaco2007 China]The Bovine Accordion and Banjo Orchestra 音乐会 二维斜率优化
  2. “压扁数组”技巧(flattening the array)
  3. Arquillian 1.0.0.Final正式发布! 准备使用GlassFish和WebLogic! 杀死所有虫子!
  4. 解决:No goals have been specified for this build. You must specify a valid lifecycle phase or a goal i
  5. 华图砖题库php文件怎么打印_事业单位招聘考试《工会基础知识》试题库及答案1380题...
  6. 强迫症福利--收起.NET程序的dll来
  7. Problem F. Grab The Tree博弈
  8. 半自动添加Grafana 模板之 ---- POST提交
  9. C#基础16:事件与观察者模式
  10. [FJWC2018]欧拉函数
  11. 【Python】使用Python批量移动文件
  12. 《大数据之路:阿里巴巴大数据实践》-第1篇 数据技术篇 -第5章 实时技术
  13. 项目成本管理的5项原则
  14. 003.宋浩老师《线性代数》笔记(第二章矩阵)(二)
  15. V部落-微信小程序版博客
  16. C/C++实现水果忍者(四) 实现按鼠标左键划过水果消失的功能
  17. 我们为什么选择计算机专业?为什么学习编程?
  18. 华三设备常用调试命令
  19. textContent 和 innerText
  20. nuxt框架Universal和Spa两种render mode的区别

热门文章

  1. JavaScript中函数的变量提升问题
  2. XCTF_Web_新手练习区:webshell
  3. java高级知识点_JAVA高级阶段知识点汇总
  4. (五)资源优化 (经典性能优化解决方案)
  5. (十三)react hooks
  6. JS 中判断一个对象是否为数组对象?
  7. nginx php mysql 部署_Linux+Nginx+Mysql+Php运维部署
  8. 运算放大器基本公式_运算放大器积分器的些微差异
  9. react封装子组件弹框
  10. 手写jwt框架SSO