环境:CDH6.3.2
spark版本2.4.0
spark-sql脚本
```shell
#!/bin/bash

export HADOOP_CONF_DIR=/etc/hadoop/conf
export YARN_CONF_DIR=/etc/hadoop/conf
SOURCE="${BASH_SOURCE[0]}"
BIN_DIR="$( dirname "$SOURCE" )"
while [ -h "$SOURCE" ]
do
 SOURCE="$(readlink "$SOURCE")"
 [[ $SOURCE != /* ]] && SOURCE="$BIN_DIR/$SOURCE"
 BIN_DIR="$( cd -P "$( dirname "$SOURCE" )" && pwd )"
done
BIN_DIR="$( cd -P "$( dirname "$SOURCE" )" && pwd )"
LIB_DIR=$BIN_DIR/../lib
export HADOOP_HOME=$LIB_DIR/hadoop

# Autodetect JAVA_HOME if not defined
. $LIB_DIR/bigtop-utils/bigtop-detect-javahome

exec $LIB_DIR/spark2/bin/spark-submit --class org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver "$@"
```

服务器上执行 spark-sql 命令
```shell
spark-sql --master yarn --driver-memory 4G --executor-memory 2G --driver-cores 1 --executor-cores 2 -d dt=20220727 -f /opt/sparksql/dwd/xxx.sql
```

报以下错误
```java
[root@cdh01 ~]# spark-sql --master yarn --driver-memory 4G --executor-memory 2G --driver-cores 1 --executor-cores 2 -d dt=20220727 -f /opt/sparksql/dwd/xxx.sql
22/07/28 17:00:47 WARN conf.HiveConf: HiveConf of name hive.vectorized.use.checked.expressions does not exist
22/07/28 17:00:47 WARN conf.HiveConf: HiveConf of name hive.strict.checks.no.partition.filter does not exist
22/07/28 17:00:47 WARN conf.HiveConf: HiveConf of name hive.vectorized.use.vector.serde.deserialize does not exist
22/07/28 17:00:47 WARN conf.HiveConf: HiveConf of name hive.strict.checks.orderby.no.limit does not exist
22/07/28 17:00:47 WARN conf.HiveConf: HiveConf of name hive.vectorized.adaptor.usage.mode does not exist
22/07/28 17:00:47 WARN conf.HiveConf: HiveConf of name hive.vectorized.use.vectorized.input.format does not exist
22/07/28 17:00:47 WARN conf.HiveConf: HiveConf of name hive.vectorized.input.format.excludes does not exist
22/07/28 17:00:47 WARN conf.HiveConf: HiveConf of name hive.strict.checks.bucketing does not exist
22/07/28 17:00:47 WARN conf.HiveConf: HiveConf of name hive.strict.checks.type.safety does not exist
22/07/28 17:00:47 WARN conf.HiveConf: HiveConf of name hive.strict.checks.cartesian.product does not exist
22/07/28 17:00:47 INFO hive.metastore: Trying to connect to metastore with URI thrift://cdh01:9083
22/07/28 17:00:47 INFO hive.metastore: Connected to metastore.
22/07/28 17:00:48 INFO session.SessionState: Created local directory: /tmp/ec4047fc-051f-4f24-9ab5-1e4dc9a2351e_resources
22/07/28 17:00:48 INFO session.SessionState: Created HDFS directory: /tmp/hive/root/ec4047fc-051f-4f24-9ab5-1e4dc9a2351e
22/07/28 17:00:48 INFO session.SessionState: Created local directory: /tmp/root/ec4047fc-051f-4f24-9ab5-1e4dc9a2351e
22/07/28 17:00:48 INFO session.SessionState: Created HDFS directory: /tmp/hive/root/ec4047fc-051f-4f24-9ab5-1e4dc9a2351e/_tmp_space.db
22/07/28 17:00:48 INFO spark.SparkContext: Running Spark version 2.4.0
22/07/28 17:00:48 INFO spark.SparkContext: Submitted application: SparkSQL::192.168.1.20
22/07/28 17:00:48 INFO spark.SecurityManager: Changing view acls to: root
22/07/28 17:00:48 INFO spark.SecurityManager: Changing modify acls to: root
22/07/28 17:00:48 INFO spark.SecurityManager: Changing view acls groups to:
22/07/28 17:00:48 INFO spark.SecurityManager: Changing modify acls groups to:
22/07/28 17:00:48 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(root); groups with view permissions: Set(); users  with modify permissions: Set(root); groups with modify permissions: Set()
22/07/28 17:00:48 INFO util.Utils: Successfully started service 'sparkDriver' on port 37899.
22/07/28 17:00:48 INFO spark.SparkEnv: Registering MapOutputTracker
22/07/28 17:00:48 INFO spark.SparkEnv: Registering BlockManagerMaster
22/07/28 17:00:48 INFO storage.BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
22/07/28 17:00:48 INFO storage.BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
22/07/28 17:00:48 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-3e62cbd5-f9c7-43e4-8acb-e1d10a30da9b
22/07/28 17:00:48 INFO memory.MemoryStore: MemoryStore started with capacity 2004.6 MB
22/07/28 17:00:48 INFO spark.SparkEnv: Registering OutputCommitCoordinator
22/07/28 17:00:48 INFO util.log: Logging initialized @2977ms
22/07/28 17:00:48 INFO server.Server: jetty-9.3.z-SNAPSHOT, build timestamp: unknown, git hash: unknown
22/07/28 17:00:48 INFO server.Server: Started @3042ms
22/07/28 17:00:48 INFO server.AbstractConnector: Started ServerConnector@30364216{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
22/07/28 17:00:48 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
22/07/28 17:00:48 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4a734c04{/jobs,null,AVAILABLE,@Spark}
22/07/28 17:00:48 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2127e66e{/jobs/json,null,AVAILABLE,@Spark}
22/07/28 17:00:48 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1229a2b7{/jobs/job,null,AVAILABLE,@Spark}
22/07/28 17:00:48 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@51c959a4{/jobs/job/json,null,AVAILABLE,@Spark}
22/07/28 17:00:48 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4fc3c165{/stages,null,AVAILABLE,@Spark}
22/07/28 17:00:48 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@10a0fe30{/stages/json,null,AVAILABLE,@Spark}
22/07/28 17:00:48 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7b6860f9{/stages/stage,null,AVAILABLE,@Spark}
22/07/28 17:00:48 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@262816a8{/stages/stage/json,null,AVAILABLE,@Spark}
22/07/28 17:00:48 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1effd53c{/stages/pool,null,AVAILABLE,@Spark}
22/07/28 17:00:48 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@46c269e0{/stages/pool/json,null,AVAILABLE,@Spark}
22/07/28 17:00:48 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6920614{/storage,null,AVAILABLE,@Spark}
22/07/28 17:00:48 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6069dd38{/storage/json,null,AVAILABLE,@Spark}
22/07/28 17:00:48 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5fa23c{/storage/rdd,null,AVAILABLE,@Spark}
22/07/28 17:00:49 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@558756be{/storage/rdd/json,null,AVAILABLE,@Spark}
22/07/28 17:00:49 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@433348bc{/environment,null,AVAILABLE,@Spark}
22/07/28 17:00:49 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6d1dcdff{/environment/json,null,AVAILABLE,@Spark}
22/07/28 17:00:49 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@102ecc22{/executors,null,AVAILABLE,@Spark}
22/07/28 17:00:49 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7ff35a3f{/executors/json,null,AVAILABLE,@Spark}
22/07/28 17:00:49 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@26dc9bd5{/executors/threadDump,null,AVAILABLE,@Spark}
22/07/28 17:00:49 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@252dc8c4{/executors/threadDump/json,null,AVAILABLE,@Spark}
22/07/28 17:00:49 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@43045f9f{/static,null,AVAILABLE,@Spark}
22/07/28 17:00:49 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@65b97f47{/,null,AVAILABLE,@Spark}
22/07/28 17:00:49 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@255eaa6b{/api,null,AVAILABLE,@Spark}
22/07/28 17:00:49 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@43e9089{/jobs/job/kill,null,AVAILABLE,@Spark}
22/07/28 17:00:49 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3c5dbdf8{/stages/stage/kill,null,AVAILABLE,@Spark}
22/07/28 17:00:49 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at http://cdh01:4040
22/07/28 17:00:49 INFO util.Utils: Using initial executors = 0, max of spark.dynamicAllocation.initialExecutors, spark.dynamicAllocation.minExecutors and spark.executor.instances
22/07/28 17:00:49 INFO yarn.Client: Requesting a new application from cluster with 5 NodeManagers
22/07/28 17:00:49 INFO yarn.Client: Verifying our application has not requested more than the maximum memory capability of the cluster (38357 MB per container)
22/07/28 17:00:49 INFO yarn.Client: Will allocate AM container, with 896 MB memory including 384 MB overhead
22/07/28 17:00:49 INFO yarn.Client: Setting up container launch context for our AM
22/07/28 17:00:49 INFO yarn.Client: Setting up the launch environment for our AM container
22/07/28 17:00:49 INFO yarn.Client: Preparing resources for our AM container
22/07/28 17:00:49 INFO yarn.Client: Uploading resource file:/tmp/spark-e18790ed-7bf0-494c-a1b3-1b1e6c45ab66/__spark_conf__8487614089148696127.zip -> hdfs://cdh01:8020/user/root/.sparkStaging/application_1658995826987_0005/__spark_conf__.zip
22/07/28 17:00:49 INFO spark.SecurityManager: Changing view acls to: root
22/07/28 17:00:49 INFO spark.SecurityManager: Changing modify acls to: root
22/07/28 17:00:49 INFO spark.SecurityManager: Changing view acls groups to:
22/07/28 17:00:49 INFO spark.SecurityManager: Changing modify acls groups to:
22/07/28 17:00:49 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(root); groups with view permissions: Set(); users  with modify permissions: Set(root); groups with modify permissions: Set()
22/07/28 17:00:50 INFO yarn.Client: Submitting application application_1658995826987_0005 to ResourceManager
22/07/28 17:00:50 INFO impl.YarnClientImpl: Submitted application application_1658995826987_0005
22/07/28 17:00:50 INFO cluster.SchedulerExtensionServices: Starting Yarn extension services with app application_1658995826987_0005 and attemptId None
22/07/28 17:00:51 INFO yarn.Client: Application report for application_1658995826987_0005 (state: ACCEPTED)
22/07/28 17:00:51 INFO yarn.Client:
     client token: N/A
     diagnostics: AM container is launched, waiting for AM container to Register with RM
     ApplicationMaster host: N/A
     ApplicationMaster RPC port: -1
     queue: root.users.root
     start time: 1658998850473
     final status: UNDEFINED
     tracking URL: http://cdh01:8088/proxy/application_1658995826987_0005/
     user: root
22/07/28 17:00:52 INFO yarn.Client: Application report for application_1658995826987_0005 (state: ACCEPTED)
22/07/28 17:00:53 INFO cluster.YarnClientSchedulerBackend: Add WebUI Filter. org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter, Map(PROXY_HOSTS -> cdh01,cdh02, PROXY_URI_BASES -> http://cdh01:8088/proxy/application_1658995826987_0005,http://cdh02:8088/proxy/application_1658995826987_0005, RM_HA_URLS -> cdh01:8088,cdh02:8088), /proxy/application_1658995826987_0005
22/07/28 17:00:53 ERROR server.TransportRequestHandler: Error while invoking RpcHandler#receive() on RPC id 7767897012427002665
java.lang.ClassNotFoundException: org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages$RetrieveDelegationTokens$
    at java.net.URLClassLoader.findClass(URLClassLoader.java:387)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:348)
    at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67)
    at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2011)
    at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1875)
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2209)
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1692)
    at java.io.ObjectInputStream.readObject(ObjectInputStream.java:508)
    at java.io.ObjectInputStream.readObject(ObjectInputStream.java:466)
    at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
    at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:108)
    at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1$$anonfun$apply$1.apply(NettyRpcEnv.scala:271)
    at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
    at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:320)
    at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1.apply(NettyRpcEnv.scala:270)
    at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
    at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:269)
    at org.apache.spark.rpc.netty.RequestMessage$.apply(NettyRpcEnv.scala:611)
    at org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:662)
    at org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:647)
    at org.apache.spark.network.server.TransportRequestHandler.processRpcRequest(TransportRequestHandler.java:181)
    at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:103)
    at org.apache.spark.network.server.TransportChannelHandler.channelRead(TransportChannelHandler.java:118)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
    at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:286)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
    at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
    at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
    at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1359)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
    at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:935)
    at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:138)
    at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:645)
    at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:580)
    at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:497)
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459)
    at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
    at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
    at java.lang.Thread.run(Thread.java:750)
22/07/28 17:00:53 INFO ui.JettyUtils: Adding filter org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter to /jobs, /jobs/json, /jobs/job, /jobs/job/json, /stages, /stages/json, /stages/stage, /stages/stage/json, /stages/pool, /stages/pool/json, /storage, /storage/json, /storage/rdd, /storage/rdd/json, /environment, /environment/json, /executors, /executors/json, /executors/threadDump, /executors/threadDump/json, /static, /, /api, /jobs/job/kill, /stages/stage/kill.
22/07/28 17:00:53 INFO yarn.Client: Application report for application_1658995826987_0005 (state: RUNNING)
22/07/28 17:00:53 INFO yarn.Client:
     client token: N/A
     diagnostics: N/A
     ApplicationMaster host: 192.168.1.20
     ApplicationMaster RPC port: -1
     queue: root.users.root
     start time: 1658998850473
     final status: UNDEFINED
     tracking URL: http://cdh01:8088/proxy/application_1658995826987_0005/
     user: root
22/07/28 17:00:53 INFO cluster.YarnClientSchedulerBackend: Application application_1658995826987_0005 has started running.
22/07/28 17:00:53 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 38426.
22/07/28 17:00:53 INFO netty.NettyBlockTransferService: Server created on cdh01:38426
22/07/28 17:00:53 INFO storage.BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
22/07/28 17:00:53 INFO storage.BlockManagerMaster: Registering BlockManager BlockManagerId(driver, cdh01, 38426, None)
22/07/28 17:00:53 INFO storage.BlockManagerMasterEndpoint: Registering block manager cdh01:38426 with 2004.6 MB RAM, BlockManagerId(driver, cdh01, 38426, None)
22/07/28 17:00:53 INFO storage.BlockManagerMaster: Registered BlockManager BlockManagerId(driver, cdh01, 38426, None)
22/07/28 17:00:53 INFO storage.BlockManager: external shuffle service port = 7337
22/07/28 17:00:53 INFO storage.BlockManager: Initialized BlockManager: BlockManagerId(driver, cdh01, 38426, None)
22/07/28 17:00:54 INFO ui.JettyUtils: Adding filter org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter to /metrics/json.
22/07/28 17:00:54 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3dc39459{/metrics/json,null,AVAILABLE,@Spark}
22/07/28 17:00:54 INFO scheduler.EventLoggingListener: Logging events to hdfs://cdh01:8020/user/spark/applicationHistory/application_1658995826987_0005
22/07/28 17:00:54 INFO util.Utils: Using initial executors = 0, max of spark.dynamicAllocation.initialExecutors, spark.dynamicAllocation.minExecutors and spark.executor.instances
22/07/28 17:00:54 WARN cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: Attempted to request executors before the AM has registered!
22/07/28 17:00:54 INFO server.AbstractConnector: Stopped Spark@30364216{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
22/07/28 17:00:54 INFO ui.SparkUI: Stopped Spark web UI at http://cdh01:4040
22/07/28 17:00:54 WARN cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: Attempted to request executors before the AM has registered!
22/07/28 17:00:54 INFO cluster.YarnClientSchedulerBackend: Interrupting monitor thread
22/07/28 17:00:54 INFO cluster.YarnClientSchedulerBackend: Shutting down all executors
22/07/28 17:00:54 INFO cluster.YarnSchedulerBackend$YarnDriverEndpoint: Asking each executor to shut down
22/07/28 17:00:54 INFO cluster.SchedulerExtensionServices: Stopping SchedulerExtensionServices
(serviceOption=None,
 services=List(),
 started=false)
22/07/28 17:00:54 INFO cluster.YarnClientSchedulerBackend: Stopped
22/07/28 17:00:54 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
22/07/28 17:00:54 INFO memory.MemoryStore: MemoryStore cleared
22/07/28 17:00:54 INFO storage.BlockManager: BlockManager stopped
22/07/28 17:00:54 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
22/07/28 17:00:54 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
22/07/28 17:00:54 INFO spark.SparkContext: Successfully stopped SparkContext
22/07/28 17:00:54 ERROR spark.SparkContext: Error initializing SparkContext.
org.apache.spark.SparkException: Exception when registering SparkListener
    at org.apache.spark.SparkContext.setupAndStartListenerBus(SparkContext.scala:2398)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:555)
    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2520)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:935)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:926)
    at scala.Option.getOrElse(Option.scala:121)
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926)
    at org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:48)
    at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.<init>(SparkSQLCLIDriver.scala:315)
    at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:166)
    at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
    at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
    at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
    at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
    at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
    at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: com.cloudera.spark.lineage.NavigatorAppListener
    at java.net.URLClassLoader.findClass(URLClassLoader.java:387)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:348)
    at org.apache.spark.util.Utils$.classForName(Utils.scala:238)
    at org.apache.spark.util.Utils$$anonfun$loadExtensions$1.apply(Utils.scala:2682)
    at org.apache.spark.util.Utils$$anonfun$loadExtensions$1.apply(Utils.scala:2680)
    at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
    at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
    at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
    at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
    at scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:241)
    at scala.collection.AbstractTraversable.flatMap(Traversable.scala:104)
    at org.apache.spark.util.Utils$.loadExtensions(Utils.scala:2680)
    at org.apache.spark.SparkContext$$anonfun$setupAndStartListenerBus$1.apply(SparkContext.scala:2387)
    at org.apache.spark.SparkContext$$anonfun$setupAndStartListenerBus$1.apply(SparkContext.scala:2386)
    at scala.Option.foreach(Option.scala:257)
    at org.apache.spark.SparkContext.setupAndStartListenerBus(SparkContext.scala:2386)
    ... 22 more
22/07/28 17:00:54 INFO spark.SparkContext: SparkContext already stopped.
Exception in thread "main" org.apache.spark.SparkException: Exception when registering SparkListener
    at org.apache.spark.SparkContext.setupAndStartListenerBus(SparkContext.scala:2398)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:555)
    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2520)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:935)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:926)
    at scala.Option.getOrElse(Option.scala:121)
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926)
    at org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:48)
    at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.<init>(SparkSQLCLIDriver.scala:315)
    at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:166)
    at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
    at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
    at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
    at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
    at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
    at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: com.cloudera.spark.lineage.NavigatorAppListener
    at java.net.URLClassLoader.findClass(URLClassLoader.java:387)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:348)
    at org.apache.spark.util.Utils$.classForName(Utils.scala:238)
    at org.apache.spark.util.Utils$$anonfun$loadExtensions$1.apply(Utils.scala:2682)
    at org.apache.spark.util.Utils$$anonfun$loadExtensions$1.apply(Utils.scala:2680)
    at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
    at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
    at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
    at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
    at scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:241)
    at scala.collection.AbstractTraversable.flatMap(Traversable.scala:104)
    at org.apache.spark.util.Utils$.loadExtensions(Utils.scala:2680)
    at org.apache.spark.SparkContext$$anonfun$setupAndStartListenerBus$1.apply(SparkContext.scala:2387)
    at org.apache.spark.SparkContext$$anonfun$setupAndStartListenerBus$1.apply(SparkContext.scala:2386)
    at scala.Option.foreach(Option.scala:257)
    at org.apache.spark.SparkContext.setupAndStartListenerBus(SparkContext.scala:2386)
    ... 22 more
22/07/28 17:00:54 INFO util.ShutdownHookManager: Shutdown hook called
22/07/28 17:00:54 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-8b86611e-65b4-4793-badb-99f77cbdfe9f
22/07/28 17:00:54 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-e18790ed-7bf0-494c-a1b3-1b1e6c45ab66
```

spark-sql运行报错 ERROR server.TransportRequestHandler: Error while invoking RpcHandler#receive() on RPC相关推荐

  1. 导入sql文件报错:MySQL server has gone away 以及解决方法

    项目场景: 导入sql文件报错:MySQL server has gone away 以及解决方法 问题描述: 在我们使用mysql导入大文件sql时可能会报MySQL server has gone ...

  2. mysql sql option_MySQL 报错MySQL server syntax to use near 'OPTION SQL_SELECT_LIMIT=DEFAULT'

    在hive的应用中,出现如下错误时You have an error in your SQL syntax; check the manual that corresponds to your MyS ...

  3. RTX 3090运行报错:RuntimeError: CUDA error: no kernel image is available for execution on the device

    RuntimeError: CUDA error: no kernel image is available for execution on the device 安装适用于GeForce RTX ...

  4. 程序运行报错:A JavaScript error occurred in the main process

    找到电脑中的%appdata%文件 在文件夹中,找当出错软件的缓存文件夹, 这里假设出错的是baidu 删除,重启软件,还不行的话,删除软件,然后删除日志文件,再重启电脑,重装软件. 出现A Java ...

  5. geth运行报错zsh: exec format error: ./geth

    使用: file geth 可知,原因多半是geth与对应的系统不匹配造成的,同理,AMD的mac也暂时用不了这个 可以从这里重新下载:https://geth.ethereum.org/downlo ...

  6. vue 运行报错Module build failed: Error: Node Sass does not yet support your current environment: Windows

    卸载node-sass在安装 #1.卸载node-sass npm uninstall --save node-sass #2.安装node-sass npm install --save node- ...

  7. Spark SQL入门:创建SparkSession时import spark.implicits._ 报错: error: value implicits is not a member of...

    Spark SQL入门:创建SparkSession时import spark.implicits._ 报错: error: value implicits is not a member of... ...

  8. unknown error mysql_mysql执行sql文件报错Error: Unknown storage engine‘InnoDB’的解决方法...

    发现问题 最近在工作中遇到一个问题,在运行了一个innoDB类型的sql文件,报了Error: Unknown storage engine 'InnoDB'错误,网上查了很多方法,但是都没办法真正解 ...

  9. Spark权限问题:Spark-submit运行报错 Permission denied user=deploy

    1.视界 2.背景 Spark权限问题:Spark-submit运行报错 Permission denied user=deploy 这个问题是权限问题,只需要修改一下hdfs路径的权限就好了 原本的 ...

最新文章

  1. 交叉编译qt-everywhere-opensource-src-4.6.2需要的几个包
  2. leetcode-114. Flatten Binary Tree to Linked List
  3. 用asp.net core结合fastdfs打造分布式文件存储系统
  4. 广东金融学院java实验报告_《大学计算机Ⅰ》实验报告实验三
  5. 新手上路必学的Python函数基础知识,全在这里了(多段代码举例)
  6. F#简明教程二:F#类型系统和类型推断机制
  7. 2017.9.27 青蛙的约会 失败总结
  8. 编程语言对高手没有差别,对低手差别太明显
  9. Data Shapley: Equitable Valuation of Data for Machine Learning(翻译)
  10. ubuntu18.04 Nvidia 显卡的风扇调速及startx的后果 --转载
  11. word里如何设置目录页码
  12. 用ESP8266获取网页信息+获取b站粉丝数案例
  13. html5新年网页做给父母的,2020给父母的新年祝福语
  14. 用Xilinx的FPGA实现HDMI(DVI)接收器
  15. 那一份无怨亦无悔的真情实意
  16. 堡垒机Windows远程桌面连接服务器黑屏解决
  17. js 解决数据精度丢失问题
  18. linux 防火墙黑名单
  19. Editplus下载安装
  20. 《SysML精粹》学习记录--第五章

热门文章

  1. X线DR医学图像 --- 直方图的窗宽窗位调整(Matlab篇)
  2. Transformer课程 第8课NER案例代码笔记-部署简介
  3. 【淘宝API开发系列】获得商品评论 API 返回值说明
  4. spyder汉化方法
  5. CRC16算法是什么
  6. python 遍历List各种方式
  7. 三种批量插入数据的方法
  8. 微信小程序--js中string转换为number
  9. 非线性激励函数sigmoid,tanh,softplus,Relu
  10. intellij 打开两个窗口