Spark streaming应用运行7天之后,自动退出,日志显示token for xxx(用户名): HDFS_DELEGATION_TOKEN owner=xxxx@xxxx.com, renewer=yarn, realUser=, issueDate=1581323654722, maxDate=1581928454722, sequenceNumber=6445344, masterKeyId=1583) is expired, current time: 2020-02-17 16:37:40,567+0800 expected renewal time: 2020-02-17 16:34:14,722+0800 ,可是提交应用的已经用kinit获取了kerberos票据,从日志信息中可以看出,是spark streaming的checkpoint操作hadoop时,发现kerberos票据过期导致。

解决方案:

spark-submit 其他参数。。。 --keytab /home/keytabs/xxxx.keytab --principal xxxx --conf spark.hadoop.fs.hdfs.impl.disable.cache=true

xxxx只是为了隐藏真实用户名 还有就是程序提交时肯定还有其他参数需要配置的。

完整异常信息如下:

暂时隐去用户名信息

20/02/17 16:37:40 ERROR util.Utils: Uncaught exception in thread Thread-5
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.token.SecretManagerInvalidToken):token(tokenforhadoop:HDFSDELEGATIONTOKENowner=xxxx@xxxx.com,renewer=yarn,realUser=,issueDate=1581323654722,maxDate=1581928454722,sequenceNumber=6445344,masterKeyId=1583)isexpired,currenttime:2020−02−1716:37:40,567+0800expectedrenewaltime:2020−02−1716:34:14,722+0800atorg.apache.hadoop.ipc.Client.call(Client.java:1504)atorg.apache.hadoop.ipc.Client.call(Client.java:1441)atorg.apache.hadoop.ipc.ProtobufRpcEngineInvalidToken): token (token for hadoop: HDFS_DELEGATION_TOKEN owner=xxxx@xxxx.com, renewer=yarn, realUser=, issueDate=1581323654722, maxDate=1581928454722, sequenceNumber=6445344, masterKeyId=1583) is expired, current time: 2020-02-17 16:37:40,567+0800 expected renewal time: 2020-02-17 16:34:14,722+0800 at org.apache.hadoop.ipc.Client.call(Client.java:1504) at org.apache.hadoop.ipc.Client.call(Client.java:1441) at org.apache.hadoop.ipc.ProtobufRpcEngineInvalidToken):token(tokenforhadoop:HDFSD​ELEGATIONT​OKENowner=xxxx@xxxx.com,renewer=yarn,realUser=,issueDate=1581323654722,maxDate=1581928454722,sequenceNumber=6445344,masterKeyId=1583)isexpired,currenttime:2020−02−1716:37:40,567+0800expectedrenewaltime:2020−02−1716:34:14,722+0800atorg.apache.hadoop.ipc.Client.call(Client.java:1504)atorg.apache.hadoop.ipc.Client.call(Client.java:1441)atorg.apache.hadoop.ipc.ProtobufRpcEngineInvoker.invoke(ProtobufRpcEngine.java:230)
at com.sun.proxy.Proxy16.getFileInfo(UnknownSource)atorg.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:771)atsun.reflect.GeneratedMethodAccessor34.invoke(UnknownSource)atsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)atjava.lang.reflect.Method.invoke(Method.java:606)atorg.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:258)atorg.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:104)atcom.sun.proxy.Proxy16.getFileInfo(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:771) at sun.reflect.GeneratedMethodAccessor34.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:258) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:104) at com.sun.proxy.Proxy16.getFileInfo(UnknownSource)atorg.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:771)atsun.reflect.GeneratedMethodAccessor34.invoke(UnknownSource)atsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)atjava.lang.reflect.Method.invoke(Method.java:606)atorg.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:258)atorg.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:104)atcom.sun.proxy.Proxy17.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2126)
at org.apache.hadoop.hdfs.DistributedFileSystem$20.doCall(DistributedFileSystem.java:1262)
at org.apache.hadoop.hdfs.DistributedFileSystem20.doCall(DistributedFileSystem.java:1258)atorg.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)atorg.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1258)atorg.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1418)atorg.apache.spark.scheduler.EventLoggingListener.stop(EventLoggingListener.scala:232)atorg.apache.spark.SparkContext20.doCall(DistributedFileSystem.java:1258) at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1258) at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1418) at org.apache.spark.scheduler.EventLoggingListener.stop(EventLoggingListener.scala:232) at org.apache.spark.SparkContext20.doCall(DistributedFileSystem.java:1258)atorg.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)atorg.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1258)atorg.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1418)atorg.apache.spark.scheduler.EventLoggingListener.stop(EventLoggingListener.scala:232)atorg.apache.spark.SparkContextanonfunanonfunanonfunstop777anonfunanonfunanonfunapplymcVmcVmcVsp5.apply(SparkContext.scala:1831)atorg.apache.spark.SparkContext5.apply(SparkContext.scala:1831) at org.apache.spark.SparkContext5.apply(SparkContext.scala:1831)atorg.apache.spark.SparkContextanonfunanonfunanonfunstop777anonfunanonfunanonfunapplymcVmcVmcVsp5.apply(SparkContext.scala:1831)atscala.Option.foreach(Option.scala:257)atorg.apache.spark.SparkContext5.apply(SparkContext.scala:1831) at scala.Option.foreach(Option.scala:257) at org.apache.spark.SparkContext5.apply(SparkContext.scala:1831)atscala.Option.foreach(Option.scala:257)atorg.apache.spark.SparkContextanonfunanonfunanonfunstop7.apply7.apply7.applymcVsp(SparkContext.scala:1831)atorg.apache.spark.util.Utilssp(SparkContext.scala:1831) at org.apache.spark.util.Utilssp(SparkContext.scala:1831)atorg.apache.spark.util.Utils.tryLogNonFatalError(Utils.scala:1295)
at org.apache.spark.SparkContext.stop(SparkContext.scala:1830)
at org.apache.spark.SparkContextKaTeX parse error: Can't use function '$' in math mode at position 8: anonfun$̲2.apply$mcV$sp(…anonfun$runAll111anonfunanonfunanonfunapplymcVmcVmcVsp1.apply1.apply1.applymcVsp(ShutdownHookManager.scala:188)atorg.apache.spark.util.SparkShutdownHookManagersp(ShutdownHookManager.scala:188) at org.apache.spark.util.SparkShutdownHookManagersp(ShutdownHookManager.scala:188)atorg.apache.spark.util.SparkShutdownHookManageranonfunanonfunanonfunrunAll111anonfunanonfunanonfunapplymcVmcVmcVsp1.apply(ShutdownHookManager.scala:188)atorg.apache.spark.util.SparkShutdownHookManager1.apply(ShutdownHookManager.scala:188) at org.apache.spark.util.SparkShutdownHookManager1.apply(ShutdownHookManager.scala:188)atorg.apache.spark.util.SparkShutdownHookManageranonfunanonfunanonfunrunAll111anonfunanonfunanonfunapplymcVmcVmcVsp1.apply(ShutdownHookManager.scala:188)atorg.apache.spark.util.Utils1.apply(ShutdownHookManager.scala:188) at org.apache.spark.util.Utils1.apply(ShutdownHookManager.scala:188)atorg.apache.spark.util.Utils.logUncaughtExceptions(Utils.scala:1963)
at org.apache.spark.util.SparkShutdownHookManagerKaTeX parse error: Can't use function '$' in math mode at position 8: anonfun$̲runAll$1.apply$…anonfun$runAll1.apply(ShutdownHookManager.scala:188)atorg.apache.spark.util.SparkShutdownHookManager1.apply(ShutdownHookManager.scala:188) at org.apache.spark.util.SparkShutdownHookManager1.apply(ShutdownHookManager.scala:188)atorg.apache.spark.util.SparkShutdownHookManageranonfunanonfunanonfunrunAll1.apply(ShutdownHookManager.scala:188)atscala.util.Try1.apply(ShutdownHookManager.scala:188) at scala.util.Try1.apply(ShutdownHookManager.scala:188)atscala.util.Try.apply(Try.scala:192)
at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188)
at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178)
at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)

Spark hadoop票据过期问题HDFS_DELEGATION_TOKEN相关推荐

  1. Spark集群搭建【Spark+Hadoop+Scala+Zookeeper】

    1.安装Linux 需要:3台CentOS7虚拟机 IP:192.168.245.130,192.168.245.131,192.168.245.132(类似,尽量保持连续,方便记忆) 注意: 3台虚 ...

  2. Spark+hadoop+mllib及相关概念与操作笔记

    Spark+hadoop+mllib及相关概念与操作笔记 作者: lw 版本: 0.1 时间: 2016-07-18 1.调研相关注意事项 a) 理解调研 调研的意义在于了解当前情况,挖掘潜在的问题, ...

  3. Spark Hadoop 环境搭建http://www.jianshu.com/p/5b1eafdf34a9

    Spark Hadoop 环境搭建 链接地址:http://www.jianshu.com/p/5b1eafdf34a9 来源:简书 博文内容与结构: 1.介绍了搭建大数据框架的文件和步骤; 2.包括 ...

  4. 【计算机大数据毕设之基于spark+hadoop的大数据分析论文写作参考案例】

    [计算机大数据毕设之基于spark+hadoop的大数据分析论文写作参考案例-哔哩哔哩] https://b23.tv/zKOtd3L 目  录 一 引言​1 二 系统分析​2 2.1 必要性和可行性 ...

  5. spark,hadoop区别

    https://zhuanlan.zhihu.com/p/95016937 Spark和Hadoop的区别和比较: 1.原理比较: Hadoop和Spark都是并行计算,两者都是用MR模型进行计算 H ...

  6. Windows下安装spark+Hadoop

    Spark作为一个基于内存的开源计算框架,在这个大数据时代背景下,受到越来越多的开发者的喜爱,相对于Hadoop,Spark拥有对大量数据更快的处理速度,并且易于使用(支持多种开发语言).比Hadoo ...

  7. Windows 下 Spark+Hadoop+Scala 安装

    整体流程可参考,但文中的版本较低 Spark学习笔记--Spark在Windows下的环境搭建 - 法号阿兴 - 博客园 (cnblogs.com)https://www.cnblogs.com/xu ...

  8. win 10 安装单点Spark+Hadoop+Python环境

    软件版本 python 3.6.8 下载网址 java 1.8.0_221 下载网址 spark-2.4.4-bin-hadoop2.7.tgz 下载网址 hadoop 2.7.6 下载网址 winu ...

  9. 大数据相关书籍(包含Java, Scala, R, Linux, Spark, Hadoop, Hive, Hbase, Sqoop, Flume, Strom)

    下面书单,有一部分英文版原版,当然价格也相对高一点,英文版部分需要在在亚马逊搜索 ,中文版大部分在京东有售! <Hadoop核心技术> 翟周伟 著 <Storm分布式实时计算模式 & ...

最新文章

  1. JOptionPane
  2. R语言交互式可视化包CanvasXpress
  3. c# bindingsource mysql,Navicat for MySQL 使用教程:在.NET中如何连接MySQL数据库
  4. 这是我看过关于 volatile 最好的文章
  5. elasticSearch 安装和 head插件使用
  6. spring mvc重定向_Spring的Web MVC –重定向到内存泄漏
  7. 广州.NET俱乐部 VSTS活动报道
  8. armv7 cortex a系列编程手册_STM32、Cortex-M3和ARMv8-M之间的关联
  9. 实验四+116+陈洁
  10. 网页压缩ob_start('ob_gzhandler')
  11. 冬奥会开幕式震撼刷屏,这些黑科技立了大功!
  12. 微信小程序模拟器加载图片成功,真机加载失败
  13. HTML/CSS/Javascript注册登陆界面全模版(表单验证/验证码生成/敏感词屏蔽/炫酷动画/账号信息储存)
  14. win10系统快速进入bios的设置方法
  15. 《CornerNet: Detecting Objects as Paired Keypoints》之 corner pooling 解读
  16. Java初级·基础语法
  17. 按光在光纤中的传输模式可将光纤分为单模光纤和多模光纤两种
  18. ubuntu下给firefox安装flash插件
  19. 将本地项目上传至码云仓库
  20. C# dotnet 在内存中的 double 的 NAN 和正负无穷二进制是如何存

热门文章

  1. 天水訟 (易經大意 韓長庚)
  2. LeetCode 213. House Robber II(小偷游戏)
  3. ebcdic java_在Java中将EBCDIC转换为ASCII
  4. 直流电机笔记1-串并励电机特性
  5. 联想G460笔记本触摸板驱动 For Windows 7 x64
  6. 【航线运输驾驶员理论考试】操作程序
  7. 计算机专业英语四个部分思维导图,一张思维导图,彻底分清英语五大基本句型...
  8. Spring事务管理-tx:advice标签
  9. 再探矩阵求逆引理 : Woodbury恒等式的证明
  10. 一个面试我的后端妹子问的405错误