一、问题

在win10,local模式执行完spark任务后不论是否可以执行出结果,都会报错:

Failed to delete: C:\Users\lvacz\AppData\Local\Temp\spark-7921735f-07fa-45db-875e-5a6440eb7e79

部分日志:

21/05/26 13:01:34 WARN SparkEnv: Exception while deleting Spark temp dir: C:\Users\lvacz\AppData\Local\Temp\spark-7921735f-07fa-45db-875e-5a6440eb7e79\userFiles-af608dcd-6f5c-44a3-a295-3667472e8936
java.io.IOException: Failed to delete: C:\Users\lvacz\AppData\Local\Temp\spark-7921735f-07fa-45db-875e-5a6440eb7e79\userFiles-af608dcd-6f5c-44a3-a295-3667472e8936\org.mongodb_mongo-java-driver-3.4.2.jarat org.apache.spark.network.util.JavaUtils.deleteRecursivelyUsingJavaIO(JavaUtils.java:144)at org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:118)at org.apache.spark.network.util.JavaUtils.deleteRecursivelyUsingJavaIO(JavaUtils.java:128)at org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:118)at org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:91)at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:1062)at org.apache.spark.SparkEnv.stop(SparkEnv.scala:103)at org.apache.spark.SparkContext$$anonfun$stop$11.apply$mcV$sp(SparkContext.scala:1974)at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1340)at org.apache.spark.SparkContext.stop(SparkContext.scala:1973)at org.apache.spark.SparkContext$$anonfun$2.apply$mcV$sp(SparkContext.scala:575)at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:216)at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:188)at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188)at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188)at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1945)at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:188)at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188)at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188)at scala.util.Try$.apply(Try.scala:192)at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188)at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178)at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)
21/05/26 13:01:34 ERROR ShutdownHookManager: Exception while deleting Spark temp dir: C:\Users\lvacz\AppData\Local\Temp\spark-7921735f-07fa-45db-875e-5a6440eb7e79
java.io.IOException: Failed to delete: C:\Users\lvacz\AppData\Local\Temp\spark-7921735f-07fa-45db-875e-5a6440eb7e79\userFiles-af608dcd-6f5c-44a3-a295-3667472e8936\org.mongodb_mongo-java-driver-3.4.2.jarat org.apache.spark.network.util.JavaUtils.deleteRecursivelyUsingJavaIO(JavaUtils.java:144)at org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:118)at org.apache.spark.network.util.JavaUtils.deleteRecursivelyUsingJavaIO(JavaUtils.java:128)at org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:118)at org.apache.spark.network.util.JavaUtils.deleteRecursivelyUsingJavaIO(JavaUtils.java:128)at org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:118)at org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:91)at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:1062)at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:65)at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:62)at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)at org.apache.spark.util.ShutdownHookManager$$anonfun$1.apply$mcV$sp(ShutdownHookManager.scala:62)at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:216)at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:188)at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188)at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188)at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1945)at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:188)at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188)at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188)at scala.util.Try$.apply(Try.scala:192)at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188)at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178)at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)
21/05/26 13:01:34 ERROR ShutdownHookManager: Exception while deleting Spark temp dir: C:\Users\lvacz\AppData\Local\Temp\spark-7921735f-07fa-45db-875e-5a6440eb7e79\userFiles-af608dcd-6f5c-44a3-a295-3667472e8936
java.io.IOException: Failed to delete: C:\Users\lvacz\AppData\Local\Temp\spark-7921735f-07fa-45db-875e-5a6440eb7e79\userFiles-af608dcd-6f5c-44a3-a295-3667472e8936\org.mongodb_mongo-java-driver-3.4.2.jarat org.apache.spark.network.util.JavaUtils.deleteRecursivelyUsingJavaIO(JavaUtils.java:144)at org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:118)at org.apache.spark.network.util.JavaUtils.deleteRecursivelyUsingJavaIO(JavaUtils.java:128)at org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:118)at org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:91)at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:1062)at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:65)at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:62)at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)at org.apache.spark.util.ShutdownHookManager$$anonfun$1.apply$mcV$sp(ShutdownHookManager.scala:62)at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:216)at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:188)at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188)at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188)at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1945)at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:188)at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188)at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188)at scala.util.Try$.apply(Try.scala:192)at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188)at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178)at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)

二、解决

根据自己情况自行选择解决方式

1 本地安装部署过spark

1.1 修改Spark日志配置

  1. 找到%SPARK_HOME%\conf下的log4j.properties,如果没有log4j.properties,重命名log4j.properties.template为log4j.properties
  2. 用文本编辑器编辑log4j.properties
  3. 在文件末尾追加如下2行配置
log4j.logger.org.apache.spark.util.ShutdownHookManager=OFF
log4j.logger.org.apache.spark.SparkEnv=ERROR

再次执行就不会报错了,但存在一个问题,这个零时目录(C:\Users\lvacz\AppData\Local\Temp)下spakr创建的以spark开头的临时文件会越来越多。

1.2 编写删除临时文件脚本

请注意路径里的lvacz是电脑的用户名,请替换为自己的对应路径

在%SPARK_HOME%\conf目录创建powershell脚本,脚本名字为rm_spark_tmpdir.ps1

Remove-Item C:\Users\lvacz\AppData\Local\Temp\spark-* -Recurse -Force

鼠标移动到脚本文件名上 ——> 右键 ——> 点创建快捷方式
重命名创建好的快捷方式(非必须)

1.3 添加到开机启动

将上面创建的(如果重命名了就是重命名后的)快捷方式移动到如下路径

%APPDATA%\Microsoft\Windows\Start Menu\Programs\Startup

2 只有项目,没有本地部署spark

  1. 在该项目的resource目录下新建名为org/apache/spark/log4j-defaults.properties的文件

在文件中添加如下内容

#
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements.  See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License.  You may obtain a copy of the License at
#
#    http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
## Set everything to be logged to the console
#log4j.rootCategory=INFO, console
log4j.rootCategory=ERROR, console
log4j.logger.org.apache.spark.util.ShutdownHookManager=OFF
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.target=System.err
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n# Set the default spark-shell log level to WARN. When running the spark-shell, the
# log level for this class is used to overwrite the root logger's log level, so that
# the user can have different defaults for the shell and regular Spark apps.
log4j.logger.org.apache.spark.repl.Main=WARN# Settings to quiet third party logs that are too verbose
log4j.logger.org.spark_project.jetty=WARN
log4j.logger.org.spark_project.jetty.util.component.AbstractLifeCycle=ERROR
log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=INFO
log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=INFO# SPARK-9183: Settings to avoid annoying messages when looking up nonexistent UDFs in SparkSQL with Hive support
log4j.logger.org.apache.hadoop.hive.metastore.RetryingHMSHandler=FATAL
log4j.logger.org.apache.hadoop.hive.ql.exec.FunctionRegistry=ERROR# Parquet related logging
log4j.logger.org.apache.parquet.CorruptStatistics=ERROR
log4j.logger.parquet.CorruptStatistics=ERROR

再运行该项目中的spark任务,这个时候应该就不会报错了

后续操作参考1 本地安装部署过spark部分的1.2、1.3配置即可

参考资料

https://www.jianshu.com/p/547892d6657e

Spark 报错 Failed to delete: C:\Users\lvacz\AppData\Local\Temp\spark-*相关推荐

  1. Python3导入scrapy报错1 in C:\Users\ADMINI~1\AppData\Local\Temp\pip-install-831gxniz\Twisted\

    在用scrapy框架实现爬虫时,scrapy导包报错: Command "D:\Python\Anaconda\python.exe -u -c "import setuptool ...

  2. 【坑王之王】OException: Mkdirs failed to create C:/Users/sunxiaochuan/AppData/Local/Temp/*************错误解决

    ERROR StreamMetadata: Error writing stream metadata StreamMetadata IOException: Mkdirs failed to cre ...

  3. Electron-Builder打包时报错could not find: “C:\Users\XX\AppData\Local\Temp\t-bDWVX6\0-messages.nsh“

    Electron-Builder打包时报错could not find: "C:\Users\XX\AppData\Local\Temp\t-bDWVX6\0-messages.nsh&qu ...

  4. Python导入sklearn报错:OSError: Failed to open file b‘C:\\\...AppData\\Local\\Temp\\scipy-...’

    pip install sklearn成功 import sklearn报错 首先这个AppData文件夹是隐藏的,一开始找了很久没有找到,设置显示隐藏的文件夹才看到.(不看也可以) 尝试了很多方法, ...

  5. Spark报错 Failed to send RPC xxx to/ip:43074 java.nio.channels.ClosedChannelException

    1.美图 2.背景 是这样的,我写了一个spark程序,然后,运行的时候,我kiill yarn掉这个程序,然后发现程序报个错然后继续运行了. 我在页面点击任务停止就没事,我页面点击是调用yarn a ...

  6. IDEA打包时clean报错Failed to delete

    现象:控制台报Failed to clean project:Failed to delete 原因:target文件可能时编译的文件被其他程序占用,导致资源无法回收 解决方案: 1.在开始搜索框中输 ...

  7. electron打包时报错could not find: “C:\Users\xxxx\AppData\Local\Temp\t-OLh5E0\0-messages.nsh“

    原文博客地址:https://blog.csdn.net/kyq0417/article/details/111266776 原因 用户名中文名导致的 解决办法 打开 node_module/app- ...

  8. Springboot2中文件上传报java.io.FileNotFoundException: C:\Users\WIzarder\AppData\Local\Temp\tomcat.8080.589

    Springboot2文件上传中用MultipartFile接受文件,上传报错java.io.FileNotFoundException: C:\Users\WIzarder\AppData\Loca ...

  9. Yarn Clinet模式运行spark报错问题

    应用场景 安装部署完完全分布式的spark后,发现yarn-cluster模式可以运行不报错,但是yarn-client报错,无法进行计算PI的值,导致spark并不能使用,报错信息如下所示,只需要修 ...

最新文章

  1. Hybris IMPEX.
  2. JavaSE 国际化 简单例子
  3. .NET Core开发日志——Runtime IDentifier
  4. php 监听 扫描枪,jquery监听扫码枪获得值
  5. 第1篇:Flowable简介
  6. 二叉树前序,中序,后序遍历的迭代实现,实现思路及代码
  7. eclipse 中使用Git
  8. jackson 忽略多余字段_Java进阶学习:JSON解析利器JackSon
  9. java策略模式+工厂模式+模板模式
  10. ngnix 端口映射
  11. QR码生成原理(一)
  12. C++11新特性 - 侯捷
  13. BH1750 STM32 驱动程序
  14. Vue中:error ‘XXXXX‘ is not defined no-undef解决办法
  15. 学计算机用酷一点的话怎么说,酷到让你窒息的句子说说简短一句话 很酷很拽的社会人专属说说...
  16. 怎样给公司定义一份完美的maven parent pom 文件
  17. 窄告:超越搜索引擎关键词模式的精准营销
  18. [ROS2 Foxy]#1.1 ROS2安装
  19. 软件测试自动化测试工具课件,《软件测试自动化》PPT课件.ppt
  20. 2022年监理工程师合同管理考试每日一练及答案

热门文章

  1. 基于Java的大型网站设计方案
  2. 华为路由器负载均衡_华为 AR路由 策略路由 多WAN环境下指定出口
  3. Catagory添加属性、扩展方法
  4. 2019年暑假第八周总结
  5. AWS认证解决方案架构师证书有效期是多久?aws认证架构师考什么?
  6. devise 笔记
  7. 公考二十四节气考点汇总
  8. MACBOOK 快捷键与系统设置
  9. 【机器学习】数据驱动方法在电网稳定分析应用浅谈
  10. java.lang.IllegalArgumentException: Result Maps collection already contains value for ciis.zht.model