org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.fs.FileAlreadyExistsException)
文章目录
- 1.错误信息:
- 2.原因
- 3.解决方法
1.错误信息:
- 父路径不是一个目录:/tmp tmp
- org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.fs.FileAlreadyExistsException): Parent path is not a directory: /tmp tmp
PriviledgedActionException as:root (auth:SIMPLE) cause:org.apache.hadoop.fs.FileAlreadyExistsException: Parent path is not a directory: /tmp tmpat org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsRecursively(FSNamesystem.java:4561)at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:4513)at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInt(FSNamesystem.java:4472)at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:4445)at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:880)at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.mkdirs(AuthorizationProviderProxyClientProtocol.java:326)at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:640)at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073)at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2281)at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2277)at java.security.AccessController.doPrivileged(Native Method)at javax.security.auth.Subject.doAs(Subject.java:422)at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1917)at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2275)Exception in thread "main" org.apache.hadoop.fs.FileAlreadyExistsException: Parent path is not a directory: /tmp tmpat org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsRecursively(FSNamesystem.java:4561)at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:4513)at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInt(FSNamesystem.java:4472)at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:4445)at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:880)at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.mkdirs(AuthorizationProviderProxyClientProtocol.java:326)at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:640)at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073)at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2281)at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2277)at java.security.AccessController.doPrivileged(Native Method)at javax.security.auth.Subject.doAs(Subject.java:422)at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1917)at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2275)at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)at java.lang.reflect.Constructor.newInstance(Constructor.java:423)at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:3143)at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:3108)at org.apache.hadoop.hdfs.DistributedFileSystem$19.doCall(DistributedFileSystem.java:1004)at org.apache.hadoop.hdfs.DistributedFileSystem$19.doCall(DistributedFileSystem.java:1000)at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1000)at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:992)at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:133)at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:148)at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1307)at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1304)at java.security.AccessController.doPrivileged(Native Method)at javax.security.auth.Subject.doAs(Subject.java:422)at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1917)at org.apache.hadoop.mapreduce.Job.submit(Job.java:1304)at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1325)at com.czxy.hadoop.mapreduce.demo03.WordCountApp.main(WordCountApp.java:46)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:498)at org.apache.hadoop.util.RunJar.run(RunJar.java:221)at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.fs.FileAlreadyExistsException): Parent path is not a directory: /tmp tmpat org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsRecursively(FSNamesystem.java:4561)at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:4513)at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInt(FSNamesystem.java:4472)at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:4445)at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:880)at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.mkdirs(AuthorizationProviderProxyClientProtocol.java:326)at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:640)at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073)at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2281)at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2277)at java.security.AccessController.doPrivileged(Native Method)at javax.security.auth.Subject.doAs(Subject.java:422)at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1917)at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2275)at org.apache.hadoop.ipc.Client.call(Client.java:1504)at org.apache.hadoop.ipc.Client.call(Client.java:1441)at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)at com.sun.proxy.$Proxy10.mkdirs(Unknown Source)at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:573)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:498)at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:258)at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:104)at com.sun.proxy.$Proxy11.mkdirs(Unknown Source)at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:3141)... 22 more
2.原因
- HDFS根目录下有一个文件叫/tmp 与HDFS的临时文件夹重名了
3.解决方法
将文件删除 则OK!
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.fs.FileAlreadyExistsException)相关推荐
- Hive启动报错org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.SafeModeE...
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.SafeModeExce ...
- 安装好hadoop集群后,报错如下n org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /data/hadoop-roo
master错误: n org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /data/hadoop-root-namen ...
- HBase中此类异常解决记录org.apache.hadoop.ipc.RemoteException(java.io.IOException):
HBase中此类异常解决记录org.apache.hadoop.ipc.RemoteException(java.io.IOException): 参考文章: (1)HBase中此类异常解决记录org ...
- hbase错误:Org.apache.hadoop.ipc.RemoteException:User:client is not allowed to impersonate root
场景:远程登录配有Kerberos的hbase的时候,client端用到的kerberos的principal是client/mj1@BDSM.CMCC 错误: Org.apache.hadoop.i ...
- Hadoop2.2.0 中错误总结之(org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /test._COPYING)
错误: [root@xiajie01 sbin]# hadoop fs -put /root/20131210110122880.doc hdfs://192.168.30.169:9000/tes ...
- HDFS上传文件命令报错org.apache.hadoop.ipc.RemoteException(java.io.IOException)
作为作者,强烈不建议进行格式化hadoop,毕竟开发数据是最为重要的! Hadoop 3.1 hdfs dfs -put /源文件路径 /目的文件路径 解决方案 第一步:停止主节点(Master)和子 ...
- [HADOOP]我所遇到的Hadoop报错(更新中)
welcome baby hadoop的fs命令官网地址 格式化namenode报错 hdfs的命令报错 查看HADOOP的HA模式RM的active节点 所有节点突然变成standby 我得集群上安 ...
- org.apache.hadoop.ipc.Client: Retrying connect to server异常的解决
org.apache.hadoop.ipc.Client: Retrying connect to server异常的解决 参考文章: (1)org.apache.hadoop.ipc.Client: ...
- INFO org.apache.hadoop.ipc.RPC: Server at master/192.168.200.128:9000 not available yet, Zzzzz...
hadoop 启动时namenode和datanode可以启动,使用jps命令也可以看到进程,但是在浏览器中输入master:50070却没有显示datanode. 查看datanode的log日志: ...
最新文章
- MFC命令消息的路由
- CIKM 2021 | 自监督学习在社会化推荐系统中的应用
- HDU 2818 Building Block
- MATLAB中median函数的用法
- 《学习OpenCV》课后习题解答1
- webapi找到了与该请求匹配的多个操作
- AIOps中异常检测简的单应用
- 50道编程小题目之【企业利润提成】
- 04_过滤器Filter_04_Filter生命周期
- wordpress自定义打赏
- SpringBoot中的文件读取
- X230 安装 EI Capitan 10.11.5 驱动篇
- 视频编码格式、视频码率、视频帧率、分辨率的概念
- 采集抖音APP的10个经典方法
- UVALive - 7345 The Hypnotic Spirals 高等数学+几何知识
- SharePoint服务器端对象模型 之 使用CAML进展数据查询
- MyEclipse更换背景豆沙绿
- Tensorflow基础语法以及网络搭建
- DM8 MPP集群部署
- 这么清晰的帮助手册,AppCube 二次体验
热门文章
- JS验证手机号码格式
- C++cmath数学常用库中的代码介绍
- 【金色独家 360信息安全部负责人高雪峰:区块链企业应有专业安全团队】
- 大图片上的像素风格游戏图片分割
- 数据库之MySQL查询去重数据
- 计算机专业评中级职称需要哪些材料,中级职称评定条件及所需材料
- Shellshock(bashdoor)漏洞详细分析、复现
- (LeetCode 406)根据身高重建队列 [贪心+sort+条件推理 ]
- python几何拼贴画_想要了解拼贴画这篇就够了,最全的种类和技法全在这里!
- 用选择排序法对数组中10个整数从大到小排序