Hadoop报错AccessControlException: Permission denied: user=vincent, access=WRITE, inode=/:iie4bu:supe
尝试使用Java操作Hadoop,代码如下:
/*** 使用Java API操作HDFS文件系统*/
public class HDFSAPP {public static void main(String[] args) throws IOException, URISyntaxException {Configuration configuration = new Configuration();FileSystem fileSystem = FileSystem.get(new URI("hdfs://swarm-worker1:9000"),configuration);Path path = new Path("/hdfsapi/test");boolean mkdirs = fileSystem.mkdirs(path);System.out.println(mkdirs);}
}
提示报错信息如下:
Exception in thread "main" org.apache.hadoop.security.AccessControlException: Permission denied: user=vincent, access=WRITE, inode="/":iie4bu:supergroup:drwxr-xr-xat org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:279)at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:260)at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:240)at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:162)at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:152)at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3885)at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3868)at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:3850)at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:6820)at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:4562)at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInt(FSNamesystem.java:4532)at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:4505)at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:884)at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.mkdirs(AuthorizationProviderProxyClientProtocol.java:328)at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:641)at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073)at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2281)at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2277)at java.security.AccessController.doPrivileged(Native Method)at javax.security.auth.Subject.doAs(Subject.java:422)at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1924)at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2275)at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)at java.lang.reflect.Constructor.newInstance(Constructor.java:423)at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:3157)at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:3122)at org.apache.hadoop.hdfs.DistributedFileSystem$19.doCall(DistributedFileSystem.java:1005)at org.apache.hadoop.hdfs.DistributedFileSystem$19.doCall(DistributedFileSystem.java:1001)at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1001)at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:993)at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1970)at hadoop.hdfs.HDFSAPP.main(HDFSAPP.java:20)
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=vincent, access=WRITE, inode="/":iie4bu:supergroup:drwxr-xr-xat org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:279)at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:260)at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:240)at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:162)at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:152)at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3885)at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3868)at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:3850)at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:6820)at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:4562)at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInt(FSNamesystem.java:4532)at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:4505)at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:884)at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.mkdirs(AuthorizationProviderProxyClientProtocol.java:328)at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:641)at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073)at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2281)at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2277)at java.security.AccessController.doPrivileged(Native Method)at javax.security.auth.Subject.doAs(Subject.java:422)at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1924)at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2275)at org.apache.hadoop.ipc.Client.call(Client.java:1504)at org.apache.hadoop.ipc.Client.call(Client.java:1441)at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)at com.sun.proxy.$Proxy10.mkdirs(Unknown Source)at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:575)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:498)at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:258)at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:104)at com.sun.proxy.$Proxy11.mkdirs(Unknown Source)at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:3155)... 8 more
因为我们的hadoop文件系统的权限如下:
用户是iie4bu,用户组是supergroup,而我们当前用户是vincent,因此没有权限执行。
因此需要指定用户:
FileSystem fileSystem = FileSystem.get(new URI("hdfs://swarm-worker1:9000"),configuration, "iie4bu");
这样就可以成功执行了。
Hadoop报错AccessControlException: Permission denied: user=vincent, access=WRITE, inode=/:iie4bu:supe相关推荐
- CDH spark启动spark-shell报错:Permission denied: user=root, access=WRITE, inode=/user
1.对非CDH用户,到Namenode上修改hadoop的配置文件:conf/hdfs-core.xml, 找到 dfs.permissions 的配置项 , 将value值改为 false < ...
- 远程连接:hive 报错:Permission denied: user=anonymous, access=EXECUTE
远程连接hive Error: Failed to open new session: java.lang.RuntimeException: java.lang.RuntimeException: ...
- Spark On Kubernetes报错Permission denied: user=root, access=WRITE, inode=/user/spark/log:hadoop:supe
Spark On Kubernetes 提交测试任务,driver pod 报错日志,如下: 19/11/06 07:38:05 INFO OutputCommitCoordinator$Output ...
- AccessControlException: Permission denied: user=Frankie, access=WRITE,
API写数据到hdf上,报错: Exception in thread "main" org.apache.hadoop.security.AccessControlExcepti ...
- git clone报错:Permission denied (publickey). fatal: Could not read from remote repository...
原文:git clone报错:Permission denied (publickey). fatal: Could not read from remote repository- 今天clone一 ...
- Hive报错:Hive JDBC:Permission denied: user=anonymous, access=EXECUTE, inode=”/tmp”
1.美图 今天使用JDBC来操作Hive时,首先启动了hive远程服务模式:hiveserver2 &(表示后台运行),然后到eclipse中运行程序时出现错误: Permission den ...
- linux su 不能输密码错误,su - root正确输入密码但是登录不了系统,报错su: Permission denied...
su - root输入了正确的密码但是一直报错,密码肯定是输入对的.这主要是因为权限的问题导致的. Linux里面有一个文件,/etc/pam.d/su文件. [[email protected] ~ ...
- aws 亚马逊 服务器 运行docker run报错 Got permission denied while trying to connect to the Docker daemon socket
问题: 当aws亚马逊服务器使用ec2-user运行docker run创建实例时,因为ec2-user没有权限所以会报错Got permission denied while trying to c ...
- hadoop 权限错误 Permission denied: user=root, access=WRITE, inode=/:hdfs:super
关于不能执行Hadoop命令 并报权限问题执行错误1.Permission denied: user=root, access=WRITE, inode="/":hdfs:supe ...
最新文章
- android studio同步代码块,Android Studio快捷键大全
- Squid概述及相关配置说明
- php asp.net des,转DES的dotNet到php实现
- Ros无法自动补全命令的解决
- 解决ASP.NET MVC(post数据)Json请求太大,无法反序列化,而报【远程服务器返回错误: (500) 内部服务器错误】...
- jar包里面文件修改
- 【截屏篇】系统PrtSc
- android 动画进度控制,Android仿美团加载数据、小人奔跑进度动画对话框实现方法...
- 资源池以及资源池化是什么意思?
- python获取今日头条搜索信息_python 爬取今日头条关键词搜索
- php更换wordpress用户头像,WordPress修改评论默认头像的方法
- eclipse32位安装教程_Maya2015 (64位) 软件安装教程
- Tekla图纸二次开发课程
- 关于安装mmdetection
- MySQL DBA 必读:万字归总表设计与 SQL 编写技巧
- java如何处理excel的读取
- 本地html文件显示不全,网页显示不全,详细教您网页显示不全怎么办
- 傅里叶变换、拉普拉斯变换与z变换对比
- Android2023暑期实习---网易游戏一面面经
- Python爬虫之路——简单的网页抓图
热门文章
- python算法与数据结构-希尔排序算法
- 在Sublime Text 3上安装代码格式化插件CodeFormatter
- ThinkPHP跨控制器调用方法
- Homestead 集成开发环境配置
- mysql使用MRG_MyISAM(MERGE)实现水平分表
- Laravel-admin添加模型路由报错出现问题:Model does not exists !
- 小米手机能用上鸿蒙吗,鸿蒙系统小米手机能用吗?鸿蒙系统支持第三方手机!
- 国内下载erlang链接
- windows7 php的php-ssh2,windows7下安装php的php-ssh2扩展教程_PHP教程
- 联想e470无线网卡驱动_笔记本显示已关闭无线功能,请问怎样打开?谢谢!