网上关于 Java 代码连接启用了Kerberos认证的HBASE资料很多,但是总感觉不够准确,总是出现各种问题。经过整合网上资料和亲自试验,得出连接成功的最小配置项如下:

java.security.krb5.confhadoop.security.authenticationhbase.security.authenticationhbase.regionserver.kerberos.principalhbase.zookeeper.quorumhbase.zookeeper.property.clientPort

试验发现,如果上述配置项缺少了任一项都会导致HBASE连接读写不成功。先放上获取连接成功的代码:

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.TableName;
import org.apache.hadoop.hbase.client.Admin;
import org.apache.hadoop.hbase.client.Connection;
import org.apache.hadoop.hbase.client.ConnectionFactory;
import org.apache.hadoop.security.UserGroupInformation;import java.io.IOException;public class HbaseConnector {public static Connection getConnection(String zkQuorum, String clientPort, String keyTabPath, String krbConfPath, String principal) throws IOException {// krb5.conf必需System.setProperty("java.security.krb5.conf", krbConfPath);org.apache.hadoop.conf.Configuration conf = HBaseConfiguration.create();// 必需conf.set("hadoop.security.authentication", "kerberos");// 必需conf.set("hbase.security.authentication", "kerberos");conf.set("hbase.regionserver.kerberos.principal", "hbase/_HOST@ECLD.COM");conf.set("hbase.zookeeper.quorum", zkQuorum);conf.set("hbase.zookeeper.property.clientPort", clientPort);// (非必需)conf.set("hbase.master.kerberos.principal", "hbase/_HOST@ECLD.COM");UserGroupInformation.setConfiguration(conf);//登陆认证UserGroupInformation.loginUserFromKeytab(principal, keyTabPath);return ConnectionFactory.createConnection(conf);}public static void main(String[] args) throws IOException {Connection connection = getConnection("node101,node102,node103", "2181", "C:/Users/sysea/Desktop/tonseal.keytab", "C:/Users/sysea/Desktop/krb5.conf", "tonseal@HADOOP.COM");Admin admin = connection.getAdmin();if (admin.tableExists(TableName.valueOf("tonseal:tonseal_table"))) {System.out.println("表tonseal:tonseal_table存在");} else {System.err.println("表tonseal:tonseal_table不存在");}admin.close();connection.close();}
}

hbase.zookeeper.quorum和hbase.zookeeper.property.clientPort这两个配置项是必需的,无论是否开启Kerberos认证都需要进行设置的。

krb5.conf这个文件可以在主机的/etc目录下找到:

krb5.conf示例内容如下:

[libdefaults]
default_realm = HADOOP.COM
dns_lookup_kdc = false
dns_lookup_realm = false
ticket_lifetime = 86400
renew_lifetime = 604800
forwardable = true
default_tgs_enctypes = aes256-cts
default_tkt_enctypes = aes256-cts
permitted_enctypes = aes256-cts
udp_preference_limit = 1
kdc_timeout = 3000[realms]
HADOOP.COM = {kdc = node101admin_server = node101
}
[domain_realm]

下面贴出缺少上述其他配置项的连接报错信息:

如果缺少了java.security.krb5.conf,提示无法获取realm:

23:14:29.533 [main] DEBUG org.apache.hadoop.security.authentication.util.KerberosName - Kerberos krb5 configuration not found, setting default realm to empty
Exception in thread "main" java.lang.IllegalArgumentException: Can't get Kerberos realmat org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:65)at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:309)at org.apache.hadoop.security.UserGroupInformation.setConfiguration(UserGroupInformation.java:355)at com.seal.HbaseConnector.getConnection(HbaseConnector.java:49)at com.seal.Main.createConnectionWithKerberos(Main.java:113)at com.seal.Main.main(Main.java:33)
Caused by: java.lang.reflect.InvocationTargetExceptionat sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:498)at org.apache.hadoop.security.authentication.util.KerberosUtil.getDefaultRealm(KerberosUtil.java:110)at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:63)... 5 more
Caused by: KrbException: Cannot locate default realmat sun.security.krb5.Config.getDefaultRealm(Config.java:1066)... 11 more

如果缺少了hadoop.security.authentication,代码会尝试使用本机当前用户名进行认证,认证方式为"SIMPLE",而不是"Kerberos",导致连接失败:

23:16:32.875 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - hadoop login
23:16:32.875 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - hadoop login commit
23:16:32.879 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - using local user:NTUserPrincipal: sysea
23:16:32.879 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - Using user: "NTUserPrincipal: sysea" with name sysea
23:16:32.879 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - User entry: "sysea"
23:16:32.879 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - Assuming keytab is managed externally since logged in from subject.
23:16:32.879 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - UGI loginUser:sysea (auth:SIMPLE)
23:16:32.886 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - PrivilegedAction as:sysea (auth:SIMPLE) from:org.apache.hadoop.hbase.security.User$SecureHadoopUser.runAs(User.java:347)
23:16:42.112 [hconnection-0x59fd97a8-metaLookup-shared--pool4-t1] WARN org.apache.hadoop.hbase.security.provider.BuiltInProviderSelector - No matching SASL authentication provider and supporting token found from providers for user: sysea (auth:SIMPLE)
23:16:42.113 [hconnection-0x59fd97a8-metaLookup-shared--pool4-t1] DEBUG org.apache.hadoop.hbase.client.RpcRetryingCallerImpl - Call exception, tries=6, retries=11, started=4293 ms ago, cancelled=false, msg=Call to node101/192.168.56.101:16020 failed on local exception: java.io.IOException: java.lang.RuntimeException: Found no valid authentication method from options, details=row 'tonseal:tonseal_table,rowkey9-1615130197768,99999999999999' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=node101,16020,1615095171294, seqNum=-1, see https://s.apache.org/timeout, exception=java.io.IOException: Call to node101/192.168.56.101:16020 failed on local exception: java.io.IOException: java.lang.RuntimeException: Found no valid authentication method from optionsat sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)at java.lang.reflect.Constructor.newInstance(Constructor.java:423)at org.apache.hadoop.hbase.ipc.IPCUtil.wrapException(IPCUtil.java:225)at org.apache.hadoop.hbase.ipc.AbstractRpcClient.onCallFinished(AbstractRpcClient.java:378)at org.apache.hadoop.hbase.ipc.AbstractRpcClient.access$100(AbstractRpcClient.java:89)at org.apache.hadoop.hbase.ipc.AbstractRpcClient$3.run(AbstractRpcClient.java:409)at org.apache.hadoop.hbase.ipc.AbstractRpcClient$3.run(AbstractRpcClient.java:405)at org.apache.hadoop.hbase.ipc.Call.callComplete(Call.java:117)at org.apache.hadoop.hbase.ipc.Call.setException(Call.java:132)at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callMethod(AbstractRpcClient.java:422)at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:316)at org.apache.hadoop.hbase.ipc.AbstractRpcClient.access$200(AbstractRpcClient.java:89)at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:572)at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$BlockingStub.scan(ClientProtos.java:45390)at org.apache.hadoop.hbase.client.ScannerCallable.openScanner(ScannerCallable.java:332)at org.apache.hadoop.hbase.client.ScannerCallable.rpcCall(ScannerCallable.java:242)at org.apache.hadoop.hbase.client.ScannerCallable.rpcCall(ScannerCallable.java:58)at org.apache.hadoop.hbase.client.RegionServerCallable.call(RegionServerCallable.java:127)at org.apache.hadoop.hbase.client.RpcRetryingCallerImpl.callWithoutRetries(RpcRetryingCallerImpl.java:192)at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:396)at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:370)at org.apache.hadoop.hbase.client.RpcRetryingCallerImpl.callWithRetries(RpcRetryingCallerImpl.java:107)at org.apache.hadoop.hbase.client.ResultBoundedCompletionService$QueueingFuture.run(ResultBoundedCompletionService.java:80)at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)at java.lang.Thread.run(Thread.java:748)
Caused by: java.io.IOException: java.lang.RuntimeException: Found no valid authentication method from optionsat org.apache.hadoop.hbase.ipc.IPCUtil.toIOE(IPCUtil.java:159)... 17 more
Caused by: java.lang.RuntimeException: Found no valid authentication method from optionsat org.apache.hadoop.hbase.ipc.RpcConnection.<init>(RpcConnection.java:112)at org.apache.hadoop.hbase.ipc.NettyRpcConnection.<init>(NettyRpcConnection.java:97)at org.apache.hadoop.hbase.ipc.NettyRpcClient.createConnection(NettyRpcClient.java:76)at org.apache.hadoop.hbase.ipc.NettyRpcClient.createConnection(NettyRpcClient.java:39)at org.apache.hadoop.hbase.ipc.AbstractRpcClient.getConnection(AbstractRpcClient.java:350)at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callMethod(AbstractRpcClient.java:419)... 16 more

如果缺少了hbase.security.authentication,也会导致连接失败:

23:24:34.930 [RPCClient-NioEventLoopGroup-1-1] DEBUG org.apache.hadoop.hbase.ipc.NettyRpcDuplexHandler - Unknown callId: -1, skipping over this response of 0 bytes
23:24:34.931 [hconnection-0x2cdd0d4b-metaLookup-shared--pool4-t1] DEBUG org.apache.hadoop.hbase.client.RpcRetryingCallerImpl - Call exception, tries=6, retries=16, started=4530 ms ago, cancelled=false, msg=Call to node101/192.168.56.101:16020 failed on local exception: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed, details=row 'tonseal:tonseal_table,rowkey4-1615130670351,99999999999999' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=node101,16020,1615095171294, seqNum=-1, see https://s.apache.org/timeout, exception=org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Call to node101/192.168.56.101:16020 failed on local exception: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closedat org.apache.hadoop.hbase.ipc.IPCUtil.wrapException(IPCUtil.java:206)at org.apache.hadoop.hbase.ipc.AbstractRpcClient.onCallFinished(AbstractRpcClient.java:378)at org.apache.hadoop.hbase.ipc.AbstractRpcClient.access$100(AbstractRpcClient.java:89)at org.apache.hadoop.hbase.ipc.AbstractRpcClient$3.run(AbstractRpcClient.java:409)at org.apache.hadoop.hbase.ipc.AbstractRpcClient$3.run(AbstractRpcClient.java:405)at org.apache.hadoop.hbase.ipc.Call.callComplete(Call.java:117)at org.apache.hadoop.hbase.ipc.Call.setException(Call.java:132)at org.apache.hadoop.hbase.ipc.NettyRpcDuplexHandler.cleanupCalls(NettyRpcDuplexHandler.java:203)at org.apache.hadoop.hbase.ipc.NettyRpcDuplexHandler.channelInactive(NettyRpcDuplexHandler.java:211)at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:262)at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:248)at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:241)at org.apache.hbase.thirdparty.io.netty.handler.codec.ByteToMessageDecoder.channelInputClosed(ByteToMessageDecoder.java:386)at org.apache.hbase.thirdparty.io.netty.handler.codec.ByteToMessageDecoder.channelInactive(ByteToMessageDecoder.java:351)at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:262)at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:248)at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:241)at org.apache.hbase.thirdparty.io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:81)at org.apache.hbase.thirdparty.io.netty.handler.timeout.IdleStateHandler.channelInactive(IdleStateHandler.java:277)at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:262)at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:248)at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:241)at org.apache.hbase.thirdparty.io.netty.channel.DefaultChannelPipeline$HeadContext.channelInactive(DefaultChannelPipeline.java:1405)at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:262)at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:248)at org.apache.hbase.thirdparty.io.netty.channel.DefaultChannelPipeline.fireChannelInactive(DefaultChannelPipeline.java:901)at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannel$AbstractUnsafe$8.run(AbstractChannel.java:818)at org.apache.hbase.thirdparty.io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:164)at org.apache.hbase.thirdparty.io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:472)at org.apache.hbase.thirdparty.io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:497)at org.apache.hbase.thirdparty.io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989)at org.apache.hbase.thirdparty.io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)at org.apache.hbase.thirdparty.io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed... 26 more

有的连接代码中会使用core-site.xml、hbase-site.xml这些文件添加到Configuration中,这是因为core-site.xml和hbase-site.xml中包含了上述的配置项,如此一来当然可以连接成功了。

core-site.xml中包含了配置项hadoop.security.authentication:

如果hbase-site.xml中包含了上述所有hbase相关的配置项,把hbase-site.xml添加到Configuration后,则hbase.security.authentication、hbase.regionserver.kerberos.principal、hbase.zookeeper.quorum、hbase.zookeeper.property.clientPort都不需要再指定:

使用core-site.xml、hbase-site.xml进行连接的代码如下:

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.TableName;
import org.apache.hadoop.hbase.client.Admin;
import org.apache.hadoop.hbase.client.Connection;
import org.apache.hadoop.hbase.client.ConnectionFactory;
import org.apache.hadoop.security.UserGroupInformation;import java.io.IOException;public class HbaseConnector {public static Connection getConnectionByFile(String keyTabPath, String krbConfPath, String coreSitePath, String hbaseSitePath, String principal) throws IOException {// krb5.conf必需System.setProperty("java.security.krb5.conf", krbConfPath);org.apache.hadoop.conf.Configuration conf = HBaseConfiguration.create();conf.addResource(new Path(coreSitePath));conf.addResource(new Path(hbaseSitePath));UserGroupInformation.setConfiguration(conf);//登陆认证UserGroupInformation.loginUserFromKeytab(principal, keyTabPath);connection = ConnectionFactory.createConnection(conf);return connection;}public static void main(String[] args) throws IOException {
//        Connection connection = getConnection("node101,node102,node103", "2181", "C:/Users/sysea/Desktop/tonseal.keytab", "C:/Users/sysea/Desktop/krb5.conf", "tonseal@HADOOP.COM");Connection connection = getConnectionByFile("C:/Users/sysea/Desktop/tonseal.keytab", "C:/Users/sysea/Desktop/krb5.conf", "C:/Users/sysea/Desktop/core-site.xml", "C:/Users/sysea/Desktop/hbase-site.xml", "tonseal@HADOOP.COM");Admin admin = connection.getAdmin();if (admin.tableExists(TableName.valueOf("tonseal:tonseal_table"))) {System.out.println("表tonseal:tonseal_table存在");} else {System.err.println("表tonseal:tonseal_table不存在");}admin.close();connection.close();}
}

其他:Kerberos认证获取的票据时长是有限的,程序运行一段时间后会出现无法读写HBASE的情况,这就需要对票据进行续约。

Java API连接Kerberos认证的HBASE相关推荐

  1. 用Java访问带有Kerberos认证的HBase

    程序代码实例如下:package com.hbasedemo;import java.io.IOException;import org.apache.hadoop.conf.Configuratio ...

  2. pyhive连接kerberos认证的hive

    目前,大多数的大数据集群之前是采用kerberos认证的,公司的大数据集群目前正在升级,认证方式由LDAP认证方式改变成为kerberos进行认证: 以下介绍如何将LDAP认证连接升级至KERBERO ...

  3. Java API接口签名认证

    Java API接口签名认证 我们在进行程序开发的时候,一定会开发一些API接口,供他人访问.当然这些接口中有可能是开放的,也有可能是需要登录才能访问的,也就是需要Token鉴权成功后才可以访问的.那 ...

  4. java连接虚拟机hadoop_本地eclipse java api连接远程虚拟机HBase

    1.本地与远程连通 无论是域名或者ip都可以,另外需保证HBase在虚拟机集群上正常运行. 2.本地要有一个跟远程相同的hadoop环境 当然不相同,只要兼容也可以,现采用hadoop-2.5.0-c ...

  5. hadoop java client_hadoop3 Java client客户端kerberos认证

    hadoop集群升级hadoop3,并需要Kerberos认证,hadoop3代码包做了合并,引用jar包如下: org.apache.hadoop hadoop-hdfs 3.1.1 org.apa ...

  6. 使用Spark/Java读取已开启Kerberos认证的HBase

    1.赋予drguo用户相应的权限 2.KDC中创建drguo用户并导出相应的keytab文件 [root@bigdata28 ~]# kadmin.local  Authenticating as p ...

  7. Java API连接HBase

    导入依赖 <dependency><groupId>junit</groupId><artifactId>junit</artifactId> ...

  8. 报错:使用java api连接redis集群时报错 READONLY You can't write against a read only slave....

    报错: READONLY You can't write against a read only slave. 报错原因: 因为连接的是从节点,从节点只有读的权限,没有写的权限 解决方案: 进入red ...

  9. 报错:使用java api连接redis集群时报错 READONLY You can't write against a read only slave.

    报错: READONLY You can't write against a read only slave. 报错原因: 因为连接的是从节点,从节点只有读的权限,没有写的权限 解决方案: 进入red ...

最新文章

  1. 收藏!超全机器学习资料合集!(附下载)
  2. C++操作windows注册列表
  3. sourceinsight如何显示完整文件路径
  4. 【数据结构与算法】之深入解析“24点游戏”的求解思路与算法示例
  5. 模拟Spring Security上下文进行单元测试
  6. PLSQL Developer报“动态执行表不可访问,本会话的自动统计被禁止”的解决方案...
  7. Objective-C语法之集合对象的那些事儿(九)
  8. Python学习之not,and,or篇
  9. 代码行数统计工具loc与iloc
  10. 漏洞C:/Windows/Fonts/csrss.exe文件找不到简单查找方法
  11. 操作系统Clock算法
  12. IBM X3650M4服务器拆机风扇 69Y5611 94Y6620 GFC0812DS 线序
  13. 单片机实验一、单片机开发环境设置
  14. 解决 There is no getter for property named ‘null‘ in ‘class 报错
  15. 前端HTML CSS JavaScipt JQuery
  16. java 纳秒_如何在Java 8中使用Date API达到精确的纳秒精...
  17. tp5微信开发(二) ---- 微信关键字自动回复,图文回复,关注自动回复
  18. echarts 折线面积区域绘制+手机端四川地图的实现+折线图标点闪烁
  19. 目标检测--RFBNet训练自己制作数据集出现loss=nan问题的解决方法
  20. Erlang之父Joe Armstrong近期回顾的一些旧事

热门文章

  1. python数据分析流程
  2. 鼠标点击图片移动时图片移动,松开鼠标,再移动鼠标时,图片仍随着鼠标移动问题
  3. 【社交网络分析】映射主题网络:从两极分化的人群到社区集群(一)
  4. 二维点集求外轮廓Java_从二维点集重建平面形状-浅议凹包算法
  5. VC中CTime和SYSTEMTIME转化
  6. 永远闪亮,网的眼睛 (转)
  7. 学以致用——Java源码——使用Graphics2D类draw方法绘制立方体(Drawing Cubes)
  8. OTA更新利用CRC保证程序的完整性
  9. 阿里云ACP大数据工程师认证,ACP,阿里云ACP认证,阿里云认证,大数据工程师认证
  10. 北邮计算机学院石川,祝贺石川教授团队两篇论文被ACM SIGKDD2019接收!