1. Sqoop将mysql中表导入到hive遇到 ERROR hive.HiveConfig: Could not load org.apache.hadoop.hive.conf.HiveConf. Make sure HIVE_CONF_DIR is set correctly.

2021-08-03 21:13:28,937 ERROR hive.HiveConfig: Could not load org.apache.hadoop.hive.conf.HiveConf. Make sure HIVE_CONF_DIR is set correctly.
2021-08-03 21:13:28,938 ERROR tool.ImportTool: Import failed: java.io.IOException: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConfat org.apache.sqoop.hive.HiveConfig.getHiveConf(HiveConfig.java:50)at org.apache.sqoop.hive.HiveImport.getHiveArgs(HiveImport.java:392)at org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:379)at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:337)at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:241)at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:537)at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)at org.apache.sqoop.Sqoop.run(Sqoop.java:147)at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConfat java.net.URLClassLoader.findClass(URLClassLoader.java:381)at java.lang.ClassLoader.loadClass(ClassLoader.java:424)at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)at java.lang.ClassLoader.loadClass(ClassLoader.java:357)at java.lang.Class.forName0(Native Method)at java.lang.Class.forName(Class.java:264)at org.apache.sqoop.hive.HiveConfig.getHiveConf(HiveConfig.java:44)... 12 more

解决方案1:

往~/.bashrc最后加入 export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:$HIVE_HOME/lib/*
然后刷新配置,source ~/.bashrc

解决方案2:

将hive中的hive-common-3.1.2.jar拷贝至sqoop的lib目录下

[root@singleNode bin]# find /opt/install/hive/ -name 'hive-common*.jar'
/opt/install/hive/lib/hive-common-3.1.2.jar
[root@singleNode bin]# cp /opt/install/hive/lib/hive-common-3.1.2.jar /opt/install/sqoop/lib/

2. 使用sqoop从mysql导数据到hive报错ERROR Could not register mbeans java.security.AccessControlException: access denied ("javax.management.MBeanTrustPermission" "register")

2018-06-22 12:28:32,398 main ERROR Could not register mbeans java.security.AccessControlException: access denied ("javax.management.MBeanTrustPermission" "register")</strong>at java.security.AccessControlContext.checkPermission(AccessControlContext.java:472)at java.lang.SecurityManager.checkPermission(SecurityManager.java:585)at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.checkMBeanTrustPermission(DefaultMBeanServerInterceptor.java:1848)at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.registerMBean(DefaultMBeanServerInterceptor.java:322)at com.sun.jmx.mbeanserver.JmxMBeanServer.registerMBean(JmxMBeanServer.java:522)at org.apache.logging.log4j.core.jmx.Server.register(Server.java:379)at org.apache.logging.log4j.core.jmx.Server.reregisterMBeansAfterReconfigure(Server.java:171)at org.apache.logging.log4j.core.jmx.Server.reregisterMBeansAfterReconfigure(Server.java:147)at org.apache.logging.log4j.core.LoggerContext.setConfiguration(LoggerContext.java:457)at org.apache.logging.log4j.core.LoggerContext.start(LoggerContext.java:246)at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:230)at org.apache.logging.log4j.core.config.Configurator.initialize(Configurator.java:140)at org.apache.logging.log4j.core.config.Configurator.initialize(Configurator.java:113)at org.apache.logging.log4j.core.config.Configurator.initialize(Configurator.java:98)at org.apache.logging.log4j.core.config.Configurator.initialize(Configurator.java:156)at org.apache.hadoop.hive.common.LogUtils.initHiveLog4jDefault(LogUtils.java:121)at org.apache.hadoop.hive.common.LogUtils.initHiveLog4jCommon(LogUtils.java:73)at org.apache.hadoop.hive.common.LogUtils.initHiveLog4j(LogUtils.java:54)at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:661)at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:645)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:498)at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:331)at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:241)at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:537)at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)at org.apache.sqoop.Sqoop.run(Sqoop.java:147)at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)at org.apache.sqoop.Sqoop.main(Sqoop.java:252)18/06/22 12:28:32 WARN common.LogUtils: hive-site.xml not found on CLASSPATHLogging initialized using configuration in jar:file:/home/hive/lib/hive-exec-2.0.0.jar!/hive-log4j2.properties
18/06/22 12:28:32 INFO SessionState:
Logging initialized using configuration in jar:file:/home/hive/lib/hive-exec-2.0.0.jar!/hive-log4j2.properties
18/06/22 12:28:32 INFO metastore.HiveMetaStore: 0: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStore
18/06/22 12:28:32 INFO metastore.ObjectStore: ObjectStore, initialize called
18/06/22 12:28:33 WARN DataNucleus.General: Plugin (Bundle) "org.datanucleus.api.jdo" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/home/hive/lib/datanucleus-api-jdo-4.2.1.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/home/sqoop/lib/datanucleus-api-jdo-4.2.1.jar."
18/06/22 12:28:33 WARN DataNucleus.General: Plugin (Bundle) "org.datanucleus.store.rdbms" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/home/hive/lib/datanucleus-rdbms-4.1.7.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/home/sqoop/lib/datanucleus-rdbms-4.1.7.jar."
18/06/22 12:28:33 WARN DataNucleus.General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/home/hive/lib/datanucleus-core-4.1.6.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/home/sqoop/lib/datanucleus-core-4.1.6.jar."
18/06/22 12:28:33 INFO DataNucleus.Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored
18/06/22 12:28:33 INFO DataNucleus.Persistence: Property datanucleus.cache.level2 unknown - will be ignored
18/06/22 12:28:41 ERROR bonecp.BoneCP: Unable to start/stop JMX

解决方案1:

修改jdk的文件jdk1.8.0_11/jre/lib/security/java.policy
具体配置如下:在文件中添加如下内容

grant {
        permission javax.management.MBeanTrustPermission "register";
};

解决方案2:

将hive-site.xml复制到sqoop的conf目录下即可

3. 虚拟机重启后虚拟机内docker镜像连不上 Error response from daemon: Container xxxx is not running

解决方案

查看sysctl net.ipv4.ip_forward状态 若为0就需要修改

具体使用的几个命令

  • echo ‘net.ipv4.ip_forward = 1’ >> /usr/lib/sysctl.d/50-default.conf
  • sysctl -p /usr/lib/sysctl.d/50-default.conf

4. Sqoop空指针异常报警 ERROE: sqoop.Sqoop:Got exception running Sqoop: java.lang.NullPointerException java.lang.NullPointerException

解决方案:

1)检查sqoop-site.xml配置是否有缺失

2)添加json依赖(需要自己找下这个jar包)

cp /opt/software/java-json.jar /opt/install/sqoop/lib/

5. Sqoop运行时 Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hive/shims/ShimLoader

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hive/shims/ShimLoaderat org.apache.hadoop.hive.conf.HiveConf$ConfVars.<clinit>(HiveConf.java:370)at org.apache.hadoop.hive.conf.HiveConf.<clinit>(HiveConf.java:108)at java.lang.Class.forName0(Native Method)at java.lang.Class.forName(Class.java:264)at org.apache.sqoop.hive.HiveConfig.getHiveConf(HiveConfig.java:44)at org.apache.sqoop.hive.HiveImport.getHiveArgs(HiveImport.java:392)at org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:379)at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:337)at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:241)at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:530)at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:621)at org.apache.sqoop.Sqoop.run(Sqoop.java:147)at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.shims.ShimLoader

解决方案:

cp /opt/install/hive/lib/hive-shims* /opt/install/sqoop/lib/

6. Sqoop 文件已存在异常报警 ERROR tool.ImportTool: Import failed: org.apache.hadoop.mapred.FileAlreadyExistsException

2021-08-04 02:02:52,989 ERROR tool.ImportTool: Import failed: org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory hdfs://singleNode:8020/user/root/temp already existsat org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:164)at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:277)at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:143)at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1570)at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1567)at java.security.AccessController.doPrivileged(Native Method)at javax.security.auth.Subject.doAs(Subject.java:422)at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)at org.apache.hadoop.mapreduce.Job.submit(Job.java:1567)at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1588)at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:200)at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:173)at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:270)at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:692)at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:127)at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:520)at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)at org.apache.sqoop.Sqoop.run(Sqoop.java:147)at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)at org.apache.sqoop.Sqoop.main(Sqoop.java:252)

解决方案:

删除掉提示的文件即可

7. Sqoop 创建从mysql向hive表中增量导入的job时空指针异常报警 ERROR sqoop.Sqoop: Got exception running Sqoop: java.lang.NullPointerException

解决方案:

删除sqoop的lib文件夹中的hive-exec-3.1.2.jar

rm -f /opt/install/lib/hive-exec-3.1.2.jar

8. Hive中运行任务报错:FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask

 FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask

错误原因
第一种,yarn资源不足
原因:
该错误是YARN的虚拟内存计算方式导致,上例中用户程序申请的内存为1Gb,YARN根据此值乘以一个比例(默认为2.1)得出申请的虚拟内存的值,当YARN计算的用户程序所需虚拟内存值大于计算出来的值时,就会报出以上错误。调节比例值可以解决该问题。具体参数为:yarn-site.xml中的yarn.nodemanager.vmem-pmem-ratio

解决方案:

调整hadoop配置文件yarn-site.xml中值:

vim /opt/install/hadoop/etc/hadoop/yarn-site.xml
-----------------------------------------------------------<property><name>yarn.scheduler.minimum-allocation-mb</name><value>2048</value><description>default value is 1024</description>
</property>

重启hiveserver2

9. Mysql远程登录权限异常报警 java.sql.SQLException: Access denied for user 'root'@'localhost'

解决方案:

设置远程登入权限并刷新

# 启动服务
systemctl start mysql
# 修改MySQL密码
/usr/bin/mysqladmin -u root password 'root'
# 登陆MySQL设置权限
mysql -uroot -proot
> update mysql.user set host='%' where host='localhost';
> delete from mysql.user where host<>'%' or user='';
> flush privileges;

10.flink yarn-session.sh启动出现java.lang.NoClassDefFoundError: org/apache/hadoop/fs/Path

2021-08-09 02:15:33,593 ERROR org.apache.flink.yarn.cli.FlinkYarnSessionCli                [] - Error while running the Flink session.
java.lang.NoClassDefFoundError: org/apache/hadoop/fs/Pathat org.apache.flink.yarn.cli.FlinkYarnSessionCli.getLocalFlinkDistPathFromCmd(FlinkYarnSessionCli.java:347) ~[flink-dist_2.12-1.13.2.jar:1.13.2]at org.apache.flink.yarn.cli.FlinkYarnSessionCli.applyDescriptorOptionToConfig(FlinkYarnSessionCli.java:473) ~[flink-dist_2.12-1.13.2.jar:1.13.2]at org.apache.flink.yarn.cli.FlinkYarnSessionCli.toConfiguration(FlinkYarnSessionCli.java:394) ~[flink-dist_2.12-1.13.2.jar:1.13.2]at org.apache.flink.yarn.cli.FlinkYarnSessionCli.run(FlinkYarnSessionCli.java:571) ~[flink-dist_2.12-1.13.2.jar:1.13.2]at org.apache.flink.yarn.cli.FlinkYarnSessionCli.lambda$main$4(FlinkYarnSessionCli.java:860) ~[flink-dist_2.12-1.13.2.jar:1.13.2]at org.apache.flink.runtime.security.contexts.NoOpSecurityContext.runSecured(NoOpSecurityContext.java:28) ~[flink-dist_2.12-1.13.2.jar:1.13.2]at org.apache.flink.yarn.cli.FlinkYarnSessionCli.main(FlinkYarnSessionCli.java:860) [flink-dist_2.12-1.13.2.jar:1.13.2]
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.fs.Pathat java.net.URLClassLoader.findClass(URLClassLoader.java:381) ~[?:1.8.0_171]at java.lang.ClassLoader.loadClass(ClassLoader.java:424) ~[?:1.8.0_171]at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349) ~[?:1.8.0_171]at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ~[?:1.8.0_171]... 7 more

解决方案:

将flink-shaded-hadoop-3-uber-3.1.1.7.1.1.0-565-9.0.jar拷贝至flink的bin目录下

链接:https://pan.baidu.com/s/1a9bdL2amXq0lyZMDF2GLhw 
提取码:ltpc

11. flink 启动时Maximun Memory:1536MB Requeested:1600MB

解决方案:

vim /opt/software/hadoop/etc/hadoop/yarn-site.xml
<!--yarn容器允许分配的最大内存-->
<property><name>yarn.scheduler.maximum-allocation-mb</name><value>2048</value>
</property>
<!--yarn容器允许管理的物理内存大小-->
<property><name>yarn.nodemanager.resource.memory-mb</name><value>2048</value>
</property>

12. Multiple Executor Mode模式配置配置对executor主机内存限制

解决方案:

修改 web-server conf/azkaban.properties 配置
# execute 主机过滤器配置, 去掉 MinimumFreeMemory
# MinimumFreeMemory 过滤器会检查 executor 主机空余内存是否会大于 6G,如果不足 6G,则 web-server 不会将任务交由该主机执行
azkaban.executorselector.filters=StaticRemainingFlowSize,CpuStatus

大数据——那些年走过的坑(异常报错解决方案,持续更新)相关推荐

  1. @ConfigurationProperties(prefix = )异常报错解决方案:

    读取shiro配置文件时出错! 分析原因: SpringBoot需要读取配置文件.yml时,需要添加此注解: @ConfigurationProperties(prefix = "shiro ...

  2. python消费kafka逻辑处理导致cpu升高_大数据技术之一次KAFKA消费者异常引起的思考...

    本篇教程探讨了大数据技术之一次KAFKA消费者异常引起的思考,希望阅读本篇文章以后大家有所收获,帮助大家对相关内容的理解更加深入. 问题描述: 线上出现一台服务器特别慢,于是关闭了服务器上的kafka ...

  3. 《大数据》杂志——大数据容灾备份技术挑战和增量备份解决方案

    大数据容灾备份技术挑战和增量备份解决方案 罗圣美1,2,李 明1,叶郁文1 (1.中兴通讯股份有限公司 南京 210012: 2.清华大学计算机科学与技术系 北京 100084) 摘要:大数据已成为当 ...

  4. JS Uncaught SyntaxError:Unexpected identifier异常报错原因及其解决方法

    最近在写ajax的时候,调用js方法,遇到了Uncaught SyntaxError:Unexpected identifier异常报错,开始搞不清原因,很苦恼. 以为是js方法参数个数和长度的问题, ...

  5. python raise语句_python中异常报错的分析处理

    想必到现在经过python基础的学习之后,小伙伴们都已经开始写很多脚本了,有大的有小的,但是有的时候并不是所写的能够顺利跑出结果来,期间会有不但的报错以及异常,很多我们都不理解,所以也就不会修改,这是 ...

  6. JSONException: illegal identifier : \pos 1 异常报错问题

    JSONException: illegal identifier : \pos 1 异常报错问题 1.常见情况: 1.1 JSON 字符串格式不正确 1.2 JSON 字符串中包含了非法字符 1.3 ...

  7. .NET(C#)时间日期字符串(String)格式化转换成Datetime异常报错问题

    .NET(C#)时间日期字符串(String)格式化转换成Datetime异常报错问题 参考文章: (1).NET(C#)时间日期字符串(String)格式化转换成Datetime异常报错问题 (2) ...

  8. 树莓派4 安装OPENCV3全过程(各种踩坑和报错)

    树莓派4 安装OPENCV3全过程(各种踩坑和报错) 说明 第一步更换源 第二部:存储空间的一些说明和操作 第三步:增加交换空间 第四步:下载工具及包 第五步:设置编译编译参数 第六步:开始编译 第六 ...

  9. 开启SQLServer数据库的CDC报错:无法更新元数据来指示已对数据库 XXX 启用了变更数据捕获

    开启SQLServer数据库的CDC时,运行报错: 无法更新元数据来指示已对数据库 XXX 启用了变更数据捕获.执行命令 'SetCDCTracked(Value = 1)' 时失败.返回的错误为 1 ...

最新文章

  1. Fedora 23如何安装LAMP服务器
  2. oracle客户端安装后,oracle客户端安装
  3. ftp上传图片出现550_FtpClient 实现文件上传
  4. 剪映专业版Mac版上线,来讲讲体验感吧
  5. C# CharacterToBinary 将类似2进制字符串 10010110111 转换为数值型源码
  6. 小马激活软件下载,当心伪小马,有病毒
  7. excel熵值法计算权重_如何用熵值法确定指标权重?
  8. 宝塔面板强制绑定手机号码解决办法
  9. 华为服务器插网线后没有响应,用con口连接华为交换机没有反应,重启之后出现以下信息后...
  10. CSS Bulma 框架
  11. ADO.NET如何读取Excel(转自晓风残月)
  12. GNSS原理及技术(一)——GNSS现状与发展
  13. 86u 网页服务器,路由器怎么设置DMZ_华硕RT-AC86U路由器开启DMZ方法
  14. 小程序源码:升级版手机检测微信工具
  15. cati服务器授权信息无效,CATI简易操作.doc
  16. 计算机不能识别监控硬盘分区,MBR 分区电脑无法识别 3TB 及以上硬盘的原因!
  17. 国科大-高性能计算考试
  18. html初学者对相对地址,绝对地址的理解
  19. 我为什么加入了 TDengine
  20. Android撕衣服小案例

热门文章

  1. 各省绿色金融指数(2001-2020年)
  2. 无线发射器c语言程序代码,单片机编码 无线模块发送与接收 程序
  3. SCCB与IIC的异同及FPGA实现的注意事项
  4. Stargazer的分治讲义
  5. VOL.bat 内容
  6. OpenCV漫水填充
  7. 申请电子面单号API接口文档
  8. JSON排除指定字段的4种方法
  9. 机器学习的数学基础:向量篇
  10. 二、第十五届全国大学生智能汽车竞赛AI电磁——硬件设计篇