Spark-submit:System memory 466092032 must be at least 471859200
在运行Standalone-client模式时遇到如下错误:
Spark Executor Command: "/usr/local/jdk1.8.0_181/bin/java" "-cp" "/usr/local/spark-2.3.0/conf/:/usr/local/spark-2.3.0/jars/*:/usr/local/hadoop-2.7.6/etc/hadoop/" "-Xmx500M" "-Dspark.driver.port=43614" "org.apache.spark.executor.CoarseGrainedExecutorBackend" "--driver-url" "spark://CoarseGrainedScheduler@hpmaster:43614" "--executor-id" "32" "--hostname" "192.168.199.212" "--cores" "1" "--app-id" "app-20190212233950-0001" "--worker-url" "spark://Worker@192.168.199.212:7079"
========================================Exception in thread "main" java.lang.IllegalArgumentException: System memory 466092032 must be at least 471859200. Please increase heap size using the --driver-memory option or spark.driver.memory in Spark configuration.
at org.apache.spark.memory.UnifiedMemoryManager$.getMaxMemory(UnifiedMemoryManager.scala:217)
at org.apache.spark.memory.UnifiedMemoryManager$.apply(UnifiedMemoryManager.scala:199)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:330)
at org.apache.spark.SparkEnv$.createExecutorEnv(SparkEnv.scala:200)
at org.apache.spark.executor.CoarseGrainedExecutorBackend$$anonfun$run$1.apply$mcV$sp(CoarseGrainedExecutorBackend.scala:228)
at org.apache.spark.deploy.SparkHadoopUtil$$anon$2.run(SparkHadoopUtil.scala:65)
at org.apache.spark.deploy.SparkHadoopUtil$$anon$2.run(SparkHadoopUtil.scala:64)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
at org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:64)
at org.apache.spark.executor.CoarseGrainedExecutorBackend$.run(CoarseGrainedExecutorBackend.scala:188)
at org.apache.spark.executor.CoarseGrainedExecutorBackend$.main(CoarseGrainedExecutorBackend.scala:293)
at org.apache.spark.executor.CoarseGrainedExecutorBackend.main(CoarseGrainedExecutorBackend.scala)
spark源码是这么写的:
/*** Return the total amount of memory shared between execution and storage, in bytes.*/private def getMaxMemory(conf: SparkConf): Long = {val systemMemory = conf.getLong("spark.testing.memory", Runtime.getRuntime.maxMemory)val reservedMemory = conf.getLong("spark.testing.reservedMemory",if (conf.contains("spark.testing")) 0 else RESERVED_SYSTEM_MEMORY_BYTES)val minSystemMemory = reservedMemory * 1.5if (systemMemory < minSystemMemory) {throw new IllegalArgumentException(s"System memory $systemMemory must " +s"be at least $minSystemMemory. Please use a larger heap size.")}val usableMemory = systemMemory - reservedMemoryval memoryFraction = conf.getDouble("spark.memory.fraction", 0.75)(usableMemory * memoryFraction).toLong}
spark-submit 中配置 --driver-java-options "-Dspark.testing.memory=1073741824" 比512M(536870912字节)大即可
/usr/local/spark-2.3.0/bin/spark-submit --class com.chy.rdd.initSpark --master spark://hpmaster:7077 --deploy-mode client --executor-memory 500m --driver-java-options "-Dspark.testing.memory=1073741824" --total-executor-cores 1 /usr/local/spark-2.3.0/examples/jars/sparkProject-1.0-SNAPSHOT.jar
Spark-submit:System memory 466092032 must be at least 471859200相关推荐
- Spark解决 System memory 259522560 must be at least 471859200
本地运行spark出现问题: 22/04/26 20:11:42 ERROR SparkContext: Error initializing SparkContext. java.lang.Ille ...
- spark Error initializing SparkContext System memory 466092032 must be at least 471859200.
ERROR SparkContext: Error initializing SparkContext. Java.lang.IllegalArgumentException: System memo ...
- System memory 466092032 must be at least
Error initializing SparkContext. dependencies | java.lang.IllegalArgumentException: System memory 46 ...
- Spark本地测试异常之 System memory 259522560 must be at least 471859200.
解决Spark本地测试异常之 System memory 259522560 must be at least 471859200 一.异常如下 二.抛出异常原因 三.解决办法 一.异常如下 java ...
- 【Spark】java.lang.IllegalArgumentException: System memory 259522560 must be at least 471859200.
报错 java.lang.IllegalArgumentException: System memory 259522560 must be at least 471859200. Please in ...
- System memory 249364480 must be at least 471859200
我是在eclipse run遇到这个问题的,所以解决办法是来到这里加上: 表示运行的时候虚拟空间最小为128,最大为512 在linux上运行jar 如果用控制的虚拟机内存,句子要这么写: java ...
- Spark问题:System memory 259522560 must be at least 4.718592E8. Please use a larger heap size.
在本地运行Spark程序时:报出下面的异常: Exception in thread "main" java.lang.IllegalArgumentException: Syst ...
- 【原创】大数据基础之Spark(1)Spark Submit即Spark任务提交过程
Spark2.1.1 一 Spark Submit本地解析 1.1 现象 提交命令: spark-submit --master local[10] --driver-memory 30g --cla ...
- Spark Submit任务提交流程
1,简介 在上一篇博客中,我们详细介绍了Spark Standalone模式下集群的启动流程.在Spark 集群启动后,我们要想在集群上运行我们自己编写的程序,该如何做呢?本篇博客就主要介绍Spark ...
最新文章
- 面对别人强行关机你怎么办与 定时关机
- [转]DPM2012系列之十:备份exchange2010数据库
- hdpi、mdpi、ldpi图片规格
- 关于iOS知识的提升
- Windows 7 几个小问题的解决方法(二)
- C# 参数化SQL语句中的like和in
- 解密昇腾AI处理器--Ascend310简介
- openstack服务编排
- opencv imshow 窗口无响应 the window does not seem to be responding. do you want to force
- 2020计算机大纲,计算机专业2020考试大纲.doc
- WebDriver API
- CSU 1556 Pseudoprime numbers
- secp256r1 c语言程序,区块链中的数学-secp256k1点压缩和公钥恢复原理
- Java异常处理-----finally
- 如何下载免费高清Google谷歌卫星3D地图?
- java中画幅相机推荐_中画幅的初级入门选择-飞思645DF+
- 《经济机器是怎样运行的》笔记(一)
- [BZOJ1984]月下“毛景树”(树链剖分)
- WIN10管理员权限设置、更改用户名被“拒绝访问”
- 2019中国机器人大赛窄足机器人赛后总结