Created by Wang, Jerry, last modified on Sep 12, 2015

start-master.sh ( sbin folder下)

then ps -aux
7334 5.6 0.6 1146992 221652 pts/0 Sl 12:34 0:05 /usr/jdk1.7.0_79/bin/java -cp /root/devExpert/spark-1.4.1/sbin/…/conf/:/root/devExpert/spar
monitor master node via url: http://10.128.184.131:8080
启动两个worker:

./spark-class org.apache.spark.deploy.worker.Worker spark://NKGV50849583FV1:7077 ( bin folder下)

提交job到集群

./spark-submit --class “org.apache.spark.examples.JavaWordCount” --master spark://NKGV50849583FV1:7077 /root/devExpert/spark-1.4.1/example-java-build/JavaWordCount/target/JavaWordCount-1.jar /root/devExpert/spark-1.4.1/bin/test.txt

成功执行job

./spark-submit --class “org.apache.spark.examples.JavaWordCount” --master spark://NKGV50849583FV1:7077 /root/devExpert/spark-1.4.1/example-java-build/JavaWordCount/target/JavaWordCount-1.jar /root/devExpert/spark-1.4.1/bin/test.txt
added by Jerry: loading load-spark-env.sh !!!1
added by Jerry:…
/root/devExpert/spark-1.4.1/conf
added by Jerry, number of Jars: 1
added by Jerry, launch_classpath: /root/devExpert/spark-1.4.1/assembly/target/scala-2.10/spark-assembly-1.4.1-hadoop2.4.0.jar
added by Jerry,RUNNER:/usr/jdk1.7.0_79/bin/java
added by Jerry, printf argument list: org.apache.spark.deploy.SparkSubmit --class org.apache.spark.examples.JavaWordCount --master spark://NKGV50849583FV1:7077 /root/devExpert/spark-1.4.1/example-java-build/JavaWordCount/target/JavaWordCount-1.jar /root/devExpert/spark-1.4.1/bin/test.txt
added by Jerry, I am in if-else branch: /usr/jdk1.7.0_79/bin/java -cp /root/devExpert/spark-1.4.1/conf/:/root/devExpert/spark-1.4.1/assembly/target/scala-2.10/spark-assembly-1.4.1-hadoop2.4.0.jar:/root/devExpert/spark-1.4.1/lib_managed/jars/datanucleus-rdbms-3.2.9.jar:/root/devExpert/spark-1.4.1/lib_managed/jars/datanucleus-core-3.2.10.jar:/root/devExpert/spark-1.4.1/lib_managed/jars/datanucleus-api-jdo-3.2.6.jar -Xms512m -Xmx512m -XX:MaxPermSize=256m org.apache.spark.deploy.SparkSubmit --master spark://NKGV50849583FV1:7077 --class org.apache.spark.examples.JavaWordCount /root/devExpert/spark-1.4.1/example-java-build/JavaWordCount/target/JavaWordCount-1.jar /root/devExpert/spark-1.4.1/bin/test.txt
Using Spark’s default log4j profile: org/apache/spark/log4j-defaults.properties
15/08/15 14:08:02 INFO SparkContext: Running Spark version 1.4.1
15/08/15 14:08:03 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
15/08/15 14:08:03 WARN Utils: Your hostname, NKGV50849583FV1 resolves to a loopback address: 127.0.0.1; using 10.128.184.131 instead (on interface eth0)
15/08/15 14:08:03 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
15/08/15 14:08:03 INFO SecurityManager: Changing view acls to: root
15/08/15 14:08:03 INFO SecurityManager: Changing modify acls to: root
15/08/15 14:08:03 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)
15/08/15 14:08:04 INFO Slf4jLogger: Slf4jLogger started
15/08/15 14:08:04 INFO Remoting: Starting remoting
15/08/15 14:08:04 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@10.128.184.131:44792]
15/08/15 14:08:04 INFO Utils: Successfully started service ‘sparkDriver’ on port 44792.
15/08/15 14:08:04 INFO SparkEnv: Registering MapOutputTracker
15/08/15 14:08:04 INFO SparkEnv: Registering BlockManagerMaster
15/08/15 14:08:04 INFO DiskBlockManager: Created local directory at /tmp/spark-6fc6b901-3ac8-4acd-87aa-352fd22cf8d4/blockmgr-4c660a56-0014-4b1f-81a9-7ac66507b9fa
15/08/15 14:08:04 INFO MemoryStore: MemoryStore started with capacity 265.4 MB
15/08/15 14:08:05 INFO HttpFileServer: HTTP File server directory is /tmp/spark-6fc6b901-3ac8-4acd-87aa-352fd22cf8d4/httpd-b4344651-dbd8-4ba4-be1a-913ae006d839
15/08/15 14:08:05 INFO HttpServer: Starting HTTP Server
15/08/15 14:08:05 INFO Utils: Successfully started service ‘HTTP file server’ on port 46256.
15/08/15 14:08:05 INFO SparkEnv: Registering OutputCommitCoordinator
15/08/15 14:08:05 WARN Utils: Service ‘SparkUI’ could not bind on port 4040. Attempting port 4041.
15/08/15 14:08:05 WARN QueuedThreadPool: 2 threads could not be stopped
15/08/15 14:08:05 WARN Utils: Service ‘SparkUI’ could not bind on port 4041. Attempting port 4042.
15/08/15 14:08:05 WARN Utils: Service ‘SparkUI’ could not bind on port 4042. Attempting port 4043.
15/08/15 14:08:06 WARN Utils: Service ‘SparkUI’ could not bind on port 4043. Attempting port 4044.
15/08/15 14:08:06 WARN Utils: Service ‘SparkUI’ could not bind on port 4044. Attempting port 4045.
15/08/15 14:08:06 INFO Utils: Successfully started service ‘SparkUI’ on port 4045.
15/08/15 14:08:06 INFO SparkUI: Started SparkUI at http://10.128.184.131:4045
15/08/15 14:08:06 INFO SparkContext: Added JAR file:/root/devExpert/spark-1.4.1/example-java-build/JavaWordCount/target/JavaWordCount-1.jar at http://10.128.184.131:46256/jars/JavaWordCount-1.jar with timestamp 1439618886415
15/08/15 14:08:06 INFO AppClientClientActor:Connectingtomasterakka.tcp://sparkMaster@NKGV50849583FV1:7077/user/Master...15/08/1514:08:06INFOSparkDeploySchedulerBackend:ConnectedtoSparkclusterwithappIDapp−20150815140806−000315/08/1514:08:06INFOAppClientClientActor: Connecting to master akka.tcp://sparkMaster@NKGV50849583FV1:7077/user/Master... 15/08/15 14:08:06 INFO SparkDeploySchedulerBackend: Connected to Spark cluster with app ID app-20150815140806-0003 15/08/15 14:08:06 INFO AppClientClientActor:Connectingtomasterakka.tcp://sparkMaster@NKGV50849583FV1:7077/user/Master...15/08/1514:08:06INFOSparkDeploySchedulerBackend:ConnectedtoSparkclusterwithappIDapp−20150815140806−000315/08/1514:08:06INFOAppClientClientActor: Executor added: app-20150815140806-0003/0 on worker-20150815125648-10.128.184.131-53710 (10.128.184.131:53710) with 8 cores
15/08/15 14:08:06 INFO SparkDeploySchedulerBackend: Granted executor ID app-20150815140806-0003/0 on hostPort 10.128.184.131:53710 with 8 cores, 512.0 MB RAM
15/08/15 14:08:06 INFO AppClientClientActor:Executoradded:app−20150815140806−0003/1onworker−20150815125443−10.128.184.131−34423(10.128.184.131:34423)with8cores15/08/1514:08:06INFOSparkDeploySchedulerBackend:GrantedexecutorIDapp−20150815140806−0003/1onhostPort10.128.184.131:34423with8cores,512.0MBRAM15/08/1514:08:06INFOAppClientClientActor: Executor added: app-20150815140806-0003/1 on worker-20150815125443-10.128.184.131-34423 (10.128.184.131:34423) with 8 cores 15/08/15 14:08:06 INFO SparkDeploySchedulerBackend: Granted executor ID app-20150815140806-0003/1 on hostPort 10.128.184.131:34423 with 8 cores, 512.0 MB RAM 15/08/15 14:08:06 INFO AppClientClientActor:Executoradded:app−20150815140806−0003/1onworker−20150815125443−10.128.184.131−34423(10.128.184.131:34423)with8cores15/08/1514:08:06INFOSparkDeploySchedulerBackend:GrantedexecutorIDapp−20150815140806−0003/1onhostPort10.128.184.131:34423with8cores,512.0MBRAM15/08/1514:08:06INFOAppClientClientActor: Executor updated: app-20150815140806-0003/0 is now LOADING
15/08/15 14:08:06 INFO AppClientClientActor:Executorupdated:app−20150815140806−0003/1isnowLOADING15/08/1514:08:06INFOAppClientClientActor: Executor updated: app-20150815140806-0003/1 is now LOADING 15/08/15 14:08:06 INFO AppClientClientActor:Executorupdated:app−20150815140806−0003/1isnowLOADING15/08/1514:08:06INFOAppClientClientActor: Executor updated: app-20150815140806-0003/0 is now RUNNING
15/08/15 14:08:06 INFO AppClientKaTeX parse error: Double subscript at position 1112: …ock broadcast_0_̲piece0 stored a…OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
15/08/15 14:08:16 INFO SparkContext: Successfully stopped SparkContext
15/08/15 14:08:16 INFO Utils: Shutdown hook called
15/08/15 14:08:16 INFO RemoteActorRefProviderRemotingTerminator:Shuttingdownremotedaemon.15/08/1514:08:16INFOUtils:Deletingdirectory/tmp/spark−6fc6b901−3ac8−4acd−87aa−352fd22cf8d415/08/1514:08:16INFORemoteActorRefProviderRemotingTerminator: Shutting down remote daemon. 15/08/15 14:08:16 INFO Utils: Deleting directory /tmp/spark-6fc6b901-3ac8-4acd-87aa-352fd22cf8d4 15/08/15 14:08:16 INFO RemoteActorRefProviderRemotingTerminator:Shuttingdownremotedaemon.15/08/1514:08:16INFOUtils:Deletingdirectory/tmp/spark−6fc6b901−3ac8−4acd−87aa−352fd22cf8d415/08/1514:08:16INFORemoteActorRefProviderRemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.

如果关掉一个worker:

要获取更多Jerry的原创文章,请关注公众号"汪子熙":

Spark练习 - 提交作业到集群 - submit job via cluster相关推荐

  1. spark提交到yarn_详细总结spark基于standalone、yarn集群提交作业流程

    最近总结了一些关于spark core的内容,今天先来和大家分享一下spark的运行模式. spark运行模式 (1)local:在本地eclipse.IDEA中写spark代码运行程序,一般用于测试 ...

  2. spark 提交任务到集群

    spark 提交任务到集群 链接 posted on 2018-07-11 15:46 luoganttcc 阅读(...) 评论(...) 编辑 收藏

  3. 【Python学习系列四】Python程序通过hadoop-streaming提交到Hadoop集群执行MapReduce

    场景:将Python程序通过hadoop-streaming提交到Hadoop集群执行. 参考:http://www.michael-noll.com/tutorials/writing-an-had ...

  4. Spark基础学习笔记06:搭建Spark On YARN模式的集群

    文章目录 零.本讲学习目标 一.在Spark Standalone模式的集群基础上修改配置 二.运行Spark应用程序 (一)启动Hadoop的HDFS和YARN (二)运行Spark应用程序 (三) ...

  5. Flink学习笔记04:将项目打包提交到Flink集群上运行(Scala版)

    文章目录 一.创建Maven项目 - ScalaWordCount 三.利用mvn命令打包Maven项目 三.上传项目jar包到Flink集群主节点 四.启动Flink Standalone集群 五. ...

  6. Flink学习笔记03:将项目打包提交到Flink集群上运行(Java版)

    文章目录 一.创建Maven项目 - WordCount 三.利用mvn命令打包Maven项目 三.上传项目jar包到Flink集群主节点 四.启动Flink Standalone集群 五.将应用提交 ...

  7. 生产环境实战spark (5)分布式集群 5台设备之间hosts文件配置 ssh免密码登录

    生产环境实战spark (5)分布式集群 5台设备之间 ssh免密码登录 之前已经在master节点单台设备上配置ssh免密码的登录工作,现在要做的事情是在5台设备之间实现ssh免密码操作.我在整个大 ...

  8. 记一次 基于Hadoop 3.3.0 安装部署 Spark 3.0.0 分布式集群

    一.基本信息 官网 http://spark.apache.org/ Apache Spark 官方文档中文版(Spark 2.2.0) http://spark.apachecn.org/#/ Sp ...

  9. 在CentOS上配置Percona XtraDB集群(Percona XtraDB Cluster)

    原作者:Percona官网 翻译&转载来源:https://www.percona.com/doc/percona-xtradb-cluster/LATEST/howtos/centos_ho ...

最新文章

  1. 怎样解决MySQL数据库主从复制延迟的问题
  2. 【知识星球】超3万字的网络结构解读,学习必备
  3. Python元组tuple(不可变)
  4. Bootstrap的lia
  5. 作业调度方案(codevs 1156)
  6. php 内核开发_深入理解PHP7内核之Reference
  7. shiro学习(18):使用注解实现权限认证和后台管理三
  8. 解析WeNet云端推理部署代码
  9. max os取消开机启动
  10. ios整理(六)关于用富文本在tableview的cell去加载html字符串的优化方案
  11. 暗中学习的人太坏了,12本书帮你翻车变超车!
  12. 麻省理工 城市规划 计算机,麻省理工学院建筑设计与城市规划科研
  13. 码农辞职一年后:独立工程师太难了
  14. 2020计算机毕设选题推荐可视化方向,前端方向本科应届生有什么毕设选题推荐?...
  15. 安装loadrunner时出现”命令行选项语法错误键入命令 \?获得帮助“的解决方法
  16. 2017/7/11 新开博客
  17. 奈奎斯特与香农定理_奈奎斯特定理和香农定理有什么区别?
  18. 好好搭搭机器人编程视频_机器人编程搭建 | 乐创世界学员作品展示第4期!
  19. pytorch中tf.nn.functional.softmax(x,dim = -1)对参数dim的理解
  20. 我也写点八卦系文章:从李彦宏八卦说起

热门文章

  1. Markdown基本语法总结
  2. Ubuntu下载gitea
  3. C#常用集合的使用(转载)
  4. log4j日志文件配置
  5. JS基础--函数与BOM、DOM操作、JS中的事件以及内置对象
  6. java中Infinity(无限)和NaN
  7. MongoDB的查询操作
  8. 关于图片延迟加载的解决方案(针对移动端)
  9. FileUpload路径
  10. 猫大叫,鼠速逃,人醒了