编译准备

1、下载所需的软件

先去官网下载hadoop2.8.0源码并解压,打开解压目录下的BUILDING.txt,编译过程和需要的软件其实就是根据这个文档里的描述来的。

(可以通过命令下载:wget http://mirrors.hust.edu.cn/apache/hadoop/common/hadoop-2.8.0/hadoop-2.8.0-src.tar.gz)

Requirements:* Unix System
* JDK 1.7+
* Maven 3.0 or later
* Findbugs 1.3.9 (if running findbugs)
* ProtocolBuffer 2.5.0
* CMake 2.6 or newer (if compiling native code), must be 3.0 or newer on Mac
* Zlib devel (if compiling native code)
* openssl devel ( if compiling native hadoop-pipes and to get the best HDFS encryption performance )
* Jansson C XML parsing library ( if compiling libwebhdfs )
* Linux FUSE (Filesystem in Userspace) version 2.6 or above ( if compiling fuse_dfs )
* Internet connection for first build (to fetch all Maven and Hadoop dependencies)
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13

这是编译所需要的软件,下载戳这里。

包括:

  1. JDK1.7+
  2. maven 3.0 or later
  3. findbugs 1.3.9
  4. protocolBuffer 2.5.0
  5. cmake 2.6
  6. zlib-devel
  7. openssl-devel

根据网友的资料,也需要安装autoconf automake gcc等。

2、安装软件

1> 安装JDK1.7并配置环境变量,这里就不赘述了,具体看前面的文档。

2> 安装各种库

yum -y install svn ncurses-devel gcc*
yum -y install lzo-devel zlib-devel autoconf automake libtool cmake openssl-devel
  • 1
  • 2
  • 3
  • 1
  • 2
  • 3

3> 安装maven

解压安装并配置环境变量,具体目录结构看前面的博客。

tar zxvf apache-maven-3.3.9-bin.tar.gz
vi /etc/profile
  • 1
  • 2
  • 3
  • 4
  • 5
  • 1
  • 2
  • 3
  • 4
  • 5

profile文件末尾追加

export MAVEN_HOME=/home/toto/software/apache-maven-3.3.9
export MAVEN_OPTS="-Xms256m -Xmx512m"  (可以不设置)
export PATH=$PATH:$MAVEN_HOME/bin进入/home/toto/software/apache-maven-3.3.9/conf,修改settings.xml中临时产生的本地maven库的位置
<localRepository>/home/toto/software/repo</localRepository>

保存并使环境变量生效, source /etc/profile,输入 mvn -version,有下面输出结果则安装并配置正确。

[root@hadoop ~]# mvn -version
Apache Maven 3.3.9 (bb52d8502b132ec0a5a3f4c09453c07478323dc5; 2015-11-11T00:41:47+08:00)
Maven home: /home/toto/software/apache-maven-3.3.9
Java version: 1.8.0_73, vendor: Oracle Corporation
Java home: /usr/local/java/jdk1.8.0_73/jre
Default locale: zh_CN, platform encoding: UTF-8
OS name: "linux", version: "2.6.32-573.el6.x86_64", arch: "amd64", family: "unix"
[root@hadoop ~]#
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9

4> 安装protocolBuffer

解压安装并配置环境变量。

tar zxvf protobuf-2.5.0.tar.gz
cd /home/toto/software/protobuf-2.5.0
./configure
make
make install
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11

输入protoc --version,有下面输出结果则安装并配置正确。

[hadoop@Master ~]$ protoc --version
libprotoc 2.5.0
[hadoop@Master ~]$
  • 1
  • 2
  • 3
  • 4
  • 1
  • 2
  • 3
  • 4

5> 安装findbugs

解压安装并配置环境变量。

unzip findbugs-1.3.9.zip
vi /etc/profile
  • 1
  • 2
  • 3
  • 4
  • 5
  • 1
  • 2
  • 3
  • 4
  • 5

profile文件末尾追加

export FINDBUGS_HOME=/home/toto/software/findbugs-1.3.9
export  PATH=$PATH:$FINDBUGS_HOME/bin
  • 1
  • 2
  • 1
  • 2

保存并使环境变量生效,source /etc/profile,输入findbugs -version,有下面输出结果则安装并配置正确。

[hadoop@Master ~]$ findbugs -version
1.3.9
[hadoop@Master ~]$ 
  • 1
  • 2
  • 3
  • 1
  • 2
  • 3

开始编译

首先保证主机能上网(虚拟机怎么上网的点这里),在编译过程中网络保持畅通;进入到hadoop2.8.0源码的解压目录下,输入下面命令:

cd /home/toto/software/hadoop-2.8.0-srcmvn package -Pdist,native -DskipTests -Dtar
  • 1
  • 1

或者这个

cd /home/toto/software/hadoop-2.8.0-srcmvn package -Pdist,native,docs,src -DskipTests -Dtar
  • 1
  • 1

前面只编译本地代码,后者编译本地代码和文档,因此前者速度较快。

接下来就是漫长的等待,等出现这个就说明编译成功。

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main ................................. SUCCESS [  3.533 s]
[INFO] Apache Hadoop Project POM .......................... SUCCESS [  2.023 s]
[INFO] Apache Hadoop Annotations .......................... SUCCESS [  3.679 s]
[INFO] Apache Hadoop Assemblies ........................... SUCCESS [  0.275 s]
[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [  2.875 s]
[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [  4.856 s]
[INFO] Apache Hadoop MiniKDC .............................. SUCCESS [  4.340 s]
[INFO] Apache Hadoop Auth ................................. SUCCESS [  4.534 s]
[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [  5.398 s]
[INFO] Apache Hadoop Common ............................... SUCCESS [03:02 min]
[INFO] Apache Hadoop NFS .................................. SUCCESS [ 11.653 s]
[INFO] Apache Hadoop KMS .................................. SUCCESS [ 24.501 s]
[INFO] Apache Hadoop Common Project ....................... SUCCESS [  0.112 s]
[INFO] Apache Hadoop HDFS ................................. SUCCESS [07:28 min]
[INFO] Apache Hadoop HttpFS ............................... SUCCESS [ 41.608 s]
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SUCCESS [ 10.673 s]
[INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [  7.225 s]
[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [  0.057 s]
[INFO] hadoop-yarn ........................................ SUCCESS [  0.110 s]
[INFO] hadoop-yarn-api .................................... SUCCESS [03:36 min]
[INFO] hadoop-yarn-common ................................. SUCCESS [ 45.418 s]
[INFO] hadoop-yarn-server ................................. SUCCESS [  0.164 s]
[INFO] hadoop-yarn-server-common .......................... SUCCESS [ 12.942 s]
[INFO] hadoop-yarn-server-nodemanager ..................... SUCCESS [ 19.200 s]
[INFO] hadoop-yarn-server-web-proxy ....................... SUCCESS [  3.315 s]
[INFO] hadoop-yarn-server-applicationhistoryservice ....... SUCCESS [  7.855 s]
[INFO] hadoop-yarn-server-resourcemanager ................. SUCCESS [ 24.347 s]
[INFO] hadoop-yarn-server-tests ........................... SUCCESS [  6.439 s]
[INFO] hadoop-yarn-client ................................. SUCCESS [  6.393 s]
[INFO] hadoop-yarn-server-sharedcachemanager .............. SUCCESS [  3.445 s]
[INFO] hadoop-yarn-applications ........................... SUCCESS [  0.075 s]
[INFO] hadoop-yarn-applications-distributedshell .......... SUCCESS [  2.304 s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher ..... SUCCESS [  2.026 s]
[INFO] hadoop-yarn-site ................................... SUCCESS [  0.155 s]
[INFO] hadoop-yarn-registry ............................... SUCCESS [  7.255 s]
[INFO] hadoop-yarn-project ................................ SUCCESS [ 11.871 s]
[INFO] hadoop-mapreduce-client ............................ SUCCESS [  0.254 s]
[INFO] hadoop-mapreduce-client-core ....................... SUCCESS [ 26.029 s]
[INFO] hadoop-mapreduce-client-common ..................... SUCCESS [ 25.002 s]
[INFO] hadoop-mapreduce-client-shuffle .................... SUCCESS [  3.792 s]
[INFO] hadoop-mapreduce-client-app ........................ SUCCESS [  7.797 s]
[INFO] hadoop-mapreduce-client-hs ......................... SUCCESS [  5.143 s]
[INFO] hadoop-mapreduce-client-jobclient .................. SUCCESS [  6.771 s]
[INFO] hadoop-mapreduce-client-hs-plugins ................. SUCCESS [  1.837 s]
[INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [  4.513 s]
[INFO] hadoop-mapreduce ................................... SUCCESS [  6.842 s]
[INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [  4.355 s]
[INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [ 14.910 s]
[INFO] Apache Hadoop Archives ............................. SUCCESS [  2.844 s]
[INFO] Apache Hadoop Rumen ................................ SUCCESS [  6.931 s]
[INFO] Apache Hadoop Gridmix .............................. SUCCESS [  3.937 s]
[INFO] Apache Hadoop Data Join ............................ SUCCESS [  2.499 s]
[INFO] Apache Hadoop Ant Tasks ............................ SUCCESS [  2.268 s]
[INFO] Apache Hadoop Extras ............................... SUCCESS [  2.739 s]
[INFO] Apache Hadoop Pipes ................................ SUCCESS [  5.793 s]
[INFO] Apache Hadoop OpenStack support .................... SUCCESS [  4.444 s]
[INFO] Apache Hadoop Amazon Web Services support .......... SUCCESS [  4.258 s]
[INFO] Apache Hadoop Azure support ........................ SUCCESS [ 47.689 s]
[INFO] Apache Hadoop Client ............................... SUCCESS [ 19.524 s]
[INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [  0.305 s]
[INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [  5.581 s]
[INFO] Apache Hadoop Tools Dist ........................... SUCCESS [ 25.708 s]
[INFO] Apache Hadoop Tools ................................ SUCCESS [  0.281 s]
[INFO] Apache Hadoop Distribution ......................... SUCCESS [01:53 min]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 24:49 min
[INFO] Finished at: 2015-12-11T20:29:45+08:00
[INFO] Final Memory: 110M/493M
[INFO] ------------------------------------------------------------------------
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 18
  • 19
  • 20
  • 21
  • 22
  • 23
  • 24
  • 25
  • 26
  • 27
  • 28
  • 29
  • 30
  • 31
  • 32
  • 33
  • 34
  • 35
  • 36
  • 37
  • 38
  • 39
  • 40
  • 41
  • 42
  • 43
  • 44
  • 45
  • 46
  • 47
  • 48
  • 49
  • 50
  • 51
  • 52
  • 53
  • 54
  • 55
  • 56
  • 57
  • 58
  • 59
  • 60
  • 61
  • 62
  • 63
  • 64
  • 65
  • 66
  • 67
  • 68
  • 69
  • 70
  • 71
  • 72
  • 73
  • 74
  • 75
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 18
  • 19
  • 20
  • 21
  • 22
  • 23
  • 24
  • 25
  • 26
  • 27
  • 28
  • 29
  • 30
  • 31
  • 32
  • 33
  • 34
  • 35
  • 36
  • 37
  • 38
  • 39
  • 40
  • 41
  • 42
  • 43
  • 44
  • 45
  • 46
  • 47
  • 48
  • 49
  • 50
  • 51
  • 52
  • 53
  • 54
  • 55
  • 56
  • 57
  • 58
  • 59
  • 60
  • 61
  • 62
  • 63
  • 64
  • 65
  • 66
  • 67
  • 68
  • 69
  • 70
  • 71
  • 72
  • 73
  • 74
  • 75

编译好的文件在../hadoop-dist/target/hadoop-2.8.0.tar.gz下。

最后编译后出的效果:

cd /home/toto/software/hadoop-2.8.0-src/hadoop-dist/target

编译中遇到的问题

错误1

Connection to http://repo.maven.apache.org refused

表示连接maven远程仓库拒绝,此时再运行一下编译命令,就会接着下载jar包。

错误2

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on project hadoop-nfs: Compilation failure: Compilation failure:
[ERROR] /home/hadoop/toolkits/hadoop-2.7.1-src/hadoop-common-project/hadoop-nfs/src/main/java/org/apache/hadoop/oncrpc/XDR.java:[23,30] package org.jboss.netty.buffer does not exist
  • 1
  • 2
  • 1
  • 2

这个错误估计很少遇到,这是因为我嫌repo.maven.apache.org这个网站比较慢,更改的第三方镜像,导致maven中的settings文件配置错误,后来采用默认的就行,虽然慢点。

错误3

[ERROR] around Ant part ...<exec dir="/opt/soft/hadoop-2.7.1-src/hadoop-hdfs-project/hadoop-hdfs-httpfs/target" executable="sh" failonerror="true">... @ 10:123 in /opt/soft/hadoop-2.7.1-src/hadoop-hdfs-project/hadoop-hdfs-httpfs/target/antrun/build-main.xml
[ERROR] -> [Help 1]
  • 1
  • 2
  • 1
  • 2

这是由于tomcat的apache-tomcat-6.0.41.tar.gz包太大,没有下载完整,可以到.../hadoop-2.7.1-src/hadoop-hdfs-project/hadoop-hdfs-httpfs/downloads/apache-tomcat-6.0.41.tar.gz这个目录下,删除重新下载。

提醒: 
1、有时候编译过程中会出现下载某个包的时间太久,这是由于连接网站的过程中会出现假死,此时按ctrl+c,重新运行编译命令。 
2、如果出现缺少了某个文件的情况,则要先清理maven(使用命令 mvn clean) 再重新编译。

====================================

写在最后:

经过了较长时间的折磨,终于成功,可能还是自己比较菜吧,有编译不成功的欢迎留言探讨。

本地编译Hadoop2.8.0源码总结和问题解决(转自:http://blog.csdn.net/young_kim1/article/details/50324345)相关推荐

  1. 编译hadoop2.2.0源码时报错

    编译hadoop2.2.0源码时, mvn install -DskipTests 报错: [ERROR] COMPILATION ERROR : [INFO] ------------------- ...

  2. 自己动手编译Android 8.0源码

    转载自:http://blog.csdn.net/dl6655/article/details/78869501 安装git并且配置 sudo apt-get install git git conf ...

  3. [Android 编译(一)] Ubuntu 16.04 LTS 成功编译 Android 6.0 源码教程

    1 前言 经过3天奋战,终于在Ubuntu 16.04上把Android 6.0的源码编译出来了,各种配置,各种error,各种爬坑,特写此博客记录爬坑经历.先上图,Ubuntu上编译完后成功运行模拟 ...

  4. Ubuntu14.04 32位上编译VLC2.2.0源码操作步骤

    1.  首先安装必须的依赖软件,打开终端,执行: sudo apt-get install git libtool build-essential pkg-config autoconf 2. 从 h ...

  5. android源码编译烧鸡,android4.0源码下载 编译 系统体验~图解

    近来着眼于ANDROID4.0的关注与研究,第一时间把ANDROID4.0源码下 下来了,只是时间问题,所以没能第一时间把过程记录下来,今天算是富裕点时间就给写下来,希望能给灼热于ANDROID继续平 ...

  6. 龙芯3a5000下编译redis 7.0源码

    1.下载redis 7.0源码后解压缩备用 https://redis.io/download/ 2.下载最新版本的config.guess和config.sub redis 用到了jemalloc库 ...

  7. ubuntu下编译安卓7.0源码

    ubuntu下使用国内镜像下载安卓7.0源码 本文使用的环境如下: 1.Ubuntu系统:ubuntu-18.04.5-desktop-amd64 2.repo:谷歌开发的方便拉安卓源码的工具 3.p ...

  8. 海思3559A上编译OpenCV4.1.0源码操作步骤

    1. 从https://github.com/opencv/opencv/releases 下载opencv源码opencv-4.1.0.zip并解压缩: 2. 修改最顶层的CMakeLists.tx ...

  9. (微信公众号开发《一》OAuth2.0网页授权认证获取用户的详细信息,实现自动登陆)http://blog.csdn.net/liaohaojian/article/details/70175835

    从接触公众号到现在,通过不断积累学习,对如何调用微信提供接口有了一定的见解.当然在开发过程中遇到很多问题,现在把部分模块功能在这备案一下,做个总结也希望能给其他人帮助 工欲善其事,必先利其器,先看看开 ...

最新文章

  1. 《Java技术》第三次作业--面向对象——继承、抽象类、接口
  2. 【网络流24题】解题报告:E 、圆桌问题(最大流求二分图多重匹配)
  3. 用示波器恢复软盘里的游戏,这个程序员大神的操作太硬核了
  4. avalov+require实现tab栏
  5. 面试官再问高并发,求你把这篇发给他!
  6. 如何改变SQL SERVER的身份验证模式
  7. 记一次 Python Web 接口优化,性能提升25倍!
  8. ORA-00031: session marked for kill 标记要终止的会话
  9. python将2个列表list合并到1个列表使用appenden_【新手入门】20个很实用的 Python 学习小技巧...
  10. 10kv开关柜价格_一进三出10KV负荷开关环网柜乌兰察布
  11. css div设置inline-block后 div顶部对齐
  12. mysql 与gemfire的同步_(转)分布式缓存GemFire架构介绍
  13. oss上传判断_OSS
  14. 实战来了!聊聊电商系统中红包雨功能的设计与实现
  15. 技术人的未来(一)——跳槽
  16. [CGAL]建立一个正四面体
  17. 6.misc类设备与蜂鸣器驱动
  18. Matlab+cpp矩量法代码演示
  19. dd dt标签 显示与隐藏
  20. 高级运维工程师证书_高级运维工程师的具体职责说明

热门文章

  1. Django框架(22.Django中设置session以及session对象及方法)
  2. VTK:迭代最近点变换用法实战
  3. VTK:vtkNew<vtkDenseArray<double>>用法实战
  4. opengl加载显示3D模型3d类型文件
  5. wxWidgets:wxAUI 概述
  6. boost::python::register_exception_translator相关的测试程序
  7. boost::proto模块实现在外部指定转换的示例的测试程序
  8. boost::io::ostream_put用法的测试程序
  9. GDCM:DICOM转储到DshibaDTI的测试程序
  10. ITK:图像的区域最大值