Hadoop读书笔记(六)MapReduce自定义数据类型demo
Hadoop读书笔记(一)Hadoop介绍:http://blog.csdn.net/caicongyang/article/details/39898629
Hadoop读书笔记(二)HDFS的shell操作:http://blog.csdn.net/caicongyang/article/details/41253927
Hadoop读书笔记(三)Java API操作HDFS:http://blog.csdn.net/caicongyang/article/details/41290955
Hadoop读书笔记(四)HDFS体系结构 :http://blog.csdn.net/caicongyang/article/details/41322649
Hadoop读书笔记(五)MapReduce统计单词demo:http://blog.csdn.net/caicongyang/article/details/41453579
1.demo说明
从给定的日志文件中统计手机流量
2.日志文件
1363157985066 13726230503 00-FD-07-A4-72-B8:CMCC 120.196.100.82 i02.c.aliimg.com 24 27 2481 24681 200
1363157995052 13826544101 5C-0E-8B-C7-F1-E0:CMCC 120.197.40.4 4 0 264 0 200
1363157991076 13926435656 20-10-7A-28-CC-0A:CMCC 120.196.100.99 2 4 132 1512 200
1363154400022 13926251106 5C-0E-8B-8B-B1-50:CMCC 120.197.40.4 4 0 240 0 200
1363157993044 18211575961 94-71-AC-CD-E6-18:CMCC-EASY 120.196.100.99 iface.qiyi.com 视频网站 15 12 1527 2106 200
1363157995074 84138413 5C-0E-8B-8C-E8-20:7DaysInn 120.197.40.4 122.72.52.12 20 16 4116 1432 200
1363157993055 13560439658 C4-17-FE-BA-DE-D9:CMCC 120.196.100.99 18 15 1116 954 200
1363157995033 15920133257 5C-0E-8B-C7-BA-20:CMCC 120.197.40.4 sug.so.360.cn 信息安全 20 20 3156 2936 200
1363157983019 13719199419 68-A1-B7-03-07-B1:CMCC-EASY 120.196.100.82 4 0 240 0 200
1363157984041 13660577991 5C-0E-8B-92-5C-20:CMCC-EASY 120.197.40.4 s19.cnzz.com 站点统计 24 9 6960 690 200
1363157973098 15013685858 5C-0E-8B-C7-F7-90:CMCC 120.197.40.4 rank.ie.sogou.com 搜索引擎 28 27 3659 3538 200
1363157986029 15989002119 E8-99-C4-4E-93-E0:CMCC-EASY 120.196.100.99 www.umeng.com 站点统计 3 3 1938 180 200
1363157992093 13560439658 C4-17-FE-BA-DE-D9:CMCC 120.196.100.99 15 9 918 4938 200
1363157986041 13480253104 5C-0E-8B-C7-FC-80:CMCC-EASY 120.197.40.4 3 3 180 180 200
1363157984040 13602846565 5C-0E-8B-8B-B6-00:CMCC 120.197.40.4 2052.flash2-http.qq.com 综合门户 15 12 1938 2910 200
1363157995093 13922314466 00-FD-07-A2-EC-BA:CMCC 120.196.100.82 img.qfc.cn 12 12 3008 3720 200
1363157982040 13502468823 5C-0A-5B-6A-0B-D4:CMCC-EASY 120.196.100.99 y0.ifengimg.com 综合门户 57 102 7335 110349 200
1363157986072 18320173382 84-25-DB-4F-10-1A:CMCC-EASY 120.196.100.99 input.shouji.sogou.com 搜索引擎 21 18 9531 2412 200
1363157990043 13925057413 00-1F-64-E1-E6-9A:CMCC 120.196.100.55 t3.baidu.com 搜索引擎 69 63 11058 48243 200
1363157988072 13760778710 00-FD-07-A4-7B-08:CMCC 120.196.100.82 2 2 120 120 200
1363157985079 13823070001 20-7C-8F-70-68-1F:CMCC 120.196.100.99 6 3 360 180 200
1363157985069 13600217502 00-1F-64-E2-E8-B1:CMCC 120.196.100.55 18 138 1080 186852 200
3.代码
KpiApp.java
package mapReduce;import java.io.DataInput;
import java.io.DataOutput;
import java.io.IOException;
import java.net.URI;import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.io.Writable;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.Mapper;
import org.apache.hadoop.mapreduce.Reducer;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.input.TextInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
import org.apache.hadoop.mapreduce.lib.output.TextOutputFormat;
import org.apache.hadoop.mapreduce.lib.partition.HashPartitioner;
/*** * <p> * Title: KpiApp.java * Package mapReduce * </p>* <p>* Description: 统计流量* <p>* @author Tom.Cai* @created 2014-11-25 下午10:23:33 * @version V1.0 **/
public class KpiApp {private static final String INPUT_PATH = "hdfs://192.168.80.100:9000/wlan";private static final String OUT_PATH = "hdfs://192.168.80.100:9000/wlan_out";public static void main(String[] args) throws Exception {FileSystem fileSystem = FileSystem.get(new URI(INPUT_PATH), new Configuration());Path outPath = new Path(OUT_PATH);if (fileSystem.exists(outPath)) {fileSystem.delete(outPath, true);}Job job = new Job(new Configuration(), KpiApp.class.getSimpleName());FileInputFormat.setInputPaths(job, INPUT_PATH);job.setInputFormatClass(TextInputFormat.class);job.setMapperClass(KpiMapper.class);job.setMapOutputKeyClass(Text.class);job.setMapOutputValueClass(KpiWite.class);job.setPartitionerClass(HashPartitioner.class);job.setNumReduceTasks(1);job.setReducerClass(KpiReducer.class);job.setOutputKeyClass(Text.class);job.setOutputValueClass(KpiWite.class);FileOutputFormat.setOutputPath(job, new Path(OUT_PATH));job.setOutputFormatClass(TextOutputFormat.class);job.waitForCompletion(true);}static class KpiMapper extends Mapper<LongWritable, Text, Text, KpiWite> {@Overrideprotected void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException {String[] splited = value.toString().split("\t");String num = splited[1];KpiWite kpi = new KpiWite(splited[6], splited[7], splited[8], splited[9]);context.write(new Text(num), kpi);}}static class KpiReducer extends Reducer<Text, KpiWite, Text, KpiWite> {@Overrideprotected void reduce(Text key, Iterable<KpiWite> value, Context context) throws IOException, InterruptedException {long upPackNum = 0L;long downPackNum = 0L;long upPayLoad = 0L;long downPayLoad = 0L;for (KpiWite kpi : value) {upPackNum += kpi.upPackNum;downPackNum += kpi.downPackNum;upPayLoad += kpi.upPayLoad;downPayLoad += kpi.downPayLoad;}context.write(key, new KpiWite(String.valueOf(upPackNum), String.valueOf(downPackNum), String.valueOf(upPayLoad), String.valueOf(downPayLoad)));}}}class KpiWite implements Writable {long upPackNum;long downPackNum;long upPayLoad;long downPayLoad;public KpiWite() {}public KpiWite(String upPackNum, String downPackNum, String upPayLoad, String downPayLoad) {this.upPackNum = Long.parseLong(upPackNum);this.downPackNum = Long.parseLong(downPackNum);this.upPayLoad = Long.parseLong(upPayLoad);this.downPayLoad = Long.parseLong(downPayLoad);}@Overridepublic void readFields(DataInput in) throws IOException {this.upPackNum = in.readLong();this.downPackNum = in.readLong();this.upPayLoad = in.readLong();this.downPayLoad = in.readLong();}@Overridepublic void write(DataOutput out) throws IOException {out.writeLong(upPackNum);out.writeLong(downPackNum);out.writeLong(upPayLoad);out.writeLong(downPayLoad);}}
欢迎大家一起讨论学习!
有用的自己收!
记录与分享,让你我共成长!欢迎查看我的其他博客;我的博客地址:http://blog.csdn.net/caicongyang
Hadoop读书笔记(六)MapReduce自定义数据类型demo相关推荐
- Hadoop读书笔记(八)MapReduce 打成jar包demo
Hadoop读书笔记(一)Hadoop介绍:http://blog.csdn.net/caicongyang/article/details/39898629 Hadoop读书笔记(二)HDFS的sh ...
- Hadoop学习笔记—11.MapReduce中的排序和分组
Hadoop学习笔记-11.MapReduce中的排序和分组 一.写在之前的 1.1 回顾Map阶段四大步骤 首先,我们回顾一下在MapReduce中,排序和分组在哪里被执行: 从上图中可以清楚地看出 ...
- 3d游戏设计读书笔记六
3d游戏设计读书笔记六 一.改进飞碟(Hit UFO)游戏: 游戏内容要求: 按 adapter模式 设计图修改飞碟游戏 使它同时支持物理运动与运动学(变换)运动 更改原 UFO_action 类 为 ...
- Hadoop读书笔记(四)HDFS体系结构
Hadoop读书笔记(一)Hadoop介绍:http://blog.csdn.net/caicongyang/article/details/39898629 Hadoop读书笔记(二)HDFS的sh ...
- Hadoop读书笔记(三)Java API操作HDFS
Hadoop读书笔记(一)Hadoop介绍:http://blog.csdn.net/caicongyang/article/details/39898629 Hadoop读书笔记(二)HDFS的sh ...
- Hadoop读书笔记——基础知识
//书非借不能读也,今早从图书馆新书阅览室借来<Hadoop in Action>,最长借期7天.整理读书笔记并留下电子版以供以后方便复习. Hadoop是一个开源的框架,可编写和运行分布 ...
- Mapreduce自定义数据类型
Hadoop自带的数据类型: Intwritable,LongWritable,Text,xxWritable. 某些情况下:使用自定义的数据类型方便一些(类似java中的pojo). 实现: 实现w ...
- Hadoop学习笔记:MapReduce框架详解
原文:http://blog.jobbole.com/84089/ 原文出处: 夏天的森林 开始聊mapreduce,mapreduce是hadoop的计算框架,我学hadoop是从hive开始入手, ...
- hadoop 学习笔记:mapreduce框架详解
开始聊mapreduce,mapreduce是hadoop的计算框架,我学hadoop是从hive开始入手,再到hdfs,当我学习hdfs时候,就感觉到hdfs和mapreduce关系的紧密.这个可能 ...
最新文章
- (转载)深入理解Linux中内存管理---分段与分页简介
- ASP正则表达式方面小笔记
- logistics regression 线性不可分转换成线性可分
- WIN7的IE11中安装activex控件
- 牛客网优惠码-直通BAT面试算法精品课购买
- 在两个电子表格中找出相同的姓名
- 【操作系统概念-作业4】Threads
- C#事件中sender的小用法(转载)
- Java Cache 缓存方案详解及代码-Ehcache
- 使用spring的优势
- ubuntu 安装小企鹅拼音输入法
- 价值连城 图灵奖得主杰弗里·欣顿(Geoffrey·Hinton)的采访 给AI从业者的建议
- C++ 链表的基本操作
- 国标28181:什么是RTSP协议
- PPT怎么插入图案填充效果
- 桌面云服务器联想,联想云桌面系统助力四川大学搭建智慧课堂
- Java入门——方法的使用
- 分享快递批量查询高手工具,分析出问题原因或延误单号,赶紧收藏
- 视图渲染、CPU和GPU卡顿原因及其优化方案
- 2021绥化高考成绩查询,绥化中考成绩查询2021