1.MapReduce是什么
Hadoop MapReduce是一個軟件框架,基於該框架能夠容易地編寫應用程序,這些應用程序能夠運行在由上千個商用機器組成的大集群上,並以一種可靠的,具有容錯能力的方式並行地處理上TB級別的海量數據集。這個定義里面有着這些關鍵詞,
一是軟件框架,二是並行處理,三是可靠且容錯,四是大規模集群,五是海量數據集。
2 MapReduce做什么
MapReduce擅長處理大數據,它為什么具有這種能力呢?這可由MapReduce的設計思想發覺。MapReduce的思想就是“分而治之”。
(1)Mapper負責“分”,即把復雜的任務分解為若干個“簡單的任務”來處理。“簡單的任務”包含三層含義:
一是數據或計算的規模相對原任務要大大縮小;二是就近計算原則,即任務會分配到存放着所需數據的節點上進行計算;三是這些小任務可以並行計算,彼此間幾乎沒有依賴關系。
(2)Reducer負責對map階段的結果進行匯總。至於需要多少個Reducer,用戶可以根據具體問題,通過在mapred-site.xml配置文件里設置參數mapred.reduce.tasks的值,缺省值為1。
一個比較形象的語言解釋MapReduce:
我們要數圖書館中的所有書。你數1號書架,我數2號書架。這就是“Map”。我們人越多,數書就更快。 現在我們到一起,把所有人的統計數加在一起。這就是“Reduce”。
MapReduce流程
- inputFormat 先通過inputFormat 讀進來
- InputSplit 然后通過split進行分片
- RecordReaders 簡稱RR 通過 recordReader讀取切片
- map map處理 輸出個臨時結果
- Combiner 本機先做一次reduce 減少io 提升作業執行性能,但是也有缺點,如果做全局平均數 等就不准了
- shuffing - partitioner shuffing分發
- shuffing - sort shuffing排序
- reduce
- OutputFormat 最終輸出
MapReduce的輸入輸出
MapReduce框架運轉在<key,value>鍵值對上,也就是說,框架把作業的輸入看成是一組<key,value>鍵值對,同樣也產生一組<key,value>鍵值對作為作業的輸出,這兩組鍵值對有可能是不同的。
一個MapReduce作業的輸入和輸出類型如下圖所示:可以看出在整個流程中,會有三組<key,value>鍵值對類型的存在。
MapReduce的處理流程
這里以WordCount單詞計數為例,介紹map和reduce兩個階段需要進行哪些處理。單詞計數主要完成的功能是:統計一系列文本文件中每個單詞出現的次數,如圖所示
編寫一個簡單的 WordCount mapReduce 腳本
編寫map腳本
//繼承mapper類 /** * KEYIN, Map任務讀數據的key類型,offset,是每行數據起始位置的偏移量 Long * VALUEIN, Map任務讀取數據的 value類型 就是一行行字符串 String * KEYOUT, map方法自定義實現輸出key類型 * VALUEOUT map方法自定義實現輸出value類型 * * hello world welcome * hello welcome * keyout String valueout int * (world,1) * hadoop 會有自定義類型 支持序列化和反序列化 */ public class WordCountMapper extends Mapper<LongWritable,Text,Text,IntWritable> { //自定義map 把自己需要的數據截取出來 然后交給后續步驟來做 @Override protected void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException { //多個單詞用 空格拆開 String[] words = value.toString().split("-"); for(String word:words) { context.write(new Text(word),new IntWritable(1)); } } }
編寫Reduce
/** * Reduce 的輸入是map的輸出 * KEYIN, VALUEIN, KEYOUT, VALUEOUT 輸入是 word,1 輸出是 word,3 都是 string,int */ public class WordCountReduce extends Reducer<Text,IntWritable,Text,IntWritable> { /** * * @param key 對應的單詞 * @param values 可以迭代的value 相同的key都會分發到一個reduce上面去 類似於 (hello,<1,1,1,1>) * @param context * @throws IOException * @throws InterruptedException */ @Override protected void reduce(Text key, Iterable<IntWritable> values, Context context) throws IOException, InterruptedException { int count = 0; Iterator<IntWritable> iterator = values.iterator(); while(iterator.hasNext()) { IntWritable value = iterator.next(); //累加 count += value.get(); } context.write(key,new IntWritable(count)); } }
創建Job 運行
//windows需要設置 hadoop.home.dir System.setProperty("hadoop.home.dir", "D:\\javaroot\\soft\\hadoop-2.6.0-cdh5.15.1"); //設置hadoop帳號 System.setProperty("HADOOP_USER_NAME","hadoop"); Configuration configuration = new Configuration(); configuration.set("fs.defaultFS","hdfs://192.168.1.100:8020"); //提交個作業 Job job = Job.getInstance(configuration); //設置job對應的主類 job.setJarByClass(App.class); //添加 Combiner job.setCombinerClass(WordCountReduce.class); //設置自定義的mapper類型 job.setMapperClass(WordCountMapper.class); job.setReducerClass(WordCountReduce.class); //設置輸出類型 job.setMapOutputKeyClass(Text.class); job.setMapOutputValueClass(IntWritable.class); //設置reduce輸出 job.setOutputKeyClass(Text.class); job.setOutputValueClass(IntWritable.class); //設置輸入和輸出的路徑 FileInputFormat.setInputPaths(job, new Path("/demo/wordcount/input")); FileOutputFormat.setOutputPath(job,new Path("/demo/wordcount/output")); //提交job boolean res = job.waitForCompletion(true); System.exit(res ? 0 :1);
實例:解析流量日志 算出每個手機號 上行和下行的流量和總流量
數據log
1363157985066 13726230503 00-FD-07-A4-72-B8:CMCC 120.196.100.82 i02.c.aliimg.com 24 27 2481 24681 200 1363157995052 13826544101 5C-0E-8B-C7-F1-E0:CMCC 120.197.40.4 4 0 264 0 200 1363157991076 13926435656 20-10-7A-28-CC-0A:CMCC 120.196.100.99 2 4 132 1512 200 1363154400022 13926251106 5C-0E-8B-8B-B1-50:CMCC 120.197.40.4 4 0 240 0 200 1363157993044 18211575961 94-71-AC-CD-E6-18:CMCC-EASY 120.196.100.99 iface.qiyi.com 視頻網站 15 12 1527 2106 200 1363157995074 84138413 5C-0E-8B-8C-E8-20:7DaysInn 120.197.40.4 122.72.52.12 20 16 4116 1432 200 1363157993055 13560439658 C4-17-FE-BA-DE-D9:CMCC 120.196.100.99 18 15 1116 954 200 1363157995033 15920133257 5C-0E-8B-C7-BA-20:CMCC 120.197.40.4 sug.so.360.cn 信息安全 20 20 3156 2936 200 1363157983019 13719199419 68-A1-B7-03-07-B1:CMCC-EASY 120.196.100.82 4 0 240 0 200 1363157984041 13660577991 5C-0E-8B-92-5C-20:CMCC-EASY 120.197.40.4 s19.cnzz.com 站點統計 24 9 6960 690 200 1363157973098 15013685858 5C-0E-8B-C7-F7-90:CMCC 120.197.40.4 rank.ie.sogou.com 搜索引擎 28 27 3659 3538 200 1363157986029 15989002119 E8-99-C4-4E-93-E0:CMCC-EASY 120.196.100.99 www.umeng.com 站點統計 3 3 1938 180 200 1363157992093 13560439658 C4-17-FE-BA-DE-D9:CMCC 120.196.100.99 15 9 918 4938 200 1363157986041 13480253104 5C-0E-8B-C7-FC-80:CMCC-EASY 120.197.40.4 3 3 180 180 200 1363157984040 13602846565 5C-0E-8B-8B-B6-00:CMCC 120.197.40.4 2052.flash2-http.qq.com 綜合門戶 15 12 1938 2910 200 1363157995093 13922314466 00-FD-07-A2-EC-BA:CMCC 120.196.100.82 img.qfc.cn 12 12 3008 3720 200 1363157982040 13502468823 5C-0A-5B-6A-0B-D4:CMCC-EASY 120.196.100.99 y0.ifengimg.com 綜合門戶 57 102 7335 110349 200 1363157986072 18320173382 84-25-DB-4F-10-1A:CMCC-EASY 120.196.100.99 input.shouji.sogou.com 搜索引擎 21 18 9531 2412 200 1363157990043 13925057413 00-1F-64-E1-E6-9A:CMCC 120.196.100.55 t3.baidu.com 搜索引擎 69 63 11058 48243 200 1363157988072 13760778710 00-FD-07-A4-7B-08:CMCC 120.196.100.82 2 2 120 120 200 1363157985066 13726238888 00-FD-07-A4-72-B8:CMCC 120.196.100.82 i02.c.aliimg.com 24 27 2481 24681 200 1363157993055 13560436666 C4-17-FE-BA-DE-D9:CMCC 120.196.100.99 18 15 1116 954 200 1363157985066 13726238888 00-FD-07-A4-72-B8:CMCC 120.196.100.82 i02.c.aliimg.com 24 27 10000 20000 200
代碼實現
//map類 public class AccessMapper extends Mapper<LongWritable,Text,Text,Access> { //把日志按切分 找到需要的三個字段 @Override protected void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException { String[] lines = value.toString().split("\t"); String phone = lines[1]; long up = Long.parseLong(lines[lines.length - 3]); long down = Long.parseLong(lines[lines.length - 2]); context.write(new Text(phone),new Access(phone,up,down,(up+down))); } } //reduce 類 public class AccessReduce extends Reducer<Text,Access,NullWritable,Access> { /** * @param key 手機號 * @param values Access * @param context * @throws IOException * @throws InterruptedException */ @Override protected void reduce(Text key, Iterable<Access> values, Context context) throws IOException, InterruptedException { long ups = 0; long downs = 0; for (Access access:values) { ups += access.getUp(); downs += access.getDown(); } context.write(NullWritable.get(),new Access(key.toString(),ups,downs,(ups+downs))); } } //job 執行 public static void main(String[] args) throws Exception { //windows需要設置 hadoop.home.dir System.setProperty("hadoop.home.dir", "D:\\javaroot\\soft\\hadoop-2.6.0-cdh5.15.1"); //設置hadoop帳號 // System.setProperty("HADOOP_USER_NAME","hadoop"); Configuration configuration = new Configuration(); //configuration.set("fs.defaultFS","hdfs://192.168.1.100:8020"); Job job = Job.getInstance(configuration); job.setJarByClass(App.class); job.setMapperClass(AccessMapper.class); job.setReducerClass(AccessReduce.class); job.setMapOutputKeyClass(Text.class); job.setMapOutputValueClass(Access.class); job.setOutputKeyClass(NullWritable.class); job.setOutputValueClass(Access.class); //設置輸入和輸出的路徑 FileInputFormat.setInputPaths(job, new Path("input")); FileOutputFormat.setOutputPath(job,new Path("output")); //提交job boolean res = job.waitForCompletion(true); System.exit(res ? 0 :1); }
最后執行結果
phone='13480253104', up=180, down=180, sum=360 phone='13502468823', up=7335, down=110349, sum=117684 phone='13560436666', up=1116, down=954, sum=2070 phone='13560439658', up=2034, down=5892, sum=7926 phone='13602846565', up=1938, down=2910, sum=4848 phone='13660577991', up=6960, down=690, sum=7650 phone='13719199419', up=240, down=0, sum=240 phone='13726230503', up=2481, down=24681, sum=27162 phone='13726238888', up=12481, down=44681, sum=57162 phone='13760778710', up=120, down=120, sum=240 phone='13826544101', up=264, down=0, sum=264 phone='13922314466', up=3008, down=3720, sum=6728 phone='13925057413', up=11058, down=48243, sum=59301 phone='13926251106', up=240, down=0, sum=240 phone='13926435656', up=132, down=1512, sum=1644 phone='15013685858', up=3659, down=3538, sum=7197 phone='15920133257', up=3156, down=2936, sum=6092 phone='15989002119', up=1938, down=180, sum=2118 phone='18211575961', up=1527, down=2106, sum=3633 phone='18320173382', up=9531, down=2412, sum=11943 phone='84138413', up=4116, down=1432, sum=5548