当前位置: 首页 > news >正文

北京产品网站设计哪家专业网站后台放在哪里

北京产品网站设计哪家专业,网站后台放在哪里,做网站公司昆山,企业营销网站建设费用预算文章目录1. MapReduce 作业流程2. 实践2.1 启动 hadoop2.2 创建 java 项目2.3 MapReduce shell2.4 MapReduce Web UI3. MapReduce 编程实践#xff1a;统计对象中的某些属性参考书#xff1a;《Hadoop大数据原理与应用》1. MapReduce 作业流程 2. 实践 2.1 启动 hadoop sta… 文章目录1. MapReduce 作业流程2. 实践2.1 启动 hadoop2.2 创建 java 项目2.3 MapReduce shell2.4 MapReduce Web UI3. MapReduce 编程实践统计对象中的某些属性参考书《Hadoop大数据原理与应用》1. MapReduce 作业流程 2. 实践 2.1 启动 hadoop start-dfs.sh start-yarn.sh mr-jobhistory-daemon.sh start historyserver # 第三条可以用下面的命令上面的显示过期了以后弃用 mapred --daemon start historyserver2.2 创建 java 项目 WordCountMapper.java package com.michael.mapreduce;import org.apache.hadoop.io.IntWritable; import org.apache.hadoop.io.LongWritable; import org.apache.hadoop.io.Text; import org.apache.hadoop.mapreduce.Mapper; import java.io.IOException;public class WordCountMapper extends MapperLongWritable, Text, Text, IntWritable{//self define map method 自定义 map 方法Overrideprotected void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException{String line value.toString();String[] words line.split( );for(String word : words)context.write(new Text(word), new IntWritable(1));// context.write() give to next stage: shuffle} }WordCountReducer.java package com.michael.mapreduce;import org.apache.hadoop.io.IntWritable; import org.apache.hadoop.io.Text; import org.apache.hadoop.mapreduce.Reducer; import java.io.IOException;public class WordCountReducer extends ReducerText, IntWritable, Text, IntWritable{// 自定义 reduce 方法Overrideprotected void reduce(Text key, IterableIntWritable values, Context context) throwsIOException, InterruptedException{int sum 0;for(IntWritable value : values)sum value.get();context.write(key, new IntWritable(sum));} }WordCountDriver.javadirver 类设置本次 job package com.michael.mapreduce;import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.fs.Path; import org.apache.hadoop.io.IntWritable; import org.apache.hadoop.io.Text; import org.apache.hadoop.io.compress.BZip2Codec; import org.apache.hadoop.io.compress.CompressionCodec; import org.apache.hadoop.mapreduce.Job; import org.apache.hadoop.mapreduce.lib.input.FileInputFormat; import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;import java.io.IOException;public class WordCountDriver {// args 参数 输入输出文件路径public static void main(String[] args) throws IOException, ClassNotFoundException, InterruptedException{Configuration conf new Configuration();// map compress 开启 map 阶段的压缩conf.setBoolean(mapreduce.map.output.compress, true);// compress type指定压缩类型conf.setClass(mapreduce.map.output.compress.codec, BZip2Codec.class, CompressionCodec.class);Job job Job.getInstance(conf, word count diy:);job.setJarByClass(WordCountDriver.class);job.setMapperClass(WordCountMapper.class);// 自定义 Combinejob.setCombinerClass(WordCountReducer.class);job.setReducerClass(WordCountReducer.class);// 指定 map 输出数据的类型job.setMapOutputKeyClass(Text.class);job.setMapOutputValueClass(IntWritable.class);// 指定 reduce 输出数据类型job.setOutputKeyClass(Text.class);job.setOutputValueClass(IntWritable.class);// 设置输入文件路径FileInputFormat.setInputPaths(job, new Path(args[0]));// 设置输出文件路径FileOutputFormat.setOutputPath(job, new Path(args[1]));// 开启 reduce 阶段的解压缩FileOutputFormat.setCompressOutput(job, true);// 指定解压缩类型跟上面压缩类型一致FileOutputFormat.setOutputCompressorClass(job, BZip2Codec.class);boolean result job.waitForCompletion(true);System.exit(result ? 0 : 1);} }导出 wordcount_diy.jar - 提交hadoop执行 hadoop jar /home/dnn/eclipse-workspace/HDFS_example/wordcount_diy.jar com.michael.mapreduce.WordCountDriver /InputDataTest /OutputDataTest1查看结果 hdfs dfs -cat /OutputDataTesdfs dfs -cat /OutputDataTest1/part-r-00000.bz2显示乱码需要下载然后解压再查看 下载 hdfs dfs -get /OutputDataTest1/part-r-00000.bz2 /home/dnn/eclipse-workspace/HDFS_example/part-r-00000.bz2查看 bzcat /home/dnn/eclipse-workspace/HDFS_example/part-r-00000.bz22.3 MapReduce shell 查看作业状态 mapred job -status job_1615849408082_0001[dnnmaster Desktop]$ mapred job -status job_1615849408082_0001 WARNING: HADOOP_MAPRED_PID_DIR has been replaced by HADOOP_PID_DIR. Using value of HADOOP_MAPRED_PID_DIR. 2021-03-26 04:25:14,881 INFO client.DefaultNoHARMFailoverProxyProvider: Connecting to ResourceManager at master/192.168.253.130:8032 2021-03-26 04:25:15,939 INFO mapred.ClientServiceDelegate: Application state is completed. FinalApplicationStatusSUCCEEDED. Redirecting to job history serverJob: job_1615849408082_0001 Job File: hdfs://192.168.253.130:9000/tmp/hadoop-yarn/staging/history/done/2021/03/24/000000/job_1615849408082_0001_conf.xml Job Tracking URL : http://master:19888/jobhistory/job/job_1615849408082_0001 Uber job : false Number of maps: 3 Number of reduces: 1 map() completion: 1.0 reduce() completion: 1.0 Job state: SUCCEEDED retired: false reason for failure: Counters: 54File System CountersFILE: Number of bytes read6640FILE: Number of bytes written1072644FILE: Number of read operations0FILE: Number of large read operations0FILE: Number of write operations0HDFS: Number of bytes read25631HDFS: Number of bytes written4967HDFS: Number of read operations14HDFS: Number of large read operations0HDFS: Number of write operations2HDFS: Number of bytes read erasure-coded0Job Counters Launched map tasks3Launched reduce tasks1Data-local map tasks3Total time spent by all maps in occupied slots (ms)43801Total time spent by all reduces in occupied slots (ms)5037Total time spent by all map tasks (ms)43801Total time spent by all reduce tasks (ms)5037Total vcore-milliseconds taken by all map tasks43801Total vcore-milliseconds taken by all reduce tasks5037Total megabyte-milliseconds taken by all map tasks44852224Total megabyte-milliseconds taken by all reduce tasks5157888Map-Reduce FrameworkMap input records667Map output records3833Map output bytes40605Map output materialized bytes8455Input split bytes358Combine input records3833Combine output records1264Reduce input groups913Reduce shuffle bytes8455Reduce input records1264Reduce output records913Spilled Records2528Shuffled Maps 3Failed Shuffles0Merged Map outputs3GC time elapsed (ms)818CPU time spent (ms)3140Physical memory (bytes) snapshot599461888Virtual memory (bytes) snapshot10950950912Total committed heap usage (bytes)385351680Peak Map Physical memory (bytes)167784448Peak Map Virtual memory (bytes)2735529984Peak Reduce Physical memory (bytes)96972800Peak Reduce Virtual memory (bytes)2744360960Shuffle ErrorsBAD_ID0CONNECTION0IO_ERROR0WRONG_LENGTH0WRONG_MAP0WRONG_REDUCE0File Input Format Counters Bytes Read25273File Output Format Counters Bytes Written49672.4 MapReduce Web UI http://192.168.253.130:19888/jobhistory3. MapReduce 编程实践统计对象中的某些属性 MapReduce 编程实践统计对象中的某些属性
http://wiki.neutronadmin.com/news/81139/

相关文章:

  • php网站实例网站优化排名公司
  • 网站建设junke100搞笑视频网站建设策划书
  • 北京高端网站制作怎样做医疗保健网站
  • dw建设网站的代码模板下载浙江乐清新闻今天
  • 邯郸做企业网站改版做建材的网站好名字
  • 怎么做公司网站的手机客户端太原做网站需要多少钱
  • 做一个平面网站的成本wordpress做手机版
  • 自己做网站写文章变装app制作教程
  • 建筑工程是干嘛的高明搜索seo
  • 橱柜手机网站模板微信小程序官网
  • 企业网站必须实名认证cq网络网站
  • 网络舆情分析的内容快速网站seo效果
  • 包包网站建设策划书安徽建筑网
  • 网站建设免费国外云鼎大数据888元建站
  • 连网站建设百度服务中心投诉
  • 网站建设与优化推广方案内容学生创业做网站制作设计
  • 腾讯云建设网站怎么seo关键词优化排名
  • 佛山建设网站推广资源整合平台
  • 高端网站建设价钱部队网站建设报告
  • 网站设计与开发专家设计海报的软件
  • 网站开发实用技术pdfwordpress绑定熊掌号
  • 做网站如何添加表单wordpress 全部页面500
  • 制作网站的步骤有哪些青岛活动策划公司
  • 公司网站怎么创一个网站
  • 自己做网站打开是乱码华阳路街道网站建设
  • 东莞网站推广公司门户网站制作流程博客
  • 开发区网站建设在哪衡水做wap网站的地方
  • 东莞网站建设周期三门县住房和城乡建设规划局网站
  • 网站建设设计原则建设网站的网络公司
  • 公司自己做网站多少费用注册营业执照网上申请入口