zoukankan      html  css  js  c++  java
  • eclipse安装hadoop插件

    我想还有很多人没有听说过ZModem协议,更不知道有rz/sz这样方便的工具。 好东西不敢独享。以下给出我知道的一点皮毛。 下面一段是从SecureCRT的帮助中copy的:

     

     

    ZModem is a full-duplex file transfer protocol that supports fast data transfer rates and effective error detection. ZModem is very user friendly, allowing either the sending or receiving party to initiate a file transfer. ZModem supports multiple file ("batch") transfers, and allows the use of wildcards when specifying filenames. ZModem also supports resuming most prior ZModem file transfer attempts.

     

     

    rz,sz是便是Linux/Unix同Windows进行ZModem文件传输的命令行工具 windows端需要支持ZModem的telnet/ssh客户端,SecureCRT就可以用SecureCRT登陆到Unix/Linux主机(telnet或ssh均可) O 运行命令rz,即是接收文件,SecureCRT就会弹出文件选择对话框,选好文件之后关闭对话框,文件就会上传到当前目录 O 运行命令sz file1 file2就是发文件到windows上(保存的目录是可以配置) 比FTP命令方便多了,而且服务器不用再开FTP服务了 PS:Linux上rz/sz这两个小工具安装lrzsz-x.x.xx.rpm即可,Unix可用源码自行 编译,Solaris spac的可以到sunfreeware下载执行码

     

    如果安装的是hadoop-0.20.2,那么eclipse-plugin的具体位置位在:/home/hadoop/hadoop-0.20.2/contrib/eclipse-plugin下面。 
    如果安装的是hadoop-0.21.0,那么eclipse-plugin的具体位置位在:/home/hadoop/hadoop-0.21.0/mapred/contrib/eclipse/hadoop-0.21.0-eclipse-plugin.jar下面

    将hadoop-0.21.0-eclipse-plugin.jar这个插件保存到eclipse目录下的pluging中,eclipse就能够自动识别。

     

    本机的环境如下:

    Eclipse 3.6

    Hadoop-0.20.2

    Hive-0.5.0-dev

    1. 安装hadoop-0.20.2-eclipse-plugin的插件。注意:Hadoop目录中的/hadoop-0.20.2/contrib /eclipse-plugin/hadoop-0.20.2-eclipse-plugin.jar在Eclipse3.6下有问题,无法在 Hadoop Server上运行,可以从http://code.google.com/p/hadoop-eclipse-plugin/下载

    2. 选择Map/Reduce视图:window ->  open pers.. ->  other.. ->  map/reduce

    3. 增加DFS Locations:点击Map/Reduce Locations—> New Hadoop Loaction,填写对应的host和port

    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    
    1. Map/Reduce Master:     
    2. Host: 10.10.xx.xx   
    3. Port: 9001     
    4. DFS Master:     
    5. Host: 10.10.xx.xx(选中 User M/R Master host即可)     
    6. Port: 9000     
    7. User name: root  
    8.    
    9. 更改Advance parameters 中的 hadoop.job.ugi, 默认是 DrWho,Tardis, 改成:root,Tardis。如果看不到选项,则使用Eclipse -clean重启Eclipse   
    10. 否则,可能会报错org.apache.hadoop.security.AccessControlException  

    4. 设置本机的Host:

    1
    2
    3
    4
    5
    
    1. 10.10.xx.xx zw-hadoop-master. zw-hadoop-master     
    2.    
    3. #注意后面需要还有一个zw-hadoop-master.,否则运行Map/Reduce时会报错:     
    4. java.lang.IllegalArgumentException: Wrong FS: hdfs://zw-hadoop-master:9000/user/root/oplog/out/_temporary/_attempt_201008051742_0135_m_000007_0, expected: hdfs://zw-hadoop-master.:9000     
    5.     at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:352)  

    5. 新建一个Map/Reduce Project,新建Mapper,Reducer,Driver类,注意,自动生成的代码是基于老版本的Hadoop,自己修改:

    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    29
    30
    31
    32
    33
    34
    35
    36
    37
    38
    39
    40
    41
    42
    43
    44
    45
    46
    47
    48
    49
    50
    51
    52
    53
    54
    55
    56
    57
    58
    59
    60
    61
    62
    63
    64
    65
    66
    67
    68
    69
    70
    71
    72
    73
    74
    75
    76
    77
    78
    79
    80
    81
    
    1. <span>package</span> <span>com.sohu.hadoop.test</span><span>;</span>     
    2.    
    3. <span>import</span> <span>java.util.StringTokenizer</span><span>;</span>     
    4. <span>import</span> <span>org.apache.hadoop.io.IntWritable</span><span>;</span>     
    5. <span>import</span> <span>org.apache.hadoop.io.Text</span><span>;</span>     
    6. <span>import</span> <span>org.apache.hadoop.mapreduce.Mapper</span><span>;</span>     
    7.    
    8. <span>public</span> <span>class</span> MapperTest <span>extends</span> Mapper<span><</span>Object, Text, Text, IntWritable<span>></span> <span>{</span>     
    9.     <span>private</span> <span>final</span> <span>static</span> IntWritable one <span>=</span> <span>new</span> IntWritable<span>(</span><span>1</span><span>)</span><span>;</span>     
    10.    
    11.     <span>public</span> <span>void</span> map<span>(</span><span>Object</span> key, Text value, <span>Context</span> context<span>)</span>     
    12.             <span>throws</span> <span>IOException</span>, <span>InterruptedException</span> <span>{</span>     
    13.         <span>String</span> userid <span>=</span> value.<span>toString</span><span>(</span><span>)</span>.<span>split</span><span>(</span><span>"[|]"</span><span>)</span><span>[</span><span>2</span><span>]</span><span>;</span>     
    14.         context.<span>write</span><span>(</span><span>new</span> Text<span>(</span>userid<span>)</span>, <span>new</span> IntWritable<span>(</span><span>1</span><span>)</span><span>)</span><span>;</span>     
    15.     <span>}</span>     
    16. <span>}</span>     
    17.    
    18.    
    19. <span>package</span> <span>com.sohu.hadoop.test</span><span>;</span>     
    20.    
    21. <span>import</span> <span>java.io.IOException</span><span>;</span>     
    22. <span>import</span> <span>org.apache.hadoop.io.IntWritable</span><span>;</span>     
    23. <span>import</span> <span>org.apache.hadoop.io.Text</span><span>;</span>     
    24. <span>import</span> <span>org.apache.hadoop.mapreduce.Reducer</span><span>;</span>     
    25.    
    26. <span>public</span> <span>class</span> ReducerTest <span>extends</span> Reducer<span><</span>Text, IntWritable, Text, IntWritable<span>></span> <span>{</span>     
    27.    
    28.     <span>private</span> IntWritable result <span>=</span> <span>new</span> IntWritable<span>(</span><span>)</span><span>;</span>     
    29.    
    30.     <span>public</span> <span>void</span> reduce<span>(</span>Text key, Iterable<span><</span>IntWritable<span>></span> values, <span>Context</span> context<span>)</span>     
    31.             <span>throws</span> <span>IOException</span>, <span>InterruptedException</span> <span>{</span>     
    32.         <span>int</span> sum <span>=</span> <span>0</span><span>;</span>     
    33.         <span>for</span> <span>(</span>IntWritable val <span>:</span> values<span>)</span> <span>{</span>     
    34.             sum <span>+=</span> val.<span>get</span><span>(</span><span>)</span><span>;</span>     
    35.         <span>}</span>     
    36.         result.<span>set</span><span>(</span>sum<span>)</span><span>;</span>     
    37.         context.<span>write</span><span>(</span>key, result<span>)</span><span>;</span>     
    38.     <span>}</span>     
    39. <span>}</span>     
    40.    
    41.    
    42. <span>package</span> <span>com.sohu.hadoop.test</span><span>;</span>     
    43.    
    44. <span>import</span> <span>org.apache.hadoop.conf.Configuration</span><span>;</span>     
    45. <span>import</span> <span>org.apache.hadoop.fs.Path</span><span>;</span>     
    46. <span>import</span> <span>org.apache.hadoop.io.IntWritable</span><span>;</span>     
    47. <span>import</span> <span>org.apache.hadoop.io.Text</span><span>;</span>     
    48. <span>import</span> <span>org.apache.hadoop.io.compress.CompressionCodec</span><span>;</span>     
    49. <span>import</span> <span>org.apache.hadoop.io.compress.GzipCodec</span><span>;</span>     
    50. <span>import</span> <span>org.apache.hadoop.mapreduce.Job</span><span>;</span>     
    51. <span>import</span> <span>org.apache.hadoop.mapreduce.lib.input.FileInputFormat</span><span>;</span>     
    52. <span>import</span> <span>org.apache.hadoop.mapreduce.lib.output.FileOutputFormat</span><span>;</span>     
    53. <span>import</span> <span>org.apache.hadoop.util.GenericOptionsParser</span><span>;</span>     
    54.    
    55. <span>public</span> <span>class</span> DriverTest <span>{</span>     
    56.     <span>public</span> <span>static</span> <span>void</span> main<span>(</span><span>String</span><span>[</span><span>]</span> args<span>)</span> <span>throws</span> <span>Exception</span> <span>{</span>     
    57.         Configuration conf <span>=</span> <span>new</span> Configuration<span>(</span><span>)</span><span>;</span>     
    58.         <span>String</span><span>[</span><span>]</span> otherArgs <span>=</span> <span>new</span> GenericOptionsParser<span>(</span>conf, args<span>)</span>     
    59.                 .<span>getRemainingArgs</span><span>(</span><span>)</span><span>;</span>     
    60.         <span>if</span> <span>(</span>otherArgs.<span>length</span> <span>!=</span> <span>2</span><span>)</span>      
    61.         <span>{</span>     
    62.             <span>System</span>.<span>err</span>.<span>println</span><span>(</span><span>"Usage: DriverTest <in> <out>"</span><span>)</span><span>;</span>     
    63.             <span>System</span>.<span>exit</span><span>(</span><span>2</span><span>)</span><span>;</span>     
    64.         <span>}</span>     
    65.         Job job <span>=</span> <span>new</span> Job<span>(</span>conf, <span>"Driver Test"</span><span>)</span><span>;</span>     
    66.         job.<span>setJarByClass</span><span>(</span>DriverTest.<span>class</span><span>)</span><span>;</span>     
    67.         job.<span>setMapperClass</span><span>(</span>MapperTest.<span>class</span><span>)</span><span>;</span>     
    68.         job.<span>setCombinerClass</span><span>(</span>ReducerTest.<span>class</span><span>)</span><span>;</span>     
    69.         job.<span>setReducerClass</span><span>(</span>ReducerTest.<span>class</span><span>)</span><span>;</span>     
    70.         job.<span>setOutputKeyClass</span><span>(</span>Text.<span>class</span><span>)</span><span>;</span>     
    71.         job.<span>setOutputValueClass</span><span>(</span>IntWritable.<span>class</span><span>)</span><span>;</span>     
    72.    
    73.         conf.<span>setBoolean</span><span>(</span><span>"mapred.output.compress"</span>, <span>true</span><span>)</span><span>;</span>     
    74.         conf.<span>setClass</span><span>(</span><span>"mapred.output.compression.codec"</span>, GzipCodec.<span>class</span>,CompressionCodec.<span>class</span><span>)</span><span>;</span>     
    75.    
    76.         FileInputFormat.<span>addInputPath</span><span>(</span>job, <span>new</span> Path<span>(</span>otherArgs<span>[</span><span>0</span><span>]</span><span>)</span><span>)</span><span>;</span>     
    77.         FileOutputFormat.<span>setOutputPath</span><span>(</span>job, <span>new</span> Path<span>(</span>otherArgs<span>[</span><span>1</span><span>]</span><span>)</span><span>)</span><span>;</span>     
    78.    
    79.         <span>System</span>.<span>exit</span><span>(</span>job.<span>waitForCompletion</span><span>(</span><span>true</span><span>)</span> <span>?</span> <span>0</span> <span>:</span> <span>1</span><span>)</span><span>;</span>     
    80.     <span>}</span>     
    81. <span>}</span>  

    6. 在DriverTest上,点击Run As —> Run on Hadoop,选择对应的Hadoop Locaion即可

  • 相关阅读:
    Python 类中方法的内部变量,命名加'self.'变成 self.xxx 和不加直接 xxx 的区别
    用foreach遍历 datagridView 指定列所有的内容
    treeView1.SelectedNode.Level
    YES NO 上一个 下一个
    正则 单词全字匹配查找 reg 边界查找 精确匹配 只匹配字符 不含连续的字符
    抓取2个字符串中间的字符串
    sqlite 60000行 插入到数据库只用不到2秒
    将多行文本以单行的格式保存起来 读和写 ini
    将秒转换成时间格式
    richtextbox Ctrl+V只粘贴纯文本格式
  • 原文地址:https://www.cnblogs.com/wshsdlau/p/3529004.html
Copyright © 2011-2022 走看看