Hadoop文件系统是一个抽象的概念,hdfs仅仅是Hadoop文件系统的其中之一。
就hdfs而言,访问该文件系统有两种方式:(1)利用hdfs自带的命令行方式,此方法类似linux下面的shell命令;(2)利用hdfs的java接口,通过编写java程序来实现。
操作环境:hadoop-1.0.4,java1.7.0_65,Ubuntu 14.04.1 LTS

1 import java.io.InputStream; 2 import java.net.URI; 3 4 import org.apache.hadoop.conf.Configuration; 5 import org.apache.hadoop.fs.FSDataInputStream; 6 import org.apache.hadoop.fs.FileSystem; 7 import org.apache.hadoop.fs.Path; 8 import org.apache.hadoop.io.IOUtils; 9 10 11 12 13 public class FileSystemCat { 14 15 public static void main(String[] args) throws Exception { 16 String uri = args[0]; 17 Configuration conf = new Configuration(); 18 FileSystem fs = FileSystem.get(URI.create(uri), conf); 19 InputStream in = null; 20 try { 21 in = fs.open(new Path(uri)); 22 IOUtils.copyBytes(in, System.out, 4096, false); 23 } finally { 24 IOUtils.closeStream(in); 25 } 26 } 27 28 }
气死我了,这个程序都还没运行成功。
一开始,搞不清楚到底需要import那些类, 关于代码中的类需要import哪些package,可以查这个API文档:http://hadoop.apache.org/docs/current/api/index.html
现在能javac编译成功了,但用hadoop filename 还是不能运行,报错提示:
hadoop FileSystemCat hdfs://conf.sh
Error: Could not find or load main class FileSystemCat
气死我了!!!!!!!!!!!!!!!!!!!!!!1
-----------------------------------
我想一定是关于java程序运行,以及classpath的问题,,,,,我需要搞清楚!21:28:54 2014-10-23
------------------------------
问题搞定了,hadoop-env.sh这个文件里面有个CLASSPATH的参数设置,这个设置值要和javac编译生成的.class文件一致 2014-10-23 23:59:53
今天发现,在没有启动hadoop的情况下,居然可以直接启动hbase 2014-10-28 11:12:29
用javac FileSystemCat.java时,会出现很多报错,
stu@master:~$ javac FileSystemCat.java FileSystemCat.java:4: error: package org.apache.hadoop.conf does not exist import org.apache.hadoop.conf.Configuration; ^ FileSystemCat.java:5: error: package org.apache.hadoop.fs does not exist import org.apache.hadoop.fs.FSDataInputStream; ^ FileSystemCat.java:6: error: package org.apache.hadoop.fs does not exist import org.apache.hadoop.fs.FileSystem; ^ FileSystemCat.java:7: error: package org.apache.hadoop.fs does not exist import org.apache.hadoop.fs.Path; ^ FileSystemCat.java:8: error: package org.apache.hadoop.io does not exist import org.apache.hadoop.io.IOUtils; ^ FileSystemCat.java:17: error: cannot find symbol Configuration conf = new Configuration(); ^ symbol: class Configuration location: class FileSystemCat FileSystemCat.java:17: error: cannot find symbol Configuration conf = new Configuration(); ^ symbol: class Configuration location: class FileSystemCat FileSystemCat.java:18: error: cannot find symbol FileSystem fs = FileSystem.get(URI.create(uri), conf); ^ symbol: class FileSystem location: class FileSystemCat FileSystemCat.java:18: error: cannot find symbol FileSystem fs = FileSystem.get(URI.create(uri), conf); ^ symbol: variable FileSystem location: class FileSystemCat FileSystemCat.java:21: error: cannot find symbol in = fs.open(new Path(uri)); ^ symbol: class Path location: class FileSystemCat FileSystemCat.java:22: error: cannot find symbol IOUtils.copyBytes(in, System.out, 4096, false); ^ symbol: variable IOUtils location: class FileSystemCat FileSystemCat.java:24: error: cannot find symbol IOUtils.closeStream(in); ^ symbol: variable IOUtils location: class FileSystemCat 12 errors stu@master:~$
这时需要在编译的时候,把hadoop里面的相应jar文件设置为classpath参数,即如下就对了:
stu@master:~$ javac -classpath /home/stu/hadoop-1.0.4/hadoop-core-1.0.4.jar FileSystemCat.java
然后把生成的FileSystemCat.java复制到 hadoop-env.sh里面设定的文件夹下即可。
# Extra Java CLASSPATH elements. Optional.
export HADOOP_CLASSPATH=/home/stu/myclass