转载自http://blog.csdn.net/yfkiss/article/details/7715476和http://blog.csdn.net/yfkiss/article/details/7721329
下载hadoop
hadoop下载地址:
http://www.apache.org/dyn/closer.cgi/hadoop/core/
这里下载的版本是1.0.3
$ mkdir hadoop
$ wget http://www.fayea.com/apache-mirror/hadoop/core/stable/hadoop-1.0.3.tar.gz .
安装java
首先用“java -version”查看是否已安装,如果显示类似于java version "1.7.0_147-icedtea 则表示已安装
没有安装可以参考:http://blog.csdn.net/yang_hui1986527/article/details/6677450
安装必须设置JAVA_HOME和CLASSPATH
我的配置:
export PATH=${PATH}:/usr/lib/jvm/java-6-openjdk-amd64/bin
export JAVA_HOME=/usr/lib/jvm/java-6-openjdk-amd64/
export JRE_HOME=${JAVA_HOME}/jre
export CLASSPATH=.:${JRE_HOME}/lib:${JAVA_HOME}/lib/dt.jar:${JAVA_HOME}/lib/tools.jar
并将这两行语句放在:~/.bashrc中
安装ssh
$ sudo apt-get install ssh
安装完成后,需要确认可以用ssh免密码登录localhost
$ ssh localhost
如果需要密码才能登录,则需要设置:
$ ssh-keygen -t dsa -P '' -f ~/.ssh/id_dsa
$ cat ~/.ssh/id_dsa.pub >> ~/.ssh/authorized_keys
安装rsync
rsync是linux实现远程同步的软件
$ sudo apt-get install rsync
配置启动hadoop
解压:
$ tar -zxvf hadoop-1.0.3.tar.gz
设置JAVA_HOME
编辑conf/hadoop-env.sh文件,找到:
# export JAVA_HOME=/usr/lib/j2sdk1.5-sun
修改为:
export JAVA_HOME=/usr/lib/jvm/java-6-openjdk-amd64/
修改配置文件:
如果不知道java在什么地方请用whereis java查询
修改conf/core-site.xml:
- <configuration>
- <property>
- <name>fs.default.name</name>
- <value>hdfs://localhost:9000</value>
- </property>
- </configuration>
修改conf/hdfs-site.xml:
- <configuration>
- <property>
- <name>dfs.replication</name>
- <value>1</value>
- </property>
- <property>
- <name>hadoop.tmp.dir</name>
- <value>/home/work/hadoop_tmp</value>
- </property>
- </configuration>
修改conf/mapred-site.xml:
- <configuration>
- <property>
- <name>mapred.job.tracker</name>
- <value>localhost:9001</value>
- </property>
- </configuration>
初始化hadoop Namenode:
$ bin/hadoop namenode –format
启动:
$ bin/start-all.sh
确认启动:
$ jps
5146 Jps
4538 TaskTracker
4312 JobTracker
4015 DataNode
4228 SecondaryNameNode
3789 NameNode
表示启动成功
以下内容写入~/.bashrc:
- alias hadoop='/home/zxm/hadoop/hadoop-1.0.3/bin/hadoop'
- alias hls='hadoop fs -ls'
- alias hlsr='hadoop fs -lsr'
- alias hcp='hadoop fs -cp '
- alias hmv='hadoop fs -mv'
- alias hget='hadoop fs -get'
- alias hput='hadoop fs -put'
- alias hrm='hadoop fs -rm'
- alias hmkdir='hadoop fs -mkdir'
- alias hcat='hadoop fs -cat'
- alias hrmr='hadoop fs -rmr'
- alias hstat='hadoop fs -stat'
- alias htest='hadoop fs -test'
- alias htext='hadoop fs -text'
- alias htouchz='hadoop fs -touchz'
- alias hdu='hadoop fs -du'
- alias hdus='hadoop fs -dus'
- alias hchmod='hadoop fs -chmod'
- alias hchgrp='hadoop fs -chgrp'
- alias hchown='hadoop fs -chown'
- alias htail='hadoop fs -tail'<span style="font-family:Arial, Helvetica, sans-serif;"><span style="white-space: normal;">
- </span></span>
常见问题解决方案:
问题1:运行hadoop命令是出现“Warning: $HADOOP_HOME is deprecated.”报警
解决:添加 export HADOOP_HOME_WARN_SUPPRESS=TRUE 到 hadoop-env.sh 中
问题2:namenode无法启动
解决:删除/tmp/hadoop* 执行bin/hadoop namenode –format
在hadoop单机环境搭建成功后,可以搭建hive。
在hdfs上建目录:
- $ hadoop fs -mkdir /tmp
- $ hadoop fs -mkdir /user/hive/warehouse
添加权限:
- $ hadoop fs -chmod g+w /tmp
- $ hadoop fs -chmod g+w /user/hive/warehouse
下载解压hive:
$ wget http://labs.mop.com/apache-mirror/hive/stable/hive-0.8.1.tar.gz .
$ tar -zxvf hive-0.8.1.tar.gz
设置HADOOP_HOME、HIVE_HOME,并将其添加到~/.bashrc
- export HADOOP_HOME=/home/zxm/hadoop/hadoop-1.0.3
- export HIVE_HOME=/home/work/hadoop/hive-0.8.1
多用户支持
(确认已安装好mysql)
启动mysql:
$ mysql -u root -p
mysql>grant all on hive.* to hive@localhost identified by '123456'
- <property>
- <name>javax.jdo.option.ConnectionURL</name>
- <value>jdbc:mysql://localhost:3306/hive?createDatabaseIfNotExist=true<alue>
- <description>JDBC connect string for a JDBC metastore</description>
- </property>
- <property>
- <name>javax.jdo.option.ConnectionDriverName</name>
- <value>com.mysql.jdbc.Driver<alue>
- <description>Driver class name for a JDBC metastore</description>
- </property>
- <property>
- <name>javax.jdo.option.ConnectionUserName</name>
- <value>hive<alue>
- <description>username to use against metastore database</description>
- </property>
- <property>
- <name>javax.jdo.option.ConnectionPassword</name>
- <value>123456<alue>
- <description>password to use against metastore database</description>
- </property>
wget http://downloads.mysql.com/archives/mysql-connector-java-5.0/mysql-connector-java-5.0.8.tar.gz .
解压:
tar -zxvf mysql-connector-java-5.0.8.tar.gz
将mysql-connector-java-5.0.8-bin.jar拷贝到hive lib目录下:
cp mysql-connector-java-5.0.8/mysql-connector-java-5.0.8-bin.jar ./lib
启动hive:
$ cd /home/zxm/hadoop/hive-0.8.1 ; ./bin/hive
测试:
$ ./hive
WARNING: org.apache.hadoop.metrics.jvm.EventCounter is deprecated. Please use org.apache.hadoop.log.metrics.EventCounter in all the log4j.properties files.
Logging initialized using configuration in jar:file:/home/zxm/hadoop/hive-0.8.1/lib/hive-common-0.8.1.jar!/hive-log4j.properties
Hive history file=/tmp/work/hive_job_log_work_201207051945_218572007.txt
hive> SHOW TABLES;
OK
Time taken: 7.281 seconds
hive> CREATE TABLE pokes (foo INT, bar STRING);
OK
Time taken: 0.398 seconds
hive> SHOW TABLES;
OK
pokes
Time taken: 0.181 seconds
hive> DESCRIBE pokes;
OK
foo int
bar string
Time taken: 0.58 seconds
hive>