zoukankan      html  css  js  c++  java
  • Ubuntu16.04下安装Hive

     上一篇博客我们已经说过了要如何安装Hadoop,别忘记了我们的目的是安装Hive。所以这篇博客,我就来介绍一下如何安装Hive。

    一、环境准备

    (1)Vmware

      (2)  Ubuntu 16.04

      (3)  Hadoop

    二、安装Hive

     (1) mysql-server和mysql-client的下载

      $ su hadoop

      $ sudo apt-get install mysql-server mysql-client

     (2)启动mysql服务

      $ sudo /etc/init.d/mysql start

      

      (3)进入mysql服务

      $ mysql -u root -p

      键入你自己设置的mysql的root密码,

      现在进入到了mysql里面,执行以下命令:

      create user 'hive'@'%' identified by 'hive';

      create all privileges on *.* to 'hive'@'%' with grant option;

      flush privileges;

      create database if not existes hive_metadata;

      grant all privileges on hive_metadata.* to 'hive'@'%' identifies by 'hive';

      grant all privileges on hive_metadata.* to 'hive'@'localhost' identified by 'hive';

      flush privileges;

      exit;

      $ sudo /etc/init.d/mysql restart

      mysql -u hive -p

      键入密码:hive

      show databases;

      如果hive_metadata不存在的话就执行 create database hive_metadata;

      (4)安装hive

      $ su hadoop

      $ cd /usr/local

      $ wget http://mirrors.hust.edu.cn/apache/hive/hive-2.3.3/apache-hive-2.3.3-bin.tar.gz

      要检查是否有相应的文件,没有的话要自己去搜

      $ tar zxvf apache-hive-2.3.3-bin.tar.gz

      $ sudo mkdir hive

      $ sudo mv apache-hive-2.3.3.bin hive/hive-2.3.3

      $ cd hive/hive-2.3.3

      $ cd conf

      $ cp hive-default.xml.template hive-site.xml

      $ sudo vim hive-site.xml

      

    <property>
    <name>javax.jdo.option.ConnectionURL</name>
    <value>jdbc:mysql://localhost:3306/hive?createDatabaseIfNotExist=true</value>
    </property>
    <property>
    <name>javax.jdo.option.ConnectionDriverName</name>
    <value>com.mysql.jdbc.Driver</value>
    </property>
    <property>
    <name>javax.jdo.option.ConnectionUserName</name>
    <value>hive</value>
    </property>
    <property>
    <name>javax.jdo.option.ConnectionPassword</name>
    <value>hive</value>
    </property>

      $ cp hive-env.sh.template hive-env.sh

      $ sudo vim hive-env.sh

    export  HADOOP_HOME=/usr/local/hadoop
    export HIVE_CONF_DIR=/usr/local/hive/hive-2.3.3/conf

      $ cd ../bin

      $ vim hive-config.sh

    export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
    export HIVE_HOME=/usr/local/hive/hive-2.3.3
    export HADOOP_HOME=/usr/local/hadoop

      $ sudo vim /etc/profile

     

    export HIVE_HOME=/usr/local/hive/hive-2.3.3
    export PATH=$PATH:$HIVE_HOME/bin

      $ source /etc/profile

      $ sudo cd /usr/local/hive/hive-2.3.3

      $ wget http://ftp.ntu.edu.tw/MySQL/Downloads/Connector-J/mysql-connector-java-5.1.45.tar.gz

      $ tar zxvf mysql-connector-java-5.1.45.tar.gz

      $ jar -cf mysql-connector-java-5.1.45.jar mysql-connector-java-5.1.45

      $ sudo cp mysql-connector-java-5.1.45.jar lib/

      (5)测试

      $jps

      检查hadoop的Namenode, datanode, secondarynode, resourcemanager, nodemanager是不是都存在,不是的话就要关闭hadoop,重启。至于如何关闭和重启hadoop参见上一篇安装hadoop的博客

      $cd bin

      $./hive

      执行完这个会进入到:

      hive>

    三、报错记录

    (1)如果运行bin/.hive的报错为:

      

    which: no hbase in (/opt/service/jdk1.7.0_67/bin:/opt/service/jdk1.7.0_67/jre/bin:/opt/mysql-5.6.24/bin:/opt/service/jdk1.7.0_67/bin:/opt/service/jdk1.7.0_67/jre/bin:/usr/lib64/qt-3.3/bin:/usr/local/bin:/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/sbin:/home/hadoop/bin)
    
    SLF4J: Class path contains multiple SLF4J bindings.
    
    SLF4J: Found binding in [jar:file:/opt/apache/hive-2.1.0/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    
    SLF4J: Found binding in [jar:file:/opt/apache/hadoop-2.7.3/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    
    SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]

    等类似这样出现Class path contains multiple XXX bindings,只需要根据下面的Found binding,删除其中的一个文件,就可以了。

    (2)如果报错为:

    call from wuyanjing-virtucal-machie/127.0.0.1 to localhost:9000 failure

    出现这个错误的时候,先运行了一下jps命令,看看hadoop是不是成功运行。一般重启hadoop,这个问题就解决了。

       (3) 如果报错为:

    Exception in thread "main" java.lang.RuntimeException:java.lang.illegalAgrumentException:java.net.URISystaxException:Relative path in absolate URI:${system:ja va.io.tmpdir}

    出现这个错误的时候,只要在hive-site.xml中找到${System:java.io.tmpdir},并把此都替换成具体目录。

  • 相关阅读:
    spring cloud alibaba +seata 实战中Error processing condition on io.seata.spring.boot.autoconfigure.问题总结
    Docker部署Elasticsearch及安装后自动关闭的问题
    SpringBoot10:Web开发静态资源处理
    SpringBoot09:整合MyBatis
    SpringBoot08:整合Druid
    SpringBoot07:整合JDBC
    SpringBoot06:自定义starter
    SpringBoot05:自动配置原理
    SpringBoot04:JSR303数据校验及多环境切换
    SpringBoot03:yaml配置注入
  • 原文地址:https://www.cnblogs.com/whatyouknow123/p/8907449.html
Copyright © 2011-2022 走看看