zoukankan      html  css  js  c++  java
  • ubuntu安装hive

    1.安装mysql,可参考下面链接

    http://www.cnblogs.com/liuchangchun/p/4099003.html

    2.安装hive,之前,先在mysql上创建一个hive,数据库,并在hive数据库中建立表user

    create database hive;
    use hive;
    create table user(Host char(20),User char(10),Password char(20));

    3.进入mysql,赋予权限,密码自己改

    mysql -u root -p
    insert into user(Host,User,Password) values("localhost","hive",password("123"));
    FLUSH PRIVILEGES;
    
    
    GRANT ALL PRIVILEGES ON *.*  TO 'hive'@'localhost' IDENTIFIED BY 'hive';
    FLUSH PRIVILEGES;

    4.解压hive安装包之后,配置环境

    sudo gedit /etc/profile
    #hive
    export HIVE_HOME=/home/sendi/apache-hive-1.1.1-bin
    export PATH=$PATH:$HIVE_HOME/bin

    5.修改hive/conf下的几个template模板,

    cp hive-env.sh.template hive-env.sh
    cp hive-default.xml.template hive-site.xml

    6.配置hive-env.sh文件,指定HADOOP_HOME

    HADOOP_HOME=/home/sendi/hadoop-2.6.0

    7.修改hive-site.xml文件,指定MySQL数据库驱动、数据库名、用户名及密码

    <property>
      <name>javax.jdo.option.ConnectionURL</name>
      <value>jdbc:mysql://localhost:3306/hive?createDatabaseIfNotExist=true</value>
      <description>JDBC connect string for a JDBC metastore</description>
    </property>
    
    
    <property>
      <name>javax.jdo.option.ConnectionDriverName</name>
      <value>com.mysql.jdbc.Driver</value>
      <description>Driver class name for a JDBC metastore</description>
    </property>
    
    
    <property>
      <name>javax.jdo.option.ConnectionUserName</name>
      <value>hive</value>
      <description>username to use against metastore database</description>
    </property>
    
    
    <property>
      <name>javax.jdo.option.ConnectionPassword</name>
      <value>hive</value>
      <description>password to use against metastore database</description>
    </property>
    
    <property>
      <name>hive.metastore.local</name>
      <value>true</value>
      <description></description>
    </property>

    8.修改hive/bin下的hive-config.sh文件,设置JAVA_HOME,HADOOP_HOME

    export JAVA_HOME=/usr/lib/jdk/jdk1.7.0_67
    export HADOOP_HOME=/home/sendi/hadoop-2.6.0
    export HIVE_HOME=/home/sendi/apache-hive-1.1.1-bin

    9.下载mysql-connector-java-5.1.27-bin.jar文件,并放到$HIVE_HOME/lib目录下

    10.在HDFS中创建/tmp和/user/hive/warehouse并设置权限

    hadoop fs -mkdir /tmp
    hadoop fs -mkdir /user/hive/warehouse
    hadoop fs -chmod g+w /tmp
    hadoop fs -chmod g+w /user/hive/warehouse

    11.启动hive

    12.启动时,可能会遇到下面的问题

    Logging initialized using configuration in jar:file:/hive/apache-hive-1.1.0-bin/lib/hive-common-1.1.0.jar!/hive-log4j.properties
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:file:/hadoop-2.5.2/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in [jar:file:/hive/apache-hive-1.1.0-bin/lib/hive-jdbc-1.1.0-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
    [ERROR] Terminal initialization failed; falling back to unsupported
    java.lang.IncompatibleClassChangeError: Found class jline.Terminal, but interface was expected
            at jline.TerminalFactory.create(TerminalFactory.java:101)
            at jline.TerminalFactory.get(TerminalFactory.java:158)
            at jline.console.ConsoleReader.<init>(ConsoleReader.java:229)
            at jline.console.ConsoleReader.<init>(ConsoleReader.java:221)
            at jline.console.ConsoleReader.<init>(ConsoleReader.java:209)
            at org.apache.hadoop.hive.cli.CliDriver.getConsoleReader(CliDriver.java:773)
            at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:715)
            at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675)
            at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:615)
            at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
            at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
            at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
            at java.lang.reflect.Method.invoke(Method.java:606)
            at org.apache.hadoop.util.RunJar.main(RunJar.java:212)

    13.原因是hadoop目录下存在老版本jline,解决方法:

    1.进入hive的lib目录,把新版本的jline复制到hadoop的一下目录

    /home/sendi/hadoop-2.6.0/share/hadoop/yarn/lib

    2把hadoop就版本的jline删掉

    14.如果还遇到以下问题:

    jiahong@jiahongPC:~/apache/apache-hive-1.1.1-bin$ hive
    15/09/05 21:29:28 WARN conf.HiveConf: HiveConf of name hive.metastore.local does not exist
    
    Logging initialized using configuration in jar:file:/home/jiahong/apache/apache-hive-1.1.1-bin/lib/hive-common-1.1.1.jar!/hive-log4j.properties
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:file:/home/jiahong/apache/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in [jar:file:/home/jiahong/apache/apache-hive-1.1.1-bin/lib/hive-jdbc-1.1.1-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
    Exception in thread "main" java.lang.RuntimeException: java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: ${system:java.io.tmpdir%7D/$%7Bsystem:user.name%7D
        at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:472)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:671)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:615)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
    Caused by: java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: ${system:java.io.tmpdir%7D/$%7Bsystem:user.name%7D
        at org.apache.hadoop.fs.Path.initialize(Path.java:206)
        at org.apache.hadoop.fs.Path.<init>(Path.java:172)
        at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:515)
        at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:458)
        ... 8 more
    Caused by: java.net.URISyntaxException: Relative path in absolute URI: ${system:java.io.tmpdir%7D/$%7Bsystem:user.name%7D
        at java.net.URI.checkPath(URI.java:1804)
        at java.net.URI.<init>(URI.java:752)
        at org.apache.hadoop.fs.Path.initialize(Path.java:203)
        ... 11 more
    jiahong@jiahongPC:~/apache/apache-hive-1.1.1-bin$ hadoop dfs - ls /
    DEPRECATED: Use of this script to execute hdfs command is deprecated.
    Instead use the hdfs command for it.

    15.修改hive-site.xml文件,修改之前要在hdfs上建立相应的文件夹,内容如下:

    <property>
        <name>hive.exec.scratchdir</name>
        <value>/tmp/hive</value>
        <description>HDFS root scratch dir for Hive jobs which gets created with write all (733) permission. For each connecting user, an HDFS scratch dir: ${hive.exec.scratchdir}/&lt;username&gt; is created, with ${hive.scratch.dir.permission}.</description>
    </property>
    <property>
        <name>hive.exec.local.scratchdir</name>
        <value>/tmp/hive/local</value>
        <description>Local scratch space for Hive jobs</description>
    </property>
    <property>
        <name>hive.downloaded.resources.dir</name>
        <value>/tmp/hive/resources</value>
        <description>Temporary local directory for added resources in the remote file system.</description>
    </property> 

    16.先启动hadoop,再启动hive

    sendi@sendijia:~/hadoop-2.6.0$ hive
    
    Logging initialized using configuration in jar:file:/home/sendi/apache-hive-1.1.1-bin/lib/hive-common-1.1.1.jar!/hive-log4j.properties
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:file:/home/sendi/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in [jar:file:/home/sendi/apache-hive-1.1.1-bin/lib/hive-jdbc-1.1.1-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
    hive> 
  • 相关阅读:
    返回页面顶部的方法
    一个获取当前 url 查询字符串中的参数的方法
    那些让你看起来很牛逼的Docker使用技巧
    Docker 1.13 新特性 —— Docker服务编排相关
    docker1.13新功能network关注点
    Docker 1.13 最实用命令行:终于可以愉快地打扫房间了
    Docker 1.13 – 新增功能大揭秘
    Docker 1.13 编排能力进化
    Docker
    Docker
  • 原文地址:https://www.cnblogs.com/aijianiula/p/4730236.html
Copyright © 2011-2022 走看看