zoukankan      html  css  js  c++  java
  • Hive——环境搭建

    Hive——环境搭建

    相关hadoop和mysql环境已经搭建好。我博客中也有相关搭建的博客。

    一、下载Hive并解压到指定目录(本次使用版本hive-1.1.0-cdh5.7.0,下载地址:http://archive.cloudera.com/cdh5/cdh/5/)

    tar zxvf ./hive-1.1.0-cdh5.7.0.tar.gz -C ~/app/

    二、Hive配置:参考官网:https://cwiki.apache.org/confluence/display/Hive/GettingStarted#GettingStarted-InstallationandConfiguration

    1、配置环境变量

    1)vi .bash_profile

        export HIVE_HOME=/home/hadoop/app/hive-1.1.0-cdh5.7.0
        export PATH=$HIVE_HOME/bin:$PATH

    2)source .bash_profile

    source .bash_profile

    2、hive-1.1.0-cdh5.7.0/conf/hive-env.sh

    1)cp hive-env.sh.template hive-env.sh

    cp hive-env.sh.template hive-env.sh

     2)vi hive-env.sh 添加HADOOP_HOME

        HADOOP_HOME=/home/hadoop/app/hadoop-2.6.0-cdh5.7.0

    3、hive-1.1.0-cdh5.7.0/conf/hive-site.xml(自己创建配置)

    (mysql驱动包需要自己手动拷贝到hive-1.1.0-cdh5.7.0/lib中)。

     <configuration> 
            <!-- 配置连接串 -->
            <property>
                <name>javax.jdo.option.ConnectionURL</name>
                <!-- 数据库名称:zhaotao_hive -->
                <!-- createDatabaseIfNotExist=true:当数据库不存在的时候,自动帮你创建 -->
                <value>jdbc:mysql://localhost:3306/rdb_hive?createDatabaseIfNotExist=true</value>
            </property>
            <!-- mysql的driver类 -->
            <property>
                <name>javax.jdo.option.ConnectionDriverName</name>
                <value>com.mysql.jdbc.Driver</value>
            </property>
            <!-- 用户名 -->
            <property>
                <name>javax.jdo.option.ConnectionUserName</name>
                <value>root</value>
            </property>
            <!-- 密码 -->
            <property>
                <name>javax.jdo.option.ConnectionPassword</name>
                <value>root</value>
            </property>
        </configuration>

    三、启动hive

    hive-1.1.0-cdh5.7.0/bin/hive

    启动日志:

    [hadoop@hadoop01 bin]$ ./hive
    which: no hbase in (/home/hadoop/app/hive-1.1.0-cdh5.7.0/bin:/home/hadoop/app/hadoop-2.6.0-cdh5.7.0
    /bin:/home/hadoop/app/jdk1.8.0_131/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/home/hadoop/
    .local/bin:/home/hadoop/bin)
    Logging initialized using configuration in jar:file:/home/hadoop/app/hive-1.1.0-cdh5.7.0/lib/
    hive-common-1.1.0-cdh5.7.0.jar!/hive-log4j.properties
    WARNING: Hive CLI is deprecated and migration to Beeline is recommended.
    hive>

    启动后会自动在mysql库上建立数据库和表:

    mysql> show tables;
    +--------------------+
    | Tables_in_rdb_hive |
    +--------------------+
    | CDS                |
    | DATABASE_PARAMS    |
    | DBS                |
    | FUNCS              |
    | FUNC_RU            |
    | GLOBAL_PRIVS       |
    | PARTITIONS         |
    | PART_COL_STATS     |
    | ROLES              |
    | SDS                |
    | SEQUENCE_TABLE     |
    | SERDES             |
    | SKEWED_STRING_LIST |
    | TAB_COL_STATS      |
    | TBLS               |
    | VERSION            |
    +--------------------+

    四、hive简单入门

    使用hive实现wordcount。

    1、创建表:create table hive_wordcount(context string);

    hive> create table hive_wordcount(context string);
    OK
    Time taken: 1.203 seconds
    hive> show tables;
    OK
    hive_wordcount
    Time taken: 0.19 seconds, Fetched: 1 row(s)

    2、导入数据:load data local inpath '/home/hadoop/data/hello.txt' into table hive_wordcount;

    hive> load data local inpath '/home/hadoop/data/hello.txt' into table hive_wordcount;
    Loading data to table default.hive_wordcount
    Table default.hive_wordcount stats: [numFiles=1, totalSize=44]
    OK
    Time taken: 2.294 seconds

    3、查询表数据看是否导成功:select * from hive_wordcount;

    hello.txt内容:

    Deer Bear River
    Car Car River
    Deer Car Bear
    hive> select * from hive_wordcount;
    OK
    Deer Bear River
    Car Car River
    Deer Car Bear
    Time taken: 0.588 seconds, Fetched: 3 row(s)

    4、使用sql实现wordcount:select word,count(1) from hive_wordcount lateral view explode(split(context,' ')) wc as word group by word;

    hive> select word,count(1) from hive_wordcount lateral view explode(split(context,' ')) wc as 
    word group by word;
    Query ID = hadoop_20180904070404_b23d8c2e-161b-4e65-a2cc-206ce343d9e8
    Total jobs = 1
    Launching Job 1 out of 1
    Number of reduce tasks not specified. Estimated from input data size: 1
    In order to change the average load for a reducer (in bytes):
      set hive.exec.reducers.bytes.per.reducer=<number>
    In order to limit the maximum number of reducers:
      set hive.exec.reducers.max=<number>
    In order to set a constant number of reducers:
      set mapreduce.job.reduces=<number>
    Starting Job = job_1536010835653_0002, 
    Kill Command = /home/hadoop/app/hadoop-2.6.0-cdh5.7.0/bin/hadoop job  -kill job_1536010835653_0002
    Hadoop job information for Stage-1: number of mappers: 1; number of reducers: 1
    2018-09-04 07:05:49,279 Stage-1 map = 0%,  reduce = 0%
    2018-09-04 07:06:01,893 Stage-1 map = 100%,  reduce = 0%, Cumulative CPU 1.95 sec
    2018-09-04 07:06:10,804 Stage-1 map = 100%,  reduce = 100%, Cumulative CPU 3.44 sec
    MapReduce Total cumulative CPU time: 3 seconds 440 msec
    Ended Job = job_1536010835653_0002
    MapReduce Jobs Launched:
    Stage-Stage-1: Map: 1  Reduce: 1   Cumulative CPU: 3.44 sec   HDFS Read: 8797 HDFS 
    Write: 28 SUCCESS
    Total MapReduce CPU Time Spent: 3 seconds 440 msec
    OK
    Bear    2
    Car     3
    Deer    2
    River   2
    Time taken: 37.441 seconds, Fetched: 4 row(s)

    可以看到结果:

    Bear    2
    Car     3
    Deer    2
    River   2

    注意:在创建表的时候遇到一个错误:

    Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:
    For direct MetaStore DB connections, we don't support retries at the client level.)

    从字面意思是是连接msql有问题。从网上查询大概有两种解决办法:

    1、换mysql jdbc驱动包,比如换成  mysql-connector-java-5.1.34-bin.jar,但我试过,我这里没有解决

    2、换对应mysq 上MetaStore 数据库的编码,换成 latin1,亲测,解决。

  • 相关阅读:
    C++primer习题3.13
    Indesign技巧
    《转载》虚函数在对象中的内存布局
    C++new失败后如何处理
    sizeof的用法
    转载 C++中虚继承防止二义性
    字符串反转
    回文写法
    C++术语
    QT+VS2008
  • 原文地址:https://www.cnblogs.com/jnba/p/10670920.html
Copyright © 2011-2022 走看看