zoukankan      html  css  js  c++  java
  • Apache Hive处理数据示例

    继上一篇文章介绍如何使用Pig处理HDFS上的数据,本文将介绍使用Apache Hive进行数据查询和处理。

    Apache Hive简介

    • 首先Hive是一款数据仓库软件
    • 使用HiveQL来结构化和查询存放的数据
    • 执行环境:MapReduce, Tez, Spark
    • 数据存放:HDFS, HBase
    • 使用场景:数据挖掘和分析,机器学习,即席查询等

    Hive使用示例

    • 还是使用passwd作为操作文件
    beeline> !quit
    [cloudera@quickstart ~]$ hdfs dfs -put /etc/passwd /tmp/
    [cloudera@quickstart ~]$ hdfs dfs -ls /tmp/
    Found 5 items
    drwxrwxrwt   - mapred   mapred              0 2016-12-29 01:05 /tmp/hadoop-yarn
    drwx-wx-wx   - hive     supergroup          0 2016-08-27 10:19 /tmp/hive
    drwxrwxrwt   - mapred   hadoop              0 2016-08-10 14:37 /tmp/logs
    -rw-r--r--   1 cloudera supergroup       2559 2017-02-22 05:34 /tmp/passwd
    
    • 使用beeline连接Hive
    [cloudera@quickstart ~]$ beeline -u jdbc:hive2://
    scan complete in 24ms
    Connecting to jdbc:hive2://
    Connected to: Apache Hive (version 1.1.0-cdh5.8.0)
    Driver: Hive JDBC (version 1.1.0-cdh5.8.0)
    Transaction isolation: TRANSACTION_REPEATABLE_READ
    Beeline version 1.1.0-cdh5.8.0 by Apache Hive
    0: jdbc:hive2://> 
    
    • 建表并且插入数据
    
    0: jdbc:hive2://> CREATE TABLE userinfo ( uname STRING, pswd STRING, uid INT, gid INT, fullname STRING, hdir STRING, shell STRING ) ROW FORMAT DELIMITED FIELDS TERMINATED BY ':' STORED AS TEXTFILE;
    0: jdbc:hive2://> LOAD DATA INPATH '/tmp/passwd' OVERWRITE INTO TABLE userinfo;
    0: jdbc:hive2://> select uname,fullname,hdir from userinfo order by unmame;
    MapReduce Jobs Launched: 
    Stage-Stage-1: Map: 1  Reduce: 1   Cumulative CPU: 27.83 sec   HDFS Read: 8767 HDFS Write: 1454 SUCCESS
    Total MapReduce CPU Time Spent: 27 seconds 830 msec
    OK
    +----------------+-------------------------------+-------------------------------+--+
    |     uname      |           fullname            |             hdir              |
    +----------------+-------------------------------+-------------------------------+--+
    | abrt           |                               | /etc/abrt                     |
    | adm            | adm                           | /var/adm                      |
    | apache         | Apache                        | /var/www                      |
    | avahi-autoipd  | Avahi IPv4LL Stack            | /var/lib/avahi-autoipd        |
    | bin            | bin                           | /bin                          |
    | cloudera       |                               | /home/cloudera                |
    | cloudera-scm   | Cloudera Manager              | /var/lib/cloudera-scm-server  |
    ...
    

    总结

    • 使用beeline进行对Hive交互访问,类似于sqlplus之于Oracle数据库
    • 其它的交互工作好包括:Hive CLI, Hcatalog, WebHcat
    • 相应的DDL, DML语法可以参考官方WIKI
  • 相关阅读:
    2017 Multi-University Training Contest
    ACM 竞赛高校联盟 练习赛 第一场
    hdu 6194 string string string(后缀数组)
    Codeforces Round #433 (Div. 1) D. Michael and Charging Stations(dp)
    Codeforces Round #433 (Div. 2) E. Boredom(主席树)
    Codeforces Round #433 (Div. 2) C. Planning(贪心)
    Codeforces Round #433(Div. 2) D. Jury Meeting(贪心)
    hdu 6191 Query on A Tree(dfs序+可持久化字典树)
    hdu 6183 Color it(线段树)
    poj 2464 Brownie Points II(扫描线)
  • 原文地址:https://www.cnblogs.com/shenfeng/p/apache_hive_example.html
Copyright © 2011-2022 走看看