zoukankan      html  css  js  c++  java
  • Hive的访问接口 | Allen's World

    Hive的访问接口 | Allen's World

    Hive的访问接口

    Hive提供了三种客户端访问接口:

    1)Hive CLI(Hive Command Line,Hive命令行),客户端可以直接在命令行模式下进行操作。

    2)hwi(Hive Web Interface,Hive Web接口),Hive提供了更直观的Web界面

    3)hiveserver,Hive提供了Thrift服务,Thrift客户端目前支持C++/Java/PHP/Python/Ruby。

    下面我们来分别尝试下这三种接口访问方式:

    一、Hive CLI

    直接键入hive命令即可进入CLI模式:

    [cloud@cloud01 lib]$ hive
    
    Hive history file=/tmp/cloud/hive_job_log_cloud_201110311056_1009535967.txt
    
    hive> show tables;
    
    OK
    
    testhivedrivertable
    
    Time taken: 3.038 seconds
    
    hive> select * from testhivedrivertable;
    
    OK
    
    Time taken: 0.905 seconds
    
    hive> quit;
    
    [cloud@cloud01 lib]$

    更多的命令选项,参见官方wiki,Hive Cli

    二、Hive hwi

    Hive hwi提供了一个更直观的web界面,使用起来更方便。

    1)启动hive hwi

    [cloud@cloud01 ~]$ hive --service hwi
    
    11/10/31 10:14:11 INFO hwi.HWIServer: HWI is starting up
    
    11/10/31 10:14:11 INFO mortbay.log: Logging to org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via org.mortbay.log.Slf4jLog
    
    11/10/31 10:14:11 INFO mortbay.log: jetty-6.1.14
    
    11/10/31 10:14:11 INFO mortbay.log: Extract jar:file:/data/cloud/hive-0.7.1/lib/hive-hwi-0.7.1.war!/ to /tmp/Jetty_0_0_0_0_9999_hive.hwi.0.7.1.war__hwi__.hf8ccz/webapp
    
    11/10/31 10:14:12 INFO mortbay.log: Started SocketConnector@0.0.0.0:9999

    2)通过hwi方式访问Hive

    我的Hive部署在10.46.169.101机器上,hive默认hwi端口为9999。我们在浏览器中键入http://10.46.169.101:9999/hwi/ 就可以访问了。如图:

    更多hwi的信息,访问官方wiki,hwi

    三、hiveserver

    Hive以Thrift方式作为服务对客户端提供,目前Hive的Thrift绑定了多种语言,C++/Java/PHP/Python/Ruby,可在Hive发行版本的src/service/src目录下找到这些语言的Thrift绑定。Hive还提供了JDBC和ODBC的驱动,大大方面了基于Hive的应用开发。我利用官方的例子对JDBC驱动进行了测试。

    1)启动hiveserver

    [cloud@cloud01 ~]$ hive --service hiveserver
    
    Starting Hive Thrift Server

    2)在Eclipse中新建一个Java工程Hive0.7.1Test

    3)将$HIVE_HOME/lib目录下的jar包加到工程的buildpath里

    4)Hive的表是存储在HDFS上,所以,需要加载Hadoop的核心jar包。我的Hadoop版本是0.20.1。

    5)新建一个class,用官方wiki提供的代码,如下:

    import java.sql.SQLException;
    
    import java.sql.Connection;
    
    import java.sql.ResultSet;
    
    import java.sql.Statement;
    
    import java.sql.DriverManager;
    
    public class HiveJdbcClient {
    
    private static String driverName = "org.apache.hadoop.hive.jdbc.HiveDriver";
    
    /**
    
    * @param args
    
    * @throws SQLException
    
    */
    
    public static void main(String[] args) throws SQLException {
    
    try {
    
    Class.forName(driverName);
    
    } catch (ClassNotFoundException e) {
    
    // TODO Auto-generated catch block
    
    e.printStackTrace();
    
    System.exit(1);
    
    }
    
    Connection con = DriverManager.getConnection(
    
    "jdbc:hive://10.46.169.101:10000/default", "", "");
    
    Statement stmt = con.createStatement();
    
    String tableName = "testHiveDriverTable";
    
    stmt.executeQuery("drop table " + tableName);
    
    ResultSet res = stmt.executeQuery("create table " + tableName
    
    + " (key int, value string)");
    
    // show tables
    
    String sql = "show tables '" + tableName + "'";
    
    System.out.println("Running: " + sql);
    
    res = stmt.executeQuery(sql);
    
    if (res.next()) {
    
    System.out.println(res.getString(1));
    
    }
    
    // describe table
    
    sql = "describe " + tableName;
    
    System.out.println("Running: " + sql);
    
    res = stmt.executeQuery(sql);
    
    while (res.next()) {
    
    System.out.println(res.getString(1) + "\t" + res.getString(2));
    
    }
    
    // load data into table
    
    // NOTE: filepath has to be local to the hive server
    
    // NOTE: /tmp/a.txt is a ctrl-A separated file with two fields per line
    
    String filepath = "/tmp/a.txt";
    
    sql = "load data local inpath '" + filepath + "' into table "
    
    + tableName;
    
    System.out.println("Running: " + sql);
    
    res = stmt.executeQuery(sql);
    
    // select * query
    
    sql = "select * from " + tableName;
    
    System.out.println("Running: " + sql);
    
    res = stmt.executeQuery(sql);
    
    while (res.next()) {
    
    System.out.println(String.valueOf(res.getInt(1)) + "\t"
    
    + res.getString(2));
    
    }
    
    // regular hive query
    
    sql = "select count(1) from " + tableName;
    
    System.out.println("Running: " + sql);
    
    res = stmt.executeQuery(sql);
    
    while (res.next()) {
    
    System.out.println(res.getString(1));
    
    }
    
    }
    
    }
    6)编译运行,console如下:
    <div>2011-10-31 11:21:31,703 WARN  [main] conf.Configuration(175): DEPRECATED: hadoop-site.xml found in the classpath. Usage of hadoop-site.xml is deprecated. Instead use core-site.xml, mapred-site.xml and hdfs-site.xml to override properties of core-default.xml, mapred-default.xml and hdfs-default.xml respectively</div>
    <div>Running: show tables 'testHiveDriverTable'</div>
    <div>testhivedrivertable</div>
    <div>Running: describe testHiveDriverTable</div>
    <div>keyint</div>
    <div>valuestring</div>
    <div>Running: load data local inpath '/tmp/a.txt' into table testHiveDriverTable</div>
    <div>Exception in thread "main" java.sql.SQLException: Query returned non-zero code: 10, cause: FAILED: Error in semantic analysis: Line 1:23 Invalid path '/tmp/a.txt': No files matching path file:/tmp/a.txt</div>
    <div>at org.apache.hadoop.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:192)</div>
    <div>at com.zte.allen.hive.HiveJdbcClient.main(HiveJdbcClient.java:53)</div>
    报了一个Exception,是因为加载源是找不到/tmp/a.txt。这个不影响,但可以从hwi里看到已经新建了一个表testhivedrivertable。
    更多关于hiveserver的内容,参见官方wiki Setting up Hive Server, 还有这里介绍了各种客户端(cli、Java、PHP、Python、ODBC、Thrift方式等)如何访问Hive。
  • 相关阅读:
    如果用 索引的话,这个太不方便了,草,
    nslocal notification
    夜半三更,
    别人写的 代码,
    账目
    view 关于 controller的代理
    浅谈GFC
    浅谈IFC
    浅谈BFC
    JS ES6中的箭头函数(Arrow Functions)使用
  • 原文地址:https://www.cnblogs.com/lexus/p/2697316.html
Copyright © 2011-2022 走看看