zoukankan      html  css  js  c++  java
  • HIVE jdbc访问程序--Eclipse环境(hive-0.12.0 + hadoop-2.4.0集群)

    一、Eclipse 新建Other-》Map/Reduce Project工程

    工程自动包含了相关hadoop的jar包,

    另外还需分别导入以下hive和连接mysql的jar包:

    hive/lib/*.jar

    mysql-connector-java-5.1.24-bin.jar

    二、启运HiveServer

    命令:bin/hive --service hiveserver &

    曾经执行多次这个命令没成功后,报错:Could not create ServerSocket on address 0.0.0.0/0.0.0.0:10000.

    解决办法:启动时,指定端口号,命令

    bin/hive --service hiveserver -p 10002

    如果没报错就不用理会,但下面代码中的端口号要改为默认的10000,与这里的一致。


    三、Eclipse中java测试代码

    如果新建工程时不时选择hadoop工程,会报错:Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/io/Writable

    解决办法:这是因为一下hadoop的包没有加进来,重新建一个工程,工程类型选择Other-》Map/Reduce Project工程,自动就包含了hadoop开发环境

    另需先在/home/hadoop/file/目录下准备user_info.txt文件,内容如下( 分割符)

    1001  jack    30
    1002  tom    25
    1003  kate    20

    //--------------HiveTest .java-----------------
    package test;
    import java.sql.SQLException;  
    import java.sql.Connection;  
    import java.sql.ResultSet;  
    import java.sql.Statement;  
    import java.sql.DriverManager;  
     
    public class HiveQuery {  
      private static String driverName = "org.apache.hadoop.hive.jdbc.HiveDriver";  
     
      /**
     * @param args
     * @throws SQLException
       */  
      public static void main(String[] args) throws SQLException {  
          try {  
          Class.forName(driverName);  
        } catch (ClassNotFoundException e) {  
          // TODO Auto-generated catch block  
          e.printStackTrace();  
          System.exit(1);  
        }  
        Connection con = DriverManager.getConnection("jdbc:hive://192.168.1.200:10002/default", "", "");  
        Statement stmt = con.createStatement();  
        String tableName = "testHiveDriverTable";  
        stmt.executeQuery("drop table " + tableName);  
        ResultSet res = stmt.executeQuery("create table " + tableName + " (id int, name string, age string) row format delimited fields terminated by ' ' lines terminated by ' '");  
        // show tables  
        String sql = "show tables '" + tableName + "'";  
        System.out.println("Running: " + sql);  
        res = stmt.executeQuery(sql);  
        if (res.next()) {  
          System.out.println(res.getString(1));  
        }  
        // describe table  
        sql = "describe " + tableName;  
        System.out.println("Running: " + sql);  
        res = stmt.executeQuery(sql);  
        while (res.next()) {  
          System.out.println(res.getString(1) + " " + res.getString(2)+ " " + res.getString(3));  
        }  
     
        // load data into table  
        // NOTE: filepath has to be local to the hive server  
        // NOTE: /tmp/a.txt is a ctrl-A separated file with two fields per line  
        String filepath = "/home/hadoop/file/user_info.txt";  
        sql = "load data local inpath '" + filepath + "' into table " + tableName;  
        System.out.println("Running: " + sql);  
        res = stmt.executeQuery(sql);  
     
        // select * query  
        sql = "select * from " + tableName;  
        System.out.println("Running: " + sql);  
        res = stmt.executeQuery(sql);  
        while (res.next()) {  
          System.out.println(String.valueOf(res.getInt(1)) + " " + res.getString(2) + " " + res.getString(3));  
        }  
     
        // regular hive query  
        sql = "select count(1) from " + tableName;  
        System.out.println("Running: " + sql);  
        res = stmt.executeQuery(sql);  
        while (res.next()) {  
          System.out.println(res.getString(1));  
        }  
      }  
    //------------end---------------------------------------------

    四、显示结果

    Running: show tables 'testHiveDriverTable'

    testhivedrivertable

    Running: describe testHiveDriverTable

    id                  int                

    name             string              

    age               string              

    Running: load data local inpath '/home/hadoop/file/user_info.txt' into table testHiveDriverTable

    Running: select * from testHiveDriverTable

    1001 jack 30

    1002 tom 25

    1003 kate 20

    Running: select count(1) from testHiveDriverTable

    3

  • 相关阅读:
    rename 批量重命名
    shell脚本实现轮询查看进程是否结束
    mysql 修改max_connections
    window10下的solr6.1.0入门笔记之---安装部署
    php下载大文件
    【转】Pyhton 单行、多行注释符号使用方法及规范
    window10系统下使用python版本实现mysql查询
    Reading table information for completion of table and column names You can turn off this feature to get a quicker startup with -A
    【Visual Studio】 使用EF、 Linq2Sql快速创建数据交互层(一)
    【OPCAutomation】 使用OPCAutomation实现对OPC数据的访问
  • 原文地址:https://www.cnblogs.com/zhaohz/p/4403066.html
Copyright © 2011-2022 走看看