zoukankan      html  css  js  c++  java
  • 使用sqoop将mysql中表导入hive中报错

    [hdfs@node1 root]$ sqoop import --connect jdbc:mysql://node2:3306/cm?charset-utf8 --username root --password 123456 --table USERS --columns "USER_ID,USER_NAME,PASSWORD_HASH" --fields-terminated-by " " --lines-terminated-by " " --hive-import --create-hive-table --hive-table cm_users 

    Warning: /opt/cloudera/parcels/CDH-5.9.2-1.cdh5.9.2.p0.3/bin/../lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
    Please set $ACCUMULO_HOME to the root of your Accumulo installation.
    18/04/23 13:45:45 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.9.2
    18/04/23 13:45:45 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
    18/04/23 13:45:45 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
    18/04/23 13:45:45 INFO tool.CodeGenTool: Beginning code generation
    18/04/23 13:45:46 ERROR manager.SqlManager: Error executing statement: java.sql.SQLException: Access denied for user 'root'@'node1' (using password: YES)
    java.sql.SQLException: Access denied for user 'root'@'node1' (using password: YES)
    at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:964)
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3973)
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3909)
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:873)
    at com.mysql.jdbc.MysqlIO.proceedHandshakeWithPluggableAuthentication(MysqlIO.java:1710)
    at com.mysql.jdbc.MysqlIO.doHandshake(MysqlIO.java:1226)
    at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2191)
    at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2222)
    at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2017)
    at com.mysql.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:779)
    at com.mysql.jdbc.JDBC4Connection.<init>(JDBC4Connection.java:47)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at com.mysql.jdbc.Util.handleNewInstance(Util.java:425)
    at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:389)
    at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:330)
    at java.sql.DriverManager.getConnection(DriverManager.java:664)
    at java.sql.DriverManager.getConnection(DriverManager.java:247)
    at org.apache.sqoop.manager.SqlManager.makeConnection(SqlManager.java:904)
    at org.apache.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:52)
    at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:763)
    at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:786)
    at org.apache.sqoop.manager.SqlManager.getColumnInfoForRawQuery(SqlManager.java:289)
    at org.apache.sqoop.manager.SqlManager.getColumnTypesForRawQuery(SqlManager.java:260)
    at org.apache.sqoop.manager.SqlManager.getColumnTypes(SqlManager.java:246)
    at org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:327)
    at org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:1858)
    at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1658)
    at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:106)
    at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:494)
    at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:621)
    at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
    at org.apache.sqoop.Sqoop.main(Sqoop.java:252)

    ================================我是分割线=================================================

    开始以为是需要更改hdfs目录 /user/hive的所有者和权限,但是更改后发现还是不行;

    后面经过各种尝试,找到问题原因,是mysql权限问题;

    mysql> grant all privileges on *.* to root@node1 identified by '123456';
    Query OK, 0 rows affected (0.00 sec)

    mysql> flush privileges;
    Query OK, 0 rows affected (0.00 sec)

    mysql> grant all privileges on *.* to root@node3 identified by '123456';
    Query OK, 0 rows affected (0.00 sec)

    mysql> flush privileges;
    Query OK, 0 rows affected (0.00 sec)

    授权集群其他节点用root用户访问mysql数据库的权限;

  • 相关阅读:
    google-glog 开源库分析(一):glog介绍
    homebrew用法
    macos新手入门
    markdown语法_文本效果[转载]
    markdown语法[转载]
    从Search Sort到Join
    实际例子描述和分析“猎豹抢票跨站推荐功能有票刷不到”的疑似bug
    最简单例子图解JVM内存分配和回收
    B树在数据库索引中的应用剖析
    从Count看Oracle执行计划的选择
  • 原文地址:https://www.cnblogs.com/gxc2015/p/8920030.html
Copyright © 2011-2022 走看看