zoukankan      html  css  js  c++  java
  • hbase-0.92.1表备份还原

    原表结构和数据

    hbase(main):021:0* describe 'test'
    DESCRIPTION                                                                                                                                                        ENABLED                                                                                   
     {NAME => 'test', FAMILIES => [{NAME => 'cf1', BLOOMFILTER => 'NONE', REPLICATION_SCOPE => '0', VERSIONS => '3', COMPRESSION => 'NONE', MIN_VERSIONS => '0', TTL = true                                                                                      
     > '2147483647', BLOCKSIZE => '65536', IN_MEMORY => 'false', BLOCKCACHE => 'true'}, {NAME => 'cf2', BLOOMFILTER => 'NONE', REPLICATION_SCOPE => '0', COMPRESSION =                                                                                           
     > 'NONE', VERSIONS => '3', TTL => '2147483647', MIN_VERSIONS => '0', BLOCKSIZE => '65536', IN_MEMORY => 'false', BLOCKCACHE => 'true'}]}                                                                                                                    
    1 row(s) in 0.0670 seconds
    
    hbase(main):022:0> scan 'test'
    ROW                                                              COLUMN+CELL                                                                                                                                                                                 
     row1                                                            column=cf1:age, timestamp=1555771920276, value=21                                                                                                                                           
     row1                                                            column=cf1:name, timestamp=1555771906481, value=zhangsan                                                                                                                                    
     row2                                                            column=cf2:age, timestamp=1555837304256, value=20                                                                                                                                           
     row2                                                            column=cf2:name, timestamp=1555837324252, value=wangba                                                                                                                                      
    2 row(s) in 0.0270 seconds

    一.导入导出

    # hbase org.apache.hadoop.hbase.mapreduce.Export
    ERROR: Wrong number of arguments: 0
    Usage: Export [-D <property=value>]* <tablename> <outputdir> [<versions> [<starttime> [<endtime>]] [^[regex pattern] or [Prefix] to filter]]
    
      Note: -D properties will be applied to the conf used. 
      For example: 
       -D mapred.output.compress=true
       -D mapred.output.compression.codec=org.apache.hadoop.io.compress.GzipCodec
       -D mapred.output.compression.type=BLOCK
      Additionally, the following SCAN properties can be specified
      to control/limit what is exported..
       -D hbase.mapreduce.scan.column.family=<familyName>
    # hbase org.apache.hadoop.hbase.mapreduce.Import
    ERROR: Wrong number of arguments: 0
    Usage: Import <tablename> <inputdir>

    1.导出到hdfs

    # hbase org.apache.hadoop.hbase.mapreduce.Export test /backup/test

    或采用以下写法

    # hbase org.apache.hadoop.hbase.mapreduce.Export test hdfs://sht-sgmhadoopnn-01:9011/backup/test

    输出log

    [root@sht-sgmhadoopdn-02 exp]# hbase org.apache.hadoop.hbase.mapreduce.Export test hdfs://sht-sgmhadoopnn-01:9011/backup/test
    19/04/21 17:45:39 INFO mapreduce.Export: verisons=1, starttime=0, endtime=9223372036854775807
    19/04/21 17:45:39 DEBUG mapreduce.TableMapReduceUtil: New JarFinder: org.apache.hadoop.util.JarFinder.getJar not available.  Using old findContainingJar
    19/04/21 17:45:39 DEBUG mapreduce.TableMapReduceUtil: New JarFinder: org.apache.hadoop.util.JarFinder.getJar not available.  Using old findContainingJar
    19/04/21 17:45:39 DEBUG mapreduce.TableMapReduceUtil: New JarFinder: org.apache.hadoop.util.JarFinder.getJar not available.  Using old findContainingJar
    19/04/21 17:45:39 DEBUG mapreduce.TableMapReduceUtil: New JarFinder: org.apache.hadoop.util.JarFinder.getJar not available.  Using old findContainingJar
    19/04/21 17:45:39 DEBUG mapreduce.TableMapReduceUtil: New JarFinder: org.apache.hadoop.util.JarFinder.getJar not available.  Using old findContainingJar
    19/04/21 17:45:39 DEBUG mapreduce.TableMapReduceUtil: New JarFinder: org.apache.hadoop.util.JarFinder.getJar not available.  Using old findContainingJar
    19/04/21 17:45:39 DEBUG mapreduce.TableMapReduceUtil: New JarFinder: org.apache.hadoop.util.JarFinder.getJar not available.  Using old findContainingJar
    19/04/21 17:45:39 DEBUG mapreduce.TableMapReduceUtil: New JarFinder: org.apache.hadoop.util.JarFinder.getJar not available.  Using old findContainingJar
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:file:/opt/hbase-0.92.1/lib/slf4j-log4j12-1.5.8.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in [jar:file:/opt/hadoop-1.0.3/lib/slf4j-log4j12-1.4.3.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    19/04/21 17:45:45 INFO zookeeper.ZooKeeper: Client environment:zookeeper.version=3.4.3-1240972, built on 02/06/2012 10:48 GMT
    19/04/21 17:45:45 INFO zookeeper.ZooKeeper: Client environment:host.name=sht-sgmhadoopdn-02
    19/04/21 17:45:45 INFO zookeeper.ZooKeeper: Client environment:java.version=1.6.0_45
    19/04/21 17:45:45 INFO zookeeper.ZooKeeper: Client environment:java.vendor=Sun Microsystems Inc.
    19/04/21 17:45:45 INFO zookeeper.ZooKeeper: Client environment:java.home=/opt/jdk1.6.0_45/jre
    19/04/21 17:45:45 INFO zookeeper.ZooKeeper: Client environment:java.class.path=/opt/hbase/bin/../conf:/opt/jdk1.6.0_45/lib/tools.jar:/opt/hbase/bin/..:/opt/hbase/bin/../hbase-0.92.1.jar:/opt/hbase/bin/../hbase-0.92.1-tests.jar:/opt/hbase/bin/../lib/activation-1.1.jar:/opt/hbase/bin/../lib/asm-3.1.jar:/opt/hbase/bin/../lib/avro-1.5.3.jar:/opt/hbase/bin/../lib/avro-ipc-1.5.3.jar:/opt/hbase/bin/../lib/commons-beanutils-1.7.0.jar:/opt/hbase/bin/../lib/commons-beanutils-core-1.8.0.jar:/opt/hbase/bin/../lib/commons-cli-1.2.jar:/opt/hbase/bin/../lib/commons-codec-1.4.jar:/opt/hbase/bin/../lib/commons-collections-3.2.1.jar:/opt/hbase/bin/../lib/commons-configuration-1.6.jar:/opt/hbase/bin/../lib/commons-digester-1.8.jar:/opt/hbase/bin/../lib/commons-el-1.0.jar:/opt/hbase/bin/../lib/commons-httpclient-3.1.jar:/opt/hbase/bin/../lib/commons-lang-2.5.jar:/opt/hbase/bin/../lib/commons-logging-1.1.1.jar:/opt/hbase/bin/../lib/commons-math-2.1.jar:/opt/hbase/bin/../lib/commons-net-1.4.1.jar:/opt/hbase/bin/../lib/core-3.1.1.jar:/opt/hbase/bin/../lib/guava-r09.jar:/opt/hbase/bin/../lib/hadoop-core-1.0.0.jar:/opt/hbase/bin/../lib/high-scale-lib-1.1.1.jar:/opt/hbase/bin/../lib/httpclient-4.0.1.jar:/opt/hbase/bin/../lib/httpcore-4.0.1.jar:/opt/hbase/bin/../lib/jackson-core-asl-1.5.5.jar:/opt/hbase/bin/../lib/jackson-jaxrs-1.5.5.jar:/opt/hbase/bin/../lib/jackson-mapper-asl-1.5.5.jar:/opt/hbase/bin/../lib/jackson-xc-1.5.5.jar:/opt/hbase/bin/../lib/jamon-runtime-2.3.1.jar:/opt/hbase/bin/../lib/jasper-compiler-5.5.23.jar:/opt/hbase/bin/../lib/jasper-runtime-5.5.23.jar:/opt/hbase/bin/../lib/jaxb-api-2.1.jar:/opt/hbase/bin/../lib/jaxb-impl-2.1.12.jar:/opt/hbase/bin/../lib/jersey-core-1.4.jar:/opt/hbase/bin/../lib/jersey-json-1.4.jar:/opt/hbase/bin/../lib/jersey-server-1.4.jar:/opt/hbase/bin/../lib/jettison-1.1.jar:/opt/hbase/bin/../lib/jetty-6.1.26.jar:/opt/hbase/bin/../lib/jetty-util-6.1.26.jar:/opt/hbase/bin/../lib/jruby-complete-1.6.5.jar:/opt/hbase/bin/../lib/jsp-2.1-6.1.14.jar:/opt/hbase/bin/../lib/jsp-api-2.1-6.1.14.jar:/opt/hbase/bin/../lib/libthrift-0.7.0.jar:/opt/hbase/bin/../lib/log4j-1.2.16.jar:/opt/hbase/bin/../lib/netty-3.2.4.Final.jar:/opt/hbase/bin/../lib/protobuf-java-2.4.0a.jar:/opt/hbase/bin/../lib/servlet-api-2.5-6.1.14.jar:/opt/hbase/bin/../lib/servlet-api-2.5.jar:/opt/hbase/bin/../lib/slf4j-api-1.5.8.jar:/opt/hbase/bin/../lib/slf4j-log4j12-1.5.8.jar:/opt/hbase/bin/../lib/snappy-java-1.0.3.2.jar:/opt/hbase/bin/../lib/stax-api-1.0.1.jar:/opt/hbase/bin/../lib/velocity-1.7.jar:/opt/hbase/bin/../lib/xmlenc-0.52.jar:/opt/hbase/bin/../lib/zookeeper-3.4.3.jar:/opt/hadoop/conf:/opt/hadoop-1.0.3/libexec/../conf:/opt/jdk1.6.0_45/lib/tools.jar:/opt/hadoop-1.0.3/libexec/..:/opt/hadoop-1.0.3/libexec/../hadoop-core-1.0.3.jar:/opt/hadoop-1.0.3/libexec/../lib/asm-3.2.jar:/opt/hadoop-1.0.3/libexec/../lib/aspectjrt-1.6.5.jar:/opt/hadoop-1.0.3/libexec/../lib/aspectjtools-1.6.5.jar:/opt/hadoop-1.0.3/libexec/../lib/commons-beanutils-1.7.0.jar:/opt/hadoop-1.0.3/libexec/../lib/commons-beanutils-core-1.8.0.jar:/opt/hadoop-1.0.3/libexec/../lib/commons-cli-1.2.jar:/opt/hadoop-1.0.3/libexec/../lib/commons-codec-1.4.jar:/opt/hadoop-1.0.3/libexec/../lib/commons-collections-3.2.1.jar:/opt/hadoop-1.0.3/libexec/../lib/commons-configuration-1.6.jar:/opt/hadoop-1.0.3/libexec/../lib/commons-daemon-1.0.1.jar:/opt/hadoop-1.0.3/libexec/../lib/commons-digester-1.8.jar:/opt/hadoop-1.0.3/libexec/../lib/commons-el-1.0.jar:/opt/hadoop-1.0.3/libexec/../lib/commons-httpclient-3.0.1.jar:/opt/hadoop-1.0.3/libexec/../lib/commons-io-2.1.jar:/opt/hadoop-1.0.3/libexec/../lib/commons-lang-2.4.jar:/opt/hadoop-1.0.3/libexec/../lib/commons-logging-1.1.1.jar:/opt/hadoop-1.0.3/libexec/../lib/commons-logging-api-1.0.4.jar:/opt/hadoop-1.0.3/libexec/../lib/commons-math-2.1.jar:/opt/hadoop-1.0.3/libexec/../lib/commons-net-1.4.1.jar:/opt/hadoop-1.0.3/libexec/../lib/core-3.1.1.jar:/opt/hadoop-1.0.3/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/opt/hadoop-1.0.3/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/opt/hadoop-1.0.3/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/opt/hadoop-1.0.3/libexec/../lib/hsqldb-1.8.0.10.jar:/opt/hadoop-1.0.3/libexec/../lib/jackson-core-asl-1.8.8.jar:/opt/hadoop-1.0.3/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/opt/hadoop-1.0.3/libexec/../lib/jasper-compiler-5.5.12.jar:/opt/hadoop-1.0.3/libexec/../lib/jasper-runtime-5.5.12.jar:/opt/hadoop-1.0.3/libexec/../lib/jdeb-0.8.jar:/opt/hadoop-1.0.3/libexec/../lib/jersey-core-1.8.jar:/opt/hadoop-1.0.3/libexec/../lib/jersey-json-1.8.jar:/opt/hadoop-1.0.3/libexec/../lib/jersey-server-1.8.jar:/opt/hadoop-1.0.3/libexec/../lib/jets3t-0.6.1.jar:/opt/hadoop-1.0.3/libexec/../lib/jetty-6.1.26.jar:/opt/hadoop-1.0.3/libexec/../lib/jetty-util-6.1.26.jar:/opt/hadoop-1.0.3/libexec/../lib/jsch-0.1.42.jar:/opt/hadoop-1.0.3/libexec/../lib/junit-4.5.jar:/opt/hadoop-1.0.3/libexec/../lib/kfs-0.2.2.jar:/opt/hadoop-1.0.3/libexec/../lib/log4j-1.2.15.jar:/opt/hadoop-1.0.3/libexec/../lib/mockito-all-1.8.5.jar:/opt/hadoop-1.0.3/libexec/../lib/oro-2.0.8.jar:/opt/hadoop-1.0.3/libexec/../lib/servlet-api-2.5-20081211.jar:/opt/hadoop-1.0.3/libexec/../lib/slf4j-api-1.4.3.jar:/opt/hadoop-1.0.3/libexec/../lib/slf4j-log4j12-1.4.3.jar:/opt/hadoop-1.0.3/libexec/../lib/xmlenc-0.52.jar:/opt/hadoop-1.0.3/libexec/../lib/jsp-2.1/jsp-2.1.jar:/opt/hadoop-1.0.3/libexec/../lib/jsp-2.1/jsp-api-2.1.jar
    19/04/21 17:45:45 INFO zookeeper.ZooKeeper: Client environment:java.library.path=/opt/hadoop-1.0.3/libexec/../lib/native/Linux-amd64-64:/opt/hbase/bin/../lib/native/Linux-amd64-64
    19/04/21 17:45:45 INFO zookeeper.ZooKeeper: Client environment:java.io.tmpdir=/tmp
    19/04/21 17:45:45 INFO zookeeper.ZooKeeper: Client environment:java.compiler=<NA>
    19/04/21 17:45:45 INFO zookeeper.ZooKeeper: Client environment:os.name=Linux
    19/04/21 17:45:45 INFO zookeeper.ZooKeeper: Client environment:os.arch=amd64
    19/04/21 17:45:45 INFO zookeeper.ZooKeeper: Client environment:os.version=3.10.0-514.el7.x86_64
    19/04/21 17:45:45 INFO zookeeper.ZooKeeper: Client environment:user.name=root
    19/04/21 17:45:45 INFO zookeeper.ZooKeeper: Client environment:user.home=/root
    19/04/21 17:45:45 INFO zookeeper.ZooKeeper: Client environment:user.dir=/opt/hbase-0.92.1/dba/exp
    19/04/21 17:45:45 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=sht-sgmhadoopdn-02:2182,sht-sgmhadoopdn-01:2182,sht-sgmhadoopdn-03:2182 sessionTimeout=60000 watcher=hconnection
    19/04/21 17:45:45 INFO zookeeper.RecoverableZooKeeper: The identifier of this process is 24347@sht-sgmhadoopdn-02.telenav.cn
    19/04/21 17:45:45 INFO zookeeper.ClientCnxn: Opening socket connection to server /172.16.101.60:2182
    19/04/21 17:45:45 WARN client.ZooKeeperSaslClient: SecurityException: java.lang.SecurityException: Unable to locate a login configuration occurred when trying to find JAAS configuration.
    19/04/21 17:45:45 INFO client.ZooKeeperSaslClient: Client will not SASL-authenticate because the default JAAS configuration section 'Client' could not be found. If you are not using SASL, you may ignore this. On the other hand, if you expected SASL to work, please fix your JAAS configuration.
    19/04/21 17:45:45 INFO zookeeper.ClientCnxn: Socket connection established to sht-sgmhadoopdn-03/172.16.101.60:2182, initiating session
    19/04/21 17:45:45 WARN zookeeper.ClientCnxnSocket: Connected to an old server; r-o mode will be unavailable
    19/04/21 17:45:45 INFO zookeeper.ClientCnxn: Session establishment complete on server sht-sgmhadoopdn-03/172.16.101.60:2182, sessionid = 0x36a3a9e24d50034, negotiated timeout = 40000
    19/04/21 17:45:45 DEBUG client.HConnectionManager$HConnectionImplementation: Lookedup root region location, connection=org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@58648016; serverName=sht-sgmhadoopdn-01,60021,1555762016498
    19/04/21 17:45:45 DEBUG client.HConnectionManager$HConnectionImplementation: Cached location for .META.,,1.1028785192 is sht-sgmhadoopdn-01:60021
    19/04/21 17:45:46 DEBUG client.MetaScanner: Scanning .META. starting at row=test,,00000000000000 for max=10 rows using org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@58648016
    19/04/21 17:45:46 DEBUG client.HConnectionManager$HConnectionImplementation: Cached location for test,,1555838328985.681b358885eb10357f9f811b77275b25. is sht-sgmhadoopdn-01:60021
    19/04/21 17:45:46 DEBUG client.MetaScanner: Scanning .META. starting at row=test,,00000000000000 for max=2147483647 rows using org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@58648016
    19/04/21 17:45:46 DEBUG mapreduce.TableInputFormatBase: getSplits: split -> 0 -> sht-sgmhadoopdn-01:,
    19/04/21 17:45:46 INFO mapred.JobClient: Running job: job_201904201958_0026
    19/04/21 17:45:47 INFO mapred.JobClient:  map 0% reduce 0%
    19/04/21 17:46:03 INFO mapred.JobClient:  map 100% reduce 0%
    19/04/21 17:46:08 INFO mapred.JobClient: Job complete: job_201904201958_0026
    19/04/21 17:46:08 INFO mapred.JobClient: Counters: 19
    19/04/21 17:46:08 INFO mapred.JobClient:   Job Counters 
    19/04/21 17:46:08 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=14713
    19/04/21 17:46:08 INFO mapred.JobClient:     Total time spent by all reduces waiting after reserving slots (ms)=0
    19/04/21 17:46:08 INFO mapred.JobClient:     Total time spent by all maps waiting after reserving slots (ms)=0
    19/04/21 17:46:08 INFO mapred.JobClient:     Rack-local map tasks=1
    19/04/21 17:46:08 INFO mapred.JobClient:     Launched map tasks=1
    19/04/21 17:46:08 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0
    19/04/21 17:46:08 INFO mapred.JobClient:   File Output Format Counters 
    19/04/21 17:46:08 INFO mapred.JobClient:     Bytes Written=310
    19/04/21 17:46:08 INFO mapred.JobClient:   FileSystemCounters
    19/04/21 17:46:08 INFO mapred.JobClient:     HDFS_BYTES_READ=71
    19/04/21 17:46:08 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=31358
    19/04/21 17:46:08 INFO mapred.JobClient:     HDFS_BYTES_WRITTEN=310
    19/04/21 17:46:08 INFO mapred.JobClient:   File Input Format Counters 
    19/04/21 17:46:08 INFO mapred.JobClient:     Bytes Read=0
    19/04/21 17:46:08 INFO mapred.JobClient:   Map-Reduce Framework
    19/04/21 17:46:08 INFO mapred.JobClient:     Map input records=2
    19/04/21 17:46:08 INFO mapred.JobClient:     Physical memory (bytes) snapshot=81055744
    19/04/21 17:46:08 INFO mapred.JobClient:     Spilled Records=0
    19/04/21 17:46:08 INFO mapred.JobClient:     CPU time spent (ms)=1390
    19/04/21 17:46:08 INFO mapred.JobClient:     Total committed heap usage (bytes)=91226112
    19/04/21 17:46:08 INFO mapred.JobClient:     Virtual memory (bytes) snapshot=1540837376
    19/04/21 17:46:08 INFO mapred.JobClient:     Map output records=2
    19/04/21 17:46:08 INFO mapred.JobClient:     SPLIT_RAW_BYTES=71
    View Code

    2.查看备份文件

    # hadoop fs -ls /backup/test
    Found 3 items
    -rw-r--r--   3 root supergroup          0 2019-04-21 17:46 /backup/test/_SUCCESS
    drwxr-xr-x   - root supergroup          0 2019-04-21 17:45 /backup/test/_logs
    -rw-r--r--   3 root supergroup        310 2019-04-21 17:45 /backup/test/part-m-00000

    3.创建新的表结构

    hbase(main):032:0> create 'emp', 'cf1', 'cf2'
    0 row(s) in 1.0590 seconds

    4.将备份导入到新表

    # hbase org.apache.hadoop.hbase.mapreduce.Import emp hdfs://sht-sgmhadoopnn-01:9011/backup/test

    或采用以下写法

    # hbase org.apache.hadoop.hbase.mapreduce.Import emp /backup/test

    输出log

    [root@sht-sgmhadoopdn-02 exp]# hbase org.apache.hadoop.hbase.mapreduce.Import emp hdfs://sht-sgmhadoopnn-01:9011/backup/test
    19/04/21 17:49:55 DEBUG mapreduce.TableMapReduceUtil: New JarFinder: org.apache.hadoop.util.JarFinder.getJar not available.  Using old findContainingJar
    19/04/21 17:49:55 DEBUG mapreduce.TableMapReduceUtil: New JarFinder: org.apache.hadoop.util.JarFinder.getJar not available.  Using old findContainingJar
    19/04/21 17:49:55 DEBUG mapreduce.TableMapReduceUtil: New JarFinder: org.apache.hadoop.util.JarFinder.getJar not available.  Using old findContainingJar
    19/04/21 17:49:55 DEBUG mapreduce.TableMapReduceUtil: New JarFinder: org.apache.hadoop.util.JarFinder.getJar not available.  Using old findContainingJar
    19/04/21 17:49:55 DEBUG mapreduce.TableMapReduceUtil: New JarFinder: org.apache.hadoop.util.JarFinder.getJar not available.  Using old findContainingJar
    19/04/21 17:49:55 DEBUG mapreduce.TableMapReduceUtil: New JarFinder: org.apache.hadoop.util.JarFinder.getJar not available.  Using old findContainingJar
    19/04/21 17:49:55 DEBUG mapreduce.TableMapReduceUtil: New JarFinder: org.apache.hadoop.util.JarFinder.getJar not available.  Using old findContainingJar
    19/04/21 17:49:55 DEBUG mapreduce.TableMapReduceUtil: New JarFinder: org.apache.hadoop.util.JarFinder.getJar not available.  Using old findContainingJar
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:file:/opt/hbase-0.92.1/lib/slf4j-log4j12-1.5.8.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in [jar:file:/opt/hadoop-1.0.3/lib/slf4j-log4j12-1.4.3.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    19/04/21 17:49:56 INFO zookeeper.ZooKeeper: Client environment:zookeeper.version=3.4.3-1240972, built on 02/06/2012 10:48 GMT
    19/04/21 17:49:56 INFO zookeeper.ZooKeeper: Client environment:host.name=sht-sgmhadoopdn-02
    19/04/21 17:49:56 INFO zookeeper.ZooKeeper: Client environment:java.version=1.6.0_45
    19/04/21 17:49:56 INFO zookeeper.ZooKeeper: Client environment:java.vendor=Sun Microsystems Inc.
    19/04/21 17:49:56 INFO zookeeper.ZooKeeper: Client environment:java.home=/opt/jdk1.6.0_45/jre
    19/04/21 17:49:56 INFO zookeeper.ZooKeeper: Client environment:java.class.path=/opt/hbase/bin/../conf:/opt/jdk1.6.0_45/lib/tools.jar:/opt/hbase/bin/..:/opt/hbase/bin/../hbase-0.92.1.jar:/opt/hbase/bin/../hbase-0.92.1-tests.jar:/opt/hbase/bin/../lib/activation-1.1.jar:/opt/hbase/bin/../lib/asm-3.1.jar:/opt/hbase/bin/../lib/avro-1.5.3.jar:/opt/hbase/bin/../lib/avro-ipc-1.5.3.jar:/opt/hbase/bin/../lib/commons-beanutils-1.7.0.jar:/opt/hbase/bin/../lib/commons-beanutils-core-1.8.0.jar:/opt/hbase/bin/../lib/commons-cli-1.2.jar:/opt/hbase/bin/../lib/commons-codec-1.4.jar:/opt/hbase/bin/../lib/commons-collections-3.2.1.jar:/opt/hbase/bin/../lib/commons-configuration-1.6.jar:/opt/hbase/bin/../lib/commons-digester-1.8.jar:/opt/hbase/bin/../lib/commons-el-1.0.jar:/opt/hbase/bin/../lib/commons-httpclient-3.1.jar:/opt/hbase/bin/../lib/commons-lang-2.5.jar:/opt/hbase/bin/../lib/commons-logging-1.1.1.jar:/opt/hbase/bin/../lib/commons-math-2.1.jar:/opt/hbase/bin/../lib/commons-net-1.4.1.jar:/opt/hbase/bin/../lib/core-3.1.1.jar:/opt/hbase/bin/../lib/guava-r09.jar:/opt/hbase/bin/../lib/hadoop-core-1.0.0.jar:/opt/hbase/bin/../lib/high-scale-lib-1.1.1.jar:/opt/hbase/bin/../lib/httpclient-4.0.1.jar:/opt/hbase/bin/../lib/httpcore-4.0.1.jar:/opt/hbase/bin/../lib/jackson-core-asl-1.5.5.jar:/opt/hbase/bin/../lib/jackson-jaxrs-1.5.5.jar:/opt/hbase/bin/../lib/jackson-mapper-asl-1.5.5.jar:/opt/hbase/bin/../lib/jackson-xc-1.5.5.jar:/opt/hbase/bin/../lib/jamon-runtime-2.3.1.jar:/opt/hbase/bin/../lib/jasper-compiler-5.5.23.jar:/opt/hbase/bin/../lib/jasper-runtime-5.5.23.jar:/opt/hbase/bin/../lib/jaxb-api-2.1.jar:/opt/hbase/bin/../lib/jaxb-impl-2.1.12.jar:/opt/hbase/bin/../lib/jersey-core-1.4.jar:/opt/hbase/bin/../lib/jersey-json-1.4.jar:/opt/hbase/bin/../lib/jersey-server-1.4.jar:/opt/hbase/bin/../lib/jettison-1.1.jar:/opt/hbase/bin/../lib/jetty-6.1.26.jar:/opt/hbase/bin/../lib/jetty-util-6.1.26.jar:/opt/hbase/bin/../lib/jruby-complete-1.6.5.jar:/opt/hbase/bin/../lib/jsp-2.1-6.1.14.jar:/opt/hbase/bin/../lib/jsp-api-2.1-6.1.14.jar:/opt/hbase/bin/../lib/libthrift-0.7.0.jar:/opt/hbase/bin/../lib/log4j-1.2.16.jar:/opt/hbase/bin/../lib/netty-3.2.4.Final.jar:/opt/hbase/bin/../lib/protobuf-java-2.4.0a.jar:/opt/hbase/bin/../lib/servlet-api-2.5-6.1.14.jar:/opt/hbase/bin/../lib/servlet-api-2.5.jar:/opt/hbase/bin/../lib/slf4j-api-1.5.8.jar:/opt/hbase/bin/../lib/slf4j-log4j12-1.5.8.jar:/opt/hbase/bin/../lib/snappy-java-1.0.3.2.jar:/opt/hbase/bin/../lib/stax-api-1.0.1.jar:/opt/hbase/bin/../lib/velocity-1.7.jar:/opt/hbase/bin/../lib/xmlenc-0.52.jar:/opt/hbase/bin/../lib/zookeeper-3.4.3.jar:/opt/hadoop/conf:/opt/hadoop-1.0.3/libexec/../conf:/opt/jdk1.6.0_45/lib/tools.jar:/opt/hadoop-1.0.3/libexec/..:/opt/hadoop-1.0.3/libexec/../hadoop-core-1.0.3.jar:/opt/hadoop-1.0.3/libexec/../lib/asm-3.2.jar:/opt/hadoop-1.0.3/libexec/../lib/aspectjrt-1.6.5.jar:/opt/hadoop-1.0.3/libexec/../lib/aspectjtools-1.6.5.jar:/opt/hadoop-1.0.3/libexec/../lib/commons-beanutils-1.7.0.jar:/opt/hadoop-1.0.3/libexec/../lib/commons-beanutils-core-1.8.0.jar:/opt/hadoop-1.0.3/libexec/../lib/commons-cli-1.2.jar:/opt/hadoop-1.0.3/libexec/../lib/commons-codec-1.4.jar:/opt/hadoop-1.0.3/libexec/../lib/commons-collections-3.2.1.jar:/opt/hadoop-1.0.3/libexec/../lib/commons-configuration-1.6.jar:/opt/hadoop-1.0.3/libexec/../lib/commons-daemon-1.0.1.jar:/opt/hadoop-1.0.3/libexec/../lib/commons-digester-1.8.jar:/opt/hadoop-1.0.3/libexec/../lib/commons-el-1.0.jar:/opt/hadoop-1.0.3/libexec/../lib/commons-httpclient-3.0.1.jar:/opt/hadoop-1.0.3/libexec/../lib/commons-io-2.1.jar:/opt/hadoop-1.0.3/libexec/../lib/commons-lang-2.4.jar:/opt/hadoop-1.0.3/libexec/../lib/commons-logging-1.1.1.jar:/opt/hadoop-1.0.3/libexec/../lib/commons-logging-api-1.0.4.jar:/opt/hadoop-1.0.3/libexec/../lib/commons-math-2.1.jar:/opt/hadoop-1.0.3/libexec/../lib/commons-net-1.4.1.jar:/opt/hadoop-1.0.3/libexec/../lib/core-3.1.1.jar:/opt/hadoop-1.0.3/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/opt/hadoop-1.0.3/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/opt/hadoop-1.0.3/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/opt/hadoop-1.0.3/libexec/../lib/hsqldb-1.8.0.10.jar:/opt/hadoop-1.0.3/libexec/../lib/jackson-core-asl-1.8.8.jar:/opt/hadoop-1.0.3/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/opt/hadoop-1.0.3/libexec/../lib/jasper-compiler-5.5.12.jar:/opt/hadoop-1.0.3/libexec/../lib/jasper-runtime-5.5.12.jar:/opt/hadoop-1.0.3/libexec/../lib/jdeb-0.8.jar:/opt/hadoop-1.0.3/libexec/../lib/jersey-core-1.8.jar:/opt/hadoop-1.0.3/libexec/../lib/jersey-json-1.8.jar:/opt/hadoop-1.0.3/libexec/../lib/jersey-server-1.8.jar:/opt/hadoop-1.0.3/libexec/../lib/jets3t-0.6.1.jar:/opt/hadoop-1.0.3/libexec/../lib/jetty-6.1.26.jar:/opt/hadoop-1.0.3/libexec/../lib/jetty-util-6.1.26.jar:/opt/hadoop-1.0.3/libexec/../lib/jsch-0.1.42.jar:/opt/hadoop-1.0.3/libexec/../lib/junit-4.5.jar:/opt/hadoop-1.0.3/libexec/../lib/kfs-0.2.2.jar:/opt/hadoop-1.0.3/libexec/../lib/log4j-1.2.15.jar:/opt/hadoop-1.0.3/libexec/../lib/mockito-all-1.8.5.jar:/opt/hadoop-1.0.3/libexec/../lib/oro-2.0.8.jar:/opt/hadoop-1.0.3/libexec/../lib/servlet-api-2.5-20081211.jar:/opt/hadoop-1.0.3/libexec/../lib/slf4j-api-1.4.3.jar:/opt/hadoop-1.0.3/libexec/../lib/slf4j-log4j12-1.4.3.jar:/opt/hadoop-1.0.3/libexec/../lib/xmlenc-0.52.jar:/opt/hadoop-1.0.3/libexec/../lib/jsp-2.1/jsp-2.1.jar:/opt/hadoop-1.0.3/libexec/../lib/jsp-2.1/jsp-api-2.1.jar
    19/04/21 17:49:56 INFO zookeeper.ZooKeeper: Client environment:java.library.path=/opt/hadoop-1.0.3/libexec/../lib/native/Linux-amd64-64:/opt/hbase/bin/../lib/native/Linux-amd64-64
    19/04/21 17:49:56 INFO zookeeper.ZooKeeper: Client environment:java.io.tmpdir=/tmp
    19/04/21 17:49:56 INFO zookeeper.ZooKeeper: Client environment:java.compiler=<NA>
    19/04/21 17:49:56 INFO zookeeper.ZooKeeper: Client environment:os.name=Linux
    19/04/21 17:49:56 INFO zookeeper.ZooKeeper: Client environment:os.arch=amd64
    19/04/21 17:49:56 INFO zookeeper.ZooKeeper: Client environment:os.version=3.10.0-514.el7.x86_64
    19/04/21 17:49:56 INFO zookeeper.ZooKeeper: Client environment:user.name=root
    19/04/21 17:49:56 INFO zookeeper.ZooKeeper: Client environment:user.home=/root
    19/04/21 17:49:56 INFO zookeeper.ZooKeeper: Client environment:user.dir=/opt/hbase-0.92.1/dba/exp
    19/04/21 17:49:56 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=sht-sgmhadoopdn-02:2182,sht-sgmhadoopdn-01:2182,sht-sgmhadoopdn-03:2182 sessionTimeout=60000 watcher=hconnection
    19/04/21 17:49:56 INFO zookeeper.ClientCnxn: Opening socket connection to server /172.16.101.59:2182
    19/04/21 17:49:56 WARN client.ZooKeeperSaslClient: SecurityException: java.lang.SecurityException: Unable to locate a login configuration occurred when trying to find JAAS configuration.
    19/04/21 17:49:56 INFO client.ZooKeeperSaslClient: Client will not SASL-authenticate because the default JAAS configuration section 'Client' could not be found. If you are not using SASL, you may ignore this. On the other hand, if you expected SASL to work, please fix your JAAS configuration.
    19/04/21 17:49:56 INFO zookeeper.ClientCnxn: Socket connection established to sht-sgmhadoopdn-02/172.16.101.59:2182, initiating session
    19/04/21 17:49:56 INFO zookeeper.RecoverableZooKeeper: The identifier of this process is 24873@sht-sgmhadoopdn-02.telenav.cn
    19/04/21 17:49:56 WARN zookeeper.ClientCnxnSocket: Connected to an old server; r-o mode will be unavailable
    19/04/21 17:49:56 INFO zookeeper.ClientCnxn: Session establishment complete on server sht-sgmhadoopdn-02/172.16.101.59:2182, sessionid = 0x26a3a9dc0150032, negotiated timeout = 40000
    19/04/21 17:49:56 DEBUG client.HConnectionManager$HConnectionImplementation: Lookedup root region location, connection=org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@66922804; serverName=sht-sgmhadoopdn-01,60021,1555762016498
    19/04/21 17:49:56 DEBUG client.HConnectionManager$HConnectionImplementation: Cached location for .META.,,1.1028785192 is sht-sgmhadoopdn-01:60021
    19/04/21 17:49:56 DEBUG client.MetaScanner: Scanning .META. starting at row=emp,,00000000000000 for max=10 rows using org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@66922804
    19/04/21 17:49:56 DEBUG client.HConnectionManager$HConnectionImplementation: Cached location for emp,,1555840094033.a8346463e975084ba0398d3bf9c32649. is sht-sgmhadoopdn-03:60021
    19/04/21 17:49:56 INFO mapreduce.TableOutputFormat: Created table instance for emp
    19/04/21 17:49:57 INFO input.FileInputFormat: Total input paths to process : 1
    19/04/21 17:49:57 INFO mapred.JobClient: Running job: job_201904201958_0028
    19/04/21 17:49:58 INFO mapred.JobClient:  map 0% reduce 0%
    19/04/21 17:50:14 INFO mapred.JobClient:  map 100% reduce 0%
    19/04/21 17:50:19 INFO mapred.JobClient: Job complete: job_201904201958_0028
    19/04/21 17:50:19 INFO mapred.JobClient: Counters: 18
    19/04/21 17:50:19 INFO mapred.JobClient:   Job Counters 
    19/04/21 17:50:19 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=13335
    19/04/21 17:50:19 INFO mapred.JobClient:     Total time spent by all reduces waiting after reserving slots (ms)=0
    19/04/21 17:50:19 INFO mapred.JobClient:     Total time spent by all maps waiting after reserving slots (ms)=0
    19/04/21 17:50:19 INFO mapred.JobClient:     Launched map tasks=1
    19/04/21 17:50:19 INFO mapred.JobClient:     Data-local map tasks=1
    19/04/21 17:50:19 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0
    19/04/21 17:50:19 INFO mapred.JobClient:   File Output Format Counters 
    19/04/21 17:50:19 INFO mapred.JobClient:     Bytes Written=0
    19/04/21 17:50:19 INFO mapred.JobClient:   FileSystemCounters
    19/04/21 17:50:19 INFO mapred.JobClient:     HDFS_BYTES_READ=430
    19/04/21 17:50:19 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=31298
    19/04/21 17:50:19 INFO mapred.JobClient:   File Input Format Counters 
    19/04/21 17:50:19 INFO mapred.JobClient:     Bytes Read=310
    19/04/21 17:50:19 INFO mapred.JobClient:   Map-Reduce Framework
    19/04/21 17:50:19 INFO mapred.JobClient:     Map input records=2
    19/04/21 17:50:19 INFO mapred.JobClient:     Physical memory (bytes) snapshot=91877376
    19/04/21 17:50:19 INFO mapred.JobClient:     Spilled Records=0
    19/04/21 17:50:19 INFO mapred.JobClient:     CPU time spent (ms)=90
    19/04/21 17:50:19 INFO mapred.JobClient:     Total committed heap usage (bytes)=91226112
    19/04/21 17:50:19 INFO mapred.JobClient:     Virtual memory (bytes) snapshot=1535459328
    19/04/21 17:50:19 INFO mapred.JobClient:     Map output records=2
    19/04/21 17:50:19 INFO mapred.JobClient:     SPLIT_RAW_BYTES=120
    View Code

    5. 查看新表数据

    hbase(main):034:0> scan 'emp'
    ROW                                                              COLUMN+CELL                                                                                                                                                                                 
     row1                                                            column=cf1:age, timestamp=1555771920276, value=21                                                                                                                                           
     row1                                                            column=cf1:name, timestamp=1555771906481, value=zhangsan                                                                                                                                    
     row2                                                            column=cf2:age, timestamp=1555837304256, value=20                                                                                                                                           
     row2                                                            column=cf2:name, timestamp=1555837324252, value=wangba                                                                                                                                      
    2 row(s) in 0.0450 seconds

    二.复制

    # hbase org.apache.hadoop.hbase.mapreduce.CopyTable
    Usage: CopyTable [--rs.class=CLASS] [--rs.impl=IMPL] [--starttime=X] [--endtime=Y] [--new.name=NEW] [--peer.adr=ADR] <tablename>
    
    Options:
     rs.class     hbase.regionserver.class of the peer cluster
                  specify if different from current cluster
     rs.impl      hbase.regionserver.impl of the peer cluster
     starttime    beginning of the time range
                  without endtime means from starttime to forever
     endtime      end of the time range
     new.name     new table's name
     peer.adr     Address of the peer cluster given in the format
                  hbase.zookeeer.quorum:hbase.zookeeper.client.port:zookeeper.znode.parent
     families     comma-separated list of families to copy
                  To copy from cf1 to cf2, give sourceCfName:destCfName. 
                  To keep the same name, just give "cfName"
    
    Args:
     tablename    Name of the table to copy
    
    Examples:
     To copy 'TestTable' to a cluster that uses replication for a 1 hour window:
     $ bin/hbase org.apache.hadoop.hbase.mapreduce.CopyTable --rs.class=org.apache.hadoop.hbase.ipc.ReplicationRegionInterface --rs.impl=org.apache.hadoop.hbase.regionserver.replication.ReplicationRegionServer --starttime=1265875194289 --endtime=1265878794289 --peer.adr=server1,server2,server3:2181:/hbase --families=myOldCf:myNewCf,cf2,cf3 TestTable

    1. 新建表结构

    hbase(main):035:0> create 'emp1', 'cf1', 'cf2'
    0 row(s) in 1.0610 seconds

    2. 将老表数据复制到新表

    # hbase org.apache.hadoop.hbase.mapreduce.CopyTable --new.name=emp1 test

    输出log

    [root@sht-sgmhadoopdn-01 exp]# hbase org.apache.hadoop.hbase.mapreduce.CopyTable --new.name=emp1 test
    19/04/21 18:01:18 DEBUG mapreduce.TableMapReduceUtil: New JarFinder: org.apache.hadoop.util.JarFinder.getJar not available.  Using old findContainingJar
    19/04/21 18:01:18 DEBUG mapreduce.TableMapReduceUtil: New JarFinder: org.apache.hadoop.util.JarFinder.getJar not available.  Using old findContainingJar
    19/04/21 18:01:18 DEBUG mapreduce.TableMapReduceUtil: New JarFinder: org.apache.hadoop.util.JarFinder.getJar not available.  Using old findContainingJar
    19/04/21 18:01:18 DEBUG mapreduce.TableMapReduceUtil: New JarFinder: org.apache.hadoop.util.JarFinder.getJar not available.  Using old findContainingJar
    19/04/21 18:01:18 DEBUG mapreduce.TableMapReduceUtil: New JarFinder: org.apache.hadoop.util.JarFinder.getJar not available.  Using old findContainingJar
    19/04/21 18:01:18 DEBUG mapreduce.TableMapReduceUtil: New JarFinder: org.apache.hadoop.util.JarFinder.getJar not available.  Using old findContainingJar
    19/04/21 18:01:18 DEBUG mapreduce.TableMapReduceUtil: New JarFinder: org.apache.hadoop.util.JarFinder.getJar not available.  Using old findContainingJar
    19/04/21 18:01:18 DEBUG mapreduce.TableMapReduceUtil: New JarFinder: org.apache.hadoop.util.JarFinder.getJar not available.  Using old findContainingJar
    19/04/21 18:01:18 DEBUG mapreduce.TableMapReduceUtil: New JarFinder: org.apache.hadoop.util.JarFinder.getJar not available.  Using old findContainingJar
    19/04/21 18:01:18 DEBUG mapreduce.TableMapReduceUtil: New JarFinder: org.apache.hadoop.util.JarFinder.getJar not available.  Using old findContainingJar
    19/04/21 18:01:18 DEBUG mapreduce.TableMapReduceUtil: New JarFinder: org.apache.hadoop.util.JarFinder.getJar not available.  Using old findContainingJar
    19/04/21 18:01:18 DEBUG mapreduce.TableMapReduceUtil: New JarFinder: org.apache.hadoop.util.JarFinder.getJar not available.  Using old findContainingJar
    19/04/21 18:01:18 DEBUG mapreduce.TableMapReduceUtil: New JarFinder: org.apache.hadoop.util.JarFinder.getJar not available.  Using old findContainingJar
    19/04/21 18:01:18 DEBUG mapreduce.TableMapReduceUtil: New JarFinder: org.apache.hadoop.util.JarFinder.getJar not available.  Using old findContainingJar
    19/04/21 18:01:18 DEBUG mapreduce.TableMapReduceUtil: New JarFinder: org.apache.hadoop.util.JarFinder.getJar not available.  Using old findContainingJar
    19/04/21 18:01:18 DEBUG mapreduce.TableMapReduceUtil: New JarFinder: org.apache.hadoop.util.JarFinder.getJar not available.  Using old findContainingJar
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:file:/opt/hbase-0.92.1/lib/slf4j-log4j12-1.5.8.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in [jar:file:/opt/hadoop-1.0.3/lib/slf4j-log4j12-1.4.3.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    19/04/21 18:01:19 INFO zookeeper.ZooKeeper: Client environment:zookeeper.version=3.4.3-1240972, built on 02/06/2012 10:48 GMT
    19/04/21 18:01:19 INFO zookeeper.ZooKeeper: Client environment:host.name=sht-sgmhadoopdn-01
    19/04/21 18:01:19 INFO zookeeper.ZooKeeper: Client environment:java.version=1.6.0_45
    19/04/21 18:01:19 INFO zookeeper.ZooKeeper: Client environment:java.vendor=Sun Microsystems Inc.
    19/04/21 18:01:19 INFO zookeeper.ZooKeeper: Client environment:java.home=/opt/jdk1.6.0_45/jre
    19/04/21 18:01:19 INFO zookeeper.ZooKeeper: Client environment:java.class.path=/opt/hbase/bin/../conf:/opt/jdk1.6.0_45/lib/tools.jar:/opt/hbase/bin/..:/opt/hbase/bin/../hbase-0.92.1.jar:/opt/hbase/bin/../hbase-0.92.1-tests.jar:/opt/hbase/bin/../lib/activation-1.1.jar:/opt/hbase/bin/../lib/asm-3.1.jar:/opt/hbase/bin/../lib/avro-1.5.3.jar:/opt/hbase/bin/../lib/avro-ipc-1.5.3.jar:/opt/hbase/bin/../lib/commons-beanutils-1.7.0.jar:/opt/hbase/bin/../lib/commons-beanutils-core-1.8.0.jar:/opt/hbase/bin/../lib/commons-cli-1.2.jar:/opt/hbase/bin/../lib/commons-codec-1.4.jar:/opt/hbase/bin/../lib/commons-collections-3.2.1.jar:/opt/hbase/bin/../lib/commons-configuration-1.6.jar:/opt/hbase/bin/../lib/commons-digester-1.8.jar:/opt/hbase/bin/../lib/commons-el-1.0.jar:/opt/hbase/bin/../lib/commons-httpclient-3.1.jar:/opt/hbase/bin/../lib/commons-lang-2.5.jar:/opt/hbase/bin/../lib/commons-logging-1.1.1.jar:/opt/hbase/bin/../lib/commons-math-2.1.jar:/opt/hbase/bin/../lib/commons-net-1.4.1.jar:/opt/hbase/bin/../lib/core-3.1.1.jar:/opt/hbase/bin/../lib/guava-r09.jar:/opt/hbase/bin/../lib/hadoop-core-1.0.0.jar:/opt/hbase/bin/../lib/high-scale-lib-1.1.1.jar:/opt/hbase/bin/../lib/httpclient-4.0.1.jar:/opt/hbase/bin/../lib/httpcore-4.0.1.jar:/opt/hbase/bin/../lib/jackson-core-asl-1.5.5.jar:/opt/hbase/bin/../lib/jackson-jaxrs-1.5.5.jar:/opt/hbase/bin/../lib/jackson-mapper-asl-1.5.5.jar:/opt/hbase/bin/../lib/jackson-xc-1.5.5.jar:/opt/hbase/bin/../lib/jamon-runtime-2.3.1.jar:/opt/hbase/bin/../lib/jasper-compiler-5.5.23.jar:/opt/hbase/bin/../lib/jasper-runtime-5.5.23.jar:/opt/hbase/bin/../lib/jaxb-api-2.1.jar:/opt/hbase/bin/../lib/jaxb-impl-2.1.12.jar:/opt/hbase/bin/../lib/jersey-core-1.4.jar:/opt/hbase/bin/../lib/jersey-json-1.4.jar:/opt/hbase/bin/../lib/jersey-server-1.4.jar:/opt/hbase/bin/../lib/jettison-1.1.jar:/opt/hbase/bin/../lib/jetty-6.1.26.jar:/opt/hbase/bin/../lib/jetty-util-6.1.26.jar:/opt/hbase/bin/../lib/jruby-complete-1.6.5.jar:/opt/hbase/bin/../lib/jsp-2.1-6.1.14.jar:/opt/hbase/bin/../lib/jsp-api-2.1-6.1.14.jar:/opt/hbase/bin/../lib/libthrift-0.7.0.jar:/opt/hbase/bin/../lib/log4j-1.2.16.jar:/opt/hbase/bin/../lib/netty-3.2.4.Final.jar:/opt/hbase/bin/../lib/protobuf-java-2.4.0a.jar:/opt/hbase/bin/../lib/servlet-api-2.5-6.1.14.jar:/opt/hbase/bin/../lib/servlet-api-2.5.jar:/opt/hbase/bin/../lib/slf4j-api-1.5.8.jar:/opt/hbase/bin/../lib/slf4j-log4j12-1.5.8.jar:/opt/hbase/bin/../lib/snappy-java-1.0.3.2.jar:/opt/hbase/bin/../lib/stax-api-1.0.1.jar:/opt/hbase/bin/../lib/velocity-1.7.jar:/opt/hbase/bin/../lib/xmlenc-0.52.jar:/opt/hbase/bin/../lib/zookeeper-3.4.3.jar:/opt/hadoop/conf:/opt/hadoop-1.0.3/libexec/../conf:/opt/jdk1.6.0_45/lib/tools.jar:/opt/hadoop-1.0.3/libexec/..:/opt/hadoop-1.0.3/libexec/../hadoop-core-1.0.3.jar:/opt/hadoop-1.0.3/libexec/../lib/asm-3.2.jar:/opt/hadoop-1.0.3/libexec/../lib/aspectjrt-1.6.5.jar:/opt/hadoop-1.0.3/libexec/../lib/aspectjtools-1.6.5.jar:/opt/hadoop-1.0.3/libexec/../lib/commons-beanutils-1.7.0.jar:/opt/hadoop-1.0.3/libexec/../lib/commons-beanutils-core-1.8.0.jar:/opt/hadoop-1.0.3/libexec/../lib/commons-cli-1.2.jar:/opt/hadoop-1.0.3/libexec/../lib/commons-codec-1.4.jar:/opt/hadoop-1.0.3/libexec/../lib/commons-collections-3.2.1.jar:/opt/hadoop-1.0.3/libexec/../lib/commons-configuration-1.6.jar:/opt/hadoop-1.0.3/libexec/../lib/commons-daemon-1.0.1.jar:/opt/hadoop-1.0.3/libexec/../lib/commons-digester-1.8.jar:/opt/hadoop-1.0.3/libexec/../lib/commons-el-1.0.jar:/opt/hadoop-1.0.3/libexec/../lib/commons-httpclient-3.0.1.jar:/opt/hadoop-1.0.3/libexec/../lib/commons-io-2.1.jar:/opt/hadoop-1.0.3/libexec/../lib/commons-lang-2.4.jar:/opt/hadoop-1.0.3/libexec/../lib/commons-logging-1.1.1.jar:/opt/hadoop-1.0.3/libexec/../lib/commons-logging-api-1.0.4.jar:/opt/hadoop-1.0.3/libexec/../lib/commons-math-2.1.jar:/opt/hadoop-1.0.3/libexec/../lib/commons-net-1.4.1.jar:/opt/hadoop-1.0.3/libexec/../lib/core-3.1.1.jar:/opt/hadoop-1.0.3/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/opt/hadoop-1.0.3/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/opt/hadoop-1.0.3/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/opt/hadoop-1.0.3/libexec/../lib/hsqldb-1.8.0.10.jar:/opt/hadoop-1.0.3/libexec/../lib/jackson-core-asl-1.8.8.jar:/opt/hadoop-1.0.3/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/opt/hadoop-1.0.3/libexec/../lib/jasper-compiler-5.5.12.jar:/opt/hadoop-1.0.3/libexec/../lib/jasper-runtime-5.5.12.jar:/opt/hadoop-1.0.3/libexec/../lib/jdeb-0.8.jar:/opt/hadoop-1.0.3/libexec/../lib/jersey-core-1.8.jar:/opt/hadoop-1.0.3/libexec/../lib/jersey-json-1.8.jar:/opt/hadoop-1.0.3/libexec/../lib/jersey-server-1.8.jar:/opt/hadoop-1.0.3/libexec/../lib/jets3t-0.6.1.jar:/opt/hadoop-1.0.3/libexec/../lib/jetty-6.1.26.jar:/opt/hadoop-1.0.3/libexec/../lib/jetty-util-6.1.26.jar:/opt/hadoop-1.0.3/libexec/../lib/jsch-0.1.42.jar:/opt/hadoop-1.0.3/libexec/../lib/junit-4.5.jar:/opt/hadoop-1.0.3/libexec/../lib/kfs-0.2.2.jar:/opt/hadoop-1.0.3/libexec/../lib/log4j-1.2.15.jar:/opt/hadoop-1.0.3/libexec/../lib/mockito-all-1.8.5.jar:/opt/hadoop-1.0.3/libexec/../lib/oro-2.0.8.jar:/opt/hadoop-1.0.3/libexec/../lib/servlet-api-2.5-20081211.jar:/opt/hadoop-1.0.3/libexec/../lib/slf4j-api-1.4.3.jar:/opt/hadoop-1.0.3/libexec/../lib/slf4j-log4j12-1.4.3.jar:/opt/hadoop-1.0.3/libexec/../lib/xmlenc-0.52.jar:/opt/hadoop-1.0.3/libexec/../lib/jsp-2.1/jsp-2.1.jar:/opt/hadoop-1.0.3/libexec/../lib/jsp-2.1/jsp-api-2.1.jar
    19/04/21 18:01:19 INFO zookeeper.ZooKeeper: Client environment:java.library.path=/opt/hadoop-1.0.3/libexec/../lib/native/Linux-amd64-64:/opt/hbase/bin/../lib/native/Linux-amd64-64
    19/04/21 18:01:19 INFO zookeeper.ZooKeeper: Client environment:java.io.tmpdir=/tmp
    19/04/21 18:01:19 INFO zookeeper.ZooKeeper: Client environment:java.compiler=<NA>
    19/04/21 18:01:19 INFO zookeeper.ZooKeeper: Client environment:os.name=Linux
    19/04/21 18:01:19 INFO zookeeper.ZooKeeper: Client environment:os.arch=amd64
    19/04/21 18:01:19 INFO zookeeper.ZooKeeper: Client environment:os.version=3.10.0-514.el7.x86_64
    19/04/21 18:01:19 INFO zookeeper.ZooKeeper: Client environment:user.name=root
    19/04/21 18:01:19 INFO zookeeper.ZooKeeper: Client environment:user.home=/root
    19/04/21 18:01:19 INFO zookeeper.ZooKeeper: Client environment:user.dir=/opt/hbase-0.92.1/dba/exp
    19/04/21 18:01:19 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=sht-sgmhadoopdn-02:2182,sht-sgmhadoopdn-01:2182,sht-sgmhadoopdn-03:2182 sessionTimeout=60000 watcher=hconnection
    19/04/21 18:01:19 INFO zookeeper.ClientCnxn: Opening socket connection to server /172.16.101.58:2182
    19/04/21 18:01:19 WARN client.ZooKeeperSaslClient: SecurityException: java.lang.SecurityException: Unable to locate a login configuration occurred when trying to find JAAS configuration.
    19/04/21 18:01:19 INFO client.ZooKeeperSaslClient: Client will not SASL-authenticate because the default JAAS configuration section 'Client' could not be found. If you are not using SASL, you may ignore this. On the other hand, if you expected SASL to work, please fix your JAAS configuration.
    19/04/21 18:01:19 INFO zookeeper.ClientCnxn: Socket connection established to sht-sgmhadoopdn-01/172.16.101.58:2182, initiating session
    19/04/21 18:01:19 INFO zookeeper.RecoverableZooKeeper: The identifier of this process is 12345@sht-sgmhadoopdn-01
    19/04/21 18:01:19 WARN zookeeper.ClientCnxnSocket: Connected to an old server; r-o mode will be unavailable
    19/04/21 18:01:19 INFO zookeeper.ClientCnxn: Session establishment complete on server sht-sgmhadoopdn-01/172.16.101.58:2182, sessionid = 0x16a3a9dc00f0035, negotiated timeout = 40000
    19/04/21 18:01:19 DEBUG client.HConnectionManager$HConnectionImplementation: Lookedup root region location, connection=org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@d18d189; serverName=sht-sgmhadoopdn-01,60021,1555762016498
    19/04/21 18:01:19 DEBUG client.HConnectionManager$HConnectionImplementation: Cached location for .META.,,1.1028785192 is sht-sgmhadoopdn-01:60021
    19/04/21 18:01:19 DEBUG client.MetaScanner: Scanning .META. starting at row=emp1,,00000000000000 for max=10 rows using org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@d18d189
    19/04/21 18:01:19 DEBUG client.HConnectionManager$HConnectionImplementation: Cached location for emp1,,1555840809230.6fda341441637758b7ea64c63a769f79. is sht-sgmhadoopdn-01:60021
    19/04/21 18:01:19 INFO mapreduce.TableOutputFormat: Created table instance for emp1
    19/04/21 18:01:19 DEBUG client.MetaScanner: Scanning .META. starting at row=test,,00000000000000 for max=10 rows using org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@d18d189
    19/04/21 18:01:19 DEBUG client.HConnectionManager$HConnectionImplementation: Cached location for test,,1555838328985.681b358885eb10357f9f811b77275b25. is sht-sgmhadoopdn-01:60021
    19/04/21 18:01:19 DEBUG client.MetaScanner: Scanning .META. starting at row=test,,00000000000000 for max=2147483647 rows using org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@d18d189
    19/04/21 18:01:19 DEBUG mapreduce.TableInputFormatBase: getSplits: split -> 0 -> sht-sgmhadoopdn-01:,
    19/04/21 18:01:19 INFO mapred.JobClient: Running job: job_201904201958_0029
    19/04/21 18:01:20 INFO mapred.JobClient:  map 0% reduce 0%
    19/04/21 18:01:36 INFO mapred.JobClient:  map 100% reduce 0%
    19/04/21 18:01:41 INFO mapred.JobClient: Job complete: job_201904201958_0029
    19/04/21 18:01:42 INFO mapred.JobClient: Counters: 18
    19/04/21 18:01:42 INFO mapred.JobClient:   Job Counters 
    19/04/21 18:01:42 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=14788
    19/04/21 18:01:42 INFO mapred.JobClient:     Total time spent by all reduces waiting after reserving slots (ms)=0
    19/04/21 18:01:42 INFO mapred.JobClient:     Total time spent by all maps waiting after reserving slots (ms)=0
    19/04/21 18:01:42 INFO mapred.JobClient:     Rack-local map tasks=1
    19/04/21 18:01:42 INFO mapred.JobClient:     Launched map tasks=1
    19/04/21 18:01:42 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0
    19/04/21 18:01:42 INFO mapred.JobClient:   File Output Format Counters 
    19/04/21 18:01:42 INFO mapred.JobClient:     Bytes Written=0
    19/04/21 18:01:42 INFO mapred.JobClient:   FileSystemCounters
    19/04/21 18:01:42 INFO mapred.JobClient:     HDFS_BYTES_READ=71
    19/04/21 18:01:42 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=31301
    19/04/21 18:01:42 INFO mapred.JobClient:   File Input Format Counters 
    19/04/21 18:01:42 INFO mapred.JobClient:     Bytes Read=0
    19/04/21 18:01:42 INFO mapred.JobClient:   Map-Reduce Framework
    19/04/21 18:01:42 INFO mapred.JobClient:     Map input records=2
    19/04/21 18:01:42 INFO mapred.JobClient:     Physical memory (bytes) snapshot=77787136
    19/04/21 18:01:42 INFO mapred.JobClient:     Spilled Records=0
    19/04/21 18:01:42 INFO mapred.JobClient:     CPU time spent (ms)=150
    19/04/21 18:01:42 INFO mapred.JobClient:     Total committed heap usage (bytes)=91226112
    19/04/21 18:01:42 INFO mapred.JobClient:     Virtual memory (bytes) snapshot=1539833856
    19/04/21 18:01:42 INFO mapred.JobClient:     Map output records=2
    19/04/21 18:01:42 INFO mapred.JobClient:     SPLIT_RAW_BYTES=71
    View Code

    3. 查看新表数据

    hbase(main):036:0> scan 'emp1'
    ROW                                                              COLUMN+CELL                                                                                                                                                                                 
     row1                                                            column=cf1:age, timestamp=1555771920276, value=21                                                                                                                                           
     row1                                                            column=cf1:name, timestamp=1555771906481, value=zhangsan                                                                                                                                    
     row2                                                            column=cf2:age, timestamp=1555837304256, value=20                                                                                                                                           
     row2                                                            column=cf2:name, timestamp=1555837324252, value=wangba                                                                                                                                      
    2 row(s) in 0.0240 seconds
  • 相关阅读:
    关于BFS
    关于bitset
    关于线段树(数组和指针两种实现方法)
    关于RMQ问题
    浅谈树状数组
    洛谷—— P3865 【模板】ST表
    洛谷—— P3807 【模板】卢卡斯定理
    2017-10-29-afternoon-清北模拟赛
    2017-10-29-morning-清北模拟赛
    51Nod 1526 分配 笔名
  • 原文地址:https://www.cnblogs.com/ilifeilong/p/10746091.html
Copyright © 2011-2022 走看看