zoukankan      html  css  js  c++  java
  • Hadoop 权威指南学习2 (Sqoop)

    6. Sqoop

    Apache sqoop is an open source tool that allow users to extract data from structured data store into Hadoop or back.

    1) run

    self :        $SQOOP_HOME/bin/sqoop

    standard: sqoop (默认目录是/usr/bin/sqoop)

    2) common commands

        sqoop help           ##  the list of available tools

        sqoop help import #provided with a tool, will get its usage

        sqoop import         # run the tool

        sqoop-toolname     #alternatate way  to run the tool

    3) Sqoop Connector

    Sqoop has an entension framework that makes it possible to import data from and export to ,any external storage system that has bulk

    data transfer capabilities.

    Sqoop Connector is a modular component to use it to enable extracting work.

    4) An import example:

    % sqoop import --connect jdbc:mysql://localhost/hadoopDB --table myTest -m 1
     
    • Sqoop tool will run a MapReduce job that connects to the MySQL db and reads the table.
    • By default, it will call four map tasks. But we could specify just one map task as (-m 1).
    • By default, it will generate comma-delimited text files.

    5) generated code

    Besides importing to HDFS successfully, Sqoop also provides you with a generated Java source file in current local directory.

    • Could be used for some special needs.
    • Coegen tool could generate the source code without performing an import.
    %sqoop codegen  --connect jdbc:mysql://localhost/hadoopDB  --table myTest  --class-name myNeed
    

    6) import process

    A better importing process should use a splitting column to divide table querying aross multiple nodes.

    6)

    7)

  • 相关阅读:
    Ace
    微信公众平台模版消息
    微信智能开放平台
    微信公众平台运营规范
    微信公众平台开发(80) 上传下载多媒体文件
    .net上传文件夹的解决方案
    php上传文件夹的解决方案
    jsp上传文件夹的解决方案
    浏览器上传文件夹的解决方案
    网页上传文件夹的解决方案
  • 原文地址:https://www.cnblogs.com/skyEva/p/5471262.html
Copyright © 2011-2022 走看看