zoukankan      html  css  js  c++  java
  • MongoDB Connector for Hadoop

    MongoDB Connector for Hadoop

    https://github.com/mongodb/mongo-hadoop

     

    Purpose

    The MongoDB Connector for Hadoop is a library which allows MongoDB (or backup files in its data format, BSON) to be used as an input source, or output destination, for Hadoop MapReduce tasks. It is designed to allow greater flexibility and performance and make it easy to integrate data in MongoDB with other parts of the Hadoop ecosystem.

    Current stable release: 1.2.0

    Features

    • Can create data splits to read from standalone, replica set, or sharded configurations
    • Source data can be filtered with queries using the MongoDB query language
    • Supports Hadoop Streaming, to allow job code to be written in any language (python, ruby, nodejs currently supported)
    • Can read data from MongoDB backup files residing on S3, HDFS, or local filesystems
    • Can write data out in .bson format, which can then be imported to any MongoDB database with mongorestore
    • Works with BSON/MongoDB documents in other Hadoop tools such as Pig and Hive.

    Download

    See the release page.

    Building

    To build, first edit the value for hadoopRelease in ThisBuild in the build.sbt file to select the distribution of Hadoop that you want to build against. For example to build for CDH4:

    hadoopRelease in ThisBuild := "cdh4"
    

    or for Hadoop 1.0.x:

    hadoopRelease in ThisBuild := "1.0"
    

    To determine which value you need to set in this file, refer to the list of distributions below. Then run ./sbt package to build the jars, which will be generated in the core/target/ directory.

    After successfully building, you must copy the jars to the lib directory on each node in your hadoop cluster. This is usually one of the following locations, depending on which Hadoop release you are using:

    • $HADOOP_HOME/lib/
    • $HADOOP_HOME/share/hadoop/mapreduce/
    • $HADOOP_HOME/share/hadoop/lib/

    Supported Distributions of Hadoop

    • Apache Hadoop 1.0

      Does not support Hadoop Streaming.

      Build using "1.0" or "1.0.x"

    • Apache Hadoop 1.1

      Includes support for Hadoop Streaming.

      Build using "1.1" or "1.1.x"

    • Apache Hadoop 0.20.*

      Does not support Hadoop Streaming

      Includes Pig 0.9.2.

      Build using "0.20" or "0.20.x"

    • Apache Hadoop 0.23

      Includes Pig 0.9.2.

      Includes support for Streaming

      Build using "0.23" or "0.23.x"

    • Cloudera Distribution for Hadoop Release 4

      This is the newest release from Cloudera which is based on Apache Hadoop 2.0. The newer MR2/YARN APIs are not yet supported, but MR1 is still fully compatible.

      Includes support for Streaming and Pig 0.11.1.

      Build with "cdh4"

    • Apache Hadoop 2.2

      Includes Pig 0.9.2

      Includes support for Streaming

      Build using "2.2" or "2.2.x"

    Configuration

    Configuration

    Streaming

    Streaming

    Examples

    Examples

    Usage with static .bson (mongo backup) files

    BSON Usage

    Usage with Amazon Elastic MapReduce

    Amazon Elastic MapReduce is a managed Hadoop framework that allows you to submit jobs to a cluster of customizable size and configuration, without needing to deal with provisioning nodes and installing software.

    Using EMR with the MongoDB Connector for Hadoop allows you to run MapReduce jobs against MongoDB backup files stored in S3.

    Submitting jobs using the MongoDB Connector for Hadoop to EMR simply requires that the bootstrap actions fetch the dependencies (mongoDB java driver, mongo-hadoop-core libs, etc.) and place them into the hadoop distributions lib folders.

    For a full example (running the enron example on Elastic MapReduce) please see here.

    Usage with Pig

    Documentation on Pig with the MongoDB Connector for Hadoop.

    For examples on using Pig with the MongoDB Connector for Hadoop, also refer to the examples section.

    Notes for Contributors

    If your code introduces new features, please add tests that cover them if possible and make sure that the existing test suite still passes. If you're not sure how to write a test for a feature or have trouble with a test failure, please post on the google-groups with details and we will try to help.

    Maintainers

    Mike O'Brien (mikeo@10gen.com)

    Contributors

    Support

    Issue tracking: https://jira.mongodb.org/browse/HADOOP/

    Discussion: http://groups.google.com/group/mongodb-user/

  • 相关阅读:
    VS2010不能引用System.Data.OracleClient解决方法(转)
    stdafx.h的作用(转载)
    生成缩略图
    java 泛型 入门
    itext库产生word文档示例(.doc)
    json(在JSP中) 应用实例
    C/C++ 指针应用 常见问题
    Java RTTI 和 反射机制
    Hibernate DAO类三个函数:merge() attachDirty() attachClean()
    HTTPSession 简介
  • 原文地址:https://www.cnblogs.com/fx2008/p/3607055.html
Copyright © 2011-2022 走看看