zoukankan      html  css  js  c++  java
  • 【Hadoop】HA 场景下访问 HDFS JAVA API Client

    客户端需要指定ns名称,节点配置,ConfiguredFailoverProxyProvider等信息。

    代码示例:

    package cn.itacst.hadoop.hdfs;
    
    import java.io.FileInputStream;
    import java.io.InputStream;
    import java.io.OutputStream;
    import java.net.URI;
    
    import org.apache.hadoop.conf.Configuration;
    import org.apache.hadoop.fs.FileSystem;
    import org.apache.hadoop.fs.Path;
    import org.apache.hadoop.io.IOUtils;
    
    public class HDFS_HA {
    
        
        public static void main(String[] args) throws Exception {
            Configuration conf = new Configuration();
            conf.set("fs.defaultFS", "hdfs://ns1");
            conf.set("dfs.nameservices", "ns1");
            conf.set("dfs.ha.namenodes.ns1", "nn1,nn2");
            conf.set("dfs.namenode.rpc-address.ns1.nn1", "itcast01:9000");
            conf.set("dfs.namenode.rpc-address.ns1.nn2", "itcast02:9000");
            //conf.setBoolean(name, value);
            conf.set("dfs.client.failover.proxy.provider.ns1", "org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider");
            FileSystem fs = FileSystem.get(new URI("hdfs://ns1"), conf, "hadoop");
            InputStream in =new FileInputStream("c://eclipse.rar");
            OutputStream out = fs.create(new Path("/eclipse"));
            IOUtils.copyBytes(in, out, 4096, true);
        }
    }
  • 相关阅读:
    CF251D
    P6914
    CF1100F
    双连通 / 圆方树 胡扯笔记
    P4082
    SparkSql使用Hive中注册的UDF函数报类找不到问题解决
    Oracle 查询时使用时间作为where报错hour must be between 1 and 12
    【面试-python】
    Linux和Git
    AMBA初探
  • 原文地址:https://www.cnblogs.com/junneyang/p/5869413.html
Copyright © 2011-2022 走看看