zoukankan      html  css  js  c++  java
  • 集群同步hive的脚本

    程序员就是把一切手工做的事情变成让计算机来做,从而可以让自己偷偷懒。

    以下就是个非常low的hive文件夹同步程序,至于节点超过100个或者1000个的,可以加个循环了。

    #!/bin/sh
    
    #================ hive 安装包同步 =================#
    # 该脚本用来将name节点的hive文件夹同步到data节点   #
    # 当hive安装包变动时,需要同步data节点,否则oozie  #
    # 通过shell调用hive程序时,会因为分配的节点hive安  #
    # 装包不同步而引起错误                             #
    #==================================================#
    
    # 1.清理旧的hive
    ssh -t hadoop@dwprod-dataslave1 rm -r /opt/local/hive
    ssh -t hadoop@dwprod-dataslave2 rm -r /opt/local/hive
    ssh -t hadoop@dwprod-dataslave3 rm -r /opt/local/hive
    ssh -t hadoop@dwprod-dataslave4 rm -r /opt/local/hive
    ssh -t hadoop@dwprod-dataslave5 rm -r /opt/local/hive
    ssh -t hadoop@dwprod-dataslave6 rm -r /opt/local/hive
    ssh -t hadoop@dwprod-dataslave7 rm -r /opt/local/hive
    ssh -t hadoop@dwprod-dataslave8 rm -r /opt/local/hive
    ssh -t hadoop@dwprod-dataslave9 rm -r /opt/local/hive
    ssh -t hadoop@dwprod-dataslave10 rm -r /opt/local/hive
    
    # 2.拷贝新的hive
    scp -r -q /opt/local/hive hadoop@dwprod-dataslave1:/opt/local/
    scp -r -q /opt/local/hive hadoop@dwprod-dataslave2:/opt/local/
    scp -r -q /opt/local/hive hadoop@dwprod-dataslave3:/opt/local/
    scp -r -q /opt/local/hive hadoop@dwprod-dataslave4:/opt/local/
    scp -r -q /opt/local/hive hadoop@dwprod-dataslave5:/opt/local/
    scp -r -q /opt/local/hive hadoop@dwprod-dataslave6:/opt/local/
    scp -r -q /opt/local/hive hadoop@dwprod-dataslave7:/opt/local/
    scp -r -q /opt/local/hive hadoop@dwprod-dataslave8:/opt/local/
    scp -r -q /opt/local/hive hadoop@dwprod-dataslave9:/opt/local/
    scp -r -q /opt/local/hive hadoop@dwprod-dataslave10:/opt/local/

  • 相关阅读:
    求一些数字字符参数的和(Java)
    《大道至简》第二章 读后感
    华为机试题 简单错误记录
    华为机试 购物单
    华为机试题 提取不重复的整数
    华为机试题 合并表结构
    华为机试 取近似值
    华为机试题 质数因子
    华为机试题 进制转换
    华为机试题 字符串分割
  • 原文地址:https://www.cnblogs.com/30go/p/8776823.html
Copyright © 2011-2022 走看看