zoukankan      html  css  js  c++  java
  • 16项目实战_判断主机存活

    项目实战_判断主机存活

    脚本目的

    通过ping主机IP判断主机是否存活

    脚本功能

    1.ping通主机,输出"xxx is ok."
    
    2.没有ping通主机,提供三次机会ping,ping通输出"xxx is ok.",没ping通最终输出"xxx ping is failure!"
    
    3.通过文本传入 ip列表
    

    脚本内容

    思路1:

    #/usr/bin/bash
    ping_success(){
        ping -c1 -W1 ${ip} &>/dev/null
        if [ $? -eq 0 ]; then
            echo "${ip} is ok."
            continue
        fi
    }
    
    while read ip
    do
        ping_success
        ping_success
        ping_success
        echo "$ip ping is failure!"
    done < $1
    

    思路2:

    #!/bin/bash
    
    while read ip
    do
        for count in {1..3}; do
            ping -c1 -w1 ${ip} &> /dev/null
            if [ "$?" == "0" ]; then
                echo "ping ${ip} is ok."
                break
            else
                echo "ping ${ip} is failed: ${count}"
                if [ ${count} -eq 3 ]; then
                    echo "ping ${ip} is failed."
                fi
            fi
        done
    done < $1
    

    ip.txt内容

    172.22.34.78
    172.22.34.12
    172.22.34.33
    172.22.34.45
    172.22.34.23
    172.22.34.124
    172.22.34.139
    172.22.34.8
    172.22.34.32
    172.22.34.170
    172.22.34.160
    

    脚本执行

    [root@hadoop04 shell_awk]# bash ping_count3_3.sh  ip.txt 
    172.22.34.78 is ok.
    172.22.34.12 ping is failure!
    172.22.34.33 ping is failure!
    172.22.34.45 ping is failure!
    172.22.34.23 is ok.
    172.22.34.124 is ok.
    172.22.34.139 ping is failure!
    172.22.34.8 ping is failure!
    172.22.34.32 ping is failure!
    172.22.34.170 ping is failure!
    172.22.34.160 ping is failure!
    
  • 相关阅读:
    Spark开发-SparkUDAF(二)
    Spark开发-Spark UDAF(一)
    Spark开发-Spark中类型安全UDAF开发示例
    Spark开发_构建TypeSafe的Dataset
    布隆过滤器(Bloom Filter)
    一个 Spark 应用程序的完整执行流程
    Spark的RPC
    Spark调优
    Hbase系列文章
    Flink怎么做到精确一次的?
  • 原文地址:https://www.cnblogs.com/ElegantSmile/p/12325803.html
Copyright © 2011-2022 走看看