zoukankan      html  css  js  c++  java
  • scrapy 伪装代理和fake_userAgent的使用

    伪装浏览器代理 在爬取网页是有些服务器对请求过滤的不是很高可以不用ip来伪装请求直接将自己的浏览器信息给伪装也是可以的。

    第一中方法:

    1.在setting.py文件中加入以下内容,这是一些浏览器的头信息

    USER_AGENT_LIST = ['zspider/0.9-dev http://feedback.redkolibri.com/',
                        'Xaldon_WebSpider/2.0.b1',
                        'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) Speedy Spider (http://www.entireweb.com/about/search_tech/speedy_spider/)',
                        'Mozilla/5.0 (compatible; Speedy Spider; http://www.entireweb.com/about/search_tech/speedy_spider/)',
                        'Speedy Spider (Entireweb; Beta/1.3; http://www.entireweb.com/about/search_tech/speedyspider/)',
                        'Speedy Spider (Entireweb; Beta/1.2; http://www.entireweb.com/about/search_tech/speedyspider/)',
                        'Speedy Spider (Entireweb; Beta/1.1; http://www.entireweb.com/about/search_tech/speedyspider/)',
                        'Speedy Spider (Entireweb; Beta/1.0; http://www.entireweb.com/about/search_tech/speedyspider/)',
                        'Speedy Spider (Beta/1.0; www.entireweb.com)',
                        'Speedy Spider (http://www.entireweb.com/about/search_tech/speedy_spider/)',
                        'Speedy Spider (http://www.entireweb.com/about/search_tech/speedyspider/)',
                        'Speedy Spider (http://www.entireweb.com)',
                        'Sosospider+(+http://help.soso.com/webspider.htm)',
                        'sogou spider',
                        'Nusearch Spider (www.nusearch.com)',
                        'nuSearch Spider (compatible; MSIE 4.01; Windows NT)',
                        'lmspider (lmspider@scansoft.com)',
                        'lmspider lmspider@scansoft.com',
                        'ldspider (http://code.google.com/p/ldspider/wiki/Robots)',
                        'iaskspider/2.0(+http://iask.com/help/help_index.html)',
                        'iaskspider',
                        'hl_ftien_spider_v1.1',
                        'hl_ftien_spider',
                        'FyberSpider (+http://www.fybersearch.com/fyberspider.php)',
                        'FyberSpider',
                        'everyfeed-spider/2.0 (http://www.everyfeed.com)',
                        'envolk[ITS]spider/1.6 (+http://www.envolk.com/envolkspider.html)',
                        'envolk[ITS]spider/1.6 ( http://www.envolk.com/envolkspider.html)',
                        'Baiduspider+(+http://www.baidu.com/search/spider_jp.html)',
                        'Baiduspider+(+http://www.baidu.com/search/spider.htm)',
                        'BaiDuSpider',
                        'Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.0) AddSugarSpiderBot www.idealobserver.com',
                       ]
    
    2.在spider同级目录下建立一个MidWare文件价里面写一个HeaderMidWare.py文件 内容为
    复制代码
     1 # encoding: utf-8
     2 from scrapy.utils.project import get_project_settings
     3 import random
     4 
     5 settings = get_project_settings()
     6 
     7 class ProcessHeaderMidware():
     8     """process request add request info"""
     9 
    10     def process_request(self, request, spider):
    11         """
    12         随机从列表中获得header, 并传给user_agent进行使用
    13         """
    14         ua = random.choice(settings.get('USER_AGENT_LIST'))  
    15         spider.logger.info(msg='now entring download midware')
    16         if ua:
    17             request.headers['User-Agent'] = ua
    18             # Add desired logging message here.
    19             spider.logger.info(u'User-Agent is : {} {}'.format(request.headers.get('User-Agent'), request))
    20         pass
    复制代码

     3.在setting.py文件中添加

    DOWNLOADER_MIDDLEWARES = {
    'projectName.MidWare.HeaderMidWare.ProcessHeaderMidware': 543,
    }
    第二种方法:fake_userAgent的使用
    fake_userAgent是github上的开源项目
    1.安装fake_userAgent
    pip install fake-useragent
    2.在spider同级目录下建立一个MidWare文件价里面写一个user_agent_middlewares.py文件内容为
    复制代码
     1 # -*- coding: utf-8 -*-
     2 from fake_useragent import UserAgent
     3 
     4 class RandomUserAgentMiddlware(object):
     5     #随机跟换user-agent
     6     def __init__(self,crawler):
     7         super(RandomUserAgentMiddlware,self).__init__()
     8         self.ua = UserAgent()
     9         self.ua_type = crawler.settings.get('RANDOM_UA_TYPE','random')#从setting文件中读取RANDOM_UA_TYPE值
    10 
    11     @classmethod
    12     def from_crawler(cls,crawler):
    13         return cls(crawler)
    14 
    15     def process_request(self,request,spider):  ###系统电泳函数
    16         def get_ua():
    17             return getattr(self.ua,self.ua_type)
    18         # user_agent_random=get_ua()
    19         request.headers.setdefault('User_Agent',get_ua())
    20         pass
    复制代码

    3.在setting.py中添加

     

     RANDOM_UA_TYPE = 'random'##random    chrome
    
      DOWNLOADER_MIDDLEWARES = {
    
      'projectName.MidWare.user_agent_middlewares.RandomUserAgentMiddlware': 543, 
    
        'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware':None,
    
    }
    
     

    fake_userAgent伪装代理就配置好了,与第一种方法相比不用写一大串的浏览器头,那些浏览器头会在

    https://fake-useragent.herokuapp.com/browsers/0.1.7 中得到。

    在第一次启用fake_userAgent的时候会有一些错,我认为是项目请求网络时需要缓存一些内容而导致的。

    github地址:https://github.com/sea1234/fake-useragent

  • 相关阅读:
    jvm的几个概念误区
    JDK动态代理和CGLib动态代理的对比
    mybatis源码解析(连载)
    HashMap源码解析(基于JDK1.8)
    线程池ThreadPoolExecutor——Worker源码解析
    Java线程池原理分析
    装饰器模式在mybatis-cache包中的应用
    rocketMQ手动创建Topic
    JVM误区--动态对象年龄判定
    云上奈飞(三):隐藏在播放按钮下的奥秘(上)
  • 原文地址:https://www.cnblogs.com/yunlongaimeng/p/11593202.html
Copyright © 2011-2022 走看看