zoukankan      html  css  js  c++  java
  • Scrapy定制命令开启爬虫

    一、单爬虫运行

    每次运行scrapy都要在终端输入命令太麻烦了

    在项目的目录下创建manager.py(任意名称)

    from scrapy.cmdline import execute
    
    if __name__ == '__main__':
        execute(["scrapy", "crawl", "quote", "--nolog"])

    二、所有爬虫运行

    1、在spiders同级创建commands目录(任意)

    2、在其中创建 crawlall.py 文件,决定命令的运行

    from scrapy.commands import ScrapyCommand
    
    
    class Command(ScrapyCommand):
        requires_project = True
    
        def syntax(self):
            return '[options]'
    
        def short_desc(self):
            return 'Runs all of the spiders'
    
        def run(self, args, opts):
            spider_list = self.crawler_process.spiders.list()
            for name in spider_list:
                self.crawler_process.crawl(name, **opts.__dict__)
            self.crawler_process.start()

    3、配置文件

    # COMMANDS_MODULE = '项目名称.目录名称'
    COMMANDS_MODULE = 'toscrapy.commands'

    4、manager.py

    from scrapy.cmdline import execute
    
    if __name__ == '__main__':
        execute(["scrapy", "crawlall", "--nolog"])
  • 相关阅读:
    2016huasacm暑假集训训练四 递推_A
    2016huasacm暑假集训训练三 G
    2016huasacm暑假集训训练三 F
    今年暑假不AC
    Who's in the Middle
    The Suspects
    食物链
    抓牛问题
    士兵队列训练问题
    H
  • 原文地址:https://www.cnblogs.com/wt7018/p/11756236.html
Copyright © 2011-2022 走看看