参考一篇大佬的博文:https://www.cnblogs.com/zlslch/p/8318942.html
问题:将scrapy 1.8 程序在win本地运行成功,但是在linux运行报以下错误:
2019-12-10 17:12:14 [twisted] CRITICAL:
Traceback (most recent call last):
File "/apps/software/anaconda2/lib/python2.7/site-packages/twisted/internet/defer.py", line 1418, in _inlineCallbacks
result = g.send(result)
File "/apps/software/anaconda2/lib/python2.7/site-packages/scrapy/crawler.py", line 104, in crawl
six.reraise(*exc_info)
File "/apps/software/anaconda2/lib/python2.7/site-packages/scrapy/crawler.py", line 86, in crawl
self.engine = self._create_engine()
File "/apps/software/anaconda2/lib/python2.7/site-packages/scrapy/crawler.py", line 111, in _create_engine
return ExecutionEngine(self, lambda _: self.stop())
File "/apps/software/anaconda2/lib/python2.7/site-packages/scrapy/core/engine.py", line 69, in __init__
self.downloader = downloader_cls(crawler)
File "/apps/software/anaconda2/lib/python2.7/site-packages/scrapy/core/downloader/__init__.py", line 86, in __init__
self.middleware = DownloaderMiddlewareManager.from_crawler(crawler)
File "/apps/software/anaconda2/lib/python2.7/site-packages/scrapy/middleware.py", line 53, in from_crawler
return cls.from_settings(crawler.settings, crawler)
File "/apps/software/anaconda2/lib/python2.7/site-packages/scrapy/middleware.py", line 34, in from_settings
mwcls = load_object(clspath)
File "/apps/software/anaconda2/lib/python2.7/site-packages/scrapy/utils/misc.py", line 46, in load_object
mod = import_module(module)
File "/apps/software/anaconda2/lib/python2.7/importlib/__init__.py", line 37, in import_module
__import__(name)
File "/apps/software/anaconda2/lib/python2.7/site-packages/scrapy/downloadermiddlewares/retry.py", line 22, in <module>
from scrapy.core.downloader.handlers.http11 import TunnelError
File "/apps/software/anaconda2/lib/python2.7/site-packages/scrapy/core/downloader/handlers/http11.py", line 22, in <module>
from scrapy.core.downloader.tls import openssl_methods
File "/apps/software/anaconda2/lib/python2.7/site-packages/scrapy/core/downloader/tls.py", line 4, in <module>
from service_identity.exceptions import CertificateError
File "/apps/software/anaconda2/lib/python2.7/site-packages/service_identity/__init__.py", line 7, in <module>
from . import cryptography, pyopenssl
File "/apps/software/anaconda2/lib/python2.7/site-packages/service_identity/pyopenssl.py", line 14, in <module>
from pyasn1_modules.rfc2459 import GeneralNames
File "/apps/software/anaconda2/lib/python2.7/site-packages/pyasn1_modules/rfc2459.py", line 23, in <module>
from pyasn1.type import opentype
ImportError: cannot import name opentype
以上报错信息给的不完整,大佬从启动过程中发现service_identity
是报错的最终原因。所以要善于使用报错信息和日志信息。自己百度的答案也是安装相关包:pip install -I -U service_identity 就能解决。
可以离线安装:
根据提示,去下载和安装service_identity
,地址为:https://pypi.python.org/pypi/service_identity#downloads,下载whl文件
离线安装:
如果是whl格式的用:
pip install service_identity-18.1.0-py2.py3-none-any.whl
如果是tar.gz格式自己摸索
我用相同的方式安装的scrapy , 都是用的yum install scrapy . 但是为什么会在另一台机器上出现报错。可能在安装的时候scrapy只会检查有没有这个模块service_identity,而不会进行版本检查以更新。
重新安装即可:
pip install service_identity