我使用 Scrapy FakeUserAgent 并在我的 Linux 服务器上不断收到此错误。
Traceback (most recent call last):
File "/usr/local/lib64/python2.7/site-packages/twisted/internet/defer.py", line 1299, in _inlineCallbacks
result = g.send(result)
File "/usr/local/lib/python2.7/site-packages/scrapy/core/downloader/middleware.py", line 37, in process_request
response = yield method(request=request, spider=spider)
File "/usr/local/lib/python2.7/site-packages/scrapy_fake_useragent/middleware.py", line 27, in process_request
request.headers.setdefault('User-Agent', self.ua.random)
File "/usr/local/lib/python2.7/site-packages/fake_useragent/fake.py", line 98, in __getattr__
raise FakeUserAgentError('Error occurred during getting browser') # noqa
FakeUserAgentError: Error occurred during getting browser
当我同时运行多个蜘蛛时,我在 Linux 服务器上不断收到此错误。这个错误在我自己的笔记本电脑上很少发生。我应该怎么做才能避免这种情况?我需要提高内存还是其他什么?服务器的规格为 512MB RAM 和 1 个 vCPU。
我不确定 RAM 以及为什么错误只发生在具有最低规格的 Linux 服务器上。我通过使用解决了它fake-useragent
后备功能。可悲的是,scrapy-fake-useragent
没有提供任何方便设置的功能,所以我必须重写中间件功能middlewares.py
像这样:
from fake_useragent import UserAgent
from scrapy_fake_useragent.middleware import RandomUserAgentMiddleware
class FakeUserAgentMiddleware(RandomUserAgentMiddleware):
def __init__(self, crawler):
super(FakeUserAgentMiddleware, self).__init__(crawler)
# If failed to get random user agent, use the most common one
self.ua = UserAgent(fallback='Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/56.0.2924.87 Safari/537.36')
self.per_proxy = crawler.settings.get('RANDOM_UA_PER_PROXY', False)
self.ua_type = crawler.settings.get('RANDOM_UA_TYPE', 'random')
self.proxy2ua = {}
然后我激活中间件settings.py
像这样:
DOWNLOADER_MIDDLEWARES = {
'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware': None,
# 'scrapy_fake_useragent.middleware.RandomUserAgentMiddleware': 400, # disable the original middleware
'myproject.middlewares.FakeUserAgentMiddleware': 400,
# omitted
}
UPDATE
尝试将 fake-useragent 更新到版本 0.1.5。我使用的是 0.1.4,升级后,问题从根本上消失了,而不是通过使用后备。
本文内容由网友自发贡献,版权归原作者所有,本站不承担相应法律责任。如您发现有涉嫌抄袭侵权的内容,请联系:hwhale#tublm.com(使用前将#替换为@)