dataabc / weibo-search

获取微博搜索结果信息,搜索即可以是微博关键词搜索,也可以是微博话题搜索
1.7k stars 372 forks source link

设置保存到mysql数据库后,运行scrapy crawl search出现报错 #185

Closed banlangen1111 closed 2 years ago

banlangen1111 commented 2 years ago

image PS D:\Projects\PythonProject\weibo-search> scrapy crawl search 2022-04-10 13:58:37 [scrapy.utils.signal] ERROR: Error caught on signal handler: <bound method CoreStats.spider_closed of <scrapy.extensions.corestats.CoreStats object at 0x000001E8BD8FBB20>> Traceback (most recent call last): File "d:\programes\python3.8\lib\site-packages\scrapy\crawler.py", line 104, in crawl yield self.engine.open_spider(self.spider, start_requests) AttributeError: 'MysqlPipeline' object has no attribute 'create_database'

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "d:\programes\python3.8\lib\site-packages\scrapy\utils\defer.py", line 169, in maybeDeferred_coro result = f(*args, **kw) builtins.AttributeError: 'MysqlPipeline' object has no attribute 'create_database'

2022-04-10 13:58:37 [twisted] CRITICAL: Traceback (most recent call last): File "d:\programes\python3.8\lib\site-packages\twisted\internet\defer.py", line 1660, in _inlineCallbacks result = current_context.run(gen.send, result) File "d:\programes\python3.8\lib\site-packages\scrapy\crawler.py", line 104, in crawl yield self.engine.open_spider(self.spider, start_requests) AttributeError: 'MysqlPipeline' object has no attribute 'create_database' PS D:\Projects\PythonProject\weibo-search>

dataabc commented 2 years ago

是不是修改了pipeline文件?

banlangen1111 commented 2 years ago

是不是修改了pipeline文件?

是的,取消修改后可以运行了。十分感谢大大