SpiderClub / weibospider

:zap: A distributed crawler for weibo, building with celery and requests.
MIT License
4.81k stars 1.21k forks source link

执行login_first.py之后显示ValueError: not enough values to unpack (expected 3, got 0) #205

Closed keithkang1986 closed 4 years ago

keithkang1986 commented 4 years ago

在提交Issue之前请先回答下面问题,谢谢!

1.你是怎么操作的? 执行第9步celery -A tasks.workers -Q login_queue,user_crawler,fans_followers,search_crawler,home_crawler worker -l info -c 1之后, 再开启进程login_first.py 尽量把你的操作过程描述清楚,最好能够复现问题。

2.你期望的结果是什么? 开始运行login_first.py 3.实际上你得到的结果是什么? [2020-04-02 18:42:49,458: INFO/MainProcess] Connected to redis://127.0.0.1:6379/5 [2020-04-02 18:42:49,469: INFO/MainProcess] mingle: searching for neighbors [2020-04-02 18:42:49,878: INFO/SpawnPoolWorker-1] child process 13780 calling self.run() C:\data\爬虫\weibospider-1.7.2\config\conf.py:12: YAMLLoadWarning: calling yaml.load() without Loader=... is deprecated, as the default Loader is unsafe. Please read https://msg.pyyaml.org/load for full details. cf = load(cont) [2020-04-02 18:42:50,490: INFO/MainProcess] mingle: all alone [2020-04-02 18:43:13,772: INFO/MainProcess] Received task: tasks.login.login_task[2a5fb61f-1546-4bdc-8ebc-007bc784c678] [2020-04-02 18:43:13,775: ERROR/MainProcess] Task handler raised error: ValueError('not enough values to unpack (expected 3, got 0)') Traceback (most recent call last): File "c:\programdata\anaconda3\lib\site-packages\billiard\pool.py", line 358, in workloop result = (True, prepare_result(fun(*args, kwargs))) File "c:\programdata\anaconda3\lib\site-packages\celery\app\trace.py", line 537, in _fast_trace_task tasks, accept, hostname = _loc ValueError: not enough values to unpack (expected 3, got 0)** 4.你使用的是哪个版本的WeiboSpider? 你的操作系统是什么?是否有读本项目的[常见问题] win10操作系统(https://github.com/SpiderClub/weibospider/wiki/%E9%A1%B9%E7%9B%AE%E4%BD%BF%E7%94%A8%E5%B8%B8%E8%A7%81%E9%97%AE%E9%A2%98)?

thekingofcity commented 4 years ago

windows并非项目支持, 尝试降级下celery? 我记得4.x就抛弃windows了 这个错误并非项目问题