my8100 / scrapydweb

Web app for Scrapyd cluster management, Scrapy log analysis & visualization, Auto packaging, Timer tasks, Monitor & Alert, and Mobile UI. DEMO :point_right:
https://github.com/my8100/files
GNU General Public License v3.0
3.17k stars 565 forks source link

Got status error when stopping a job integrated with headless browser #51

Closed ServletJunior closed 5 years ago

my8100 commented 5 years ago

Are there any pending jobs in the jobs page after you deleted the timer task?

ServletJunior commented 5 years ago

Are there any pending jobs in the jobs page after you deleted the timer task?

不管我有没有删除任务,都是显示成这样 微信截图_20190530141429

my8100 commented 5 years ago

See step 4 in https://github.com/my8100/files/blob/master/scrapyd-basic-auth/README.md#try-it-out

ServletJunior commented 5 years ago

See step 4 in https://github.com/my8100/files/blob/master/scrapyd-basic-auth/README.md#try-it-out

我现在已经更新job页面了,在任务 点击stop还是会一直运行 111

my8100 commented 5 years ago

But there are no running jobs in your screenshot.

ServletJunior commented 5 years ago

See step 4 in https://github.com/my8100/files/blob/master/scrapyd-basic-auth/README.md#try-it-out

我又新建一个任务,在job页面stop或者forceStop的时候都会报这个错误 111 - 副本

my8100 commented 5 years ago

See https://github.com/my8100/scrapydweb/issues/7#issuecomment-485254942

ServletJunior commented 5 years ago

See #7 (comment)

我看了一下logs,没有错误,我用了无头浏览器来爬,不知道是不是这个原因

my8100 commented 5 years ago

It seems so. https://github.com/my8100/scrapydweb/issues/7#issuecomment-485246846 https://github.com/my8100/scrapydweb/issues/7#issuecomment-485247391

ServletJunior commented 5 years ago

It seems so. #7 (comment) #7 (comment)

谢谢,在cmd运行了scrapydweb -v ,可以正常停止了,最后一个问题,就是items页面显示404

111 - 副本
my8100 commented 5 years ago

Actually, adding the argument ‘-v’ only changes the logging level of scrapydweb. As for the items page, see https://scrapyd.readthedocs.io/en/stable/config.html#items-dir

BTW, try to communicate in English on GitHub.

ServletJunior commented 5 years ago

Actually, adding the argument ‘-v’ only changes the logging level of scrapydweb. As for the items page, see https://scrapyd.readthedocs.io/en/stable/config.html#items-dir

BTW, try to communicate in English on GitHub.

ok! thx! Is FEED_URI written in settings. py of the scrapy project?