ScrapeOps / scrapeops-scrapy-sdk

Scrapy extension that gives you all the scraping monitoring, alerting, scheduling, and data validation you will need straight out of the box.
https://scrapeops.io/
BSD 3-Clause "New" or "Revised" License
37 stars 8 forks source link

如何判断爬虫的任务状态 #6

Closed q1179336367 closed 1 year ago

q1179336367 commented 1 year ago

根据什么条件,可以判断爬虫的运行状态是running、waiting、fail

josephkearney91 commented 1 year ago

According to what conditions, the running status of the crawler can be judged as running, waiting, or fail:

When the spider stops running due to an error in your spider, the logs do not get received by our API(due to a connection issue) after 3 re-tries it switches the status to unknown - this can also happen if the error logs are too big to transmit to our API and it stops transmitting to our backend.