Closed gaurav-sdt closed 2 years ago
Is there a way I can run some code automatically when a job goes from running to finished in scrapyd? I would like to call an API with the finished job info and then store it in a database without polling for scrapyd list projects again and again.
The way to do that is to write an extension in Scrapy that uses the spider_closed signal: https://docs.scrapy.org/en/latest/topics/signals.html
I do that to send the crawl stats and more to an API.
Is there a way I can run some code automatically when a job goes from running to finished in scrapyd? I would like to call an API with the finished job info and then store it in a database without polling for scrapyd list projects again and again.