-
I have several machines spidering with scrapyd and I am monitoring and managing them via the scrapyd api. I love the software, but... I cannot seem to cancel jobs. I make the call to the cancel API…
-
https://docs.djangoproject.com/en/3.0/howto/deployment/
Go through [checklist](https://docs.djangoproject.com/en/3.0/howto/deployment/checklist/), notably:
* https://docs.djangoproject.com/en/3.…
-
(From discussion from https://github.com/open-contracting/kingfisher-process/pull/152#issuecomment-491908018 )
-
Hello there.
I have been using scrapy-cluster for my project and so far the results are good.
It was only until I started using it at scale. I have a spider that tries to authenticate to a websit…
-
I noticed that if I visit the spider page, I see the following:
`Scrapyd
Available projects: ScrapydWeb_demo
Jobs
Items
Logs
Documentation
How to schedule a spider?
To schedule a spider yo…
-
I install anaconda, scrapy, scrapyd on my ubuntu18.04, when i try to start scrapyd server, below error accured:
```python
(base) lee@ScrapyStation:~$ scrapyd
Unhandled Error
Traceback (most rece…
-
Ability to schedule the scraper to fetch and download albums as often as needed.
-
**Describe the bug**
跳到projects 页就报错
**To Reproduce**
Steps to reproduce the behavior:
1. Go to ‘/project' page
**Environment (please complete the following information):**
## 500 (INTERNAL …
-
I build a small app to deploy, launch and monitor spiders from one big scrapy project.
I am using scrapyd to scheduel the spiders through basic python requests:
`requests.post('http://localhost:68…
-
I need to know when the spider finishes. For now I can only use a loop to query the status until it finishes. I asked if there is some way to pass a callback url to scapyd when schedule a job, when th…