-
I'm trying to deploy spider created via [Portia](https://portia.readthedocs.io/en/latest/projects.html#deployment). Portia, scrapyd - all is latest versions.
I'm running **scrapyd** server just by co…
ghost updated
7 years ago
-
```
Traceback (most recent call last):
File "/usr/local/bin/scrapyd", line 5, in
from pkg_resources import load_entry_point
File "/usr/lib/python2.7/dist-packages/pkg_resources.py", line …
-
Hello, Read the first part of the book, but chapters 3-5 are tough without running Vagrant. I have windows 10 on a lenovo laptop, all downlloads went ok but vagrant up still not working....was the wi…
-
I have a spider that extracts two types of items. My item pipeline saves these in two separate files by using two item exporters (currently standard csv exporter). This approach creates the two files …
-
I am using scrapyd and a FEED_URI to save in a ftp endpoint.
Everything works fine but I would like the items to be also saved locally using the default scrapyd file location: eg. `items='file:///var/…
-
I have more than one scrapyd want to deploy. However, when I run the command, it can't deploy to all my scrapyd with one command.
```
$ scrapyd-deploy -a -p test
Usage: scrapyd-deploy [options] […
-
Is there a way to retrieve the parameters of a scheduled task?
Currently
```sh
$ curl http://localhost:6800/schedule.json -d project=myproject -d spider=somespider -d setting=DOWNLOAD_DELAY=2 -d …
-
When I execute
`curl http://localhost:6800/schedule.json -d project=default -d spider=spider1`
I go to scrapyd console and I see the Exception:
```
[-] Unhandled Error
Traceback (most recent call l…
-
The job resource commands (`log`, `items`, `requests`) currently always use the production Hubstorage endpoint (http://storage.scrapinghub.com). This is due to the fact that the storage endpoint canno…
-
If you try to deploy a project with scrapyd-deploy and your current branch has a slash in the name, it will say the deploy was successful, but doesn't update the project.
Example Branch Name: feature…