-
Hi, when I start nginx-proxy container, it creates nginxproxy_default network.
I tried adding
networks:
- bridge
to my docker-compose.yml but to no avail.
When I start the container with 'docker-…
-
Hi, thanks for the very helpful API wrapper.
I can't find API for send custom argument to the spider, this feature supports in this version?
scrapyd docs: https://scrapyd.readthedocs.io/en/latest/api…
-
Per [the docs](http://scrapyd.readthedocs.io/en/latest/install.html#installing-scrapyd-in-ubuntu) I'm trying
```
apt-get install scrapyd
```
in ubuntu 14.04 but get this error:
> E: Unable to loca…
-
Server response (200):
{"status": "error", "message": "NameError: name 'TRUE' is not defined", "node_name": "ubuntu"}
What is the reason?
-
Scrapy 1.0 allows us to run full crawler instances within a process thanks to its internal API.
- Docs at http://doc.scrapy.org/en/0.24/topics/practices.html#running-multiple-spiders-in-the-same-proce…
-
I found that the code on https://pypi.python.org/pypi/scrapyd/1.1.0 is not as same as the master branch, No class DaemonStatus in it, Would you deploy it again.
``` bash
$ curl http://localhost:6800/…
-
My spider working 12 hours after stop - endless crawling - then scrapyd said
`Process died: exitstatus=None project='default' ...`
How can solve this ? Please, i need more verbose. Is this a memory i…
-
1)Running the Scrapy command from the Python script with the subprocess
from subprocess import call
import sys
sys.path.append("SCRAPYPROJECTPATH")
call (["scrapy","crawl","example"])
2)Running the…
-
I haved installed scrapyd via pip.
And I created my costomized config file as `~/.scrapyd.conf`. But it didn't work.
I inspected `scrapyd/config.py`. It seems that the function didn't check `~/.scra…
-
I'm using Scrapyd to use scrapy as webservice.
I would like to use the curl command with parameters like this :
```
curl http://myip:6800/schedule.json -d project=default -d spider=myspider -d domai…