-
`shub schedule` works for this:
```
[deploy]
url = https://dash.scrapinghub.com/api/scrapyd/
project = 1234
```
`shub schedule` fails for this:
```
[deploy:default]
url = https://dash.scrapinghub.c…
-
Just that question, it seems like you can't pass heroku's assigned port when starting scrapyd.
-
### ## **How to convert jl which the item is splited by '\n' to csv? thanks.**
{"url": "http://wap.kpi.com/?fr=ad&bid=xrlst-waps-e243aa93e6b6e031", "label1": "\u70ed\u95e8", "label2": "\u641c\u72d7…
-
`scrapy startproject myprojectname` puts the following section into `scrapy.cfg`:
``` ini
[deploy]
project = myprojectname
```
When users first run `shub deploy`, this will be transferred to `scrapi…
-
Hi,
I have an issue related to configuration: I need to be being able to configure a spider differently when running it in my development environment compared how it is configured when running it thr…
-
While deploying javascript spiders i receive
2015-09-28 14:58:01+0530 [HTTPChannel,8,127.0.0.1] Unhandled Error
Traceback (most recent call last):
File "C:\Python27\lib\site-pac…
-
See the docs here: http://scrapyd.readthedocs.org/en/latest/api.html#daemonstatus-json
Happy for someone to take this on.
-
- I created spider for javascript url. And try to deploy and schedule in scrapyd using commands
- scrapyd-deploy your_scrapyd_target -p project_name
- curl http://your_scrapyd_host:6800/schedule.j…
-
We need to add a deployment section to the documentation covering scrapyd-deploy.
-
I encounter a problem when I use 'logs_filename' to config scrapyd.
Then I checked the code on github and found 'logs_filename' was added not so long ago.
I use pypi to install this wonderful tool, t…