-
例如我的爬虫部署了多台服务器,都启用了scrapyd。怎么在SpiderKeeper里配置这些服务器呢?
因为我看SpiderKeeper的启动是用的 spiderkeeper --server=http://localhost:6800
-
我是直接调用api来删除SpiderKeeper的项目的:
``` python
for i in range(2, 19):
project_delete_url = 'http://localhost:5000/project/{}/delete'.format(i)
r = session.get(project_delete_url, auth=('admin','…
-
Sorry, stupid question, but where do I put my normal spider and pipeline files?
-
I'm currently developing locally on windows 10 and have the `SCRAPY_PROJECTS_DIR` setting set to `SCRAPY_PROJECTS_DIR = 'C:/Users/mhill/PycharmProjects/dScrapy/d_webscraping'`
In that directory, I …
-
Hi, according to the following links
[https://doc.scrapy.org/en/latest/topics/spiders.html#spiderargs](url)
[https://scrapyd.readthedocs.io/en/stable/api.html#schedule-json](url)
Params can be …
-
So that other components can get the config options directly from it.
For example, the poller and scheduler currently store the config, so that they can call `get_spider_queues` with it.
First m…
-
前一阵子发现定时任务有时候没有按计划来执行,仔细检查程序输出日志,发现报错如下:
```
2018-10-11 16:18:50,165 - SpiderKeeper.app - DEBUG - [sync_job_execution_status]
Job "sync_job_execution_status_job (trigger: interval[0:00:05], next ru…
-
您好,这个部署新爬虫工程时,具体需要怎么部署
-
linux:HTTPConnectionPool(host='192.168.0.24', port=6801): Max retries exceeded with url: /listprojects.json (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connectio…
-
{"status": "error", "message": "Traceback (most recent call last):\\n File \"/Users/qiaolongjin/anaconda2/lib/python2.7/runpy.py\", line 174, in _run_module_as_main\\n \"__main__\", fname, loader,…