-
使用系统: centos7
问题复现
1.使用auto_manage_spiders.py提示正常上传,但系统上无显示
(base) [root@localhost scrapyd_web_manager]# python auto_manage_spiders.py -dp
deploy True
POST Fetch: http://192.168.1.94:5000/1/d…
-
1. 直接在web管理界面添加scrapyd主机,并且支持有https连接的scrapyd主机
2. 添加中文语言支持
3. 能够配置各种请求的超时时间与重试次数
-
[scrapyd](https://scrapyd.readthedocs.io/) has scheduling, while this project immediately starts running when a spider is scheduled.
The idea is to start [suspended Kubernetes jobs](https://kubernete…
-
When attempting to run spider a notification pops telling me to notify dev team of an unexpected error.
Console:
[28/Jan/2018 23:20:25] "PATCH /api/projects/MayWes/spiders/www.maywes.com HTTP/1…
-
I am experiencing a situation where I am sending thousands of `schedule` requests to schedule Scrapyd jobs and the server (Ubuntu) kills the scrapyd process, because it consumes a critical amount of m…
-
We need a small stand-alone web UI that ties in with the rest components in #24 to visualize the data generated by the cluster. You should also be able to submit API requests to the cluster.
Preferab…
-
Currently, Docker / Kubernetes logs are used for logging. This is sometimes good enough, but in many situations not. These logs are often truncated at night (and potentially more often when grown to a…
-
## 廖祥森
#### 本周工作
- xposed:按照之前讨论的策略进行sd卡的清理,可以work,清理速度也蛮快的,之前速度太慢和那个号的数据太多可能有很大关系,已经部署到机子上去
- 课程任务:完成了软件工程研究导引的论文,自然语言处理的第一次作业,分布式算法的读书报告
#### 下周工作
- 龚亿通信那边的事不知道要不要开始开发,总之先了解一下sinosteel,考虑一下…
-
微博内容精选
-
**Describe the bug**
Ive set the option DATABASE_URL to support MySQL in a correct format and restart scrapydweb,but no DBS in [DB_APSCHEDULER, DB_TIMERTASKS, DB_METADATA, DB_JOBS] had been created a…