Closed DuoCaiXie closed 2 years ago
don't run scrapy as root are you running scrapyd in some docker image? did you mount a remote file system for your dbs or eggs directories?
Closing as no response to question in several months.
same problem!
Need more info @frshman. See the questions above.
When I run the scrapyd ,and run curl http://localhost:6800/schedule.json -d project=default -d spider=somespider The log generated is as follows:
but why? when I use the cmdling 'scrapy crawl xxx' ,is formal,so,why