I'm using Python 2.7.12 with Scrapyd 1.2.1 and Twisted 19.10.0 running inside a Docker container. I've recently encounted this message in Scrapyd log:
2020-03-25T00:00:26+0000 [-] Process started: project='foo' spider='bar' job='a136c0c86e2b11eaa7e00242ac110004
' pid=11430 log='/var/lib/scrapyd/logs/foo/bar/a136c0c86e2b11eaa7e00242ac110004.log' items=None
...
2020-03-25T00:00:36+0000 [Launcher,11430/stderr] Traceback (most recent call last):
File "/usr/lib/python2.7/runpy.py", line 174, in _run_module_as_main
"__main__", fname, loader, pkg_name)
File "/usr/lib/python2.7/runpy.py", line 72, in _run_code
exec code in run_globals
File "/usr/local/lib/python2.7/dist-packages/scrapyd/runner.py", line 40, in <module>
main()
File "/usr/local/lib/python2.7/dist-packages/scrapyd/runner.py", line 35, in main
with project_environment(project):
File "/usr/lib/python2.7/contextlib.py", line 17, in __enter__
return self.gen.next()
File "/usr/local/lib/python2.7/dist-packages/scrapyd/runner.py", line 13, in project_environment
app = get_application()
File "/usr/local/lib/python2.7/dist-packages/scrapyd/__init__.py", line 15, in get_application
return appfunc(config)
File "/usr/local/lib/python2.7/dist-packages/scrapyd/app.py", line 39, in application
webservice = TCPServer(http_port, server.Site(webcls(config, app)), interface=bind_address)
File "/usr/local/lib/python2.7/dist-packages/scrapyd/website.py", line 35, in __init__
self.update_projects()
File "/usr/local/lib/python2.7/dist-packages/scrapyd/website.py", line 38, in update_projects
self.poller.update_projects()
File "/usr/local/lib/python2.7/dist-packages/scrapyd/poller.py", line 31, in update_projects
self.queues = get_spider_queues(self.config)
File "/usr/local/lib/python2.7/dist-packages/scrapyd/utils.py", line 62, in get_spider_queues
d[project] = SqliteSpiderQueue(dbpath)
File "/usr/local/lib/python2.7/dist-packages/scrapyd/spiderqueue.py", line 11, in __init__
self.q = JsonSqlitePriorityQueue(database, table)
File "/usr/local/lib/python2.7/dist-packages/scrapyd/sqlite.py", line 117, in __init__
self.conn.execute(q)
sqlite3.OperationalError: database is locked
...
2020-03-25T00:00:37+0000 [-] Process died: exitstatus=1 project='foo' spider='bar' job='a136c0c86e2b11eaa7e00242ac110004' pid=11430 log='/var/lib/scrapyd/logs/foo/bar/a136c0c86e2b11eaa7e00242ac110004.log' items=None
When I look at Jobs page on Scrapyd web, I can see an entry there:
Hi,
I'm using Python 2.7.12 with Scrapyd 1.2.1 and Twisted 19.10.0 running inside a Docker container. I've recently encounted this message in Scrapyd log:
When I look at Jobs page on Scrapyd web, I can see an entry there:
However, the log file is unavailable, I'm getting 404
No Such Resource
error.