it would be really good if there would be a simple way to restart the scrapyd service after the configuration file scrapyd.conf is changed, or even in any circumstance without killing the current running scrapy jobs.
When installed in debian or ubuntu with apt-get you can normally use it a service with service scrapyd restart, which is very handy in these situations.
Could you please share how would you restart this on your docker configuration? I would really appreciate it.
Currently I am stopping the docker instance and starting (up) it again, which kills the current jobs :(
it would be really good if there would be a simple way to restart the
scrapyd
service after the configuration filescrapyd.conf
is changed, or even in any circumstance without killing the current running scrapy jobs.When installed in
debian
orubuntu
withapt-get
you can normally use it a service withservice scrapyd restart
, which is very handy in these situations.Could you please share how would you restart this on your docker configuration? I would really appreciate it.
Currently I am stopping the docker instance and starting (
up
) it again, which kills the current jobs :(