antoinemartin / django-windows-tools

Django application providing management commands to host Django projects in Windows environments
BSD 2-Clause "Simplified" License
51 stars 13 forks source link

python process spawned and then disappeared in windows service #2

Closed cangelzz closed 11 years ago

cangelzz commented 11 years ago

hello, I am facing a wired problem. in the command console, I am able to run "manage.py celeryd" and see the daemon is ready. however when I install it as a service, seen from the process list, the pythonservice.exe first created 1 python process and 2 python sub-processes(should be 2 workers), and then the subprocesses and python process disappeared. checking the celery.log, it shows the same message that the daemon was ready as WARNING/MainProcess] celery@HOST ready.

the problem only happens in celeryd, but not celerybeat. celerybeat can be running without any problem. celery version 3.0.10 and 3.1b1, both tried. broker: django database django 1.4.2

thanks very much if you have the answers :)

antoinemartin commented 11 years ago

Hello,

To help tackle the issue, could you please in the .ini file change the -l info to -l debug in the [celeryd] section and check in your Windows Event Viewer to see if there is some reported python exception ?

Thanks in advance.

cangelzz commented 11 years ago

I've tried your suggestion, however the event viewer doesn't have any exception, otherwise I would have noticed it earlier. in the event viewer, it is reported that the service is in running state, and no other errors since the pythonservices.exe keeps running.

here's the debug log [2012-11-08 10:07:46,555: DEBUG/Process-1] [Worker] Loading modules. [2012-11-08 10:07:46,572: DEBUG/Process-1] [Worker] Claiming components. [2012-11-08 10:07:46,572: DEBUG/Process-1] [Worker] Building boot step graph. [2012-11-08 10:07:46,572: DEBUG/Process-1] [Worker] New boot order: {ev, queues, beat, pool, mediator, autoreloader, timers, state-db, autoscaler, consumer} [2012-11-08 10:07:46,572: DEBUG/Process-1] Starting celery.concurrency.processes.TaskPool... [2012-11-08 10:07:46,572: WARNING/MainProcess] C:\Python\lib\site-packages\billiardinit.py:318: RuntimeWarning: force_execv is not supported as the billiard C extension is not installed warnings.warn(RuntimeWarning(W_NO_EXECV)) [2012-11-08 10:07:46,602: DEBUG/Process-1] celery.concurrency.processes.TaskPool OK! [2012-11-08 10:07:46,602: DEBUG/Process-1] Starting celery.worker.mediator.Mediator... [2012-11-08 10:07:46,618: DEBUG/Process-1] celery.worker.mediator.Mediator OK! [2012-11-08 10:07:46,618: DEBUG/Process-1] Starting celery.worker.consumer.BlockingConsumer... [2012-11-08 10:07:46,618: WARNING/MainProcess] celery@myhost ready. [2012-11-08 10:07:46,618: DEBUG/MainProcess] consumer: Re-establishing connection to the broker... [2012-11-08 10:07:46,618: INFO/MainProcess] consumer: Connected to django://localhost//. [2012-11-08 10:07:46,650: DEBUG/MainProcess] consumer: basic.qos: prefetch_count->8 [2012-11-08 10:07:46,697: DEBUG/MainProcess] consumer: Ready to accept tasks! there is nothing coming after this, the python process disppears

if the daemon runs from command line, this is what it displays

-------------- celery@myhost v3.0.12 (Chiastic Slide) ---- * ----- --- \ * * -- [Configuration] -- * - **\ --- . broker: django://localhost//

[2012-11-08 10:03:24,138: WARNING/MainProcess] C:\Python\lib\site-packages\billi ardinit.py:318: RuntimeWarning: force_execv is not supported as the billiar d C extension is not installed warnings.warn(RuntimeWarning(W_NO_EXECV)) [2012-11-08 10:03:24,200: WARNING/MainProcess] celery@myhost ready.

antoinemartin commented 11 years ago

Hello,

It seems an issue with the 3.0 version of Celery. Personally, I'm using 2.4. It seems that they have changed the way they manage the workers. beat does not use workers.

I will try to upgrade in my test environment and get back to you.

cangelzz commented 11 years ago

thank you, glad we find the cause.

antoinemartin commented 11 years ago

Hello,

I finally found where the bug was coming from. The new celery version uses Billiard which is a fork of the standard python multiprocessing package. There is a small bug in it preventing proper execution of the workers in the context of a service. I monkey patched it.

It should work now if you pull the new version.

cangelzz commented 11 years ago

it works! thanks!