ZoomQuiet / pybeanstalk

Automatically exported from code.google.com/p/pybeanstalk
0 stars 0 forks source link

losing messages from work queue #10

Open GoogleCodeExporter opened 9 years ago

GoogleCodeExporter commented 9 years ago
Hi, 

Thanks for implementing this in python!

When i run:

  > python simple_clients.py producer localhost 11300

and a 

  > python simple_clients.py consumer localhost 11300 

in another Terminal.app window it works ad advertised UNTIL i kill the producer 
then all the messages get lost. (the consumer stops 
getting messages from beanstalkd.)

Also if i put a sleep in the consumer while loop to simulate long-running 
worker jobs, then the consumer process only gets the LAST 
job created by the producer and skips all the other jobs that were generated 
during the sleep.

not sure if this is expected behavior, but just wanted to bring it up here. see 
if you guys have some input. I was hoping that no 
messages would get lost unless the server was killed (which is ok.)

i have OS X 10.5, Python 2.5.4 and beanstalkd 1.3

thanks!

r.S.

Original issue reported on code.google.com by robspych...@gmail.com on 21 May 2009 at 4:11

GoogleCodeExporter commented 9 years ago
looks like there might be a problem with the example code:

i took out:

    connection.job = job.Job

from def main() and placed it in def consumer_main(connection)

seems to have fixed it.

r.S.

Original comment by robspych...@gmail.com on 21 May 2009 at 4:25

GoogleCodeExporter commented 9 years ago
Hi,

Thanks for the report.  What version of pybeanstalk are you using?

This may be tricky to determine, as it could either be a pybeanstalk error or a 
beanstalkd error :/ but I have OSX 10.5 so i can try to reproduce it, and go 
from there.

Original comment by sophac...@gmail.com on 21 May 2009 at 4:26

GoogleCodeExporter commented 9 years ago
I'm running these tests with pybeanstalk 0.11.1

r.S.

Original comment by robspych...@gmail.com on 21 May 2009 at 4:31