pricingassistant / mrq

Mr. Queue - A distributed worker task queue in Python using Redis & gevent
MIT License
877 stars 117 forks source link

unicode bug which is claimed to be fixed in #142 #191

Closed functure closed 6 years ago

functure commented 6 years ago

I am still getting error on the line which is claimed to be fixed in #142

System Specs: Debian Wheezy Python 2.7.14 MRQ 0.2.1 Mongo 3.4.10 Redis 3.0.6

Traceback:

2017-12-13 15:44:17.787133 [DEBUG] mongodb_jobs: Connecting to MongoDB at [('mongo', 27017)]/mymongo... 2017-12-13 15:44:17.788948 [DEBUG] mongodb_jobs: ... connected. (readPreference=Primary()) 2017-12-13 15:44:17.789091 [DEBUG] Starting mrq.basetasks.cleaning.RequeueRedisStartedJobs({'{': '}'}) 2017-12-13 15:44:17.790134 [INFO] redis: Connecting to Redis at redis... Traceback (most recent call last): File "/usr/local/bin/mrq-run", line 11, in sys.exit(main()) File "/usr/local/lib/python2.7/site-packages/mrq/bin/mrq_run.py", line 62, in main ret = job.perform() File "/usr/local/lib/python2.7/site-packages/mrq/job.py", line 304, in perform result = self.task.run_wrapped(self.data["params"]) File "/usr/local/lib/python2.7/site-packages/mrq/task.py", line 19, in run_wrapped return self.run(params) File "/usr/local/lib/python2.7/site-packages/mrq/basetasks/cleaning.py", line 105, in run unserialized_job_ids = queue_obj.unserialize_job_ids(job_ids) File "/usr/local/lib/python2.7/site-packages/mrq/queue.py", line 172, in unserialize_job_ids for x in job_ids] UnicodeEncodeError: 'ascii' codec can't encode character u'\u0ff6' in position 2: ordinal not in range(128)

sylvinus commented 6 years ago

Hi @functure !

Could you tell us more about how you're queuing jobs on this queue? This would happen if you queued jobs directly without going through the send_task API.

Thanks!

sylvinus commented 6 years ago

Hi @functure !

Any news on this?

thanks

functure commented 6 years ago

Hi @sylvinus . Really sorry for extremely late reply. I completely forgot about this after I set config parameter USE_LARGE_JOB_IDS = True as a workaround. I call mrq.job.queue_jobs for queuing jobs and use regular queues. If it helps, I can try to send the problematic redis entries or any other info you may require.

sylvinus commented 6 years ago

OK great! I think it may have been due to the switch between both values for this parameter, you have to completely empty the DB. thanks!