Koed00 / django-q

A multiprocessing distributed task queue for Django
https://django-q.readthedocs.org
MIT License
1.84k stars 289 forks source link

django-q is constantly hitting redis server - but using orm as Broker ? #635

Open abadger1 opened 2 years ago

abadger1 commented 2 years ago
1639082983.648143 [0 127.0.0.1:38830] "GET" ":1:django_q:Visitor Express:cluster"
1639082983.648465 [0 127.0.0.1:38830] "SET" ":1:django_q:Visitor Express:cluster" "\x80\x04\x95\x95\x00\x00\x00\x00\x00\x00\x00]\x94(\x8cEdjango_q:Visitor Express:cluster:b2203f69-3c5e-4884-9155-74777b53b942\x94\x8cEdjango_q:Visitor Express:cluster:71398713-e5b8-4a40-8561-c688f44f3097\x94e." "PX" "300000"
1639082983.648709 [0 127.0.0.1:38830] "SET" ":1:django_q:Visitor Express:cluster:71398713-e5b8-4a40-8561-c688f44f3097" "\x80\x04\x95C\x02\x00\x00\x00\x00\x00\x00X<\x02\x00\x00.eJxVUMlOIzEQTaazgwBpEPAJQaAIjtxGggs9B9YcUctJm8SkY3fssiCRRhqEWIR8SyF-JV_Af3DgD7ixlgNI4MNzyeWqt_zP3zxlM-PjpuMjJlsq6tUMMLAGXW6PChzi4in-w6orHivd4drgAVbD5-18Jnzx8OrhzUN2h-CXh8BDnoC7AFQDXSlmwEF0-fdyiOuV4sNkeXZ5Lls5I760DwO6ovr-uqfdxQvcRTeluZBNpiUDoaTBvxkXpCLGcESkrtJMrAGuI3pxOWvHV72-ufEl3AVCAl7P3K5sPV7dsfM_4ejh9-VazzRcyXAJQvIEwye_q_DlfP4ji5oFkZjaoZVNT80SdLNRwgb9KNXqpB9ZmYpmJ_FOqm7hxwxoJk0yVoyu2OIA_MRn6XKbMQ2ce2GlhhUJ8RNhYEBTF7zfiVhJHvUiIwbcm821lQFaQuJaQrao6iopQGkMCxSxmwBmOt_-F1Jr2py6Jd8t-6hptJtiu_KZ9-pS-b6O7Wkfr23U3gEz4MMk:1mvQMJ:rGebQh9m0ZS5vRoTlK3n5UKz6if0ROB473IYgVZi_GM\x94." "PX" "3000"
1639082984.150224 [0 127.0.0.1:38830] "GET" ":1:django_q:Visitor Express:cluster"
1639082984.150595 [0 127.0.0.1:38830] "SET" ":1:django_q:Visitor Express:cluster" "\x80\x04\x95\x95\x00\x00\x00\x00\x00\x00\x00]\x94(\x8cEdjango_q:Visitor Express:cluster:b2203f69-3c5e-4884-9155-74777b53b942\x94\x8cEdjango_q:Visitor Express:cluster:71398713-e5b8-4a40-8561-c688f44f3097\x94e." "PX" "300000"
1639082984.150897 [0 127.0.0.1:38830] "SET" ":1:django_q:Visitor Express:cluster:71398713-e5b8-4a40-8561-c688f44f3097" "\x80\x04\x95?\x02\x00\x00\x00\x00\x00\x00X8\x02\x00\x00.eJxVUMtKQzEQbb19WUUFRf2EClJ06U7QhV4XPosruaS9sY29TW6TCdqCoIgPJDtH_BW_wD_xD9z5nFQFzeJkyGTmPM7y96_ZzOC48fiIyaaKulUDDKxBl9ulAu9w7hxPseKKx0q3uTZ4gJXwbSufCd89fHj49JDdJhjyEHjIE3AXgKqjK8UMOIgO_1ve4Uq5-Dw6PDU_nS1fEF_agz5dUW1vxdPu4BXuoBvTXMgG05KBUNLgRsYFqYgxfCRSV24k1gDXEb24nLWDq1ZbX_0V7gIhAW8nHhY2X26e2OVy-Pg8eb3UNXVXMlyCkDzB8NXvKvw6n_nOompBJKZ6aGXDU7ME3VSUsH4vSrU66UVWpqLRTryTipv9NwOaSZMMFKMrNjkAP_FZutx6TAOXXlipbkVC_EQYGNDUBe93JFaSR93IiD73ZnMtZYCWkLimkE2qOkoKUBrDAkXsRoCZ9p__hdSaFqduyXeHfdQ02kmxVf7Je3F-aG0fW-M-XluvfgEsfcKH:1mvQMK:2FB6lnHgpHGO3aZCutJ3P7P85NBh7ZaE8TPlyjSISkk\x94." "PX" "3000"
1639082984.652708 [0 127.0.0.1:38830] "GET" ":1:django_q:Visitor Express:cluster"
1639082984.653043 [0 127.0.0.1:38830] "SET" ":1:django_q:Visitor Express:cluster" "\x80\x04\x95\x95\x00\x00\x00\x00\x00\x00\x00]\x94(\x8cEdjango_q:Visitor Express:cluster:b2203f69-3c5e-4884-9155-74777b53b942\x94\x8cEdjango_q:Visitor Express:cluster:71398713-e5b8-4a40-8561-c688f44f3097\x94e." "PX" "300000"
1639082984.653283 [0 127.0.0.1:38830] "SET" ":1:django_q:Visitor Express:cluster:71398713-e5b8-4a40-8561-c688f44f3097" "\x80\x04\x95=\x02\x00\x00\x00\x00\x00\x00X6\x02\x00\x00.eJxVUMtKQzEQbb19Kyoo6idUkKJLd4JuvC58dimXtDe2sbfJbTJBWxAU8YFk54i_4hf4J_6AuPM5qQqaxcmQycx5nObvXrOZ4XET8SGTLRX1agYYWIMut0sF3uL8GZ5g1RWPlO5wbXAfq-HbVj4Tvnv48PDpIbtNMOIh8JAn4C4A1UBXihlwEF3-t7zF1Urxaaw8vTCTrZwTX9qHAV1RfW_V0-7gJe6gG9dcyCbTkoFQ0uBGxgWpiDF8IFJXaSbWANcRvbictcOrXl9f-xXuAiEBbybvFzdfrh_ZxUr48DR1tdwzDVcyXIKQPMHw1e8q_Dqf_c6iZkEkpnZgZdNTswTddJSwQT9KtTruR1amotlJvJOqm_s3A5pJkwwVoyu2OAA_9lm63HpMAxdeWKlhRUL8RBgY0NQF73c0VpJHvciIAfdmc21lgJaQuJaQLaq6SgpQGsMCRexGgZnOn_-F1Jo2p27Jd8s-ahrtptiu_OS9tFB-Psf2hI_XNmpfNrTDZA:1mvQMK:XUg57Nll456nwWBdAofHc4WlKNr71t1Mta7Su2U3CDk\x94." "PX" "3000"
1639082985.154894 [0 127.0.0.1:38830] "GET" ":1:django_q:Visitor Express:cluster"
1639082985.155240 [0 127.0.0
abadger1 commented 2 years ago

https://giters.com/Koed00/django-q/issues/359

This seems to say this is okay ? But seems so strange that redis is spinning hard with these 0.2 second polls - for no reason.

pysean3 commented 2 years ago

Are you sure it's not just the monitoring functionality doing its thing? I'll assume you have Redis set up for your app's cache. If you look at the keys in the log above with both get and set and cluster in the key name, I'd wager that's what it is. You could look through the source code and find where that cache key is generated at. Or, disable your app's cache and see if it stops.

abadger1 commented 2 years ago

Yes redis is in place for the app's cache - used on one DRF query to speed this up.

Why is this related to Django-q - which is using ORM 'default' - ie Django and back-end PostgreSQL in my setup.

Why is it monitoring redis - when redis is not being used by Django-q ? I think I'm missing something in my understanding of 'orm default' setting.

pysean3 commented 2 years ago

Two concepts going on: 1) The cluster is the worker process, that picks up tasks and processes the tasks. 2) The broker communicates those jobs between your app and the cluster.

With that said: A) The ORM/postgres table is being used in the ORM 'default' to broker or transfer those messages between your app and the cluster. So, your app writes a task to the postgres table, the cluster picks it up from the same table and processes it. This is done every 0.2 seconds, or whatever your setting is configured to. B) The cluster monitoring independent of what is used for a broker and makes sure the tasks are being processed, processes running successfully, etc. The monitoring for the cluster uses your app's cache to do so. If you were using the database backend for caching, that monitoring process would also be hitting postgres. However, you're using Redis for caching, so the cluster monitoring is using Redis. The cluster is monitoring itself using Redis to track statistics. It's not monitoring Redis itself.

The monitoring process is covered in the docs. Basically, everything in here will be hitting your app's cache, which in your case is Redis: https://django-q.readthedocs.io/en/latest/monitor.html

On a side note, if Redis is spinning hard by getting hit every 0.2 second, something is seriously wrong your your Redis setup.