NicolasLM / spinach

Modern Redis task queue for Python 3
https://spinach.readthedocs.io
BSD 2-Clause "Simplified" License
63 stars 4 forks source link

Restarting the engine unconditionally resets the max_concurrency values #27

Open 0xDEC0DE opened 9 months ago

0xDEC0DE commented 9 months ago

This one is a bit complicated, but:

Preamble

A test case script:

import time
import spinach
sp = spinach.Engine(spinach.RedisBroker())

@sp.task(name='nap', max_retries=1, max_concurrency=8)
def nap(job_id):
    print(f"{job_id:3} Zzzz...")
    time.sleep(5)

batch = spinach.Batch()
for i in range(32):
    batch.schedule('nap', i)

sp.schedule_batch(batch)
sp.start_workers(32, stop_when_queue_empty=True)

Steps to reproduce

Expected result

The second run of the script runs all 32 tasks simultaneously, like we told it to

Actual behavior

The second run of the script processes the queue 8-at-a-time

Miscellany

Any thoughts and insights on how to best address this would be appreciated.

bigjools commented 9 months ago

This is not a bug; it is 100% intentional behavior, with comments in the set_concurrency_keys.lua script explaining that it is done on purpose.

This is the only place that sets up the key, so naturally it's going to override the current value if one was set from a different client.

I'd advocate for a second key that allows user to set their own limits, effectively as an override for the code's own limit. We can amend the Lua script so it looks for that key first and falls back to the real one. Users could potentially add a TTL on the second key to get the "temporary behaviour" action.

bigjools commented 9 months ago

In fact I'd say let's add a tool script to do this, so that the user doesn't need to know about our Redis keys. Something like: spinach max_concurrency myjobname 32 1m to set myjobname job to 32 concurrency for 1 minute.