Open ryanrain2016 opened 3 months ago
Sure, this is how RedisLists pub/sub works - we consumes the one message, process it and then take the next one. We can figure out about concurrency, but it can leads to undefined consuming logic and I can't promise this feature possibility
Anyway, you can consume messages in batches - it should speed up your services working.
Anyway, you can consume messages in batches - it should speed up your services working. Do you mean code like this?
@rb.subscriber(list=ListSub("users", batch=True)) async def my_listener(user: List[User]): await asyncio.sleep(3) print(user, 'from faststream')
async def producer():
# await rb.publish_batch(User(name="Bob", age=i), list="users")
await rb.publish_batch(*[User(name="Bob", age=i) for i in range(10)], list="users")
I doesn't work. No message cosumed, with the output below.
2024-06-07 14:24:44,921 INFO - FastStream app starting...
2024-06-07 14:24:44,921 INFO - users | - MyListener
waiting for messages
2024-06-07 14:24:44,922 INFO - FastStream app started successfully! To exit, press CTRL+C
messages in redis like this
I find a way. Just push the message to taskiq
in a faststream task. The taskiq
works just as expected.
I don't sure why your batch example doesn't work, but my works as expected:
from faststream import FastStream
from faststream.redis import RedisBroker, ListSub
rb = RedisBroker()
app = FastStream(rb)
@rb.subscriber(list=ListSub("users", batch=True))
async def my_listener(user: list[str]):
print(user, "from faststream")
@app.after_startup
async def producer():
await rb.publish_batch(*["bob"] * 10, list="users")
With the output:
2024-06-07 17:59:34,313 INFO - FastStream app starting...
2024-06-07 17:59:34,314 INFO - users | - `MyListener` waiting for messages
2024-06-07 17:59:34,319 INFO - FastStream app started successfully! To exit, press CTRL+C
2024-06-07 17:59:34,320 INFO - users | 34df713e-6 - Received
['bob', 'bob', 'bob', 'bob', 'bob', 'bob', 'bob', 'bob', 'bob', 'bob'] from faststream
2024-06-07 17:59:34,320 INFO - users | 34df713e-6 - Processed
Anyway, I'll add max_workers
option the same with NATS subscriber way to support concurency
Describe the bug It seems tasks don't run in parallel
How to reproduce Include source code:
And/Or steps to reproduce the behavior:
run the script above
Expected behavior task should run in parallel
Observed behavior task run one after one