Closed mmzeynalli closed 1 week ago
If u want to run tasks on the FastAPI server, u can start receiver
with run_receiver_task
from taskiq/api/receiver
there is also a scheduler
I see. Is it recommended to use either worker or scheduler with this API? At least in worker's api file, it is not that recommended to use it programmatically, however, nothing is mentioned in scheduler's api file.
I'd not recommend it to be run from code neither. Because in fastapi you can run multiple uvicorn workers, which will result in multiple schedulers running at the same time.
Hello,
You have to run worker and scheduler processes separately.
Anyway, you are using redis to enqueue your tasks so you can run another container/process for each part. This is to be able to scale these processes individually.
Yep, got it. I was just wondering, if it would be possible to run taskiq in the same docker as fastapi.
Basically, I need to implement
apply_async
of celery (witheta
argument), knowing when I need to call the task. However, I struggled to make it work. Here is my example:When I call
schedule_by_time
, it is registered in redis and is in the queue with correct data. However, when time comes, it never executes. From documentation, I understood that I need to runtaskiq scheduler module:scheduler
, but, I am not using Taskiq standalone, it is part of FastAPI. Do I need to create new docker container for Taskiq and run that command there? Or could it be integrated with FastAPI?