Hey guys, first off great work on this little library - got a small ETL + indexing pipeline working using FastAPI + some workers pretty quickly. I was looking for ways to keep track of job states and noticed that there's no database integration/even calling the cloud tasks api to poll for job started/running/success/failed states (e.g. like in Celery).
How are you folks doing it? I'm thinking of maybe just assigning deterministic task_ids to the tasks and then adding an api which calls the gcp cloud tasks api to check. Then I can poll this in my app/wherever. The other (less hacky) way I think is to just add a redis/postgres/ and then use that.
Hey guys, first off great work on this little library - got a small ETL + indexing pipeline working using FastAPI + some workers pretty quickly. I was looking for ways to keep track of job states and noticed that there's no database integration/even calling the cloud tasks api to poll for job started/running/success/failed states (e.g. like in Celery).
How are you folks doing it? I'm thinking of maybe just assigning deterministic and then use that.
task_id
s to the tasks and then adding an api which calls the gcp cloud tasks api to check. Then I can poll this in my app/wherever. The other (less hacky) way I think is to just add a redis/postgres/