sparckles / Robyn

Robyn is a Super Fast Async Python Web Framework with a Rust runtime.
https://robyn.tech/
BSD 2-Clause "Simplified" License
4.24k stars 217 forks source link

Add support for background tasks #548

Open sansyrox opened 1 year ago

sansyrox commented 1 year ago

Investigate the various libraries available for the same.

18o commented 1 year ago

can build something like celery integrate to robyn

GRBurst commented 9 months ago

Are there any news on this?

Do I understand correctly, that it would allow us to send a response for a request immediately, like an ACK for receiving the request, but run a job in the background? This would be useful to use robyn with services like AWS SNS, because there is a 15 second timeout for a response, but a task may take longer to execute.

Would this enable use cases like:

@app.post("/fun")
async def fun(request: Request) -> Response:
    run_my_background_task(request)

    # Send ACK for receiving the request immediately, even though run_my_task is still running
    return Response(
        status_code=status_codes.HTTP_200_OK,
    )

Right now we are doing workarounds by spawning a Thread every time, which is far from optimal, similar like below:

@app.post("/fun")
async def fun(request: Request) -> Response:
    Thread(target=my_fun, args=[request]).start()

    # Send ACK for receiving the request immediately
    return Response(
        status_code=status_codes.HTTP_200_OK,
    )
sansyrox commented 9 months ago

Hey @GRBurst 👋

Thank you for sharing a codesample. 😄 I think we can make a plugin using a task queue to help with your project. Do let me know if that sounds beneficial to you 😄

GRBurst commented 9 months ago

Hey @GRBurst 👋

Thank you for sharing a codesample. 😄 I think we can make a plugin using a task queue to help with your project. Do let me know if that sounds beneficial to you 😄

Hey @sansyrox,

thanks for the fast response, very nice :+1:

I was thinking about using a queue as well. However, ideally these requests should be distributed across the configured processes and workers and I wasn't sure how I can make these things play together. Could you shed some light on that, whether this is possible with (or without) a plugin and how? I assume there is a process pool used under the hood (didn't look into the code yet :-D).

And secondly: Is the goal of this issue / feature request different? In what way does it differ?

Looking forward to hear from you :-)

sansyrox commented 9 months ago

Hey @GRBurst 👋

No problem 😄

I was thinking about using a queue as well. However, ideally these requests should be distributed across the configured processes and workers and I wasn't sure how I can make these things play together.

I think this is handled by message queue libraries, not 100% sure though. If not, we can always have some indexing 😄

And secondly: Is the goal of this issue / feature request different? In what way does it differ?

Yes, that is the goal! But I don't want to implement a message queue library myself(too much work haha) and am still in the process of deciding whether the built in task queues should be a simple one or should we integrate a fully fledged library