sparckles / Robyn

Robyn is a Super Fast Async Python Web Framework with a Rust runtime.
https://robyn.tech/
BSD 2-Clause "Simplified" License
4.24k stars 218 forks source link

Add support for yields #527

Open sansyrox opened 1 year ago

sansyrox commented 1 year ago

Issue:

Our current system does not support the use of multiple yield statements within a single function, thereby limiting execution flow control. This becomes especially important when we wish to have execution spanning across multiple requests, which can't be achieved with our existing setup.

Expected Behavior:

A function should be able to include multiple yield statements, with each yield representing a pause point in execution until the next request triggers the continuation of execution from where it left off.

Steps to Reproduce:

Write a function that includes multiple yield statements. Attempt to execute the function across multiple requests. Observe that the function does not resume execution from the last yield upon a new request. Suggested Solution:

Revise the system to support the use of multiple yield statements within a function, where each yield acts as a checkpoint. When a new request is made, the function should continue its execution from the last yield.

This will require modification to the function execution flow and potentially the addition of a mechanism to store the state of a function between requests.

sansyrox commented 5 months ago

Hey @iiian πŸ‘‹

Thank you for your interest. I should have been clear(er). I actually wanted to support yields in regular HTTP requests using the HTTP 2 features. But I am unsure if actix supports http2 at the moment πŸ˜…

iiian commented 5 months ago

@sansyrox I have some suggestions for this ticket that I think will push the conversation in the right direction. These suggestions are the result of a few weeks of investigation.

You said:

Our current system does not support the use of multiple yield statements within a single function, thereby limiting execution flow control. This becomes especially important when we wish to have execution spanning across multiple requests, which can't be achieved with our existing setup.

again, for emphasis:

we wish to have execution spanning across multiple requests

I think the real ask here, then, is for a good multi-request/response API in route handlers for Robyn. I don't believe generators are the best alternative for such an API because when you try to extend generators to support arbitrary multi-request/response scenarios, an impedance mismatch rapidly develops that results in a poor developer experience [2]. Now, the scope of using generators could be restricted to a trivial subset of all what is possible, however there are existing techniques available for these trivial cases and none of them require directly yielding from route handlers [3].

Instead, a novel multi-requesting API might be send/recv channels as functions provided to async route handlers [1], something like:

@app.get("/", multi_request=True)
async def my_multi_request_handler(send, recv):
    # recv ingests some arbitrary # of requests
    (rq1, id1) = await recv()
    (rq2, id2) = await recv()

    # send takes a response and an id. responses can be sent out of order
    # compared to how received
    await send(ORJSONResponse(...), id2)
    await send(ORJsonResponse(...), id1)

And of course a send/recv-style API lends itself naturally to being plugged into HTTP/2 binary frame streams under the hood.

For brevity's sake, I've omitted a write up demonstrating the pain of using yields to try and achieve generalized multi-requesting, but if you'd like to see one I can happily share!


[1] As it turns out, async/await in Python is actually built on top of generators! https://tenthousandmeters.com/blog/python-behind-the-scenes-12-how-asyncawait-works-in-python/ [2] Suffice it to say that the main problem with yield is that it's a cumbersome developer experience to ensure that new requests aren't lost between yield checkpoints due to the bidirectional nature of yields. [3] It might also be worth considering doing something like a StreamingResponse akin to FastApi/Starlette, where we wrap a generator and simply return the response. https://fastapi.tiangolo.com/advanced/custom-response/#streamingresponse

sansyrox commented 5 months ago

Hey @iiian πŸ‘‹

Apologies for the delayed revert, I was away for the past month with limited access. Most of these suggestions are great points.

I do have some follow up questions. Let me draft them properly and get back to you. πŸ˜„