Hi--we'd like to use Chalice for an OpenAI streaming project (streaming results back from the openai sdk using the streaming flag, ala how ChatGPT streams responses as the AI generates them). It looks like this is supported in the NodeJS runtime, but not so far in the Python runtimes.
There does seem to be a python api in boto to do turn this on: client.lambda.invoke_with_response_stream. Is there some limitation that would stop us from using this in Chalice?
Hi--we'd like to use Chalice for an OpenAI streaming project (streaming results back from the openai sdk using the streaming flag, ala how ChatGPT streams responses as the AI generates them). It looks like this is supported in the NodeJS runtime, but not so far in the Python runtimes.
There does seem to be a python api in boto to do turn this on: client.lambda.invoke_with_response_stream. Is there some limitation that would stop us from using this in Chalice?
thanks!