Closed ghost closed 1 year ago
I think the "gpt-3.5-turbo-0613" is streaming compatible model ? but I modify it, new problem is appear then.
Thanks for your contributions, we'll be closing this issue as it has gone stale. Feel free to reopen if you'd like to continue the discussion.
Bug: Streaming Response for model "gpt-3.5-turbo-0613"
Git SHA: 6a436d822c4ed0f1c66e83f5b5a1ba06c8e85a6e
Operating System:
Windows 11, No WSL / No WSL2
Docker and Docker Compose versions:
Supabase version and setup:
Detailed steps to reproduce the issue:
Run the project using Docker.
Make a chat request using model "gpt-3.5-turbo-0613". Observe that the streaming response is not working as expected.
Problem Description:
In the codebase for the backend routes in file backend/routes/chat_route.py, the if condition which checks if chat_question.model is present in the streaming_compatible_models array, seems to not be properly including "gpt-3.5-turbo-0613".
Code Snippets:
This issue might cause improper functioning of the system when using the mentioned model. Any feedback or suggested fixes would be appreciated.