Currently the chat session service only uses OpenAI for chat. This was used to get a quick prototype up and running for the project but ideally we should be able to use other LLMs.
This task is coming up with a plan to abstract out the chat session logic so that we can chose between using OpenAIs chat gpt or a locally hosted LLM like ollama
Exit Critera
A design doc will be created, reviewed and accepted
Overview
Currently the chat session service only uses OpenAI for chat. This was used to get a quick prototype up and running for the project but ideally we should be able to use other LLMs.
This task is coming up with a plan to abstract out the chat session logic so that we can chose between using OpenAIs chat gpt or a locally hosted LLM like ollama
Exit Critera
A design doc will be created, reviewed and accepted