Unskilledcrab / chain-generator

0 stars 0 forks source link

Allow for locally hosted LLM #4

Open Unskilledcrab opened 5 months ago

Unskilledcrab commented 5 months ago

Overview

Currently the chat session service only uses OpenAI for chat. This was used to get a quick prototype up and running for the project but ideally we should be able to use other LLMs.

This task is coming up with a plan to abstract out the chat session logic so that we can chose between using OpenAIs chat gpt or a locally hosted LLM like ollama

Exit Critera

A design doc will be created, reviewed and accepted