Open Samk13 opened 9 months ago
Would love to see this feature. Phi models would be great for this.
As a workaround, I setup https://github.com/g0t4/term-chat-ollama as an intermediate "proxy" that can forward requests to any OpenAI compat completions backend... i.e. ollama, OpenAI, groq.com, etc
FYI, video overview here: https://youtu.be/-QcSRmrsND0
@dossjj with this, you can use phi3 by setting the endpoint to https://fake.openai.azure.com:5000/answer?model=phi3
Description of the new feature/enhancement
The Windows Terminal Chat currently only supports Azure OpenAI Service. This restriction limits developers who work with or are developing their own local Large Language Models (LLMs), or using tools such as Ollama and need to interface with them directly within the Terminal. The ability to connect to a local LLM service would allow for better flexibility, especially for those concerned with privacy, working offline, or dealing with sensitive information that cannot be sent to cloud services.
Proposed technical implementation details (optional)
include functionality to support local LLM services by allowing users to configure a connection to local AI models. This would involve:
Thanks!