microsoft / terminal

The new Windows Terminal and the original Windows console host, all in the same place!
MIT License
94.88k stars 8.22k forks source link

Feature Request: 🛠 Support for local LLMs tools in Terminal Chat like Ollama #16471

Open Samk13 opened 9 months ago

Samk13 commented 9 months ago

Description of the new feature/enhancement

The Windows Terminal Chat currently only supports Azure OpenAI Service. This restriction limits developers who work with or are developing their own local Large Language Models (LLMs), or using tools such as Ollama and need to interface with them directly within the Terminal. The ability to connect to a local LLM service would allow for better flexibility, especially for those concerned with privacy, working offline, or dealing with sensitive information that cannot be sent to cloud services.

Proposed technical implementation details (optional)

include functionality to support local LLM services by allowing users to configure a connection to local AI models. This would involve:

  1. Provide an option in the Terminal Chat settings to specify the endpoint of a local LLM service.
  2. Allowing the user to set the port that the local LLM service should listen to for incoming requests.

Thanks!

dossjjx commented 3 months ago

Would love to see this feature. Phi models would be great for this.

g0t4 commented 2 months ago

As a workaround, I setup https://github.com/g0t4/term-chat-ollama as an intermediate "proxy" that can forward requests to any OpenAI compat completions backend... i.e. ollama, OpenAI, groq.com, etc

FYI, video overview here: https://youtu.be/-QcSRmrsND0

@dossjj with this, you can use phi3 by setting the endpoint to https://fake.openai.azure.com:5000/answer?model=phi3