draupner1 / oai

ChatGPT into CLI using OpenAIs API
Apache License 2.0
4 stars 2 forks source link

Any chance you can add ollama or mozilla llamafile support? #8

Closed neural-loop closed 3 months ago

neural-loop commented 8 months ago

Now there are multiple local models we can run. Thought I'd see if it might be possible to integrate with some of these?

draupner1 commented 8 months ago

Interesting, We need one way to configure the feature-set to each model. (right now it is defined by OpenAI API) (was thinking in line with creating a new GOI for Google Gemini API, but maybe not?)

Secondly, a way to connect to a locally running model. (So, a resources/Loc-uit.py, and also a command to oai, switching from openai api to local...) (When local model can be a couple of different ones...)

Some challenges to not ruin ease-of-use.

Not a finished thought: might need to shift the application with yet another hierarchical level to handle many different models, with different functions and message formats. Will need some time to try these two out...

draupner1 commented 3 months ago

New release v.0.7.0 now has support for Ollama integration.