Open Armandeus66 opened 2 months ago
Hey,
I'm not a maintainer but from what I see you are suggesting is to create a common interface for AI chatbots other than ChatGPT?
I've looked into Ollama and connecting to it is as straightforward as it gets since Ollama has the REST API option for querying the whatever models you would prefer to run, the same way as you connect to ChatGPT.
From what I see the modification on the RPG Manager side would require to make a generalized version of what the ChatGPTService is which is the "hard" part since the way that the ChatGPT works is very specific already.
I'm sure that the maintainers would welcome a more detailed proposition of this feat from the design perspective, maybe provide a prototype on a different branch?
I personally don't see anyone having the means to use the local LLMs to a degree of satisfaction since it requires a local machine. A paid for version that would use cloud resources that you could connect to with a REST API sounds more reasonable but ultimately would fall short of OpenAI options in my opinion.
A way to improve the current solution would be to way to generalize the way to use OpenAI services, e.g. use newer models, fine-tune models or make assistants with RAG capabilities using tools to retrieve vector databases(all hosted by OpenAI, all paid for by the people working on this feat). Mind you it's not a small task and more importantly it's not free(not too expensive but would certainly require funding).
@carlonicora what do you think of this proposition?
Please make sure this feature request hasn't been suggested before.
Feature description
Please allow the user the option of using a local LLM.
Solution
Being able to input a local URL, etc. to point to a local LLM installation (LM Studio or Ollama) in lieu of an OpenAI API key would be ideal, and much cheaper too.
Alternatives
There may be other local LLMs I am not aware of.
Additional Information
Thank you.