nomic-ai / gpt4all

GPT4All: Run Local LLMs on Any Device. Open-source and available for commercial use.
https://nomic.ai/gpt4all
MIT License
69.09k stars 7.59k forks source link

[Feature] Enable GPT4All to Make Requests to External Services #2553

Open johmicrot opened 2 months ago

johmicrot commented 2 months ago

Feature Request: Enable GPT4All to Make Requests to External Services

I would like to request the addition of a feature that allows GPT4All to make requests to external services. Specifically, it would be highly beneficial if GPT4All could query an external chat API service hosted on a server.

For instance, if a user has a chat API service running on their server, it would be useful for GPT4All to have the capability to send queries to that server. This functionality would greatly enhance the implementation of Retrieval-Augmented Generation (RAG). By allowing GPT4All to interact with external services, users could query local files while leveraging a fast server to process the RAG prompts.

I see that we currently have the ability to query the OpenAI external service, which is very useful. However, it would be even more advantageous if there were a way to query a user's own LLM server. Implementing this feature would significantly expand the versatility and efficiency of GPT4All in various applications.

manyoso commented 2 months ago

@mcembalest ^^

spktkl commented 1 month ago

Seconding this. Being able to use LocalDocs on my low-octane laptop while the model runs remotely on my GPU at home would be fantastic. I'd hope this wouldn't be too much of a stretch since the app already supports API calls to OpenAI?

cosmic-snow commented 1 month ago

@johmicrot @spktkl Have a look at #2683, it was released with v3.1.1 of the chat application.

Although note: it's for OpenAI compatible servers, the feature request here doesn't clarify, however.