Open iHaagcom opened 8 months ago
What specifically from GPT4ALL, like being able to use it as your LLM? Do you have any documentation on sBERT plugins - that is not something I'm familiar with.
You should be able to select one of the local llm options and put in your Gpt4All server mode endpoint there, as they all follow the openAI API standards. As an example, I am using llamafile for inference, and I put my llamafile endpoint into the LocalAI option
What would you like to see?
Would like to see support for Gpt4all and sBert plugins.