Closed timothycarambat closed 7 months ago
LocalAI?
The moderations endpoint will need to be managed. I had to comment it out to make this work.
@AntonioCiolino Have you successfully managed to use LocalAI? Changing the the endpoints did the the trick? and if it does does the embbeding work?
PR #335
LMStudio integration is live: f499f1ba59f2e9f8be5e44c89a951e859382e005
Supporting LMStudio and LocalAI currently - which should be enough as this time. Other providers will be their own issue.
Is there any tutorial on how to set up a LocalAI Model for anything LLM?
I was able to set up a model on LocalAI I can prompt it via cmd. But when I put the endpoint on anythingllm it never loads the model options.
Are you running LocalAI and AnythingLLM on the same machine using Docker? @amorimds
Add support for someone to be able to use a Locally hosted LLM.
References for easier integration: https://docs.gpt4all.io/ LocalAI Replicant OLLama LLMStudio HuggingFace