Mintplex-Labs / anything-llm

The all-in-one Desktop & Docker AI application with full RAG and AI Agent capabilities.
https://useanything.com
MIT License
17.01k stars 1.82k forks source link

Local LLM Support #118

Closed timothycarambat closed 7 months ago

timothycarambat commented 1 year ago

Add support for someone to be able to use a Locally hosted LLM.

References for easier integration: https://docs.gpt4all.io/ LocalAI Replicant OLLama LLMStudio HuggingFace

AntonioCiolino commented 1 year ago

LocalAI?

AntonioCiolino commented 11 months ago

The moderations endpoint will need to be managed. I had to comment it out to make this work.

CK-Daniel commented 10 months ago

@AntonioCiolino Have you successfully managed to use LocalAI? Changing the the endpoints did the the trick? and if it does does the embbeding work?

franzbischoff commented 8 months ago

PR #335

timothycarambat commented 8 months ago

LMStudio integration is live: f499f1ba59f2e9f8be5e44c89a951e859382e005

timothycarambat commented 7 months ago

Supporting LMStudio and LocalAI currently - which should be enough as this time. Other providers will be their own issue.

amorimds commented 6 months ago

Is there any tutorial on how to set up a LocalAI Model for anything LLM?

I was able to set up a model on LocalAI I can prompt it via cmd. But when I put the endpoint on anythingllm it never loads the model options.

timothycarambat commented 6 months ago

Are you running LocalAI and AnythingLLM on the same machine using Docker? @amorimds