Mintplex-Labs / anything-llm

The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, and more.
https://anythingllm.com
MIT License
24.24k stars 2.43k forks source link

Use gradio.live link from colab notebook as Local AI Base URL or LMStudio Base URL #793

Closed PpiNeaPpLe closed 5 months ago

PpiNeaPpLe commented 7 months ago

How are you running AnythingLLM?

AnythingLLM desktop app

What happened?

usually when I run program that depends on Local AI i can simply put in my gradio.live link which comes from my google colab notebook which is hosting mistral 7b v.2 and it will act as if I am running it locally. However I get an error that my link must contain a "/v1" I have never experienced anything like this before on other applications. This is my first time using anything LLM and I am trying to avoid having to run mistral 7b on my PC locally and would prefer to run it through my google colab.

Are there known steps to reproduce?

No response

timothycarambat commented 7 months ago

If you are using LocalAI, the endpoint for inferencing with LocalAI API requires /v1 because it uses the OpenAI format to communicate - which is what AnythingLLM uses. Is there an API endpoint you can use on Gradio to generate a response? Because if using LocalAI this would still require the /v1 since that is just how LocalAI works for its API.