apocas / restai

RESTai is an AIaaS (AI as a Service) open-source platform. Built on top of LlamaIndex & Langchain. Supports any public LLM supported by LlamaIndex and any local LLM suported by Ollama/vLLM/etc. Precise embeddings usage and tuning. Image generation (Dall-E, SD, Flux).
https://apocas.github.io/restai/
Apache License 2.0
383 stars 73 forks source link

Please add support for LiteLLM to this project #71

Closed Greatz08 closed 7 months ago

Greatz08 commented 7 months ago

This project is pretty great BUT we need more options to use different LLM's.You don't have to worry about creating a solution which supports 100+ LLM easily as LiteLLM is another foss project which is capable of doing this task for you. Project LiteLLM link - https://github.com/BerriAI/litellm You can study there project and see how it can be implemented in this project like you implemented support for ollama I believe similarly you can do it for LiteLLM which will be big win for the project as many will be easily able to use many more LLM easily which everyone wants and project will require 3 major parameters from user like base url,model name,api key that's all and with open ai api general structure it can query and give back result for the query.Many big projects have started adding support for this project in there project to make things advanced in easier way so study it and after that if you have any query you can ask them they are pretty responsive plus if u want to know more about my personal experience of using it with other great projects like flowise then I can tell you that too in detail.

apocas commented 7 months ago

We already support any LLM that Ollama supports, you don't need any code to add a LLM. You are able to do it via browser in this case :) Besides Ollama for local LLMs, any public LLM supported by Llamaindex is also easily supported.

So I believe we are well served regarding LLMs for now :)

apocas commented 7 months ago

https://docs.llamaindex.ai/en/stable/examples/llm/litellm/ its veryyy easy to add support for litellm in RestAI. I will add it in the next release :)

apocas commented 7 months ago

Et voila, you can now use LiteLLM :) If you use an user with admin privileges just add a new LLM using the "LiteLLM" class and specify the parameters you want for this LLM.

image

In master, published in the next release.

Greatz08 commented 7 months ago

@apocas thanks will test later for sure :-)).Problem with ollama only was that in ollama you have to download and then run the heavy model on your system and then you can use its base url in other project like this one which will act as llm source to generate response but in case of litellm we can use any kind of model be it close source like open ai,claud or Gemini or open source model running with public api like groq which is providing mixtral,Gemma,llama or running locally by ourself with ollama so it solves the issue of running multiple types of llm with single api structure which is very convenient and easy to use and that's why i wanted that in this project