marcusschiesser / unc

Enterprise-ready, privacy-first ChatGPT platform
https://unc.de
GNU Affero General Public License v3.0
7 stars 1 forks source link

[Feature] Add more API endpoint support #21

Open LaDoger opened 10 months ago

LaDoger commented 10 months ago

Currently Unc supports:

  1. OpenAI (Chat completion)
  2. Telegram Bots

We'll also wanna support:

  1. OpenAI (other features such as DALL-E)
  2. Stability AI
  3. Hugging Face
  4. Replicate
  5. Discord Bots
marcusschiesser commented 10 months ago

@LaDoger, we will automatically get more LLM models when we integrate llamaindex (#22). The question is, how should we include visual models? We would need a concept for this first.

ishaan-jaff commented 9 months ago

Hi @marcusschiesser @LaDoger I’m the maintainer of LiteLLM (abstraction to call 100+ LLMs)- we allow you to create a proxy server to call 100+ LLMs, and I think it can solve your problem (I'd love your feedback if it does not)

Try it here: https://docs.litellm.ai/docs/proxy_server https://github.com/BerriAI/litellm

Using LiteLLM Proxy Server

import openai
openai.api_base = "http://0.0.0.0:8000/" # proxy url
print(openai.ChatCompletion.create(model="test", messages=[{"role":"user", "content":"Hey!"}]))

Creating a proxy server

Ollama models

$ litellm --model ollama/llama2 --api_base http://localhost:11434

Hugging Face Models

$ export HUGGINGFACE_API_KEY=my-api-key #[OPTIONAL]
$ litellm --model claude-instant-1

Anthropic

$ export ANTHROPIC_API_KEY=my-api-key
$ litellm --model claude-instant-1

Palm

$ export PALM_API_KEY=my-palm-key
$ litellm --model palm/chat-bison
marcusschiesser commented 9 months ago

@ishaan-jaff Looking great, but we won't add Python code for the time being

ishaan-jaff commented 9 months ago

why not use our proxy for this ? That way you don't need to add any python code

marcusschiesser commented 9 months ago

@ishaan-jaff sorry, but that would then add one more deployment. Currently it's just one vercel deployment.