Open mrseanryan opened 1 year ago
Hi @mrseanryan believe we can help with this issue. I’m the maintainer of LiteLLM https://github.com/BerriAI/litellm
TLDR:
We allow you to use any LLM as a drop in replacement for gpt-3.5-turbo
- You can use llama/gpt/claude (100+ LLMs)
If you don't have access to certain LLMs you can use our proxy server or spin up your own proxy server using LiteLLM
This calls the provider API directly
from litellm import completion
import os
## set ENV variables
os.environ["OPENAI_API_KEY"] = "your-key" #
messages = [{ "content": "Hello, how are you?","role": "user"}]
# openai call
response = completion(model="gpt-3.5-turbo", messages=messages)
# falcon call
response = completion(model="falcon-40b", messages=messages)
bump on this @mrseanryan @ishaan-jaff
Nice! I hope to give it a try, when have the time ...
Try other LLM - Try HuggingFace's Code Llama
see https://github.com/mrseanryan/gpt-dm/issues/6