rgbkrk / genai

What if GPT could help you notebook?
BSD 3-Clause "New" or "Revised" License
352 stars 37 forks source link

Add support for Azure, OpenAI, Palm, Anthropic, Cohere Models - using litellm #86

Closed ishaan-jaff closed 1 year ago

ishaan-jaff commented 1 year ago

I'm the maintainer of litellm https://github.com/BerriAI/litellm - a simple & light package to call OpenAI, Azure, Cohere, Anthropic, Replicate API Endpoints

This PR adds support for models from all the above mentioned providers

Here's a sample of how it's used:

from litellm import completion

## set ENV variables
# ENV variables can be set in .env file, too. Example in .env.example
os.environ["OPENAI_API_KEY"] = "openai key"
os.environ["COHERE_API_KEY"] = "cohere key"

messages = [{ "content": "Hello, how are you?","role": "user"}]

# openai call
response = completion(model="gpt-3.5-turbo", messages=messages)

# cohere call
response = completion("command-nightly", messages)

# anthropic call
response = completion(model="claude-instant-1", messages=messages)
ishaan-jaff commented 1 year ago

@rgbkrk can you take a look at this PR when you get a chance ?

Happy to add additional tests if this initial commit looks good to you 😊

rgbkrk commented 1 year ago

We run with stream=True. Does litellm adapt to streaming as well?

ishaan-jaff commented 1 year ago

Yes, here's a notebook with stream=True using litellm

https://colab.research.google.com/drive/1R9c5eD5ZC4f8zBDLFek-WjwbfUTxLYuP?usp=sharing

rgbkrk commented 1 year ago

Can you show an example that uses your branch here?