smol-ai / developer

the first library to let you embed a developer agent in your own app!
https://twitter.com/SmolModels
MIT License
11.81k stars 1.03k forks source link

No access to claude api key, can't use #50

Open d3287t328 opened 1 year ago

d3287t328 commented 1 year ago

What do you suggest for those of us without a clause api key to use this? I applied for one but it doesn't seem like it is going to be provided to me anytime soon.

swyxio commented 1 year ago
  1. you dont need it for the base experience, claude key is for the generated chrome extension
  2. i'm working on a hosted thing that can hopefully share keys...
BigwigsNFT commented 1 year ago

Any ideas. This is great but without claude the token issue prevents from using it. Tried chunking and other options but no luck.

jdingus commented 1 year ago

Same here I'm assuming the context size from Claude is the defining factor.

openai.error.InvalidRequestError: This model's maximum context length is 8192 tokens. However, your messages resulted in 9037 tokens. Please reduce the length of the messages.

Seen some videos with people having some really detailed prompt.md files.

swyxio commented 1 year ago

yeah unfortunately I cant do much about claude access :/

ishaan-jaff commented 1 year ago

Hi @jdingus @d3287t328 @BigwigsNFT I believe I can help with this issue. I’m the maintainer of LiteLLM https://github.com/BerriAI/litellm - we allow you to use any LLM as a drop in replacement for gpt-3.5-turbo.

You can use LiteLLM in the following ways:

With your own API KEY:

This calls the provider API directly

from litellm import completion
import os
## set ENV variables 
os.environ["OPENAI_API_KEY"] = "your-key" # 
os.environ["COHERE_API_KEY"] = "your-key" # 

messages = [{ "content": "Hello, how are you?","role": "user"}]

# openai call
response = completion(model="gpt-3.5-turbo", messages=messages)

# cohere call
response = completion(model="command-nightly", messages=messages)

Using the LiteLLM Proxy with a LiteLLM Key

this is great if you don’t have access to claude but want to use the open source LiteLLM proxy to access claude

from litellm import completion
import os

## set ENV variables 
os.environ["OPENAI_API_KEY"] = "sk-litellm-5b46387675a944d2" # [OPTIONAL] replace with your openai key
os.environ["COHERE_API_KEY"] = "sk-litellm-5b46387675a944d2" # [OPTIONAL] replace with your cohere key

messages = [{ "content": "Hello, how are you?","role": "user"}]

# openai call
response = completion(model="gpt-3.5-turbo", messages=messages)

# cohere call
response = completion(model="command-nightly", messages=messages)