Open baberibrar opened 6 days ago
Please paste the full code on how you are loading the keys and initializing aisuite client.
Please paste the full code on how you are loading the keys and initializing aisuite client.
from dotenv import load_dotenv
import os
# Load environment variables from .env file
load_dotenv()
# Access your API keys
openai_api_key = os.getenv("OPENAI_API_KEY")
# anthropic_api_key = os.getenv("ANTHROPIC_API_KEY")
groq_api_key = os.getenv("GROQ_API_KEY")
import aisuite as ai
client = ai.Client()
models = ["groq:grok-beta"]
messages = [
{"role": "system", "content": "Respond in Pirate English."},
{"role": "user", "content": "Tell me a joke."},
]
for model in models:
response = client.chat.completions.create(
model=model,
messages=messages,
temperature=0.75
)
print(response.choices[0].message.content)
Only 2 providers work for me: OpenAI and Ollama.
I see this for Anthropic and Groq:
python -B cls1.py
Traceback (most recent call last):
File "/Users/cleesmith/aisuite/cls1.py", line 16, in <module>
response = client.chat.completions.create(model="groq:llama-3.1-8b-instant", messages=messages)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/cleesmith/aisuite/aisuite/client.py", line 108, in create
self.client.providers[provider_key] = ProviderFactory.create_provider(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/cleesmith/aisuite/aisuite/provider.py", line 46, in create_provider
return provider_class(**config)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/cleesmith/aisuite/aisuite/providers/groq_provider.py", line 19, in __init__
self.client = groq.Groq(**config)
^^^^^^^^^^^^^^^^^^^
File "/opt/miniconda3/envs/aisuite/lib/python3.11/site-packages/groq/_client.py", line 99, in __init__
super().__init__(
File "/opt/miniconda3/envs/aisuite/lib/python3.11/site-packages/groq/_base_client.py", line 824, in __init__
self._client = http_client or SyncHttpxClientWrapper(
^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/miniconda3/envs/aisuite/lib/python3.11/site-packages/groq/_base_client.py", line 722, in __init__
super().__init__(**kwargs)
TypeError: Client.__init__() got an unexpected keyword argument 'proxies'
for the following code, or the example code in this repo ... both have the same error:
# pip install 'aisuite[all]'
import aisuite as ai
client = ai.Client()
messages = [
{"role": "system", "content": "You are a helpful agent, who answers with brevity."},
{"role": "user", "content": 'list planets'},
]
# response = client.chat.completions.create(model="anthropic:claude-3-haiku-20240307", messages=messages)
response = client.chat.completions.create(model="groq:llama-3.1-8b-instant", messages=messages)
print(response.choices[0].message.content)
It works for openai and ollama ... using Groq is fast, free, and good for testing code so it would be nice to have.
pip show aisuite
Name: aisuite
Version: 0.1.6
Summary: Uniform access layer for LLMs
Home-page:
Author: Andrew Ng
Author-email:
License:
Location: /opt/miniconda3/envs/aisuite/lib/python3.11/site-packages
Requires:
Required-by:
related #110