andrewyng / aisuite

Simple, unified interface to multiple Generative AI providers
MIT License
6.86k stars 586 forks source link

The error is occurring on the Groq configuration, but the Grok API key is working fine. #68

Open baberibrar opened 6 days ago

baberibrar commented 6 days ago
Traceback (most recent call last):
  File "D:\ai-research-suite\aisuite_ai.py", line 23, in <module>
    response = client.chat.completions.create(
        model=model,
        messages=messages,
        temperature=0.75
    )
  File "D:\ai-research-suite\venv\Lib\site-packages\aisuite\client.py", line 117, in create
    return provider.chat_completions_create(model_name, messages, **kwargs)
           ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\ai-research-suite\venv\Lib\site-packages\aisuite\providers\groq_provider.py", line 22, in chat_completions_create
    return self.client.chat.completions.create(
           ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^
        model=model,
        ^^^^^^^^^^^^
        messages=messages,
        ^^^^^^^^^^^^^^^^^^
        **kwargs  # Pass any additional arguments to the Groq API
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    )
    ^
  File "D:\ai-research-suite\venv\Lib\site-packages\groq\resources\chat\completions.py", line 289, in create
    return self._post(
           ~~~~~~~~~~^
        "/openai/v1/chat/completions",
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    ...<31 lines>...
        stream_cls=Stream[ChatCompletionChunk],
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    )
    ^
  File "D:\ai-research-suite\venv\Lib\site-packages\groq\_base_client.py", line 1225, in post
    return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
                           ~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\ai-research-suite\venv\Lib\site-packages\groq\_base_client.py", line 920, in request
    return self._request(
           ~~~~~~~~~~~~~^
        cast_to=cast_to,
        ^^^^^^^^^^^^^^^^
    ...<3 lines>...
        remaining_retries=remaining_retries,
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    )
    ^
  File "D:\ai-research-suite\venv\Lib\site-packages\groq\_base_client.py", line 1018, in _request
    raise self._make_status_error_from_response(err.response) from None
groq.AuthenticationError: Error code: 401 - {'error': {'message': 'Invalid API Key', 'type': 'invalid_request_error', 'code': 'invalid_api_key'}}
rohitprasad15 commented 5 days ago

Please paste the full code on how you are loading the keys and initializing aisuite client.

baberibrar commented 5 days ago

Please paste the full code on how you are loading the keys and initializing aisuite client.

from dotenv import load_dotenv
import os

# Load environment variables from .env file
load_dotenv()

# Access your API keys
openai_api_key = os.getenv("OPENAI_API_KEY")
# anthropic_api_key = os.getenv("ANTHROPIC_API_KEY")
groq_api_key = os.getenv("GROQ_API_KEY")

import aisuite as ai
client = ai.Client()

models = ["groq:grok-beta"]

messages = [
    {"role": "system", "content": "Respond in Pirate English."},
    {"role": "user", "content": "Tell me a joke."},
]

for model in models:
    response = client.chat.completions.create(
        model=model,
        messages=messages,
        temperature=0.75
    )
    print(response.choices[0].message.content)
cleesmith commented 3 days ago

Only 2 providers work for me: OpenAI and Ollama.

I see this for Anthropic and Groq:

python -B cls1.py

Traceback (most recent call last):
  File "/Users/cleesmith/aisuite/cls1.py", line 16, in <module>
    response = client.chat.completions.create(model="groq:llama-3.1-8b-instant", messages=messages)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/cleesmith/aisuite/aisuite/client.py", line 108, in create
    self.client.providers[provider_key] = ProviderFactory.create_provider(
                                          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/cleesmith/aisuite/aisuite/provider.py", line 46, in create_provider
    return provider_class(**config)
           ^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/cleesmith/aisuite/aisuite/providers/groq_provider.py", line 19, in __init__
    self.client = groq.Groq(**config)
                  ^^^^^^^^^^^^^^^^^^^
  File "/opt/miniconda3/envs/aisuite/lib/python3.11/site-packages/groq/_client.py", line 99, in __init__
    super().__init__(
  File "/opt/miniconda3/envs/aisuite/lib/python3.11/site-packages/groq/_base_client.py", line 824, in __init__
    self._client = http_client or SyncHttpxClientWrapper(
                                  ^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/miniconda3/envs/aisuite/lib/python3.11/site-packages/groq/_base_client.py", line 722, in __init__
    super().__init__(**kwargs)
TypeError: Client.__init__() got an unexpected keyword argument 'proxies'

for the following code, or the example code in this repo ... both have the same error:

# pip install 'aisuite[all]'
import aisuite as ai

client = ai.Client()

messages = [
    {"role": "system", "content": "You are a helpful agent, who answers with brevity."},
    {"role": "user", "content": 'list planets'},
]

# response = client.chat.completions.create(model="anthropic:claude-3-haiku-20240307", messages=messages)
response = client.chat.completions.create(model="groq:llama-3.1-8b-instant", messages=messages)

print(response.choices[0].message.content)

It works for openai and ollama ... using Groq is fast, free, and good for testing code so it would be nice to have.

pip show aisuite
Name: aisuite
Version: 0.1.6
Summary: Uniform access layer for LLMs
Home-page: 
Author: Andrew Ng
Author-email: 
License: 
Location: /opt/miniconda3/envs/aisuite/lib/python3.11/site-packages
Requires: 
Required-by:
Quantisan commented 20 hours ago

related #110