MadcowD / ell

A language model programming library.
http://docs.ell.so/
MIT License
4.93k stars 285 forks source link

Groq client api error #289

Open vayoa opened 2 weeks ago

vayoa commented 2 weeks ago

I'm using this chat example with the groq client and I'm getting errors once I try to input a second user message:

import ell

MODEL = "llama-3.1-70b-versatile"
ell.models.groq.register()

@ell.complex(model=MODEL, temperature=0.7)
def chat_bot(message_history: list[ell.Message]):
    return [
        ell.system("You are a friendly chatbot. Engage in casual conversation."),
    ] + message_history

message_history = []
while True:
    user_input = input("You: ")
    message_history.append(ell.user(user_input))
    response = chat_bot(message_history)
    print("Bot:", response.text)
    message_history.append(response)

Output:

You: hey
Bot: How's it going? How's your day been so far?
You: hello
Traceback (most recent call last):
  File "c:\Users\ew0nd\Documents\otui\v2\ell_test.py", line 39, in <module>
    response = chat_bot(message_history)
               ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\ew0nd\AppData\Local\Programs\Python\Python311\Lib\site-packages\ell\lmp\_track.py", line 64, in tracked_func
    return func_to_track(*fn_args, **fn_kwargs, _invocation_origin=invocation_id)[0]
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\ew0nd\AppData\Local\Programs\Python\Python311\Lib\site-packages\ell\lmp\complex.py", line 68, in model_call
    (result, final_api_params, metadata) = provider.call(ell_call, origin_id=_invocation_origin, logger=_logger if should_log else None)
                                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\ew0nd\AppData\Local\Programs\Python\Python311\Lib\site-packages\ell\provider.py", line 125, in call
    provider_resp = call(**final_api_call_params)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\ew0nd\AppData\Local\Programs\Python\Python311\Lib\site-packages\groq\resources\chat\completions.py", line 287, in create
    return self._post(
           ^^^^^^^^^^^
  File "C:\Users\ew0nd\AppData\Local\Programs\Python\Python311\Lib\site-packages\groq\_base_client.py", line 1244, in post
    return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^        
  File "C:\Users\ew0nd\AppData\Local\Programs\Python\Python311\Lib\site-packages\groq\_base_client.py", line 936, in request
    return self._request(
           ^^^^^^^^^^^^^^
  File "C:\Users\ew0nd\AppData\Local\Programs\Python\Python311\Lib\site-packages\groq\_base_client.py", line 1039, in _request
    raise self._make_status_error_from_response(err.response) from None
groq.BadRequestError: Error code: 400 - {'error': {'message': "'messages.2' : for 'role:assistant' the following must be satisfied[('messages.2.content' : value must be a string)]", 'type': 'invalid_request_error'}}
reecelikesramen commented 1 week ago

Hey @vayoa this is an issue but it's actually not isolated to your example, this errors as well:

import ell

MODEL = "llama-3.1-70b-versatile"
ell.models.groq.register()

@ell.simple(model=MODEL, temperature=0, max_tokens=20)
def test():
    return [
        ell.user("hello"),
        ell.assistant("hello there"),
                ell.user("how are you?"),
        ]

print(test())
# Error:
# groq.BadRequestError: Error code: 400 - {'error': {'message': "'messages.1' : for 'role:assistant' the following must be satisfied[('messages.1.content' : value must be a string)]", 'type': 'invalid_request_error'}}

I just handled an error similar to this with the Instructor example. Looks like Groq also can only handle an assistant message like {"role": "assistant", "content": <str>} and not what ell gives it: {"role": "assistant", "content": [{"type": "text", "text": <str>}]}

If you take a look at groq documentation for their completions API they don't show any examples with JSON ContentBlocks, only content as strings. The exception to this is their vision models. Unlike Instructor, I think this is intentional so it will require a change to the Groq provider.