Open vayoa opened 2 weeks ago
Hey @vayoa this is an issue but it's actually not isolated to your example, this errors as well:
import ell
MODEL = "llama-3.1-70b-versatile"
ell.models.groq.register()
@ell.simple(model=MODEL, temperature=0, max_tokens=20)
def test():
return [
ell.user("hello"),
ell.assistant("hello there"),
ell.user("how are you?"),
]
print(test())
# Error:
# groq.BadRequestError: Error code: 400 - {'error': {'message': "'messages.1' : for 'role:assistant' the following must be satisfied[('messages.1.content' : value must be a string)]", 'type': 'invalid_request_error'}}
I just handled an error similar to this with the Instructor example. Looks like Groq also can only handle an assistant message like {"role": "assistant", "content": <str>}
and not what ell gives it: {"role": "assistant", "content": [{"type": "text", "text": <str>}]}
If you take a look at groq documentation for their completions API they don't show any examples with JSON ContentBlocks, only content as strings. The exception to this is their vision models. Unlike Instructor, I think this is intentional so it will require a change to the Groq provider.
I'm using this chat example with the groq client and I'm getting errors once I try to input a second user message:
Output: