Closed sklinkert closed 2 months ago
@devalexandre thanks for approving it :)
@tmc can you merge it?
Using GenerateContent
method this implementation doesn't work completely. The request model of completions is different to openai defaults. Mine json request:
{
"model":"llama3-70b-8192",
"messages":[
{
"role":"system",
"content":[
{
"text":"some text...",
"type":"text"
},
{
"text":"some text...",
"type":"text"
}
]
},
{
"role":"user",
"content":[
{
"text":"hi",
"type":"text"
}
]
}
],
"temperature":0,
"max_tokens":300
}
Response:
API returned unexpected status code: 400: 'messages.0' : for 'role:system' the following must be satisfied[('messages.0.content' : value must be a string)]
@ImPedro29 Have you tried it with the latest version of langchaingo? I had the same error but it worked for me after upgrading it.
I updated and after some corrections to my embed engine, the same errors are appearing:
Using version v0.1.9
POST https://api.groq.com/openai/v1/chat/completions
Body in my message above
It looks like groq doesn't support the same models that openai gpt supports... Maybe in a future groq update it could be
Edit: I just tested GenerateContent
method
Add usage example for groq
groq offers the same API like OpenAPI
https://github.com/tmc/langchaingo/issues/797
PR Checklist
memory: add interfaces for X, Y
orutil: add whizzbang helpers
).Fixes #123
).golangci-lint
checks.