You can set the config like this to use groq instead of openai
OPENAI_API_ENDPOINT=https://api.groq.com/openai/v1
OPENAI_KEY=<groq api key>
MODEL=<llama model>
This creates commands correctly, but i get this error when it writes the explanation
{
"error": {
"message": "The model `gpt-3.5-turbo` does not exist or you do not have access to it.",
"type": "invalid_request_error",
"code": "model_not_found"
}
}
A temporary workaround is to always have silent mode on.
You can set the config like this to use groq instead of openai
This creates commands correctly, but i get this error when it writes the explanation
A temporary workaround is to always have silent mode on.