Closed codewiz closed 2 months ago
I think the answer lies in your log output:
The model `gpt-4o` does not exist or you do not have access to it.
I've just tried putting in my own API key with your minimal.lua file and it works as expected. Can you access gpt-4o outside of CodeCompanion?
Topping up my OpenAI account with $10 cured the issue :-)
Can we leave this bug open to report API errors in the UI?
I thought I'd handled that damn out of credit error. It got me last time 😆.
I'll try and handle those other errors better. Thanks for raising this.
I did implement a hacky way of detecting an error from OpenAI back in 84597e0. The challenge is that they like to stream the error message back to you rather than just giving it as one full response. This means you have to handle {
then "error": {
then "message": "The model gpt-4o does not exist or you do not have access to it."
etc.
On this occasion, it looks like I should have detected we had an error in the response and not allowed the run_tools
method to be called and then the adapter would have caught the error and displayed it back to you in the UI.
My credit balance is running low (🤑) so I'll keep this issue open until I am confident I've handled it.
Closing this as I have been able to confirm this is now handled.
Your
minimal.lua
configError messages
Health check output
Log output
Describe the bug
Calling
:CodeCompanionChat
followed by:w
results in the above error.Tested with nvim built from git head:
Also tested with Arch system package:
Reproduce the bug
/usr/bin/nvim --clean -u minimal.lua
:CodeCompanionChat
:w
Final checks
minimal.lua
config file above and still get the issue