nvms / wingman

Your pair programming wingman. Supports OpenAI, Anthropic, or any LLM on your local inference server.
https://marketplace.visualstudio.com/items?itemName=nvms.ai-wingman
ISC License
61 stars 10 forks source link

`Error: Failed to open completion stream: 429 Too Many Requests` #36

Closed akashagarwal7 closed 7 months ago

akashagarwal7 commented 7 months ago

Hi, first of - great extension! Exactly what I've been looking for.

I keep getting 429 Too Many Requests when I try to use a prompt. Nothing appears on the chat view either apart from the correct text selection and the generated prompt: image

I've setup the OpenAI key, made sure VSCode and the extension are up to date, restarted VSCode several times, tried using different prompts and text selections. Still no bueno. No relevant logs in the output of Extension Host either.

Would much appreciate some help in debugging this! Cheers.

nvms commented 7 months ago

Hi @akashagarwal7, thank you!

https://help.openai.com/en/articles/6891829-error-code-429-rate-limit-reached-for-requests

You are most likely being rate limited by OpenAI for one of the reasons listed in the article above. Wingman itself doesn't impose any rate limiting or anything like that. The 429 code is the response from their platform.

Going to close this issue because unfortunately there's very likely nothing I'm able to do to help you with this other than providing the above documentation.

akashagarwal7 commented 7 months ago

Thanks, that was it!