Closed sourabhdesai closed 9 months ago
The latest updates on your projects. Learn more about Vercel for Git ↗︎
Name | Status | Preview | Comments | Updated (UTC) |
---|---|---|---|---|
llama-app-frontend | ✅ Ready (Inspect) | Visit Preview | 💬 Add feedback | Nov 26, 2023 0:22am |
Hey @sourabhdesai thank you for this. I can confirm that the latest version of LlamaIndex has now been fixed. Good work man.
Some folks on discord have been patiently waiting for me to investigate and fix an issue that occurs when updating the llama-index version. They were seeing the following error on the backend when trying to receive a response to a user message:
I did some investigation and only found through using a step-by-step debugger that there was an error being thrown on these lines of the
OpenAI
LLM class which was being swallowed/not being printed out. That error was:In a much older version of llama-index, there was a bug where the
api_key
constructor parameter was being ignored. As a workaround for that, we had passed theapi_key
in through theadditional_kwargs
constructor parameter instead. Fast forward to the current version of llama-index, the bug where theapi_key
constructor param was being ignored has been fixed + now the new OpenAI client library seems to have stricter validation of the types of parameters you can pass to their chat completion APIs. Hence we were seeing this error. The fix was simply to remove the lines here and here where we were passing in the extra API key value inadditional_kwargs
.