Closed vishnumg closed 10 months ago
bump
Same problem, I'm using the Bing Search Tool
Hi, @vishnumg,
I'm helping the LangChain team manage their backlog and am marking this issue as stale.
It seems like you're experiencing an issue with the "Retrieve Information" tool related to the gpt-3.5-turbo-16k model, resulting in a "Could not parse LLM output" error. RalissonMattias has also commented about a similar problem with the Bing Search Tool. As of now, the issue remains unresolved.
Could you please confirm if this issue is still relevant to the latest version of the LangChain repository? If it is, kindly let the LangChain team know by commenting on the issue. Otherwise, feel free to close the issue yourself, or it will be automatically closed in 7 days.
Thank you for your understanding and cooperation.
System Info
I have the following prompt:
And I use it as follows:
This works perfectly with gpt-3.5-turbo. However, when I use 16k model, I face 2 issues.
Information
Related Components
Reproduction
Steps to reproduce behaviour:
Expected behavior
With gpt-3.5-turbo, it should work well. But with gpt-3.5-turbo-16k the following errors should happen: