Closed mimadiule closed 7 months ago
🤖
Hello @mimadiule,
Nice to meet you! I'm Dosu, an AI bot here to assist you with your issues, answer your questions, and help you navigate through the process of contributing to the LangChain repository. While we're waiting for a human maintainer, I'm here to get the ball rolling. Let's see how we can solve this together!
It seems like you've got this under control, if you want help or have specific questions, let me know what I can do for you!
System Info
langchain version 0.0.332, affect all platforms There is a mistake in the file: qianfan\resources\llm\completion.py this
endpoint="/chat/completions-pro
, is not correct ,it should beendpoint="/chat/completions_pro
," like below:"ERNIE-Bot-4": QfLLMInfo( endpoint="/chat/completions_pro", required_keys={"messages"}, optional_keys={ "stream", "temperature", "top_p", "penalty_score", "user_id", "system", }, ),
Who can help?
No response
Information
- [ ] The official example notebooks/scripts
- [ ] My own modified scripts
Related Components
- [x] LLMs/Chat Models
- [ ] Embedding Models
- [ ] Prompts / Prompt Templates / Prompt Selectors
- [ ] Output Parsers
- [ ] Document Loaders
- [ ] Vector Stores / Retrievers
- [ ] Memory
- [ ] Agents / Agent Executors
- [ ] Tools / Toolkits
- [ ] Chains
- [ ] Callbacks/Tracing
- [ ] Async
Reproduction
llm = QianfanLLMEndpoint(streaming = True,temperature = 0.5)
llm.model = "ERNIE-Bot-turbo"
llm.model = "ERNIE-Bot-4"
llm.model = "ChatGLM2-6B-32K"
res = llm("hi")
this will report error
Expected behavior
should work without error
Hi, Thanks for using Qianfan.
Try upgrade qianfan
and retry!
pip install qianfan --upgrade
Hi, @mimadiule
I'm helping the LangChain team manage their backlog and am marking this issue as stale. From what I understand, you identified a mistake in the file qianfan\resources\llm\completion.py
, which was causing an error when using the model "ERNIE-Bot-4". The issue has received comments from dosubot, danielhjz, and stonekim, with danielhjz suggesting an upgrade to qianfan
and providing a command for it. Additionally, stonekim has referenced a related pull request.
Could you please confirm if this issue is still relevant to the latest version of the LangChain repository? If it is, please let the LangChain team know by commenting on the issue. Otherwise, feel free to close the issue yourself, or it will be automatically closed in 7 days.
Thank you for your understanding and cooperation. If you have any further questions or need assistance, feel free to reach out.
System Info
langchain version 0.0.332, affect all platforms There is a mistake in the file: qianfan\resources\llm\completion.py this
endpoint="/chat/completions-pro
, is not correct ,it should beendpoint="/chat/completions_pro
," like below:"ERNIE-Bot-4": QfLLMInfo( endpoint="/chat/completions_pro", required_keys={"messages"}, optional_keys={ "stream", "temperature", "top_p", "penalty_score", "user_id", "system", }, ),
Who can help?
No response
Information
Related Components
Reproduction
llm = QianfanLLMEndpoint(streaming = True,temperature = 0.5)
llm.model = "ERNIE-Bot-turbo"
llm.model = "ERNIE-Bot-4"
llm.model = "ChatGLM2-6B-32K"
res = llm("hi")
this will report error
Expected behavior
should work without error