Closed briup1 closed 4 months ago
🤖
Hello,
Thank you for reaching out and providing detailed information about the issue you're facing.
The QianfanLLMEndpoint
class in LangChain is designed to interact with Baidu Qianfan's hosted open source or customized models. It requires the qianfan
Python package to be installed and the environment variables qianfan_ak
and qianfan_sk
to be set with your API key and Secret Key. These keys can be obtained from https://cloud.baidu.com/product/wenxinworkshop.
From the error message you provided, it seems like the QianfanLLMEndpoint
class is not receiving the qianfan_ak
and qianfan_sk
parameters correctly. The error message "both ak and sk must be provided, otherwise access_token should be provided" indicates that the class is not receiving either the API key and Secret Key or the access token.
Here is the correct way to use the QianfanLLMEndpoint
class:
from langchain.llms import QianfanLLMEndpoint
llm = QianfanLLMEndpoint(qianfan_ak="your_ak", qianfan_sk="your_sk", model="ERNIE-Bot-turbo")
res = llm("hi")
print(res)
Please replace "your_ak"
and "your_sk"
with your actual API key and Secret Key.
If you're still facing the issue, could you please confirm if you're providing the correct API key and Secret Key? Also, please make sure that the qianfan
Python package is installed correctly.
Regarding the changes made in the QianfanLLMEndpoint
class between LangChain version 0.0.312 and the latest version 0.0.316, I'm sorry but I couldn't find any specific changes related to this class in the repository. If you're using an older version of LangChain, I would recommend updating to the latest version to ensure you have the most recent updates and bug fixes.
I hope this helps! If you have any other questions or need further clarification, please don't hesitate to ask.
Best regards.
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.
Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.
If you want to continue the conversation, start your reply with @dosu-bot.
Hi, Thanks for using Qianfan as your LLM. That is a bug in qianfan 0.0.6
, Please upgrade your qianfan -> 0.0.7
# for pip
pip install --upgrade qianfan
FYI @briup1
ValueError: Argument prompt is expected to be a string. Instead found <class 'list'>. If you want to run the LLM on multiple prompts, use generate instead.
System Info
It was unexpected that I had to provide the accss_token when using QianfanLLMEndpoint
Name: langchain Version: 0.0.312
Name: qianfan Version: 0.0.6
Who can help?
No response
Information
Related Components
Reproduction
error msg: 860 if not isinstance(prompt, str): 861 raise ValueError( 862 "Argument
prompt
is expected to be a string. Instead found " 863 f"{type(prompt)}. If you want to run the LLM on multiple prompts, use " 864 "generate
instead." 865 ) 866 return ( --> 867 self.generate( 868 [prompt], 869 stop=stop, 870 callbacks=callbacks, 871 tags=tags, 872 metadata=metadata, 873 **kwargs, 874 ) 875 .generations[0][0] ... 166 ) 167 AuthManager().register(self._ak, self._sk, self._access_token) 168 else:InvalidArgumentError: both ak and sk must be provided, otherwise access_token should be provided
Expected behavior
Normal operation