Closed liye199033 closed 9 months ago
🤖
Hello,
Thank you for bringing this to our attention. It seems like the model
parameter is indeed missing from the _default_params
method in the baidu_qianfan_endpoint.py
file. This could be causing the issue you're experiencing at line 183 of the chat_completion.py
file in the qianfan
module.
To resolve this, you can include the model
parameter in the _default_params
method. Here's how you can do it:
@property
def _default_params(self) -> Dict[str, Any]:
"""Get the default parameters for calling OpenAI API."""
normal_params = {
"stream": self.streaming,
"request_timeout": self.request_timeout,
"top_p": self.top_p,
"temperature": self.temperature,
"penalty_score": self.penalty_score,
"model": self.model, # Add this line
}
return {**normal_params, **self.model_kwargs}
This should ensure that the model
parameter is included in the parameters passed to the do
or ado
methods of the qianfan.ChatCompletion
client, potentially resolving the execution failure at line 183 of the chat_completion.py
file in the qianfan
module.
As for the error message erniebot.errors.InvalidParameterError: the length of messages must be an odd number
, it seems like the Baidu Qianfan API requires the number of messages to be odd. You might want to check the number of messages you're passing to the do
method and ensure it's an odd number.
Please try this solution and let us know if it resolves your issue. If not, we'll be happy to assist you further.
Best, Dosu
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.
Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.
If you want to continue the conversation, start your reply with @dosu-bot.
I am using the RetrievalQA chain and the odd number error occurs ` llm = QianfanChatEndpoint( qianfan_ak="xxx", qianfan_sk="xxx" )
embedding = ErnieEmbeddings( qianfan_ak="xxx", qianfan_sk="xxx" ) db = Chroma.from_documents(texts, embedding) retriever = db.as_retriever()
qa = RetrievalQA.from_chain_type(llm=llm, chain_type="stuff", retriever=retriever)
qa.run("请问事项的名称是什么?") `
@dosu-bot
@dosu-bot
I am using the RetrievalQA chain and the odd number error occurs ` llm = QianfanChatEndpoint( qianfan_ak="xxx", qianfan_sk="xxx" )
embedding = ErnieEmbeddings( qianfan_ak="xxx", qianfan_sk="xxx" ) db = Chroma.from_documents(texts, embedding) retriever = db.as_retriever()
qa = RetrievalQA.from_chain_type(llm=llm, chain_type="stuff", retriever=retriever)
qa.run("请问事项的名称是什么?") `
🤖
Based on the code snippet you provided, it seems like you're trying to use the RetrievalQA
chain with the QianfanChatEndpoint
and ErnieEmbeddings
. The error message you're encountering is likely due to the way the QianfanChatEndpoint
interacts with the Baidu Qianfan API.
The QianfanChatEndpoint
class in the LangChain framework is designed to interact with Baidu's Qianfan chat models. It provides methods to generate responses from the chat models and stream the results if needed. The class requires the qianfan
Python package and API key (qianfan_ak
) and Secret Key (qianfan_sk
) to be set in the environment variables.
The error message you're seeing suggests that the number of messages being sent to the Qianfan API needs to be an odd number. This is likely because the Qianfan API expects a conversation history where each message alternates between the user and the model, starting with the user. If the number of messages is even, it means the conversation ended with the user, and the model hasn't had a chance to respond.
To resolve this issue, you could try ensuring that the conversation history you're passing to the QianfanChatEndpoint
always has an odd number of messages. If you're using the RetrievalQA
chain, you might need to modify the chain_type
or the way you're using the run
method to ensure this.
Unfortunately, without more information about the ErnieEmbeddings
class and how it interacts with the QianfanChatEndpoint
class, I can't provide a more specific solution. If you could provide more details about these classes and how you're using them, I might be able to help further.
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.
Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.
If you want to continue the conversation, start your reply with @dosu-bot.
@dosu-bot
I am using the RetrievalQA chain and the odd number error occurs ` llm = QianfanChatEndpoint( qianfan_ak="xxx", qianfan_sk="xxx" )
embedding = ErnieEmbeddings( qianfan_ak="xxx", qianfan_sk="xxx" ) db = Chroma.from_documents(texts, embedding) retriever = db.as_retriever()
qa = RetrievalQA.from_chain_type(llm=llm, chain_type="stuff", retriever=retriever)
qa.run("请问事项的名称是什么?") `
Hi, do you still get this problem, i tried to reproduce this issue by the following code(fix the ErnieEmbedding init fields name). As a result, not error thrown.
from langchain.chat_models import QianfanChatEndpoint
from langchain.embeddings import ErnieEmbeddings
from langchain.vectorstores import Chroma
from langchain.chains import RetrievalQA
llm = QianfanChatEndpoint(
qianfan_ak="your_ak",
qianfan_sk="your_sk"
)
embedding = ErnieEmbeddings(
ernie_client_id="your_ak",
ernie_client_secret="your_sk"
)
texts = ["户政事项地点: shanghai office", "签证事项地点:beijng"]
from langchain.schema import Document
db = Chroma.from_documents([Document(page_content=text) for text in texts], embedding)
retriever = db.as_retriever()
qa = RetrievalQA.from_chain_type(llm=llm, chain_type="stuff", retriever=retriever)
ans = qa.run("请问办户政的地点是哪里?")
print(ans)
@liye199033 CC
Hi, @liye199033,
I'm helping the LangChain team manage their backlog and am marking this issue as stale. From what I understand, the issue involves a missing model from the _default_params method in the baidu_qianfan_endpoint.py file, leading to a failure in the chat_completion.py file. Dosubot provided a potential solution to include the model parameter in the _default_params method and suggested ensuring an odd number of messages when interacting with the Baidu Qianfan API. Liye199033 mentioned encountering the odd number error while using the RetrievalQA chain and provided code snippets. Danielhjz attempted to reproduce the issue and shared their findings. It seems that there is a potential resolution to the issue.
Could you please confirm if this issue is still relevant to the latest version of the LangChain repository? If it is, please let the LangChain team know by commenting on the issue. Otherwise, feel free to close the issue yourself, or the issue will be automatically closed in 7 days. Thank you!
System Info
langchain 0.0.295 python 3.11 os macOS 12.5
Who can help?
No response
Information
Related Components
Reproduction
The model appears to be missing from the _default_params method in the baidu_qianfan_endpoint.py file ` @property def _default_params(self) -> Dict[str, Any]: """Get the default parameters for calling OpenAI API.""" normal_params = { "stream": self.streaming, "request_timeout": self.request_timeout, "top_p": self.top_p, "temperature": self.temperature, "penalty_score": self.penalty_score, }
Line 183 of the chat_completion.py file in the qianfan module failed to be executed.
def do( self, model: Optional[str] = None, endpoint: Optional[str] = None, stream: bool = False, retry_count: int = 1, request_timeout: float = 60, backoff_factor: float = 0, **kwargs, ) -> QfResponse: """ if model is EB, use EB SDK to deal with the request """ if "messages" in kwargs and isinstance(kwargs["messages"], QfMessages): kwargs["messages"] = kwargs["messages"]._to_list() if not GLOBAL_CONFIG.DISABLE_EB_SDK: if model in ["ERNIE-Bot-turbo", "ERNIE-Bot"]: import erniebot #line 183`
but even with model,baidu will return the following error
File "/Users/xxx/Workspace/xxx/xxx/chatbot/venv/lib/python3.11/site-packages/erniebot/backends.py", line 113, in handle_response raise errors.InvalidParameterError(emsg) erniebot.errors.InvalidParameterError: the length of messages must be an odd number
because there are two messages in the requested messages
Expected behavior
able to return correct results