Closed pradeepdev-1995 closed 9 months ago
π€
Hello,
Thank you for reaching out and providing detailed information about your issue. The error message "TypeError: Got unknown type F" suggests that the function _convert_message_to_dict
is receiving an argument of a type it doesn't recognize. This function is designed to handle instances of ChatMessage
, HumanMessage
, AIMessage
, SystemMessage
, and FunctionMessage
. If it receives an instance of any other type, it raises a ValueError.
Based on the code snippet you provided, it seems like you're trying to use the AzureChatOpenAI
class to interact with a deployed model on Azure OpenAI. However, the error might be occurring because the message you're passing to the AzureChatOpenAI
instance is not an instance of the expected types.
Here is the relevant code snippet from the AzureChatOpenAI
class:
def _convert_message_to_dict(message: BaseMessage) -> dict:
if isinstance(message, ChatMessage):
message_dict = {"role": message.role, "content": message.content}
elif isinstance(message, HumanMessage):
message_dict = {"role": "user", "content": message.content}
elif isinstance(message, AIMessage):
message_dict = {"role": "assistant", "content": message.content}
elif isinstance(message, SystemMessage):
message_dict = {"role": "system", "content": message.content}
elif isinstance(message, FunctionMessage):
message_dict = {
"role": "function",
"name": message.name,
"content": message.content,
}
else:
raise ValueError(f"Got unknown type {message}")
if "name" in message.additional_kwargs:
message_dict["name"] = message.additional_kwargs["name"]
return message_dict
In your code, you're passing a string directly to the AzureChatOpenAI
instance:
result = llm("Father of computer")
However, the AzureChatOpenAI
class expects a ChatMessage
, HumanMessage
, AIMessage
, SystemMessage
, or FunctionMessage
instance, not a string. You should create an instance of one of these classes and pass that to the AzureChatOpenAI
instance instead. Here's an example using HumanMessage
:
from langchain.chat_models import HumanMessage
message = HumanMessage(content="Father of computer")
result = llm(message)
Please try this modification and let me know if it resolves your issue. If not, please provide more details about the error, such as the full traceback, so I can assist you further.
Best regards.
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.
Help me be more useful! Please leave a π if this is helpful and π if it is irrelevant.
If you want to continue the conversation, start your reply with @dosu-bot.
@dosu-bot was close. This documentation calls the model using a list of HumanMessages
. You should call the llm
differently:
from langchain.schema import HumanMessage
message = HumanMessage(content="Father of computer")
result = llm([message])
result
AIMessage(content="The father of computer is considered to be Charles Babbage. He was an English mathematician, philosopher, and inventor who is best known for conceptualizing and designing an early mechanical general-purpose computer called the Analytical Engine in the 1830s. Although the Analytical Engine was never fully constructed during his lifetime, his work laid the foundation for modern computers. Babbage's contributions to the field of computing make him widely regarded as the father of computer science and the father of the computer.", additional_kwargs={}, example=False)
or use llm.predict
result2 = llm.predict("Father of computer")
result2
"The father of computer is considered to be Charles Babbage. He is known for his work on the Analytical Engine, an early mechanical computer concept. Although the Analytical Engine was never built during Babbage's lifetime, his designs and ideas laid the foundation for modern computing."
@joshuasundance-swca Working the above solution. Thank you.
System Info
langchain-0.0.291 python3.9
Who can help?
No response
Information
Related Components
Reproduction
Expected behavior
Expecting the answer