langchain-ai / langchain

🦜🔗 Build context-aware reasoning applications
https://python.langchain.com
MIT License
94.01k stars 15.16k forks source link

BaseMessagePromptTemplate conversion to message NotImplementedError #22115

Closed MPuust closed 5 months ago

MPuust commented 5 months ago

Checked other resources

Example Code

from langchain_core.messages import SystemMessage
from langchain_core.output_parsers import PydanticOutputParser
from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder, HumanMessagePromptTemplate
from langchain_openai import ChatOpenAI
from pydantic import Field, BaseModel

llm = ChatOpenAI(
    openai_api_key='',
    model_name="gpt-4o",
    response_format={"type": "json_object"},
)

template = """
{format_instructions}
---
Type of jokes that entertain today's crowd: {type}
"""

class Response(BaseModel):
    best_joke: str = Field(description="best joke you've heard")
    worst_joke: str = Field(description="worst joke you've heard")

input_variables = {"type": "dad"}
parser = PydanticOutputParser(pydantic_object=Response)
system_message = SystemMessage(content="You are a comedian that has to perform two jokes.")
human_message = HumanMessagePromptTemplate.from_template(template=template,
                                                         input_variables=list(input_variables.keys()),
                                                         partial_variables={
                                                             "format_instructions": parser.get_format_instructions()})
chat_prompt = ChatPromptTemplate.from_messages([system_message, MessagesPlaceholder(variable_name="messages")])
chain = chat_prompt | llm | parser

print(chain.invoke({"messages": [human_message]}))

Error Message and Stack Trace (if applicable)

Traceback (most recent call last):
  File "/Library/Application Support/JetBrains/PyCharm2024.1/scratches/scratch_2.py", line 37, in <module>
    print(chain.invoke({"messages": [human_message]}))
  File "/venv/lib/python3.9/site-packages/langchain_core/runnables/base.py", line 2393, in invoke
    input = step.invoke(
  File "/venv/lib/python3.9/site-packages/langchain_core/prompts/base.py", line 128, in invoke
    return self._call_with_config(
  File "/venv/lib/python3.9/site-packages/langchain_core/runnables/base.py", line 1503, in _call_with_config
    context.run(
  File "/venv/lib/python3.9/site-packages/langchain_core/runnables/config.py", line 346, in call_func_with_variable_args
    return func(input, **kwargs)  # type: ignore[call-arg]
  File "/venv/lib/python3.9/site-packages/langchain_core/prompts/base.py", line 112, in _format_prompt_with_error_handling
    return self.format_prompt(**_inner_input)
  File "/venv/lib/python3.9/site-packages/langchain_core/prompts/chat.py", line 665, in format_prompt
    messages = self.format_messages(**kwargs)
  File "/venv/lib/python3.9/site-packages/langchain_core/prompts/chat.py", line 1008, in format_messages
    message = message_template.format_messages(**kwargs)
  File "/venv/lib/python3.9/site-packages/langchain_core/prompts/chat.py", line 200, in format_messages
    return convert_to_messages(value)
  File "/venv/lib/python3.9/site-packages/langchain_core/messages/utils.py", line 244, in convert_to_messages
    return [_convert_to_message(m) for m in messages]
  File "/venv/lib/python3.9/site-packages/langchain_core/messages/utils.py", line 244, in <listcomp>
    return [_convert_to_message(m) for m in messages]
  File "/venv/lib/python3.9/site-packages/langchain_core/messages/utils.py", line 228, in _convert_to_message
    raise NotImplementedError(f"Unsupported message type: {type(message)}")
NotImplementedError: Unsupported message type: <class 'langchain_core.prompts.chat.HumanMessagePromptTemplate'>

Description

MessagePromptTemplate conversion to message not implemented although it's said in docstring.

langchain_core/messages/utils.py row 186

def _convert_to_message(
    message: MessageLikeRepresentation,
) -> BaseMessage:
    """Instantiate a message from a variety of message formats.

    The message format can be one of the following:

    - BaseMessagePromptTemplate
    - BaseMessage
    - 2-tuple of (role string, template); e.g., ("human", "{user_input}")
    - dict: a message dict with role and content keys
    - string: shorthand for ("human", template); e.g., "{user_input}"

    Args:
        message: a representation of a message in one of the supported formats

    Returns:
        an instance of a message or a message template
    """
    if isinstance(message, BaseMessage):
        _message = message
    elif isinstance(message, str):
        _message = _create_message_from_message_type("human", message)
    elif isinstance(message, Sequence) and len(message) == 2:
        # mypy doesn't realise this can't be a string given the previous branch
        message_type_str, template = message  # type: ignore[misc]
        _message = _create_message_from_message_type(message_type_str, template)
    elif isinstance(message, dict):
        msg_kwargs = message.copy()
        try:
            try:
                msg_type = msg_kwargs.pop("role")
            except KeyError:
                msg_type = msg_kwargs.pop("type")
            msg_content = msg_kwargs.pop("content")
        except KeyError:
            raise ValueError(
                f"Message dict must contain 'role' and 'content' keys, got {message}"
            )
        _message = _create_message_from_message_type(
            msg_type, msg_content, **msg_kwargs
        )
    else:
        raise NotImplementedError(f"Unsupported message type: {type(message)}")

    return _message

System Info

System Information

OS: Darwin OS Version: Darwin Kernel Version 23.4.0: Wed Feb 21 21:44:54 PST 2024; root:xnu-10063.101.15~2/RELEASE_ARM64_T6031 Python Version: 3.9.6 (default, Feb 3 2024, 15:58:27) [Clang 15.0.0 (clang-1500.3.9.4)]

Package Information

langchain_core: 0.2.1 langchain: 0.2.1 langsmith: 0.1.56 langchain_openai: 0.1.7 langchain_text_splitters: 0.2.0

Packages not installed (Not Necessarily a Problem)

The following packages were not found:

langgraph langserve

keenborder786 commented 5 months ago

You don't need to use the MessagePlaceholder but you should correct the code to as follow:


from langchain_core.messages import SystemMessage
from langchain_core.output_parsers import PydanticOutputParser
from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder, HumanMessagePromptTemplate
from langchain_openai import ChatOpenAI
from pydantic import Field, BaseModel

llm = ChatOpenAI(
    openai_api_key='xxxxxxxxx',
    response_format={"type": "json_object"},
)

template = """
{format_instructions}
---
Type of jokes that entertain today's crowd: {type}
"""

class Response(BaseModel):
    best_joke: str = Field(description="best joke you've heard")
    worst_joke: str = Field(description="worst joke you've heard")

input_variables = {"type": "dad"}
parser = PydanticOutputParser(pydantic_object=Response)
system_message = SystemMessage(content="You are a comedian that has to perform two jokes.")
human_message = HumanMessagePromptTemplate.from_template(template=template)
chat_prompt = ChatPromptTemplate.from_messages([system_message, human_message]) # pass the human message template
chain = chat_prompt | llm | parser
print(chain.invoke({'format_instructions':parser.get_format_instructions(),
                          'type':'dad'})) # pass the input to human message template
MPuust commented 5 months ago

That does it, thanks