Closed MPuust closed 5 months ago
You don't need to use the MessagePlaceholder
but you should correct the code to as follow:
from langchain_core.messages import SystemMessage
from langchain_core.output_parsers import PydanticOutputParser
from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder, HumanMessagePromptTemplate
from langchain_openai import ChatOpenAI
from pydantic import Field, BaseModel
llm = ChatOpenAI(
openai_api_key='xxxxxxxxx',
response_format={"type": "json_object"},
)
template = """
{format_instructions}
---
Type of jokes that entertain today's crowd: {type}
"""
class Response(BaseModel):
best_joke: str = Field(description="best joke you've heard")
worst_joke: str = Field(description="worst joke you've heard")
input_variables = {"type": "dad"}
parser = PydanticOutputParser(pydantic_object=Response)
system_message = SystemMessage(content="You are a comedian that has to perform two jokes.")
human_message = HumanMessagePromptTemplate.from_template(template=template)
chat_prompt = ChatPromptTemplate.from_messages([system_message, human_message]) # pass the human message template
chain = chat_prompt | llm | parser
print(chain.invoke({'format_instructions':parser.get_format_instructions(),
'type':'dad'})) # pass the input to human message template
That does it, thanks
Checked other resources
Example Code
Error Message and Stack Trace (if applicable)
Description
MessagePromptTemplate conversion to message not implemented although it's said in docstring.
langchain_core/messages/utils.py row 186
System Info
System Information
Package Information
Packages not installed (Not Necessarily a Problem)
The following packages were not found: