langchain-ai / langchain

🦜🔗 Build context-aware reasoning applications
https://python.langchain.com
MIT License
93.01k stars 14.94k forks source link

Pydantc output parser not working with gemma fireworks ai #23754

Open gopi-tookitaki opened 3 months ago

gopi-tookitaki commented 3 months ago

Checked other resources

Example Code

model = ChatFireworks(model=model_name) parser = PydanticOutputParser(pydantic_object=pydantic) prompt = ChatPromptTemplate.from_messages([ ("system", "Answer the user query. Wrap the output in json tags\n{format_instructions}"), ("human", "{query}"), ]).partial(format_instructions=parser.get_format_instructions()) chain = prompt | model | parser try: output = chain.invoke({"query": input}) except (OutputParserException, InvalidRequestError) as e: output = f"An error occurred: {e}"

Error Message and Stack Trace (if applicable)

No response

Description

An error occurred: {'error': {'object': 'error', 'type': 'invalid_request_error', 'message': 'jinja template rendering failed. System role not supported'}}

System Info

System Information

OS: Darwin OS Version: Darwin Kernel Version 23.5.0: Wed May 1 20:12:58 PDT 2024; root:xnu-10063.121.3~5/RELEASE_ARM64_T6000 Python Version: 3.12.4 (main, Jun 21 2024, 11:46:08) [Clang 15.0.0 (clang-1500.3.9.4)]

Package Information

langchain_core: 0.2.9 langchain: 0.2.5 langchain_community: 0.2.5 langsmith: 0.1.81 langchain_fireworks: 0.1.3 langchain_openai: 0.1.8 langchain_text_splitters: 0.2.1

Packages not installed (Not Necessarily a Problem)

The following packages were not found:

langgraph langserve

eyurtsev commented 3 months ago

@gopi-tookitaki the chat model does not seem to automatically handle system message

as a workaround update the template so that the instructions are part of the human message

gopi-tookitaki commented 3 months ago

is this an issue with langchain? or fireworks @eyurtsev