langchain-ai / langchain

🦜🔗 Build context-aware reasoning applications
https://python.langchain.com
MIT License
88.87k stars 13.98k forks source link

Bug Report: Issue with LangChain and OpenAI Template Handling (vLLM) #19857

Closed lavrenalex closed 3 months ago

lavrenalex commented 3 months ago

Checked other resources

Example Code

from langchain.chats import ChatOpenAI, ChatPromptTemplate, SystemMessagePromptTemplate, HumanMessagePromptTemplate

endpoint_url = "http://url/v1"
chat = ChatOpenAI(
    model="mistral-medium",
    openai_api_key="EMPTY",
    openai_api_base=endpoint_url,
    max_tokens=5,
    temperature=0,
)

template = (
    "You are a helpful assistant that translates {input_language} to {output_language}."
)
system_message_prompt = SystemMessagePromptTemplate.from_template(template)
human_template = "{text}"
human_message_prompt = HumanMessagePromptTemplate.from_template(human_template)

chat_prompt = ChatPromptTemplate.from_messages(
    [system_message_prompt, human_message_prompt]
)

results = chat(
    chat_prompt.format_prompt(
        input_language="English", output_language="Italian", text="I love programming."
    ).to_messages()
)
print(results)

Error Message and Stack Trace (if applicable)

openai.UnprocessableEntityError: Error code: 422 - {'error': 'Template error: template not found', 'error_type': 'template_error'}

Description

I'm encountering an issue with LangChain's OpenAI Chat integration, specifically when trying to use custom templates for dynamic chat prompts. The expected behavior is to seamlessly generate chat prompts using the specified system and human message templates, and then perform a chat session using these prompts. However, the current behavior results in an openai.UnprocessableEntityError with a template error indicating that the template was not found.

This issue arises despite following the documented approach for creating and using ChatPromptTemplate, SystemMessagePromptTemplate, and HumanMessagePromptTemplate within LangChain's framework. The error suggests there's either a problem with the template processing/handling in LangChain or a misconfiguration in the OpenAI chat integration setup.

Even when using a cURL command to interact with the model, it's still resulting in an error about a missing template.

cURL Testing

curl http://ur/lv1/chat/completions \
    -H "Content-Type: application/json" \
    -d '{
        "model": "mistralai/Mistral-7B-Instruct-v0.2",
       "messages": [
    {
        "role": "system",
        "content": "When user gives a number, simply respond with the double and say nothing else."
    },
    {
        "role": "user",
        "content": "100"
    }]
    }'

System Info

System Information

OS: Linux OS Version: #224-Ubuntu SMP Mon Jun 19 13:30:12 UTC 2023 Python Version: 3.10.0 (default, Jan 10 2024, 16:43:55) [GCC 7.5.0]

Package Information

langchain_core: 0.1.37 langchain: 0.1.14 langchain_community: 0.0.30 langsmith: 0.1.38 langchain_mistralai: 0.1.0 langchain_openai: 0.1.1 langchain_text_splitters: 0.0.1

Plutone11011 commented 3 months ago

Using gpt-3.5-turbo it gives no such error. Since you are using a mistral model, have you tried using a ChatMistralAI chat model instead of ChatOpenAI? An example notebook is provided here

lavrenalex commented 3 months ago

I tried it as well:

    chat = ChatMistralAI(mistral_api_key=token, endpoint=url)
    messages = [HumanMessage(content="knock knock")]
    results = chat.invoke(messages)

Response: Status code: 422, Response: {"error":"Template error: template not found","error_type":"template_error"}

In addition to that, I've tried the vLLM as well:

    chat = ChatOpenAI(
        model=model,
        openai_api_key="EMPTY",
        openai_api_base=url,
        max_tokens=5,
        temperature=0,
    )
    messages = [
        SystemMessage(
            content="You are a helpful assistant that translates English to Italian."
        ),
        HumanMessage(
            content="Translate the following sentence from English to Italian: I love programming."
        ),
    ]
   results = chat(messages)

Same error:

{'error': 'Template error: template not found', 'error_type': 'template_error'}
lavrenalex commented 3 months ago

Okay, I've resolved this issue. If you're encountering the same problem, for your information, the issue lies with the Mistral Model. I've identified that the TGI version was causing the issues. Simply download a new one from the Mistral website.