langchain-ai / langchain

πŸ¦œπŸ”— Build context-aware reasoning applications
https://python.langchain.com
MIT License
93.87k stars 15.13k forks source link

Issue: document_variable_name context was not found in llm_chain input_variables #15314

Closed summer1704 closed 6 months ago

summer1704 commented 9 months ago

Issue you'd like to raise.

I am trying to add a specific prompt template to my ConversationalRetrievalChain. This is my current code:

PROMPT_TEMPLATE = """ Act as the policies interactive Bot that gives advice on the Company policies, Travel policies, and Information security policies for the company. Do not try to make up an answer. Use only the given pieces of context; do not use your own knowledge. Chat History: {chat_history} Follow Up Input: {question} """ qa_prompt = PromptTemplate(input_variables=["chat_history", "question",], template=PROMPT_TEMPLATE)

chat = ChatOpenAI( verbose=True, model_name=MODEl_NAME, temperature=TEMPERATURE, max_retries=MAX_RETRIES, streaming=True, ) qa_chain =ConversationalRetrievalChain.from_llm( llm=chat, retriever=MyVectorStoreRetriever( vectorstore=vectordb, search_type="similarity_score_threshold", search_kwargs={"score_threshold": SIMILARITY_THRESHOLD, "k": 1}, ), return_source_documents=True, combine_docs_chain_kwargs={'prompt': qa_prompt}, ) response = qa_chain( { "question": query, "chat_history": chat_history, }, callbacks=[stream_handler], )

This is the error I'm currently getting,

qa_chain =ConversationalRetrievalChain.from_llm( File "/home/sfm/anaconda3/envs/chat_v2/lib/python3.10/site-packages/langchain/chains/conversational_retrieval/base.py", line 360, in from_llm doc_chain = load_qa_chain( File "/home/sfm/anaconda3/envs/chat_v2/lib/python3.10/site-packages/langchain/chains/question_answering/init.py", line 249, in load_qa_chain return loader_mapping[chain_type]( File "/home/sfm/anaconda3/envs/chat_v2/lib/python3.10/site-packages/langchain/chains/question_answering/init.py", line 81, in _load_stuff_chain return StuffDocumentsChain( File "/home/sfm/anaconda3/envs/chat_v2/lib/python3.10/site-packages/langchain/load/serializable.py", line 97, in init super().init(**kwargs) File "pydantic/main.py", line 341, in pydantic.main.BaseModel.init pydantic.error_wrappers.ValidationError: 1 validation error for StuffDocumentsChain root document_variable_name context was not found in llm_chain input_variables: ['chat_history', 'question'] (type=value_error)

can you help me to figure out the error and correct it?

Suggestion:

No response

dosubot[bot] commented 9 months ago

πŸ€–

Hello @summer1704! :wave:

I'm Dosu, an AI here to lend a hand with bugs, questions, and even guide you on your contributor journey while we wait for a human maintainer. Just think of me as your friendly neighborhood bot. :robot: Let's dive into this issue together, shall we? :diving_mask:

Based on the error message you're encountering, it seems like the variable 'document_variable_name' is not found in the input_variables of the ConversationalRetrievalChain.

In the context you provided, the 'document_variable_name' is not mentioned, so it's not clear where this variable is being used. However, the error suggests that you're trying to use a variable that hasn't been defined in the input_variables of your ConversationalRetrievalChain.

To resolve this error, you need to ensure that 'document_variable_name' is included in the input_variables of your ConversationalRetrievalChain. If 'document_variable_name' is a variable that you're using in your prompt templates, you need to include it in the input_variables list when you're creating your ChatPromptTemplate.

For example, if you're creating a ChatPromptTemplate like this:

ChatPromptTemplate(
    input_variables=["foo", "bar", "context"],
    messages=create_messages(),
)

And 'document_variable_name' is a variable that you're using in your prompt templates, you need to include it in the input_variables list like this:

ChatPromptTemplate(
    input_variables=["foo", "bar", "context", "document_variable_name"],
    messages=create_messages(),
)

This will ensure that 'document_variable_name' is recognized as a valid variable when you're formatting your prompts. You can refer to the test_chat.py for more examples on how to use the ChatPromptTemplate.

As for the issue of adding a specific prompt template to your ConversationalRetrievalChain, a similar issue was resolved by using combine_docs_chain_kwargs={'prompt': qa_prompt} as mentioned in this issue.

Please try these suggestions and let me know if you still encounter any issues.

Sources


This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a πŸ‘ if this is helpful and πŸ‘Ž if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.