langchain-ai / langchain

🦜🔗 Build context-aware reasoning applications
https://python.langchain.com
MIT License
93.08k stars 14.96k forks source link

'AnthropicLLM' object has no attribute 'prompt' | StuffDocumentChain #19703

Closed umair313 closed 1 month ago

umair313 commented 6 months ago

Checked other resources

Example Code

from langchain_anthropic import AnthropicLLM
from langchain_community.chat_models import ChatAnthropic
from langchain.prompts import PromptTemplate

prompt_template = """\

Summarize the given text
"{text}"
output:"""
prompt = PromptTemplate.from_template(prompt_template)

# Define LLM
llm = ChatAnthropic(temperature=0, model="claude-3-sonnet-20240229", verbose=True)

# llm_chain = prompt | Anthropic(llm=llm, verbose=True)
llm_chain = AnthropicLLM(verbose=True, temperature=0, model="claude-3-sonnet-20240229")

# Define StuffDocumentsChain
stuff_chain =  StuffDocumentsChain(llm_chain=llm_chain, document_variable_name="text", verbose=True)

stuff_chain.run(documents)

Error Message and Stack Trace (if applicable)

/Users/naruto/Desktop/personal/ghi/venv/lib/python3.11/site-packages/langchain_anthropic/llms.py:176: UserWarning: This Anthropic LLM is deprecated. Please use from langchain_community.chat_models import ChatAnthropic instead warnings.warn(


AttributeError Traceback (most recent call last) Cell In[39], line 26 23 llm_chain = AnthropicLLM(verbose=True, temperature=0, model="claude-3-sonnet-20240229") 25 # Define StuffDocumentsChain ---> 26 stuff_chain = StuffDocumentsChain(llm_chain=llm_chain, document_variable_name="text", verbose=True) 27 # 28 # stuff_chain.run([main_docs, consumer_feed_back_docs]) 29 # print(stuff_chain.run(performance_json_docs))

File ~/Desktop/personal/ghi/venv/lib/python3.11/site-packages/langchain_core/load/serializable.py:120, in Serializable.init(self, kwargs) 119 def init(self, kwargs: Any) -> None: --> 120 super().init(**kwargs) 121 self._lc_kwargs = kwargs

File ~/Desktop/personal/ghi/venv/lib/python3.11/site-packages/pydantic/v1/main.py:339, in BaseModel.init(pydantic_self__, **data) 333 """ 334 Create a new model by parsing and validating input data from keyword arguments. 335 336 Raises ValidationError if the input data cannot be parsed to form a valid model. 337 """ 338 # Uses something other than self the first arg to allow "self" as a settable attribute --> 339 values, fields_set, validation_error = validate_model(pydantic_self.class, data) 340 if validation_error: 341 raise validation_error File ~/Desktop/personal/ghi/venv/lib/python3.11/site-packages/pydantic/v1/main.py:1048, in validate_model(model, input_data, cls) 1046 for validator in model.pre_root_validators__: 1047 try: -> 1048 inputdata = validator(cls, input_data) 1049 except (ValueError, TypeError, AssertionError) as exc: 1050 return {}, set(), ValidationError([ErrorWrapper(exc, loc=ROOTKEY)], cls)

File ~/Desktop/personal/ghi/venv/lib/python3.11/site-packages/langchain/chains/combine_documents/stuff.py:158, in StuffDocumentsChain.get_default_document_variable_name(cls, values) 150 @root_validator(pre=True) 151 def get_default_document_variable_name(cls, values: Dict) -> Dict: 152 """Get default document variable name, if not provided. 153 154 If only one variable is present in the llm_chain.prompt, 155 we can infer that the formatted documents should be passed in 156 with this variable name. 157 """ --> 158 llm_chain_variables = values["llm_chain"].prompt.input_variables 159 if "document_variable_name" not in values: 160 if len(llm_chain_variables) == 1:

AttributeError: 'AnthropicLLM' object has no attribute 'prompt'

Description

I am tryig to use the stuff document chain to summarize documents with anthropic models and getting the error mention above

System Info

python 3.11.6

langchain==0.1.11 langchain-anthropic==0.1.4 langchain-community==0.0.25 langchain-core==0.1.29 langchain-openai==0.0.8 langchain-text-splitters==0.0.1

theVannu commented 4 months ago

I have same error when using OpenAI or ChatOpenAI LLM

llm = ChatOpenAI(
    temperature=0,
    model="gpt-4o",
    openai_api_key=config["OPENAI_API_KEY"],
    openai_organization=config["OPENAI_ORGANIZATION"]
)

# Map
map_template = """Write a concise summary of the following content:

{content}

Summary:
"""
map_prompt = PromptTemplate.from_template(map_template)
map_chain = map_prompt | llm

# Reduce
reduce_template = """The following is set of summaries:
{context}
Take these and distill it into a final, consolidated summary of the main themes. 
Helpful Answer:"""

reduce_prompt = PromptTemplate(
    input_variables=["context"],
    template=reduce_template
)

stuff_documents_chain = create_stuff_documents_chain(llm=llm, prompt=reduce_prompt)
combine_documents_chain = StuffDocumentsChain(llm_chain=stuff_documents_chain, document_variable_name="context")

This is my error

File "/home/meta/.local/lib/python3.10/site-packages/langchain/chains/combine_documents/stuff.py", line 158, in get_default_document_variable_name
    llm_chain_variables = values["llm_chain"].prompt.input_variables

I really can't understund the mistake.