langchain-ai / langchain

🦜🔗 Build context-aware reasoning applications
https://python.langchain.com
MIT License
94.61k stars 15.31k forks source link

Custom Agent Class fails with object has no attribute 'is_single_input' #18292

Closed shoaibpatel4u closed 5 months ago

shoaibpatel4u commented 8 months ago

Checked other resources

Example Code

class CustomSQLTool(BaseTool): name = "SQL_TOOL" description = "useful when you need to answer questions residing on spark tables" llm = AzureChatOpenAI() k = 30

def _run(
    self, query: str, run_manager: Optional[CallbackManagerForToolRun] = None
) -> str:
    """Use the tool."""
    spark_sql = SparkSQL(schema='schema_name', include_tables=['table_name'])
    toolkit = SparkSQLToolkit(db=spark_sql, llm=self.llm)
    spark_sql_agent_executor = create_spark_sql_agent(
        llm=chatllm, #chatllm is the Azure open AI GPT model reference
        toolkit=toolkit, 
        verbose=True,
        top_k=self.k,
        agent_type=AgentType.ZERO_SHOT_REACT_DESCRIPTION, 
        early_stopping_method="generate", 
        agent_executor_kwargs={"handle_parsing_errors": True}
    )
    return spark_sql_agent_executor.run(query)

async def _arun(
    self, query: str, run_manager: Optional[AsyncCallbackManagerForToolRun] = None
) -> str:
    """Use the tool asynchronously."""
    raise NotImplementedError("CustomSQLTool does not support async")

Error Message and Stack Trace (if applicable)

[68f78b9qrn] 2024-02-27T20:13:01.+0000 ERROR src.mlflowserving.scoring_server Encountered an unexpected error while evaluating the model. Verify that the input is compatible with the model for inference. Error ''CustomSQLTool' object has no attribute 'is_single_input'' [68f78b9qrn] Traceback (most recent call last): [68f78b9qrn] File "/opt/conda/envs/mlflow-env/lib/python3.9/site-packages/src/mlflowserving/scoring_server/init.py", line 457, in transformation [68f78b9qrn] raw_predictions = model.predict(data, params=params) [68f78b9qrn] File "/opt/conda/envs/mlflow-env/lib/python3.9/site-packages/mlflow/pyfunc/init.py", line 492, in predict [68f78b9qrn] return _predict() [68f78b9qrn] File "/opt/conda/envs/mlflow-env/lib/python3.9/site-packages/mlflow/pyfunc/init.py", line 478, in _predict [68f78b9qrn] return self._predict_fn(data, params=params) [68f78b9qrn] File "/opt/conda/envs/mlflow-env/lib/python3.9/site-packages/mlflow/pyfunc/model.py", line 473, in predict [68f78b9qrn] return self.python_model.predict(self.context, self._convert_input(model_input)) [68f78b9qrn] File "", line 10, in predict [68f78b9qrn] File "/opt/conda/envs/mlflow-env/lib/python3.9/site-packages/langchain/agents/conversational/base.py", line 109, in from_llm_and_tools [68f78b9qrn] cls._validate_tools(tools) [68f78b9qrn] File "/opt/conda/envs/mlflow-env/lib/python3.9/site-packages/langchain/agents/conversational/base.py", line 91, in _validate_tools [68f78b9qrn] validate_tools_single_input(cls.name, tools) [68f78b9qrn] File "/opt/conda/envs/mlflow-env/lib/python3.9/site-packages/langchain/agents/utils.py", line 9, in validate_tools_single_input [68f78b9qrn] if not tool.is_single_input: [68f78b9qrn] AttributeError: 'CustomSQLTool' object has no attribute 'is_single_input' [68f78b9qrn]

Description

I am using langchain==0.0.330 (also tried with 0.0.347) and creating a custom SQL class to query spark tables in my Azure Databricks Environment. my custom SQL class is the child class of BaseTool. I am registering the whole thing in a mlflow custom pyfunc model. when I register the model in modle registery it is successfull and when I load and query it is working fine. when I deploy it as a model serving endpoint it fails stating that the my class CustomSQLTool does not have attribute is_single_input. As far as I understand this is available in validate tool in basetool class and I need not override it. both model serving and model registry accept the same input string wrapped in pandas dataframe.

System Info

langchain 0.0.330 and python 3.9 on Databricks runtime 12.2 ML.

liugddx commented 8 months ago

upgrade langchain and try again.