run-llama / llama_index

LlamaIndex is a data framework for your LLM applications
https://docs.llamaindex.ai
MIT License
35.24k stars 4.95k forks source link

[Bug]: PydanticSelector fails when used with LangChainLLM #14874

Open jordanparker6 opened 1 month ago

jordanparker6 commented 1 month ago

Bug Description

PydanticSelector validation fails as it expects a LlamaIndex OpenAI llm. An OpenAILike llm can be provided as a LangChainLLM so this validation shouldn't limit these llms from use in the selector.

Version

^0.10.55

Steps to Reproduce

LangChainLLM(
    ChatOpenAI(
         model=DEFAULT_FAST_MODEL,
         max_tokens=512,
         api_key=config.openai.api_key,
         base_url=config.openai.api_base,
         model_kwargs={
              "extra_body": {
              "customer_identifier": auth.user_id,
               "metadata": {"service": "retrieval"},
               }
           },
        )
    )
}
LLMSingleSelector.from_defaults(llm=llm)

Relevant Logs/Tracbacks

No response

dosubot[bot] commented 1 month ago

To address the issue with PydanticSelector failing when used with LangChainLLM, you need to modify the validation logic to accept LangChainLLM instances. Here’s how you can update the PydanticSelector class to support LangChainLLM:

  1. Import the necessary classes.
  2. Update the validation logic to include LangChainLLM.

Here’s an example of how you might modify the validation logic:

from pydantic import BaseModel, validator
from typing import Union
from llama_index.legacy.llms.openai import OpenAI
from llama_index.llms.langchain import LangChainLLM
from langchain.base_language import BaseLanguageModel

class PydanticSelector(BaseModel):
    llm: Union[OpenAI, LangChainLLM, BaseLanguageModel]

    @validator('llm')
    def validate_llm(cls, v):
        if not isinstance(v, (OpenAI, LangChainLLM, BaseLanguageModel)):
            raise ValueError('llm must be an instance of OpenAI, LangChainLLM, or BaseLanguageModel')
        return v

This modification ensures that the PydanticSelector class can recognize and validate instances of LangChainLLM along with OpenAI and other compatible LLMs [1][2][3].

To continue talking to Dosu, mention @dosu.

logan-markewich commented 1 month ago

Langchain llm is a generic wrapper, it doesn't know what the underlying llm is.

Since the langchain llm class does not implement function calling, it can't be used as a pydantic selector. You'll need to use the regular llm selector, or use the openai llm from llamaindex