langchain-ai / langchain

🦜🔗 Build context-aware reasoning applications
https://python.langchain.com
MIT License
91.41k stars 14.54k forks source link

create_openai_fn_chain throws an error when not providing input not related to a function #10397

Closed Coding-Crashkurse closed 8 months ago

Coding-Crashkurse commented 11 months ago

System Info

langchain 0.0.285 on Windows. Reproduceable script is attached

Who can help?

No response

Information

Related Components

Reproduction

from langchain.chat_models import ChatOpenAI
from langchain.prompts import PromptTemplate
from langchain.chains.openai_functions import create_openai_fn_chain

from dotenv import load_dotenv, find_dotenv

load_dotenv(find_dotenv())

database = [
    {"name": "Salami", "price": 9.99},
    {"name": "Margherita", "price": 8.99},
    {"name": "Pepperoni", "price": 10.99},
    {"name": "Hawaiian", "price": 11.49},
    {"name": "Veggie Supreme", "price": 10.49},
]

def get_pizza_info(pizza_name: str) -> dict:
    """Retrieve information about a specific pizza from the database.

    Args:
        pizza_name (str): Name of the pizza.

    Returns:
        dict: A dictionary containing the pizza's name and price or a message indicating the pizza wasn't found.
    """
    for pizza in database:
        if pizza["name"] == pizza_name:
            return pizza
    return {"message": f"No pizza found with the name {pizza_name}."}

def add_pizza(pizza_name: str, price: float) -> dict:
    """Add a new pizza to the database.

    Args:
        pizza_name (str): Name of the new pizza.
        price (float): Price of the new pizza.

    Returns:
        dict: A message indicating the result of the addition.
    """
    for pizza in database:
        if pizza["name"] == pizza_name:
            return {"message": f"Pizza {pizza_name} already exists in the database."}

    database.append({"name": pizza_name, "price": price})
    return {"message": f"Pizza {pizza_name} added successfully!"}

llm = ChatOpenAI(model="gpt-3.5-turbo-0613", temperature=0)

template = """You are an AI chatbot having a conversation with a human.

Human: {human_input}
AI: """
prompt = PromptTemplate(input_variables=["human_input"], template=template)

chain = create_openai_fn_chain(
    [get_pizza_info, add_pizza], llm, prompt, verbose=True
)

result1 = chain.run("I want to add the pizza 'Jumbo' for 13.99")
print(result1)
result2 = chain.run("Who are the main characters of the A-Team?") <- that code does not work
print(result2)

Traceback: Traceback:

Traceback (most recent call last): File "C:\Users\User\AppData\Local\Programs\Python\Python311\Lib\site-packages\langchain\output_parsers\openai_functions.py", line 28, in parse_result func_call = copy.deepcopy(message.additional_kwargs["function_call"])


KeyError: 'function_call'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "C:\Users\User\Desktop\LangChain\07_OpenAI_Functions\pizza_store.py", line 63, in <module>
    result1 = chain.run("Who are the main characters of the A-Team?")
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\User\AppData\Local\Programs\Python\Python311\Lib\site-packages\langchain\chains\base.py", line 487, in run
    return self(args[0], callbacks=callbacks, tags=tags, metadata=metadata)[
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\User\AppData\Local\Programs\Python\Python311\Lib\site-packages\langchain\chains\base.py", line 292, in __call__
    raise e
  File "C:\Users\User\AppData\Local\Programs\Python\Python311\Lib\site-packages\langchain\chains\base.py", line 286, in __call__
    self._call(inputs, run_manager=run_manager)
  File "C:\Users\User\AppData\Local\Programs\Python\Python311\Lib\site-packages\langchain\chains\llm.py", line 92, in _call
    return self.create_outputs(response)[0]
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\User\AppData\Local\Programs\Python\Python311\Lib\site-packages\langchain\chains\llm.py", line 220, in create_outputs
    result = [
             ^
  File "C:\Users\User\AppData\Local\Programs\Python\Python311\Lib\site-packages\langchain\chains\llm.py", line 223, in <listcomp>
    self.output_key: self.output_parser.parse_result(generation),
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\User\AppData\Local\Programs\Python\Python311\Lib\site-packages\langchain\output_parsers\openai_functions.py", line 49, in 
parse_result
    function_call_info = super().parse_result(result)
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\User\AppData\Local\Programs\Python\Python311\Lib\site-packages\langchain\output_parsers\openai_functions.py", line 30, in 
parse_result
    raise OutputParserException(f"Could not parse function call: {exc}")
langchain.schema.output_parser.OutputParserException: Could not parse function call: 'function_call'

### Expected behavior

I would expect the similar behaviour to using the vanilla API.

```
def chat(query):
    response = openai.ChatCompletion.create(
        model="gpt-3.5-turbo-0613",
        messages=[{"role": "user", "content": query}],
        functions=functions, # this is new
    )
    message = response["choices"][0]["message"]
    return message

   chat("What is the capital of france?")
```

  If I run a query not related to the function, I will or will not add "function_call" to the output. I can handle this as follows:

```
  if message.get("function_call"):
    pizza_name = json.loads(message["function_call"]["arguments"]).get("pizza_name")
    print(pizza_name)
    function_response = get_pizza_info(
        pizza_name=pizza_name
    )
    print(function_response)
```

Is there a workaround, does it work as intended or is that an unknown bug? I would normally just expect it to work without creating a workaround :)
dosubot[bot] commented 11 months ago

🤖

Hello,

Thank you for bringing this issue to our attention. The error you're encountering is due to the way the create_openai_fn_chain function is designed. It expects the input to be related to a function, and when it's not, it throws an error.

The create_openai_fn_chain function uses the convert_to_openai_function function to handle its inputs. This function checks if the input is a dictionary, a pydantic.BaseModel class, or a Python function. If it's not one of these types, it raises a ValueError.

Here is the relevant code:

def convert_to_openai_function(
    function: Union[Dict[str, Any], Type[BaseModel], Callable]
) -> Dict[str, Any]:
    """Convert a raw function/class to an OpenAI function.

    Args:
        function: Either a dictionary, a pydantic.BaseModel class, or a Python function.
            If a dictionary is passed in, it is assumed to already be a valid OpenAI
            function.

    Returns:
        A dict version of the passed in function which is compatible with the
            OpenAI function-calling API.
    """
    if isinstance(function, dict):
        return function
    elif isinstance(function, type) and issubclass(function, BaseModel):
        return cast(Dict, convert_pydantic_to_openai_function(function))
    elif callable(function):
        return convert_python_function_to_openai_function(function)

    else:
        raise ValueError(
            f"Unsupported function type {type(function)}. Functions must be passed in"
            f" as Dict, pydantic.BaseModel, or Callable."
        )

To work around this issue, you need to ensure that the input you provide to create_openai_fn_chain is one of the supported types: a dictionary, a pydantic.BaseModel class, or a Python function.

However, I understand that you would like to be able to pass in queries not related to the function without encountering an error. This is currently not supported in LangChain, but it's a valid use case. I recommend opening a feature request for this in the LangChain repository. The maintainers can then consider adding this functionality in a future update.

I hope this helps! If you have any other questions, feel free to ask.

Best, Dosu

Sources


This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

Coding-Crashkurse commented 11 months ago

@dosu-bot: I provided the correct types, two functions to be precise:

chain = create_openai_fn_chain([get_pizza_info, add_pizza], llm, prompt, verbose=True)

My scenario is that I want to query backends with the function, but ONLY when needed. The usercase is a chatbot, where the end user should be able to ask whatever he wants.

My issue is that I can not use create_openai_fn_chain in a chatbot that way, since the service will crash if the query is unrelated to the function definitions.

dosubot[bot] commented 11 months ago

🤖

Hello,

Thank you for reaching out and providing a detailed explanation of your issue. The error you're encountering is due to the fact that the create_openai_fn_chain function is expecting a function call in the user input. When it doesn't find one, it raises a KeyError: 'function_call' and an OutputParserException: Could not parse function call: 'function_call'.

To handle any user input, not just those related to the function definitions, without the application crashing when the input is unrelated to the functions, you can modify your chain.run calls to include a try-except block. This will allow the program to continue running even if it encounters an error.

Here's an example of how you can do this:

try:
    result1 = chain.run("I want to add the pizza 'Jumbo' for 13.99")
    print(result1)
except Exception as e:
    print(f"An error occurred: {e}")

try:
    result2 = chain.run("Who are the main characters of the A-Team?")
    print(result2)
except Exception as e:
    print(f"An error occurred: {e}")

In this example, each chain.run call is wrapped in a try-except block. If an error occurs when trying to run the chain with the provided user input, the program will print an error message and continue to the next line instead of crashing.

However, this will only prevent the application from crashing. If you want the chatbot to be able to handle any user input and provide a meaningful response, you might need to implement a more sophisticated error handling mechanism or use a different approach to handle user inputs that are not related to the function definitions.

I hope this helps! Let me know if you have any other questions.

Sources


This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

dosubot[bot] commented 8 months ago

Hi, @Coding-Crashkurse,

I'm helping the LangChain team manage their backlog and am marking this issue as stale. From what I understand, you raised an issue regarding the create_openai_fn_chain function in the langchain library, which was throwing an error when a query not related to the function was provided. I provided an explanation of the error and suggested using a try-except block to handle unrelated user input without crashing the application. You clarified your scenario and expressed the need to use create_openai_fn_chain in a chatbot, but the service crashes if the query is unrelated to the function definitions.

Could you please confirm if this issue is still relevant to the latest version of the LangChain repository? If it is, please let the LangChain team know by commenting on the issue. Otherwise, feel free to close the issue yourself, or the issue will be automatically closed in 7 days.

Thank you!