langchain-ai / langchain

🦜🔗 Build context-aware reasoning applications
https://python.langchain.com
MIT License
92.52k stars 14.81k forks source link

AmadeusClosestAirport tool should accept any LLM #15847

Closed mspronesti closed 8 months ago

mspronesti commented 8 months ago

Checked other resources

Description

AmadeusClosestAirport contains a "hardcoded" call to ChatOpenAI (see here), while it would make sense to use the same llm passed to the chain/agent when initialized.

In addition, this implies that AmadeusToolkit implicitly depends on openai, which should not be the case.

Example (source code from the docs)

from langchain_community.agent_toolkits.amadeus.toolkit import AmadeusToolkit
# Set environmental variables here
import os

os.environ["AMADEUS_CLIENT_ID"] = "CLIENT_ID"
os.environ["AMADEUS_CLIENT_SECRET"] = "CLIENT_SECRET"
os.environ["OPENAI_API_KEY"] = "API_KEY"
# os.environ["AMADEUS_HOSTNAME"] = "production" or "test"

toolkit = AmadeusToolkit() 
tools = toolkit.get_tools()

llm = OpenAI(temperature=0) # this can be any `BaseLLM`
agent = initialize_agent(
    tools=tools,
    llm=llm,
    verbose=False,
    agent=AgentType.STRUCTURED_CHAT_ZERO_SHOT_REACT_DESCRIPTION,
)

#  ==> agent calls `ChatOpenAI` regardless of `llm` <===
agent.run("What is the name of the airport in Cali, Colombia?")

Related Components

dosubot[bot] commented 8 months ago

🤖

It's great to see your interest in improving the LangChain framework! Your suggestion to use the same llm passed to the chain/agent when initialized for the AmadeusClosestAirport tool sounds like a valuable enhancement. Your contribution would be highly appreciated. Thank you for considering to make LangChain even better!


This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

S-T-A-R-L-O-R-D commented 8 months ago

I would like work on this, I am thinking of making a class and having an default argument for any base llm (default arg can be openAI) and if user want to use some other model then user can specify same while initialize class object.

Or we can Infer the same base llm model passed to the chain/agent when initialized

mspronesti commented 8 months ago

Solved in #15879.