Closed leonardotorresaltez closed 1 year ago
Hi @leonardotorresaltez - were you able to resolve this issue? I am trying to test a locally running plugin with langchain and the call tool = AIPluginTool.from_plugin_url("http://localhost:8001/.well-known/ai-plugin.json")
hangs even though I see GET request complete successfully GET /ai-plugin.json HTTP/1.1" 200 OK
Hi, @leonardotorresaltez! I'm Dosu, and I'm here to help the LangChain team manage their backlog. I wanted to let you know that we are marking this issue as stale.
From what I understand, you reported an issue regarding the inconsistency in the AIPluginTool with ChatOpenAI. Sometimes it successfully calls the plugin, but other times it returns a response instructing the user to call a URL to get the response. sanzgiri also commented on the issue, mentioning their own similar issue with a locally running plugin.
Before we close this issue, we wanted to check if it is still relevant to the latest version of the LangChain repository. If it is, please let us know by commenting on the issue. Otherwise, feel free to close the issue yourself, or it will be automatically closed in 7 days.
Thank you for your contribution to the LangChain repository!
System Info
Hello,
When using AIPluginTool with ChatOpenAI
sometimes the chain call the plugin and sometimes the response is like "the user can call the url ... to get the response" . Why is it?
My code:
import os
import openai from dotenv import load_dotenv, find_dotenv from langchain.chat_models import ChatOpenAI
from langchain.tools import AIPluginTool from langchain.agents import load_tools, ConversationalChatAgent, ZeroShotAgent from langchain.chains.conversation.memory import ConversationBufferWindowMemory from langchain.agents.agent import AgentExecutor
tool = AIPluginTool.from_plugin_url("http://localhost:5003/.well-known/ai-plugin.json") tools2 = load_tools(["requests_get"] )
tools = [tool,tools2[0]]
_ = load_dotenv(find_dotenv()) #read local .env file openai.api_key = os.getenv('OPENAI_API_KEY')
llm=ChatOpenAI( openai_api_key=os.getenv('OPENAI_API_KEY'), temperature=0, model_name='gpt-3.5-turbo' )
prefix = """Have a conversation with a human, answering the following questions as best you can. You have access to the following tools:"""
memory = ConversationBufferWindowMemory( memory_key="chat_history",
k=5, return_messages=True )
custom_agent = ConversationalChatAgent.from_llm_and_tools(llm=llm, tools=tools, system_message=prefix)
agent_executor = AgentExecutor.from_agent_and_tools(agent=custom_agent, tools=tools, memory=memory) agent_executor.verbose = True
print( agent_executor.agent.llm_chain.prompt )
resp = agent_executor.run(input="What are my store orders for userId Leo ?")
print( resp )
Who can help?
No response
Information
Related Components
Reproduction
execute the code , two or three times. you will get a different response
Expected behavior
call the plugin and get the response from http://localhost:5003/order/Leo