Closed RobertHH-IS closed 1 year ago
Hello, this code works on my machine (using langchain 0.0.211
):
import chainlit as cl
from langchain.tools import Tool
from langchain.memory import ConversationBufferMemory
from langchain.chat_models import ChatOpenAI
from langchain.agents import initialize_agent, AgentType
def foo(bar):
return "foo"
tools = [
Tool(
name="Function1",
func=foo,
description="useful for function1.",
return_direct=True,
),
Tool(
name="Function1",
func=foo,
description="Useful for function2.",
return_direct=True,
),
]
@cl.langchain_factory(use_async=False)
def load():
memory = ConversationBufferMemory(memory_key="chat_history", return_messages=True)
llm = ChatOpenAI(temperature=0, max_tokens=2500)
agent_chain = initialize_agent(
tools,
llm,
agent=AgentType.CHAT_CONVERSATIONAL_REACT_DESCRIPTION,
verbose=True,
memory=memory,
)
return agent_chain
What version of chainlit? I am also using 211 for langchain.
On Sat, 24 Jun 2023 at 22:43, Willy Douhard @.***> wrote:
Hello, this code works on my machine (using langchain 0.0.211):
import chainlit as clfrom langchain.tools import Toolfrom langchain.memory import ConversationBufferMemoryfrom langchain.chat_models import ChatOpenAIfrom langchain.agents import initialize_agent, AgentType
def foo(bar): return "foo"
tools = [ Tool( name="Function1", func=foo, description="useful for function1.", return_direct=True, ), Tool( name="Function1", func=foo, description="Useful for function2.", return_direct=True, ), ]
@cl.langchain_factory(use_async=False)def load(): memory = ConversationBufferMemory(memory_key="chat_history", return_messages=True) llm = ChatOpenAI(temperature=0, max_tokens=2500) agent_chain = initialize_agent( tools, llm, agent=AgentType.CHAT_CONVERSATIONAL_REACT_DESCRIPTION, verbose=True, memory=memory, ) return agent_chain
— Reply to this email directly, view it on GitHub https://github.com/Chainlit/chainlit/issues/102#issuecomment-1605762124, or unsubscribe https://github.com/notifications/unsubscribe-auth/AM36SVOMYQIIWLGLAK2RDQLXM5USLANCNFSM6AAAAAAZSXSRGA . You are receiving this because you authored the thread.Message ID: @.***>
-- Róbert Helgason
Founder
KOT Hugbúnaður ehf.
+354 611 9082 <+354+611+9082> @.*** kot.is
I am using 0.4.1. If it still does not work, you can use langchain_run to manually call the agent and pass the missing key in the input dict.
I am deploying a simple agent with a few tools. But whenever i try and run it it hits raise ValueError(f"Missing some input keys: {missing_keys}") ValueError: Missing some input keys: {'input'} The tools are as follows - tools = [ Tool( name = "Function1", func=Function1, description="useful for function1.",
return_direct=True ), Tool( name = "Function1", func=Function1, description="Useful for function2.", return_direct=True
) ]
I start agent as follow:
@langchain_factory(use_async=False) def load(): memory = ConversationBufferMemory(memory_key="chat_history", return_messages=True) llm = ChatOpenAI(model="gpt-4-0613",temperature=0,max_tokens=2500) agent_chain = initialize_agent(tools, llm, agent=AgentType.CHAT_CONVERSATIONAL_REACT_DESCRIPTION, verbose=True, memory=memory) return agent_chain
I do not see any guidelines in docs regarding this. Do prompt templates have to include an {input} ?