nlkitai / nlux

The π—£π—Όπ˜„π—²π—Ώπ—³π˜‚π—Ή Conversational AI JavaScript Library πŸ’¬ β€”Β UI for any LLM, supporting LangChain / HuggingFace / Vercel AI, and more 🧑 React, Next.js, and plain JavaScript ⭐️
https://docs.nlkit.com/nlux
Other
943 stars 48 forks source link

Is is compatible with agent_executors? #24

Closed hellseyfer closed 5 months ago

hellseyfer commented 5 months ago

I tried this without success:

app = FastAPI(
    title="LangChain Server",
    version="1.0",
    description="Spin up a simple api server using Langchain's Runnable interfaces",
)

# We need to add these input/output schemas because the current AgentExecutor
# is lacking in schemas.
class Input(BaseModel):
    input: str
    chat_history: List[Union[HumanMessage, AIMessage, FunctionMessage]]

class Output(BaseModel):
    output: Any

def add_route(path: str, chain: Runnable):
    add_routes(
        app,
        runnable=chain,
        path=path,
        enabled_endpoints=["invoke", "stream", "input_schema", "output_schema"],
    )

add_route("/test", agent_executor.with_types(input_type=Input, output_type=Output))

I'm using langchain adapter in the frontend. langchain version is 0.1.4

salmenus commented 5 months ago

Hi @hellseyfer I didn't try it with agent_executors, but from what I can see in your example, the issue is quite likely because your input schema requires 2 attributes: input and chat_history

The default @nlux/langchain adapter config assume that the input schema only requires 1 input: The user prompt.

If you use a more complex schema, you will need to provide an inputPreProcessor in the frontend, to update the user input to make it match your schema. You can do that by using the inputPreProcessor config (documentation here)

I created an example here that I expect to work with your schema: https://codesandbox.io/p/sandbox/lingering-wind-jzgmyk?file=%2FApp.tsx%3A15%2C6 ( you will need to change the URL to make it match your LangServe sever URL)

Give it a try and let me know.

hellseyfer commented 5 months ago

worked like a charm. Thank you