tmc / langchaingo

LangChain for Go, the easiest way to write LLM-based programs in Go
https://tmc.github.io/langchaingo/
MIT License
4.6k stars 623 forks source link

issue with conversational agent chain: chain with more then one expected input #664

Open devalexandre opened 8 months ago

devalexandre commented 8 months ago

I'm writing some use cases, to document the use of langchaingo, but even looking at the examples some things are not clear, the use of template is a case, I tested both with llama2 and openai, and the result was the same error .

2024/03/11 23:02:43 run not supported in chain with more then one expected input
exit status 1

code:

package main

import (
    "context"
    "log"

    "github.com/tmc/langchaingo/agents"
    "github.com/tmc/langchaingo/chains"
    "github.com/tmc/langchaingo/llms/ollama"
    "github.com/tmc/langchaingo/prompts"
)

func main() {
    llm, err := ollama.New(ollama.WithModel("llama2"))
    if err != nil {
        log.Fatal(err)
    }

    promptInput := prompts.PromptTemplate{
        Template: "{{.Name}} are 36 years old and live in {{.City}}.",
        PartialVariables: map[string]any{
            "Name": "Bob",
            "City": "New York",
        },
        TemplateFormat: prompts.TemplateFormatGoTemplate,
    }

    agent, err := agents.Initialize(
        llm,
        nil,
        agents.ConversationalReactDescription,
        agents.WithPrompt(promptInput),
    )
    if err != nil {
        log.Fatal(err)
    }

    options := []chains.ChainCallOption{
        chains.WithTemperature(0.8),
    }

    if err != nil {
        log.Fatal(err)
    }

    res, err := chains.Run(context.Background(), agent, "What is the name of the person?", options...)
    if err != nil {
        log.Fatal(err)
    }

    log.Println(res)
}
tmc commented 7 months ago

Thanks for the report, I'm going to adjust the issue title and point out issues in the conversational agent chain.

tmc commented 7 months ago

Part of what's happening here is by supplying agents.WithPrompt ends up overwriting the prompt assembly logic in getConversationalPrompt so the contents sent to the model don't include any of the embedded templates.

devalexandre commented 7 months ago

any solution for it?