tmc / langchaingo

LangChain for Go, the easiest way to write LLM-based programs in Go
https://tmc.github.io/langchaingo/
MIT License
3.73k stars 518 forks source link

[BUG FIX] fix a bug for error when using openai function agent & add some examples #924

Open diemus opened 1 week ago

diemus commented 1 week ago

PR Checklist

Currently, the OpenAI function agent has a bug. When executing a tool, it throws an error: API returned unexpected status code: 400: Missing parameter 'name': messages with role 'function' must have a 'name'. The reason is that while processing the result of the previous action, the message type is FunctionChatMessage instead of ToolChatMessage, and the Plan function does not handle FunctionChatMessage. Therefore, the function result turns into a regular user message, causing OpenAI to return a 400 error.

The issue is very easy to reproduce:

package main

import (
    "context"
    "fmt"
    "github.com/tmc/langchaingo/callbacks"
    "log"
    "os"

    "github.com/tmc/langchaingo/agents"
    "github.com/tmc/langchaingo/llms/openai"
    "github.com/tmc/langchaingo/tools"
)

func main() {
    if err := run(); err != nil {
        fmt.Fprintln(os.Stderr, err)
        os.Exit(1)
    }
}

func run() error {
    llm, err := openai.New(openai.WithModel("gpt-4-turbo"))
    if err != nil {
        log.Fatal(err)
    }

    agentTools := []tools.Tool{
        tools.Calculator{},
    }

    agent := agents.NewOpenAIFunctionsAgent(
        llm,
        agentTools,
        agents.WithCallbacksHandler(callbacks.LogHandler{}),
    )
    if err != nil {
        return err
    }

    executor := agents.NewExecutor(agent,
        agents.WithMaxIterations(3),
        agents.WithReturnIntermediateSteps(),
    )

    a, err := executor.Call(context.Background(), map[string]any{"input": "what is 3 plus 3 and what is python"})
    fmt.Println(a, err)
    return err
}

I have already submitted a fix and added the example to the repository.