tmc / langchaingo

LangChain for Go, the easiest way to write LLM-based programs in Go
https://tmc.github.io/langchaingo/
MIT License
4.38k stars 597 forks source link

[Bug Report] OpenAI functions raise exception #638

Open robermar23 opened 7 months ago

robermar23 commented 7 months ago

459 added agent support for OpenAI functions and it seems to work. I have created my own simple Tool (calling an internal api), following existing implementations and am able to run a Chain that ends up having OpenAI returning a request for as a function call to call my Tool.

My Tool completes its step and returns a string response. During Planning for the subsequent step in the chain, an exception is raised because one of the prompt messages now has an item with a Role of Function.

Exception: role function not supported

It looks like the refactoring in #521 that was released also as part of v0.1.4, broke the ability for chains to continue after an openai function call response.

Specifically line no. 72 in llms/openai/openaillm.go has the Function Role falling through to the default in the switch statement and thus throwing an exception that it is not supported.

The previous llms/openai/openaillm_chat.go file that was used did not follow the same logic and allowed for function roles.

I am creating my agent and chain like so (portion of code that is relevant):

agent := agents.NewOpenAIFunctionsAgent(c.openAIClient, toolsToUse, agents.NewOpenAIOption().WithSystemMessage(c.chatConfig.SystemPersona))

executor := agents.NewExecutor(agent, toolsToUse)

result, err := chains.Run(ctx, executor, newChat,chains.WithModel(c.chatConfig.Model))
devalexandre commented 6 months ago

How I could reproduce it?

ChrisCPoirier commented 6 months ago

I can confirm this is an issue. Here is an example of attempting RAG (openAI->function->data->openAI). The example code below sends a request, receives a functionCall, grabs that data and attempts to send it back. I added some comments hi-lighting the specific point of break.

error: 2024/03/23 17:04:31 role function not supported exit status 1

expected output:

The current weather in Boston is sunny and windy with a temperature of 72 degrees Fahrenheit.

example:

package main

import (
    "context"
    "encoding/json"
    "fmt"
    "log"

    "github.com/tmc/langchaingo/llms"
    "github.com/tmc/langchaingo/llms/openai"
    "github.com/tmc/langchaingo/schema"
)

func main() {
    llm, err := openai.New(openai.WithModel("gpt-3.5-turbo-0613"))
    if err != nil {
        log.Fatal(err)
    }
    ctx := context.Background()

    messages := []llms.MessageContent{
        llms.TextParts(schema.ChatMessageTypeHuman, "What is the weather like in Boston?"),
    }

    resp, err := llm.GenerateContent(ctx,
        messages,
        llms.WithFunctions(functions))
    if err != nil {
        log.Fatal(err)
    }

    choice1 := resp.Choices[0]
    if choice1.FuncCall != nil {
        //add the response from the AI to the conversation
        messages = append(messages, llms.TextParts(schema.ChatMessageTypeAI, choice1.Content))

        //execute function
        switch choice1.FuncCall.Name {
        case "getCurrentWeather":
            //Generate data
            data, _ := getCurrentWeather(`titusville`, `fahrenheit`)

            //Add function data to conversation
            messages = append(messages, llms.TextParts(schema.ChatMessageTypeFunction, data))

            //Send back to openAI for final response -
            //  this is where it breaks -
            //  expecation is function data sent back to OpenAI and a formal response constructed
            resp, err := llm.GenerateContent(ctx,
                messages,
                llms.WithFunctions(functions))

            if err != nil {
                log.Fatal(err)
            }

            fmt.Println(resp.Choices[0].Content)

        default:
            log.Fatalf("function %s not found", choice1.FuncCall.Name)
        }
    }
}

func getCurrentWeather(location string, unit string) (string, error) {
    weatherInfo := map[string]interface{}{
        "location":    location,
        "temperature": "72",
        "unit":        unit,
        "forecast":    []string{"sunny", "windy"},
    }
    b, err := json.Marshal(weatherInfo)
    if err != nil {
        return "", err
    }
    return string(b), nil
}

var functions = []llms.FunctionDefinition{
    {
        Name:        "getCurrentWeather",
        Description: "Get the current weather in a given location",
        Parameters:  json.RawMessage(`{"type": "object", "properties": {"location": {"type": "string", "description": "The city and state, e.g. San Francisco, CA"}, "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]}}, "required": ["location"]}`),
    },
}

Possible Fix: One fix I have in mind is to add FuncCall to llms.MessageContent. https://github.com/tmc/langchaingo/blob/main/llms/generatecontent.go#L14

type MessageContent struct {
    Role     schema.ChatMessageType
    Parts    []ContentPart
    FuncCall *schema.FunctionCall
}

This would then allow for something like

...
--calling code
m := llms.TextParts(schema.ChatMessageTypeFunction, data)
m.FuncCall = choice1.FuncCall
....

And finally this inside the llms/openai https://github.com/tmc/langchaingo/blob/83bf27c8855714ce8b7eb38b9ded29c831d94797/llms/openai/openaillm.go#L69

...
case schema.ChatMessageTypeFunction:
    msg.Role = RoleFunction
    if mc.FuncCall != nil {
        msg.Name = mc.FuncCall.Name
    }
...

This solution worked for me when I implemented locally but I am not completely familiar with all of the other LLM solutions and I am not sure if this fits generically.

I can create a PR if this is the correct approach

ChrisCPoirier commented 6 months ago

I believe this issue can be closed.

It looks like this was fixed/resolved with this commit https://github.com/tmc/langchaingo/commit/41746928f0938d9032adec2b36ccc139a2461531

Specifically this https://github.com/tmc/langchaingo/commit/41746928f0938d9032adec2b36ccc139a2461531#diff-2a83212750cde18b3ed9b7043adcdc3e0637e0e903c37b90581c25d9ec5e1447R176