Closed lemon-mint closed 5 months ago
cc @eliben @jba
@lemon-mint Are you able to reproduce this with the official example: https://pkg.go.dev/cloud.google.com/go/vertexai@v0.10.0/genai#example-Tool
?
When SendMessageStream is included, an empty string (genai.Text)("") is included at the end.
{CurrentWeather map[location:New York]}
2024/06/11 08:04:07 rpc error: code = InvalidArgument desc = Unable to submit request because it has an empty text parameter. Add a value to the parameter and try again. Learn more: https://cloud.google.com/vertex-ai/generative-ai/docs/model-reference/gemini
exit status 1
Here is the go-spew dump:
(*genai.Content)(0x140000b8ea0)({
Role: (string) (len=4) "user",
Parts: ([]genai.Part) (len=1 cap=1) {
(genai.Text) (len=37) "What is the weather like in New York?"
}
})
(*genai.Content)(0x1400029e0f0)({
Role: (string) (len=5) "model",
Parts: ([]genai.Part) (len=2 cap=2) {
(genai.FunctionCall) {
Name: (string) (len=14) "CurrentWeather",
Args: (map[string]interface {}) (len=1) {
(string) (len=8) "location": (string) (len=12) "New York, NY"
}
},
(genai.Text) ""
}
})
2024/06/11 08:00:51 rpc error: code = InvalidArgument desc = Unable to submit request because it has an empty text parameter. Add a value to the parameter and try again. Learn more: https://cloud.google.com/vertex-ai/generative-ai/docs/model-reference/gemini
exit status 1
package main
import (
"context"
"fmt"
"log"
"cloud.google.com/go/vertexai/genai"
)
// Your GCP project
const projectID = "project-name"
// A GCP location like "us-central1"; if you're using standard Google-published
// models (like untuned Gemini models), you can keep location blank ("").
const location = "us-central1"
func main() {
ctx := context.Background()
client, err := genai.NewClient(ctx, projectID, location)
if err != nil {
log.Fatal(err)
}
defer client.Close()
// To use functions / tools, we have to first define a schema that describes
// the function to the model. The schema is similar to OpenAPI 3.0.
//
// In this example, we create a single function that provides the model with
// a weather forecast in a given location.
schema := &genai.Schema{
Type: genai.TypeObject,
Properties: map[string]*genai.Schema{
"location": {
Type: genai.TypeString,
Description: "The city and state, e.g. San Francisco, CA",
},
"unit": {
Type: genai.TypeString,
Enum: []string{"celsius", "fahrenheit"},
},
},
Required: []string{"location"},
}
weatherTool := &genai.Tool{
FunctionDeclarations: []*genai.FunctionDeclaration{{
Name: "CurrentWeather",
Description: "Get the current weather in a given location",
Parameters: schema,
}},
}
model := client.GenerativeModel("gemini-1.0-pro")
// Before initiating a conversation, we tell the model which tools it has
// at its disposal.
model.Tools = []*genai.Tool{weatherTool}
// For using tools, the chat mode is useful because it provides the required
// chat context. A model needs to have tools supplied to it in the chat
// history so it can use them in subsequent conversations.
//
// The flow of message expected here is:
//
// 1. We send a question to the model
// 2. The model recognizes that it needs to use a tool to answer the question,
// an returns a FunctionCall response asking to use the CurrentWeather
// tool.
// 3. We send a FunctionResponse message, simulating the return value of
// CurrentWeather for the model's query.
// 4. The model provides its text answer in response to this message.
session := model.StartChat()
iter := session.SendMessageStream(ctx, genai.Text("What is the weather like in New York?"))
if err != nil {
log.Fatal(err)
}
for {
resp, err := iter.Next()
if err != nil {
break
}
for _, cand := range resp.Candidates {
for _, part := range cand.Content.Parts {
fmt.Print(part)
}
}
}
fmt.Println()
resp, err := session.SendMessage(ctx, genai.FunctionResponse{
Name: "CurrentWeather",
Response: map[string]any{
"weather": "cold",
},
})
if err != nil {
log.Fatal(err)
}
for _, cand := range resp.Candidates {
for _, part := range cand.Content.Parts {
fmt.Println(part)
}
}
fmt.Println("---")
}
Thanks @lemon-mint
I've sent a PR with a fix
Client
Client: Vertex AI, Gemini, genai
Environment
Environment: macOS 14.5
Go Environment
go version go1.22.2 darwin/arm64
Code
Expected behavior
The code should successfully send the function_response to the Gemini model and receive a response based on the provided weather information.
Actual behavior
The code panics with the following error when sending the function_response:
Additional context