gptscript-ai / gptscript

Build AI assistants that interact with your systems
https://gptscript.ai
Apache License 2.0
3.01k stars 266 forks source link

Incorrect model used for tool execution when both `--default-model` is used in cli to execute scripts that have model set. #645

Open sangee2004 opened 3 months ago

sangee2004 commented 3 months ago

gptscript version - v0.0.0-dev-6e92ee7a-dirty

Steps to reproduce the problem:

Pre Req - In my case I dont have gcloud authentication setup , so using github.com/gptscript-ai/gemini-vertexai-provider would fail.

Scenario 1:

  1. Execute script with default provider set to gemini vertexai provider for the following script - gptscript --disable-cache --debug-messages --default-model 'gemini-1.0-pro from github.com/gptscript-ai/gemini-vertexai-provider' test_answers_from_internet.gpt
Tools: github.com/gptscript-ai/answers-from-the-internet
Model: gpt-4o

Who won 2024 Superbowl?
  1. Script execution succeeds in this case since gpt-4o model gets used to make LLM calls to the Node GPTScript SDK when executing github.com/gptscript-ai/answers-from-the-internet tool.
gptscript --disable-cache --debug-messages --default-model 'gemini-1.0-pro from github.com/gptscript-ai/gemini-vertexai-provider' test_answers_from_internet.gpt     
15:55:52 WARNING: Changing the default model can have unknown behavior for existing tools. Use the model field per tool instead.
15:55:53 started  [main]
15:55:53 sent     [main]
15:55:53 messages [request={"model":"gpt-4o","messages":[{"role":"system","content":"\nYou are task oriented system.\nYou receive input from a user, process the input from the given instructions, and then output the result.\nYour objective is to provide consistent and correct results.\nYou do not need to explain the steps taken, only provide the result to the given instructions.\nYou are referred to as a tool.\nYou don't move to the next step until you have a result.\n\nWho won 2024 Superbowl?"}],"temperature":0,"tools":[{"type":"function","function":{"name":"answersFromTheInternet","description":"Uses Google to answer the provided question","parameters":{"properties":{"question":{"description":"the question to ask","type":"string"}},"type":"object"}}}]}]
         content  [1] content | Waiting for model response...
         content  [1] content | <tool call> answersFromTheInternet -> {"question":"Who won 2024 Superbowl?"}
15:55:54 messages [response={"role":"assistant","content":[{"toolCall":{"index":0,"id":"call_70LEmmgIsTXkaQYcb0tm05uC","function":{"name":"answersFromTheInternet","arguments":"{\"question\":\"Who won 2024 Superbowl?\"}"}}}],"usage":{"promptTokens":135,"completionTokens":23,"totalTokens":158}}]
15:55:54 started  [answers-from-the-internet(2)] [input={"question":"Who won 2024 Superbowl?"}]
15:55:54 sent     [answers-from-the-internet(2)]
15:55:54 messages [request={"command":["/usr/bin/env","npm","--prefix","/Users/sangeethahariharan/Library/Caches/gptscript/repos/401d974b425bbdd5637618c492beafbb644cad4c/tool.gpt/node21","run","tool"],"input":"{\"question\":\"Who won 2024 Superbowl?\"}"}]
         content  [2] content | 
         content  [2] content | > tool
         content  [2] content | > node --no-warnings --loader ts-node/esm src/server.ts
         content  [2] content | 
         content  [2] content | slow page: https://www.hollywoodreporter.com/news/general-news/super-bowl-2024-winner-1235822266/
### Sources:
         content  [2] content | - [Wikipedia](https://en.wikipedia.org/wiki/Super_Bowl_LVIII)
         content  [2] content | - [Fox Sports](https://www.foxsports.com/nfl/super-bowl)
         content  [2] content | - [The Hollywood Reporter](https://www.hollywoodreporter.com/news/general-news/super-bowl-2024-winne ...
         content  [2] content | 
         content  [2] content | ### Answer:
         content  [2] content | The Kansas City Chiefs won Super Bowl LVIII, defeating the San Francisco 49ers 25-22 in overtime. Pa ...
15:56:02 messages [response={"err":null,"output":"\n\u003e tool\n\u003e node --no-warnings --loader ts-node/esm src/server.ts\n\n### Sources:\n- [Wikipedia](https://en.wikipedia.org/wiki/Super_Bowl_LVIII)\n- [Fox Sports](https://www.foxsports.com/nfl/super-bowl)\n- [The Hollywood Reporter](https://www.hollywoodreporter.com/news/general-news/super-bowl-2024-winner-1235822266/)\n\n### Answer:\nThe Kansas City Chiefs won Super Bowl LVIII, defeating the San Francisco 49ers 25-22 in overtime. Patrick Mahomes was named the MVP of the game."}]
15:56:02 ended    [answers-from-the-internet(2)] [output=\u003e tool\n\u003e node --no-warnings --loader ts-node/esm src/server.ts\n\n### Sources:\n- [Wikipedia](https://e...]
15:56:02 continue [main]
15:56:02 sent     [main]
15:56:02 messages [request={"model":"gpt-4o","messages":[{"role":"system","content":"\nYou are task oriented system.\nYou receive input from a user, process the input from the given instructions, and then output the result.\nYour objective is to provide consistent and correct results.\nYou do not need to explain the steps taken, only provide the result to the given instructions.\nYou are referred to as a tool.\nYou don't move to the next step until you have a result.\n\nWho won 2024 Superbowl?"},{"role":"assistant","content":"","tool_calls":[{"id":"call_70LEmmgIsTXkaQYcb0tm05uC","type":"function","function":{"name":"answersFromTheInternet","arguments":"{\"question\":\"Who won 2024 Superbowl?\"}"}}]},{"role":"tool","content":"\n\u003e tool\n\u003e node --no-warnings --loader ts-node/esm src/server.ts\n\n### Sources:\n- [Wikipedia](https://en.wikipedia.org/wiki/Super_Bowl_LVIII)\n- [Fox Sports](https://www.foxsports.com/nfl/super-bowl)\n- [The Hollywood Reporter](https://www.hollywoodreporter.com/news/general-news/super-bowl-2024-winner-1235822266/)\n\n### Answer:\nThe Kansas City Chiefs won Super Bowl LVIII, defeating the San Francisco 49ers 25-22 in overtime. Patrick Mahomes was named the MVP of the game.","name":"answersFromTheInternet","tool_call_id":"call_70LEmmgIsTXkaQYcb0tm05uC"}],"temperature":0,"tools":[{"type":"function","function":{"name":"answersFromTheInternet","description":"Uses Google to answer the provided question","parameters":{"properties":{"question":{"description":"the question to ask","type":"string"}},"type":"object"}}}]}]
         content  [1] content | Waiting for model response...
         content  [1] content | The Kansas City Chiefs won Super Bowl LVIII, defeating the San Francisco 49ers 25-22 in overtime. Patrick Mahomes was named the MVP of the game.
15:56:03 messages [response={"role":"assistant","content":[{"text":"The Kansas City Chiefs won Super Bowl LVIII, defeating the San Francisco 49ers 25-22 in overtime. Patrick Mahomes was named the MVP of the game."}],"usage":{"promptTokens":299,"completionTokens":36,"totalTokens":335}}]
15:56:03 ended    [main] [output=The Kansas City Chiefs won Super Bowl LVIII, defeating the San Francisco 49ers 25-22 in overtime. Pa...]
15:56:03 usage    [total=493] [prompt=434] [completion=59]

OUTPUT:

The Kansas City Chiefs won Super Bowl LVIII, defeating the San Francisco 49ers 25-22 in overtime. Patrick Mahomes was named the MVP of the game.

Expected Behavior: gemini provider should be used for the LLM calls to the Node GPTScript SDK when executing github.com/gptscript-ai/answers-from-the-internet tool.

sangee2004 commented 3 months ago

The same issue is also seen when executing a chat enabled script with answers-from-the-internet tool with --diasble-tui or --debug-messages option.

This issue is not seen when executing the script in TUI.

Tools: github.com/gptscript-ai/answers-from-the-internet
Model: gpt-4o
chat: true

You are a good assistant.Wait for the user to ask a question and always return sources for the search result.

Pre Req -Do not have gcloud authentication setup , so using github.com/gptscript-ai/gemini-vertexai-provider would fail.

  1. gptscript --disable-cache --debug-messages --default-model 'gemini-1.0-pro from github.com/gptscript-ai/gemini-vertexai-provider' test_answers_from_internet_chat.gpt --> succeeds --> This is not expected behavior.

  2. gptscript --disable-cache --disable-tui --default-model 'gemini-1.0-pro from github.com/gptscript-ai/gemini-vertexai-provider' test_answers_from_internet_chat.gpt --> succeeds --> This is not expected behavior.

  3. gptscript --disable-cache --default-model 'gemini-1.0-pro from github.com/gptscript-ai/gemini-vertexai-provider' test_answers_from_internet_chat.gpt --> fails as we expect it to.