Open sangee2004 opened 3 months ago
The same issue is also seen when executing a chat enabled script with answers-from-the-internet
tool with --diasble-tui
or --debug-messages
option.
This issue is not seen when executing the script in TUI.
Tools: github.com/gptscript-ai/answers-from-the-internet
Model: gpt-4o
chat: true
You are a good assistant.Wait for the user to ask a question and always return sources for the search result.
Pre Req -Do not have gcloud authentication setup , so using github.com/gptscript-ai/gemini-vertexai-provider
would fail.
gptscript --disable-cache --debug-messages --default-model 'gemini-1.0-pro from github.com/gptscript-ai/gemini-vertexai-provider' test_answers_from_internet_chat.gpt --> succeeds --> This is not expected behavior.
gptscript --disable-cache --disable-tui --default-model 'gemini-1.0-pro from github.com/gptscript-ai/gemini-vertexai-provider' test_answers_from_internet_chat.gpt --> succeeds --> This is not expected behavior.
gptscript --disable-cache --default-model 'gemini-1.0-pro from github.com/gptscript-ai/gemini-vertexai-provider' test_answers_from_internet_chat.gpt --> fails as we expect it to.
gptscript version - v0.0.0-dev-6e92ee7a-dirty
Steps to reproduce the problem:
Pre Req - In my case I dont have gcloud authentication setup , so using
github.com/gptscript-ai/gemini-vertexai-provider
would fail.Scenario 1:
gptscript --disable-cache --debug-messages --default-model 'gemini-1.0-pro from github.com/gptscript-ai/gemini-vertexai-provider' test_answers_from_internet.gpt
gpt-4o
model gets used to make LLM calls to the Node GPTScript SDK when executinggithub.com/gptscript-ai/answers-from-the-internet
tool.Expected Behavior: gemini provider should be used for the LLM calls to the Node GPTScript SDK when executing
github.com/gptscript-ai/answers-from-the-internet
tool.