danny-avila / LibreChat

Enhanced ChatGPT Clone: Features Anthropic, OpenAI, Assistants API, Azure, Groq, GPT-4o, Mistral, OpenRouter, Vertex AI, Gemini, Artifacts, AI model switching, message search, langchain, DALL-E-3, ChatGPT Plugins, OpenAI Functions, Secure Multi-User System, Presets, completely open-source for self-hosting. Actively in public development.
https://librechat.ai/
MIT License
17.2k stars 2.87k forks source link

[Bug]: function agent call fail #537

Closed waltcow closed 1 year ago

waltcow commented 1 year ago

Contact Details

No response

What happened?

use function agent call stable diffusion structure api fail

Steps to Reproduce

image image

What browsers are you seeing the problem on?

Chrome

Relevant log output

ask gpt-plugin log
{
  text: 'Generate me a picture of Sunny',
  conversationId: '88714dd9-e415-4f75-a955-95c27ce61c03',
  endpointOption: {
    chatGptLabel: null,
    promptPrefix: null,
    tools: [ 'stable-diffusion' ],
    modelOptions: {
      model: 'gpt-3.5-turbo-16k',
      temperature: 0.8,
      top_p: 1,
      presence_penalty: 0,
      frequency_penalty: 0
    },
    agentOptions: {
      agent: 'functions',
      skipCompletion: false,
      model: 'gpt-3.5-turbo-0613',
      temperature: 0
    }
  }
}
sendMessage Generate me a picture of Sunny {
  getIds: [Function: getIds],
  user: '64880214bb4969d2ccd9e5c5',
  parentMessageId: '00000000-0000-0000-0000-000000000000',
  conversationId: '88714dd9-e415-4f75-a955-95c27ce61c03',
  overrideParentMessageId: '5aa51825-4f7c-46fb-b2f0-42d8959d3e4a',
  onAgentAction: [Function: onAgentAction],
  onChainEnd: [Function: onChainEnd],
  onStart: [Function: onStart],
  onProgress: [Function: wrapper],
  abortController: AbortController { signal: AbortSignal { aborted: false } }
}
Loading history for conversation 88714dd9-e415-4f75-a955-95c27ce61c03 00000000-0000-0000-0000-000000000000
options
{
  debug: true,
  reverseProxyUrl: 'https://api.openai.com/v1/chat/completions',
  proxy: null,
  chatGptLabel: null,
  promptPrefix: null,
  tools: [ 'stable-diffusion' ],
  modelOptions: {
    model: 'gpt-3.5-turbo-16k',
    temperature: 0.8,
    top_p: 1,
    presence_penalty: 0,
    frequency_penalty: 0
  },
  agentOptions: {
    agent: 'functions',
    skipCompletion: false,
    model: 'gpt-3.5-turbo-0613',
    temperature: 0
  },
  getIds: [Function: getIds],
  user: '64880214bb4969d2ccd9e5c5',
  parentMessageId: '00000000-0000-0000-0000-000000000000',
  conversationId: '88714dd9-e415-4f75-a955-95c27ce61c03',
  overrideParentMessageId: '5aa51825-4f7c-46fb-b2f0-42d8959d3e4a',
  onAgentAction: [Function: onAgentAction],
  onChainEnd: [Function: onChainEnd],
  onStart: [Function: onStart],
  onProgress: [Function: wrapper],
  abortController: AbortController { signal: AbortSignal { aborted: false } }
}
<-----Agent Model: gpt-3.5-turbo-0613 | Temp: 0----->
Requested Tools
[ 'stable-diffusion' ]
Loaded Tools
[ 'stable-diffusion' ]
Loaded agent.
Attempt 1 of 1
[chain/start] [1:chain:AgentExecutor] Entering Chain run with input: {
  "input": "Generate me a picture of Sunny",
  "signal": {},
  "chat_history": ""
}
[llm/start] [1:chain:AgentExecutor > 2:llm:ChatOpenAI] Entering LLM run with input: {
  "messages": [
    [
      {
        "type": "system",
        "data": {
          "content": "Date: June 19, 2023\nYou are a helpful AI assistant. Objective: Resolve the user's query with provided functions.\nThe user is demanding a function response to the query.",
          "additional_kwargs": {}
        }
      },
      {
        "type": "human",
        "data": {
          "content": "\nQuery: Generate me a picture of Sunny",
          "additional_kwargs": {}
        }
      }
    ]
  ]
}
[llm/end] [1:chain:AgentExecutor > 2:llm:ChatOpenAI] [3.06s] Exiting LLM run with output: {
  "generations": [
    [
      {
        "text": "I'm sorry, but as an AI text-based assistant, I am unable to generate pictures. However, I can help you with other tasks or provide information. Is there anything else I can assist you with?",
        "message": {
          "type": "ai",
          "data": {
            "content": "I'm sorry, but as an AI text-based assistant, I am unable to generate pictures. However, I can help you with other tasks or provide information. Is there anything else I can assist you with?",
            "additional_kwargs": {}
          }
        }
      }
    ]
  ],
  "llmOutput": {
    "tokenUsage": {
      "completionTokens": 42,
      "promptTokens": 59,
      "totalTokens": 101
    }
  }
}
message AIChatMessage {
  text: "I'm sorry, but as an AI text-based assistant, I am unable to generate pictures. However, I can help you with other tasks or provide information. Is there anything else I can assist you with?",
  name: undefined,
  additional_kwargs: { function_call: undefined }
}
[chain/end] [1:chain:AgentExecutor] [3.06s] Exiting Chain run with output: {
  "output": "I'm sorry, but as an AI text-based assistant, I am unable to generate pictures. However, I can help you with other tasks or provide information. Is there anything else I can assist you with?",
  "intermediateSteps": []
}
Error in handler Handler, handleChainEnd: TypeError: Cannot read properties of undefined (reading 'action')
this.result {
  output: "I'm sorry, but as an AI text-based assistant, I am unable to generate pictures. However, I can help you with other tasks or provide information. Is there anything else I can assist you with?",
  intermediateSteps: []
}
promptPrefix As a helpful AI Assistant, review and improve the answer you generated using plugins in response to the User Message below. The user hasn't seen your answer or thoughts yet.
Internal Actions Taken: None
Preliminary Answer: "I'm sorry, but as an AI text-based assistant, I am unable to generate pictures. However, I can help you with other tasks or provide information. Is there anything else I can assist you with?"
Reply conversationally to the User based on your preliminary answer, internal actions, thoughts, and observations, making improvements wherever possible, but do not modify URLs.
You must cite sources if you are using any web links. 
Only respond with your conversational reply to the following User Message:
"Generate me a picture of Sunny"
buildPrompt messages [
  {
    messageId: '5aa51825-4f7c-46fb-b2f0-42d8959d3e4a',
    parentMessageId: '00000000-0000-0000-0000-000000000000',
    role: 'User',
    text: 'Generate me a picture of Sunny'
  }
]

https://api.openai.com/v1/chat/completions
{
  model: 'gpt-3.5-turbo-16k',
  temperature: 0.8,
  top_p: 1,
  presence_penalty: 0,
  frequency_penalty: 0,
  stop: undefined,
  max_tokens: 1024,
  stream: true,
  messages: [
    {
      role: 'user',
      name: 'instructions',
      content: '||>Instructions:\n' +
        "As a helpful AI Assistant, review and improve the answer you generated using plugins in response to the User Message below. The user hasn't seen your answer or thoughts yet.\n" +
        'Internal Actions Taken: None\n' +
        `Preliminary Answer: "I'm sorry, but as an AI text-based assistant, I am unable to generate pictures. However, I can help you with other tasks or provide information. Is there anything else I can assist you with?"\n` +
        'Reply conversationally to the User based on your preliminary answer, internal actions, thoughts, and observations, making improvements wherever possible, but do not modify URLs.\n' +
        'You must cite sources if you are using any web links. \n' +
        'Only respond with your conversational reply to the following User Message:\n' +
        '"Generate me a picture of Sunny"'
    },
    {
      role: 'user',
      content: 'Chat History:\n||>User:\nGenerate me a picture of Sunny\n||>Assistant:\n'
    }
  ]
}

Server closed the connection unexpectedly, returning...

Screenshots

No response

Code of Conduct

danny-avila commented 1 year ago

The only issue here is the unhandled action. The AI doesn’t know who you’re referring to so there’s no description it can think of to pass to the function call. In my example, if you’re referring to “Sunny” in the image shared in my PR, there were more messages prior describing it as a Corgi

waltcow commented 1 year ago

Can you provide more details or docs on function agent @danny-avila

danny-avila commented 1 year ago

It generally works the same as the original handling of plugins, which you can read a lot of detail here: https://github.com/danny-avila/LibreChat/blob/main/docs/features/plugins/introduction.md

The main differences are written in great detail here #521

I will update the docs with the notes from the PR

fuegovic commented 1 year ago

Same here, I was able to use functions agent with 3.5 when testing #521 before it was merged into main, but since 0.5.1 I always get an error (as a LLM I can't...) and no plugins are called

danny-avila commented 1 year ago

Same here, I was able to use functions agent with 3.5 when testing #521 before it was merged into main, but since 0.5.1 I always get an error (as a LLM I can't...) and no plugins are called

any output with your error, or is the LLM deciding not to use the function?

fuegovic commented 1 year ago

I didn't spot it before but yes there's an error in the terminal:

image