nextcloud / mail

💌 Mail app for Nextcloud
https://apps.nextcloud.com/apps/mail
GNU Affero General Public License v3.0
850 stars 262 forks source link

Regression: Smart reply stopped working after version stable3.7 #10274

Open Tobias4Real opened 1 month ago

Tobias4Real commented 1 month ago

Steps to reproduce

  1. Install the latest mail app and the "OpenAI and LocalAI integrations" app from the app store
  2. Set OpenAI or an AI provider of you own choosing as the AI provider in the settings, enter the API provider details and select a model like 'gpt-4o-mini'. Set the the text completion endpoint to 'Chat completions'.

Enable text processing through LLMs in groupware administrator settings.

  1. Open a mail containing text

Expected behavior

Smart replies will be generated and shown

Actual behavior

No smart replies are shown, but thread summaries do work.

Mail app version

4.0.1

Mailserver or service

private

Operating system

Fedora Linux 40 (Workstation Edition)

PHP engine version

PHP 8.3

Web server

Nginx

Database

PostgreSQL

Additional info

I get the following error in the javascript console:

TypeError: right-hand side of 'in' should be an object, got undefined convertAxiosError convert.js:30 smartReply AiIntergrationsService.js:37 fetchMessage ThreadEnvelope.vue:394 mounted ThreadEnvelope.vue:315 VueJS 18 addEnvelopes mutations.js:301 addEnvelopes mutations.js:295 wrappedMutationHandler vuex.esm.js:844 commitIterator vuex.esm.js:466 commit vuex.esm.js:465 _withCommit vuex.esm.js:624 commit vuex.esm.js:464 boundCommit vuex.esm.js:409 fetchEnvelopes actions.js:646 vue.runtime.esm.js:3065

And the following error in the nextcloud log: Smart reply failed: Failed to decode smart replies JSON output

The reason that this happens is that the LLM wraps the json it responds with in a code fence:

"```json

```" The function AIIntegrationsService::getSmartReply should check for it / prune it.
ChristophWurst commented 1 day ago

And the following error in the nextcloud log: Smart reply failed: Failed to decode smart replies JSON output

The reason that this happens is that the LLM wraps the json it responds with in a code fence:

"```json

Good catch, Tobias. It might be possible to avoid the fence with a better prompt or unwrapping on our end.