monicahq / monica

Personal CRM. Remember everything about your friends, family and business relationships.
https://beta.monicahq.com
GNU Affero General Public License v3.0
21.2k stars 2.12k forks source link

AI Chatbot support for Q&A and Actions, aka ChatGPT for Monica #7026

Open benkaiser opened 8 months ago

benkaiser commented 8 months ago

Is your feature request related to a problem? Please describe. Currently the README states:

Monica does not have built-in AI with integrations like ChatGPT.

However, I couldn't find any issues discussing this topic. Was there a hard decision made here, or is it possible this could be added?

Describe the solution you'd like It would be great to build a chat interface. It could support basic actions such as:

Describe alternatives you've considered (optional) Potential alternative over implementing it in the core codebase, would be to build it as a chrome extension and have it make the needed API requests. Many downsides to this approach, poor mobile support being a key one. It could also just be manually executed within ChatGPT, but adding the background information for contacts would have to be automated in some way.

How to use GitHub

sibbl commented 8 months ago

I'd be happy to have the option as it could be a great way to quickly summarize content.

It should be possible to switch between different LLMs easily as local ones will get better and not everyone wants to share the information with large LLM providers which only provide APIs.

HarvsG commented 7 months ago

Just adding a thought. To do this, one would have to send some or all of the database to the LLM as part of the prompt, this would be a large amount of tokens for each request, and the more tokens, the more cost. As such you would first have to do some context discovery (e.g the name of the person) and run a search first, before then only sending the product of that search, however this would severely limit the scope of the chat requests.

benkaiser commented 7 months ago

@HarvsG most of those problems can actually be mitigated by giving the LLM pseudo-functions to run (SEARCH_USERS, USER_INFO, CONVERSATION_HISTORY_FOR_USER) and letting the LLM go through a reasoning loop (meaning multiple round trips) and at each step letting the LLM evaluate if it has enough information to satisfy the request, or if a given function should be run.

This also allows for extensibility to execute modification actions (INSERT_CONVERSATION_ENTRY, UPDATE_USER_DETAILS, etc).