AppFlowy-IO / AppFlowy

Bring projects, wikis, and teams together with AI. AppFlowy is an AI collaborative workspace where you achieve more without losing control of your data. The best open source alternative to Notion.
https://www.appflowy.io
GNU Affero General Public License v3.0
51.35k stars 3.44k forks source link

[FR] Add local AI capability (Windows/MacOS/Linux) #4981

Open cabusar opened 4 months ago

cabusar commented 4 months ago

Description

Hello,

Great work on this project, many thanks. :)

As OpenAI capability is already here and OpenAI client lib could be used for local AI, it would be nice to be able to modify the destination server so we can use our own LLM instead.

In python it would mean to have acces to openai.base_url as shown here :

import openai

# optional; defaults to `os.environ['OPENAI_API_KEY']`
openai.api_key = '...'

# all client options can be configured just like the `OpenAI` instantiation counterpart
openai.base_url = "https://..."

In this case the openAI API key is not relevant and could be anything except null.

Impact

Users would be able to use any AI service that use the OpenAI request format, including local LLM users.

Additional Context

No response

annieappflowy commented 4 months ago

What LLM models/services would you like to use in AppFlowy? Which AI features would you like to run on these models?

basilkorompilias commented 4 months ago

Local AI seems a great suggestion, but for those with older machines who rely on external processing for good AI, it would be great to be able to use other services too than only openai, with a nice little list perhaps. The most universal would be something like: https://openrouter.ai/ https://replicate.com/