toeverything / AFFiNE

There can be more than Notion and Miro. AFFiNE(pronounced [ə‘fain]) is a next-gen knowledge base that brings planning, sorting and creating all together. Privacy first, open-source, customizable and ready to use.
https://affine.pro
Other
40.02k stars 2.58k forks source link

Add explicit setup instructions for locally hosted AI #7497

Closed ProfessorDey closed 4 weeks ago

ProfessorDey commented 1 month ago

Description

As I've documented in the discussion on the subject, it is actually relatively easy to redirect the openAI requests to a locally hosted solution (I've only tested with text-generation-webui but others should be usable). This information should be formally documented and the relevant code lines added. I discuss my findings in detail in #7030 but I'll summarise the details here: 1) Run text-generation-webui with the following settings (Note the API key may potentially be anything but I only tested the recommended value in this documentation: CMD_FLAGS.txt (This will default to listening on a global port 5000, consider additional security in production): --api --listen --api-key "sk-111111111111111111111111111111111111111111111111" Model Selection and Settings: Any 8k+ context length model, tested with Llama-3-8B-Instruct-262k.Q5_K_M.gguf with a set context length of 8192 Character Selection: Due to the https://github.com/oobabooga/text-generation-webui/issues/4320 bug, we need to manually add a new character to the text-generation-webui character page. Here is one that can be used, it just needs to match the "None" name as the context is overridden by AFFiNE. None.json 2) Run AFFiNE at least once to generate the userland config (Since docker requires running as SUDO, that usually means /root/.affine/self-hosted/config/) 3) Edit /root/.affine/self-hosted/config/affine.js and modify the Copilot Plugin section like so:

AFFiNE.use('copilot', {
  openai: {
    baseURL: 'http://YOUR_PC_IP_ADDRESS:5000/v1', // Use whatever IP address your host PC is assigned, not localhost or it won't escape the docker container
    apiKey: 'sk-111111111111111111111111111111111111111111111111', // Or whatever API Key you set manually
  }

4) Run AFFiNE (Setting environment variables doesn't matter as they're ignored in the case of OPENAI_* values due to using OpenAI's NodeJS module)

Use case

As self-hosting becomes more established and the kinks ironed out, I foresee a lot of demand for being able to run our own AI models locally. Not only is this critical for most internal business infrastructure, which AFFiNE and its competitors are very well suited for supporting, but it also allows for far greater customisation of how the software operates. Preferably, we'd also be able to directly modify the context string so we can set up our own branded AIs just as there are plans to allow us to modify the UI to suit our individual needs.

Anything else?

Original Discussion and Research: #7030 Text-Generation-WebUI's OPENAI API substitution documentation: https://github.com/oobabooga/text-generation-webui/wiki/12-%E2%80%90-OpenAI-API Text-Generation-WebUI Character Bug: https://github.com/oobabooga/text-generation-webui/issues/4320 Text-Generation-WebUI "None" Character Workaround: https://github.com/user-attachments/files/16227649/None.json

Are you willing to submit a PR?

affine-issue-bot[bot] commented 1 month ago

Issue Status: 🆕 *Untriaged

*🆕 Untriaged**

The team has not yet reviewed the issue. We usually do it within one business day. Docs: https://github.com/toeverything/AFFiNE/blob/canary/docs/issue-triaging.md

This is an automatic reply by the bot.

SAnBlog commented 1 month ago

Great, I successfully connected!

Aliang-code commented 1 month ago

is there a way to modify the "model" param in request? now it is "gpt-4o" always

forehalo commented 4 weeks ago

will track this issue in #7705