This plug integrates various LLMs (Large Language Models) into SilverBullet, the markdown-based note taking tool, allowing users to perform various AI-related tasks directly within their notes. It requires SilverBullet to work, and also access to either a self-hosted or SaaS LLM such as Ollama, OpenAI (ChatGPT), Google Gemini, etc.
silverbullet-ai is very new and is still in early development. It may not work as expected. Please report any issues you encounter, or feature ideas.
If you are new here, start with either the AI: Chat on current page
command or the custom templated prompts!
Warning: Please backup your notes before using this plug. It inserts and replaces text at certain points and isn't well-tested yet, so back up your data!
The list below are the commands available in this plugin.
After installing the plug, you can access its features through the command palette. To ensure the plug functions correctly, you must set the OPENAI_API_KEY
on the SECRETS page.
If you do not have a SECRETS page, create one and name it SECRETS
. Then, insert a YAML block as shown below, replacing "openai key here"
with your actual OpenAI API key:
```yaml
OPENAI_API_KEY: "openai key here"
```
OPENAI_API_KEY is required for any openai api compatible model currently, but may not get used for local models that don't use keys.
The secret does not necessary have to be OPENAI_API_KEY
, it can be any name you want as long as you also change the secretName
for the model to match. This allows you to have multiple api keys for the same provider as an example.
To change the text generation model used by all commands, or other configurable options, open your SETTINGS
page and change the setting below:
ai:
# configure one or more image models. Only OpenAI's api is currently supported
imageModels:
- name: dall-e-3
modelName: dall-e-3
provider: dalle
- name: dall-e-2
modelName: dall-e-2
provider: dalle
# Configure one or more text models
# Provider may be openai or gemini. Most local or self-hosted LLMs offer an openai compatible api, so choose openai as the provider for those and change the baseUrl accordingly.
textModels:
- name: ollama-phi-2
modelName: phi-2
provider: openai
baseUrl: http://localhost:11434/v1
requireAuth: false
- name: gpt-4-turbo
provider: openai
modelName: gpt-4-0125-preview
- name: gpt-4-vision-preview
provider: openai
modelName: gpt-4-vision-preview
- name: gpt-3-turbo
provider: openai
modelName: gpt-3.5-turbo-0125
# Chat section is optional, but may help provide better results when using the Chat On Page command
chat:
userInformation: >
I'm a software developer who likes taking notes.
userInstructions: >
Please give short and concise responses. When providing code, do so in python unless requested otherwise.
To use Ollama locally, make sure you have it running first and the desired models downloaded. Then, set the openAIBaseUrl
to the url of your ollama instance:
ai:
textModels:
- name: ollama-phi-2
# Run `ollama list` to see a list of models downloaded
modelName: phi
provider: openai
baseUrl: http://localhost:11434/v1
requireAuth: false
requireAuth: When using Ollama and chrome, requireAuth needs to be set to false so that the Authorization header isn't set. Otherwise you will get a CORS error.
Mistral.ai is a hosted service that offers an openai-compatible api. You can use it with settings like this:
ai:
textModels:
- name: mistral-medium
modelName: mistral-medium
provider: openai
baseUrl: https://api.mistral.ai/v1
secretName: MISTRAL_API_KEY
MISTRAL_API_KEY
also needs to be set in SECRETS
using an api key generated from their web console.
Perplexity.ai is another hosted service that offers an openai-compatible api and various models. You can use it with settings like this:
ai:
textModels:
- name: sonar-medium-online
modelName: sonar-medium-online
provider: openai
baseUrl: https://api.perplexity.ai
OPENAI_API_KEY
also needs to be set in SECRETS
to an API key generated from their web console.
Google does not offer an openai-compatible api, so consider the support for Gemini to be very experimental for now.
To configure it, you can use these settings:
ai:
textModels:
- name: gemini-pro
modelName: gemini-pro
provider: gemini
baseUrl: https://api.gemini.ai/v1
secretName: GOOGLE_AI_STUDIO_KEY
Note: The secretName defined means you need to put the api key from google ai studio in your SECRETS file as GOOGLE_AI_STUDIO_KEY
.
Note 2: AI Studio is not the same as the Gemini App (previously Bard). You may have access to https://gemini.google.com/app but it does not offer an api key needed for integrating 3rd party tools. Instead, you need access to https://aistudio.google.com/app specifically.
Dall-E can be configured to use for generating images with these settings:
ai:
imageModels:
- name: dall-e-3
modelName: dall-e-3
provider: dalle
- name: dall-e-2
modelName: dall-e-2
provider: dalle
OPENAI_API_KEY
also needs to be set in SECRETS
to an API key generated in the OpenAI web console.
baseUrl
can also be set to another api compatible with openai/dall-e.
OpenAI introduced custom instructions for ChatGPT a while back to help improve the responses from ChatGPT. We are emulating that feature by allowing a system prompt to be injected into each new chat session.
The system prompt is rendered similar to the one below, see the example config above for where to configure these settings:
Always added:
This is an interactive chat session with a user in a note-taking tool called SilverBullet.
If userInformation is set, this is added:
The user has provided the following information about their self: ${ai.chat.userInformation}
If userInstructions is set, this is added:
The user has provided the following instructions for the chat, follow them as closely as possible: ${ai.chat.userInstructions}
NOTE: All built-in prompts will be replaced with templated prompts eventually.
As of 0.0.6, you can use template notes to create your own custom prompts to send to the LLM.
Template notes make use of all of the template language available to SilverBullet.
To be a templated prompt, the note must have the following frontmatter:
tags
must include template
and aiPrompt
aiprompt
object must exist and have a description
keyaiprompt.systemPrompt
can be specified to override the system promptFor example, here is a templated prompt to summarize the current note and insert the summary at the cursor:
---
tags:
- template
- aiPrompt
aiprompt:
description: "Generate a summary of the current page."
---
Generate a short and concise summary of the note below.
title: {{@page.name}}
Everything below is the content of the note:
{{readPage(@page.ref)}}
With the above note saved as AI: Generate note summary
, you can run the AI: Execute AI Prompt from Custom Template
command from the command palette, select the AI: Generate note summary
template, and the summary will be streamed to the current cursor position.
Another example prompt is to pull in remote pages via federation and ask the llm to generate a space script for you:
---
tags:
- template
- aiPrompt
aiprompt:
description: "Describe the space script functionality you want and generate it"
systemPrompt: "You are an expert javascript developer. Help the user develop new functionality for their personal note taking tool."
slashCommand: aiSpaceScript
---
SilverBullet space script documentation:
{{readPage([[!silverbullet.md/Space%20Script]])}}
Using the above documentation, please create a space-script following the users description in the note below. Output only valid markdown with a code block using space-script. No explanations, code in a markdown space-script block only. Must contain silverbullet.registerFunction or silverbullet.registerCommand.
title: {{@page.name}}
Everything below is the content of the note:
{{readPage(@page.ref)}}
While this plugin is free to use, OpenAI does charge for their API usage. Please see their pricing page for cost of the various apis.
As of 2024-02, here's a rough idea of what to expect:
To build this plug, make sure you have SilverBullet installed. Then, build the plug with:
deno task build
Or to watch for changes and rebuild automatically
deno task watch
Then, copy the resulting .plug.js
file into your space's _plug
folder. Or build and copy in one command:
deno task build && cp *.plug.js /my/space/_plug/
SilverBullet will automatically sync and load the new version of the plug (or speed up this process by running the {[Sync: Now]} command).
Add the following to to your PLUGS
file, run Plugs: Update
command and off you go!
For in-development code from the main branch:
- github:justyns/silverbullet-ai/silverbullet-ai.plug.js
For the latest "release" code, mostly also still in development for now:
- ghr:justyns/silverbullet-ai/0.4.0
You can also use the Plugs: Add
command and enter the above url to install.
After installing, be sure to make the necessary config changes in SETTINGS and SECRETS.