[[http://www.gnu.org/licenses/gpl-3.0.txt][file:https://img.shields.io/badge/license-GPL_3-green.svg]] [[https://melpa.org/#/elisa][file:https://melpa.org/packages/elisa-badge.svg]]
ELISA (Emacs Lisp Information System Assistant) is a project designed to help Emacs users quickly find answers to their questions related to Emacs and Emacs Lisp. Utilizing the powerful Ellama package, ELISA provides accurate and relevant responses to user queries, enhancing productivity and efficiency in the Emacs environment. By integrating links to the Emacs info manual after answering a question, ELISA ensures that users have easy access to additional information on the topic, making it an essential tool for both beginners and advanced Emacs users.
ELISA creates index from info manuals. When you send message to ~elisa-chat~ it search to semantically similar info nodes in index, get first ~elisa-limit~ nodes, add it to context and send your message to llm. LLM generates answer to your message based on provided context. You can read not only answer generated by llm, but also info manuals by provided links.
** Installation
You need emacs 29.2 or newer to use this package.
This package now on [[https://melpa.org/#/getting-started][MELPA]] and you can just ~M-x~ ~package-install~ ~elisa~.
*** Alternative method
You can use ~package-vc~ to install ELISA:
(package-vc-install "https://github.com/s-kostyaev/elisa")
*** System dependencies
Then you need to download ~sqlite-vss~. You can do it manually from https://github.com/asg017/sqlite-vss/releases or by calling ~M-x~ ~elisa-download-sqlite-vss~.
You can use this package with different llm providers. By default it uses [[https://github.com/jmorganca/ollama][ollama]] provider both for embeddings and chat. If you ok with it, you need to install [[https://github.com/jmorganca/ollama][ollama]] and pull used models:
ollama pull nomic-embed-text ollama pull sskostyaev/openchat:8k-rag
Second model is just [[https://ollama.com/library/openchat][openchat]] with exactly 2 tweaks: context window extended to 8k and temperature set to 0 to better usage for RAG (Retrieval Augmented Generation). You can try other models, for example:
Create index for builtin, external or all info manuals by one of this commands:
This can take some time.
** Commands
*** elisa-chat
Entrypoint. Makes similarity search in index, add semantically similar info nodes into context and query llm for prompt. Uses ~ellama~ under the hood. Call one of parse manuals functions to create index before use it.
*** elisa-download-sqlite-vss
Download [[https://github.com/asg017/sqlite-vss][sqlite vss]] extension to provide similarity search.
*** elisa-async-parse-builtin-manuals
Parse builtin emacs info manuals asyncronously. Can take long time.
*** elisa-async-parse-external-manuals
Parse external emacs info manuals asyncronously.
*** elisa-async-parse-all-manuals
Parse all emacs info manuals asyncronously.
One of parse functions should be called before ~elisa-chat~ to create index.
** Configuration
Example configuration. With default installation you don't need it.
(use-package elisa :init (setopt elisa-limit 5) (require 'llm-ollama) (setopt elisa-embeddings-provider (make-llm-ollama :embedding-model "nomic-embed-text")) (setopt elisa-chat-provider (make-llm-ollama :chat-model "sskostyaev/openchat:8k-rag" :embedding-model "nomic-embed-text")))
The following variables can be customized for ELISA:
** Contributions
To contribute, submit a pull request or report a bug. This library is planned to be part of GNU ELPA; major contributions must be from someone with FSF papers. Alternatively, you can write a module and share it on a different archive like MELPA.