Open Propheticus opened 1 month ago
This is a version without screenshots. In line with most of your docs apart from the quickstart.
An SVG image I had added in bulletpoint 2 under 'Jan' is removed by Github:
<svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="flex-shrink-0 text-muted-foreground"><rect width="18" height="18" x="3" y="3" rx="2"></rect><path d="m10 10-2 2 2 2"></path><path d="m14 14 2-2-2-2"></path></svg>
(this is a literal steal from your UI)
Hoped it would render like in Obsidian locally:
Replaced with <>
Did you change anything about the embedder settings? I am getting:
Pinecone::Invalid ENV settings
when I try to upload a document or use the jan documentation website in your example above. After clicking "Save and Embed".
Update: Nevermind, solved: https://github.com/Mintplex-Labs/anything-llm/issues/1018
My settings showed I was using LanceDB already but just clicking it and saving it again worked. Looks like the backend and frontend got out of sync.
@hiro-v I've updated the first comment to reflect the new LLM selection option in ALLM
AnythingLLM
AnythingLLM enables the embedding of documents and web URLs, cuts them into chunks and stores them in a vector database.
AnythingLLM uses the VectorDB to match the chunks that have the highest semantic similarity to your query. Then it adds those as context to the prompt that is sent to the LLM running on Jan. You can 'pin' a particular document to paste that into the context in its entirety. How well this pinning works depends on how well the model that you use can handle large contexts.
On top of threads, they've added the concept of workspaces. Per workspace you can embed sets of documents that make sense together. This way you can have separate workspaces for asking questions about different topics.
This enhances Jan to enable more advanced RAG applications; Jan can currently only attach one document to a thread at a time.
Setup
Jan - local API
<>
in the bottom left corner.AnythingLLM
http://<IP>:<port>/v1
, if you used the defaults that would behttp://127.0.0.1:1337/v1
."id"
. This looks something likellama-3-8b-instruct-32k-q8
.Setting up a workspace
Your first RAG
↥
next to the cogwheel (⚙️).This is the principle. You can do the same with text files, audio, csv's, spreadsheets,....