Describe the bug
I am using the exact same configuration on linux and OS X:
model: claude:claude-3-5-sonnet-20240620
clients:
- type: claude
api_key: <api-key>
- type: ollama
api_base: http://localhost:11434
api_auth: null
models:
- name: mxbai-embed-large
type: embedding
rag_embedding_model: ollama:mxbai-embed-large
rag_chunk_size: 3000 # Specifies the chunk size
rag_chunk_overlap: 150 # Specifies the chunk overlap
# Define document loaders to control how RAG and .file/--file load files of specific formats.
document_loaders:
# You can add custom loaders using the following syntax:
# <file-extension>: <command-to-load-the-file>
# Note: Use $1 for input file and $2 for output file. If $2 is omitted, use stdout as output.
pdf: 'pdftotext $1 -' # Load .pdf file, see https://poppler.freedesktop.org to set up pdftotext
docx: 'pandoc --to plain $1' # Load .docx file, see https://pandoc.org to set up pandoc
recursive_url: 'rag-crawler $1 $2' # Load websites, see https://github.com/sigoden/rag-crawler to set up rag-crawler
On linux if I type .rag on aichat it works and uses the ollama:mxbai-embed-large model, on mac it outputs Unknown embedding model 'ollama:mxbai-embed-large'.
To Reproduce
Use the configuration above, pull mxbai-embed-large to ollama, call .rag on aichat.
Expected behavior
It works the same way it does on linux.
Describe the bug I am using the exact same configuration on linux and OS X:
On linux if I type
.rag
on aichat it works and uses theollama:mxbai-embed-large
model, on mac it outputsUnknown embedding model 'ollama:mxbai-embed-large'
.To Reproduce Use the configuration above, pull
mxbai-embed-large
to ollama, call .rag on aichat.Expected behavior It works the same way it does on linux.
Configuration
Environment (please complete the following information):