Open w013nad opened 3 months ago
For @codebase
you can work around the issue by using nomic-embed-text
embeddings provider from ollama
. However, @docs
indexing seems to ignore that setting.
I am getting a very similar error message on macos and IntelliJ with continue v0.0.50, even when downloading the tokenizer from hugging face manually and placing it at that exact path:
Error: `local_files_only=true` or `env.allowRemoteModels=false` and file was not found locally at "/snapshot/continue-deploy/binary/models/all-MiniLM-L6-v2/tokenizer.json".
How this issue progresses?
I am facing the sample issue as well!!
Help me i faced the same issue
Guys, if you are using Intellji, please use local ollama with nomic-embed-text as embeddingsProvider
Revise your config.json to
"embeddingsProvider": {
"provider": "ollama",
"model": "nomic-embed-text",
"apiBase": "http://localhost:11434"
},
Guys, if you are using Intellji, please use local ollama with nomic-embed-text as embeddingsProvider
Revise your config.json to
"embeddingsProvider": { "provider": "ollama", "model": "nomic-embed-text", "apiBase": "http://localhost:11434" },
That works for most things, but not for @docs
, unfortunately.
I'm getting the same issue with IntelliJ Ultimate 2024.1 on Windows with Continue 0.0.55. Unfortunately using ollama is not an option to work around.
Before submitting your bug report
Relevant environment info
Description
This appears to be the same issue from #895 which was fixed in February but I appear to be having the same issues now with the latest versions.
To reproduce
Log output