continuedev / continue

⏩ Continue is the leading open-source AI code assistant. You can connect any models and any context to build custom autocomplete and chat experiences inside VS Code and JetBrains
https://docs.continue.dev/
Apache License 2.0
16.06k stars 1.23k forks source link

Cannot use @Codebase: file was not found locally ... models/all-MiniLM-L6-v2/tokenizer.json #1479

Open w013nad opened 3 months ago

w013nad commented 3 months ago

Before submitting your bug report

Relevant environment info

- OS: Windows 11
- Continue:0.9.158
- IDE: VSCode 1.70.2

Description

This appears to be the same issue from #895 which was fixed in February but I appear to be having the same issues now with the latest versions.

To reproduce

  1. Start Continue
  2. During the embedding process error happens

Log output

Error indexing codebase: Error. local_files_only=true or 'env.allowRemoteModels=false' and file was not found locally at "C:\Users\ndurkee\.vscode\extensions\continue-continue-0.9.158-win32-x64\models\all-MiniLM-L6-v2\tokenizer.json".
neominik commented 3 months ago

For @codebase you can work around the issue by using nomic-embed-text embeddings provider from ollama. However, @docs indexing seems to ignore that setting.

I am getting a very similar error message on macos and IntelliJ with continue v0.0.50, even when downloading the tokenizer from hugging face manually and placing it at that exact path:

Error: `local_files_only=true` or `env.allowRemoteModels=false` and file was not found locally at "/snapshot/continue-deploy/binary/models/all-MiniLM-L6-v2/tokenizer.json".
LeoLiuYan commented 3 months ago

How this issue progresses?

nggary commented 2 months ago

I am facing the sample issue as well!!

螢幕截圖 2024-06-24 上午12 32 21
buzzlightyear2k commented 2 months ago

Help me i faced the same issue

nggary commented 2 months ago

Guys, if you are using Intellji, please use local ollama with nomic-embed-text as embeddingsProvider

Revise your config.json to

"embeddingsProvider": {
    "provider": "ollama",
    "model": "nomic-embed-text",
    "apiBase": "http://localhost:11434"
  },
neominik commented 2 months ago

Guys, if you are using Intellji, please use local ollama with nomic-embed-text as embeddingsProvider

Revise your config.json to

"embeddingsProvider": {
  "provider": "ollama",
  "model": "nomic-embed-text",
  "apiBase": "http://localhost:11434"
},

That works for most things, but not for @docs, unfortunately.

tmibkr commented 1 month ago

I'm getting the same issue with IntelliJ Ultimate 2024.1 on Windows with Continue 0.0.55. Unfortunately using ollama is not an option to work around.