Closed Greatz08 closed 3 months ago
Same issue here when trying to index some documentation and then query it from chat:
Error getting context items from docs: Error: lance error: LanceError(IO): Schema error: No field named baseurl. Valid fields are title, "baseUrl", content, path, "startLine", "endLine", vector., /Users/runner/.cargo/registry/src/index.crates.io-6f17d22bba15001f/lance-0.10.4/src/io/exec/planner.rs:659:20
@alexsoyes I just solved this here. Either it's in the current pre-release or will be in the next one
@sestinj will this fix be pushed to jetbrains plugin? I'm on 0.0.46 and I'm experiencing this issue as well
when trying to use "Select Code ( CTRL + J ) /edit ..."
Error running handler for "context/getContextItems": Error: local_files_only=true
or env.allowRemoteModels=false
and file was not found locally at "C:\snapshot\continue-deploy\binary\models/all-MiniLM-L6-v2/tokenizer.json".
Error: local_files_only=true
or env.allowRemoteModels=false
and file was not found locally at "C:\snapshot\continue-deploy\binary\models/all-MiniLM-L6-v2/tokenizer.json".
Similar issue on Windows w/ the Jetbrains plugin, this appears in the logs on start up:
[info] Starting Continue core...
[2024-06-07T22:35:39] [info] Starting Continue core...
[2024-06-07T22:35:40] Error updating the vectordb::all-MiniLM-L6-v2 index: Error: `local_files_only=true` or `env.allowRemoteModels=false` and file was not found locally at "C:\snapshot\continue-deploy\binary\models/all-MiniLM-L6-v2/tokenizer.json".
The tokenizer.json
file does not exist locally (nor does C:\snapshot
for that matter).
Fresh installation, when using the @folder context:
Error getting context items from folder: Error: Schema Error. No vector column found to create index Source: Continue - Codestral, GPT-40, and more
I'm getting this error message.
Error getting context items from folder: SyntaxError: Unexpected end of JSON input
I'm using Linux on a remote VSCode server.
@womd @bradleymize we don't currently support the transformers.js embeddings provider on JetBrains, which is why you're seeing this error. In newer versions we'll have a more clear warning, but for now the solution is to use "ollama" or another embeddings provider: https://docs.continue.dev/walkthroughs/codebase-embeddings#embeddings-providers
@Devil-Mix it looks like your problem is entirely different. Can you share what your config.json looks like?
@womd @bradleymize we don't currently support the transformers.js embeddings provider on JetBrains, which is why you're seeing this error. In newer versions we'll have a more clear warning, but for now the solution is to use "ollama" or another embeddings provider: https://docs.continue.dev/walkthroughs/codebase-embeddings#embeddings-providers
@sestinj Thanks for the clarification. Confirmed I could use the code in Writing a custom EmbeddingsProvider and updating the url to LM Studio's http://localhost:1234/v1/embeddings
and indexing was successful after restarting IntelliJ.
For anyone else with this issue, I updated config.ts
as documented, and removed the configuration in config.json
for the embeddingsProvider
.
Very nice! One other way to approach this would be using the "openai" embeddingsProvider with an apiBase pointing to LMStudio, as it is OpenAI-compatible
@sestinj
@Devil-Mix it looks like your problem is entirely different. Can you share what your config.json looks like?
Thx. It's working fine now. I think it was something to do with when adding workspace.
It's working for me now!
Before submitting your bug report
Relevant environment info
Description
trying to feed folder and get answer for some files but getting this error
To reproduce
No response
Log output