Open CallMeLaNN opened 1 month ago
This is the notification.
I tried to remove ~/.continue/index
in case if it get messed up but using codebase retrieval with new index cache, the error still appear.
I tried in a few projects web and python in case related to project, still not working. However it is fine in a new project with 1 file.
So maybe there's a cache related to project, I'm not sure anywhere else other than ~/.continue/index/lancedb/{project path}...
but I already clear it for a fresh index.
Hopefully someone can tell me anywhere else I can clear the cache.
What happens if you remove the re-ranker definition from your config.json
file?
What happens if you remove the re-ranker definition from your
config.json
file?
You are right. I removed and it is working fine in my project.
Anyway to get the log or check further? So far the embedding result contain some irrelevant info, some LLM can't answer it correctly.
See here for logs etc -> https://docs.continue.dev/troubleshooting#llm-prompt-logs
That log doesn't include reranker related. Continue only log i/o for LLM. It doesn't log i/o for embeddings and reranker models before sent to LLM like my log above.
I mean I can't use reranker for my projects and not sure how to check further without any log.
... but wait, I get it now, the reranker return unexpected result. I wait for a while if I hit the rate limit.
It turn out that I hit the TPM rate limit for embeddings size times codebase nRetrieve
params. I can avoid the error by adjusting the nRetrieve
. I'm not sure if I can adjust chunk size.
So maybe an improvement can be made to catch and log unexpected response. I never expect it was due to the invalid response or rate limited.
Before submitting your bug report
Relevant environment info
Description
I'm evaluating different embeddings models and config. I don't have embeddings issue before but since yesterday, I got this error. I try to change different settings, try different provider, use back the old one but it doesn't work anymore.
I don't provide the embeddings config above because I tried each one of the provider but still doesn't work. It was working before. Let me know if you still need one example of mine.
To reproduce
Log output
EDIT: Before this thing happened, I know my embedding was working fine, I saw the log produce a relevant context of my codebase