Open makiz1999 opened 1 week ago
Hey, thanks for the report. Please could you explain why you are using codellama:7b-instruct
as the embedding model?
Hey, thanks for the report. Please could you explain why you are using
codellama:7b-instruct
as the embedding model?
Hi, I was experimenting with different models to see if the output or quality of responses would change. From my observations, responses are pretty much similar to any model. I get a similar error when using llama3.1:latest
:
The response with llama3.1:latest
is different but still doesn't make sense and reference the actual file.
Do you know by chance how to fix it? Thanks
Interesting, would have to investigate, maybe something changed with LanceDB recently...
Hey, I just released version 3.19.0. I would recommend to use https://ollama.com/library/all-minilm for the embedding model, this is the one I have had best results with in the past, let me know if you find a better one.
Many thanks!
Hey, I just released version 3.19.0. I would recommend to use https://ollama.com/library/all-minilm for the embedding model, this is the one I have had best results with in the past, let me know if you find a better one.
Many thanks!
Thank you so much! I will try to test it out soon. Do you have recommendations for chat and FIM models as well?
I have installed all-minilm
and set it up as the embedding model. It still doesn't work as expected unfortunately and can't reference other files. Here is the error after pressing 'Embed workspace documents'.
I think maybe your folder has a space in the name? Please try to remove the space for now. I'll fix that bug in the next release.
Many thanks.
Describe the bug I am trying to use embeddings to create context awareness of myry so that I can ask a current directo chatbot for details within a file. I have a python file with a simple function:
And this is chatbot response:
Logging After embedding workspace documents:
API Provider
Chat or Auto Complete? Chat
Model Name
codellama:7b-instruct
Desktop (please complete the following information):