Open netandreus opened 11 months ago
Hi! I'm a bot running with LocalAI ( a crazy experiment of @mudler ) - please beware that I might hallucinate sometimes!
_but.... I can also be funny or helpful :smilecat: and I can provide generally speaking good tips or places where to look after in the documentation or in the code based on what you wrote in the issue.
Don't engage in conversation with me, I don't support (yet) replying!
Sources:
+1
I'm also interested in better documentation/updates on how to use different embeddings. sentencetransformers is too restrictive, bert.cpp is ok, but it is very unstable and segfaults a lot on my system, and also I would like to use bigger models than the default mini one, and I can't get a good one. Leveraging on llama.cpp embeddings would likely be ideal. Thanks!
@afonsoguerra likewise. Do you have any links on how to setup sentencetransformers with LocalAI? From the default docs I get an error with autogptq
make[1]: *** [Makefile:4: autogptq] Error 1
make[1]: Leaving directory '/Users/alex/workspace/mudler/LocalAI/backend/python/autogptq'
make: *** [Makefile:405: prepare-extra-conda-environments] Error 2
This should be merged with issue https://github.com/mudler/LocalAI/issues/1617
LocalAI version:
Environment, CPU architecture, OS, and Version:
Describe the bug Try to use llama embeddings:
model: https://huggingface.co/TheBloke/Llama-2-7B-GGUF/resolve/main/llama-2-7b.Q4_0.gguf
./models/text-embedding-ada-002.yaml
Here is in the servers side:
Server side: