Open conanak99 opened 1 year ago
Can this be run locally on cpu only?
I would like to avoid adding dependencies just for the purpose of demos.
hi @jdagdelen , it can be run on CPU only, all-MiniLM-L6-v2
is a lightweight model.
I understand the concern about adding new dependencies. Fixed that, add a comment in the demo so people know which dependencies to install instead :D
One more thing to note is the embedding from local model is not as good as OpenAI, based on my test.
result
of this line indemo/demo.py
returns a list oftuple(document, similarity)
, so the demo code will not work correctly. This is fixed in this PR..OPENAI_API_KEY
. Addeddemo/demo_custom_embed.py
that makes use of SentenceTransformers/ to generate embedding, which can be run locally.