note that it needs to run locally, so compute resources need to be taken into account
[ ] BERT
[ ] distilBERT - python. original BERT model might be computationally intensive, DistilBERT is a smaller, faster, and less resource-hungry version designed for efficient inference, you can download the pre-trained DistilBERT model from the Hugging Face Transformers library.
[ ] SentenceTransformers -Python
[x] OpenAI's Ada - supports JS 😉 and Python - not a true sentence-transformer. may require more setup than sentence-transformers.
note that it needs to run locally, so compute resources need to be taken into account