koaning / embetter

just a bunch of useful embeddings
https://koaning.github.io/embetter/
MIT License
469 stars 15 forks source link

Auto-detect MPS availability when using SentenceEncoder or ClipEncoder #107

Closed glebzhelezov closed 2 months ago

glebzhelezov commented 2 months ago

This small PR sets the torch device in SentenceEncoder to mps whenever CUDA is not available, but MPS is. This speeds up inference on newer Macs, and doesn't affect other device.

Running the following script

from embetter.text import SentenceEncoder
from sklearn.datasets import fetch_20newsgroups

encoder = SentenceEncoder("all-MiniLM-L6-v2")
cats = ["alt.atheism", "sci.space"]
newsgroups_train = fetch_20newsgroups(subset="train", categories=cats)
embeddings = encoder.fit_transform(newsgroups_train["data"])

on my M3 MacBook Air with 16 GB of RAM takes ~10 seconds with the changes in this PR (and activates the GPU, according to mactop), and ~15 seconds without.

koaning commented 2 months ago

Cool, I really like this change!

One small ask though, could you also add this change to the CLIP model here? That one also uses sentence-transformers under the hood. Would be nice to also add that to this PR.

glebzhelezov commented 2 months ago

@koaning Done!

koaning commented 2 months ago

Grand. Will merge once it looks green.

koaning commented 2 months ago

Just made a new release!