huggingface / setfit

Efficient few-shot learning with Sentence Transformers
https://hf.co/docs/setfit
Apache License 2.0
2.2k stars 219 forks source link

Onnx support? #78

Closed blakechi closed 1 year ago

blakechi commented 2 years ago

Hi, really like this work!

Given its advantage on faster inference, have you considered adding support functions, like the example below, to compile SetFitTrainer into the onnx format for production-wise usage?

If that sounds promising, I will be happy to make this feature work!

Example:

# Train 
trainer.train()

# Compile to onnx
onnx_path = "path/to/store/compiled/model.onnx"
trainer.to_onnx(onnx_path, **onnx_related_kwargs)
lewtun commented 2 years ago

Hi @blakechi thanks for your interest in our work! Instead of compiling SetFitTrainer into ONNX, I think it would be better to have something like an onnx_export() function that lives inside a setfit.onnx module.

Ideally, this function would take a SetFitModel as input and output model.onnx, similar to what was done for Stable Diffusion here: https://github.com/huggingface/diffusers/blob/main/scripts/convert_stable_diffusion_checkpoint_to_onnx.py#L32

Since the current implementation of SetFitModel uses scikit-learn estimators for the classification head, it might be best to wait until we implement a pure PyTorch version in #8

blakechi commented 2 years ago

Hi @lewtun, Thanks for your reply! onnx_export for SetFitModel sounds good to me.

Sure, happy to lend a hand on adding onnx in this work after #8. :grinning:

nbertagnolli commented 2 years ago

I'm not sure if this is helpful, but I was working on deploying some of these models using ONNX and this is what I came up with so far. If others are looking for a place to start here is some code that will convert the base model and the head and then you can run them separately. I haven't been able to merge them into one graph yet but hopefully it's a start while we wait for #8 :).

from skl2onnx import convert_sklearn
from skl2onnx.common.data_types import FloatTensorType
from onnxruntime import InferenceSession
from pathlib import Path
from setfit import SetFitModel
from transformers import AutoTokenizer
from transformers.convert_graph_to_onnx import convert
import torch
import numpy as np
import sys

def mean_pooling_np(model_output: np.array, attention_mask: np.array):
    token_embeddings = model_output[0] 
    input_mask_expanded = np.broadcast_to(
        np.expand_dims(attention_mask, axis=2), token_embeddings.shape
    )
    sum_embeddings = np.sum(input_mask_expanded * token_embeddings, axis=1)
    sum_mask = np.clip(input_mask_expanded.sum(1), 1e-9, sys.maxint)
    return sum_embeddings / sum_mask

trained_model_path = "path/to/your/trained/setfit/model"
onnx_sentence_model_path = "/path/to/save/onnx/to"
onnx_head_model_path = "/path/to/save/onnx/to"

model = SetFitModel.from_pretrained(trained_model_path)

# Convert the sentence transformer model to onnx
convert(
    "pt",
    trained_model_path,
    Path(onnx_sentence_model_path).absolute(),
    15,
    trained_model_path,
)

# Convert sklearn head into ONNX format
initial_type = [("model_head", FloatTensorType([None, 768]))]
onx = convert_sklearn(model.model_head, initial_types=initial_type, target_opset=15)
with open(onnx_head_model_path, "wb") as f:
    f.write(onx.SerializeToString())

# Load and use the models
text = ["some text to do stuff with"]
tokenizer = AutoTokenizer.from_pretrained(
    "sentence-transformers/paraphrase-mpnet-base-v2"
)
session = InferenceSession(onnx_sentence_model_path)
head_session = InferenceSession(onnx_head_model_path)
tokens = tokenizer(text, truncation=True, return_tensors="np")
preds = session.run(None, dict(tokens))
pooled_preds = mean_pooling_np(preds, tokens["attention_mask"])
print(head_session.run(None, {"model_head": pooled_preds}))
kgourgou commented 1 year ago

Thank you, @nbertagnolli !

A few issues that I had with the script and how I resolved them:

blakechi commented 1 year ago

Sorry for the waiting time and good new is Issue #8 has been resolved!

Just a suggestion. Maybe @nbertagnolli can work on converting the sklearn head and I can work on the PyTorch version? So we can work on the part we are familiar with. :)

How do you think? @lewtun @nbertagnolli

nbertagnolli commented 1 year ago

Happy to help @blakechi do you have a branch you're currently working on? Want me to create one and put an initial ONNX script together? Nice work on #8!

blakechi commented 1 year ago

Thanks, @nbertagnolli!

Or maybe we can open 2 PR separately? Seems like you are almost ready to open one, so I don't want to hold your back. 😅 But I'm fine on either way :)

Maybe you can give us some suggestion? Which way is more convenient for you to manage? @lewtun

nbertagnolli commented 1 year ago

Sounds good @blakechi I'll open a PR soon with an initial script for onnx conversion and we can go from there! : )

blakechi commented 1 year ago

okay, sounds good to me :)

Make sure you follow @lewtun 's suggestion as below:

Hi @blakechi thanks for your interest in our work! Instead of compiling SetFitTrainer into ONNX, I think it would be better to have something like an onnx_export() function that lives inside a setfit.onnx module.

Ideally, this function would take a SetFitModel as input and output model.onnx, similar to what was done for Stable Diffusion here: https://github.com/huggingface/diffusers/blob/main/scripts/convert_stable_diffusion_checkpoint_to_onnx.py#L32

Since the current implementation of SetFitModel uses scikit-learn estimators for the classification head, it might be best to wait until we implement a pure PyTorch version in #8

lewtun commented 1 year ago

Hey @nbertagnolli and @blakechi super cool that you're excited to work on the ONNX export for the two heads 🔥 !!

I agree it's best to have an initial PR first so we can hone the design and then iterate from there.

blakechi commented 1 year ago

Do we also need a Python API for ONNX models? I think it would be much easier for users since we can handle the tokenizer and ONNX runtime for them. Found this pretty helpful from Diffusion. We could possibly have it in a setfit.onnx module as well

nbertagnolli commented 1 year ago

I like that idea, I think that could make working with the sklearn heads easier. : )

AnshulP10 commented 1 year ago

@nbertagnolli I tried this approach but I'm getting an issue regarding the inputs. I trained a multi-label model(one-vs-rest classifier) using text inputs

image

nbertagnolli commented 1 year ago

@AnshulP10 please take a look at the PR we've been working on #156. @kgourgou pointed out the above script has some things that you need to modify for some models. This PR hopefully addresses those concerns. In the PR there is a function called export_onnx which should do what you want. Let me know if you still have trouble.

tomaarsen commented 1 year ago

With #156 merged, this feature request has been implemented :)