guillaume-be / rust-bert

Rust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
https://docs.rs/crate/rust-bert
Apache License 2.0
2.64k stars 215 forks source link

Onnx model #292

Closed jafri closed 1 year ago

jafri commented 2 years ago

Worth exploring adding an option to use onnx instead of libtorch to run models

The following python code runs 2x faster than rust-bert summarization on my M1 Mac

from transformers import AutoTokenizer, pipeline
from optimum.onnxruntime import ORTModelForSeq2SeqLM

tokenizer = AutoTokenizer.from_pretrained("optimum/t5-small")
model = ORTModelForSeq2SeqLM.from_pretrained("optimum/t5-small")
onnx_summarization = pipeline("summarization", model=model, tokenizer=tokenizer)

text = "In findings published Tuesday in Cornell University's arXiv by a team of scientists \
from the University of Montreal and a separate report published Wednesday in Nature Astronomy by a team \
from University College London (UCL), the presence of water vapour was confirmed in the atmosphere of K2-18b, \
a planet circling a star in the constellation Leo. This is the first such discovery in a planet in its star's \
habitable zone — not too hot and not too cold for liquid water to exist. The Montreal team, led by Björn Benneke, \
used data from the NASA's Hubble telescope to assess changes in the light coming from K2-18b's star as the planet \
passed between it and Earth. They found that certain wavelengths of light, which are usually absorbed by water, \
weakened when the planet was in the way, indicating not only does K2-18b have an atmosphere, but the atmosphere \
contains water in vapour form. The team from UCL then analyzed the Montreal team's data using their own software \
and confirmed their conclusion. This was not the first time scientists have found signs of water on an exoplanet, \
but previous discoveries were made on planets with high temperatures or other pronounced differences from Earth. \
\"This is the first potentially habitable planet where the temperature is right and where we now know there is water,\" \
said UCL astronomer Angelos Tsiaras. \"It's the best candidate for habitability right now.\" \"It's a good sign\", \
said Ryan Cloutier of the Harvard–Smithsonian Center for Astrophysics, who was not one of either study's authors. \
\"Overall,\" he continued, \"the presence of water in its atmosphere certainly improves the prospect of K2-18b being \
a potentially habitable planet, but further observations will be required to say for sure. \" \
K2-18b was first identified in 2015 by the Kepler space telescope. It is about 110 light-years from Earth and larger \
but less dense. Its star, a red dwarf, is cooler than the Sun, but the planet's orbit is much closer, such that a year \
on K2-18b lasts 33 Earth days. According to The Guardian, astronomers were optimistic that NASA's James Webb space \
telescope — scheduled for launch in 2021 — and the European Space Agency's 2028 ARIEL program, could reveal more \
about exoplanets like K2-18b."
pred = onnx_summarization(text)

print(pred)

Directly compared to https://github.com/guillaume-be/rust-bert/blob/master/examples/summarization_t5.rs

guillaume-be commented 2 years ago

Hello @jafri ,

Yes - this is something I have been looking into over the last few weeks. Unfortunately the ONNX ecosystem for Rust is not yet at the same level of Python. I have mainly investigated 2 implementations so far:

This is indeed a feature that would be interesting to implement, but I am still looking for an ONNX runtime that could reliably run models exported via optimum.

jafri commented 2 years ago

@guillaume-be here is a working example on philschmid/distilbart-cnn-12-6-samsum

As you found t5 doesn't work for now due to TDim used in the original tensorflow model

  1. Export onnx
from optimum.onnxruntime import ORTModelForSeq2SeqLM
from transformers import AutoTokenizer

model_checkpoint = "philschmid/distilbart-cnn-12-6-samsum"
save_directory = "tmp/onnx/"

# Export to onnx
tokenizer = AutoTokenizer.from_pretrained(model_checkpoint)
ort_model = ORTModelForSeq2SeqLM.from_pretrained(model_checkpoint, from_transformers=True)

# Save the onnx model and tokenizer
ort_model.save_pretrained(save_directory)
tokenizer.save_pretrained(save_directory)
  1. Use tract
use ndarray::s;
use std::{
    path::{Path, PathBuf},
    str::FromStr,
};
use tokenizers::tokenizer::{Result, Tokenizer};
use tract_onnx::prelude::*;

pub fn onnx2() -> Result<()> {
    let model_dir = PathBuf::from_str("tmp/onnx")?;    
    let encoder_path = Path::join(&model_dir, "encoder_model.onnx");
    let decoder_path = Path::join(&model_dir, "decoder_model.onnx");
    let decoder_with_past_path = Path::join(&model_dir, "decoder_with_past_model.onnx");

    let encoder_model = onnx().model_for_path(encoder_path)?.into_runnable()?;

    let input_ids: Vec<i64> = vec![8774, 48, 19, 3, 9, 182, 307, 1499, 12, 36, 15459, 5, 1];
    let attention: Vec<i64> = vec![1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1];

    let input_ids =
        tract_ndarray::Array2::from_shape_vec((1, input_ids.len()), input_ids.clone())?.into();
    let attention_mask =
        tract_ndarray::Array2::from_shape_vec((1, attention.len()), attention)?.into();

    let model_inputs = tvec!(input_ids, attention_mask);
    let result = encoder_model.run(model_inputs)?;

    println!("{:?}", result);

    Ok(())
}
guillaume-be commented 2 years ago

For information I have raised https://github.com/sonos/tract/issues/856 to try and solve the issue with the T5 model. The support for ONNX model would require significant library rework and important library design choices. The design of the pipelines, their configuration, and especially the text generation pipelines will be impacted.

I understand BART/DistilBART models work as you illustrated. I would like to get a better understanding about the range of models exported from Transformers that can be supported by Tract since they do not rely on the same ONNX backend before committing to significant design decisions.

As mentioned this is a prioritized feature on my side, I want to make sure the complexity of the integration is handled right.