guillaume-be / rust-bert

Rust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
https://docs.rs/crate/rust-bert
Apache License 2.0
2.59k stars 215 forks source link

summarization_t5.rs for Fine-tuning model don't work #170

Closed remotejob closed 3 years ago

remotejob commented 3 years ago

I try to use summarization_t5.rs for my own small model but it doesn't work

Error: Tch tensor error: cannot find the tensor named decoder.block.0.layer.1.EncDecAttention.relative_attention_bias.weight in /home/juno/.cache/.rustbert/tweetsT5_small_sum_fi/weights/2d39aa9b306e3185d829cdb3cd7bc51c3bd964e6956cb106615da6a002b32ce2.9a92b1ce1a67f24b6a37241bba916438c30a4c0ba49d5358c9e99205a766f707

python pipeline works correctly. .....................................

extern crate anyhow;

use rust_bert::pipelines::common::ModelType; use rust_bert::pipelines::summarization::{SummarizationConfig, SummarizationModel}; use rust_bert::resources::{RemoteResource, Resource}; // huse rust_bert::t5::{T5ConfigResources, T5ModelResources, T5VocabResources};

fn main() -> anyhow::Result<()> { // let summarization_model = SummarizationModel::new(Default::default())?;

let  config_resource = Resource::Remote(RemoteResource::new(
    "https://huggingface.co/remotejob/tweetsT5_small_sum_fi/resolve/main/config.json",
    "tweetsT5_small_sum_fi/config",
));

let vocab_resource = Resource::Remote(RemoteResource::new(
    "https://huggingface.co/remotejob/tweetsT5_small_sum_fi/resolve/main/spiece.model",
    "tweetsT5_small_sum_fi/vocab",
));

let weights_resource = Resource::Remote(RemoteResource::new(
    "https://huggingface.co/remotejob/tweetsT5_small_sum_fi/resolve/main/rust_model.ot",
    "tweetsT5_small_sum_fi/weights",
));

// let config_resource =
//     Resource::Remote(RemoteResource::from_pretrained(T5ConfigResources::T5_SMALL));
// let vocab_resource =
//     Resource::Remote(RemoteResource::from_pretrained(T5VocabResources::T5_SMALL));
// let weights_resource =
//     Resource::Remote(RemoteResource::from_pretrained(T5ModelResources::T5_SMALL));
let summarization_config = SummarizationConfig::new(
    ModelType::T5,
    weights_resource,
    config_resource,
    vocab_resource.clone(),
    vocab_resource,
);
let summarization_model = SummarizationModel::new(summarization_config)?;

let input = ["In findings published Tuesday in Cornell University's arXiv by a team of scientists \

from the University of Montreal and a separate report published Wednesday in Nature Astronomy by a team \ from University College London (UCL), the presence of water vapour was confirmed in the atmosphere of K2-18b, \ a planet circling a star in the constellation Leo. This is the first such discovery in a planet in its star's \ habitable zone — not too hot and not too cold for liquid water to exist. The Montreal team, led by Björn Benneke, \ used data from the NASA's Hubble telescope to assess changes in the light coming from K2-18b's star as the planet \ passed between it and Earth. They found that certain wavelengths of light, which are usually absorbed by water, \ weakened when the planet was in the way, indicating not only does K2-18b have an atmosphere, but the atmosphere \ contains water in vapour form. The team from UCL then analyzed the Montreal team's data using their own software \ and confirmed their conclusion. This was not the first time scientists have found signs of water on an exoplanet, \ but previous discoveries were made on planets with high temperatures or other pronounced differences from Earth. \ \"This is the first potentially habitable planet where the temperature is right and where we now know there is water,\" \ said UCL astronomer Angelos Tsiaras. \"It's the best candidate for habitability right now.\" \"It's a good sign\", \ said Ryan Cloutier of the Harvard–Smithsonian Center for Astrophysics, who was not one of either study's authors. \ \"Overall,\" he continued, \"the presence of water in its atmosphere certainly improves the prospect of K2-18b being \ a potentially habitable planet, but further observations will be required to say for sure. \" \ K2-18b was first identified in 2015 by the Kepler space telescope. It is about 110 light-years from Earth and larger \ but less dense. Its star, a red dwarf, is cooler than the Sun, but the planet's orbit is much closer, such that a year \ on K2-18b lasts 33 Earth days. According to The Guardian, astronomers were optimistic that NASA's James Webb space \ telescope — scheduled for launch in 2021 — and the European Space Agency's 2028 ARIEL program, could reveal more \ about exoplanets like K2-18b."];

//    Credits: WikiNews, CC BY 2.5 license (https://en.wikinews.org/wiki/Astronomers_find_water_vapour_in_atmosphere_of_exoplanet_K2-18b)
let _output = summarization_model.summarize(&input);
for sentence in _output {
    println!("{}", sentence);
}

Ok(())

}

guillaume-be commented 3 years ago

Hello @remotejob ,

Thank you for raising this issue, I was able to reproduce the error. This is caused by a change in the T5 implementation in the Python library that removed the relative attention bias for the decoder cross-attention layer (see https://github.com/huggingface/transformers/pull/8518).

I am pushing some changes that resolve the issue on https://github.com/guillaume-be/rust-bert/pull/171. After the merge you should be able to run your T5 model on the Rust implementation as well. Please note that the repository version of the library now relies on Libtorch 1.9, and you would need to update your libtorch dependency to run the repository version of the code.

I am expecting a release of those changes in the next version of the library, probably over the next couple of weeks.