ccdv-ai / convert_checkpoint_to_lsg

Efficient Attention for Long Sequence Processing
MIT License
85 stars 11 forks source link

Index out of bounds for BART and other model architectures #6

Open Gimperion opened 1 year ago

Gimperion commented 1 year ago

I keep getting an index 50264 is out of bounds for dimension 0 with size 50264 or something similar when converting BART and some other models to LSG.

The issue seems to be this line of code in the update_global method - positions[1:] += u[mask_id].unsqueeze(0)

ccdv-ai commented 1 year ago

Hi @Gimperion

Can you share your transformers version and a snippet of code you did use?

Gimperion commented 1 year ago

I replicated the error with both transformers 4.26.0 and 4.28.1. Here's the snippet:

from lsg_converter import LSGConverter

converter = LSGConverter(max_sequence_length=4096)

model, tokenizer = converter.convert_from_pretrained("sshleifer/distilbart-cnn-6-6", block_size=256, sparsity_factor=2)
ccdv-ai commented 1 year ago

I think I found the problem @Gimperion Something is wrong with the model and the tokenizer. The <mask> token has the index 50264 while the model config states that "vocab_size": 50264 . Since the first token has the index 0, there are in practice 50265 tokens in the vocabulary, the index is thus out of bounds.

If you try to do an inference with the <mask> token, it fails.

If you really need to convert the model you have two possibilities: