abertsch72 / unlimiformer

Public repo for the NeurIPS 2023 paper "Unlimiformer: Long-Range Transformers with Unlimited Length Input"
MIT License
1.05k stars 77 forks source link

Running Unlimiformer with the `forward` method #38

Open testzer0 opened 12 months ago

testzer0 commented 12 months ago

Hi, I am currently trying to run the inference example with a slight modification. Instead of generate()-ing text, I want to pass in input_ids and labels to the forward function and obtain the logits. However, the current implementation leads to device-side asserts. How may I achieve the above goal?

abertsch72 commented 11 months ago

Hi @testzer0 -- what is the specific error you're seeing here? Thanks!

m10an commented 8 months ago

I also have issues with running forward method:

---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
[<ipython-input-9-5679df71b070>](https://localhost:8080/#) in <cell line: 1>()
----> 1 out = model(**example)
      2 out

2 frames
[/content/unlimiformer/src/unlimiformer.py](https://localhost:8080/#) in pre_forward_hook(self, input_ids, attention_mask, labels, **kwargs)
    545                 if input_ids is not None:
    546                     # self.input_ids_size += input_ids.shape[-1]
--> 547                     self.input_ids_size += 1
    548                 if kwargs.get('decoder_input_ids') is not None:
    549                     self.generated_input_ids = torch.cat([self.generated_input_ids, kwargs['decoder_input_ids']], axis=-1)

AttributeError: 'UnlimiformerBART' object has no attribute 'input_ids_size'

Here is my colab (based on inference-example.py) for more details and being able to reproduce error.

Thank you!

abertsch72 commented 7 months ago

I've just pushed a fix for this issue-- please let me know if you're still having problems with the newest version!