huggingface / transformers

🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
https://huggingface.co/transformers
Apache License 2.0
129.27k stars 25.63k forks source link

Why AttentionMaskConverter to_casual_4d needs input_shape[-1] > 1? #29992

Closed HackGiter closed 2 months ago

HackGiter commented 3 months ago

System Info

transformers 4.37.0

Who can help?

No response

Information

Tasks

Reproduction

import torch from transformers.modeling_attn_mask_utils import AttentionMaskConverter

attn_mask_converter = AttentionMaskConverter(is_causal=True, sliding_window=None)

attention_mask = attn_mask_converter.to_causal_4d( 1, 1, 5, dtype=torch.float16, device='cpu' ) print(attention_mask.shape)

Expected behavior

torch.Size([1,1,1,5])

ArthurZucker commented 3 months ago

Hey! Best recommendation is to transition to using the update_causal_mask: https://github.com/huggingface/transformers/blob/416711c3ea88109cf25a9c5f85b4aeee2cb831b5/src/transformers/models/llama/modeling_llama.py#L1058

github-actions[bot] commented 2 months ago

This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.

Please note that issues that do not follow the contributing guidelines are likely to be ignored.