BAAI-DCAI / Bunny

A family of lightweight multimodal models.
Apache License 2.0
865 stars 65 forks source link

After full-parameter fine-tuning on private data, using the bunny-qwen2 model for inference results in bug #36

Closed zezeze97 closed 5 months ago

zezeze97 commented 5 months ago
Traceback (most recent call last):
  File "/research/zhangzr/Bunny/bunny/eval/model_vqa.py", line 112, in <module>
    eval_model(args)
  File "/research/zhangzr/Bunny/bunny/eval/model_vqa.py", line 64, in eval_model
    output_ids = model.generate(
  File "/research/chengruogu/anaconda3/envs/bunny/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
    return func(*args, **kwargs)
  File "/research/chengruogu/anaconda3/envs/bunny/lib/python3.10/site-packages/transformers/generation/utils.py", line 1544, in generate
    return self.greedy_search(
  File "/research/chengruogu/anaconda3/envs/bunny/lib/python3.10/site-packages/transformers/generation/utils.py", line 2404, in greedy_search
    outputs = self(
  File "/research/chengruogu/anaconda3/envs/bunny/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1511, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "/research/chengruogu/anaconda3/envs/bunny/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1520, in _call_impl
    return forward_call(*args, **kwargs)
  File "/research/zhangzr/Bunny/bunny/model/language_model/bunny_qwen2.py", line 72, in forward
    return super().forward(
  File "/research/zhangzr/Bunny/bunny/model/language_model/qwen2/modeling_qwen2.py", line 1174, in forward
    outputs = self.model(
  File "/research/chengruogu/anaconda3/envs/bunny/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1511, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "/research/chengruogu/anaconda3/envs/bunny/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1520, in _call_impl
    return forward_call(*args, **kwargs)
  File "/research/zhangzr/Bunny/bunny/model/language_model/qwen2/modeling_qwen2.py", line 1021, in forward
    attention_mask = _prepare_4d_causal_attention_mask_for_sdpa(
  File "/research/chengruogu/anaconda3/envs/bunny/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py", line 398, in _prepare_4d_causal_attention_mask_for_sdpa
    expanded_4d_mask = attn_mask_converter.to_4d(
  File "/research/chengruogu/anaconda3/envs/bunny/lib/python3.10/site-packages/transformers/modeling_attn_mask_utils.py", line 137, in to_4d
    expanded_attn_mask = causal_4d_mask.masked_fill(expanded_attn_mask.bool(), torch.finfo(dtype).min)
RuntimeError: The size of tensor a (862) must match the size of tensor b (1723) at non-singleton dimension 3