theovercomer8 / captionr

GIT/BLIP/CLIP Caption tool
MIT License
139 stars 14 forks source link

The size of tensor a (8) must match the size of tensor b (64) at non-singleton dimension 0 #14

Open roperi opened 1 year ago

roperi commented 1 year ago

getting this when using blip = true

ERROR:root:Exception during BLIP captioning
Traceback (most recent call last):
  File "/content/gdrive/MyDrive/captionr/captionr/captionr_class.py", line 139, in process_img
    new_caption = config._blip.caption(img)
  File "/content/gdrive/MyDrive/captionr/captionr/blip_cap.py", line 56, in caption
    caption = self.blip_model.generate(
  File "/usr/local/lib/python3.8/dist-packages/blip/models/blip.py", line 156, in generate
    outputs = self.text_decoder.generate(input_ids=input_ids,
  File "/usr/local/lib/python3.8/dist-packages/torch/autograd/grad_mode.py", line 27, in decorate_context
    return func(*args, **kwargs)
  File "/usr/local/lib/python3.8/dist-packages/transformers/generation/utils.py", line 1490, in generate
    return self.beam_search(
  File "/usr/local/lib/python3.8/dist-packages/transformers/generation/utils.py", line 2749, in beam_search
    outputs = self(
  File "/usr/local/lib/python3.8/dist-packages/torch/nn/modules/module.py", line 1194, in _call_impl
    return forward_call(*input, **kwargs)
  File "/usr/local/lib/python3.8/dist-packages/blip/models/med.py", line 886, in forward
    outputs = self.bert(
  File "/usr/local/lib/python3.8/dist-packages/torch/nn/modules/module.py", line 1194, in _call_impl
    return forward_call(*input, **kwargs)
  File "/usr/local/lib/python3.8/dist-packages/blip/models/med.py", line 781, in forward
    encoder_outputs = self.encoder(
  File "/usr/local/lib/python3.8/dist-packages/torch/nn/modules/module.py", line 1194, in _call_impl
    return forward_call(*input, **kwargs)
  File "/usr/local/lib/python3.8/dist-packages/blip/models/med.py", line 445, in forward
    layer_outputs = layer_module(
  File "/usr/local/lib/python3.8/dist-packages/torch/nn/modules/module.py", line 1194, in _call_impl
    return forward_call(*input, **kwargs)
  File "/usr/local/lib/python3.8/dist-packages/blip/models/med.py", line 361, in forward
    cross_attention_outputs = self.crossattention(
  File "/usr/local/lib/python3.8/dist-packages/torch/nn/modules/module.py", line 1194, in _call_impl
    return forward_call(*input, **kwargs)
  File "/usr/local/lib/python3.8/dist-packages/blip/models/med.py", line 277, in forward
    self_outputs = self.self(
  File "/usr/local/lib/python3.8/dist-packages/torch/nn/modules/module.py", line 1194, in _call_impl
    return forward_call(*input, **kwargs)
  File "/usr/local/lib/python3.8/dist-packages/blip/models/med.py", line 178, in forward
    attention_scores = torch.matmul(query_layer, key_layer.transpose(-1, -2))
RuntimeError: The size of tensor a (8) must match the size of tensor b (64) at non-singleton dimension 0
theovercomer8 commented 1 year ago

Other users have reported that using WSL can solve this issue. Can you try this?

roperi commented 1 year ago

@theovercomer8

Thanks for your quick response. Sure. But what is that? Or how do I do that?

theovercomer8 commented 1 year ago

Windows Subsystem for Linux. If you wanted to hop on the IlluminatiAI Discord, there are some users there who may be able to help you. I'm on a Mac, so I have no way to reproduce or guide you on this.

roperi commented 1 year ago

Oh, thanks! I'm using the Colab. I'm a linux kind of person. Will join discord. thanks!

theovercomer8 commented 1 year ago

Oh duh. Was used to seeing that issue from Windows users. Didn't even pay attention to that your error isn't coming from a Windows console. :)

Hmm, this is a troubling one. Lets chat.

jcrsantiago commented 1 year ago

Is this project still in development?

theovercomer8 commented 1 year ago

I would say hiatus. I just haven’t had any time to give it, and minigpt4 is just better captioning tech at this point but hard to implement in captionr. Not sure what the future of Captionr is.

On May 30, 2023 at 9:45:24 PM, jcrsantiago @.***> wrote:

Is this project still in development?

— Reply to this email directly, view it on GitHub https://github.com/theovercomer8/captionr/issues/14#issuecomment-1569367918, or unsubscribe https://github.com/notifications/unsubscribe-auth/A5HWTBI3DEMRGIQXRLZXBXTXI2PDJANCNFSM6AAAAAAVN5WV2Q . You are receiving this because you were mentioned.Message ID: @.***>