bes-dev / pytorch_clip_guided_loss

A simple library that implements CLIP guided loss in PyTorch.
https://pypi.org/project/pytorch-clip-guided-loss/
Apache License 2.0
77 stars 5 forks source link

Transformer version? TypeError: _build_causal_attention_mask() missing 1 required positional argument: 'dtype' #3

Open ImneCurline opened 1 year ago

ImneCurline commented 1 year ago

python == 3.9 torch : 2.0.0

When I run the sample code, I get the above error as the title I found the corresponding position (a transformer's method)and added dtype="double".

Then I get the following error: File "/root/anaconda3/envs/point_e_env/lib/python3.9/site-packages/transformers/models/clip/modeling_clip.py", line 758, in _build_causal_attention_mask mask = torch.empty(bsz, seq_len, seq_len, dtype=dtype) TypeError: empty() received an invalid combination of arguments - got (int, int, int, dtype=str), but expected one of:

So is it because of the version of the transformer? And how to fix it?