Closed Drosrin closed 5 days ago
Please provide a list of the versions of the installed libraries, with a focus on PyTorch and Transformers. This information will be helpful in troubleshooting any potential issues.
I'm dumb, i loaded the image in wrong way :(
Checklist
Describe the bug
When trying to inference with InternVL2.0 using https://internvl.readthedocs.io/en/latest/internvl2.0/quick_start.html, I encountered with the problem above. I can solve it if the
flash_attn
is not installed by turningtorch_dtype=torch.bfloat16
totorch_dtype=torch.float32
. However, ifflash_attn
is installed, this method won't work becauseflash_attn
only supports bf16 and fp16. As I have tried, changingflash_attn
version won't work. Is there anywhere I've done wrong or is there anything I miss?Reproduction
This is the
run.py
file I used to inference:Environment
Error traceback