mlfoundations / open_flamingo

An open-source framework for training large multimodal models.
MIT License
3.74k stars 284 forks source link

Import Error #284

Closed Hambaobao closed 11 months ago

Hambaobao commented 11 months ago

ImportError: cannot import name '_expand_mask' from 'transformers.models.bloom.modeling_bloom' (/usr/local/lib/python3.9/dist-packages/transformers/models/bloom/modeling_bloom.py)

I got this error with transformers 4.36.1.

anas-awadalla commented 11 months ago

Hi @Hambaobao! Yeah this appears to be an issue with the latest versions of transformers. The solution is just to downgrade the version as done here.

Hambaobao commented 11 months ago

Thank you very much, I have solved this problem. I set trust_remote_code=False, but I encountered a new issue: The example given on Hugging Face is for performing inference using the CPU, and I want to use the GPU for inference. However, it seems that the create_model_and_transforms function does not provide parameters for loading the model on the GPU, such as device_map or device. When I manually use the .cuda() method to put the model and data on the GPU, there are some strange issues. Do you have any examples of performing inference using the GPU? 🤣

anas-awadalla commented 11 months ago

Yes I would suggest moving the model and data to gpu as you have done. What issues do you run into?

Hambaobao commented 11 months ago

Thank you for the response, my issue has been resolved. Previously, I wanted to use bf16 precision for inference, but the program reported a type mismatch error, so I then switched to using fp32 instead. 🤣

anas-awadalla commented 11 months ago

Great will close this issue then!