guillaume-be / rust-bert

Rust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
https://docs.rs/crate/rust-bert
Apache License 2.0
2.67k stars 217 forks source link

Got unsupported ScalarType BFloat16 #392

Closed MoonKraken closed 1 year ago

MoonKraken commented 1 year ago

I am trying to use convert_model.py to convert a model to .ot so that I can use it with rust_bert. But I am getting the following error:

python utils/convert_model.py ~/Downloads/pytorch_model.bin
Traceback (most recent call last):
  File "/Users/kenk/Documents/Code/OpenSource/rust-bert/utils/convert_model.py", line 67, in <module>
    tensor = v.cpu().numpy()
             ^^^^^^^^^^^^^^^
TypeError: Got unsupported ScalarType BFloat16

I'm on an M1 MBP and a quick Google search revealed that maybe this issue is expected on this hardware, but I don't know enough about numpy or pytorch to now how to fix it. TIA!

guillaume-be commented 1 year ago

Hello @Me163 ,

This is not caused by a device issue, but by the lack of support for bfloat16 in Numpy. I have added a small utility function that should handle this case and cast the tensors to their bfloat16 representation. Note that these would have to be reloaded as Kind::BFloat16 in the tch-rs crate.

The candidate PR is at https://github.com/guillaume-be/rust-bert/pull/396 - could you please give this a try and let me know if this helps?