huggingface / transformers

🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
https://huggingface.co/transformers
Apache License 2.0
135.47k stars 27.11k forks source link

Enable different torch dtype in sub models #34873

Open zucchini-nlp opened 3 days ago

zucchini-nlp commented 3 days ago

What does this PR do?

Fixes https://github.com/huggingface/transformers/issues/33997. Enables users to use different torch dtypes for each of sub config. For ex load the vision model in full precision and the text model in half precision

HuggingFaceDocBuilderDev commented 3 days ago

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.