huggingface / transformers

🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
https://huggingface.co/transformers
Apache License 2.0
135.84k stars 27.19k forks source link

should use eigvalsh instead of eigvals for fast&stable covariance matrix diagonalization #34570

Open bohrium opened 1 month ago

bohrium commented 1 month ago

https://github.com/huggingface/transformers/blob/c2820c94916e34baf4486accae74760972183a2f/src/transformers/modeling_utils.py#L2473

covariance matrices are always symmetric. so use torch.linalg.eigvalsh. This caused major speedup (>100x in preprocessing and >2x in overall finetuning) in project I'm working on today.

Rocketknight1 commented 4 weeks ago

cc @abuelnasr0 @gante - this line comes from #34037.

github-actions[bot] commented 1 day ago

This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.

Please note that issues that do not follow the contributing guidelines are likely to be ignored.

Rocketknight1 commented 1 day ago

cc @bohrium no reply from the original author - would you be willing to make this PR? I think your reasoning is valid, and the performance improvement would be nice!