bitsandbytes-foundation / bitsandbytes

Accessible large language models via k-bit quantization for PyTorch.
https://huggingface.co/docs/bitsandbytes/main/en/index
MIT License
6.31k stars 634 forks source link

Pytorch XLA/PJRT TPU support #1119

Open opooladz opened 8 months ago

opooladz commented 8 months ago

Feature request

Pytorch XLA/PJRT TPU support for bitsandbytes

Motivation

Would allow for faster and more memory efficient training of models on TPUs.

Your contribution

Happy to provide TPUs.

steveepreston commented 1 month ago

+1 please!

markjayne commented 2 weeks ago

+1 as well please!