teddykoker / torchsort

Fast, differentiable sorting and ranking in PyTorch
https://pypi.org/project/torchsort/
Apache License 2.0
765 stars 33 forks source link

16bit support? #57

Closed pumplerod closed 1 year ago

pumplerod commented 1 year ago

Curious to know if it would be difficult to implement ( or is perhaps in a future plan) torach.float16? Currently when trying to use 16bit with torchsort we are given the error:"arange_cpu" not implemented for 'Half'

teddykoker commented 1 year ago

Hi, currently 16bit support is limited to CUDA only. Fp16 is generally not supported on CPU for PyTorch, which is out of the control of this library. In this case torch.arange is not supported on CPU, which is an operation this library depends on. Overall I would not recommend attempting to use fp16 on cpu anyway, as many other operations are not supported and performance is often poor.