jinlow / forust

A lightweight gradient boosted decision tree package.
https://jinlow.github.io/forust/
Apache License 2.0
56 stars 6 forks source link

use `rug::Float` instead of hard-coding `f64` for arbitrary precision #109

Open alessandromazza98 opened 1 month ago

alessandromazza98 commented 1 month ago

Hi,

have you ever thought about using rug::Float type instead of hard coding with f32 or f64 in order to let people that use your crate to have arbitrary precision over the data that are fed into the model.

I think it would be beneficial not only for me, but for all use cases that require a better precision than what an f64 type can provide - an f64 has 53 binary digits precision.

I would be open to work on the integration of Float instead of f32 / f64 if you are open to discuss it / merge it later.

What do you think?

Thanks, Alessandro

jinlow commented 1 month ago

One of the main benefits of using f32 where it is being used, is speed and memory usage. I am not familiar with the rug:Float crate, do you know if the Float type is slower/uses more memory than the other built in f32 and f64 types?
Another thing to consider would be the interoperability of Float and the python wrapper. I would be curious if Float is able to be passed back and forth from python? Or if it has support on the pyO3 side? Does