JuliaAI / MLJLinearModels.jl

Generalized Linear Regressions Models (penalized regressions, robust regressions, ...)
MIT License
80 stars 13 forks source link

Hardcoding of `Float64` in loss #156

Open MartinuzziFrancesco opened 4 months ago

MartinuzziFrancesco commented 4 months ago

Is there a motivation for hardcoding the scaling in the loss penalties as Float64? If not, would a more generic definition allow for multiple types of regression outputs (Float64, 32, 16)?

MartinuzziFrancesco commented 4 months ago

I see that I overlooked the section in the readme that specifies "All computations are assumed to be done in Float64." and of course the issue is more than the simple hardcoding in the loss, my bad.

I can start looking into a way to generalize this in the coming weeks