Closed inejc closed 5 years ago
Just double checking to be sure, this fix removes doubles because (most of the time) there's no need for that high precision, right? Or is there some other bug in the library that is caused by having doubles?
I'll take a closer look tomorrow (I guess it's mostly about making sure every case of using doubles is found), but at the moment I've noticed (besides the failed test that's seen in log of CI) that the gradient tests for linear/logistic/Poisson regression and softmax are failing - I assume it's because the tolerance is too strict.
Edit: nvm, didn't see the WIP tag, will check back later.
@matejklemen yeah, we should sacrifice precision (I think ~7 digits is plenty for most cases) for a lower memory consumption (should be halved). The problem with failing tests was that we were using breeze's function for computing approximate gradients which we compared to the analytical solution and that one used the one-sided version of the finite difference formula. I now changed that to the two-sided estimation and also had to lower the tolerance for comparisons. The integration tests for linear models show a negligible change in performance. The tests are passing now but I want to test this on larger datasets to confirm I didn't break anything.
Fixes https://github.com/picnicml/doddle-model/issues/104.