lmfit / uncertainties

Transparent calculations with uncertainties on the quantities involved (aka "error propagation"); calculation of derivatives.
http://uncertainties.readthedocs.io/
Other
576 stars 74 forks source link

unumpy vectorized functions extremely slow #75

Open beojan opened 6 years ago

beojan commented 6 years ago

The unumpy vectorized functions are extremely slow. This seems to be because the uncertainty propagation seems to be repeated for every element rather than simply being done once.

rth commented 6 years ago

@beojan There is some related discussion in https://github.com/lebigot/uncertainties/issues/57, doesn't look like there is an easy solution for it unfortunately.

lebigot commented 5 years ago

@beojan, most uncertainty calculations done by uncertainties.unumpy are indeed repeated for every element in turn. Now, in principle it would be possible to do better. I'm not expecting much speedup in general, though. One thing that would be useful would be to profile some calculation and see whether the NumPy vectorization is responsible for most of the calculation time. So @beojan if you have a minimal working example of a calculation that you had in mind, that'd be useful!

beojan commented 5 years ago

If you see the notes here, you'll see that the NumPy vectorize function basically generates a for loop (which would presumably cross the Python - C boundary on every iteration to call the function). If that's what's being used, it's going to be very slow indeed.

The only solution I can think of is to have a uarray type that internally stores the value and uncertainty separately, and manipulate these with existing numpy or scipy functions.

lebigot commented 5 years ago

Exactly. Pull requests are welcome as usual!

lebigot commented 5 years ago

https://numpy.org/neps/nep-0018-array-function-protocol.html might be relevant.

lebigot commented 1 year ago

Could https://github.com/sradc/SmallPebble be used as an efficient backend for uncertainties?