Open beojan opened 6 years ago
@beojan There is some related discussion in https://github.com/lebigot/uncertainties/issues/57, doesn't look like there is an easy solution for it unfortunately.
@beojan, most uncertainty calculations done by uncertainties.unumpy
are indeed repeated for every element in turn. Now, in principle it would be possible to do better. I'm not expecting much speedup in general, though.
One thing that would be useful would be to profile some calculation and see whether the NumPy vectorization is responsible for most of the calculation time. So @beojan if you have a minimal working example of a calculation that you had in mind, that'd be useful!
If you see the notes here, you'll see that the NumPy vectorize
function basically generates a for loop (which would presumably cross the Python - C boundary on every iteration to call the function). If that's what's being used, it's going to be very slow indeed.
The only solution I can think of is to have a uarray type that internally stores the value and uncertainty separately, and manipulate these with existing numpy or scipy functions.
Exactly. Pull requests are welcome as usual!
https://numpy.org/neps/nep-0018-array-function-protocol.html might be relevant.
Could https://github.com/sradc/SmallPebble be used as an efficient backend for uncertainties
?
The unumpy vectorized functions are extremely slow. This seems to be because the uncertainty propagation seems to be repeated for every element rather than simply being done once.