lmfit / uncertainties

Transparent calculations with uncertainties on the quantities involved (aka "error propagation"); calculation of derivatives.
http://uncertainties.readthedocs.io/
Other
576 stars 73 forks source link

Remove the need for uncertainties.unumpy for NumPy arrays #47

Open mikofski opened 8 years ago

mikofski commented 8 years ago

if I monkeypatch Variable I can use my existing code without modifications:

>>> from uncertainties import Variable, ufloat, umath, unumpy
>>> Variable.sin = umath.sin
>>> a = ufloat(1.23, 0.045)
1.23+/-0.045
>>> b = np.array([a,a,a])
array([1.23+/-0.0345, 1.23+/-0.0345, 1.23+/-0.0345], dtype=object)
>>> np.sin(b)
array([0.9424888019316975+/-0.01153120158579534,
       0.9424888019316975+/-0.01153120158579534,
       0.9424888019316975+/-0.01153120158579534], dtype=object)

although unfortunately it doesn't save anytime over using the looped methods already in unumpy.

lebigot commented 8 years ago

Interesting. Do you have a pointer to the NumPy documentation for this? This would remove the need for umath altogether.

PS: The correct monkey patch is actually UFloat.sin = …: this makes np.sin([a, 2*a]) work (it doesn't, with the patch above, as Variable represents only independent variables, not expressions like 2*a, which depends on a).

mikofski commented 8 years ago

See NumPy issue 7519. This is actually an undesirable side affect of the current implementation of ufuncs. Apparently the new __ndarray_ufunc__ is the correct way to achieve this behavior, but I haven't tested it yet. I'm sorry. I opened this ticket before I received these responses.

There is more info re: __ndarray_ufunc__ in the proposal: "A mechanism for overriding ufuncs".

Perhaps I should change the title of this issue to, "reimplement unumpy.uarray using __ndarray_ufunc__" Or should I just close this? IMO it is probably unstable to leverage this side effect. I have already seen inconsistent results in AlgoPy, and was advised not to use it but instead use the UTPM methods. See AlgoPy issue 48

Do you think there is any interest in a pull request for overloading ndarray ufuncs using __ndarray_ufunc__? Currently, I have met my (employers) needs by developing a wrapper inspired by your wrapper.

lebigot commented 7 years ago

Reference: as of NumPy 1.13, a nicer universal function overriding mechanism is available (which is I guess consistent with the Mechanism for Overriding UFuncs?): https://github.com/numpy/numpy/releases/tag/v1.13.0rc1.

lebigot commented 5 years ago

PS: __numpy_ufunc__ (NumPy 1.11?) is now __array_ufunc__ (NumPy 1.13).

See:

mikofski commented 5 years ago

Hi @lebigot, I'm super sorry, but it's not likely I'll get to work on this, so if you choose to close it, I'll understand. Thanks

lebigot commented 5 years ago

No need to be sorry: we can all only do our best. :) I started investigating this. There might be a simple solution.

lebigot commented 5 years ago

https://numpy.org/neps/nep-0018-array-function-protocol.html might be relevant.

TomNicholas commented 5 years ago

If you re-implement this using the new __array_function__ protocol, then uncertainties arrays could potentially become compatible with wrapping in xarray data objects.

xarray currently wraps either numpy arrays or dask arrays (for distributed computation), but we are in the process of generalising this to wrap any array which implements the __array_function__ protocol. This has been driven mostly by interest in wrapping unit-aware arrays (such as those provided by pint or astropy) and in sparse arrays, but it would be awesome to be able to wrap uncertainty-aware arrays too!

It should even be possible once this is implemented to wrap arrays in a nested fashion, so you could have a unit-aware, uncertainty-propagating, distributed array, wrapped in xarray's high-level objects :open_mouth: