mperrin / poppy

Physical Optics Propagation in Python
BSD 3-Clause "New" or "Revised" License
177 stars 41 forks source link

poppy.zernike functions return references to cached arrays instead of copies #16

Closed josePhoenix closed 10 years ago

josePhoenix commented 10 years ago

It's possible to accidentally modify the cached copies of the Zernike polynomials as returned by zernike_list, which will then show up later if you think you're computing 'fresh' copies of the Zernike polynomials (because they're actually being returned out of zernike._ZCACHE, not re-computed).

To prevent unintentionally modifying the Zernike arrays, we can use the writeable (n.b. not writable) flag provided by NumPy. (Alternatively, we can copy the array on returning, at risk of additional memory use.) Example:

In [70]: temp = np.arange(10)
In [71]: temp
Out[71]: array([0, 1, 2, 3, 4, 5, 6, 7, 8, 9])
In [73]: temp.flags.writeable = False
In [74]: temp[4] = 1
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-74-76362c99aa88> in <module>()
----> 1 temp[4] = 1

ValueError: assignment destination is read-only

I also propose replacing the _ZCACHE dict with an LRU cache that can enforce an upper bound on memory usage (so calling zernike_list repeatedly with different arguments doesn't result in excess memory usage). It might be helpful to implement the same kind of caching for hexikes, since they're computed in a similar way.

josePhoenix commented 10 years ago

Oop, another pitfall: mask_outside / outside kwargs aren't part of the tuple stored in _ZCACHE.

mperrin commented 10 years ago

I wasn't previously aware of the writeable flag option for numpy arrays. Nice tip.

The above all sounds like a good plan to me, yes. In terms of the LRU cache, I see that Python 3.3 introduced a built-in LRU cache decorator (see https://docs.python.org/3/library/functools.html ), and there exists a back port of this to 2.6+ (http://code.activestate.com/recipes/578078-py26-and-py30-backport-of-python-33s-lru-cache/) . So one efficient route might be to bundle the back port for now, in the expectation of eventually standardizing on the Python 3.3 option someday. Unless you already have another cache implementation in mind?

josePhoenix commented 10 years ago

Resolved by removing _ZCACHE entirely. If we want to add caching behavior back in later, we should open an issue for that.