Closed christopherpoole closed 10 years ago
Thank you for the patch, I've tested it with my python class vectors and it works fine with them too.
Out of interest, why are you using numpy arrays for such short vectors? I thought the speed/RAM footprint gains were usually from replacing very large lists (thousands of items).
Thanks.
I use IPython exclusively as my Python shell, so using numpy arrays for anything with numbers in it is a natural fit. It is also convenient for drawing random distributions of points from the probability distributions in numpy.random.*
.
I use IPython too, I still don't know why it's easier to use numpy arrays for everything - lists are just as easy! However, I can see why the initialisers for large arrays of random data would be useful. The code should clearly work in this case as a numpy array is a valid numerical list - I do agree on that point.
If I have time, I'll check how it works with other C-implemented array types, and see if I can generalise your solution (I dislike "if using this specific library, do something different" clauses where a general solution is possible.) Anything set up as a python class, even if accelerated, should work fine already though.
Enforcing the same return type as the input data breaks for numpy arrays without a specific test. A better solution might be to abandon the forced return type policy, and instead always return a tuple
for each vertex (then they are hashable; ie. set(voro["vertices"])
will be possible). The present modification is more to maintain the forced return type policy - which might be important for some users?
When points are parsed as a numpy array, the vertices returned by
compute_voronoi
are empty.Specifically, using
obj.__class__
to detect type for numpy arrays will result in the empty arrays as their type<type 'numpy.ndarray'>
is not used for initializing them directly. Theget_constructor
function added tovoroplusplus
checks specifically for a numpy array and returns the factory function, otherwise returnstype(obj)
.