Closed JohannesBuchner closed 6 years ago
Hey Johannes,
Glad you like the project! A reasonable chunk of the code/tests was based upon your nested sampling results, so it's awesome to hear from you.
I agree that the RadFriends
and SupFriends
code could probably be sped up by taking advantage of better vectorization, and ideally would use results related to the global/local covariances to determine distances. I didn't invest too much time into it because I mostly used them to do consistency checks for the other methods, but would be happy to try and improve them if you felt that it would be important for a fair comparison/discussion with other approaches.
Hi @joshspeagle,
First of all, this is a really cool project! I like how it brings several new nested sampling developments together and is pure python.
You mention that RadFriends' performance is not so great because a lot of python calls are needed. I think you should not need a KD tree if you better take advantage of numpy vectorisation.
Just as an example, instead of
you can use:
I have my own implementation of RadFriends in my nested_sampling repo (cneighbors.c, neighbors.py). There I use some C libraries for speed-up, which pays off a lot. I guess you do not want to do that in your pure-python library; Cython or numexpr may help though.
Also, in my experience RadFriends doesn't perform so well in some real-world applications because the parameters have very different extents. Using a standardized euclidean metric (normalise dimensions by std in each axis) instead helps a lot. SupFriends is pretty pointless in practice compared to RadFriends, I wouldn't mind if it was removed :) .
Cheers, Johannes