Closed bacook17 closed 6 years ago
I thought I had addressed this (at least testing on my local versions appeared to silence this warning w/ and w/o sklearn installed), so I'll see what I can do about moving the warnings around to avoid this.
So, digging a bit further, this triggers each time the condition
if not _SPARSEWARN and npoints <= 2 * ndim
is met, where npoints
is the number of live points used in a particular ellipsoid as part of the decomposition.
The fact that you're hitting this all the time suggests your distribution is just getting shredded during the decomposition so that almost every iteration you're constructing >=1 ellipsoid with <=10 live points. Are you constructing your bounding ellipsoids right away? Or is this still happening many iterations into sampling?
The fact that somehow _SPARSEWARN
is not being set globally might have something to do with where the variables are being stored in your workflow. Is there something that prevents the global variable from being set properly?
Closing because I failed to replicate this. I’ll re-open if this continues to be an issue.
Since I recently updated dynesty (not sure what previous version I'd had), my runs now output a huge number of UserWarnings, from line 1223 of bounding.py:
/root/.local/lib/python3.6/site-packages/dynesty/bounding.py:1223: UserWarning: Volume is sparsely sampled. MLE covariance estimates and associated ellipsoid decompositions might be unstable. warnings.warn("Volume is sparsely sampled. MLE covariance "
I see in the code that there's a global variable _SPARSEWARN which is supposed to be set to True after the first warning, so this doesn't repeat, but for some reason this seems to be failing.
I also am not convinced that I should be getting this warning, as I'm using 300 live points for a 5-dimensional problem.
Any suggestions would be helpful! I'm temporarily addressing it by silencing UserWarnings.