nipy / mindboggle

Automated anatomical brain label/shape analysis software (+ website)
http://mindboggle.info
Other
146 stars 54 forks source link

Curvature methods #11

Closed binarybottle closed 11 years ago

binarybottle commented 11 years ago

Joachim Giard (3/4/2013):

CurvatureMain default is 0.

-m 2 is not slower but it is less exact and does not correspond exactly to the definition of curvature, but it's a good approximation. The 3 methods take more or less the same time but it would be a lot slower to consider a larger neighborhood for -m 0 (I can't tell exactly how slower). So, I would recommend -m 0 if you have a low resolution or if you just want to localize local peaks. -m 1 is not well tested and the filtering is done using Euclidean distances, so it's good only for fast visualization. -m 2 is a good approximation but very large curvatures (negative or positive) are underestimated (saturation effect).

binarybottle commented 11 years ago

Joachim Giard and Arno Klein (3/19/2013):

Arno: thank you for looking into this. so with -m 0, you're saying that the places where we saw zero curvature really are flat with respect to vertices' immediate neighbors?

mesh zoon with zeros in blue and a histogram of curvature values: screen shot 2013-05-08 at 9 38 49 am

for comparison, the less creased folds of the gray/white surface has fewer zeros, and the zeros are in areas that are flat, not on the creases: screen shot 2013-05-08 at 10 25 30 am

Joachim: That's right

Arno: and we don't currently have the option to choose a larger neighborhood (like 1mm), correct?

Joachim: We have the option to do so but the formula to compute curvature is based on the definition for infinitesimal distances, so when taking a too large neighborhood, it is not adapted and does not reflect the real curvature. Maybe keeping a small neighborhood and smoothing afterwards is a better option.

Arno: the -m 2 laplacian option is an approximation you said, not exactly curvature. how close is this approximation?

Joachim: I don't know exactly how close it is but there is a mapping existing between both functions and this mapping should be bijective for not too big absolute values of cruvature (in our case, it should be ok). I didn't push the study further.

Arno: and independent of the fundus extraction, which values should i report in our shape tables for morphometry?

Joachim: -m0 is correct but we have to define the size of the neighborhood we want to use and it may be a tricky question. It really depends on the data and on what you want to do with it. A curvature computation is in any case an approximation because we work with piecewise linear objects. I should read some papers about the usage of curvature in this domain in order to have more answers.

Arno: since we are dealing with a discretized version of the cortical surface, one strategy would be to estimate the continuous surface (e.g., b-spline), and another would be to widen the neighborhood to a second set of connected vertices. in the latter case, one could either average the first and second concentric ring of values (via connected pairs of vertices), or consider second-ring vertices only when the zero-curvature vertex in question has zero curvature with respect to the first-ring vertices.

Joachim: The continuous estimation may be a good option. Be careful with the definition of neighborhood. Extending to a second ring will give different results depending on the resolution. It's better to use physical lengths, as it is done in the methods geoDistRing of Meshanalyzer.

binarybottle commented 11 years ago

Joachim Giard (4/3/2013):

I verified the theory around the curvature computation. The best approximation among the methods that are implemented is -m2. It also presents the advantage to be based on a smoothing algorithm and to avoid the zero peak. -m1 is not really correct and -m0 is too sensitive to the local linear geometry of the meshes.

The relaxation factor controlled by the -n argument (between 0 and 1, 0.7 seems correct). The number of iterations of the smoothing is hard coded and I didn't noticed big differences but if you would like, I can expose it.

(4/9/2013):

I made some tests and it seems that above -n 2 for -m 0, the vector field is regular enough. On the tests I did, I don't even have the peak at 0 with -n 2. So maybe it is better to use it instead of -m2 because it computes all the 4 curvatures + the directions.

-m2 is faster than -m0 (20s instead of 2000s) on my computer, with -m 0 -n 2 it takes 1600s, so that longer than what you have. With -m 2 -n 0.7, it's around 15s.

Arno: so just to be clear, -m0 is more accurate but can be too sensitive to local linear geometry, so the larger neighborhood is still a better option than laplacian smoothing? and why would b-spline be considered better?

Joachim: -m2 is also quite accurate in absolute value but -m0 is directional and provides the 4 curvatures. I should make a comparison between those 2 methods. A too large neighborhood for -m0 is not good either. There is also a peak but for other reasons (that I still have to precise). -n 2 seems ideal (empirical).

binarybottle commented 11 years ago

Joachim Giard and Arno Klein (5/8/2013):

Arno: since we are no longer using the anchor points to tether fundi (we are using endpoints on fold boundaries), the only curvature measure we are currently using for feature extraction is the mean curvature. this is not to say that it wouldn't be good to include the others in our shape tables, and for future algorithmic development, but one of the advantages of the CurvatureMain's -m 0 option was that it produced gauss, min, max, and min directions as well as mean curvature values, whereas -m 2 was more limited.

we need to decide whether to use -m 0 or -m 2 or both. which do you think is more accurate and relevant as a shape measure, and which do you think is more useful (smoother values that look more intuitive from afar) for feature extraction?

Joachim: -m 0 is more relevant (if we solve the peak problem) -m 2 is much faster and there should be almost a bijection between both.

(5/10/2013):

I reviewed the equation, the code and I made some tests again.

I would recommend to keep only -m2.

Visually, -m 2 is better than -m0. Mathematically, it is a good estimation (more global and smoothed than -m 0). And I did not manage to make the peak at 0 disapear. Moreover, -m2 is much faster. With the kind of surface we have, it is logical to have a lot of extrema. In the tests I made (-n 0.1, 0.7, 0.9 and 0.99) I have a lot of small curvature values which correspond to the top of the gyri.

The only disadvantage is that it is smoothed and that the very local curvatures are influenced by the neighborhood. I think that for our applications, something more "accurate" would only reflect orregularities in the mesh.

binarybottle commented 11 years ago

Joachim Giard (5/13/2013):

It's done, I changed a normalization parameter and it's more consistent with the definition of the curvature. Consequently, we have a Gaussian shape histogram now.

The range is quite sensitive to the smoothing factor you choose but there is nothing we can do about it. It's a user choice. I'll keep 0.7 as I said before. The curvature is not normalized between -1 and 1. Tell me if everything seems correct.