0todd0000 / spm1d

One-Dimensional Statistical Parametric Mapping in Python
GNU General Public License v3.0
61 stars 21 forks source link

spm on vector coding data #84

Closed benjidutaillis closed 5 years ago

benjidutaillis commented 6 years ago

Hi Todd, I've run SPM on a lower limb, vector coding data looking for significance between anticipated and unanticipated conditions of a dynamic cutting task. unfortunately due to the relatively large variance within the data set there is almost no significance seen among the coupled angles being being studied, or the variance of those coupled angles. I was wondering what effect using the built in smoothing function may help to reduce noise in the signal and help the analysis and if this would be a correct course of action to take with data analysis? Find below figures of the coupled angles and variance of coupled angles.

Thanks, Benji

coupled_angle_plots.pdf cut_coupled_variability_plots.pdf

benjidutaillis commented 6 years ago

An addition to above. How do you go about calculating the FWHM value from a kineamatics data set?

0todd0000 commented 6 years ago

Hi Benji,

The FWHM parameter is computed according to Kiebel et al. (1999). The method is summarized in the attached PDF, which appears in  Pataky et al. (2016).

Smoothing the data will cause the FWHM to increase, which causes the critical threshold to decrease. The threshold will decrease to the usual 0D threshold when FWHM is infinite (i.e. perfectly flat 1D trajectories are infinitely smooth and equivalent to 0D scalars).

Provided the true signal is smoother than the noise, a second effect of smoothing is that it forces the data toward normality by removing outliers. This tends to reduce variance, thereby boosting the test statistic. This combination of (a) reduced threshold, and (b) reduced variance tends to increase chances that the test statistic will cross the threshold when the signal is true.

I wouldn't recommend the smoothing function built-in to spm1d because it's not usually the type of filter used for kinematics datasets. You'd probably want to use one of the filtering functions in scipy.signal like a Butterworth low-pass filter. https://docs.scipy.org/doc/scipy/reference/signal.html

Todd fwhm.pdf

benjidutaillis commented 6 years ago

Hi Todd,

Thanks for replying so quickly. The raw experimental marker data was filtered originally prior to running inverse kinematics, using a Butterworth filter. The kinematic data looks good (find attached) and spm results from that are showing significance so i'm weary of running a second filter and over smoothing. However, due to the noisey nature of the vector coding wave form would it be correct to run a second filter on it and and not the kinematics waveform, prior to running spm? From previous work in vector coding and coordination variability, it seems most papers take a something like a quartile mean (which we previously did) which seems to have a 'smoothing' effect on the data set. I'm not aware of any papers using spm on vector coding, would been keen to hear you thoughts.

Cheers, Benji kinematics_plots.pdf

0todd0000 commented 6 years ago

Hi Benji,

Tough one... if there is a literature precedent for filtering vector coding results then I think it would be no problem, but I'd probably present the data in both ways (filtered and unfiltered) as a type of sensitivity analysis.

It might also be possible to use simulation to judge whether it is appropriate to filter vector coding results. Similar to the power1d interface (www.spm1d.org/power1d/), you could control the type of signal you hope to detect, either at the measurement level or at the vector coding level, then use simulation to calculate the probability of detecting true signal. It is possible that a second filtering step at the vector coding level would increase your chances of detecting true signal, but it's also possible that this filtering could introduce phantom signal. The natures of the true signal, the noise, the filtering algorithms and vector coding particulars can all affect the results.

Todd

benjidutaillis commented 6 years ago

Hi Todd,

Thanks so much for your feedback, definitely have a few ideas to play with now.

Just one more quick question regarding post hoc corrections. For joint angle data I've seen you perform an alpha correction using the number of planes of movement (eg knee angles, a bonferroni correction would be 0.05/3 to account for frontal, sagittal and transverse planes), due to the assumption they are anatomically bound to each other, is that correct? For vector coding, would there be a need to perform a post hoc correction on the alpha level since the coupled angles themselves are being bound?

Cheers, Benji

0todd0000 commented 6 years ago

Hi Benji,

If:

then I don't think corrections are necessary for analysis of b(t). However, if you also wish to also analyze a1(t) and a2(t) then I think you'd have to correct for multiple comparisons across those three tests. In this case a Bonferroni correction would likely be too conservative because b may not be independent of a1 and a2. Unfortunately no other post hoc corrections are currently available in spm1d.

Todd

bernard-liew commented 6 years ago

Hi Todd,

Not sure if I can jump in. But because I have been doing vector coding myself, I found this interesting. Do correct me. This only pertains to the issue of coupling angle, not its variability.

1) Coupling angle is a circular data (see issue https://github.com/0todd0000/spm1dmatlab/issues/66) so traditional/SPM stats would not be appropriate. That could be one reason why you are not seeing results. Line plots do not make useful coupling angle plots, scatter plots do, as they need not be smooth. The only cutting data using vector coding is Polllard, although I have done it using my data. Check if it matches up with Pollard's? If it does match well, your data is what it should be. There is a recent paper by Hamill in J Appl Biomech on statistical inference methods for vector coding. Because circular stats is not popular, I stick to the binning approach, and use traditional stats.

Bernard

benjidutaillis commented 6 years ago

Hi Bernard,

Thanks a lot for getting involved in the conversation. From what we've seen your 100% correct, Pollard is the only other cutting data published, we based a lot of our analysis on those two papers. From my understanding circular statistics wasn't applicable to our data set and please correct me if I'm wrong here, but because 1) our coupling angle is reduced to between 0-90 rather then 0-360 and, 2) our task, like Pollard's, was a discrete cutting task not a gaited task.

Todd, would there be issues with using SPM on circular data? I would assume that because the output (the coupled angle) is still 1D it would still be an appropriate analysis?

Cheers, Benji

0todd0000 commented 6 years ago

If the values are limited to 0-90 there shouldn't be a problem. Surpassing 180 deg can pose serious problems. Regardless, results from random circular fields have not yet been validated in spm1d so please use with caution. Todd

bernard-liew commented 6 years ago

No worries Benji and Todd.

I worry that this may be moving further away from the realm of SPM per se.

The paper I was referring to is "Comparisons of Segment Coordination: An Investigation of Vector Coding". In it, it discourages compressing data into a 0-90 degree range, for you loose directional information.

Circular data has no relationship to (a)cyclical movements. Coupling angle (0-360) regardless of movement source is circular.

Bernard

benjidutaillis commented 6 years ago

Hi Bernard, thanks for linking the Hamill paper, it brings up a lot of good points about analysis with vector coding. If you would like to continue our conversation on VC I'd be interested to hear about some of your work.

Todd, is there any interest from your end on validating spm1d on random circular fields?

bernard-liew commented 6 years ago

Sure, my email is liew_xwb@hotmail.com, if you have further questions on Vector coding.

Regards, Bernard