Closed cbrnr closed 8 years ago
We could also use the CSP implementation in MNE (which we recently adapted anyway). So maybe we could create a CSP function for our backend mechanism and use the existing one in the builtin backend and the MNE one in a new MNE backend. Ugh, this is getting unwieldy - somehow it would be much nicer being able to pick the functions individually instead of bundling them all in a backend...
I always thought regularization is not required for CSP, because in practice you will always have enough data. Do you have a counter-example where this is not the case?
Ugh, this is getting unwieldy - somehow it would be much nicer being able to pick the functions individually instead of bundling them all in a backend...
Hmm... Refactoring the customization system would probably be too much work at the moment. What's the difference between our CSP and MNE CSP?
I always thought regularization is not required for CSP, because in practice you will always have enough data. Do you have a counter-example where this is not the case?
Yes - some of my data sets produce this error although the amount of data is the same in every subject.
Hmm... Refactoring the customization system would probably be too much work at the moment.
Agreed. Although it would be nice to mix and match specific functions from different sources instead of bundling them in backends. Maybe something for the future.
What's the difference between our CSP and MNE CSP?
Nothing. I merged our code into theirs https://github.com/mne-tools/mne-python/pull/2630, but MNE CSP is more powerful because it supports regularized covariance estimation and two different averaging variants.
I just played around with the data set that doesn't work. Both sigma1
and sigma2
are positive definite, but sigma1 + sigma2
is not. This is why eigh
fails.
Using MNE's CSP object with reg=True
solves the problem. Using cov_est="epoch"
(compute a covariance matrix for each epoch and then average) also solves the problem.
Do you think we should change the averaging method?
No, I think our default averaging method is fine. I think #165 solves this issue, but it means that our CSP implementation is not really that useful. Anyway, if someone encounters difficulties, they can always use the MNE backend.
For some data sets, CSP fails in the eigenvalue decomposition step. This is likely because the covs are not positive definite and therefore we need to regularize the covariance matrices (see discussion at https://github.com/mne-tools/mne-python/issues/2437).
@mbillingr, how should we go about and implement Ledoit-Wolf shrinkage? I would like to use the
sklearn
implementation, butsklearn
is only an optional dependency.