mne-tools / mne-python

MNE: Magnetoencephalography (MEG) and Electroencephalography (EEG) in Python
https://mne.tools
BSD 3-Clause "New" or "Revised" License
2.66k stars 1.31k forks source link

BUG: Truncated SVD vs Tikhonov regularization to compute pseudo inverse Cm_inv in beamforming #3832

Closed brovelli closed 7 years ago

brovelli commented 7 years ago

Dear all, the beamforming code are based on the the calculation of the pseudo inverse Cm_inv. For example in _dics.py at lines 382-385

    # Calculating regularized inverse, equivalent to an inverse operation
    # after the following regularization:
    # Cm += reg * np.trace(Cm) / len(Cm) * np.eye(len(Cm))
    Cm_inv = linalg.pinv(Cm, reg)

As far as I could understand the two methods are equivalent only when all singular values are taken (TSVD) or when the regularization parameter = 0: http://math.stackexchange.com/questions/1084677/tikhonov-regularization-vs-truncated-svd

The default value for reg=0.1 (I use normally 0.05) to get more focused solutions. Are these methods really equivalent/ 1) Truncated SVD: Cm_inv = linalg.pinv(Cm, reg) 2) Tikhonov Regularization: Cm += reg np.trace(Cm) / len(Cm) np.eye(len(Cm)) Cm_inv = linalg.pinv(Cm)

Q1. I tried empirically (I don't know enough the maths) and it gives different results when reg=0.05. Is it something we should worry about? Q2. Is the value of reg equivalent for the two methods ? Shouldn't the reg value in the TSVD be: reg = reg * np.trace(Cm) / len(Cm) ?

Thanks a lot

brovelli commented 7 years ago

I run the same analysis with both methods and I found results which are consisten with previous analyses using Fieldtrip only with the second method. Briefly, I performed single-trial power estimates at the source level in the high-gamma range using an anatomical atlas (called "MarsAtlas": Auzias et al., Human Brain Mapping 2016) and I get similar results only when matrix inversion is performe using the second method. I attach the average high-gamma power for the motor area computed using MNE (modified version of Cm_inv, 2nd method) and that one compute using Fieldtrip. It reflects the z-score of the high-gamma activity (HGA) with respect to the baseline. The time courses are similar, the SNR for the MNE version is higher that for the Fieldtrip version because I made some refinements using the MNE. But the general trend is there. If I use method 1 currently implemented in MNE, I get very different results. mne fieldtrip

larsoner commented 7 years ago

Those are indeed not equivalent. Do you have time to find a reference in a paper we can use to settle this? I suspect that we have a bug here and FieldTrip is correct, but it would be good to point to some equation in a paper.

@rgoj any comment on why you used truncated SVD instead of Tikhonov regularization?

brovelli commented 7 years ago

Ok, thanks for the quick reply, I am releaved indeed.

The original ref I know about the regularization in beamforming techniques is a paper from Joachim Gross 1999 (attached). More specifically, they introduce the need of a regularization parameter alpha to control for the spatial resolution of the beamforming filter at equation 24 in the paragraph "4.3. Regularization (rMinInf)"

This paper has set the formalism for the DICS method published in PNAS by Gross in 2001. What they call "Regularization (rMinInf)" is in fact the DICS.

NB: I noticed that such regularization via tructated SVD is used several times in MNE, both in _dics and in _lcvm.py . So if this is a BUG, it will need to be corrected in multiple places.

thanks and bye

Andrea

Le 12/12/2016 à 15:44, Eric Larson a écrit :

Those are indeed /not/ equivalent. Do you have time to find a reference in a paper we can use to settle this? I suspect that we have a bug here and FieldTrip is correct, but it would be good to point to some equation in a paper.

@rgoj https://github.com/rgoj any comment on why you used truncated SVD instead of Tikhonov regularization?

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/mne-tools/mne-python/issues/3832#issuecomment-266448218, or mute the thread https://github.com/notifications/unsubscribe-auth/ASrbq30dQC3V4QsJpNOouwisVYtILFL0ks5rHV3dgaJpZM4LJs9a.

larsoner commented 7 years ago

Would you have time to make a Pull Request to fix it?

Let's wait and see if another person can confirm it would be the correct fix, though.

brovelli commented 7 years ago

I can do a Pull Request after we have some feedback. But I am not so familiar with MNE, so I don't know where exactly this regularization was used. It would probably be better if someone else would do it. What I changed in _dics.py was: From:

    # Calculating regularized inverse, equivalent to an inverse operation
    # after the following regularization:
    # Cm += reg * np.trace(Cm) / len(Cm) * np.eye(len(Cm))
    Cm_inv = linalg.pinv(Cm, reg)

To:

    # Calculating regularized inverse, equivalent to an inverse operation
    # after the following regularization:
    Cm += reg * np.trace(Cm) / len(Cm) * np.eye(len(Cm))
    Cm_inv = linalg.pinv(Cm)
dengemann commented 7 years ago

It looks as if the MNE way of doing things leaked into the beamformer code. I don't quite remember why this notation was preferred. But this does not look like a bug, rather like some explicit decision, potentially related to MEG/EEG fusion and handling of rank-deficient data. @agramfort do you remember what happened here?

On Mon, Dec 12, 2016 at 4:16 PM Eric Larson notifications@github.com wrote:

Would you have time to make a Pull Request to fix it?

Let's wait and see if another person can confirm it would be the correct fix, though.

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/mne-tools/mne-python/issues/3832#issuecomment-266456662, or mute the thread https://github.com/notifications/unsubscribe-auth/AB0finGkEY5agW1Ei1eHmqmSGsz7_Zzeks5rHWU_gaJpZM4LJs9a .

larsoner commented 7 years ago

Well even in MNE we use Tikhonov regularization. The comment is incorrect, and it doesn't seem to match what the published literature says. This all points to a bug to me.

dengemann commented 7 years ago

Yes I agree with this, we should make the implementation as consistent with other toolboxes as needed. For MNE we do two things, the lambda2 Thikonov and an SVD when applying the whitening; I'm wondering if there is some link. More generally speaking, I'm trying to figure out if anyone who supervised the GSOC remembers if there was any reason for this choice. Anyways we should provide the canonical variant.

On Mon, Dec 12, 2016 at 4:34 PM Eric Larson notifications@github.com wrote:

Well even in MNE we use Tikhonov regularization. The comment is incorrect, and it doesn't seem to match what the published literature says. This all points to a bug to me.

— You are receiving this because you commented.

Reply to this email directly, view it on GitHub https://github.com/mne-tools/mne-python/issues/3832#issuecomment-266461887, or mute the thread https://github.com/notifications/unsubscribe-auth/AB0fiipOEsnzePKWngvmb_FhKhF9fDmEks5rHWmigaJpZM4LJs9a .

agramfort commented 7 years ago

sorry guys for the slow reaction time. Yes I agree it looks like bug...

First commit comes from me in 2011:

https://github.com/mne-tools/mne-python/commit/2923459327ef3dbfda0f17e5f0788c38e3e570b7

and comment comes from @rgoj

https://github.com/mne-tools/mne-python/commit/691b5dfea4a24e3b06f80d729e9c09bf9d3fc761

@dengemann I don't think it comes from MNE/dSPM code here.

@brovelli for you we "just" need to replace it by:

Cm += reg np.trace(Cm) / len(Cm) np.eye(len(Cm)) Cm_inv = linalg.pinv(Cm)

if so it's an easy pull request. I am really curious on the impact on our examples. Maybe it has little impact on our sample data?

@brovelli thanks for digging deep into this.

dengemann commented 7 years ago

Thanks Alex for taking a look and confirming. So we've got a bug to fix! +1 for a PR and exploration of examples.

On Mon, Dec 12, 2016 at 11:00 PM Alexandre Gramfort < notifications@github.com> wrote:

sorry guys for the slow reaction time. Yes I agree it looks like bug...

First commit comes from me in 2011:

2923459 https://github.com/mne-tools/mne-python/commit/2923459327ef3dbfda0f17e5f0788c38e3e570b7

and comment comes from @rgoj https://github.com/rgoj

691b5df https://github.com/mne-tools/mne-python/commit/691b5dfea4a24e3b06f80d729e9c09bf9d3fc761

@dengemann https://github.com/dengemann I don't think it comes from MNE/dSPM code here.

@brovelli https://github.com/brovelli for you we "just" need to replace it by:

Cm += reg np.trace(Cm) / len(Cm) np.eye(len(Cm)) Cm_inv = linalg.pinv(Cm)

if so it's an easy pull request. I am really curious on the impact on our examples. Maybe it has little impact on our sample data?

@brovelli https://github.com/brovelli thanks for digging deep into this.

— You are receiving this because you were mentioned.

Reply to this email directly, view it on GitHub https://github.com/mne-tools/mne-python/issues/3832#issuecomment-266566634, or mute the thread https://github.com/notifications/unsubscribe-auth/AB0fipGpi9_xZ2Fbi_Rh2aG3fJMSDH6uks5rHcQUgaJpZM4LJs9a .

brovelli commented 7 years ago

Thanks Alexandre for you reply. Yes these are the only changes I'd do. I can do a Pull Request tomorrow for both the _dics.py and _lcmv.py

Then, it would surely be helpful to rerun the gallery examples. I am curious to see the impact. What I tried out was single-trial estimates of power in the high-gamma band, and it did make a difference. But maybe for evoked response or for averaged power across trials, the effect is less important.

bye

On 12/12/2016 11:00 PM, Alexandre Gramfort wrote:

sorry guys for the slow reaction time. Yes I agree it looks like bug...

First commit comes from me in 2011:

2923459 https://github.com/mne-tools/mne-python/commit/2923459327ef3dbfda0f17e5f0788c38e3e570b7

and comment comes from @rgoj https://github.com/rgoj

691b5df https://github.com/mne-tools/mne-python/commit/691b5dfea4a24e3b06f80d729e9c09bf9d3fc761

@dengemann https://github.com/dengemann I don't think it comes from MNE/dSPM code here.

@brovelli https://github.com/brovelli for you we "just" need to replace it by:

Cm += reg np.trace(Cm) / len(Cm) np.eye(len(Cm)) Cm_inv = linalg.pinv(Cm)

if so it's an easy pull request. I am really curious on the impact on our examples. Maybe it has little impact on our sample data?

@brovelli https://github.com/brovelli thanks for digging deep into this.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/mne-tools/mne-python/issues/3832#issuecomment-266566634, or mute the thread https://github.com/notifications/unsubscribe-auth/ASrbq-eP74kW5cW2teJ2NST47IVlxhSUks5rHcQUgaJpZM4LJs9a.

dengemann commented 7 years ago

Let's see. I'm curious too, really. Let me know if you need some help along the way.

On Mon, Dec 12, 2016 at 11:15 PM Andrea Brovelli notifications@github.com wrote:

Thanks Alexandre for you reply. Yes these are the only changes I'd do. I can do a Pull Request tomorrow for both the _dics.py and _lcmv.py

Then, it would surely be helpful to rerun the gallery examples. I am curious to see the impact. What I tried out was single-trial estimates of power in the high-gamma band, and it did make a difference. But maybe for evoked response or for averaged power across trials, the effect is less important.

bye

On 12/12/2016 11:00 PM, Alexandre Gramfort wrote:

sorry guys for the slow reaction time. Yes I agree it looks like bug...

First commit comes from me in 2011:

2923459 < https://github.com/mne-tools/mne-python/commit/2923459327ef3dbfda0f17e5f0788c38e3e570b7

and comment comes from @rgoj https://github.com/rgoj

691b5df < https://github.com/mne-tools/mne-python/commit/691b5dfea4a24e3b06f80d729e9c09bf9d3fc761

@dengemann https://github.com/dengemann I don't think it comes from MNE/dSPM code here.

@brovelli https://github.com/brovelli for you we "just" need to replace it by:

Cm += reg np.trace(Cm) / len(Cm) np.eye(len(Cm)) Cm_inv = linalg.pinv(Cm)

if so it's an easy pull request. I am really curious on the impact on our examples. Maybe it has little impact on our sample data?

@brovelli https://github.com/brovelli thanks for digging deep into this.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub < https://github.com/mne-tools/mne-python/issues/3832#issuecomment-266566634>,

or mute the thread < https://github.com/notifications/unsubscribe-auth/ASrbq-eP74kW5cW2teJ2NST47IVlxhSUks5rHcQUgaJpZM4LJs9a .

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/mne-tools/mne-python/issues/3832#issuecomment-266570461, or mute the thread https://github.com/notifications/unsubscribe-auth/AB0fir7aUW68_8AkLwTK1Wt4GVsc36qNks5rHcefgaJpZM4LJs9a .

brovelli commented 7 years ago

I did the PR, could you check it please? I am still unfamiliar with github functioning

brovelli commented 7 years ago

Check out the new issue #3840 concerning the use of real filters The two issues are related.