Open dasdiptyajit opened 1 year ago
At the moment, I am trying to calculate the adjacency matrix for a modified source space (i.e., dipoles in the medial walls are excluded).
Rather than using a modified source space, could you use the standard one but pass spatial_exclude
to the clustering function? This is exactly the use case that I use this parameter/setup for...
But yes it's possible that we also have some bug with this reduced source space that we should fix.
Yes, spatial_exclude
seems like a work around for these tests. The problem that I have, may be rather more specific to my case: the medial wall dipoles were excluded in the forward modelling/source reconstruction and the data was morphed to 'fsaverage' with similar dipoles config. So, I can't adapt to a new parameter settings for the cluster tests then I will have no consistency between the pipeline/results.
Is there a work around to reduce the full spatial_src_adjacency
to a set of labels somehow?
the medial wall dipoles were excluded in the forward modelling/source reconstruction and the data was morphed to 'fsaverage' with similar dipoles config.
I don't see a problem with morphing to the complete fsaverage
space then using spatial_exclude
. I'd expect the results to be the same as morphing to fsaverage
with the dipoles excluded. (The only difference should be that the dipoles right next to the medial wall end up with slightly less activation because they can spread to the medial wall during morphing, but then this should wash out in the stat_fun
anyway because it will be the case for any condition(s) you morph and the relevant standard-deviation term should account for it.)
Is there a work around to reduce the full spatial_src_adjacency to a set of labels somehow?
The spatial_src_adjacency
for fsaverage
should just be a block-diagonal (20484, 20484)
sparse matrix. In principle you should be able to subselect the vertices you used via proper use of np.searchsorted
and knowing there are 10242 vertices per hemisphere.
Great! Thanks for the clarification. I will try to give it a go for both options.
For the second option this is something I have so far:
fs_labels = mne.read_labels_from_annot(subject='fsaverage', parc='HCPMMP1', subjects_dir=subjects_dir)
fsaverage_vertices = [np.arange(10242), np.arange(10242)]
exclude_lh = np.in1d(fsaverage_vertices[0], fs_labels[0].vertices)
exclude_rh = np.in1d(fsaverage_vertices[1], fs_labels[1].vertices)
spatial_exclude = np.concatenate([np.where(exclude_lh)[0], np.where(exclude_rh)[0] + len(fsaverage_vertices[0])])
In [34]: spatial_exclude
Out[34]: array([ 8, 23, 36, ..., 20203, 20204, 20205])
In [35]: spatial_exclude.shape
Out[35]: (1742,)
Description of the problem
I would like to perform some spatio-temporal clustering tests on the template
fsaverage
data. At the moment, I am trying to calculate the adjacency matrix for a modified source space (i.e., dipoles in the medial walls are excluded). However, if I modify the source space object;mne.spatial_src_adjacency
prompts an error sincesrc[0]["use_tris"]
for modified object returns an empty array.Steps to reproduce
Link to data
No response
Expected results
mne.spatial_src_adjacency(src_inv_restrict)
should return a sparse matrix.Actual results
mne.spatial_src_adjacency(src_inv_restrict)
returns *** ValueError: zero-size array to reduction operation maximum which has no identityAdditional information
none