Closed pmolfese closed 11 months ago
Hello! 👋 Thanks for opening your first issue here! ❤️ We will try to get back to you soon. 🚴🏽♂️
I think it's worth trying. Now we need to see how to handle the fact stc subsample the high resolution surfaces of freesurfer. Not sure how to handle this with gifti unless we upsample creating very large files.
This would be great. Currently using an amalgamation of MNE-C, Freesurfer, and AFNI to get surface estimates into .nii files.
Hello again. @jbteves and I finally have some bandwidth to work more on this.
@agramfort & @larsoner, can you direct us to where the subsampling of the surfaces happens?
Also @jhouck - can you point us to your code that does work-arounds?
@pmolfese I think we're planning on working this into nibabel
first, see discussions and hopefully join them (next meeting is a week from tomorrow IIRC!) -- it would be nice not to duplicate efforts here:
https://nipy.discourse.group/t/collecting-use-cases/58 https://nipy.discourse.group/t/surface-api-follow-up-meetings/79/9 https://github.com/nipy/nibabel/pull/1056
But to answer your question, the subsampling is here (and associated helper functions), it's not the easiest code to read:
I'm interesting in joining these discussions, how might I join? I see a standing zoom room in your second link, should I simply join at the appropriate time?
@jbteves it would be good to drop a comment in https://nipy.discourse.group/t/surface-api-follow-up-meetings/79/9 to mention your interest just so that other folks know you're coming
@pmolfese the code is not the best. It relies on the fact that a .stc file is basically a set of concatenated Freesurfer paint (.w) files. With the old MNE-C (2.7.4, ca. 2016), I was using mne_make_movie to extract the surface at the time of interest and export it to a .w file, then freesurfer's mri_surf2vol to convert the .w file to a nifti file. For gifti I think you could use mri_surf2surf. To avoid the old code you can set ftype=w when you save the SourceEstimate in mne-python.
I will add my name to the linked discussion point too. For MNE-nirs I want to move to source reconstruction next and was planning to integrate tightly with nilearn.
@pmolfese why did you prefer to export as GIFTI? I am just curious as Im new to this topic. Some of the fNIRS software exports as NIFTI, and wondered if there was a specific reason for choosing between the two?
@rob-luke - GIFTI mostly comfort in handling source data from other software. There are programs for converting between NIFTI and GIFTI for sure. I suspect ultimately both would be useful.
@jbteves you pinged me with an email for a status update here -- so far there has been some progress at the nibabel
end. I think our plan is basically to wait for nibabel
to make SurfaceImage
available, then allow something like stc.as_gifti
or something to export a SurfaceImage
class from our SourceEstimate
class in MNE. (Or maybe someday SourceEstimate
can be a wrapper for SurfaceImage
-- but that would be more work.) In any case, I think any Gifti export would hopefully come from going STC->SurfaceImage (implemented at the MNE end) then SurfaceImage->Gifti (implemented at the nibabel end).
So I think at this point the bottleneck is at the nibabel end, maybe there is some way you could help their effort?
https://nipy.discourse.group/t/preliminary-notes-on-surface-representations/57 https://nipy.discourse.group/t/surface-api-follow-up-meetings/79 https://github.com/nipy/nibabel/pull/1056
Looks like the next meeting is scheduled for 9AM EST Jan 28th if you want to join and discuss!
Alright, thanks for letting me know!
Hi @larsoner, we're pretty eager to get this working and happy to help the nibabel
efforts. From the nibabel
meeting this morning and our own poking around in the codebase, we have some questions:
for building the geometry we have found the necessary vertices on the decimated surfaces, but we cannot seem to find the triangles necessary to complete the geometry and thus get ready to build the matching GIFTIs for export.
Any help or pointers you could provide on finding the decimated triangles would be much appreciated!
-Pete (and @jbteves).
for building the geometry we have found the necessary vertices on the decimated surfaces, but we cannot seem to find the triangles necessary to complete the geometry and thus get ready to build the matching GIFTIs for export.
Where did you find the decimated vertices? From stc.vertices
?
Typically that stc.vertices
will correspond to some src[0]['vertno']
from a SourceSpaces instance. If you look at that instance, it should have what you want, e.g.:
>>> import numpy as np
>>> import mne
>>> src = mne.read_source_spaces(mne.datasets.sample.data_path() + '/subjects/sample/bem/sample-oct-6-src.fif')
>>> src[0]['tris'].shape # high-density
(310810, 3)
>>> src[0]['use_tris'].shape # subsampled
(8192, 3)
>>> src[0]['vertno'].shape # vertex numbers
(4098,)
>>> np.array_equal(np.unique(src[0]['use_tris']), np.sort(src[0]['vertno']))
True
Thanks for the response! Sorry, I may be misunderstanding something. We use something like this:
>>> se_rh = mne.read_source_estimate("sub-stc-rh.stc")
>>> right_vert = se_rh.right_verno
To index into the source vertices. However, there are the triangles that represent the faces to create a complete mesh/surface in the high-resolution sample. How might we get those triangles for the low-resolution surface?
We use something like this:
Your stc.rh_vertices
should correspond to src[1]['vertno']
in my code above, as long as no vertices were removed during forward computation. Each STC corresponds to some SourceSpaces instance (the one that is preserved in fwd['src']
when it's computed, and inv['src']
when it's computed), and that SourceSpaces instance will (usually) contain both the full-resolution triangulation (src[0]['tris']
for left and src[1]['tris']
for right) and the decimated, low-resolution triangulation (src[0]['use_tris']
and src[1]['use_tris']
).
In other words, what I was trying to express with the commands above is "tris is the dense triangulation, use_tris is the decimated one, and the set of vertex numbers in the decimated one match exactly the vertno attribute", and the implied part that is probably the missing link for you is that "the vertno key of the two source space elements should match the stc.vertices", which in code would be something like (for a STC that has no "holes" / missing vertices):
assert np.array_equal(src[0]['vertno'], stc.vertices[0])
assert np.array_equal(src[1]['vertno'], stc.vertices[[1])
@larsoner ah, thanks so much! That makes things much more clear for me. I'll hopefully get some toy code together for this soon.
@larsoner after the surface API meeting prior to last week it seemed like the ball was moving back towards "MNE should implement the solution using the new API." While we're waiting for the API in nibabel to be fully implemented, could we merge in a more formalized version of the below gist?
https://gist.github.com/jbteves/384bfdd3777190f59620c3d988a314fd
I imagine some version of src2subsurf
(everything before line 50) with a suitable API and naming scheme could live in mne/source_space.py
. Are you interested in doing this @jbteves ?
For an API/naming, perhaps mne.source_space.get_decimated_surfaces(src)
that returned len(surf_source_spaces)
(usually == 2) dicts, each with 'rr'
and 'tris'
keys would be general enough? The list of dict
rather than list of list
allows us to later add other entries (e.g., nn
) for each surface source space if we realize they're helpful.
Sure, I can take a crack at this @larsoner. Not sure on what precise timeline though.
Opened #10421 for you to take a look.
Should this be reopened? I think we're still missing the "write to gifti" part.
I think the "partially addresses #XYZ" was misread by GitHub, agreed we should reopen
Thanks! For the GIFTI part should I add another function, or are we wanting to wait for the surface API changes?
I think we want nibabel
to do the heavy lifting on that, so we'll need to wait at the MNE end for them to have GIFTI export.
One idea of a backward and forward compatible way for us to get this very easily would be to make BaseSourceEstimate a subclass of nibabel's surface API class. If this works it might be a nice way to transition
Wondering if we can revisit this since it's been over a year. Trying to make sure perfect doesn't stand in the way of progress.
We have a working prototype in our specific use cases that may need to be broadened a bit: https://github.com/nimh-cmn/stc2gii_hack
Agreed I think the nibabel stuff has stalled (?) and in theory as long as we set the API up decently enough we're future compatible with using their code anyway. So +1 for adding a new method stc.export_gifti(fname, subject=None, subjects_dir=None)
to the SourceEstimate
class. Then maybe we want a mne.source_estimate.read_gifti(fname, subject=None)
or so? That way you could do round-trip testing that a save-load round-trip of the Gifti you get the same data, vertices, etc. (or as much information as you can).
Feel free to open a PR to integrate the "hack" (nice name :smile: ) code into MNE
Thanks @larsoner! We'll also need a method to save the decimated surface as GIFTI. Any preferences on a name (and namespace) for that?
Maybe if you pass fname='something.gii'
we should automatically create 'something-lh.gii'
for the decimated left surf and 'something-rh.gii'
with the decimated right hemi surf. Would that be a reasonable workflow? If not feel free to propose something that would work better / make more sense -- the use case isn't totally clear to me, I've never used or thought much about gifti and what it can/can't do as a format...
Ah sorry I'm not typing clear enough even for me!
In Gifti (I believe) there's the geometry surface file that has the shape, and then we overlay the functional data on top of it. These correspond in MNE terms Source Space and STC, respectfully. That functional data (STC here) doesn't have shape, just the nodes that those time data map onto the geometry. For the STCs, I agree with stc.export_gifti( )
for the saving the functional data.
But for the downsampled surfaces / geometry I think we'll need to add a case to mne.write_source_spaces( )
as well. Like a format =
where folks can specify either FIF or GIFTI. Can attempt to get the mne.read_source_space to also read these downsampled surfaces eventually as well.
For the STCs, I agree with stc.export_gifti( ) for the saving the functional data.
But for the downsampled surfaces / geometry I think we'll need to add a case to mne.write_source_spaces( ) as well.
Or perhaps a cleaner API would be:
stc.export_gifti(fname, src=fwd['src'])
i.e., require the user to pass a fname
filename and the src
source space that was used in creating the STC. (There's plenty of precedent for this in viz and morphing functions so I don't think it's too onerous.) And then export_gifti
can just write out whatever file(s) with whatever information is needed to actually use with some downstream gifti tool.
Wanted to get thoughts from others who may find this feature useful, and insights into possible straightforward ways to implement.
Describe the new feature or enhancement
In an effort to compare/overlay results in fMRI and EEG/MEG, several of us are seeking an easy way to export MNE source results to GIFTI (and potentially NIFTI in the future). GIFTI would allow other programs (e.g. AFNI, Freesurfer) to then work with source-space data in useful ways (e.g. constrain solutions based on fMRI, make pretty pictures).
Describe your proposed implementation
Add options to mne.SourceSpaces.save( ) or better - a new method .export( ) to allow the export of GIFTI images. For added convenience, with option for hemisphere, time-window of interest, and overwrite. Wrapper for nibabel's GIFTI implementation with nuances for writing the relevant data in.
https://nipy.org/nibabel/reference/nibabel.gifti.html
Additional comments
This would be decently useful for folks trying to do EEG-fMRI simultaneous, MEG & fMRI joint analysis, or anyone interested in taking the source estimates from MNE into other softwares (FieldTrip and BrainStorm both have GIFTI imports).