MICA-MNI / BrainStat

A statistics and context decoding toolbox for neuroimaging.
https://brainstat.readthedocs.io
Other
91 stars 21 forks source link

[FIX] Read permission error in genetic decoding module on Windows. #229

Closed Bountainmicycle closed 2 years ago

Bountainmicycle commented 2 years ago

Hi,

I'm trying to run the following code from tutorial 02 in the python implementation on my windows machine:

import numpy as np
from brainstat.context.genetics import surface_genetic_expression
from brainstat.datasets import fetch_parcellation, fetch_template_surface

schaefer_400 = fetch_parcellation("fsaverage5", "schaefer", 400)
surfaces = fetch_template_surface("fsaverage5", join=False)

expression = surface_genetic_expression(schaefer_400, surfaces, space="fsaverage")
print(expression)

The program terminates with the following stack trace:

(ENV) C:\Users\user\Desktop\folder>python example.py
C:\Users\user\ENV\lib\site-packages\nilearn\datasets\__init__.py:93: FutureWarning: Fetchers from the nilearn.datasets module will be updated in version 0.9 to return python strings instead of bytes and Pandas dataframes instead of Numpy arrays.
  warn("Fetchers from the nilearn.datasets module will be "
2021-09-24 14:44:59.327 (   0.678s) [                ] vtkPythonAlgorithm.cxx:112    ERR| vtkPythonAlgorithm (0000012B864B49E0): Failure when calling method: "ProcessRequest":
Traceback (most recent call last):
  File "C:\Users\user\ENV\lib\site-packages\vtkmodules\util\vtkAlgorithm.py", line 152, in ProcessRequest
    return vtkself.ProcessRequest(request, inInfo, outInfo)
  File "C:\Users\user\ENV\lib\site-packages\vtkmodules\util\vtkAlgorithm.py", line 198, in ProcessRequest
    return self.RequestData(request, inInfo, outInfo)
  File "C:\Users\user\ENV\lib\site-packages\brainspace\vtk_interface\io_support\gifti_support.py", line 123, in RequestData
    _write_gifti(vtkPolyData.GetData(inInfo[0], 0), self.__FileName)
  File "C:\Users\user\ENV\lib\site-packages\brainspace\vtk_interface\decorators.py", line 41, in _wrapper_wrap
    data = func(*args, **kwds)
  File "C:\Users\user\ENV\lib\site-packages\brainspace\vtk_interface\io_support\gifti_support.py", line 68, in _write_gifti
    nb.save(g, opth)
  File "C:\Users\user\ENV\lib\site-packages\nibabel\loadsave.py", line 99, in save
    img.to_filename(filename)
  File "C:\Users\user\ENV\lib\site-packages\nibabel\filebasedimages.py", line 333, in to_filename
    self.to_file_map()
  File "C:\Users\user\ENV\lib\site-packages\nibabel\gifti\gifti.py", line 880, in to_file_map
    f = file_map['image'].get_prepare_fileobj('wb')
  File "C:\Users\user\ENV\lib\site-packages\nibabel\fileholders.py", line 70, in get_prepare_fileobj
    obj = ImageOpener(self.filename, *args, **kwargs)
  File "C:\Users\user\ENV\lib\site-packages\nibabel\openers.py", line 113, in __init__
    self.fobj = opener(fileish, *args, **kwargs)
PermissionError: [Errno 13] Permission denied: 'C:\\Users\\user\\AppData\\Local\\Temp\\tmprzrqjss5.gii'
2021-09-24 14:44:59.383 (   0.733s) [                ]       vtkExecutive.cxx:753    ERR| vtkCompositeDataPipeline (0000012B961BB1C0): Algorithm vtkPythonAlgorithm(0000012B864B49E0) returned failure for request: vtkInformation (0000012B9BA8A4B0)
  Debug: Off
  Modified Time: 983
  Reference Count: 2
  Registered Events: (none)
  Request: REQUEST_DATA
  FORWARD_DIRECTION: 0
  ALGORITHM_AFTER_FORWARD: 1
  FROM_OUTPUT_PORT: -1

2021-09-24 14:44:59.414 (   0.764s) [                ] vtkPythonAlgorithm.cxx:112    ERR| vtkPythonAlgorithm (0000012B9BABF7B0): Failure when calling method: "ProcessRequest":
Traceback (most recent call last):
  File "C:\Users\user\ENV\lib\site-packages\vtkmodules\util\vtkAlgorithm.py", line 152, in ProcessRequest
    return vtkself.ProcessRequest(request, inInfo, outInfo)
  File "C:\Users\user\ENV\lib\site-packages\vtkmodules\util\vtkAlgorithm.py", line 198, in ProcessRequest
    return self.RequestData(request, inInfo, outInfo)
  File "C:\Users\user\ENV\lib\site-packages\brainspace\vtk_interface\io_support\gifti_support.py", line 123, in RequestData
    _write_gifti(vtkPolyData.GetData(inInfo[0], 0), self.__FileName)
  File "C:\Users\user\ENV\lib\site-packages\brainspace\vtk_interface\decorators.py", line 41, in _wrapper_wrap
    data = func(*args, **kwds)
  File "C:\Users\user\ENV\lib\site-packages\brainspace\vtk_interface\io_support\gifti_support.py", line 68, in _write_gifti
    nb.save(g, opth)
  File "C:\Users\user\ENV\lib\site-packages\nibabel\loadsave.py", line 99, in save
    img.to_filename(filename)
  File "C:\Users\user\ENV\lib\site-packages\nibabel\filebasedimages.py", line 333, in to_filename
    self.to_file_map()
  File "C:\Users\user\ENV\lib\site-packages\nibabel\gifti\gifti.py", line 880, in to_file_map
    f = file_map['image'].get_prepare_fileobj('wb')
  File "C:\Users\user\ENV\lib\site-packages\nibabel\fileholders.py", line 70, in get_prepare_fileobj
    obj = ImageOpener(self.filename, *args, **kwargs)
  File "C:\Users\user\ENV\lib\site-packages\nibabel\openers.py", line 113, in __init__
    self.fobj = opener(fileish, *args, **kwargs)
PermissionError: [Errno 13] Permission denied: 'C:\\Users\\user\\AppData\\Local\\Temp\\tmpns89h1zj.gii'
2021-09-24 14:44:59.427 (   0.777s) [                ]       vtkExecutive.cxx:753    ERR| vtkCompositeDataPipeline (0000012B961BAEF0): Algorithm vtkPythonAlgorithm(0000012B9BABF7B0) returned failure for request: vtkInformation (0000012B96667E90)
  Debug: Off
  Modified Time: 1162
  Reference Count: 2
  Registered Events: (none)
  Request: REQUEST_DATA
  FORWARD_DIRECTION: 0
  ALGORITHM_AFTER_FORWARD: 1
  FROM_OUTPUT_PORT: -1

If you use BrainStat's genetics functionality, please cite abagen (https://abagen.readthedocs.io/en/stable/citing.html).
Traceback (most recent call last):
  File "C:\Users\user\ENV\lib\site-packages\abagen\images.py", line 352, in check_atlas
    atlas = check_img(atlas)
  File "C:\Users\user\ENV\lib\site-packages\abagen\images.py", line 218, in check_img
    raise TypeError('Provided image must be an existing filepath or a '
TypeError: Provided image must be an existing filepath or a pre-loaded niimg-like object

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "C:\Users\user\Desktop\folder\example.py", line 8, in <module>
    expression = surface_genetic_expression(schaefer_400, surfaces, space="fsaverage5")
  File "C:\Users\user\ENV\lib\site-packages\brainstat-0.2.5-py3.9.egg\brainstat\context\genetics.py", line 104, in surface_genetic_expression
    atlas = check_atlas(labels, geometry=surfaces, space=space)
  File "C:\Users\user\ENV\lib\site-packages\abagen\images.py", line 366, in check_atlas
    coords, triangles = check_geometry(geometry, space, donor=donor,
  File "C:\Users\user\ENV\lib\site-packages\abagen\images.py", line 419, in check_geometry
    coords, triangles = map(list, zip(*[
  File "C:\Users\user\ENV\lib\site-packages\abagen\images.py", line 420, in <listcomp>
    load_gifti(img).agg_data() for img in surface
AttributeError: 'str' object has no attribute 'agg_data'

Help would be appreciated.

ReinderVosDeWael commented 2 years ago

Holy 🤬 you weren't kidding when you said it was a long stack trace. It's a bit of a tricky one to decode, so bear with me for the debugging.

First things first, can you just confirm that your drive isn't full? A full drive can lead to issues with temporary files.

What I think is happening is that we're storing a temporary file with 'w+b' permissions, but then try to open it with read permissions as well. I've seen before that macOS/Linux will happily do this but Windows will throw errors - this would also explain why I cannot reproduce this on my MacBook (I, unfortunately, don't have access to a windows device). So lets first have a look at whether that guess is correct.

Can you open up brainstat/context/genetics.py and modify the following loop:

Original:

temp_surfaces: List[Path] = []
    for i, surface in enumerate(surfaces):
        if not isinstance(surface, str) and not isinstance(surface, Path):
            temp_surfaces.append(tempfile.NamedTemporaryFile(suffix=".gii"))
            write_surface(surface, temp_surfaces[i].name, otype="gii")

Modified:

temp_surfaces: List[Path] = []
    for i, surface in enumerate(surfaces):
        if not isinstance(surface, str) and not isinstance(surface, Path):
            temp_surfaces.append(tempfile.NamedTemporaryFile(suffix=".gii"))
            write_surface(surface, temp_surfaces[i].name, otype="gii")
            temp_surfaces[-1].seek(0)

Does this resolve the issue?

Bountainmicycle commented 2 years ago

There's plenty of space on the drive.

I had a feeling this might be a windows issue (because it always is).

Unfortunately the modification didn't help, the program produces exactly the same stack trace.

ReinderVosDeWael commented 2 years ago

Right - lets try this differently. Do you still get the same stack trace if you replace everything up to the line expression = ... with the following?

    # Deal with the input parameters.
    if isinstance(surfaces, str):
        surfaces = [surfaces]
    elif surfaces is None:
        surfaces = []

    surfaces_gii = []
    for surface in surfaces:
        if not isinstance(surface, str) and not isinstance(surface, Path):
            with tempfile.NamedTemporaryFile(suffix=".gii") as f:
                write_surface(surface, f.name, otype="gii")
                surfaces_gii.append(nib.load(f.name))
        else:
            surfaces_gii.append(nib.load(surface))

    # Use abagen to grab expression data.
    print(
        "If you use BrainStat's genetics functionality, please cite abagen (https://abagen.readthedocs.io/en/stable/citing.html)."
    )
    atlas = check_atlas(labels, geometry=surfaces_gii, space=space)
Bountainmicycle commented 2 years ago

Yep, still the same.

By the way, I reverted the modification you suggested before. Is that the correct approach or should I rather try all suggested changes together?

ReinderVosDeWael commented 2 years ago

That... is very odd. Can you copy the stack trace just so I can confirm the nothing changed at all?

Bountainmicycle commented 2 years ago

Aah sorry, my bad.

Here's the new one.

(ENV) C:\Users\user\Desktop\folder>python example.py C:\Users\user\ENV\lib\site-packages\nilearn\datasets__init.py:93: FutureWarning: Fetchers from the nilearn.datasets module will be updated in version 0.9 to return python strings instead of bytes and Pandas dataframes instead of Numpy arrays. warn("Fetchers from the nilearn.datasets module will be " 2021-09-24 19:46:09.143 ( 0.642s) [ ] vtkPythonAlgorithm.cxx:112 ERR| vtkPythonAlgorithm (000001C9DAF0F3D0): Failure when calling method: "ProcessRequest": Traceback (most recent call last): File "C:\Users\user\ENV\lib\site-packages\vtkmodules\util\vtkAlgorithm.py", line 152, in ProcessRequest return vtkself.ProcessRequest(request, inInfo, outInfo) File "C:\Users\user\ENV\lib\site-packages\vtkmodules\util\vtkAlgorithm.py", line 198, in ProcessRequest return self.RequestData(request, inInfo, outInfo) File "C:\Users\user\ENV\lib\site-packages\brainspace\vtk_interface\io_support\gifti_support.py", line 123, in RequestData _write_gifti(vtkPolyData.GetData(inInfo[0], 0), self.__FileName) File "C:\Users\user\ENV\lib\site-packages\brainspace\vtk_interface\decorators.py", line 41, in _wrapper_wrap data = func(*args, *kwds) File "C:\Users\user\ENV\lib\site-packages\brainspace\vtk_interface\io_support\gifti_support.py", line 68, in _write_gifti nb.save(g, opth) File "C:\Users\user\ENV\lib\site-packages\nibabel\loadsave.py", line 99, in save img.to_filename(filename) File "C:\Users\user\ENV\lib\site-packages\nibabel\filebasedimages.py", line 333, in to_filename self.to_file_map() File "C:\Users\user\ENV\lib\site-packages\nibabel\gifti\gifti.py", line 880, in to_file_map f = file_map['image'].get_prepare_fileobj('wb') File "C:\Users\user\ENV\lib\site-packages\nibabel\fileholders.py", line 70, in get_prepare_fileobj obj = ImageOpener(self.filename, args, **kwargs) File "C:\Users\user\ENV\lib\site-packages\nibabel\openers.py", line 113, in init__ self.fobj = opener(fileish, *args, **kwargs) PermissionError: [Errno 13] Permission denied: 'C:\Users\user\AppData\Local\Temp\tmpzcn_ic9h.gii' 2021-09-24 19:46:09.205 ( 0.704s) [ ] vtkExecutive.cxx:753 ERR| vtkCompositeDataPipeline (000001C9DAC8EEA0): Algorithm vtkPythonAlgorithm(000001C9DAF0F3D0) returned failure for request: vtkInformation (000001C9E0961FF0) Debug: Off Modified Time: 983 Reference Count: 2 Registered Events: (none) Request: REQUEST_DATA FORWARD_DIRECTION: 0 ALGORITHM_AFTER_FORWARD: 1 FROM_OUTPUT_PORT: -1

Traceback (most recent call last): File "C:\Users\user\Desktop\folder\example.py", line 8, in expression = surface_genetic_expression(schaefer_400, surfaces, space="fsaverage5") File "C:\Users\user\ENV\lib\site-packages\brainstat\context\genetics.py", line 96, in surface_genetic_expression surfaces_gii.append(nib.load(f.name)) File "C:\Users\user\ENV\lib\site-packages\nibabel\loadsave.py", line 46, in load raise ImageFileError(f"Empty file: '{filename}'") nibabel.filebasedimages.ImageFileError: Empty file: 'C:\Users\user\AppData\Local\Temp\tmpzcn_ic9h.gii'

ReinderVosDeWael commented 2 years ago

Right - that one is actually quite different ;). I think replacing everything up to expression = ... with the following should fix the issue.

# Deal with the input parameters.
    if isinstance(surfaces, str):
        surfaces = [surfaces]
    elif surfaces is None:
        surfaces = []

    surfaces_gii = []
    for surface in surfaces:
        if not isinstance(surface, str) and not isinstance(surface, Path):
            with tempfile.NamedTemporaryFile(suffix=".gii", delete=False) as f:
                name = f.name
                write_surface(surface, name, otype="gii")
            surfaces_gii.append(nib.load(name))
            (Path(name)).unlink()
        else:
            surfaces_gii.append(nib.load(surface))

    # Use abagen to grab expression data.
    print(
        "If you use BrainStat's genetics functionality, please cite abagen (https://abagen.readthedocs.io/en/stable/citing.html)."
    )
    atlas = check_atlas(labels, geometry=surfaces_gii, space=space)
Bountainmicycle commented 2 years ago

Yes! That indeed fixed it! Took a little longer to calculate than I expected, but it worked. Thanks!

ReinderVosDeWael commented 2 years ago

Great! It will take especially long on the first run as abagen has to download all the requisite files. But yea, even after that it can still take a couple of minutes.

I'll push this change with the next release.

Bountainmicycle commented 2 years ago

While trying to run the meta-analytic context decoding tutorial I encountered what looks like a similar error.

Code:

from brainstat.context.meta_analysis import surface_decoder
from brainstat.datasets import fetch_mask, fetch_template_surface
from brainstat.tutorial.utils import fetch_abide_data

civet_mask = fetch_mask("civet41k")
civet_surface_mid = fetch_template_surface("civet41k", layer="mid", join=False)
civet_surface_white = fetch_template_surface("civet41k", layer="white", join=False)
subject_thickness, demographics = fetch_abide_data(sites=["PITT"])
thickness = subject_thickness.mean(axis=0)

meta_analysis = surface_decoder(
    civet_surface_mid,
    civet_surface_white,
    [thickness[: len(thickness) // 2], thickness[len(thickness) // 2 :]],
)
print(meta_analysis)

Stack trace:

(ENV) C:\Users\user\Desktop\folder>python example2.py
C:\Users\user\ENV\lib\site-packages\nilearn\datasets\__init__.py:93: FutureWarning: Fetchers from the nilearn.datasets module will be updated in version 0.9 to return python strings instead of bytes and Pandas dataframes instead of Numpy arrays.
  warn("Fetchers from the nilearn.datasets module will be "
Downloading data from https://files.osf.io/v1/resources/mb37e/providers/osfstorage/601dafe77ad0a80119d9483c ...
 ...done. (4 seconds, 0 min)
Extracting data from C:\Users\user\brainstat_data\surface_data\ffe6e19e0ca4fbc83eda2396c68ee5fc\tpl-civet.tar.gz..... done.
Fetching thickness data for subject 56 out of 56: : 56it [02:43,  2.91s/it]
2021-09-26 13:28:31,104 - brainstat - INFO - Fetching Neurosynth feature files. This may take several minutes if you haven't downloaded them yet.
2021-09-26 13:28:31,105 - brainstat - INFO - Downloading Neurosynth data files.
Traceback (most recent call last):
  File "C:\Users\user\Desktop\folder\example2.py", line 11, in <module>
    meta_analysis = surface_decoder(
  File "C:\Users\user\ENV\lib\site-packages\brainstat\context\meta_analysis.py", line 67, in surface_decoder
    feature_files = tuple(_fetch_precomputed(data_dir, database=database))
  File "C:\Users\user\ENV\lib\site-packages\brainstat\context\meta_analysis.py", line 127, in _fetch_precomputed
    return _fetch_precomputed_neurosynth(data_dir)
  File "C:\Users\user\ENV\lib\site-packages\brainstat\context\meta_analysis.py", line 147, in _fetch_precomputed_neurosynth
    with open(zip_file.name, "wb") as fw:
PermissionError: [Errno 13] Permission denied: 'C:\\Users\\user\\brainstat_data\\neurosynth_datac4_1oc4h.zip'
ReinderVosDeWael commented 2 years ago

Yup - that's the same issue. Again, I can't bug test this myself as I'm not on Windows, so I hope the following works. Change the following two things in brainstat/context/meta_analysis.py:

Exchange function _fetch_precomputed_neurosynth for:

def _fetch_precomputed_neurosynth(data_dir: Path) -> Generator[Path, None, None]:
    """Downloads precomputed Neurosynth features and returns the filepaths."""

    json = read_data_fetcher_json()["neurosynth_precomputed"]
    url = json["url"]

    existing_files = data_dir.glob("Neurosynth_TFIDF__*z_desc-consistency.nii.gz")

    if len(list(existing_files)) != json["n_files"]:
        logger.info("Downloading Neurosynth data files.")
        response = urllib.request.urlopen(url)

        # Open, close, and reopen file to deal with Windows permission issues.
        with tempfile.NamedTemporaryFile(prefix=str(data_dir), suffix=".zip", delete=False) as f:
            name = f.name

        with open(name, "wb") as fw:
            fw.write(response.read())

        with zipfile.ZipFile(name, "r") as fr:
            fr.extractall(data_dir)

        (Path(name)).unlink()

    return data_dir.glob("Neurosynth_TFIDF__*z_desc-consistency.nii.gz")

and I think you'd get this same error again later on, so also change surface_decoder for

def surface_decoder(
    pial: Union[str, BSPolyData, Sequence[Union[str, BSPolyData]]],
    white: Union[str, BSPolyData, Sequence[Union[str, BSPolyData]]],
    stat_labels: Union[str, np.ndarray, Sequence[Union[str, np.ndarray]]],
    *,
    interpolation: str = "linear",
    data_dir: Optional[Union[str, Path]] = None,
    database: str = "neurosynth",
) -> pd.DataFrame:
    """Meta-analytic decoding of surface maps using NeuroSynth or NeuroQuery.

    Parameters
    ----------
    pial : str, BSPolyData, sequence of str or BSPolyData
        Path of a pial surface file, BSPolyData of a pial surface or a list
        containing multiple of the aforementioned.
    white : str, BSPolyData, sequence of str or BSPolyData
        Path of a white matter surface file, BSPolyData of a pial surface or a
        list containing multiple of the aforementioned.
    stat_labels : str, numpy.ndarray, sequence of str or numpy.ndarray
        Path to a label file for the surfaces, numpy array containing the
        labels, or a list containing multiple of the aforementioned.
    mask_labels : str, numpy.ndarray, sequence of str of or numpy.ndarray
        Path to a mask file for the surfaces, numpy array containing the
        mask, or a list containing multiple of the aforementioned. If None
        all vertices are included in the mask. Defaults to None.
    interpolation : str, optional
        Either 'nearest' for nearest neighbor interpolation, or 'linear'
        for trilinear interpolation, by default 'linear'.
    data_dir : str, optional
        The directory of the dataset. If none exists, a new dataset will
        be downloaded and saved to this path. If None, the directory defaults to
        your home directory, by default None.

    Returns
    -------
    pandas.DataFrame
        Table with correlation values for each feature.
    """

    data_dir = Path(data_dir) if data_dir else data_directories["NEUROSYNTH_DATA_DIR"]
    data_dir.mkdir(exist_ok=True, parents=True)

    logger.info(
        "Fetching Neurosynth feature files. This may take several minutes if you haven't downloaded them yet."
    )
    feature_files = tuple(_fetch_precomputed(data_dir, database=database))

    mni152 = load_mni152_brain_mask()

    with tempfile.NamedTemporaryFile(suffix=".nii.gz", delete=False) as f:
        name = f.name

    multi_surface_to_volume(
        pial=pial,
        white=white,
        volume_template=mni152,
        output_file=name,
        labels=stat_labels,
        interpolation=interpolation,
    )

    stat_volume = nib.load(name)
    mask = (stat_volume.get_fdata() != 0) & (mni152.get_fdata() != 0)
    stat_vector = stat_volume.get_fdata()[mask]

    feature_names = []
    correlations = np.zeros(len(feature_files))

    logger.info("Running correlations with all Neurosynth features.")
    for i in range(len(feature_files)):
        feature_names.append(re.search("__[A-Za-z0-9]+", feature_files[i].stem)[0][2:])  # type: ignore
        feature_data = nib.load(feature_files[i]).get_fdata()[mask]
        keep = np.logical_not(
            np.isnan(feature_data)
            | np.isinf(feature_data)
            | np.isnan(stat_vector)
            | np.isinf(stat_vector)
        )
        correlations[i], _ = pearsonr(stat_vector[keep], feature_data[keep])

    df = pd.DataFrame(correlations, index=feature_names, columns=["Pearson's r"])
    return df.sort_values(by="Pearson's r", ascending=False)
Bountainmicycle commented 2 years ago

This indeed seems to have fixed the current problem, but a new issue related to pyembree (which I just installed alongside cython since these are not in the requirements.txt) emerged.

Stack trace:

(ENV) C:\Users\\Desktop\folder>python example2.py
C:\Users\\ENV\lib\site-packages\nilearn\datasets\__init__.py:93: FutureWarning: Fetchers from the nilearn.datasets module will be updated in version 0.9 to return python strings instead of bytes and Pandas dataframes instead of Numpy arrays.
  warn("Fetchers from the nilearn.datasets module will be "
Fetching thickness data for subject 56 out of 56: : 56it [00:12,  4.38it/s]
2021-09-26 20:06:27,105 - brainstat - INFO - Fetching Neurosynth feature files. This may take several minutes if you haven't downloaded them yet.
C:\Users\\ENV\lib\site-packages\nilearn\datasets\struct.py:360: FutureWarning: Default resolution of the MNI template will change from 2mm to 1mm in version 0.10.0
  warnings.warn("Default resolution of the MNI template will change "
Traceback (most recent call last):
  File "C:\Users\\Desktop\folder\example2.py", line 11, in <module>
    meta_analysis = surface_decoder(
  File "C:\Users\\ENV\lib\site-packages\brainstat\context\meta_analysis.py", line 74, in surface_decoder
    multi_surface_to_volume(
  File "C:\Users\\ENV\lib\site-packages\brainstat\mesh\interpolate.py", line 169, in multi_surface_to_volume
    surface_to_volume(
  File "C:\Users\\ENV\lib\site-packages\brainstat\mesh\interpolate.py", line 61, in surface_to_volume
    ribbon_points = cortical_ribbon(pial_mesh, wm_mesh, volume_template)
  File "C:\Users\\ENV\lib\site-packages\brainstat\mesh\interpolate.py", line 255, in cortical_ribbon
    pial_trimesh = trimesh.ray.ray_pyembree.RayMeshIntersector(
  File "C:\Users\\ENV\lib\site-packages\trimesh\exceptions.py", line 28, in __getattribute__
    raise super(ExceptionModule, self).__getattribute__('exc')
  File "C:\Users\\ENV\lib\site-packages\trimesh\ray\__init__.py", line 5, in <module>
    from . import ray_pyembree
  File "C:\Users\\ENV\lib\site-packages\trimesh\ray\ray_pyembree.py", line 9, in <module>
    from pyembree import __version__ as _ver
ImportError: cannot import name '__version__' from 'pyembree' (unknown location)
ReinderVosDeWael commented 2 years ago

Did you install pyembree as described in the installation guide (https://brainstat.readthedocs.io/en/master/generic/install.html)? If yes, then unfortunately this is an issue in trimesh that I cannot solve within BrainStat. On the short term, the MATLAB implementation of the surface decoder is independent of pyembree so that may provide a solution.

Bountainmicycle commented 2 years ago

I'm currently not working with Anaconda, so I installed it from the repo https://github.com/scopatz/pyembree. In the repos README they also refer to the conda installation so I assumed the manual installation would be identical. Unfortunately I don't have MATLAB so that's not an option for me. Thanks for your help.

ReinderVosDeWael commented 2 years ago

Yea unfortunately I don't think you will be able to use the surface decoder without installing conda.