mne-tools / mne-bids

MNE-BIDS is a Python package that allows you to read and write BIDS-compatible datasets with the help of MNE-Python.
https://mne.tools/mne-bids/
BSD 3-Clause "New" or "Revised" License
133 stars 88 forks source link

write_raw_bids does not produce a valid SET file #1122

Closed laemtl closed 1 year ago

laemtl commented 1 year ago

Description of the problem

The function write_raw_bids does not produce a valid .set file that can be read by EEGLAB. When importing a .set file that was generated by write_raw_bids in EEGLAB, we systematically get the following error:

Screenshot (23)

Steps to reproduce

import mne
import os
from mne_bids import write_raw_bids, BIDSPath

file = r'C:\Users\Lae\Documents\test_data\DCC0003_444597_V01_FACE.set'
bids_directory = r'C:\Users\Lae\Documents\test_data'
raw = mne.io.read_raw_eeglab(input_fname=file, preload=False, verbose=True)

os.makedirs(bids_directory + os.path.sep, exist_ok=True)
bids_directory = bids_directory + os.path.sep
bids_root = bids_directory
bids_basename = BIDSPath(subject='001', task='test', root=bids_root, acquisition='eeg', run=None)
bids_basename.update(session='V1')
write_raw_bids(raw, bids_basename, allow_preload=False, overwrite=False, verbose=True)

Expected results

A valid EEGLAB file

Actual results

File can't be open, see above error.

Additional information

Platform: Windows-10-10.0.19041-SP0 Python: 3.8.5 (tags/v3.8.5:580fbb0, Jul 20 2020, 15:57:54) [MSC v.1924 64 bit (AMD64)] Executable: C:\Users\Lae\Documents\hbcd-eeg2bids\Scripts\python.exe CPU: Intel64 Family 6 Model 142 Stepping 9, GenuineIntel: 4 cores Memory: Unavailable (requires "psutil" package) mne: 1.2.0 numpy: 1.20.3 {OpenBLAS 0.3.13.dev with 4 threads} scipy: 1.9.2 matplotlib: 3.6.1 {backend=TkAgg}

sklearn: 1.1.2 numba: 0.56.2 nibabel: Not found nilearn: Not found dipy: Not found openmeeg: Not found cupy: Not found pandas: 1.5.0 pyvista: Not found pyvistaqt: Not found ipyvtklink: Not found vtk: Not found qtpy: Not found ipympl: Not found pyqtgraph: Not found pooch: v1.6.0

mne_bids: 0.12 mne_nirs: Not found mne_features: 0.2.1 mne_qt_browser: Not found mne_connectivity: Not found mne_icalabel: Not found

agramfort commented 1 year ago

it's likely a bug in https://github.com/jackz314/eeglabio

can you open an issue there?

Message ID: @.***>

sappelhoff commented 1 year ago

cc @jackz314

jackz314 commented 1 year ago

I don't think mne-bids it uses eeglabio for exporting to set format, seems like it uses an internal function copyfile_eeglab and the bug should be fixed there.

@laemtl you might want to consider the Raw.export function instead for now, which uses eeglabio to export to eeglab format.

laemtl commented 1 year ago

Thanks @jackz314, I will give it a try! I believe the issue happens here, where savemat is used: https://github.com/mne-tools/mne-bids/blob/213c3789390dfacc285418d8f54c6575ca104e7f/mne_bids/copyfiles.py#L556 I also tracked down the issue. The error I'm experiencing is caused by EEG.srate, EEG.xmin being uint16 and uint8.

sappelhoff commented 1 year ago

thanks @jackz314 I somehow forgot about that.

@laemtl -- interesting: Do you think we could fix that error using savemat?

laemtl commented 1 year ago

@sappelhoff @jackz314 What about using eeglabio? It does a pretty good job in converting all the values back to float since its last release.

sappelhoff commented 1 year ago

My principle with all the copyfile functions that we have is:

Having that said ... if we don't find an easy fix for our copyfile_eeglab, I would be in favor of switching our mne-bids code to use eeglabio

@laemtl I'd be grateful if you could spend some time to see whether copyfile_eeglab could be fixed for your needs:

https://github.com/mne-tools/mne-bids/blob/0b1a61d7660b35438b78898878ef0c8b51d8716b/mne_bids/copyfiles.py#L498-L573

laemtl commented 1 year ago

@sappelhoff My idea is to modify copyfiles.py and use eeglabio export_mne_raw instead of loadmat/savemat:

raw = mne.io.read_raw(...)
export_mne_raw(raw, "file_name.set").

So the logic in copyfiles.py will remain the same but the read/write logic will be more specific to the eeglab specs to prevent int conversion.

sappelhoff commented 1 year ago

Okay, I guess that would be better. Would you be willing to submit a pull request to implement this behavior?

laemtl commented 1 year ago

Sure :)

sappelhoff commented 1 year ago

who knows, maybe this will also fix https://github.com/mne-tools/mne-bids/issues/1120

cc @kaare-mikkelsen

laemtl commented 1 year ago

@sappelhoff Given the side-effects of mne.io.read_raw and export_mne_raw that can lead to errors while reading/saving the data, I opted for a simpler solution. See #1126.

jasmainak commented 1 year ago

The fact that the file could not be copied indicates a problem with the original file, no? I think that's outside the purview of MNE-BIDS

laemtl commented 1 year ago

@jasmainak Are you talking about #1120?

sappelhoff commented 1 year ago

The fact that the file could not be copied

if an fdt file is present, we don't strictly copy -- we read the file, edit the file pointer to the fdt file, and write it back (leaving everything else untouched). I think this is still where somehow the issue with different integer types occurs. Can you confirm @laemtl?

jasmainak commented 1 year ago

I commented in #1126 . Maybe eeglabio is the way to go if it preserves integer types better than savemat ? We don't want to take up the maintenance burden of writing EEGLAB files, it's opening a can of worms

laemtl commented 1 year ago

@sappelhoff yes, loadmat converts non-decimal numbers to int.

laemtl commented 1 year ago

@jasmainak eeglabio implements the same logic that I did here, casting values in-between loadmat and savemat calls. The issue with eeglabio is that it requires the data to be loaded with mne.io.BaseRaw which parses/rejects the data according to some criterion, and I think it's preferable to minimize the dataset validation/alteration.

jasmainak commented 1 year ago

okay I leave the decision to @sappelhoff ! Was just a drive-by comment :)

laemtl commented 1 year ago

@jasmainak I edited #1126 based on your comment. Thanks!