Open SophieHerbst opened 1 month ago
One option would be to internally use min(st_duration, raw.times[-1])
and document that behavior. Then in your code you could set st_duration = 10000
or whatever and it would automatically be trimmed to the duration of each run.
For now we decided to use the duration of the resting state run. Would it be worth making the use of raw.times[-1] a config option?
Rather than explicitly making that an option I would follow my comment one above ☝️ where you can just set st_duration=10000
or whatever
We have a dataset with specific artifacts, and are trying to apply tsss with mf_st_duration set to the signal duration. The problem is that the resting state run is much shorter than the experimental run, and we cannot set one duration that fits all signals. What would be the best fix for this? Can we set two versions of the mf_st_duration per task? Thanks in advance!