Closed apadee closed 3 years ago
I have not looked at the code but what you describe sounds like a bug. I would start by coding some simple test that fails on master
-- for example a two-epoch EpochsArray
where -0.5 to 0 has some signal that is coherent and 0 to 0.5 does not -- and show that it fails there. Then when you write your code you already have the test in place that should pass once things are fixed. A PR for this would be great!
Describe the bug
I'm trying to compute EEG functional connectivity in various time intervals relative to the stimulus.
I run into a problem with the
tmin
andtmax
arguments inmne.connectivity.spectral_connectivity
.If the data object passed to
mne.connectivity.spectral_connectivity
is of typemne.Epochs
, thenspectral._prepare_connectivity
should use the times frommne.Epochs.times
: Otherwise,times
is generated usingnp.linspace()
from the sampling frequencysfreq
and the number of data samples (n_times_in
).This is done in
_prepare_connectivity
with a subset of the data, which itself tries to create thetimes_in
through_get_and_verify_data_sizes
.Here, I want to get from -0.4 to -0.1 seconds from the event (300 ms span 100 ms before event.) Here is an evoked plot with the full time axis with the marked interval in which I want to compute connectivity.
However, the time points from
epochs.times
are lost in the execution ofspectral._prepare_connectivity
and replaced with generated time axis. This makes the time points provided intmin
andtmax
no longer consistent with the time in the data object.A possible fix to this issue is to pass
times_in
to bothspectral._prepare_connectivity
andspectral._get_and_verify_data_sizes
, with which I then get my expected results. Something like: Definition of_prepare_connectivity
:Calling
_prepare_connectivity
:Steps to reproduce
Expected results
I would expect the
times
from theepochs
object to be used, rather than time points generated from 0.Actual results
With a bit of guidance, I'm happy to make PR to resolve the issue :)
Additional information