Open sametdumankaya opened 2 years ago
Hi 👋 Thanks for reaching out and opening your first issue here! We'll try to come back to you as soon as possible. ❤️
Hi, could you attach a piece of reproducible code and data for us to see what's going on? Thanks
Hello @DominiqueMakowski, I am also facing the same issue for the 500Hz sample frequency. I have 88000 ECG data points from PhysioNet with various signal frequencies. I have used upsampling and downsampling on them and made the of 500Hz frequency. Now the problem arises when I try to use Neurokit ecg_analyze
function on it.
df, info = nk.ecg_process(data[1], sampling_rate=500)
R_peaks = info['ECG_R_Peaks']
temp_df = nk.ecg_analyze(df,sampling_rate=500)
Here data
is the twelve leads of the ECG signal and I am passing only lead 2 as data[1]
. But it is raising this error for some of the ECG signal data points. I have also pasted the error here:
NeuroKit error: the window cannot contain more data points than the time series. Decrease 'windows'.
Having the same issue utilising the WESAD dataset PPG data (64Hz) and ECG data (700Hz) windowed into 10-second segments. After my windowing on Subjet 10, I end up with 549 windows of data. ECG has 7000 data points per window, PPG has 640.
Interestingly the code runs and retrieves measures to identify the HRV from the peaks for two iterations on PPG and near seventy iterations on ECG, then fails.
I'm happy to supply code if needed.
Hi @ZacDair , apologies for the late reply! Could you please attach your code as well as data so that we can try reproducing the error?
Hi, I have the same issue. Is there any solution for it? Best regards.
Hi everyone, @ZacDair, @Soumadip-Saha I found the source of this error. when the number of 'RRI' in different window is not the same, this error is shown. to solve it, you could replace the '_hrv_dfa' with the following code: def _hrv_dfa(peaks, rri, out, n_windows="default", **kwargs):
if "dfa_windows" in kwargs:
dfa_windows = kwargs["dfa_windows"]
else:
print(rri.shape)
dfa_windows = [(4, rri.shape[0]-1), (rri.shape[0], None)]
print(dfa_windows)
# Determine max beats
if dfa_windows[1][1] is None:
max_beats = len(peaks) / 10
else:
max_beats = dfa_windows[1][1]
# No. of windows to compute for short and long term
if n_windows == "default":
n_windows_short = int(dfa_windows[0][1] - dfa_windows[0][0] + 1)
n_windows_long = int(max_beats - dfa_windows[1][0] + 1)
elif isinstance(n_windows, list):
n_windows_short = n_windows[0]
n_windows_long = n_windows[1]
# Compute DFA alpha1
short_window = np.linspace(dfa_windows[0][0], dfa_windows[0][1], n_windows_short).astype(int)
# For monofractal
print('fractal_dfa')
out["DFA_alpha1"] = fractal_dfa(rri, multifractal=False, windows=short_window, **kwargs)[0]
# For multifractal
mdfa_alpha1 = fractal_dfa(
rri, multifractal=True, q=np.arange(-5, 6), windows=short_window, **kwargs
)[1]
out["DFA_alpha1_ExpRange"] = mdfa_alpha1["ExpRange"]
out["DFA_alpha1_ExpMean"] = mdfa_alpha1["ExpMean"]
out["DFA_alpha1_DimRange"] = mdfa_alpha1["DimRange"]
out["DFA_alpha1_DimMean"] = mdfa_alpha1["DimMean"]
# Compute DFA alpha2
# sanatize max_beats
if max_beats < dfa_windows[1][0] + 1:
warn(
"DFA_alpha2 related indices will not be calculated. "
"The maximum duration of the windows provided for the long-term correlation is smaller "
"than the minimum duration of windows. Refer to the `windows` argument in `nk.fractal_dfa()` "
"for more information.",
category=NeuroKitWarning,
)
return out
else:
long_window = np.linspace(dfa_windows[1][0], int(max_beats), n_windows_long).astype(int)
# For monofractal
out["DFA_alpha2"] = fractal_dfa(rri, multifractal=False, windows=long_window, **kwargs)[0]
# For multifractal
mdfa_alpha2 = fractal_dfa(
rri, multifractal=True, q=np.arange(-5, 6), windows=long_window, **kwargs
)[1]
out["DFA_alpha2_ExpRange"] = mdfa_alpha2["ExpRange"]
out["DFA_alpha2_ExpMean"] = mdfa_alpha2["ExpMean"]
out["DFA_alpha2_DimRange"] = mdfa_alpha2["DimRange"]
out["DFA_alpha2_DimMean"] = mdfa_alpha2["DimMean"]
return out
I have exactly the same problem,: analyze_df=Error during ECG processing: NeuroKit error: the window cannot contain more data points than the time series. Decrease 'scale'
I suggest the case is reopened as there is no fix without rewriting part of neurokit functionality, as far as I understand. That defeats the purpose of using neurokit
Hi, would you help us fixing it by providing a reproducible code / data?
Is this still relevant? If so, what is blocking it? Is there anything you can do to help move it forward?
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs.
Heyy, Any update on this? I am facing a similar error.
I wanted to analyze my ECG data (sample rate of 500), however I came across with this bug and I don't know how to update the parameter 'windows'.
Code:
processed_data, info = nk.bio_process(ecg=df["V3"], sampling_rate=500)
results = nk.bio_analyze(processed_data, sampling_rate=500)
Error:
ValueError: NeuroKit error: the window cannot contain more data points than the time series. Decrease 'windows'.