Hi - My setup has a Cognionics headset and a heart rate monitor connected to the same PC via bluetooth, with both peripherals sending data to an LSL stream that is being recorded by the LabRecorder App on the same PC. Because this is a single PC setup, I am having trouble understanding the clock offset measurements that I'm seeing in my xdf file and how they relate to latency and jitter.
My code to analyze this xdf file is in python so I make use of the pyxdf library.
import pyxdf
# read raw data
in_file = 'baseline.xdf'
data, header = pyxdf.load_xdf(in_file)
# disaggregate streams
for stream in data:
[stream_name] = stream['info']['name']
[stream_type] = stream['info']['type']
if (stream_type == 'ECG'):
ecg_stream = stream
elif (stream_type == 'EEG'):
eeg_stream = stream
When I examine ecg_stream, I see a series of timestamps and clock offsets:
After reading the FAQs and the Time Synchronization page, I still have the following questions.
I have a series of timestamps in 'times_series' for each peripheral. I assume that this is the local cpu clock timestamp without any corrections applied? Per the docs: "If the user of the API does not provide a time stamp for a sample (e.g., obtained from the LSL clock), it will be implicitly time-stamped by the library at the time of submission."
Because I am only using a single computer for data capture and recording, what does the clock offset measurement actually measure? "The clock offsets are measured using a protocol that is similar to the Network Time Protocol. Each offset measurement involves a brief sequence of n UDP packet exchanges (n=8 by default) between the two involved computers."
Is there any way to estimate latency and jitter based on the data contained in this xdf file?
How would I go about estimating the # of samples that were dropped / lost, if any?
Hi - My setup has a Cognionics headset and a heart rate monitor connected to the same PC via bluetooth, with both peripherals sending data to an LSL stream that is being recorded by the LabRecorder App on the same PC. Because this is a single PC setup, I am having trouble understanding the clock offset measurements that I'm seeing in my
xdf
file and how they relate to latency and jitter.My code to analyze this
xdf
file is inpython
so I make use of thepyxdf
library.When I examine
ecg_stream
, I see a series of timestamps and clock offsets:I can also see a
time_series
of what I assume is the time at which each sample is received:After reading the FAQs and the Time Synchronization page, I still have the following questions.
'times_series'
for each peripheral. I assume that this is the local cpu clock timestamp without any corrections applied? Per the docs: "If the user of the API does not provide a time stamp for a sample (e.g., obtained from the LSL clock), it will be implicitly time-stamped by the library at the time of submission."xdf
file?Thanks!