Closed Mirandeitor closed 10 months ago
Hi David,
It looks like someone has already created a Xsens LSL integration: https://github.com/Torres-SMIL/xsens_labstreaminglayer_link Click on the "Releases" link on the right of the page to get the application.
(Note: I found this link by browsing the list of known supported devices; but you can also check labstreaminglayer.org)
Then you'll need a way to create LSL streams for your cameras. Video cameras are always a bit tricky with LSL. First, the data format doesn't support compression so it's quite inefficient storage. Second, many cameras drop frames all the time so we can't simply assume that frame events sent to LSL correspond 1:1 to frames captured in a video file.
To get around the first problem, you can instead save the video in its native format and only stream to LSL the camera frame timestamps. However, you have to verify that your camera system either never drops frames (unlikely) or that it provides a way of telling you when it does (i.e., because it gives you frame indices and you can note that a frame index has been skipped). If indeed it does drop frames, you might consider setting it up as an "irregular" stream so the XDF import machinery doesn't try to dejitter the timestamps.
If you search around in GitHub and elsewhere you will probably find different conversations on using LSL to sync cameras, and even a couple implementations.
P.S., before you start writing your own camera software, you might want to check out what DeepLabCut has for your camera system. It might give you a good head start.
Closing as answered and stale.
Hi all,
We have a lab setting with three cameras connected via TCP/IP and we also use some additionally Xsens system to measure movement. Can LSL deal with all and be used to synchronise data from both sources? Cameras will be running at 25Hz and Xsens sensors at 60Hz. It would be very helpful for us to synchronise that.
Thank you for any info.
Best,
David