Closed jevogel closed 1 year ago
For future visitors, I was able to resolve my problem by mapping pleorasink
instances to Ethernet interfaces using MAC addresses (with the mac
property) instead of IP addresses. From looking at the debug log, it seems like there may have been some sort of race condition when the pleorasink
elements were initializing and searching for the correct Ethernet interface to use.
Actually, that's false. I ultimately need to create four pleorasink
elements in one pipeline, and using the MAC addresses only worked with three of them.
@jevogel, Sorry I never got to look at this issue until now, great job explaining the situation, if only all reporters gave such detail. To be honest, I've only ever used two streams max. I've not made significant changes since the commit you mentioned, except for today I'm adding some basic attribute setting and stream channel selection to allow grabbing both streams from a dual-stream camera. It sounds like you have a workaround, but not a satisfying one, I'll close this if that's the case. Thanks!
Questions
Background information follows this section.
pleorasink
?Goal
I want to create a GStreamer pipeline that connects a single source to three GigE Vision sinks (
pleorasink
), but I haven't been able to get it to work.Test setup
To test the GigE Vision camera emulation, I am running the MATLAB (R2023a) Image Acquisition Explorer on my Windows 11 desktop machine ("host") connected to the MPSoC module running GStreamer on Debian ("target"). I am also using Wireshark on the host for debugging (so I can confirm whether GVCP and/or GVSP packets are being transmitted/received).
I'm using a slightly modified version of
gst-plugins-vision
that was forked from commit 8a5478b344c0284684cc1e566332519695aedb40.What works
If I run three pipelines in parallel, I am able to detect all three cameras on the host and stream from each one. Example pipeline command:
If I run just two
pleorasink
s in one pipeline, I am also able to detect and stream from each camera on the host. Example:It also worked to add
filesink
s after thetee
alongside the twopleorasink
s (that is, both cameras streamed and both video files were valid and showed the same video as the streams).What doesn't work
If I run three
pleorasink
s in one pipeline, I only detect two of them, and neither will stream frames. Example:It isn't always the first two that are detected; sometimes it's the last two or the first and the third (CAM1 and CAM3).
I've also tried changing different properties on
pleorasink
and queue, none of which changes my observations:pleorasink
max-lateness=150000000
(with framerate=10/1 on source)sync=false
async=false
num-internal-buffers=10
queue
leaky=downstream
max-size-buffers=1
The only thing that made some difference was setting
auto-multicast=true
onpleorasink
. That made it so all three cameras were detected on the host, but still none of them streamed frames.Based on this and my reading, I think there is probably an issue with buffers, synchronization, or threads, but I'm not sure what to look into next.