Closed tyounger24 closed 4 years ago
Could you elaborate on "does not work"? Got any GST_DEBUG logs?
I've been testing an extremely similar pipeline to your last snippet and it works fine. The only differences are that I'm using 480p videos from a file instead of the network, 854x480 sinks and 1708x960 output, and imxipuvideosink
.
Actually could it just be that you're missing a queue
between the compositor and sink?
I've done some further development and have more insight into the issue.
As can be seen, my application is expecting 4 different video cameras. Unfortunately, I have only two video cameras for development. In fact, my application differs slightly from the example in a way that turns out to be critical. The application uses /SubStream for all cameras (MainStream is intended for full screen view, SubStream for the quad view).
I added additional features so that if a camera is not available, I substitute a PNG image indicating such.
After testing this feature, it turns out that the imxg2dcompositor does work.
In order to reproduce my problem, use two cameras twice each.
I have been having some challenges creating a fully functioning Gstreamer application.
I have prototyped all the functions using gst-launch-1.0, and have had success, but my final solution needs to be compiled into an application.
I have been able to get every single element in my test pipelines to function in a compiled application, the issue is that I can't get all of the elements to work together. When I say I've written the examples into an application, I'm creating all the elements and a pipeline, adding to the pipeline, linking, etc., not just passing the pipeline command into a parse command.
For example, I have written the following pipeline into my application:
After sorting out sometimes pads, my application works great.
I have also compiled this into my application:
After sorting out request pads, my application works great.
I have also compiled this into my application:
When $MIXER is set to "videomixer", my application functions correctly, however CPU usage is high. When $MIXER is set to imxg2dcompositor or imxipucompositor, the pipeline does not work in my application.
What do I need to do differently with the imx compositor than I do with the videomixer?
If I can get the imx compositor to work with h264 live video, I would expect I could have a simpler pipeline:
This does not work with the videomixer element though as the videomixer element won't scale.
Does anyone have any pointers on how the videomixer and imx*compositor differ, especially when it comes to the difference between an RTSP live video feed and a videotestsrc?