Open wittenator opened 2 years ago
Hi @wittenator, you aren't wrong about the custom GStreamer pipeline, however the issue is interpreting the open-ended data that comes out of the appsink
element. As you see in the code, now I have specific logic that deals with different formats.
What is the error that you are currently getting with the RGB stream?
Hi, @dusty-nv I ran into a similar issue with USB camera from "The Imagine Source". While I understand there can be issues with sending arbitrary data into the pipeline, there are a lot of cameras out there that use their own Gstreamer source (in this case tcambin
), so it would be nice to be able to specify custom launch string. I saw someone else asking about Basler cameras (which use pylonsrc) so it seems like it could be useful especially with high-resolution cameras that need scaling or other preprocessing.
I made a simple patch to handle a custom://
URI and then takes whatever is specified by --launch-string
. Please find it attached if useful.
Here is an example:
detectnet custom:// rtp://239.80.8.8:5406 --headless --threshold=0.35 --launch-string="tcambin ! nvvidconv ! capsfilter caps=video/x-raw(memory:NVMM),format=NV12,width=2880,height=2160"
Thanks a lot for the great software! I am currently working with a MIPI CSI-2 camera whose drivers are not yet included in the Jetson kernel and which therefore can only be used as a V4L2src camera. Though I got a kernel driver that debayers the data from the camera and published the images as a BGRx or xRGB stream. It seems that such a use case is currently not modeled by the gstCamera code for the GStreamer pipeline string building. Instead of hacking the gstCamera code, I wondered if it would make sense to add the capability to supply an own GStreamer pipeline to a VideoSource to make the model a bit more flexible. As far as I see that would also nicely fit in with the DeepStream plugins offered by Nvidia.