Closed mtbsteve closed 4 years ago
Could you share the camera name as listed in the APWeb GUI? It might be using the wrong pipeline. The list of pipelines used by APStreamline can be found here
The way it works is that APStreamline will first check the camera capabilities by using ioctls to check what formats and resolution it supports to pick the pipeline accordingly. As such there is no APStreamline pipeline for video/x-raw
as listed in the gst-launch command you've provided. I can patch this in if you can give me the output of v4l2-ctl --list-formats-ext
for the ZED camera. Unfortunately, each target device has a different pipeline for dealing with cameras (e.g. amd64 PCs use x264enc, tegra/RPi use omxh264enc) so I think I might need to rewrite some other components before adding support for this.
Here is what I get after starting the RTSP Server:
How can I execute v4l2-ctl --list-formats-ext
?
thanks!
Run v4l2-ctl --list-formats-ext
in a terminal on the Tegra board or over SSH. The output will be the video formats, framerates, and resolutions. You might need to run sudo apt install v4l-utils
on the Tegra for this command to work.
Thanks - here you go:
Yes, this would explain it, APStreamline does not support pipelines for cameras using the Bayer format. I don't have a ZED camera on hand to test so I won't be able to add this, but feel free to submit a pull request if you're able to get this to work by modifying this section of code: https://github.com/shortstheory/adaptive-streaming/blob/8a92c2283d0b8c824a1174c580a74b76644f0819/src/RTSPStreamServer/RTSPAdaptiveStreaming.cpp#L50
Let me know if you have any questions. The logic to automatically detect the ZED camera is something which might need to be added as well.
I would be happy to test but to make all the changes required is a bit over my head I am afraid. What about a little tutorial on how to connect to other cameras? I am sure this issue will come up with other webcams too once APstreamline increases adoption.
Sure. This is a great suggestion and it's definitely something I want to look into as the project moves forward. The problem is that I'm not sure what the best way of supporting many different camera types is - by looking at their capabilities and using a generic pipeleine or writing a new pipeline for each particular camera? I've seen that the latter tends to offer better results but it might bloat the code more than I would like.
as a starting point it would be helpful to document the current way how the supported camera types are implemented in APstreamline. That would give me eventually the opportunity to write the extension required myself. I doubt that a generic pipeline would work for all those cameras around. Best way in my view would be to put the camera specific pipeline code into a configuration script which can be tweaked by the user. Like in cherrypy, where the streaming pipeline is included in the start_udp_stream.sh script.
@shortstheory Hi Arnav, finally I figured out the RTSP pipeline for the ZED:
v4l2src device=/dev/video1 ! video/x-raw,format=YUY2 ! nvvidconv ! video/x-raw(memory:NVMM),width=2560,height=720,format=I420 ! omxh265enc ! rtph265pay name=pay0 pt=96
How can I put this into the APstreamline code? Thanks for your support!
Great to hear! I haven't gotten around to thinking about how support for multiple cameras should be done as it should require a rewrite of the code to make the camera detection and pipeline setup more modular. I will update instructions once I've thought of a good way to do this.
Hi @mtbsteve, I have now added support for the ZED camera to an upcoming release of APStreamline. It does not support the web-app (still need to rewrite the way interprocess communication is done!), but the code is much easier to work with now. I have now made it easier to add new cameras by using config files and inheritance. While it might take me a few weeks to make an official release, you can find the current build of the new version at: https://github.com/shortstheory/adaptive-streaming/tree/refactor202005
I have tested it with the ZED2 camera on the Xavier NX and it works for me!
Nice! Thanks! look forward once it’s in master
Hi, I've now merged in my commits for the v2.0 release of APStreamline! It should now work with the ZED camera out of the box. Please refer to this for the configuration settings. I'll be gradually be updating the documentation with the steps for adding a new camera.
I am unable to generate a feed from a Stereolabs ZED camera via RTSP. When I start the RTSP server on the video tab in APWeb, the ZED is correctly displayed as cam1 in addition to the Jetson Dev board built in camera (cam0). However I cannot display the cam1 video stream on the client machine. RTSP streaming from the built in camera (cam0) works well.
In turn, streaming from the ZED camera via udp works perfectly well. I am using the following gstreamer settings to display the left camera view on a client PC:
gst-launch-1.0 v4l2src device=/dev/video1 ! video/x-raw, width=3840, height=1080 ! videocrop top=0 left=0 right=1920 bottom=0 ! tee name=t ! queue ! videoconvert ! omxh264enc ! video/x-h264, stream-format=byte-stream ! h264parse ! rtph264pay ! udpsink host=10.0.1.111 port=5600 t. ! queue ! videoconvert
I assume that the camera settings need to be tweaked somewhere in the APStreamline code - any ideas what I would need to adopt? Thanks!