Open rwang5203 opened 1 year ago
one path to create a streaming setup with Basler cameras is to use the gstreamer plugin in gst-plugin-pylon: https://github.com/basler/gst-plugin-pylon
It is focussed on a 100% gstreamer API usage of basler cameras.
For a hybrid API approach, we would recomment pypylon and using an appsrc element
Hi Thies, thanks for your reply! I tried to use the pylon gstreamer plugin. As I tried to use the param "omxh265enc" to encode the stream using H.265, it says "WARNING: erroneous pipeline: could not link pylonsrc0 to omxh265enc-omxh265enc0, ..."
My command is
gst-launch-1.0 pylonsrc ! "video/x-raw,width=1920,height=1080,framerate=120/1,format=YUY2" ! omxh265enc ! videoconvert ! xvimagesink
. Did I do something wrong here?
Your pipeline listed can't work as you try to pipe h.265 encoded video into a display.
You have to work through the options of the encoding plugin to realize low latency and jitter.
Regarding your camera to encoder connection: on which he platform do you run ? Have you verified ( using videotestsrc ) that omxh265enc can encode 120fps on your platform ?
Gotcha, thanks! Regarding "using videotestsrc" to test whether it works, the answer would be no, but I will test it in the following days. Right now, it doesn't necessarily have to be 120 FPS, just enough to capture everything and within the pipeline channel limit would be ideal, so maybe like 30/60 FPS?
Another question, would you also be able to provide me some suggestions (about the pipeline, procedures, etc.,) regarding if I would like encode the stream into H.265, and transmit the stream bytes via UDP and decode and view it on another host with low latency and jitter as I previously mentioned. Maybe some pipeline commands for sender and receiver would be very helpful. Thank you!
I tried with the following pipeline on the transmitter side:
gst-launch-1.0 pylonsrc ! video/x-raw,width=1920,height=1200,format=UYVY ! videoconvert ! omxh265enc bitrate=500000 ! 'video/x-h265, stream-format=(string)byte-stream' ! rtph265pay ! udpsink host=192.168.1.224 port=8888
And the error message popped out:
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
ERROR: from element /GstPipeline:pipeline0/GstPylonSrc:pylonsrc0: Internal data stream error.
Additional debug info:
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstPylonSrc:pylonsrc0:
streaming stopped, reason not-negotiated (-4)
Execution ended after 0:00:00.029212197
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
Would you be able to provide some solutions to that?
Have a look at https://forums.developer.nvidia.com/t/live-rtp-works-on-nvv4l2h265enc-but-not-omxh265enc/178234 You may have to add h265parse after encoder for OMX
Thanks Frank, problem resolved after adding h265parse !
to the pipeline as you suggested.
But frequent jitter still appeared in the decoded picture, any suggestions regarding that?
Hmm this could be a udp network issue. Did you consider to use rtsp with gst-rtsp-server instead?
Not yet, quick question though, if I adjust the width and height of the camera, is there a way to reduce the pixels instead of cropping down the picture? I tried to adjust from 1920x1080 to 1280x720, the picture just got cropped which is not I wanted. I just want the resolution to go down instead of the picture getting cropped. Is there a way to set in the gstreamer pipeline? Thanks!
Hi, I am currently trying to encode the Basler camera stream with H.264/H.265, and transmit it as bytes through UDP for another host to decode it and view it real time with low latency and no jittery? Is there a way to do this via pypylon? Or do I somehow need to use the gstream plugin to complete this task? Thanks!