Closed frankSDeviation closed 4 years ago
Hi, The problem with the "Error in the image processing loop" error is that it's very hard to diagnose. As you might've read in #5, it usually means that there was a corrupted frame present, and debugging it is quite hard. Now, when you say "I did exactly as instructed", do you mean you replaced the code block as shown in here https://github.com/zingmars/gst-pylonsrc/issues/5#issuecomment-477503237?
For debugging, first I would recommend you to use the pylonsrc from https://github.com/joshdoe/gst-plugins-vision (you can also try https://github.com/AB-Eskild/gst-plugins-vision/tree/bugfix/pylonsrc_handle_corupt_frames, which is a branch for a PR that attempts to tackle this issue). That repository has a slightly refactored version of plugin that should be better at showing where the error was (although it will still fail if a frame will fail to grab iirc).
One thing to try is to run the pipeline, and then open pylon viewer and run it for a while to see if there are any errors/warnings there. Basler cameras save settings in ram, and running the pipeline will set a lot of values to the plugin's defaults thus what you'll get should be very close to what it would be outputting to gstreamer. https://github.com/joshdoe/gst-plugins-vision/issues/14#issuecomment-626822532 reports that changing framerate/resolution and other parameters seem to affect the frequency of any warnings/errors, so you might also want to try that.
Hello Zingmar,
Thank you for the response. The only thing I did in issue number 5 was change "goto error" to "return GST_FLOW_OK;" I was not to sure where to place that code block.
I am having issues using the vision plug ins. I am sure I've installed everything correctly but I cant be 100%. I am not sure if I am doing thigns correctly. After installing the plugin, I should just be able to run the pylonsrc plugin correct? or is there another way I am supposed to use the vision plug in?
I have ran the Basler pylonviewer after running the pipeline and everything is running just fine. I do not get any errors at all. I also want to mention that I am able to see the video stream locally by running this pipeline:
gst-launch-1.0 pylonsrc camera=0 imageformat=mono8 width=1600 height=1200 ! videoconvert ! xvimagesink
I will take a look at issue #14 in the vision plugin. I also would like to say that I am not a software engineer. My background is primarily hardware design so I do apologize if I may say some silly things or ask dumb questions regarding compiling or installing software.
Thank you, Frank
Try running the UDP pipeline with continuous=false
set
I am having issues using the vision plug ins. I am sure I've installed everything correctly but I cant be 100%. I am not sure if I am doing thigns correctly. After installing the plugin, I should just be able to run the pylonsrc plugin correct? or is there another way I am supposed to use the vision plug in?
From what I remember you should be able to just make install
it and run, though you'll probably need to uninstall this version first. Also, you might need to clean gstreamer cache (~/.gstreamer-1.0
iirc?)
Thank you for the response. The only thing I did in issue number 5 was change "goto error" to "return GST_FLOW_OK;" I was not to sure where to place that code block.
That probably didn't work because you told gstreamer that the plugin succeeded to grab a frame when it didn't actually provide a frame. I've uploaded two patch files that you can try - 0001-Retry-capture-when-a-frame-failed-to-grab.patch
is the solution provided in #5. 0001-Grab-frame-after-failure.patch
will make the plugin try another capture if it fails (although I haven't tested it). You can apply a patch file using git am --3way <patch file>
.
Download: pylonsrc-patches.zip
Hello Zingmars,
Thank you for all of your help. I first want to say that I tried applying the patch but for some reason it just wouldn't take. After that I began to play with the gst-plugins-vision code and was finally able to stream a UDP stream with almost 0 latency! This is the pipeline I am using on my host machine:
gst-launch-1.0 -v pylonsrc camera=1 pixel-format=mono8 width=1600 height=1200 ! "video/x-raw,format=GRAY8" ! videoflip method=vertical-flip ! videoconvert ! x264enc bitrate=30000 speed-preset=superfast qp-min=30 tune=zerolatency ! rtph264pay ! udpsink host=XXX.XXX.XXX.XXX port=XXXX
And this is the pipeline I am running on the destination machine:
gst-launch-1.0 -v udpsrc port=XXXX caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96" ! rtph264depay ! queue ! decodebin ! videoconvert ! videoscale ! autovideosink sync=true
I have another question, is it possible to send two udp streams from two different cameras at the same time? My application requires me to be able to two cameras running on the same machine. Also since this question was answered, should I start another issue and ask this question there?
thank you, Frank
Glad to hear it works for you.
If you have multiple cameras connected and if you'll run pylonsrc (no
parameters) it will list the cameras. You can then pick which one to use by
specifying a camera=number
parameter. This way you can run multiple
pipelines with multiple cameras.
On Thu, 20 Aug 2020, 23:58 frankSDeviation, notifications@github.com wrote:
Hello Zingmars,
Thank you for all of your help. I first want to say that I tried applying the patch but for some reason it just wouldn't take. After that I began to play with the gst-plugins-vision code and was finally able to stream a UDP stream with almost 0 latency! This is the pipeline I am using on my host machine:
gst-launch-1.0 -v pylonsrc camera=1 pixel-format=mono8 width=1600 height=1200 ! "video/x-raw,format=GRAY8" ! videoflip method=vertical-flip ! videoconvert ! x264enc bitrate=30000 speed-preset=superfast qp-min=30 tune=zerolatency ! rtph264pay ! udpsink host=XXX.XXX.XXX.XXX port=XXXX
And this is the pipeline I am running on the destination machine:
gst-launch-1.0 -v udpsrc port=XXXX caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96" ! rtph264depay ! queue ! decodebin ! videoconvert ! videoscale ! autovideosink sync=true
I have another question, is it possible to send two udp streams from two different cameras at the same time? My application requires me to be able to two cameras running on the same machine. Also since this question was answered, should I start another issue and ask this question there?
thank you, Frank
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/zingmars/gst-pylonsrc/issues/19#issuecomment-677897995, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAG75BR6K5QF5PE4KKVNDNDSBWFAPANCNFSM4QCMGEWQ .
Thanks again Zingmars! I am currently working on getting two video streams to be sent over UDP. I have been successful in doing this with the following pipline:
st-launch-1.0 pylonsrc camera=0 pixel-format=mono8 width=800 height=600 ! "video/x-raw,format=GRAY8" ! videoflip method=vertical-flip ! videoconvert ! x264enc bitrate=30000 speed-preset=superfast qp-min=30 tune=zerolatency ! rtph264pay ! udpsink host=XXX.XXX.XXX port=XXXX | gst-launch-1.0 pylonsrc camera=1 pixel-format=mono8 width=800 height=600 ! "video/x-raw,format=GRAY8" ! videoflip method=vertical-flip ! videoconvert ! x264enc bitrate=30000 speed-preset=superfast qp-min=30 tune=zerolatency ! rtph264pay ! udpsink host=XXX.XXX.XXX port=XXXX
I will start playing with the videomixer to see if I can send them over UDP in one window. Thanks a ton for the help. I will close this issue now.
Hello, my name is Frank and i have been playing with your plugin for the last few weeks. I can successfully display my Basler camera on my host machine using xvimagesink. My goal is to be able to stream the cameras image over UDP with the lowest latency possible. After trying many different pipelines, I was finally able to send a UDP stream to another computer. The stream only comes up for about one second and then dies. This is the pipeline I am currently running:
gst-launch-1.0 -c -v pylonsrc imageformat=mono8 ! "video/x-raw,format=GRAY8" ! decodebin ! videoconvert ! x264enc tune=zerolatency ! rtph264pay ! udpsink host=192.168.0.44 port=5000
As you can see it is nothing too crazy. I ran a GST_DEBUG=pylonsrc:5 command and go these results:
The error message that comes up is "Error in the image processing loop." I have looked at this in issue #5 but that solution did not work for me. I did exactly as instructed in that thread but it did not fix my issue.
This is the pipeline I am running on the external machine to display the UDP stream. I used it to play a recorded video file from the Basler camera and it works fine.
gst-launch-1.0 -v udpsrc port=5000 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96" ! rtph264depay ! decodebin ! videoconvert ! autovideosink sync=false
I am not sure if this is the correct way of doing things.Any help will be greatly appreciated.
Thank you, Frank