MattsProjects / pylon_gstreamer

A robust integration of Basler's Pylon API with GStreamer. Delivers applications as ready-to-run standalone compiled executables (gst-launch-1.0 is not needed). Designed for reliability and easy access to performance optimizations. Note: This is not a plugin. It is an intergration using GStreamer's GstAppSrc element.
Apache License 2.0
29 stars 18 forks source link

Video Streaming Issue #21

Open frankSDeviation opened 4 years ago

frankSDeviation commented 4 years ago

Hello, I have stumbled upon your code and I would like to say that it is awesome! I wish I had found this a year ago! I have been trying to get it up and running and have been able to get it to at least display a black image. My goal is to be able to stream two Basler USB cameras (mono) on a multicast network with the lowest latency possible. I have downloaded zingmars pylon src program as well and I have been able to stream one camera locally. using this gstreamer pipeline:

gst-launch-1.0 pylonsrc imageformat=mono8 width=1600 height=1200 ! videoconvert ! xvimagesink

His program works great but is limited to what can be done with it and I love how yours is written in C++ and can be modified to build a very good application.

I have been having problems trying to get your code to just show a live image. I have tried using the "-window" option but all I get is a black screen. This happens with any other option as well. I then noticed that Harendracmaps had a similar issue and they were able to get theirs to work by commenting "camera.ResetCamera()" out of the code. I have tried this as well but I was not able to get mine to work. I then ran a debug command and got this message:

lter_transform: warning: invalid video buffer received 0:00:05.607316439 23261 0x55ba0caa3000 ERROR default video-frame.c:175:gst_video_frame_map_id: invalid buffer size 1920000 < 5760000 0:00:05.607345874 23261 0x55ba0caa3000 WARN videofilter gstvideofilter.c:293:gst_video_filter_transform: warning: invalid video buffer received 0:00:05.623998313 23261 0x55ba0caa3000 ERROR default video-frame.c:175:gst_video_frame_map_id: invalid buffer size 1920000 < 5760000 0:00:05.624026576 23261 0x55ba0caa3000 WARN videofilter gstvideofilter.c:293:gst_video_filter_transform: warning: invalid video buffer received 0:00:05.640742131 23261 0x55ba0caa3000 ERROR default video-frame.c:175:gst_video_frame_map_id: invalid buffer size 1920000 < 5760000 0:00:05.640772160 23261 0x55ba0caa3000 WARN videofilter gstvideofilter.c:293:gst_video_filter_transform: warning: invalid video buffer received 0:00:05.657838839 23261 0x55ba0caa3000 ERROR default video-frame.c:175:gst_video_frame_map_id: invalid buffer size 1920000 < 5760000 0:00:05.657894996 23261 0x55ba0caa3000 WARN videofilter gstvideofilter.c:293:gst_video_filter_transform: warning: invalid video buffer received 0:00:05.674299667 23261 0x55ba0caa3000 ERROR default video-frame.c:175:gst_video_frame_map_id: invalid buffer size 1920000 < 5760000 0:00:05.674357139 23261 0x55ba0caa3000 WARN videofilter gstvideofilter.c:293:gst_video_filter_transform: warning: invalid video buffer received 0:00:05.691215050 23261 0x55ba0caa3000 ERROR default video-frame.c:175:gst_video_frame_map_id: invalid buffer size 1920000 < 5760000 0:00:05.691271613 23261 0x55ba0caa3000 WARN videofilter gstvideofilter.c:293:gst_video_filter_transform: warning: invalid video buffer received 0:00:05.707745445 23261 0x55ba0caa3000 ERROR default video-frame.c:175:gst_video_frame_map_id: invalid buffer size 1920000 < 5760000 0:00:05.707800085 23261 0x55ba0caa3000 WARN videofilter gstvideofilter.c:293:gst_video_filter_transform: warning: invalid video buffer received ^C Sending EOS event to pipeline... End of stream Stopping pipeline... Sending EOS event... Stopping Camera image acquistion and Pylon image grabbing...

Press Enter to exit.

This continues for what appears to be every frame of the video. All I get is a black image. I also tried using the "xvimagesink" in my parse. I am not sure what I am doing wrong and any help is greatly appreciated.

Cheers, frank

BryanRacic commented 4 years ago

Hi @frankSDeviation ,

Could you share the context in which you're calling the program/library? Are you using the "demopylongstreamer" sample from the terminal? If so can you provide the exact line you're currently using (preferably the -window one).

I haven't seen this error message before, but it looks like it could be stemming from the provided width & height. You may wanna try width=2560 height=2560, this should at least meet the minimum buffer size criteria and possibly give more insight into the issue.

frankSDeviation commented 4 years ago

Hello @BryanLikesToProgram ,

Thank you for a response! I was afraid someone was going to respond months from now or at all! I ran the "./demopylongstreamer" program so that I may see the program usage, options, and examples. After reading it I assumed that by simply running the "-window" option in the command line it would just play a live image from my Basler camera. After running this option along with the ./demopylongstreamer in my command line i get this image:

image

I had a similar issue using zingmars code and i was able to fix it by changing the image format to "mono8". I noticed that I can parse my old pipeline and thought that it should work so i added a "-parse gst-launch-1.0 pylonsrc imageformat=mono8 width=1600 height=1200 ! videoconvert ! xvimagesink" to my pipeline and removed the "-window" option. I also have not changed anything in the code just yet since this demo program should be able to just launch my basler camera correct?

Another thing i just noticed was that if i run the "-window" command without rescaling the gstreamer window that opens is green and not black. Not sure exactly what this means...

Also I would like to say that my background is mostly electronics hardware design. Most of my "programming" experience is bash scripting on linux (idk if you consider bash scripting programming). I am trying to get gstreamer to either launch a very low latency video stream or to be able to run a gstreamer using a bash script over ssh on a GUI made with python.

Again, thank you much for all the help!

BryanRacic commented 4 years ago

I think you definitely are on the right track. In my experience with Pylon, a green display usually has to do with an internal data stream error.

If you run just: sudo ./demopylongstreamer -window

Do you still get just a green screen?

Also have you tried writing your stream to an h264 file instead of a display window? sudo ./demopylongstreamer -h264file myvideo.h264 18000

Finally can you try sudo ./demopylongstreamer -aoi 640 480 -framerate 15 -window and then sudo ./demopylongstreamer -aoi 640 480 -framerate 15 -rescale 320 240 -window and let me know your result?

Finally, are you getting the same results when you run PylonViewer (provided in the Basler Pylon SDK)?

frankSDeviation commented 4 years ago

Running as sudo did not help. I am able to view the camera without issues using the PylonViewer. I am able to write the stream to an h264 file but the thumbnail that it creates is the same green screen. When i try opening it with VLC, I get a black screen that does not play anything. I searched the codec information in VLC and it is blank. Also the file is only 14kb.

When i run the " sudo ./demopylongstreamer -aoi 640 480 -framerate 15 -window" command I am able to see a green screen with an aoi of 640x480 and the camera speed is 14.9979 fps.

When I run the " sudo ./demopylongstreamer -aoi 640 480 -framerate 15 -rescale 320 240 -window" command I get a smaller black window.

Just some additional information, I am running this CPU:

i7-8665UE CPU @ 1.70GHz × 8 8GB of ram

So I know it's not a hardware issue. I am able to stream the images using ffmpeg but ffmpeg has a terrible delay and that is why I am trying to use Gstreamer. I have been able to use gstreamer with other cameras and I measured the latency to be about 250mS. I am getting a latency of about 5-10 seconds with ffmpeg. This is my ffmpeg version if it helps at all:

ffmpeg version N-98478-g1ec2b3de5a

Like I mentioned above, I am able to view the camera using zingmars code without issues but I cannot do a udp stream using his code for some reason.

Cheers, Frank

MattsProjects commented 4 years ago

Hello Frank, When using Zingmar's plugin, which sink do you use to display the images, is it autovideosink or something else? A tricky part with Gstreamer is that not all systems have/support the same plugins. I run into this a lot with especially the display and h264 plugins (in demopylongstreamer you'll see I check for something like 4 different h264 plugins). I think -window just uses autovideosink, but if it works in a gst-launch using zingmars plugin, we can try that the display plugin used there in this code.

frankSDeviation commented 4 years ago

Hello Matt, Thank you for responding. I am using xvimagesink to display the video on my device using Zingmar's plugin. If I don't use the "-window" option and and just use the "-parse" option with the Zingmar plugin and my pipeline using xvimagesink, shouldn't it work? I have also changed this part of the CPipelineHelper.cpp:

// Create gstreamer elements convert = gst_element_factory_make("videoconvert", "converter"); sink = gst_element_factory_make("xvimagesink", "sink"); // depending on your platform, you may have to use some alternative here, like ("autovideosink", "sink")

I then save it and run a make command again. I then attempt to relaunch the program but I still only get a green window.

Also I forgot to mention that I do not have the supplementary package installed. I have tried installing it but I am not 100% sure its installed properly.

BryanRacic commented 4 years ago

Hi @frankSDeviation ,

Do you have Pylon installed? Does a green window appear in the Pylon Viewer application?

frankSDeviation commented 4 years ago

Hello @BryanLikesToProgram ,

I have pylon 5.2 installed I believe. Everything on the Pylon software works just fine. I can see clear images and I am able to adjust settings without any issues.

I was successfully able to write a bash script that launches the display remotely using Zingmar's code. This is a quick and dirty fix for now but I eventually need to be able to do a UDP stream with two cameras with the lowest latency possible.