MattsProjects / pylon_gstreamer

A robust integration of Basler's Pylon API with GStreamer. Delivers applications as ready-to-run standalone compiled executables (gst-launch-1.0 is not needed). Designed for reliability and easy access to performance optimizations. Note: This is not a plugin. It is an intergration using GStreamer's GstAppSrc element.
Apache License 2.0
29 stars 18 forks source link

while running demopylongstreamer facing errors. #18

Closed harendracmaps closed 4 years ago

harendracmaps commented 4 years ago

I ran the following command on TX2 (by doing ssh) inside a docker. I am facing the errors as follows. Not sure how to fix it. Any help would be appreciated.

./demopylongstreamer/demopylongstreamer -rescale 320 240 -parse "gst-launch-1.0 videotestsrc ! videoflip method=vertical-flip ! videoconvert" ----------------OUTPUT------------- Press CTRL+C at any time to quit. Resetting camera to default settings... Initializing camera and driver... Using Camera : Basler acA1920-40uc (24656034) Camera Area Of Interest : 1920x1200 Camera Speed : 40.9987 fps Images will be scaled to : 320x240 Applying this Pipeline to the CInstantCameraAppsc: gst-launch-1.0 videotestsrc ! videoflip method=vertical-flip ! videoconvert... Pipeline Made. Starting Camera image acquistion and Pylon driver Grab Engine... Starting pipeline... ERROR from element source24656034: Internal data flow error. Debugging info: gstbasesrc.c(2948): gst_base_src_loop (): /GstPipeline:pipeline/GstBin:sourcebin24656034/GstAppSrc:source24656034: streaming task paused, reason not-linked (-1) Stopping pipeline... Sending EOS event... Stopping Camera image acquistion and Pylon image grabbing...

Press Enter to exit. ^C Sending EOS event to pipeline...

(demopylongstreamer:20466): GStreamer-CRITICAL **: gst_element_send_event: assertion 'GST_IS_ELEMENT (element)' failed ^C

BryanRacic commented 4 years ago

Hello Harendra,

It looks like you're missing a video sink after your video convert, for this case I'd recommend the "autovideosink"

Your command should look like this:

./demopylongstreamer -parse "gst-launch-1.0 videotestsrc ! videoflip method=vertical-flip ! videoconvert ! video/x-raw,format=I420 ! autovideosink"

Please pay special attention to the caps filter after the videoconvert element, which is required when using the system-preferred videosink on the TX2...

As per the header of demopylongstreamer.cpp:

If you are using demopylongstreamer with the -parse argument in order to use your own pipeline, add a caps filter after the normal videoconvert and before autovideosink: ./demopylongstreamer -parse "gst-launch-1.0 videotestsrc ! videoflip method=vertical-flip ! videoconvert ! video/x-raw,format=I420 ! autovideosink"

MattsProjects commented 4 years ago

Hi Harendra, How many cameras do you have plugged in when using the -parse method?

harendracmaps commented 4 years ago

1 camera. Is it something to do with working remotely on TX2 (using ssh)? Also I am running the demopylongstreamer binary inside a docker.

My requirement is to stream the video to a webpage. how to do that? beause I don't see any IP and port in the command. Am I using the correct command (pipeline)?

BryanRacic commented 4 years ago

I believe the "No protocol specified" error originates from X11 not the pipeline. This means you cannot open/display a GUI application. Your options would include forwarding the X server over ssh, or using a VNC client instead.

However, the sample pipeline provided is intended to display the camera's output in a local window. For streaming the encoded video over IP you can use the "-h264stream" argument followed by the target IP address.

./demopylongstreamer -h264stream 192.168.2.102

harendracmaps commented 4 years ago

Bryan/Matts, I ran the following command

./demopylongstreamer -rescale 120 350 -h264stream 192.168.3.253

Output of above command:

Press CTRL+C at any time to quit. Resetting camera to default settings... Initializing camera and driver... Using Camera : Basler acA1920-40uc (24656034) Camera Area Of Interest : 1920x1200 Camera Speed : 40.9987 fps Images will be scaled to : 120x350 Creating Pipeline for streaming images as h264 video across network to: 192.168.3.253:554... Start the receiver PC first with this command: gst-launch-1.0 udpsrc port=554 ! application/x-rtp,encoding-name=H264,payload=96 ! rtph264depay ! avdec_h264 ! autovideosink sync=false async=false -e Then press enter to continue...

Trying omxh264enc encoder... Could not make omxh264enc encoder. Trying imxvpuenc_h264... Could not make imxvpuenc_h264 encoder. Trying v4l2h264enc... Could not make v4l2h264enc encoder. Trying x264enc... Pipeline Made. Starting Camera image acquistion and Pylon driver Grab Engine... Starting pipeline... Starting main loop run... ^C Sending EOS event to pipeline... End of stream Stopping pipeline... Sending EOS event... Stopping Camera image acquistion and Pylon image grabbing...

Press Enter to exit. " " " So as I want to stream the output rtsp stream I ran client command as follows

gst-launch-1.0 udpsrc port=554 ! application/x-rtp, media=video, clock-rate=90000, encoding-name=H264, payload=96 ! rtph264depay ! decodebin ! videoconvert ! x264enc ! rtph264pay name=pay0 pt=96 " " "

Again I am asking the same query how to validate that the demopylongstreamer binary is generating valid media data. As I have saved 1000 fames it in a file mymovie.h264 and it was of 2.1 M size. Also vlc could not play the file. ./demopylongstreamer -camera 24656034 -aoi 640 480 -framer ate 15 -rescale 320 240 -h264file mymovie.h264 1000

OUTPUT of above command

Press CTRL+C at any time to quit. Resetting camera to default settings... Initializing camera and driver... Using Camera : Basler acA1920-40uc (24656034) Camera Area Of Interest : 640x480 Camera Speed : 14.9997 fps Images will be scaled to : 320x240 Creating Pipeline for saving images as h264 video on local host: mymovie.h264... Could not make omxh264enc encoder. Trying imxvpuenc_h264... Could not make imxvpuenc_h264 encoder. Trying x264enc... Pipeline Made. Starting Camera image acquistion and Pylon driver Grab Engine... Starting pipeline... Starting main loop run... End of stream Stopping pipeline... Sending EOS event... Stopping Camera image acquistion and Pylon image grabbing...

Press Enter to exit.

Request you to please assist me on this.

BryanRacic commented 4 years ago

Hi Harendra,

I'm a bit confused as to what you're asking help with. If you'd like to know more about just encoding a video file, try running the -h264 argument with the filename mymovie.mp4 instead of mymovie.h264. It looks like the pipeline is writing data to the file, but if it's encoded properly you should be able to open it in a media player like VLC.

It is kind of odd that the pipeline couldn't find omxh264enc if you're running on a TX2. Did you install Nvidia Jetpack, the device drivers, etc? This shouldn't prevent you from running a normal file encoding pipeline, but it doesn't take advantage of hardware acceleration. I'd recommend installing accelerated gstreamer

However, if you're having trouble with video streaming you need to have gstreamer pipelines running simultaneously on both computers. The TX2 should be running: ./demopylongstreamer -h264stream 192.168.3.253

And the recipient system (with a display) needs to be running:

gst-launch-1.0 udpsrc port=554 ! application/x-rtp,encoding-name=H264,payload=96 ! rtph264depay ! avdec_h264 ! autovideosink sync=false async=false -e

I understand you'd like to do some more advanced stuff with the stream receiving pipeline, but when debugging I'd recommend running this simple application first.

MattsProjects commented 4 years ago

I agree with Bryan. For me I have only ever tried to stream h264 to a receiver which displays it. I haven't tried streaming to a receiver which saves the movie. I think there is a better way than de-pay'ing it and then re-paying it though. As long as the sender isn't giving errors, it's generating valid data btw. It is actually possible that 1000 frames results in a 2.1MB filesize, because that is what h.264 is trying to accomplish. Essentially, only parts of the image that changed are transmitted, so if the scene is not moving, the file size will be less.

harendra247 commented 4 years ago

Thanks @BryanLikesToProgram @MattsProjects As you suggested I tried running it on host and am facing same issue. The window is black, nothing is getting rendered. There is no errors logs as well. Not sure how to proceed. Ran the following command

./demopylongstreamer -rescale 640 480 -parse  "videoflip method=vertical-flip ! 
videoconvert ! autovideosink"

Additional Info:

system: x86_64 (Ubuntu 16.04) 
Gstreamer version : 1.8.3
Basler pylon SDK version : 5.2.0

Screen Shot 2020-04-16 at 8 47 23 PM

BryanRacic commented 4 years ago

Hi Harendra,

If you're running this on the TX2 I'd recommend you run: ./demopylongstreamer -parse "gst-launch-1.0 videotestsrc ! videoflip method=vertical-flip ! videoconvert ! video/x-raw,format=I420 ! autovideosink"

As I explained before, you need the capsfilter to use autovideosink on the TX2.

If this still does not work, you can debug gstreamer by setting the GST_DEBUG environment variable:

GST_DEBUG=2 ./demopylongstreamer -parse "gst-launch-1.0 videotestsrc ! videoflip method=vertical-flip ! videoconvert ! video/x-raw,format=I420 ! autovideosink"

This will let you know of any high level errors or warnings

harendra247 commented 4 years ago

Thanks @BryanLikesToProgram for your assistance.

Now I am not trying on TX2. I am trying it on x86_64 (Ubuntu 16.04) with Gstreamer Version : 1.8.3 Basler pylon SDK version : 5.2.0 Once it starts working I'll go back to TX2.

Finally I could make it work I have to comment the line "camera.ResetCamera()" statement. IMO, there should be some flag to control reset.

Because after running demopylongstreamer, if I try running PylonViewerApp, a black screen gets rendered there as well.

Screen Shot 2020-04-18 at 4 58 05 PM

MattsProjects commented 4 years ago

Hi Harendra, I’m not sure I fully understand, could you rephrase? Are you changing any settings in pylon viewer before running gstreamer? E.g. do you start with a black image and adjust the exposure manually to get a good image? The ReserCamera() function just returns the camera to default settings, so it can be commented out if it’s not needed. Easy to add a command line argument to demopylongstreamer too. Good idea :)

harendracmaps commented 4 years ago

Hey @MattsProjects , With PylonViewerApp we can configure camera for good quality image. But when I run demopylongstreamer, due to reset reset function the image turns black. I commented the function and it started working.