alliedvision / gst-vimbasrc

Official vimbasrc element for use of Vimba with GStreamer
Other
11 stars 9 forks source link

Unable to Make RTSP Stream #4

Open basitraza1214 opened 3 years ago

basitraza1214 commented 3 years ago

I was hoping that I could make a RSTP stream, but so far I have tried everything including the codes in example.md and readme but so far no result. Any help would be much appreciated.

NiklasKroeger-AlliedVision commented 3 years ago

Do you get any error messages when you try to run the provided RTSP Server example? Please follow the RTSP instructions in EXAMPLES.md. This should help you get an RTSP Server running. Displaying it can be done for example with something like the VLC media player. Again instructions for that are included in the linked file. Please provide any error messages and a detailed description of where you are having problems.

basitraza1214 commented 3 years ago

Yes, I have followed the instructions given in the EXAMPLES.md file but although I do not get any error, still I am unable to access the stream through VLC.

NiklasKroeger-AlliedVision commented 3 years ago

To get some information on what the vimbasrc element is doing, you can enable logging output for it. To do so please set the environment variable as follows: GST_DEBUG=vimbasrc:DEBUG in the terminal where you start the RTSP server. I hope that this will bring some more insight into what the vimbasrc element is doing.

It might also make sense to get some output from the other elements in the pipeline. The GST_DEBUG variable supports setting different log levels for different elements. You could for example set it to GST_DEBUG=warning,vimbasrc:DEBUG to get WARNING output from all elements and DEBUG output just from vimbasrc. More information on how you can set log levels can be found in the official documentation.

basitraza1214 commented 3 years ago

Thanks for the input. I tried your solution and I have found out that the GST streamer is successfully making an RTSP stream but when I try to access the stream through vlc it gives me an error.

Your input can't be opened:

I have saved the output of the shell command in a text file for your reference. shell_output.txt

NiklasKroeger-AlliedVision commented 3 years ago

That log indicates that the RTSP Server responds with a code 503 and a reason Service Unavailable. Maybe something does go wrong with the GStreamer pipeline. Generally the RTSP server will only start the GStreamer pipeline when an actual request for a stream comes in. If no stream is requested, the RTSP server will also not run the pipeline. So getting an error Service Unavailable might mean that the pipeline could not be started successfully.

Could you also provide the debugging output (see above on how to enable it) from the shell where you start the RTSP Server? Ideally I would also like to take a look at the GStreamer pipeline that is used in that RTSP server. So maybe you could also attach the python file you copied from EXAMPLES.md that you are running to start the server?

basitraza1214 commented 3 years ago

Turns it if now I run the RTSP stream with debugging off, then on trying to access the stream through VLC what I get is saved in output_without_debugging.txt. Whereas the output with debugging on is saved as shell_ouput. To enable debugging my env variables are GST_DEBUG=6,vimbasrc:6. I am using the following command to save the shell output to a text file: command > filename. Upon running hostname -I I get

192.168.32.10 , 172.17.0.1

So I am running the stream on vlc as

rtsp://172.17.0.1:8554/stream1

Please also find the python file attached as trsp_test.txt. shell_output.txt output_without_debugging.txt rtsp_test.txt

NiklasKroeger-AlliedVision commented 3 years ago

The output you provide all looks fine to me... Sorry I do not see any obvious errors in vimbasrc.

The only thing I see that I have never personally encountered is the error message from (I guess) the x264 encoder:

x264 [error]: baseline profile doesn't support 4:4:4

To me this sounds like the encoder has a problem with the image data that is passed to it, or the data that is requested from it. Looking for baseline in the documentation for that element, I only find some reference to that word on the output side of the element (the src pad). I have to admit that at this point I really do not know too much about the workings of the encoder and the rtph payloader that follows it. The example worked for me as you are using it so I did not anticipate any problems that far down the pipeline...

It does not seem like the problem is really related to vimbasrc itself, but rather somehow connected to the x264enc element. I have previously gotten great and quick help on the GStreamer IRC channel. Maybe that would be a place where you can ask followup questions. Some information on how to report bugs to GStreamer itself can be found on their "Contributing to GStreamer" page. There it says that the IRC channel can be found on freenode, but I believe that this might have changed recently. Possibly the new channel is now on the OFTC network (though I am not sure about this). At least it seems like there are quite a few people in that channel.

Sorry I do not have anything else to offer on this... Feel free to update here again if you have some new insights and I will take a look again. But as I said, I am really not that experienced with elements other than vimbasrc...

basitraza1214 commented 3 years ago

Hey, Is there a way to save the stream on the device locally?

NiklasKroeger-AlliedVision commented 3 years ago

You should be able to save the recorded images as separate pictures, or as a video file. For the video file you might however run into the same issue that you are seeing with the RTSP server, because it also encodes the images as a video codec. The example linked uses x264enc, just as the RTSP server example. You could still give it a try.

Looking through the list of GStreamer elements I also found the encodebin and encodebin2 elements. I have never used them myself but maybe they would also be interesting to encode the frames as video. This might even be something worth trying for the RTSP stream. Unfortunately I cannot give much input for this...

basitraza1214 commented 3 years ago

The file is being saved but this is an text file named as output.avi. Do I need to add additional flags such as -e flag or something

NiklasKroeger-AlliedVision commented 3 years ago

Could you provide the complete pipeline as well as the debug output you get when you run the gst-launch-1.0 command? Running the pipeline from EXAMPLES.md for me produces a video file. How do you exit the pipeline? It should run until you cancel the execution with ctrl-c.

basitraza1214 commented 3 years ago

save_output.txt shell_output_save.txt

I am exiting the pipeline by ctrl-c twice

NiklasKroeger-AlliedVision commented 3 years ago

The vimbasrc debugging output looks fine as far as I can tell. No errors reported and it looks like you should be recording about 28 seconds worth of video frames. The file itself is being created but empty? That is odd. Maybe we need to enable some more output for the other elements as well:

GST_DEBUG=INFO,vimbasrc:TRACE

This will give a lot of output as it should also print a line for every frame that is recorded by vimba. In case you are not doing so already you can redirect the shell output to a txt file like so:

gst-launch-1.0 vimbasrc ! other ! elements ! filesink >output_file.txt 2>&1

This should make it easier to provide the entire output so it doesn't need to be copied from the terminal which can be annoying if there is a lot of text.

basitraza1214 commented 3 years ago

save_output.txt Please find the requested output attached

NiklasKroeger-AlliedVision commented 3 years ago

That log shows that you vimba is only receiving incomplete frames. By default incomplete frames are not passed down the pipeline. So that explains why nothing is written to the video file. Also it appears that there is only one image recorded every couple of seconds. That seems like a very low framerate.

Are you able to view the camera stream with the vimba viewer on the system you are trying to use for this? The output seems to indicate that there is a problem getting frames from the camera to the computer...

basitraza1214 commented 3 years ago

Yes, I am able to view the frames with vimba viewer as well as asynchronous grab.py.

NiklasKroeger-AlliedVision commented 3 years ago

The vimbasrc element uses the exact same image transfer method that is used in our asynchronous grab examples. So if those work vimbasrc should also be able to receive frames. This is very odd!

Does this problem only appear with the video encoding pipeline? We have seen some issues in development with some pipelines if the camera attempts to transfer a high number of frames per second (first entry in our list of known issues). In those situations vimba received many incomplete frames as well. This only happened if I actually did some processing in the pipeline. I verified this by iteratively taking out elements from the pipeline like this:

First run shows no incomplete frames in debugging output:
vimbasrc camera=DEV_1AB22D01BBB8 num-buffers=1000 ! video/x-raw,format=GRAY8 ! fakevideosink

Second run shows no incomplete frames in debugging output:
vimbasrc camera=DEV_1AB22D01BBB8 num-buffers=1000 ! video/x-raw,format=GRAY8 ! videoscale ! videoconvert ! queue ! fakevideosink 

Third run actually displaying the video shows nearly all frames incomplete:
vimbasrc camera=DEV_1AB22D01BBB8 num-buffers=1000 ! video/x-raw,format=GRAY8 ! videoscale ! videoconvert ! queue ! autovideosink

To me that indicated that pipelines that require more processing power themselves (e.g. displaying the frame) somehow impact the frame transfer. Maybe you could try something similar with the help of the fakevideosink to see if the pipeline itself is the problem.

My guess at that point was that GStreamer was processing frames slower than the camera tried to submit them and that this somehow leads to vimba being starved of buffers or something similar. Perhaps even some trouble where the asynchronous image transfer thread was not scheduled enough time to successfully receive frames... Perhaps what you are seeing is a similar issue. But since the camera you are using only send approx. 20 fps at most this seems odd. And in the logging output from you I only saw one frame every 2 seconds or so. Unless you are using a very very low powered system GStreamer should definitely be able to process at that speed. So my guess is still on something being messed up with the camera connection at this point.

Unfortunately I do not have an idea of what you might try at this point. If the transfer works with the viewer and the asynchronous examples I do not see why vimbasrc should have any problems. If you feel up to it you can of course take a look at the examples we provide for asynchronous grabbing and the source code of vimbasrc to see if you can find places where vimbasrc may be optimized further. But I did not see any further low hanging fruits in the implementation...