Monash-Connected-Autonomous-Vehicle / Twizy-Autoware-Shadow-Repo

0 stars 0 forks source link

Stream cameras from PX2 #5

Open dylan-gonzalez opened 1 year ago

dylan-gonzalez commented 1 year ago

Current plan for the Twizy is to put a new PC in the back to do all the processing.

We will still need the PX2 because the cameras are GMSL and GMSL-USB interfaces are hard to find and super expensive. So we will be streaming the cameras from the PX2 to the new PC across the Twizy's network.

Take a look at Gstreamer, and this too. You will need to use Drivework's Sensor Abstraction Layer (SAL) to grab the camera frames and then send these frames using Gstreamer.

I think some first good steps would be to:

Driveworks SAL docs

ZileiChen commented 1 year ago

To see the source code of sample_camera_gmsl, navigate to the path: /usr/local/driveworks/samples/src

Looking though files to find examples and reading though to understand the code. Basic understanding is that in order to retrieve camera frames from the cameras, certain Driveworks methods including dwSensorCamera_readFrame() and dwSensorCamera_getImage() returns an address to the camera frames. In order to write this to Gstreamer or anything we want ourselves, we need to make a copy/clone/edit to one of the sample files in /usr/local/driveworks/samples/src/ All source code within these files follow the same format:

Make a new child class that inherits from DriveWorksSample Override the following methods with code of your own: onProcess() onInitialize() onRelease() onResizeWindow() onKeyDown() onRender()

ZileiChen commented 1 year ago

There are details that are not fully understood such as all the error checks and other parameters/variables.

ZileiChen commented 1 year ago

To install Gstreamer, the list of all Ubuntu 16 Xenial Packages can be found here: https://packages.ubuntu.com/xenial/allpackages Not too sure which ones we need since there are at least 50 different packages

ZileiChen commented 1 year ago

Image

It appears that there is no space left....

Image

Ive attempted sudo apt-get clean, autoclean, autoremove to free up some disk space but it is still not enough

dylan-gonzalez commented 1 year ago

Ok maybe we might have to reduce the partition size of the other Tegra?

ZileiChen commented 11 months ago

Found out where a large portion of the storage is being used:

Image

Removed all unused containers and images using 'docker system prune'

Then installed gstreamer using the following command:

sudo apt-get install gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly gstreamer1.0-libav sourced from https://ubuntuforums.org/archive/index.php/t-2462688.html Someone explained that Ubuntu comes with basic gstreamer packages and these are the only ones you need for most gstreamer needs.

ZileiChen commented 11 months ago

To stream to another computer, this is a starting point

Image

sourced from https://gstreamer.freedesktop.org/documentation/tools/gst-launch.html?gi-language=c Although this may not work since we need to use camera abstraction layers to retrieve camera frames and have yet to figure out how to integrate it with Gstreamer

https://github.com/stanislavkuskov/jetson_gmsl_camera_streamer This may also work but trying to figure out how this works

By using the gst-launch-1.0 command in the screenshot above, we can convert it into an application to be more specific. Since gst-launch-1.0 in simple terms is a shortcut way of writing an application and test to see if pipelines work

https://forums.developer.nvidia.com/t/nvmedia-capture-with-gstreamer/54252 someone has attempted this before but failed but we could also take ideas and improve on this

Robin-Chea commented 11 months ago

tried to use:

gst-launch-1.0 v4l2src device= /dev/video0 ! 'video/x-raw, width=1280, height=720, format=UYVY' ! videoconvert ! x264enc speed-preset=ultrafast tune=zerolatency ! rtph264pay ! udpsink host=192.168.1.181 port=5000.

problem where is the live video actually located as we get this error:

Setting pipeline to PAUSED ... ERROR: Pipeline doesn't want to pause. ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Could not open device '/dev/vcs' for reading and writing. Additional debug info: v4l2_calls.c(620): gst_v4l2_open (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: system error: Permission denied Setting pipeline to NULL ... Freeing pipeline ...

ZileiChen commented 11 months ago

Attempted to run tutorials given by Gstreamer, ran into certain compilation errors:

Image

Solved by

Image

Running the application then results in:

Image

Assuming its not because we have no display connected

ZileiChen commented 11 months ago

After testing on the PX2 with monitors connected, still doesnt work. Also TegraB cannot run sample_camera_gmsl, whereas TegraA can and works. Although running the multiple camera samples will crash and will require the PX2 to be restarted for them to work again. This is an issue since it is not possible to SSH into Tegra A

Robin-Chea commented 11 months ago

Need to check what version the Px2 is on

image

https://forums.developer.nvidia.com/t/camera-name-in-dev-folder/41222 this was posted on 2-16 so we should have feature? Checking the /dev folder there is no video0 on tegra A or B which means that it is located somewhere else.

dylan-gonzalez commented 11 months ago

@Robin-Chea @ZileiChen @dtonda8 Seems that the issue lies specifically with the PX2 I remember that we had the issue with running the multiple camera samples causing the PX2 to crash, but we had it fixed by reflashing the PX2

I think some good next steps might be to make sure that we can first get gstreamer working on Ubuntu 16 on a normal Hive PC. Once we're confident we have that working, then we can try to start focusing more on the issues with the PX2 I think.

Also @Robin-Chea can you edit your previous message to include the link that that screenshot is from?

Might be good to also take a look at the camera crashing log from earlier this year.

Robin-Chea commented 11 months ago

we attempted the first tutorial of gstreamer at https://gstreamer.freedesktop.org/documentation/tutorials/basic/index.html however since px2 has no internet connection it could not play the url. we instead downloaded and moved a random mp4 video into the downloads folder called test.mp4. Since we are not playing we changed playbin uri=https://gstreamer.freedesktop.org/data/media/sintel_trailer-480p.webm in the tutorial to filesrc location=///home/nvidia/Downloads/test_vid/test.mp4. This allowed the video to be played on the whole screen.

After that we tried to just use the command line tools of gst launch 1.0 specifically the network streaming section of https://gstreamer.freedesktop.org/documentation/tools/gst-launch.html?gi-language=c Transmitter: gst-launch-1.0 v4l2src ! video/x-raw-yuv,width=128,height=96,format='(fourcc)'UYVY ! videoconvert ! ffenc_h263 ! video/x-h263 ! rtph263ppay pt=96 ! udpsink host=192.168.1.1 port=5000 sync=false Receiver: gst-launch-1.0 udpsrc port=5000 ! application/x-rtp, clock-rate=90000,payload=96 ! rtph263pdepay queue-delay=0 ! ffdec_h263 ! xvimagesink

All we did was change a few things so that it could play our video instead which was done by looking at the video section of https://gstreamer.freedesktop.org/documentation/tools/gst-launch.html?gi-language=c

gst-launch-1.0 filesrc location=/home/nvidia/Downloads/test_vid/test.mp4 ! decodebin ! x264enc ! video/x-h264,profile=baseline ! rtph264pay config-interval=1 pt=96 ! udpsink host=192.168.1.181 port=5000

Reciever gst-launch-1.0 udpsrc port=5000 ! application/x-rtp,encoding-name=H264,payload=96 ! rtph264depay ! h264parse ! avdec_h264 ! autovideosink

However, it streamed the video on the px2 monitor : 1

this was on the reciever laptop

1 0

After a while we found out that it was because we were running the reciever command on tegra meaning we just ran the reciever and trasmitter on the same pc so thats why it was on the px2 monitor.

We then decided to run the trasmitter on the laptop using gst-launch-1.0 filesrc location=/home/david/Downloads/test.mp4 ! decodebin ! x264enc ! video/x-h264,profile=baseline ! rtph264pay config-interval=1 pt=96 ! udpsink host=192.168.1.181 port=5000 and on the px2 the reciever command gst-launch-1.0 udpsrc port=5000 ! application/x-rtp,encoding-name=H264,payload=96 ! rtph264depay ! h264parse ! avdec_h264 ! autovideosink

This allowed the video from the laptop to be streamed over the network to the px2.

At this point i just realised that on the px2 all we had to do was change the host ip address to the ip address of our laptops which will allow the video from the px2 to be streamed onto our laptop We might test this the next time we go in to work on it.

dylan-gonzalez commented 11 months ago

Awesome!! Next time if the network in the workshop is down you can just run an ethernet cable from the px2 to the wall (direct to eduroam)

Robin-Chea commented 11 months ago

Streaming from the px2 to another laptop over the network works. this was done by changing the udpsink host to specific ip.

can probably do this for mutiple videos using gst-launch-1.0 \ filesrc location=/home/nvidia/Downloads/test_vid/test1.mp4 ! decodebin ! x264enc ! video/x-h264,profile=baseline ! rtph264pay config-interval=1 pt=96 ! udpsink host=192.168.1.224 port=5000 \ filesrc location=/home/nvidia/Downloads/test_vid/test2.mp4 ! decodebin ! x264enc ! video/x-h264,profile=baseline ! rtph264pay config-interval=1 pt=97 ! udpsink host=192.168.1.224 port=5001

next step is to get camera working

dylan-gonzalez commented 10 months ago

Hey @ZileiChen @dtonda8 @Robin-Chea I might have a meeting with you guys sometime in the next week or so to see where we're at with this

But in the meantime just work on your new issues

dtonda8 commented 9 months ago

Streaming a single camera works! I played around with gst-launch-1.0 and managed to find one working solution to stream a single camera.

My laptop (receiver):

gst-launch-1.0 -e udpsrc port=5000 caps="application/x-rtp" ! rtpjitterbuffer ! rtph264depay ! avdec_h264 ! fpsdisplaysink

PX 2 (sender): Streams camera to a file (assuming test1.h264 is an empty file create previously)

cd /usr/local/driveworks/bin/
./sample_camera_gmsl --write-file=/home/nvidia/Downloads/test1.h264

On new terminal: Reads file and streams it.

gst-launch-1.0 filesrc location=test1.h264 ! h264parse ! avdec_h264 ! videoconvert ! videorate ! video/x-raw,framerate=30/1 ! x264enc tune=zerolatency bitrate=3000 speed-preset=ultrafast ! video/x-h264,profile=baseline ! rtph264pay ! udpsink host=192.168.1.218 port=5000

You'll have to execute the last bash command as soon as the cameras has begun writing to the file. This is hard to time right, so I've grouped the commands to shell scripts here.

https://github.com/Monash-Connected-Autonomous-Vehicle/Twizy-Autoware-Shadow-Repo/assets/99521514/141d179d-bb78-4d9e-b623-c8cbaff3c6c7

The latency is about 600ms. There's quite a few params we could modify.

The sample_camera_multiple_gmsl does not have a parameter to write to files so it's a bit trickier. We could write a shell script to launch each camera individually, or stream .raw files from recordings (probably not this since a 10 sec .raw file recording takes 1 GB of storage🤯).

The biggest drawback with this method (i.e. writing to a file, then streaming from it) is that the files can get large. 1 second of streaming takes about 0.95 MiB of storage => 3.2 GiB/hr. So, for 8 cameras that's about 26 GiB storage required for an hour of streaming before needing to restart and clear storage.

(05/01/2024) Update on storage issue above: I believe the code below can directly stream (rather than saving to a file then streaming), hence avoiding the file storage issue:

./sample_camera_gmsl --write-file=/dev/stdout | gst-launch-1.0 -e fdsrc ! h264parse ! avdec_h264 ! videoconvert ! videorate ! video/x-raw,framerate=30/1 ! x264enc tune=zerolatency bitrate=3000 speed-preset=ultrafast ! video/x-h264,profile=baseline ! rtph264pay ! udpsink host=192.168.1.218 port=5000
dtonda8 commented 9 months ago

Possible to stream with 2 cameras at ~22 fps using this sender.sh script SCR-20240118-ldrj

It does outputs this error and often quits streaming within 15s: IMG_0407

I can stream for a while when I first start up the px2, but the more I test (in one sitting) it the more unreliable it becomes.

Here is the output of top when running sender.sh:

image

When using files instead of pipes, streams seem to be more reliable (only downside is the storage it takes)

Single camera streams are still smooth. I might try h265 instead of h264