rodizio1 / EZ-WifiBroadcast

Affordable Digital HD Video Transmission made easy!
GNU General Public License v2.0
826 stars 200 forks source link

Using OpenCV with EZ-WifiBroadcast #71

Open tictag opened 6 years ago

tictag commented 6 years ago

Hello Rodizio,

Just got my first video stream to work [RPi Zero()W with ZeroCam to RPi3 using 2 x RT5370 based nano USB adapters] and it looks great! :) Really fantastic work (from you and Befinitiv)!!

My query does not need a comprehensive answer, I don't mind doing all the digging to get something working, but I don't want to waste my time investigating if my use case simply isn't an option. I plan to take two seperate video sources connected to a RPi, do some rudimentary image processing then merge them together using OpenCV ... and here's the query ... could I then use EZ-WifiBroadcast to transmit that composite video stream to a receiver?

I've read every page of your wiki and everything revolves around broadcasting a source video from Tx to Rx. Basically, can my source video actually be an output stream from OpenCV?

I don't mind putting the hours in to investigate/troubleshoot etc but not if this use case is simply not do'able.

I would appreciate your thoughts.

Oh, and if you don't get back to me before new year, Happy New year!! :)

rodizio1 commented 6 years ago

The wifibroadcast tx (as well as my further developed tx_rawsock) basically just send out what they receive on stdin, so if you can pipe your OpenCV output into it somehow it should work.

tictag commented 6 years ago

Ahhh, so it isn't just 'hard wired' to transmit the camera stream, anything 'piped' (I'm not 100% sure what that means just yet) into tx_rawsock should be transmitted. That sounds brilliant. I've just read tbeauchant's post re an IP cam and s/he is using gstreamer ...

I then edited the .profile file to pipe the output h264 stream from gstreamer into tx_rawsock

Looks like I have my starting point :) Do you have any further docs around tx_rawsock?

Many thanks for all your (and Befinitiv's) hard work on this. Seriously, there is no way I'd ever be able to complete this project without you.

ivanskj commented 6 years ago

I am piping a gstreamer-pipeline from a Sony QX1 MJPEG liveview-url to tx. Souphttpsrc -> h264enc. I can confirm that it works, but i still have to figure out bitrates and all that stuff to get more range. There is also some more latency on the stream. I will try the tx_rawsock

geofrancis commented 6 years ago

i would be very interested in using a sony camera with wifibroadcast, currently I am having to use a B101 hdmi converter, if you could switch between it and the pi camera it would be awsome.

tictag commented 6 years ago

rodizio1,

Whilst I understand the purpose of EZ-WiFiBroadcast is simplify setup i.e. just download a couple of SD card images and you're good to go, I'm going to need to install a whole lot of software to complete my project e.g. OpenCV, Numpy, Python etc. Is there a 'manual install' doc available? That is, a doc that outlines the steps required to essentially create a working image from, for example, a vanilla Raspbian distribution like Raspbian Stretch Lite?

careyer commented 6 years ago

Maybe the easier way to go is to find a way to boot the EZ-WifiBroadcast image so that it does not autostart all the RX, TX, OSD and Telemetry processes but welcomes you with a login prompt. Connecting the ethernet port to your router would allow for installing all the necessary packages then.

@rodizio1: is there a best practice how to boot the image so that it does not enter transmission / reception mode automatically? I tried variant No. 1 from https://github.com/bortek/EZ-WifiBroadcast/wiki/Logging-into-the-linux-console but it was kind of problematic since there were overlays from the OSD all over the place and my nano editor was in the background ;-) However No. 2 might work just fine... haven't tried this yet. @tictag: Give it a try! ;-)

tictag commented 6 years ago

The second method simply provides you with a remote command prompt much in the same way as CTRL-C does locally.

It would certainly be better for me to have the manual install instructions. I've been trying to install and configure stuff on top of the image but just keep running into problems; bash commands that are not present, make commands (e.g. numpy) that appear to run/compile forever (48 hours before quit, should be 3-4 hours). And of course I'm scared to do any kind of apt-get update/upgrade for fear of modified drivers/firmware being overwritten.

Whilst I do certainly believe that some would prefer the EZ (image-based) installation, for others this might cause more problems than it solves.

It would be great to have a manual install method.

rodizio1 commented 6 years ago

There are no instructions as I just made changes to the image and did not document them apart from the changelog. Instructions on howto make the changes would need to be constantly updated as Raspbian is constantly changing, I don't have the time nor motivation for that. Finding out what has changed is quite easy, simply take the default raspbian lite image and compare it with my image. "meld" is a good tool for that for example.

In the long run, I want to get away from Raspbian and use something smaller and better manageable like buildroot.

In general:

Regarding logging in: Hmm, the OSD still being in front of the console when logging in locally is indeed not so nice. I've changed the profile script so that it automatically does a "killall osd" to quit the OSD after logging in.

tictag commented 6 years ago

Wow, so this has been a voyage of discovery!! I have now managed to get python3.4.2, numpy1.8.2, gstreamer1.0 and opencv3.4.0 all installed and working simultaneously with EZ-WiFiBroadcast v1.6RC3 on a Raspberry Pi 3. This has required me to: resize file systems, find random files, compile code(!), edit configuration files and solve many other problems along the way but ... at least it works!

I'm working on a fully documented bash script to automate this whole thing and I'll upload once I've tested it for others to use should they wish to.

rodizio1 thank you for your candour, I totally get it. As I am eventually going to be writing an instructable for my project, I will want to start off with a vanilla Raspbian image, that is, I am going to try to use that "meld" tool. What was the exact version of original Raspbian source image?

tictag commented 6 years ago

...p.s. I only put the (!) after 'compile code' because this is the first time I have ever compiled code. Yep, that's how much of a layman I am!

Now that I have everything I need, I'm gonna have to stop wifibroadcast from snagging the hardware (i.e. camera) and instead have OpenCV do this. OpenCV then needs to process the video streams before piping them out through gstreamer to the wifibroadcast tx_rawsock device.

Why does it sound so easy when it's just words? ;)

lgangitano commented 6 years ago

@tictag I'm interested in your solution, since I'm experimenting with the same setup (video feed multicasted to OpenCV and EZ-Wifibroadcast) for realtime object tracking and streaming. Would you share your results performance-wise on Pi3?

tictag commented 6 years ago

Of course, happy to. I'm just at the point where I'm adding my thermal stream so will be, probably tomorrow be looking to piping the streams into wifibroadcast, as opposed to it capturing the stream itself. On the receive side, I'll be extracting the two streams from wifibroadcast and piping them into OpenCV for further processing.

...and I have no idea how to do this yet!! Don't let me sound like I know what I'm doing! ;)

tictag commented 6 years ago

rodizio1 thank you for your candour, I totally get it. As I am eventually going to be writing an instructable for my project, I will want to start off with a vanilla Raspbian image, that is, I am going to try to use that "meld" tool. What was the exact version of original Raspbian source image?

Bump...

rodizio1 commented 6 years ago

Sorry, I never wrote that down (and in retrospect I found out that there is no version number or similar inside the Raspbian images ...)

What I remember is, that version 1.0 was released around 15th of May 2016 and used Kernel 4.4.9 or 4.4.11, so it must be a Raspbian release around that time with that kernel.

You can find the old Raspbian releases and changelogs here: http://downloads.raspberrypi.org/raspbian/images/ http://downloads.raspberrypi.org/raspbian/release_notes.txt

careyer commented 6 years ago

@tictag : I am very interested in the automated bash script that you created to automate the installation of additional components to EZ-WifiBroadcast1.6RC3

I'm working on a fully documented bash script to automate this whole thing and I'll upload once I've tested it for others to use should they wish to.

In my use case I need to install the following components:

Being a total Linux noob this might help me get started a bit less troublesome. Thank you very much in advance!

BTW: Patrick Duffy from DIY Drones (http://diydrones.com/profiles/blogs/thermal-imaging-for-your-drone-on-a-budget) send me ready-2-run image demoing the integration of FlirOne with Raspberry. It works flawlessly. However it does not build on WiFiBroadcast but on nomal WiFi streaming via Gstreamer and fixed IPs over the local WLAN. It also supports transmission of multiple video streams (Thermal Image, HD-Video from Flir & RaspiCam Video) - i.e. I am also following your progress in #76

tictag commented 6 years ago

Happy to help a fellow noob! If you only need gstreamer then my script probably won't help so much. Tbh, the most complicated thing (for me) has been compiling OpenCV. Mind you, should help with compiling the kernel. Definitely we can work together on this :) (... blind leading the blind ;)

careyer commented 6 years ago

@tictag : That is good news! I will try to install gstreamer first (I suppose the way to go is to do it the regular way with apt-get?) and then get back to you? I think the FlirOne Device driver needs to be compiled as well :-|. Am I correct that you are using this driver for your project as well? Last (and probably most complicated) will be recompiling the kernel in order to get l4l2loopback support added. For this I definitely need some help.

BTW: Here is a screenshot from what I was able to achieve yesterday evening with the Patrick Duffy image. The frame rate was surprisingly good - I believe it was definitely more than the regular 8-9fps. Felt more like 15-20fps which I was positively surprised about:

tictag commented 6 years ago

Looking hot! Ahem, sorry...

Yes, gstreamer just via apt-get, though I did have a few issues installing:

...to resolve

# Missing packages during installs
apt-get update  # do not 'upgrade'

# Running out of diskspace installing most things
nano /etc/fstab  # for device /tmp (ramdisk used as temp scratchdisk), change size=50M, CTRL-X, Y to save then reboot
resize2fs /dev/mmcblk0p2  # this is the 2nd sd card partition and mounted at /dev/root, which is your home directory
df -h  # make sure /dev/root size is the same size as your partition and there's plenty space 'avail'

# 'Fuse' directory not found during gstreamer install
mkdir /dev/fuse
chmod 777 /dev/fuse
apt-get install fuse
apt-get install gstreamer1.0

That should get you gstreamer installed.

tictag commented 6 years ago

...oh, forgot to answer your question, no I'm not using the FLIROne, I'm just using the FLIR Lepton OEM module on a breakout board, connected to a RPi Zero via SPI (Serial Peripheral Interface), I'm using pylepton to control it, OpenCV for image processing and gstreamer to stream it via wifibroadcast's tx_rawsock.

Does the FLIROne connect via USB?

careyer commented 6 years ago

Yes, the FlirOne conencts via USB and features both a Thermal camera and a HD cam for visable light. Both videostreams can be accessed with the linux driver.

Update:

rodizio1 commented 6 years ago

say, what are you guys installing there exactly? Gstreamer is already installed (used for encapsulating the raw h264 video for missionplanner) and there is no package called "gstreamer1.0"

careyer commented 6 years ago

Hi rodizio... We are trying to get some FLIR Thermo cameras working with EZ-Wifibroadcast. In order to make the Flir cameras work we need a very complicated Gstreamer pipeline with modules from Gstreamer good, bad and ugly... that is why we need the fullblown Gstreamer suite.

apt-get update #(not upgrade)
apt-get install gstreamer1.0

installes the hole suite and works flawlessley as far as I can tell. It installs all the nessesary additional modules.

I also succeded to install the FlirOne USB driver (flir8p1-gpl), however now I am stuck installing the v4l2loopback kernel module. :-(

I did:

sudo apt-get install linux-headers-rpi
 sudo wget https://raw.githubusercontent.com/notro/rpi-source/master/rpi-source -O /usr/bin/rpi-source && sudo chmod +x /usr/bin/rpi-source && /usr/bin/rpi-source -q --tag-update
 rpi-source
 sudo apt-get install bc
 rpi-source

git clone https://github.com/umlaeute/v4l2loopback
 cd v4l2loopback
 make
 make install

Everything runs just fine without any errors but the kernelmodule v4l2loopback wont show up:

**root@wifibroadcast(rw):~/v4l2loopback#** sudo modprobe v4l2loopback
modprobe: FATAL: Module v4l2loopback not found.
**root@wifibroadcast(rw):~/v4l2loopback#**

any ideas?

careyer commented 6 years ago

@rodizio1 : I think I know now what the problem is with compiling the v4l2loopback kernel module

The commands above load the kernel sources (or headers) for 4.9.35-v7+ wheras EZ-Wifibroastcast v1.6RC3 runs on 4.9.35-v7 (non+). Maybe that is why the kernel module compiles okay but does not get installed properly?

Can you please advice me how to install the correct kernel sources/headers for 4.9.35-v7 ((and how exactly to apply the patches you made). I took a look at https://github.com/bortek/EZ-WifiBroadcast/tree/master/kernel/readme.md but I do not completely understand how to do it. You know... me just being a linux noob!

I suppose i do not have to re-compile the whole kernel but only the kernel module. But I do not get how I can do it correctly. :-(

Thank you so much! Your help is so much appreciated

careyer commented 6 years ago

Okay success! I finally solved the problem... rpi-source always adds a '+' to the release number of downloaded kernels from Github to indicate that it is a local copy. The make process of the v4l2loopback kernel module copes just fine with that but unfortuantly modprobe gets into serious trouble if the kernel release string differs only slightly. It simply won't start and make install will also copy the compiled module to a wrong destination.... GRRRR!! I had to manually alter the release number string all over the filesystem (hidden files, config files.... god knows were I found this damn attached '+'). Now it works! FlirOne is now functional in the EZ-WifiBroadcast Image 1.6RC3... Now I just have to figure out how to pipe its output to the tx pipeline.

Cheers!

careyer commented 6 years ago

@rodizio1 CC: @ivanskj / @tictag / @tbeauchant I was finally able to get all the prerequisites for operating the FlirOne thermal camera right. Driver is installed and is linked to the /dev/video3 device on the system. I also have a working gstreamer pipeline. However as of now this gets streamed via a standard WiFi connection (UDP connection via a seperate NIC) and not via WifiBroadcast.

I do not fully understand yet, how I can pipe my Gstreamer Output to WiFi-Broadcast and have it transfered. I found the tx_rawsock called in .profile but I do not understand all the parameters. Can you please help me to make that happen and guide me a little bit on what needs to be changed and where (on Air and Ground side repectively)? That would be AWESOME! 👍

My Gstreamer pipeline looks like this now: gst-launch-1.0 v4l2src device=/dev/video3 ! video/x-raw,width=160,height=128, framerate=1/10 ! rtpvrawpay ! udpsink host=192.168.0.100 port=9002

I feel the LowBudget DIY Thermal Drone is only 1 setp away. ;-) Cheers & Thank you!

tictag commented 6 years ago

I've yet to figure this out myself but I do have a novice understanding about 'piping' in Linux. Piping is where you take the output of one process and feed it to the input of another, ever see that cult film 'the human centipede'? So,

ls -p | directory.txt

... would output the current directory list and 'pipe' it into a file instead of printing it to the console.

I've seen examples on the t'interweb where the output of raspivid (the default application used for pi cameras) is then 'piped' into the tx_rawsock application, thus:

raspivid <parameters> | tx_rawsock <parameters>

I'm not quite sure yet how to do this with OpenCV but as I'll be doing the image processing using Python, I'll likely be using the cv.ImageWriter() method, piped into gstreamer, which in turn pipes into tx_rawsock.

Btw, if you have seen 'the human centipede', I apologise for associatibg that imagery with Linux piping!! ;)

ivanskj commented 6 years ago

Use fdsink as the last element in gstreamer

Sendt fra min iPhone

  1. jan. 2018 kl. 15:35 skrev tictag notifications@github.com:

I've yet to figure this out myself but I do have a novice understanding about 'piping' in Linux. Piping is where you take the output of one process and feed it to the input of another, ever see that cult film 'the human centipede'? So,

ls -p | directory.txt

... would output the current directory list and 'pipe' it into a file instead of printing it to the console.

I've seen examples on the t'interweb where the output of raspivid (the default application used for pi cameras) is then 'piped' into the tx_rawsock application, thus:

raspivid | tx_rawsock

I'm not quite sure yet how to do this with OpenCV but as I'll be doing the image processing using Python, I'll likely be using the cv.ImageWriter() method, piped into gstreamer, which in turn pipes into tx_rawsock.

Btw, if you have seen 'the human centipede', I apologise for associatibg that imagery with Linux piping!! ;)

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.

careyer commented 6 years ago

Thank you! I am familiar (as a linux noob like me can be) with the concept of piping in Linux. However I am struggling a little bit with the actual combination of things - to be precise:

I think that I need to change the following line(s) in /root/.profile Transmitting the video at the AirPi:

Linie 688: nice -n -9 raspivid -w $WIDTH -h $HEIGHT -fps $FPS -b 3000000 -g $KEYFRAMERATE -t 0 $EXTRAPARAMS -ae 40,0x00,0x8080FF -a "\n\nunder-voltage or over-temperature on TX!" -o - | nice -n -9 /root/wifibroadcast/tx_rawsock -p 0 -b $VIDEO_BLOCKS -r $VIDEO_FECS -f $VIDEO_BLOCKLENGTH -t $VIDEO_FRAMETYPE -d $VIDEO_WIFI_BITRATE -y 0 $NICS

Line717: nice -n -9 raspivid -w $WIDTH -h $HEIGHT -fps $FPS -b $BITRATE -g $KEYFRAMERATE -t 0 $EXTRAPARAMS -a "$ANNOTATION" -ae 22 -o - | nice -n -9 /root/wifibroadcast/tx_rawsock -p 0 -b $VIDEO_BLOCKS -r $VIDEO_FECS -f $VIDEO_BLOCKLENGTH -t $VIDEO_FRAMETYPE -d $VIDEO_WIFI_BITRATE -y 0 $NICS

Receiving and displaying the video at the GroundPi:

Line 836ff: tmessage "Starting RX ... (FEC: $VIDEO_BLOCKS/$VIDEO_FECS/$VIDEO_BLOCKLENGTH)" ionice -c 1 -n 3 /root/wifibroadcast/rx -p 0 -d 1 -b $VIDEO_BLOCKS -r $VIDEO_FECS -f $VIDEO_BLOCKLENGTH $NICS | ionice -c 1 -n 4 nice -n -10 tee >(ionice -c 1 -n 4 nice -n -10 /root/wifibroadcast_misc/ftee /root/videofifo2 > /dev/null 2>&1) >(ionice -c 1 nice -n -10 /root/wifibroadcast_misc/ftee /root/videofifo4 > /dev/null 2>&1) >(ionice -c 3 nice /root/wifibroadcast_misc/ftee /root/videofifo3 > /dev/null 2>&1) | ionice -c 1 -n 4 nice -n -10 /root/wifibroadcast_misc/ftee /root/videofifo1 > /dev/null 2>&1 RX_EXITSTATUS=${PIPESTATUS[0]} check_exitstatus $RX_EXITSTATUS ps -ef | nice grep "$DISPLAY_PROGRAM" | nice grep -v grep | awk '{print $2}' | xargs kill -9 ps -ef | nice grep "rx -p 0" | nice grep -v grep | awk '{print $2}' | xargs kill -9 ps -ef | nice grep "ftee /root/videofifo" | nice grep -v grep | awk '{print $2}' | xargs kill -9 ps -ef | nice grep "cat /root/videofifo" | nice grep -v grep | awk '{print $2}' | xargs kill -9 done

Any help or recommendations would be greatly appreciated.

PS: Right now my TX pipeline is:

gst-launch-1.0 v4l2src device=/dev/video3 ! video/x-raw,width=160,height=128, framerate=1/10 ! rtpvrawpay ! udpsink host=192.168.0.100 port=9002

Any my RX pipeline is:

gst-launch-1.0 udpsrc port=9002 caps="application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)RAW, sampling=(string) YCbCr-4:2:0, depth=(string)8, width=(string)160, height=(string)128, colorimetry=(string)BT601-5, payload=(int) 96" ! rtpvrawdepay ! autovideosink

tbeauchant commented 6 years ago

@tictag @careyer

Piping is the process of grabbing the stdout of one program and feeding it into the stdin of another. In the .profile script this is what is achieved on lines 688 & 71: raspivid ... | tx_rawsock

Raspivid outputs to it's stdout the h264 data, this data is grabbed and sent to the stdin of tx_rawsock which in turns does it's magic to send it through the wifi.

if you'd like to achieve the same thing with gstreamer, you'll have to replace the raspivid part of the command with your gstreamer pipeline, and make sure gstreamer outputs to stdout. This can be achieved by using the following sink: gstreamer .... ! filesink location=/dev/stdout Be aware that the video stream is not the only data that gstreamer will output on stdout. Every line displayed in the console when you run the software is also on stdout, so depending on situations, this may or may not cause issues in the datastream.

In order to get rid of this issue, I have created a fifo using mkfifo, and modified tx_rawsock to use it (look in the source code, there is a line where it opens the /dev/stdin, replace it with the name of the fifo file you created). Then in gstreamer simply output the data to the fifo: filesink location=/path/to/my/fifo

A fifo is simply a special type of file on the filesystem that acts as a buffer.

Also, the reason tx_rawsock is used twice in the .profile is to output different texts messages on the display dpeending on whether the script has detected some issues or not. Usually for testing I disable both these lines and run the command line myself from the terminal.

Good luck, Theo

careyer commented 6 years ago

@tbeauchant : Great I did some research on my own and came up with a similar approach in the meantime. However your hint with the fifo-buffer is worth gold 🥇. Sounds like the implementation on on the AirPi side of things is straight forward and quite easy.

What worries me most is how to process the gstreamer data at the GroundPi. I do not understand what this line of code exactly does and where the actual video decoding and displaying via HDMI takes place.

Line 837 in .profile ionice -c 1 -n 3 /root/wifibroadcast/rx -p 0 -d 1 -b $VIDEO_BLOCKS -r $VIDEO_FECS -f $VIDEO_BLOCKLENGTH $NICS | ionice -c 1 -n 4 nice -n -10 tee >(ionice -c 1 -n 4 nice -n -10 /root/wifibroadcast_misc/ftee /root/videofifo2 > /dev/null 2>&1) >(ionice -c 1 nice -n -10 /root/wifibroadcast_misc/ftee /root/videofifo4 > /dev/null 2>&1) >(ionice -c 3 nice /root/wifibroadcast_misc/ftee /root/videofifo3 > /dev/null 2>&1) | ionice -c 1 -n 4 nice -n -10 /root/wifibroadcast_misc/ftee /root/videofifo1 > /dev/null 2>&1

I guess I have to put my gstreamer decoding pipleline somewhere .. but where? Thanks in advance! - Thomas

rodizio1 commented 6 years ago

I have explained that tee/ftee/fifo thing sometime ago on the rcgroups thread (for telemetry but it's the same principle for video), if you search for my username and 'ftee' or 'telemetryfifo', you should find it.

For using gstreamer instead of hello_video for display, check the 'DISPLAY_PROGRAM' variable, it should be possible to change that (it's set depending on fps configured, you probably don't need that with gstreamer).

Regarding the kernel: I think the easier and cleaner way would've been not using rpi-source and simply downloading and extracting/compiling the sources from the link in the /kernel directory readme here.

careyer commented 6 years ago

@rodizio1 : I have explained that tee/ftee/fifo thing sometime ago on the rcgroups thread (for telemetry but it's the same principle for video), if you search for my username and 'ftee' or 'telemetryfifo', you should find it.

Thank you very much! I think I found it. For better reference I will link it here: https://www.rcgroups.com/forums/showpost.php?p=36551372&postcount=1897 . As far as I understand now I do not need to change anything at the tee-mimic. Just need to replace the displaying program?

@rodizio1 : For using gstreamer instead of hello_video for display, check the 'DISPLAY_PROGRAM' variable, it should be possible to change that (it's set depending on fps configured, you probably don't need that with gstreamer).

Hilarious, also found that in .profile Line 1644-1658:

if [ "$FPS" == "59.9" ]; then
    DISPLAY_PROGRAM=/opt/vc/src/hello_pi/hello_video/hello_video.bin.48-mm
else

    if [ "$FPS" -eq 30 ]; then
    DISPLAY_PROGRAM=/opt/vc/src/hello_pi/hello_video/hello_video.bin.30-mm
    fi
    if [ "$FPS" -lt 60 ]; then
    DISPLAY_PROGRAM=/opt/vc/src/hello_pi/hello_video/hello_video.bin.48-mm
#   DISPLAY_PROGRAM=/opt/vc/src/hello_pi/hello_video/hello_video.bin.240-befi
    fi
    if [ "$FPS" -gt 60 ]; then
    DISPLAY_PROGRAM=/opt/vc/src/hello_pi/hello_video/hello_video.bin.240-befi
    fi
fi

I can replace hello_video.bin with the gstreamer here. I am just wondering how I can specify all of the gstreamer pipeline parameters here? (they are quite massive: udpsrc port=9002 caps="application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)RAW, sampling=(string) YCbCr-4:2:0, depth=(string)8, width=(string)160, height=(string)128, colorimetry=(string)BT601-5, payload=(int) 96" ! rtpvrawdepay ! autovideosink)

@rodizio1 : Regarding the kernel: I think the easier and cleaner way would've been not using rpi-source and simply downloading and extracting/compiling the sources from the link in the /kernel directory readme here.

Thankfully this problem is solved. In fact I did something similar to your suggestion. I used rpi-source with the "--uri" parameter and specified the Raspberry 4.9.35 kernel sources you have linked in this very Git repo under /kernel. Works just fine. Only problem was that rpi-source adds a "+" to the kernel.release String to indicate that this was a kernel downloaded from a GitHub source. In consequence of this the kernel module I needed to compile was compiled with a differing kernel.release string causing problems at runtime. Problem was solved by eliminating the "+" in three different files manually before the compile. Works like a charm ;-).

BTW: The files to remove the "+" Character in are:

/root/linux/.scmversion
/root/linux-be2540e540f5442d7b372208787fb64100af0c54/include/config/kernel.release
/root/linux-be2540e540f5442d7b372208787fb64100af0c54/include/generated/utsrelease.h     
rodizio1 commented 6 years ago

Thank you very much! I think I found it. For better reference I will link it here: https://www.rcgroups.com/forums/showpost.php?p=36551372&postcount=1897 .

I meant this one, it describes the telemetry part, but it's the same logic for video: https://www.rcgroups.com/forums/showpost.php?p=37651300&postcount=2792

As far as I understand now I do not need to change anything at the tee-mimic. Just need to replace the displaying program?

It depends on what you want to do and how you want to do that. Replacing the display program with your gstreamer should work. But then you only have one stream at the rx? I thought you guys wanted to send two different videostreams to the rx? In that case you'd have to run another rx and the tee/ftee/fifo logic described in the post above.

I'd recommend reading some bash scripting tutorial and maybe also a basic tutorial that explains some basic unix concepts like I/O, redirection, pipes, stdin/stdout/stderr, sockets, shell, processes, some basic shell commands, special characters and escaping them etc.

careyer commented 6 years ago

@rodizio1 / @ivanskj : Thank you very much! I followed your recommendation and studied all sorts of things (tutorials, manpages...) that I could find with a little help of Google! =) Indeed I made some considerable progress, which is: I could successfully transfer the gstreamer datastream via tx_rawsock to the ground. The OSD on my GroundPi shows that those packets indeed do arrive. Since the FlirOne has only a low resolution of 160x120 Pixels it needs considerably less bandwidth which is also reflected in a lower packet count displayed via the OSD. So far so good 👍

The very last thing im am struggling with is to display the received data via HDMI. I get a "invalid RTP payload" error from gstreamer:

It outputs approx 50 of those lines scrolling down the screen and then finally gives up

May I kindly ask you to have a quick glance at my edited .profile file? I introduced a new Variable FLIRONE="Y" at the top of the file and marked each location that I have edited with ### FLIRONE ### to allow for a easy understanding of what has been changed. All edits edits/changes follow the concept:

    ### FLIRONE ###
    if [ "$FLIRONE" == "Y" ]; then
       new code 
    else 
       original code
    fi

Here is a link to my File: http://www.ip-networx.de/content/GitHub/editedProfile.txt

Any help is much appreciated! Thanks in advance! -- Cheers! Thomas

ivanskj commented 6 years ago

I am also transmitting a lower res picture via wbc. It is the liveview from a Sony QX1, it's about 640x480 and I am struggling to understand the datarate and bitrate settings. As I understand from the wiki, lower bitrate equals longer range. But how low can I go and how can I confirm that the txpower is actually maxed? (Atheros)

Does anyone know how much latency is added when displaying videofeed in QGroundcontrol via USB tether?

careyer commented 6 years ago

@rodizio1 : Updated my above post... I now figured out where the video gets displayed... However the data received seems not to be in the anticipated format. (error: invalid RTP payload)

The proplem might be caused by troubles to transfer RTP by streaming through a FIFO : https://stackoverflow.com/questions/24530911/gstreamer-rtpvp8depay-cannot-decode-stream When going through a named pipe, the RTP may not be packetized properly. (Via UDP connection my pipeline works ok). I'm stuck! :-( ...

@ivanskj Maybe you are familiar with this problem? How have you solved that with your Sony QX1? It sounds like as if you are already one step further here. (BTW: txpower should be already maxed out for Atheros by default)

ivanskj commented 6 years ago

I have to check my notes but if i could remember correctly i did not use rtp pay/depay. I can post my complete gstreamer pipeline when I get home.

Are you using h264?

careyer commented 6 years ago

@ivanskj : Thank you! Your pipeline would be very much appreciated. I think RTP is indeed the cause of trouble since it is a packet oriented network protocol usually supposed to go over UDP. Now by rethinking things Rodizios tx_rawsock seems to be a transparent tunnel for a bytestream (is that correct?). It just makes sense that the receiving side can't determine what the heck to do with a constant stream of bytes where no RTP packets can be determined easily.

Unfortunately I am not using h264 but a raw videostream.

I think I just have to leave out the rtp-elements in the gestreamer pipeline: @TX: instead of : gst-launch-1.0 v4l2src device=/dev/video3 ! "video/x-raw ..." ! rtpvrawpay ! filesink location=/dev/stdout @TX: use : gst-launch-1.0 v4l2src device=/dev/video3 ! "video/x-raw ..." ! filesink location=/dev/stdout

and for the receiver side of things use: @RX instead of: gst-launch-1.0 filesrc location=/root/videofifo1 ! "application/x-rtp ..." ! rtpvrawdepay ! fbdevsink @RX use: gst-launch-1.0 filesrc location=/root/videofifo1 ! "application/x-rtp ..." ! fbdevsink or gst-launch-1.0 filesrc location=/root/videofifo1 ! "application/x-rtp ..." ! videoconvert ! fbdevsink Hoping that the fbdevsink element will deal with the input (autovideosink somehow can not be used without running an xserver).

Anyhow. Your pipeline is very much welcome. I am a absolute beginner with gstreamer and might just be missing the magic thingy. Thanks

ivanskj commented 6 years ago

Had a hard time figuring out how to open that raw .img file of the SD-card on my Mac, but i got it eventually!

My pipeline looks like this: gst-launch-1.0 souphttpsrc location=http://192.168.122.1:8080/liveview/liveviewstream is-live=true do-timestamp=true ! image/jpeg,width=640,height=424 ! jpegparse ! omxmjpegdec ! omxh264enc control-rate=1 target-bitrate=$BITRATE ! h264parse config-interval=5 ! fdsink | nice -n -9 /root/wifibroadcast/tx -p 0 -b $VIDEO_BLOCKS -r $VIDEO_FECS -f $VIDEO_BLOCKLENGTH -t $VIDEO_FRAMETYPE -d $VIDEO_WIFI_BITRATE -y 0 $NICS

You would have to experiment with the input part of your pipeline since yours is a bit different than mine. I get a MJPEG stream from the ip-address of the camera. The jpegs are parsed, decoded and then encoded into a h264 stream. h264 is very good for streaming over a solution like this. Surely there could be many improvements done to this, but i am not sturdy enough on gstreamer to say, yet.

This is on v1.5 of WBC, so any adaptations to 1.6 needs to be considered. I am waiting for 1.6 final before I resume the hunt for long-range.

If you encounter any problems with the omx parts, let me know and i will check which version of gstreamer i am using because i remember experimenting with different gstreamer builds. Omx is the hardware accelerated framework used rpi.

Hope this helps in any way

careyer commented 6 years ago

@ivanskj : Thanks! i wanted to avoid encoding the video to h264 but rather send the rawvideo... I will give it a try anyway. How does your decoding pipeline look like? Encoding seems to work but decoding gives me no output (but no error either)

ivanskj commented 6 years ago

The RX is untouched since I am using H264. I wanted to test uncompressed MJPEG but i could not find a mission planner app that shows JPEG stream. H264 is good at recovering from packetloss.

What is the reason for using raw? What is displaying the recieved data?

ivanskj commented 6 years ago

One way to test is to use filesink and try to play the file locally before you send it trough tx. It has to be package and muxed correctly, i.e mkv

ivanskj commented 6 years ago

What type of format is /dev/video3 sending?

careyer commented 6 years ago

@ivanskj: Oh man... i did not consider that hellovideo was a h264 player after all. I tried with gstreamer for hours and hours but it does not display anything if the datarate is too low (the thermal camera has a very negligible datarate due to its low resolution). Encoding to h264 and using hellovideo for playback it worked right out of the box :-) Thanks man!

Whatsoever I think raw video might be the better solution here because of two reasons:

  1. the initial proof of concept using RTP/UDP had considerably less lag
  2. the datarate of the h264 encoded 160x120 video is so tiny that WifiBroadcast even struggles at times

@rodizio1 : Regarding No.2 the packet counter indicates that only very few packets do arrive. I think I may have to tweak the VIDEO_BLOCKS / VIDEO_FECS / VIDEO_BLOCKLENGTH variables to get less lag and more packets per timeframe arriving? Here is how it looks right now:

Video: https://youtu.be/5aDpvRuO9cI

Here is a little update on my attempt to transmit rawvideo instead:

I avoid using rtp in my gstreamer pipeline now and instead transfer only the raw datastream as is.

I decided to keep things as simple as possible and instead of using my FLIR use the nifty little testvideoloop included with gstreamer. I first build a local gstreamer pipeline (encode -> decode &-> display) to check everthing works ok: gst-launch-1.0 videotestsrc ! video/x-raw ! videoparse ! autovideoconvert ! fbdevsink This works flawlessly and yields the expected result: a clean testpattern: https://youtu.be/m8TwNFyYmSs

I then split up this working pipeline to my Air- and GroundPi as follows: AirPi: gst-launch-1.0 videotestsrc ! video/x-raw ! fdsink fd=1 | tx_rawsock ... GroundPi:
gst-launch-1.0 filesrc location=/root/videofifo1 ! videoparse ! autovideoconvert ! fbdevsink This however results in a distrubed Test-Pattern: https://youtu.be/soXGVjsfk4A

I also did a short video to explain and demonstrate things. When transfering the video from my FLIR (this has a small osd showing °C) i can see parts of this osd flickering in the messed up output from time to time, i.e. data from the camera is arriving but for some reason gets mixed up.

Video: https://youtu.be/4QUvAJTst7A

@rodizio1 : Any idea of what might be the reason for the rawvideo getting mixed up? Am I doing something wrong? I am so close to get this thing working - i do not want to give up on this. Thanks!

careyer commented 6 years ago

@ivanskj

What type of format is /dev/video3 sending?

I don't exactly know what format it is. I just know that this rtp pipeline worked like butter for it.

TX: gst-launch-1.0 v4l2src device=/dev/video3 ! video/x-raw,width=160,height=128, framerate=1/10 ! rtpvrawpay ! udpsink host=192.168.0.100 port=9002

RX: gst-launch-1.0 udpsrc port=9002 caps="application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)RAW, sampling=(string) YCbCr-4:2:0, depth=(string)8, width=(string)160, height=(string)128, colorimetry=(string)BT601-5, payload=(int) 96" ! rtpvrawdepay ! autovideosink

ivanskj commented 6 years ago

I had latency approx. 120-150ms when using h264encoding, you need to use the omx and check that you have sufficient gpu_mem if you go that route.

Maybe you are missing the pay/depay parts and the recieved data is not put together correctly? Try with the jpegpay element as a test.

rodizio1 commented 6 years ago

Sorry I have no idea about gstreamer. In general, check cpuload when doing things in software.

Regarding the latency and wbc: with 8/4/1024 fec it'll wait until it has 8x1024bytes of data, then add fec packets and send them out. The rx waits until it has received those 8x1024 bytes packets (and the fec packets if datas were missing).

So if your flir only puts out let's say 3mbit instead of the default 6mbit, you'd either need to also halve the packet size or the number of packets to maintain the same wbc transmission latency. Or a combination of both. I'd try 6/3/750 or 4/2/750.

ivanskj commented 6 years ago

Thank you for the clarification regarding FEC. That will definetly help alot!

I experienced RX overriding the FEC value to 1400 due to CTS_protection even when it was disabled in the config. Any pointers to why this could be happening?

careyer commented 6 years ago

@rodizio1 : Thank you very much for the excellent explanation of VIDEO_BLOCKS / VIDEO_FECS / VIDEO_BLOCKLENGTH and how they act together. That makes perfect sense (and is much more intuitive to understand than the explanation in the Wiki ;-). You should consider adding this information there. It really helps a lot.

Is there any way to test if some additional/unexpected data gets in between the rawvideo packets while being transmitted via tx_rawsock? It pretty much looks as if some additional bytes or something get injected here and therefore corrupt the rendering of the raw videoframes on the GroundPi. I believe it must be something like this because I can clearly see parts of the videotest pattern flickering at wrong positions in the output. As far as I understand tx_rawsock transfers a stream of data transparently to the rx, am I right?

So in theory it should be possible to transform a pipeline like (simplified): (Pi1)rawvideo -> (Pi1)display (i.e. locally displaying the video) to: (Pi1)rawvideo -> (Pi1)tx_rawsock -> (Pi2)rx -> (Pi2)display (i.e streaming video vom Pi1 to Pi2)

That is what i do with the above pipelines which results in a mixed up rendering of the content.

The reason why this works with h264 encoded frames is simple:

Even if additional/unexpected data is transferred together with the h264 frames via tx_rawsock the Ground Pi will only look for valid h264 frames in the output of rx, render them and discard the rest, whereas in a rawvideo scenario each additional byte gets interpreted as content of the video and such disturbes the anticipated output. Maybe something else (other than the rawvideo) from stdout gets into the rx_rawsock tunnel? I connected my AirPi to HDMI and observed that a packet counter of transmitted packets is displayed/incremented during transmission. Since this output is displayed on the console (stdout?) it gets transferred via tx_rawsock as well?

maybe also this might be a problem? from @tbeauchant :

Be aware that the video stream is not the only data that gstreamer will output on stdout. Every line displayed in the console when you run the software is also on stdout, so depending on situations, this may or may not cause issues in the datastream.

I cannot comprehend though that gstreamer writes something to stdout other than the video. Can I check that somehow? @tbeauchant recommends to write the video to a custom fifo (created with mkfifo) instead of stdout to avoid this problem and tweak tx_rawsock to be able to read from this custom fifo instead of stdout. I tried to modify tx_rawsock but it gives me errors on compile (missing references). It would be great to see a version of it that is able to read from a custom fifo-file if that is specified (maybe add additional launch parameter? -i input default=/dev/stdout).

To sum things up: Using rawvideo on an anyhow low res video has the advantage that more data is transmitted (still a fraction of the data the HD-video stream consumes), lower latency because of lacking encoding, and after all EZ-Wifibroadcast seems to do a better job on higher datarates.

I can't tell how much I appreciate your help and support! -- Thanks Rodizio!!! --

@ivanskj : Also many thanks to you guiding me to the h264 approach - at least things basically work now! BTW: I already use the hardware accelerated h264 encoder (omxh264enc). I think I do not need the pay/depay part since i am not payloading to a network protocol here. I couldn't depay on the GroundPi (using hellovideo) anyway ;-)

ivanskj commented 6 years ago

What i meant by using the pay/depay is that i believe you can force the groundPi to identify the correct packets and discarding the other info. Also by using them i believe the recieving Pi also knows where to start reading the image data. Without it all the recieved data may seem like unorganized bits and bytes.