rodizio1 / EZ-WifiBroadcast

Affordable Digital HD Video Transmission made easy!
GNU General Public License v2.0
816 stars 200 forks source link

Using OpenCV with EZ-WifiBroadcast #71

Open tictag opened 6 years ago

tictag commented 6 years ago

Hello Rodizio,

Just got my first video stream to work [RPi Zero()W with ZeroCam to RPi3 using 2 x RT5370 based nano USB adapters] and it looks great! :) Really fantastic work (from you and Befinitiv)!!

My query does not need a comprehensive answer, I don't mind doing all the digging to get something working, but I don't want to waste my time investigating if my use case simply isn't an option. I plan to take two seperate video sources connected to a RPi, do some rudimentary image processing then merge them together using OpenCV ... and here's the query ... could I then use EZ-WifiBroadcast to transmit that composite video stream to a receiver?

I've read every page of your wiki and everything revolves around broadcasting a source video from Tx to Rx. Basically, can my source video actually be an output stream from OpenCV?

I don't mind putting the hours in to investigate/troubleshoot etc but not if this use case is simply not do'able.

I would appreciate your thoughts.

Oh, and if you don't get back to me before new year, Happy New year!! :)

careyer commented 5 years ago

@pullenrc : No you do not need to recompile the kernel. You just need to compile the v4l2loopback kernel module. For that you need to download the correct kernel sources and headers. (Beware of the "+" kernel string problem, see above).

Not sure though: What do you need the v4l2loopback device for? You say you already have a video stream on opencv... what do you need the v4l2loopback devices then? Afaik v4l2loopback is for creating a "virtual video devices". Guess you don't need that since you have your stream already in opencv? However when it comes to sending a videostream via WBC you need to "frame" it somehow, so that the GroundPi can determine the Start and End of a frame correctly, i.e. you have to encode the datastream in a way so that it is broken up into detectable video frames. Simplest thing to do is to encode the video into a streaming format such as H264 which will take care of all that. This is where the hardware H.264 encoder comes in handy. =D

JRicardoPC commented 5 years ago

finally i can launch modprobe, i finded how in a issue open by you(@careyer) in rpi-source's github. for now i maked this try in a pi3, but now i need execute the same in a pi zero, and i have a issue because kernel is different(again -.-), i use the same ssd card, but pi3 use kernel 4.9.35-v7 and pizero use kernel 4.9.35

Update: I downloaded the kernel of this github (4.9.35), I compiled and installed it, but v4l2loopback has the same problem and I can't do a make

kwatkins commented 5 years ago

@careyer I got everything working, well, all except the streaming with the tx_rawsock using gstreamer with the flir one. It looks like on TX side all is well, flirone is reading/sending data, etc and I'm using the last bash .profile you linked above. I'm not seeing any HDMI out on the RX side tho. You have a latest you can share?

@JRicardoPC I just took out the code that does the + append from rpi-source, although, I haven't done this on the RX side yet. I didn't think that side needed to be modified for those changes. I was seeing the same exec format error you had, due to ver mismatching on the .ko, basically, wipe all that out and just take out the code adding the + from rpi-source and do the steps again. There is also a SSL certificate issue you might encounter, they cover it on the repo but i just modified the python script to ignore SSL certs.

careyer commented 5 years ago

@kwatkins : I am happy my project inspired you. Unforuntaly my latest working version is based off of 16rc3. From 16rc4 onwards some change to the tx_rawsock and the sharedmem handling has been introduced which I cannot retrace - this prohibts to call tx_rawsock with a different port parameter than 0 (e.g. "p 10"), more precisely it gives an error message at the tx side. I can only give the advice to go back to rc3 as of now. Sorry & the best of luck

kwatkins commented 5 years ago

@careyer good to know. i'll give this a go with 1.6rc3.

kwatkins commented 5 years ago

Sir @careyer, if you get a chance can post your working gstreamer pipeline (tx, rx too if it was changed) that you got working with the flir one and ez-wifi? Better yet, the /root/.profile you ended up going with for ez-wifi broadcast 1.6rc3 would be greatly appreciated 👍

For everyone (including me) struggling to get the images setup with v4lc etc, these are the steps that should get you there. This was used for the TX but should work for RX as well.

Run all below as root from the v1.6RC3 image (EZ-Wifibroadcast-1.6RC3.img)

  1. resize, install some tools, update the pi

    resize the partition using https://github.com/bortek/EZ-WifiBroadcast/wiki/Tips-and-tricks
    dpkg-reconfigure ca-certificates
    apt-get update
    apt-get -y install vim screen
  2. get vl4c working using https://github.com/bortek/EZ-WifiBroadcast/issues/71

    # fuse and gstreamer for video streams 
    mkdir /dev/fuse
    chmod 777 /dev/fuse
    apt-get -y install fuse
    apt-get -y install gstreamer1.0
    # kernel sources for vl4c module 
    apt-get -y install linux-headers-rpi
    wget https://raw.githubusercontent.com/notro/rpi-source/master/rpi-source -O /usr/bin/rpi-source && sudo chmod +x /usr/bin/rpi-source 
    #edit /usr/bin/rpi-source following https://github.com/notro/rpi-source/issues/37 and removing https://github.com/notro/rpi-source/blob/master/rpi-source#L350 
    # AND  add 'import ssl' to top AND change download(...) to, 
    #      ctx = ssl.SSLContext(ssl.PROTOCOL_SSLv23)
    #      res = urllib2.urlopen(url,context=ctx).read()
    # AND add "--no-check-certificate" to wget calls (what a mess…)
    apt-get -y install bc
    /usr/bin/rpi-source -q --tag-update
    rpi-source
    # finally, install v4l2 loopback for /dev/video* 
    export GIT_SSL_NO_VERIFY=true
    mkdir ~/flir && cd ~/flir && git clone https://github.com/umlaeute/v4l2loopback
    cd v4l2loopback && make install
    depmod -a 
  3. install flir tools and setup the ez-wifi .profile gstreamer

    apt-get -y install libusb-1.0-0-dev
    cd ~/flir && git clone https://github.com/fnoop/flirone-v4l2.git
    cd flirone-v4l2 && make 
    # edit /etc/rc.local and  before the exit 0 add 
    # modprobe v4l2loopback devices=5 
    # sleep 5
    # cd /root/flir/flirone-v4l2 &&  ./flirone ./palettes/Iron2.raw &
    #grab a .profile to start with (thanks @careyer !) 
    mv /root/.profile /root/.profile-original
    wget http://www.ip-networx.de/content/GitHub/editedProfile.txt -O /root/.profile
careyer commented 5 years ago

@kwatkins, you are welcome... I made several other changes to .profile but I am happy to share the tx and rx pipeline command with you:

TX: nice -n -9 gst-launch-1.0 v4l2src device=/dev/video3 ! video/x-raw,width=160,height=128,framerate=10/1 ! omxh264enc control-rate=1 target-bitrate=600000 ! h264parse config-interval=3 ! fdsink fd=1 | nice -n -9 /root/wifibroadcast/tx_rawsock -p 10 -b $VIDEO_BLOCKS -r $VIDEO_FECS -f $VIDEO_BLOCKLENGTH -t $VIDEO_FRAMETYPE -d $VIDEO_WIFI_BITRATE -y 0 $NICS

RX: ionice -c 1 -n 3 /root/wifibroadcast/rx -p 10 -d 1 -b $VIDEO_BLOCKS -r $VIDEO_FECS -f $VIDEO_BLOCKLENGTH $NICS | ionice -c 1 -n 4 nice -n -10 tee >(ionice -c 1 -n 4 nice -n -10 /root/wifibroadcast_misc/ftee /root/videofifo2 > /dev/null 2>&1) >(ionice -c 1 nice -n -10 /root/wifibroadcast_misc/ftee /root/videofifo4 > /dev/null 2>&1) >(ionice -c 3 nice /root/wifibroadcast_misc/ftee /root/videofifo3 > /dev/null 2>&1) | ionice -c 1 -n 4 nice -n -10 /root/wifibroadcast_misc/ftee /root/videofifo1 > /dev/null 2>&1

You can decode and display via the hellovideo standard video playback

kwatkins commented 5 years ago

@careyer and for others giving this a go, I was able to get it all streaming, thermal vision style. Following the setup steps above on the the TX/AirPi (https://github.com/bortek/EZ-WifiBroadcast/issues/71#issuecomment-428784909) to get the Flir One streaming only changes were to the TX/AirPi "/root/.profile", changing where raspivid was used to,

nice -n -9 gst-launch-1.0 v4l2src device=/dev/video3 ! video/x-raw,width=160,height=128,framerate=10/1 ! omxh264enc control-rate=1 target-bitrate=600000 ! h264parse config-interval=3 ! fdsink fd=1 | nice -n -9 /root/wifibroadcast/tx_rawsock -p 0 -b $VIDEO_BLOCKS -r $VIDEO_FECS -f $VIDEO_BLOCKLENGTH -t $VIDEO_FRAMETYPE -d $VIDEO_WIFI_BITRATE -y 0 $NICS

Also, since I wasn't attaching a camera I set $CAM=1 in the .profile for the TX/AirPi (ez-wifi checks if camera is attached to determine if it's RX or TX)

channel '-p 0' worked fine and no changes needed to view video stream on RX/GroundPi. Again, this is using 'v1.6RC3 image', from what @careyer found this might not work on later images.

My next steps are getting this all into something I can just attach to the bird and it works. I'm using a Li-ion Battery HAT on top of the TX Pi0 for power (https://www.waveshare.com/li-ion-battery-hat.htm) which has a 5V regulated power out. I'm using a mini-usb hub for testing, and includes an ethernet port, totally recommend this when your ssh root@ezwifibrdcast-tx to mess with different streams. Power wise I plan to solder the usb from the wifi directly to the pi splitting out power input from the battery HAT. I'm also trying out some other USB hubs, smaller ones, in the hopes that they work.

Past that, it's mavlink time, finding a way to fuse all this together. Going to try using the mavic pro w/ mavlink wrapper (https://github.com/diux-dev/rosettadrone). I also have an Emlid Navio2 (Autopilot HAT for Raspberry Pi powered by ArduoPilot), if I can get the AirPi working and streaming off that directly, that's some sort of win I haven't wrapped my head around :)

kwatkins commented 5 years ago

flirpup

Heyyooo - the Zero4U (https://www.adafruit.com/product/3298) seems to be solid, a little hub that pogo jumps onto the usb test pads. Got the TX all contained, power from the HAT on top. Now to find something that will case it, attach, fly.

careyer commented 5 years ago

Alright! Congratulations! I am happy that you got it right and working :+1: I am also using the Zero4U - it is a decent hub!

JRicardoPC commented 5 years ago

Hello, Finally i could fix the problem and now, i can make a stream with the picamera, and the Lepton thermal camera. Now i need got one step more and send the two stream at the same time , for this i try to use videomixer. In the first try in my computer, work so good, but if i try in pizero, i have a trouble connect fdsink fd=1 with videomixer.

At start i try only send videotestsrc

My TX pipeline: nice -n -9 gst-launch-1.0 videotestsrc ! video/x-raw,width=80,height=60,framerate=10/1 ! videobox border-alpha=0 top=0 bottom=0 ! videomixer name=mix sink_0::xpos=240 sink_0::ypos=180 sink_0::zorder=10 ! videoconvert ! xvimagesink videotestsrc ! video/x-raw,format=AYUV,framerate=5/1,width=320,height=240 ! fdsink fd=1 ! mix. | nice -n -9 /root/wifibroadcast/tx_rawsock -p 10 -b $VIDEO_BLOCKS -r $VIDEO_FECS -f $VIDEO_BLOCKLENGTH -t $VIDEO_FRAMETYPE -d $VIDEO_WIFI_BITRATE -y 0 $NICS

RespawnDespair commented 5 years ago

In the image from dino.de there are two streams sent side-by-side. On the ground you can switch between which stream you want to use for display, this works because the second stream is very low bandwidth (320x240 several FPS) from the FLIR camera. I see from your command line you use the same. I am working on a one-size-fits-all solution that will allow for multiple streams to be sent side-by-side, also v4l2 support in the image. This should enable you to use the /dev/video0 input. Would that be ok for your application?

pullenrc commented 5 years ago

@Tigchelaar That would be awesome! Looking forward to it.

Ryan Pullen

On Oct 31, 2018, at 1:14 AM, Jelle Tigchelaar notifications@github.com<mailto:notifications@github.com> wrote:

In the image from dino.dehttp://dino.de there are two streams sent side-by-side. On the ground you can switch between which stream you want to use for display, this works because the second stream is very low bandwidth (320x240 several FPS) from the FLIR camera. I see from your command line you use the same. I am working on a one-size-fits-all solution that will allow for multiple streams to be sent side-by-side, also v4l2 support in the image. This should enable you to use the /dev/video0 input. Would that be ok for your application?

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHubhttps://github.com/bortek/EZ-WifiBroadcast/issues/71#issuecomment-434597968, or mute the threadhttps://github.com/notifications/unsubscribe-auth/AQVa15UGULPl1NU33sU0QkPHgH7sl0SIks5uqVvzgaJpZM4ROkO0.

careyer commented 5 years ago

@RespawnDespair : The FLIRone code in the dino.de image is essentially my code. I have developed it and dino.de made the logic to switch between the different video streams afterwards. As you have noticed correctly both video streams are transfered side-by-side (in parallel). This is possbile because the 2nd stream consumes only very little bandwidth: 160x120 @ 9fps. As you have noticed it also makes use of the v4l2loopback kernel module (which has to be compiled for every plattform seperatly in order to start correctly, i.e. Pi0, Pi2, Pi3....). It would be very convinvient to have v4l2loopback support for all SOCs already build into EZ-WBC. So yes v4l2loopback is also a thing that I introduced.

Essentially the HD-Video (Pi Cam) is written to videofifo1 at the GroundPi and The Flir stream is written to videofifo5. We have decided to do so because we wanted to be able to quickly switch between the HD and thermal view. At the flick of a switch essentially Hellovideo is restarted with a different videofifo (1 for HD and 5 for thermal).

I am not sure what you mean with /dev/video0 input though?

Now that we have the dino.de "AirSwitches" (which can either drive GPIOs or trigger software actions at the AirPi) it also might be possbile to send everything via the same tx_rawsock port "p 0" and already decide at the airside which stream to put into the radio link? I.e. do not transfer both videos on parallel anymore but switch on the airside which video gets fed into the standard video pipeline? However that migth result in a bigger latency when switching between streams and it might happen that Hellovideo on the GroundPi crashes while switching ?

zipray commented 5 years ago

@careyer Thank you very much for sharing the image from dino. DE.and I used usb hub to install Insta360 Air on the Air zero PI, and it couldn't start without the raspberry pi camera.Install the raspberry pi camera to boot, but hdmi does not output Insta360 Air video, do I need any special Settings?

careyer commented 5 years ago

@zipray : Sorry the dino.de image contains support for FLIRone only. For insta 360 you need to modify the .profile and alter the tx_rawsock command like in https://github.com/bortek/EZ-WifiBroadcast/issues/71#issuecomment-406746281. You have to either connect a PiCamera or search in the .profile for a variable called "FLIRforcecamera" (or something like that.. don't remember its name, should be at the very beginning of .profile) and set this to "=Y" . This will make the Pi believe that there is a Pi camera connected an start as AirPi.

zipray commented 5 years ago

@careyer Thank you very much. I find that the video stream obtained from insta360 air is a dual-camera image that has not been Mosaic, qq 20181114093908 Even using the FPV_VR App.@Consti10 cache_66f6c8c8eef2d737 which leads to a very funny phenomenon when using ground VR head tracking in FPV_VR App. https://youtu.be/MraI-Ff2G3A Is there any way to get panoramic video directly from insta360 air?

careyer commented 5 years ago

@zipray : Congrats! You did it! :+1: ... Unfortunately I have no idea how to correct/convert that output to a different type of projection. Maybe @johnboiles or someone else can help?

This reference of different types of projections might be helpful: https://en.wikipedia.org/wiki/List_of_map_projections - Seems like we are dealing with a "Nicolosi globular" (or also called "double fisheye") projection here.

Update: Maybe these links can be helpful:

JRicardoPC commented 5 years ago

Finally I make a videomixer with picamera and lepton module, i tryed some ways and at the end i used v4l2sink to save the mix video in a virtual device, and then send this in another line. The result: videomixer

And my code in TX:

nice -n -9 gst-launch-1.0 v4l2src do-timestamp=true device=/dev/video1 ! video/x-raw,width=80,height=60,framerate=10/1 ! videobox border-alpha=0 top=0 bottom=0 ! videomixer name=mix sink_0::xpos=240 sink_0::ypos=180 sink_0::zorder=10 ! videoconvert ! v4l2sink device=/dev/video3 sync=false v4l2src device=/dev/video0 ! video/x-raw,framerate=30/1,width=320,height=240 ! videoconvert ! mix. &

sleep 5

nice -n -9 gst-launch-1.0 v4l2src device=/dev/video3 ! video/x-raw,width=320,height=240,framerate=25/1 ! omxh264enc control-rate=1 target-bitrate=600000 ! h264parse config-interval=3 ! fdsink fd=1 | nice -n -9 /root/wifibroadcast/tx_rawsock -p 0 -b $VIDEO_BLOCKS -r $VIDEO_FECS -f $VIDEO_BLOCKLENGTH -t $VIDEO_FRAMETYPE -d $VIDEO_WIFI_BITRATE -y 0 $NICS

careyer commented 5 years ago

Awesome! Thanks for sharing

Von meinem iPhone gesendet

Am 23.11.2018 um 10:39 schrieb Ricardo Pérez notifications@github.com:

Finally I make a videomixer with picamera and lepton module, i tryed some ways and at the end i used v4l2sink to save the mix video in a virtual device, and then send this in another line. The result:

And my code in TX:

nice -n -9 gst-launch-1.0 v4l2src do-timestamp=true device=/dev/video1 ! video/x-raw,width=80,height=60,framerate=10/1 ! videobox border-alpha=0 top=0 bottom=0 ! videomixer name=mix sink_0::xpos=240 sink_0::ypos=180 sink_0::zorder=10 ! videoconvert ! v4l2sink device=/dev/video3 sync=false v4l2src device=/dev/video0 ! video/x-raw,framerate=30/1,width=320,height=240 ! videoconvert ! mix. &

sleep 5

nice -n -9 gst-launch-1.0 v4l2src device=/dev/video3 ! video/x-raw,width=320,height=240,framerate=25/1 ! omxh264enc control-rate=1 target-bitrate=600000 ! h264parse config-interval=3 ! fdsink fd=1 | nice -n -9 /root/wifibroadcast/tx_rawsock -p 0 -b $VIDEO_BLOCKS -r $VIDEO_FECS -f $VIDEO_BLOCKLENGTH -t $VIDEO_FRAMETYPE -d $VIDEO_WIFI_BITRATE -y 0 $NICS

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.

careyer commented 5 years ago

@zipray : This might help you https://github.com/96fps/ThetaS-video-remap - we just need to find a better remap projection. Or: https://github.com/raboof/dualfisheye2equirectangular

jnoxro commented 5 years ago

@kwatkins : I am happy my project inspired you. Unforuntaly my latest working version is based off of 16rc3. From 16rc4 onwards some change to the tx_rawsock and the sharedmem handling has been introduced which I cannot retrace - this prohibts to call tx_rawsock with a different port parameter than 0 (e.g. "p 10"), more precisely it gives an error message at the tx side. I can only give the advice to go back to rc3 as of now. Sorry & the best of luck

Hey, do you happen to have a download link for the RC3 image? can't find it for the life of me

alisadra commented 5 years ago

@careyer you say "... GRRRR!! I had to manually alter the release number string all over the filesystem (hidden files, config files.... god knows were I found this damn attached '+'). Now it works! "

and

" ...I did a grep command and searched the whole filesystem for any occurrence of the altered kernel.version string."

what your command in bash for find all "+" in kernel version?? I need command grep؟ Someone else can help me thanks

msergiu80 commented 5 years ago

@careyer Hi, I am looking at the convo above and what a great way to achieve what you had in mind, congrats. I broke 2 Picams in a week by pressing on the sensor and to be honest I am not really a fan of these cams, they are not really suited for airborne and the outside environment. I would like to use a Logitech c920 on 720p with wifibroadcast, love the image quality and the autofocus feature, but got no idea where to start. Well, I know I need to start in the .profile but changes that I could recompile in the convo above get me lost on some points.

I am using Lelik´s 1.6 rc6 image since it was the only one that allowed parameters loading in external mission planner LAN tethered from the GroundPi. Even so it only works until plugging in the RC :) Anyway ...

Main questions are:

  1. How do I force Pi on tx mode without a picam? Tried to change CAM=1 in two instances of the .profile but it doesn´t work?
  2. On how many lines do I have to replace raspivid for gstreamer in the .profile?
  3. What needs to be done on GroundPi in order to switch the video layer to a RX gstreamer pipeline?

Sorry if I am asking any questions that were answered before, but it is a long discussion above and maybe I missed a few. Thanks in advance!

careyer commented 5 years ago

@msergiu80 : I am sorry that I can not be of much help with this. I have abandoned the EZ-Wifibroadcast project long ago and switched to the much more capable and more full-featured Open.HD project. It features USB-Cam and secondary CAM support as well as Picture-in-Picture and a custom QGC App - all right from the box. It has very active ongoing development as well.

msergiu80 commented 5 years ago

Yes, tried that too, issue is the same, no loading of parameters in external app, no documentation about USB cameras :) should I open a ticket there?

On Tue, 16 Jul 2019, 10:55 careyer, notifications@github.com wrote:

@msergiu80 https://github.com/msergiu80 : I am sorry that I can not be of much help with this. I have abandoned the EZ-Wifibroadcast project long ago and switched to the much more capable and more full-featured Open.HD https://github.com/HD-Fpv/Open.HD project. It features USB-Cam and secondary CAM support as well as Picture-in-Picture and a custom QGC App - all right from the box. It has very active ongoing development as well.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/rodizio1/EZ-WifiBroadcast/issues/71?email_source=notifications&email_token=ABYIA4T4TGNA2KDBSKVE4KTP7WLIPA5CNFSM4EJ2IO2KYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOD2AK3TA#issuecomment-511749580, or mute the thread https://github.com/notifications/unsubscribe-auth/ABYIA4WA45PSFOTNFPIKDX3P7WLIPANCNFSM4EJ2IO2A .

careyer commented 5 years ago

No need for that =). Just join the open Telegram support channel (link in the Wiki). It works but basically needs a bit of configuration and understanding. We still need to update the wiki on that rather new feature ;-)