Open notzed opened 5 years ago
Great idea, thanks for writing! I had hacked up something similar for gstreamer a while ago, but I've never had a good idea about how to integrate these sort of bridge modules into the main repository, as we want to keep the number of external dependencies as low as possible. Can you build your module with an unmodified libavdevice, or do you need to patch it?
FFmpeg doesn't have pluggable backends as such. It is extensible but they need to be part of the FFmpeg source.
The changes to libfreenect2 are trivial: https://www.zedzone.space/patches/freenect2-rawrgb.diff
note: The OpenGLPacketPipeline() constructor argument order might need fixing.
And just for interests sake this is the current work in progress against ffmpeg 4.0:
https://www.zedzone.space/patches/ffmpeg-kinect2.diff
I still need to finish/change a few things, update it to latest ffmpeg and prepare it for submission there. It should be straightforward I just haven't had the time to do it yet. Being part c++ may be an issue for ffmpeg but they already have 1 partly c++ indev.
building ffmpeg with patch:
PKG_CONFIG_PATH=..has to include libfreenect2 mkdir build ; cd build ../configure --enable-freenect2
The goal is to have cross-compilation for microsoft windows work out of the box.
Example capturing all 3x streams to a file:
./ffmpeg -f kinect2 -i 0 -map 0:0:0 -c:v:0 copy -map 0:1:1 -c:v:1 jpegls -map 0:2:2 -c:v:2 jpegls capture.mov
'q' to finish.
stream.0 - direct copy of the jpeg data from the camera stream.1 - IR as 16-bit integer lossless jpeg stream.2 - depth as 16-bit integer lossless jpeg
"ffplay capture.mov" plays it back and lets you switch between the streams by pressing 'c'.
It'd also be nice to store metadata such as serial # and camera intrinsic parameters, but how to do that is a question for the ffmpeg devs. There is the possibility for other options such as registered frames as extra/alternative streams.
I've made a more complete patch, this retains the original order for the opengl pipeline constructor and bumps the version number.
Again, just out of interest i've included an updated and more complete ffmpeg patch against 4.1.x head. Once the freenect patch is in I can work on submission of this.
If they don't attach properly (the web interface refuses to accept the files):
https://www.zedzone.space/patches/0001-Add-flag-to-enable-raw-passthrough-for-rgb-pipelines.patch https://www.zedzone.space/patches/0001-avdevice-Added-Kinect-for-Windows-V2-device-via-libf.patch
Z
Additional Information:
Hi guys,
I've written a kinect2 Fmpeg driver for libavdevice using libfreenect2.
Currently the driver exports 3x video streams. RGB as the source jpeg which can be trivially written as mjpeg, IR as raw 16-bit integer and the depth data as raw 16-bit integer in 0.1mm units (ostensibly to match matlab's capture output). I have options to select which streams to capture and ffmpeg lets you remux and encode however you want. Cameras are accessed by serial# or index. Capturing all 3 streams directly (i.e. no transcode, mjpeg+raw+raw) from 1 camera to a given container, on a 4-core-8-thread intel-i7-something dell laptop, takes about 25% of one core (using opengl depth stuff).
I will probably try to submit it upstream if it seems worthwhile to do so.
As background I was initially going to write a bespoke tool to capture frames and write them to a video container via ffmpeg's libraries but this approach was less work and more reusable.
!Z