Closed martin-petzold closed 1 year ago
I think you have to set
insmod v4l2loopback exclusive_caps=1
Chromium doesn't like to have video devices which are also sinks.
We already do (/etc/modprobe.d/v4l2loopback.conf):
options v4l2loopback video_nr=9
options v4l2loopback exclusive_caps=1
options v4l2loopback card_label="daA4200"
And (/etc/modules-load.d/v4l2loopback.conf):
v4l2loopback
v4l2 lists the device:
$ v4l2-ctl --list-devices
():
/dev/v4l-subdev0
mxc-isi-m2m (platform:32e00000.isi:m2m_devic):
/dev/video0
FSL Capture Media Device (platform:mxc-md):
/dev/media0
daA4200 (platform:v4l2loopback-000):
/dev/video9
Can list the device information:
$ sudo v4l2-ctl --device=/dev/video9 --all
Driver Info:
Driver name : v4l2 loopback
Card type : daA4200
Bus info : platform:v4l2loopback-000
Driver version : 5.10.52
Capabilities : 0x85208001
Video Capture
Video Memory-to-Memory
Read/Write
Streaming
Extended Pix Format
Device Capabilities
Device Caps : 0x05208001
Video Capture
Video Memory-to-Memory
Read/Write
Streaming
Extended Pix Format
Priority: 0
Video input : 0 (loopback: ok)
Format Video Capture:
Width/Height : 1920/1080
Pixel Format : 'H264' (H.264)
Field : None
Bytes per Line : 0
Size Image : 8294400
Colorspace : sRGB
Transfer Function : sRGB
YCbCr/HSV Encoding: ITU-R 601
Quantization : Limited Range
Flags :
Format Video Output:
Width/Height : 1920/1080
Pixel Format : 'H264' (H.264)
Field : None
Bytes per Line : 0
Size Image : 8294400
Colorspace : sRGB
Transfer Function : sRGB
YCbCr/HSV Encoding: ITU-R 601
Quantization : Limited Range
Flags :
Streaming Parameters Video Capture:
Frames per second: 30.000 (30/1)
Read buffers : 2
Streaming Parameters Video Output:
Frames per second: 30.000 (30/1)
Write buffers : 2
User Controls
keep_format 0x0098f900 (bool) : default=0 value=0
sustain_framerate 0x0098f901 (bool) : default=0 value=0
timeout 0x0098f902 (int) : min=0 max=100000 step=1 default=0 value=0
timeout_image_io 0x0098f903 (bool) : default=0 value=0
But cannot change default device: sudo v4l2-ctl --device 9
(no effect). Still, Chromium should find the device.
$ cat /sys/module/v4l2loopback/parameters/exclusive_caps
Y,N,N,N,N,N,N,N
$ sudo udevadm info --name=/dev/video9
P: /devices/virtual/video4linux/video9
N: video9
L: 0
E: DEVPATH=/devices/virtual/video4linux/video9
E: DEVNAME=/dev/video9
E: MAJOR=81
E: MINOR=1
E: SUBSYSTEM=video4linux
E: USEC_INITIALIZED=1622764477
E: ID_V4L_VERSION=2
E: ID_V4L_PRODUCT=daA4200
E: ID_V4L_CAPABILITIES=:video_output:
E: TAGS=:uaccess:seat:
E: CURRENT_TAGS=:uaccess:seat:
Somehow I also cannot stream a raw format to v4l2sink:
$ gst-launch-1.0 pylonsrc cam::OffsetX=1144 cam::OffsetY=1016 ! "video/x-raw,format=YUY2,width=1920,height=1080,framerate=30/1" ! v4l2sink device=/dev/video9
Leitung wird auf PAUSIERT gesetzt ...
Leitung ist aktiv und erfordert keinen VORLAUF …
Leitung ist vorgelaufen …
Leitung wird auf ABSPIELEN gesetzt ...
New clock: GstSystemClock
FEHLER: Von Element /GstPipeline:pipeline0/GstPylonSrc:pylonsrc0: Internal data stream error.
Zusätzliche Fehlerdiagnoseinformation:
../libs/gst/base/gstbasesrc.c(3127): gst_base_src_loop (): /GstPipeline:pipeline0/GstPylonSrc:pylonsrc0:
streaming stopped, reason not-negotiated (-4)
Execution ended after 0:00:00.443985152
Leitung wird auf NULL gesetzt ...
Leitung wird geleert ...
This stream works (added 'tee') AND now it is found by Chromium:
gst-launch-1.0 pylonsrc cam::OffsetX=1144 cam::OffsetY=1016 ! "video/x-raw,format=YUY2,width=1920,height=1080,framerate=30/1" ! tee ! queue ! v4l2sink device=/dev/video9
Leitung wird auf PAUSIERT gesetzt ...
Leitung ist aktiv und erfordert keinen VORLAUF …
Leitung ist vorgelaufen …
Leitung wird auf ABSPIELEN gesetzt ...
New clock: GstSystemClock
0:00:03.2 / 99:99:99.
I am currently not sure, if Chromium accepts H.264 video as camera input. We will proceed with raw stream. Chromium will then use internal software encoder or hardware encoder (if supported) for streaming.
It seems I am very close to achieve my goal of streaming from Pylon GStreamer plugin to Chromium. I would like to share my approach and maybe receive some help with the last step.
My platform is a custom board with NXP i.MX8MP. Camera daA4200-30mci-AF90-AR1335 is connected via CSI-2. OS is a custom Debian 11 and Kernel is 5.10 with some patches from NXP. We had to customize the device tree to get the camera supported.
I can stream the camera to wayland directly and this works well:
gst-launch-1.0 pylonsrc cam::OffsetX=1144 cam::OffsetY=1016 ! "video/x-raw,format=YUY2,width=1920,height=1080,framerate=30/1" ! queue ! waylandsink
Now my goal is to get the camera stream available via HTML / JavaScript media device within Chromium and also use the camera stream for WebRTC media sessions (video calls / conference).
As far as I understand, currently there is no direct support in Chromium for hardware decoders / encoders (especially H.264). However, there seem to be some patches from NXP close to be final which bring this support (but not for 5.10 kernel, some later kernel). In case of hardware encoder support in Chromium, it should be possible to stream raw video to the v4l2loopback device and Chromium should handle everything else (if I am right?). Also if there is no hardware support, it should use its internal software encode / decoder (which will result in some CPU load). I postponed this approach to later.
However, in my case I have a GStreamer plugin provided by NXP (vpuenc_h264) which can encode the raw stream to H.264. My idea now is, to start a GStreamer pipeline which takes the raw video from pylonsrc then encodes it to H.264 (hardware encoder) and forwards it to the v4l2loopback device. It looks like this:
gst-launch-1.0 pylonsrc cam::OffsetX=1144 cam::OffsetY=1016 ! "video/x-raw,format=YUY2,width=1920,height=1080,framerate=30/1" ! queue ! vpuenc_h264 ! v4l2sink device=/dev/video9
The pipeline is running. However, unfortunately on my first try Chromium did not find the media device. I need to say, that I am using some special Chromium build and there I can also change some configuration regarding media streams (i.e. audio and video capture).
Maybe this can help someone, and maybe someone has an idea, if this approach should actually work and what could be wrong?