Open crossan007 opened 3 years ago
This also seems to be a problem with f62fb9076b6313e5eb82fdcaceadb6b3052f346e
from v4l2loopback (the primary branch was renamed from master
to main
, so my local mirror was out of date)
The output of v4l2-ctl
seems to indicate that my device (/dev/video42
) does support MJPEG format:
v4l2-ctl --all -d /dev/video42
Driver Info:
Driver name : v4l2 loopback
Card type : Dummy video device (0x002A)
Bus info : platform:v4l2loopback-042
Driver version : 5.4.83
Capabilities : 0x85208002
Video Output
Video Memory-to-Memory
Read/Write
Streaming
Extended Pix Format
Device Capabilities
Device Caps : 0x05208002
Video Output
Video Memory-to-Memory
Read/Write
Streaming
Extended Pix Format
Priority: 2
Video output: 0 (loopback in)
Format Video Output:
Width/Height : 640/480
Pixel Format : 'JPEG' (JFIF JPEG)
Field : None
Bytes per Line : 0
Size Image : 1228800
Colorspace : sRGB
Transfer Function : sRGB
YCbCr/HSV Encoding: ITU-R 601
Quantization : Limited Range
Flags :
Streaming Parameters Video Capture:
Frames per second: 30.000 (30/1)
Read buffers : 8
Streaming Parameters Video Output:
Frames per second: 30.000 (30/1)
Write buffers : 8
User Controls
keep_format 0x0098f900 (bool) : default=0 value=0
sustain_framerate 0x0098f901 (bool) : default=0 value=0
timeout 0x0098f902 (int) : min=0 max=100000 step=1 default=0 value=0
timeout_image_io 0x0098f903 (bool) : default=0 value=0
What distro are you running?
But yes it looks like the format of the v4l2loopback device (I guess just passing on the info from gstreamer) is not matching up to what a usual MJPEG device ( e.g webcam) would have.
Does the gstreamer v4l2sink have any way to configure this?
Otherwise maybe we need to add another v4l2 thing to say we support JPEG as well as MJPEG. As I have seen this before.
And FYI we have seen buffer issues (frame tearing) from a bug in v4l2loopback before.
What distro are you running?
Linux octopi 5.4.83-v7l+ #1379 SMP Mon Dec 14 13:11:54 GMT 2020 armv7l GNU/Linux
Does the gstreamer v4l2sink have any way to configure this?
GStreamer
has a concept of caps
which define the format / encapsulation of the media at various points in the stream. The !
"links" various sources/pads/plugins/sinks to the next, and some plugins have implicit caps
. In this case avenc_mjpeg
is linked to v4l2sink
which will output image/jpeg
Otherwise maybe we need to add another v4l2 thing to say we support JPEG as well as MJPEG. As I have seen this before. I feel like this is on the right path - but I don't know enough about the specifics of inter-process-format-type-communication.
The but may lie with v4l2loopback, but since GSTreamer is happily emitting JPEG to the virtual camera, I think the issue lies with mjpegstreamer, since the problem only manifests while attempting to open the stream.
As a side note, GStreamer supports a concept of appsink
and shmsink
where it can use a fifo pipe or shared memory as the sink
of a pipeline so that other applications can read in the GStreamer pipeline format - it would be neat to add a input_gstreamer
plugin to mjepgstreamer.
This seems relevant: https://www.mail-archive.com/linux-media@vger.kernel.org/msg135384.html
Hello Hans,
Le lundi 01 octobre 2018 à 10:43 +0200, Hans Verkuil a écrit :
> It turns out that we have both JPEG and Motion-JPEG pixel formats defined.
>
> Furthermore, some drivers support one, some the other and some both.
>
> These pixelformats both mean the same.
>
> I propose that we settle on JPEG (since it seems to be used most often) and
> add JPEG support to those drivers that currently only use MJPEG.
Thanks for looking into this. As per GStreamer code, I see 3 alias for
JPEG. V4L2_PIX_FMT_MJPEG/JPEG/PJPG. I don't know the context, this code
was written before I knew GStreamer existed. It's possible there is a
subtle difference, I have never looked at it, but clearly all our JPEG
decoder handle these as being the same.****
It "seems like" the GStreamer pipeline is outputting V4L2_PIX_FMT_JPEG: /* JFIF JPEG */
based on the output of v4l2-ctl --all -d /dev/video42
posted above. And, the current source of that file matches what was discussed in the mailing list:
https://github.com/GStreamer/gst-plugins-good/blob/master/sys/v4l2/gstv4l2object.c#L969
I'm not sure whether this should work in mjpegstreamer, though it seems it should work? (https://github.com/jacksonliam/mjpg-streamer/blob/85f89a8c321e799fabb1693c5d133f3fb48ee748/mjpg-streamer-experimental/plugins/input_uvc/input_uvc.c#L415)
(Possibly) Related issues: https://github.com/jacksonliam/mjpg-streamer/issues/229 https://github.com/jacksonliam/mjpg-streamer/issues/133 https://stackoverflow.com/questions/22575532/how-to-take-picture-with-mjpg-streamer https://www.eevblog.com/forum/thermal-imaging/question-about-flir-one-for-android/125/
So, it looks like the call to xioctl
on line 334 of v4l2uvc.c
is a wrapper for IOCTL_VIDEO
which, according to https://www.kernel.org/doc/html/v4.9/media/uapi/v4l/func-ioctl.html both returns a 0
or -1
, and sets errno
.
I've modified the code a little to print errno
on failure, and I seem to be getting error 22
:
MJPG Streamer Version: git rev: 310b29f4a94c46652b20c4b7b6e5cf24e532af39
i: Using V4L2 device.: /dev/video42
i: Desired Resolution: 640 x 480
i: Frames Per Second.: -1
i: Format............: JPEG
i: TV-Norm...........: DEFAULT
Error: 22. Unable to set format: 1196444237 res: 640x480
Init v4L2 failed !! exit fatal
i: init_VideoIn failed
I can't seem to find a description of the ENUM linking the 22
to something useful. The closest I could find is this: https://www.kernel.org/doc/html/v4.9/media/uapi/gen-errors.html#id1 which seems to be lacking the INT
counterpart of the ENUM values :/
UPDATE: it looks like errno
is platform-specific. I found what I think to be the right enum at /usr/include/asm-generic/errno-base.h
on my RasPi (which was C:\Program Files (x86)\Windows Kits\10\Include\10.0.18362.0\ucrt\errno.h
on my PC, but was missing 22
), and the errno 22 seems to correlate with EINVAL
I GOT IT TO WORK!
Had to do some code-hacking (see https://github.com/crossan007/mjpg-streamer/commit/a1380b1cf010c78389994502e94d8441026c7b97).
By hard-coding to JPEG, I can run the GStreamer pipeline:
gst-launch-1.0 videotestsrc ! "video/x-raw, width=640, height=480, fps=30/1" ! avenc_mjpeg ! v4l2sink device=/dev/video42
And consume with the mjpg_streamer command:
./mjpg_streamer -o "output_http.so -w ./www-octopi -n" -i "input_uvc.so -d /dev/video42 -n"
-n
was necessary because mjpg_streamer tried to init all of the controls to the virtual webcam:
UVCIOC_CTRL_ADD - Error at Pan (relative): Inappropriate ioctl for device (25)
UVCIOC_CTRL_ADD - Error at Tilt (relative): Inappropriate ioctl for device (25)
UVCIOC_CTRL_ADD - Error at Pan Reset: Inappropriate ioctl for device (25)
UVCIOC_CTRL_ADD - Error at Tilt Reset: Inappropriate ioctl for device (25)
UVCIOC_CTRL_ADD - Error at Pan/tilt Reset: Inappropriate ioctl for device (25)
UVCIOC_CTRL_ADD - Error at Focus (absolute): Inappropriate ioctl for device (25)
UVCIOC_CTRL_MAP - Error at Pan (relative): Inappropriate ioctl for device (25)
UVCIOC_CTRL_MAP - Error at Tilt (relative): Inappropriate ioctl for device (25)
UVCIOC_CTRL_MAP - Error at Pan Reset: Inappropriate ioctl for device (25)
UVCIOC_CTRL_MAP - Error at Tilt Reset: Inappropriate ioctl for device (25)
UVCIOC_CTRL_MAP - Error at Pan/tilt Reset: Inappropriate ioctl for device (25)
UVCIOC_CTRL_MAP - Error at Focus (absolute): Inappropriate ioctl for device (25)
UVCIOC_CTRL_MAP - Error at LED1 Mode: Inappropriate ioctl for device (25)
UVCIOC_CTRL_MAP - Error at LED1 Frequency: Inappropriate ioctl for device (25)
UVCIOC_CTRL_MAP - Error at Disable video processing: Inappropriate ioctl for device (25)
UVCIOC_CTRL_MAP - Error at Raw bits per pixel: Inappropriate ioctl for device (25)
By using avenc_mjpeg
in GStreamer to encode the jpeg (rather than letting mjpeg_streamer handle the raw YUYV stream), my raspberry pi processor usage is significantly down.
The Gstreamer pipeline here uses ~43% CPU (load average hovers ~0.50), but with mjpeg_streamer hanlding the raw YUYV stream, I was seeing the mjpeg_streamer process hover around 110%CPU.
To add some fun context; I'm using this to "combine" two different webcams from my 3D printer into one stream that OctoPrint can use.
Thanks for the investigation and write up, interesting fix. Only those two lines needed changing?
Did you try printing formatIn before you change it to see if it is JPEG or MJPEG? I don't think we set it to anything different than the device gives us.
Sorry for a probably stupid question. How can I make it works too? I'm trying to utilize an old iPhone + DroidCam and I have exactly the same issue. I can use VLC and view the stream from my /dev/video0 on Octopi, but can't make it work on Octoprint 0.18.0
Hope you can help me. Thanks in advance
Luca
@tombo9999 try building mjpg-streamer from my commit: https://github.com/crossan007/mjpg-streamer/commit/a1380b1cf010c78389994502e94d8441026c7b97
That basically "hard codes" the parser to use V4L2_PIX_FMT_JPEG
instead of the format the the v4l2loopback device is "claiming to have".
It's not a proper fix, as it may break other mjpg-streamer uses, but it should work for the v4l2loopback use case
You are a beast. After 3 days looking and trying I finally found a way to stream full color with OV13850 mipi camera on Onragepi 4 lts with mjpg-streamer for Octoprint!
sudo modprobe v4l2loopback video_nr="10" max_buffers="8" exclusive_caps="1"
gst-launch-1.0 rkisp device=/dev/video6 io-mode=1 ! video/x-raw,format=NV12,width=640,height=480,framerate=30/1 ! videoconvert ! avenc_mjpeg ! v4l2sink device=/dev/video10
Problem: gst-launch-1.0 rkisp consumes 94% cpu :(
Any idea on how to lower the consumption?
@christianrodher I'm afraid I'm not sure how to reduce CPU consumption. I was experiencing this issue as well.
I kind of gave up on this project, since the CPU usage for my intended purpose (streaming a "gstreamer composited view" of two USB cameras from my 3D printers) was too high and caused things to become unstable
If I recall correctly, the "high consuming" piece here was the MJPEG encoding - maybe that GStreamer wasn't able to utilize hardware acceleration, or that the device I was using didn't support HA, or maybe something completely different.
Video stuff is still kind of mysterious to me, so it's a lot of guess and check.
thank you... I found one that has low consumtion... but the http_server dont have ?action=stream and action=snapshot like mjpg-streamer... so i cant use it correctly with octoprint... But in this one my CPU consumption is just 40%, Maybe you can make it work for your need. https://github.com/jungin500/gst-httpd-1.0. I dont know much of conding so i dont know how to add ?action=stream and action=snapshot to it :(
Given the follwing
A
v4l2loopback
device created from (https://github.com/umlaeute/v4l2loopback / aba3067f81b343f4e80c588de895c1aeb0da4b76) via:and a GStreamer pipeline which should be emitting
image/jpeg
(motion jpeg) video:I should be able to start an
mjpeg_streamer
instance with:However,
mjpeg_streamer
fails to open the v4l2 source:This seems to be specific to the MJPEG encoding, as a similar setup with YUYV format seems to work (however, this setup causes
mjpeg_stremer
to do the JPEG encoding, and I would like GStreamer to do the encoding)