jacksonliam / mjpg-streamer

Fork of http://sourceforge.net/projects/mjpg-streamer/
3.04k stars 1.23k forks source link

libcamera or h.264 tcp Stream Support #336

Open Serverfrog opened 2 years ago

Serverfrog commented 2 years ago

As i saw in #324 the Support for the raspicam under Rasbian Bullseye is not possible. And the Official Raspberry Pi HQ Camera Model (IMX477) doesn't support MJPG or other formats under v4l2 driver that ffmpeg or mjpeg-streamer would support. But it is possible to start libcamera-vid so it serves a tcp/h264 stream.

So i think a input_stream Plugin could be kinda usefull, if not directly use libcamera

jacksonliam commented 2 years ago

It would be difficult to support this, because the whole code is written around handling individual jpeg frames. The plugin would have to support decoding the h264 stream and then recoding as jpegs.

Are you sure the RPi cam doesn't support jpeg (or atleast YUV frames) via the v4l2 driver? I would be surprised if the model of camera being used made any difference because IIRC the encoding is handled by the GPU in the pi, not the camera module itself. And I thought it was reported working for the other cameras (though I haven't tried bullseye myself).

If you have any links to info about v4l2 not providing mjpeg/yuv for the (HQ) RPi cam I'd be interested.

While a libcamera plugin might be nice, I think I'd rather spend my time working on something to streaming h264 from Pi cameras, especially since nearly all browsers support h264 now.

Serverfrog commented 2 years ago

The thing i was asking about lib-camera is, how would else a easier way to get the images from the Camera to mjpg-streamer if not directly through their API. The other way would be to start libcamera-vid -t 0 --inline --listen -o tcp://0.0.0.0:1234 and restream it. Also, as im very very new in this topic of camera Interfaces and codecs, i can't say for very sure that it isn't supported, but im happy to give every information to help with that I append all the Reporting from v4l2 i get with unicam ( modprobe bcm2835-unicam ).

From https://www.raspberrypi.com/documentation/accessories/camera.html#v4l2-drivers

/dev/videoX | Default Action -- | -- video10 | Video decode. video11 | Video encode. video12 | Simple ISP. Can perform conversion and resizing between RGB/YUV formats, and also Bayer to RGB/YUV conversion. video13 | Input to fully programmable ISP. video14 | High resolution output from fully programmable ISP. video15 | Low result output from fully programmable ISP. video16 | Image statistics from fully programmable ISP. video19 | HEVC Decode

All Devices reported by v4l2

bcm2835-codec-decode (platform:bcm2835-codec):
        /dev/video10
        /dev/video11
        /dev/video12
        /dev/video18
        /dev/media2

bcm2835-isp (platform:bcm2835-isp):
        /dev/video13
        /dev/video14
        /dev/video15
        /dev/video16
        /dev/media1

unicam (platform:fe801000.csi):
        /dev/video0
        /dev/video1
        /dev/media0

Here are all Formats that v4l2 gives for all /dev/video*

root@octopi:/home/pi# ls -d /dev/* | grep video | xargs -L1 -d '\n' sh -c 'echo $0 && v4l2-ctl --list-formats-ext -d $0'
/dev/video0
ioctl: VIDIOC_ENUM_FMT
        Type: Video Capture

        [0]: 'pRCC' (12-bit Bayer RGRG/GBGB Packed)
                Size: Discrete 4056x3040
                Size: Discrete 2028x1520
                Size: Discrete 2028x1080
        [1]: 'RG12' (12-bit Bayer RGRG/GBGB)
                Size: Discrete 4056x3040
                Size: Discrete 2028x1520
                Size: Discrete 2028x1080
        [2]: 'pRAA' (10-bit Bayer RGRG/GBGB Packed)
                Size: Discrete 1332x990
        [3]: 'RG10' (10-bit Bayer RGRG/GBGB)
                Size: Discrete 1332x990
/dev/video1
ioctl: VIDIOC_ENUM_FMT
        Type: Video Capture

/dev/video10
ioctl: VIDIOC_ENUM_FMT
        Type: Video Capture Multiplanar

        [0]: 'YU12' (Planar YUV 4:2:0)
                Size: Stepwise 32x32 - 1920x1920 with step 2/2
        [1]: 'YV12' (Planar YVU 4:2:0)
                Size: Stepwise 32x32 - 1920x1920 with step 2/2
        [2]: 'NV12' (Y/CbCr 4:2:0)
                Size: Stepwise 32x32 - 1920x1920 with step 2/2
        [3]: 'NV21' (Y/CrCb 4:2:0)
                Size: Stepwise 32x32 - 1920x1920 with step 2/2
        [4]: 'RGBP' (16-bit RGB 5-6-5)
                Size: Stepwise 32x32 - 1920x1920 with step 2/2
/dev/video11
ioctl: VIDIOC_ENUM_FMT
        Type: Video Capture Multiplanar

        [0]: 'H264' (H.264, compressed)
                Size: Stepwise 32x32 - 1920x1920 with step 2/2
/dev/video12
ioctl: VIDIOC_ENUM_FMT
        Type: Video Capture Multiplanar

        [0]: 'YUYV' (YUYV 4:2:2)
                Size: Stepwise 32x32 - 1920x1920 with step 2/2
        [1]: 'YVYU' (YVYU 4:2:2)
                Size: Stepwise 32x32 - 1920x1920 with step 2/2
        [2]: 'VYUY' (VYUY 4:2:2)
                Size: Stepwise 32x32 - 1920x1920 with step 2/2
        [3]: 'UYVY' (UYVY 4:2:2)
                Size: Stepwise 32x32 - 1920x1920 with step 2/2
        [4]: 'YU12' (Planar YUV 4:2:0)
                Size: Stepwise 32x32 - 1920x1920 with step 2/2
        [5]: 'YV12' (Planar YVU 4:2:0)
                Size: Stepwise 32x32 - 1920x1920 with step 2/2
        [6]: 'RGB3' (24-bit RGB 8-8-8)
                Size: Stepwise 32x32 - 1920x1920 with step 2/2
        [7]: 'BGR3' (24-bit BGR 8-8-8)
                Size: Stepwise 32x32 - 1920x1920 with step 2/2
        [8]: 'BGR4' (32-bit BGRA/X 8-8-8-8)
                Size: Stepwise 32x32 - 1920x1920 with step 2/2
        [9]: 'RGBP' (16-bit RGB 5-6-5)
                Size: Stepwise 32x32 - 1920x1920 with step 2/2
        [10]: 'NV12' (Y/CbCr 4:2:0)
                Size: Stepwise 32x32 - 1920x1920 with step 2/2
        [11]: 'NV21' (Y/CrCb 4:2:0)
                Size: Stepwise 32x32 - 1920x1920 with step 2/2
/dev/video13
ioctl: VIDIOC_ENUM_FMT
        Type: Video Capture

/dev/video14
ioctl: VIDIOC_ENUM_FMT
        Type: Video Capture

        [0]: 'YUYV' (YUYV 4:2:2)
                Size: Stepwise 64x64 - 16384x16384 with step 2/2
        [1]: 'YVYU' (YVYU 4:2:2)
                Size: Stepwise 64x64 - 16384x16384 with step 2/2
        [2]: 'VYUY' (VYUY 4:2:2)
                Size: Stepwise 64x64 - 16384x16384 with step 2/2
        [3]: 'UYVY' (UYVY 4:2:2)
                Size: Stepwise 64x64 - 16384x16384 with step 2/2
        [4]: 'YU12' (Planar YUV 4:2:0)
                Size: Stepwise 64x64 - 16384x16384 with step 2/2
        [5]: 'YV12' (Planar YVU 4:2:0)
                Size: Stepwise 64x64 - 16384x16384 with step 2/2
        [6]: 'RGB3' (24-bit RGB 8-8-8)
                Size: Stepwise 64x64 - 16384x16384 with step 1/1
        [7]: 'BGR3' (24-bit BGR 8-8-8)
                Size: Stepwise 64x64 - 16384x16384 with step 1/1
        [8]: 'XB24' (32-bit RGBX 8-8-8-8)
                Size: Stepwise 64x64 - 16384x16384 with step 1/1
        [9]: 'XR24' (32-bit BGRX 8-8-8-8)
                Size: Stepwise 64x64 - 16384x16384 with step 1/1
        [10]: 'RGBP' (16-bit RGB 5-6-5)
                Size: Stepwise 64x64 - 16384x16384 with step 1/1
        [11]: 'NV12' (Y/CbCr 4:2:0)
                Size: Stepwise 64x64 - 16384x16384 with step 2/2
        [12]: 'NV21' (Y/CrCb 4:2:0)
                Size: Stepwise 64x64 - 16384x16384 with step 2/2
/dev/video15
ioctl: VIDIOC_ENUM_FMT
        Type: Video Capture

        [0]: 'YUYV' (YUYV 4:2:2)
                Size: Stepwise 64x64 - 16384x16384 with step 2/2
        [1]: 'YVYU' (YVYU 4:2:2)
                Size: Stepwise 64x64 - 16384x16384 with step 2/2
        [2]: 'VYUY' (VYUY 4:2:2)
                Size: Stepwise 64x64 - 16384x16384 with step 2/2
        [3]: 'UYVY' (UYVY 4:2:2)
                Size: Stepwise 64x64 - 16384x16384 with step 2/2
        [4]: 'YU12' (Planar YUV 4:2:0)
                Size: Stepwise 64x64 - 16384x16384 with step 2/2
        [5]: 'YV12' (Planar YVU 4:2:0)
                Size: Stepwise 64x64 - 16384x16384 with step 2/2
        [6]: 'NV12' (Y/CbCr 4:2:0)
                Size: Stepwise 64x64 - 16384x16384 with step 2/2
        [7]: 'NV21' (Y/CrCb 4:2:0)
                Size: Stepwise 64x64 - 16384x16384 with step 2/2
/dev/video16
ioctl: VIDIOC_ENUM_FMT
        Type: Video Capture

/dev/video18
ioctl: VIDIOC_ENUM_FMT
        Type: Video Capture Multiplanar

        [0]: 'YU12' (Planar YUV 4:2:0)
                Size: Stepwise 32x32 - 1920x1920 with step 2/2
        [1]: 'RGBP' (16-bit RGB 5-6-5)
                Size: Stepwise 32x32 - 1920x1920 with step 2/2
jacksonliam commented 2 years ago

Hmmm yes that doesn't appear to do what we want.

It looks like theres potentially a compatibility library for libcamera, could you try somethnig like:

sudo apt-get install libcamera-dev export LD_PRELOAD=/usr/lib/arm-linux-gnueabihf/v4l2-compat.so

Serverfrog commented 2 years ago

I needed to use export LD_PRELOAD=/usr/lib/aarch64-linux-gnu/v4l2-compat.so due to the fact its not the default arm but the 64 bit. Result was

root@octopi:/opt/mjpg-streamer/mjpg-streamer-experimental# ./mjpg_streamer -o "output_http.so -w ./www" -i "input_uvc.so"
MJPG Streamer Version: git rev: 310b29f4a94c46652b20c4b7b6e5cf24e532af39
 i: Using V4L2 device.: /dev/video0
 i: Desired Resolution: 640 x 480
 i: Frames Per Second.: -1
 i: Format............: JPEG
 i: TV-Norm...........: DEFAULT
[0:53:16.712664159] [2884]  INFO Camera camera_manager.cpp:293 libcamera v0.0.0+3156-f4070274
[0:53:16.734518797] [2885]  WARN CameraSensorProperties camera_sensor_properties.cpp:141 No static properties available for 'imx477'
[0:53:16.734598518] [2885]  WARN CameraSensorProperties camera_sensor_properties.cpp:143 Please consider updating the camera sensor properties database
Error opening device /dev/video0: unable to query device.
Init v4L2 failed !! exit fatal
 i: init_VideoIn failed
jacksonliam commented 2 years ago

Hmmm thanks for trying, did you try /dev/video1 too? I'm not sure which CSI lane(s) matches to which /dev/video output, it might be that the pi is using the other one!

On Thu, 23 Dec 2021 at 14:20, Bastian Venz @.***> wrote:

I needed to use export LD_PRELOAD=/usr/lib/aarch64-linux-gnu/v4l2-compat.so due to the fact its not the default arm but the 64 bit. Result was

@.***:/opt/mjpg-streamer/mjpg-streamer-experimental# ./mjpg_streamer -o "output_http.so -w ./www" -i "input_uvc.so" MJPG Streamer Version: git rev: 310b29f4a94c46652b20c4b7b6e5cf24e532af39 i: Using V4L2 device.: /dev/video0 i: Desired Resolution: 640 x 480 i: Frames Per Second.: -1 i: Format............: JPEG i: TV-Norm...........: DEFAULT [0:53:16.712664159] [2884] INFO Camera camera_manager.cpp:293 libcamera v0.0.0+3156-f4070274 [0:53:16.734518797] [2885] WARN CameraSensorProperties camera_sensor_properties.cpp:141 No static properties available for 'imx477' [0:53:16.734598518] [2885] WARN CameraSensorProperties camera_sensor_properties.cpp:143 Please consider updating the camera sensor properties database Error opening device /dev/video0: unable to query device. Init v4L2 failed !! exit fatal i: init_VideoIn failed

— Reply to this email directly, view it on GitHub https://github.com/jacksonliam/mjpg-streamer/issues/336#issuecomment-1000336744, or unsubscribe https://github.com/notifications/unsubscribe-auth/AA4ML7ITLAQW7WWD3NJIMJ3USMV33ANCNFSM5KGU7RDA . Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub.

You are receiving this because you commented.Message ID: @.***>

Serverfrog commented 2 years ago

/dev/video1 is the same. All Interfaces wont work. Either because it is unable to query device or because the device itself is not for recording.

bugsyb commented 2 years ago

I'd be keen to get that working too. First avenue I've picked was trying to re-compile it with input_raspi, but thinking further that would be like trying to use legacy raspi interface which is going to be dropped some time soon. Also on Raspian OS bullseye 64bit.

All am after is to stream video to gstreamer (locally) and afterwards want to multiplex it to 2 sinks, one small fps, other 25fps and process in separate threads.

Is the mjpg-streamer a good avenue or something else would be recommended? Maybe picking up libcamera with h264 directly and processing with gstreamer could be better? But then I want to stream it again with mjpg-streamer. :) as input from gstreamer sink. So maybe other type of input and maybe for @Serverfrog piping through gstreamer is a workaround (not saying it is possible at all or a good idea - this is more of a question).

bugsyb commented 2 years ago

@jacksonliam - found this: https://datasheets.raspberrypi.com/camera/raspberry-pi-camera-guide.pdf `3 Driver Framework Our Unicam kernel driver follows the standard Video for Linux 2 (V4L2) API (see https://linuxtv.org/downloads/ v4l-dvb-apis/uapi/v4l/v4l2.html). It is responsible for:

  1. Interfacing with the V4L2 sensor subdevice driver to perform standard operations (start/stop streaming, set/get modes, etc.)
  2. Providing two output nodes: Bayer image data on the /dev/video0 node and sensor embedded data on the /dev/video1 node. After getting a bayer frame buffer from the Unicam driver, we pass it to our ISP driver. Again, this driver follows the standard V4L2 API. This driver is responsible for:
  3. Exposing a set of V4L2 “controls” used to configure the pipeline blocks within the ISP hardware.
  4. Taking the bayer input framebuffer, together with an appropriate set of output buffers and configuration, and generating an output in RGB or YUV format.
  5. Generating some statistics used by our Control Algorithms (more on this later).
  6. Providing one input node on the /dev/video13 for the bayer input framebuffer.
  7. Providing three output nodes: Two image outputs on the /dev/video14 and /dev/video15 nodes, and statistics output on the /dev/video16 node.`
jacksonliam commented 2 years ago

@jacksonliam - found this: https://datasheets.raspberrypi.com/camera/raspberry-pi-camera-guide.pdf `3 Driver Framework Our Unicam kernel driver follows the standard Video for Linux 2 (V4L2) API (see https://linuxtv.org/downloads/ v4l-dvb-apis/uapi/v4l/v4l2.html). It is responsible for:

  1. Interfacing with the V4L2 sensor subdevice driver to perform standard operations (start/stop streaming, set/get modes, etc.)
  2. Providing two output nodes: Bayer image data on the /dev/video0 node and sensor embedded data on the /dev/video1 node. After getting a bayer frame buffer from the Unicam driver, we pass it to our ISP driver. Again, this driver follows the standard V4L2 API. This driver is responsible for:
  3. Exposing a set of V4L2 “controls” used to configure the pipeline blocks within the ISP hardware.
  4. Taking the bayer input framebuffer, together with an appropriate set of output buffers and configuration, and generating an output in RGB or YUV format.
  5. Generating some statistics used by our Control Algorithms (more on this later).
  6. Providing one input node on the /dev/video13 for the bayer input framebuffer.
  7. Providing three output nodes: Two image outputs on the /dev/video14 and /dev/video15 nodes, and statistics output on the /dev/video16 node.`

Yes I've seen that.

See if you can figure out a way to get one of those to output mjpg or even yuv

Serverfrog commented 2 years ago

One workaround that works now is to fall back to the Legacy Camera System they readded 2 days ago. I tested it with that and everything went fine but as Raspbian tells us often: its a legacy system and it will be removed.

jacksonliam commented 2 years ago

One workaround that works now is to fall back to the Legacy Camera System they readded 2 days ago. I tested it with that and everything went fine but as Raspbian tells us often: its a legacy system and it will be removed.

Sounds promising, do you have a link to that?

Serverfrog commented 2 years ago

I got informed via the Release Notest: https://downloads.raspberrypi.org/raspios_armhf/release_notes.txt I just used raspi-config then enabled the Legacy Camera option and rebooted

Gusser93 commented 2 years ago

For me the legacy system is not working, maybe the Sony IMX519 in my module is not supported. The libcamera-vid app supports mjpg and YUV420 encoding as well as network streaming according to the raspberry documentation. Could this be used?

Serverfrog commented 2 years ago

I could suspect that the AutoFocus Cam would not work because it was not existing before libcamera was public

jacksonliam commented 2 years ago

Relevant links: https://forums.raspberrypi.com/viewtopic.php?t=322076 https://forums.raspberrypi.com/viewtopic.php?t=273018

For me the legacy system is not working, maybe the Sony IMX519 in my module is not supported. The libcamera-vid app supports mjpg and YUV420 encoding as well as network streaming according to the raspberry documentation. Could this be used?

Have you tried to use the override "media-controller=0" when loading the dtoverlay (eg dtoverlay=imx219,media-controller=0) to revert to using the video-node centric mode of operation. That may get us at least YUV support from the /dev/videoX nodes.

kbingham commented 2 years ago

You can indeed use the legacy stack to continue using the supported RPi official cameras, but they won't support the IMX519 or any other camera that comes out for the RPi.

kbingham commented 2 years ago

To support libcamera on mjpg-streamer I see a couple of paths - either update mjpg-streamer to accept linking against a C++ library, and we implement a libcamera plugin for mjpg-streamer (probably my preferred option, and only the specific plugin has to be compiled with a c++ compiler, the rest of the code base can stay as C) or implement a gstreamer plugin to use libcamerasrc which already provides a c api.

@jacksonliam What would be your preference on those two routes?

jacksonliam commented 2 years ago

I'd accept contributions of either-or really. As long as it didn't affect the current code base building for non-rpi platforms. I saw the libcamera library seems to support mjpg but I wasn't entirely sure if that encoding was done by the GPU or the CPU.

I'm secretly hoping that the RPI cam v4l2 stack gets reworked to provide a device which behaves like a normal webcam though.

I'm not sure why the raspberry pi have effectively gone backwards on the v4l2 compatibility tbh. Surely all other applications that use webcams no longer work with RPI cams now?

kbingham commented 2 years ago

From my perspective, I could say they won't go backwards I'm afraid. It's simply not possible to support an ever expanding range of cameras in their closed source proprietary stack (the legacy stack).

Cameras are complicated, and to drive them requires lots of support. Bringing this into open source, using the V4L2 framework requires extra code. That's why we've built libcamera (note I'm in the libcamera team, I don't work at RPi), so that it isn't required to do that in every application.

You could if you prefer - choose to not use libcamera, and open the raw bayer sensor, capture bayer images, send them into the ISP device (another video node) and convert them, and implement your own algorithms and control loops to adapt the exposure and gains. But ... all of that is written for you (and is open source) by RaspberryPi and exists in libcamera.

kbingham commented 2 years ago

The bug in the v4l2 compatibility layer there is interesting though. I had a quick look, and it's worth some more investigation to fix.

kbingham commented 2 years ago

It looks like Arducam have beaten me to implementing this - they already have a libcamera plugin for mjpg-streamer written and working at : https://github.com/ArduCAM/mjpg-streamer I've just tested it here and it's streaming. Probably needs more work to refine and add more options - but it looks like a good starting point!

jacksonliam commented 2 years ago

It looks like Arducam have beaten me to implementing this - they already have a libcamera plugin for mjpg-streamer written and working at : https://github.com/ArduCAM/mjpg-streamer I've just tested it here and it's streaming. Probably needs more work to refine and add more options - but it looks like a good starting point!

That's cool, I'm glad someone tackled that. They seem to grab the frames as V4L2_PIX_FMT_RGB24 and convert it to jpeg via libjpeg using the CPU though. Would be cool to see it use the ISP as input_raspicam did, as this won't be too great on a pi 1 or pi 0.

kbingham commented 2 years ago

It's also probably more efficient to capture YUV for encoding with libjpeg too.

I'm not sure if the new drivers support/expose JPEG. I don't know why it's gone - but I don't see their drivers reporting MJPEG as any of the capture formats.

jacksonliam commented 2 years ago

Yes probably. Though RGB to YUV is probably fairly quick compared to the actual jpeg encoding.

There is an mjpg example program in the RPi repos but I've no idea if it's using the ISP. I couldn't find the source for their jpeg encoding function in their jpeg header.

kbingham commented 2 years ago

Seems like they're just using libjpeg ...

https://github.com/raspberrypi/libcamera-apps/blob/main/encoder/mjpeg_encoder.cpp

kbingham commented 2 years ago

If someone wants to add h264 support - then on an RPi - there is hw encode for that with an example at : https://github.com/raspberrypi/libcamera-apps/blob/main/encoder/h264_encoder.cpp

alexfornuto commented 1 year ago

I'm sorry to ask here, but ArduCAM's fork doesn't have issues. They have a raspicam plugin, but make doesn't create a .so file for it. Does a particular flag need to be added to include the plugin in make?

I'm sure to those who program the answer is obvious. To those who just wanna use OSS, it may not be.

EDIT: To clarify, when building it just skips the input_raspicam dir:

[ 40%] Building C object plugins/input_ptp2/CMakeFiles/input_ptp2.dir/input_ptp2.c.o
[ 44%] Linking C shared library input_ptp2.so
make[3]: Leaving directory '/home/alex/repos/ArduCAM/mjpg-streamer/mjpg-streamer-experimental/_build'
[ 44%] Built target input_ptp2
make[3]: Entering directory '/home/alex/repos/ArduCAM/mjpg-streamer/mjpg-streamer-experimental/_build'
make[3]: Leaving directory '/home/alex/repos/ArduCAM/mjpg-streamer/mjpg-streamer-experimental/_build'
make[3]: Entering directory '/home/alex/repos/ArduCAM/mjpg-streamer/mjpg-streamer-experimental/_build'
[ 48%] Building C object plugins/input_uvc/CMakeFiles/input_uvc.dir/dynctrl.c.o
[ 52%] Building C object plugins/input_uvc/CMakeFiles/input_uvc.dir/input_uvc.c.o

EDIT2: Looks like it's depending on /opt/vc/include, which was removed in buster?