Open fphammerle opened 2 years ago
ustreamer
works fine when I connect a TC358743 board instead of the camera.
@fphammerle
A simple fix for now is to enable Legacy Camera assuming you are on Raspbian Bullseye
sudo raspi-config
3 Interface Options > I1 Legacy Camera > Yes
sudo reboot
and voila it works (I hope xD, this is what I had to do)
My assumption is that ustreamer may be reliant on the older system and since the author is busy with the pikvm, the code hasn't been updated to the newer libcamera stack. Hopefully someone picks it up and is able to look into it. I might give it a shot, once I learn more about how the new system is implemented.
Thanks @benjaminjacobreji for your suggestion to revert to the legacy stack!
If possible, I would prefer keeping libcamera, as I also use libcamera-vid --post-process-file ...
from time to time.
I guess I'll wait for a fix, thanks!
I assume this is related to the recent (late 2021) introduction of the Media Controller API in newer Linux kernels and Raspbian Bullseye. Support for the Media Controller API was enabled for several camera modules including OV5647 (Pi Camera v1, end of life) and IMX219 (Pi Camera v2). It's not enabled for TC358743 though.
See this post bcm2835-unicam and the Media Controller API - this is now live on the forums.
On a fresh install of Raspbian Bullseye, you'll likely have camera_auto_detect=1
in /boot/config.txt
. In this case, the appropriate device tree overlay will be loaded automatically for supported camera modules and the Media Controller API will be enabled.
If you have, say, an IMX219 module, you can disable support for the Media Controller API by loading the appropriate overlay and adding media-controller=0
:
camera_auto_detect=0
start_x=1
dtoverlay=imx219,media-controller=0
I ran into the same issue as you with:
sudo rpi-update
ustreamer
built from the https://github.com/PiKVM/uStreamer/tree/m2m branch (not sure it matters but that's what I tested with)My V4L2 settings after a fresh boot:
That the Media Controller API is available can be seen in the Device Caps value 0x25200001
, in which the V4L2_CAP_IO_MC
(0x20000000
) flag is set. You also see a /dev/media3
node along with the /dev/video0
node for the unicam
device.
Below is the topology for the Image Signal Processor (ISP).
The topology associated with the camera sensor module.
If just run ./ustreamer
in this setup, it reports:
-- ERROR [99.029 stream] -- Can't start capturing: Invalid argument
In the kernel log (journal -t kernel --since '3 min ago'
), the following lines are logged:
Mar 23 18:30:59 raspberry kernel: unicam 3f801000.csi: Wrong width or height 640x480 (remote pad set to 3280x2464)
Mar 23 18:30:59 raspberry kernel: unicam 3f801000.csi: Failed to start media pipeline: -22
The first problem can be corrected by updating the size on the V4L2 subdev.
# Display details for the sensor V4L2 subdev
$ media-ctl --print-topology --device /dev/media3
...
Device topology
- entity 1: imx219 10-0010 (2 pads, 2 links)
type V4L2 subdev subtype Sensor flags 0
device node name /dev/v4l-subdev0
pad0: Source
[fmt:SRGGB10_1X10/3280x2464 field:none colorspace:raw xfer:none ycbcr:601 quantization:full-range
crop.bounds:(8,8)/3280x2464
crop:(8,8)/3280x2464]
-> "unicam-image":0 [ENABLED,IMMUTABLE]
pad1: Source
[fmt:unknown/16384x1 field:none
crop.bounds:(8,8)/3280x2464
crop:(8,8)/3280x2464]
-> "unicam-embedded":0 [ENABLED,IMMUTABLE]
...
# See `media-ctl -h` for the syntax (pretty much: `'<entity>:<pad> <v4l2 properties>'`)
$ media-ctl --device 3 --set-v4l2 '"imx219 10-0010":0 [fmt:SRGGB10_1X10/640x480]'
# Verify
$ media-ctl -p -d 3
...
Device topology
- entity 1: imx219 10-0010 (2 pads, 2 links)
type V4L2 subdev subtype Sensor flags 0
device node name /dev/v4l-subdev0
pad0: Source
[fmt:SRGGB10_1X10/640x480 field:none colorspace:raw xfer:none ycbcr:601 quantization:full-range
crop.bounds:(8,8)/3280x2464
crop:(1008,760)/1280x960]
-> "unicam-image":0 [ENABLED,IMMUTABLE]
pad1: Source
[fmt:unknown/16384x1 field:none
crop.bounds:(8,8)/3280x2464
crop:(1008,760)/1280x960]
-> "unicam-embedded":0 [ENABLED,IMMUTABLE]
...
Unfortunately, this is not enough and ustreamer
still fails with Can't start capturing: Invalid argument
and the kernel still logs Failed to start media pipeline: -22
(Invalid argument). It's not very helpful but indicates that one or more things aren't setup properly.
I've tried to disable use of Media Controller API by updating /boot/config.txt
with:
start_x=1
camera_auto_detect=0
dtoverlay=imx219,media-controller=0
After a reboot, I see:
$ v4l2-ctl --list-formats --device /dev/video0
ioctl: VIDIOC_ENUM_FMT
Type: Video Capture
[0]: 'pRAA' (10-bit Bayer RGRG/GBGB Packed)
[1]: 'RG10' (10-bit Bayer RGRG/GBGB)
[2]: 'RGGB' (8-bit Bayer RGRG/GBGB)
None of these pixel formats are among the ones supported by ustreamer
, so it fails after proposing YUVY
but getting pRAA
from the driver.
I attempted to circumvent this problem by adding definitions for these formats to the source code but all I ended up with after running ./ustreamer -s :: --log-level 3 -c m2m-video
(or -c M2M-MJPEG
) is a green/black weave pattern on the video stream.
It is failing the same on imx519 (Arducam 16MP)
-- DEBUG [1226.812 stream] -- Starting device capturing ...
-- ERROR [1226.812 stream] -- Can't start capturing: Invalid argument
-- DEBUG [1226.812 stream] -- Releasing device buffers ...
[pid 879] ioctl(8, VIDIOC_STREAMON, [V4L2_BUF_TYPE_VIDEO_CAPTURE]) = -1 EINVAL (Invalid argument)
Hello,
I just tried with the latest code from the master branch (ustreamer 5.3)
apt update && apt dist-upgrade -y && reboot
(kernel 5.15.30)rpi-update
camera_auto_detect=1
in /boot/config.txt
)Using the examples in the README, I tried the examples:
# Prevent kernel message "Wrong width or height 640x480 (remote pad set to 3280x2464)"
media-ctl --device 3 --set-v4l2 '"imx219 10-0010":1 [fmt:SRGGB10_1X10/640x480]'
# From the "Usage" section but with a lower resolution
./ustreamer --host :: -m jpeg --device-timeout=5 --buffers=3 -r 640x480
# From the "Raspberry Pi Camera Example" section
./ustreamer --format=uyvy --encoder=m2m-image --workers=3 --persistent --dv-timings --drop-same-frames=30
Both of these commands failed with:
-- ERROR [1616.438 stream] -- Can't start capturing: Invalid argument
With dmesg -T
I see:
unicam 3f801000.csi: Failed to start media pipeline: -22
@mdevaev, do you have any input on what's necessary to be able to use ustreamer for streaming video from MIPI CSI-2 sensors under Raspbian Bullseye with the modern camera stack (unicam)?
(comments in this thread suggested it's possible to workaround the problem by enabling the legacy camera stack and I did not try this. Personally I'd rather spend time on getting things to work with the modern camera stack to future proof things and ease setups under a default install)
Thanks for your time and keep up the good work with PiKVM!
OK. I have been playing with this to use a new bcm2835-isp
. In general libcamera
does convert pRGAA
into YUYV
using isp
and DMA transfer. It is not hard but the ustreamer
is not very flexible to extend and inject. I hacked it somehow adding m2m isp
. It kind of works, and breaks in number of places, but should support imx219
and imx519
.
You can see my branch in here: https://github.com/ayufan-research/ustreamer. Compile and then try to run it. I have troubles with higher resolutions. It seems to work on 5.10 and 5.15. On 5.15 it does work with M2M-JPEG
codec to further accelerate transfer.
This is not efficient, nor finished, and can only be seen how it might be done.
ustreamer --host :: -r 1920x1080 -m pRAA -w 1 -b 1 --debug --no-log-colors -d /dev/video0
Nice work!
I just tested this with a Raspberry Pi Zero 2W and later a Zero W v1.1, both running kernel 5.15.32.
media-ctl --device 3 --set-v4l2 '"imx219 10-0010":0 [fmt:SRGGB10_1X10/640x480]'
./ustreamer --host :: -r 640x480 -m pRAA -w 1 -b 1 --debug --no-log-colors -d /dev/video0
This might be completely incidental.. but the Zero 2W appears to have fried within about 15 seconds after running these commands, before I managed to open the browser to test out the stream. I lost SSH access and smelled a very hot PCB. After powering it off, I let it cool for 10 minutes and then tried to power it up again but it immediately becomes very hot and doesn't light any LEDs. Not sure what happened here -- it had been running idle for a few hours already on a clean desk, and I only accessed it over SSH.
Anyway, I moved the SD card and the camera over to a Raspberry Pi Zero W v1.1 and re-ran the commands. If I move my hand in front of the camera, I see a dark grey silhouette of it on a darker/black background on the streaming page. Is that expected? (libcamera-still -o still.jpg
produces an OK picture though.)
I also note that if I run libcamera-still -o still.jpg
, then a subsequent ./ustreamer ...
fails with Can't start capturing: Invalid argument
and the kernel reports unicam 20801000.csi: Failed to start media pipeline: -22
. It seems some settings are left in a different (unexpected) state than after a fresh boot when you've run libcamera-still
.
BTW, what Raspberry Pi hardware did you test with?
same issue
Sorry for the long answer.
I'm using ustreamer with some usb webcams and csi-hdmi bridge so I don't know how to make it work with the regular csi camera, I don't have any. I'll try to get some, but I can't promise that it will be happen soon.
Thanks a lot for looking into this issue, @mdevaev !
This might be completely incidental.. but the Zero 2W appears to have fried within about 15 seconds after running these commands, before I managed to open the browser to test out the stream. I lost SSH access and smelled a very hot PCB. After powering it off, I let it cool for 10 minutes and then tried to power it up again but it immediately becomes very hot and doesn't light any LEDs. Not sure what happened here -- it had been running idle for a few hours already on a clean desk, and I only accessed it over SSH.
I did not had such problems.
Anyway, I moved the SD card and the camera over to a Raspberry Pi Zero W v1.1 and re-ran the commands. If I move my hand in front of the camera, I see a dark grey silhouette of it on a darker/black background on the streaming page. Is that expected? (libcamera-still -o still.jpg produces an OK picture though.)
This is a problem that bcm2835-isp is not configured. The libcamera
does read sensor parameters from /usr/share/libcamera/ipa/raspberrypi/imx519.json
(or other depending on used sensor) and configures the /dev/video13
(you can see available with v4l2-ctl -d /dev/video13 -L
or ISP params only: v4l2-ctl -d /dev/video13 -C red_balance -C colour_correction_matrix -C black_level -C green_equalisation -C gamma -C denoise -C sharpen -C defective_pixel_correction -C colour_denoise
).
Once you configure this the above https://github.com/ayufan-research/ustreamer to work.
In general all those DLSR type sensors (from Sony, and others) they return only 8-bit or 10-bit Bayer packed format (once you hook into this stream, the latency is simply awesome) that resembles exactly how the CMOS sensor for cameras are built. You now once have this data need to manually calculate exposure time, analogue and digital gains. This can be done by ISP. Raspberry PI does have bcm2835-isp
(present in system under /dev/video12-15
) which can take this raw data, apply all corrections (color, gamma, etc.) and output YUYV/YU12/NV12 which can be then used as a source for other encoders (JPEG or H264). The ISP is able to output two resolutions at the same time, so effectively able to perform a single capture, and have high-res and low-res YUYV stream being handled concurrently.
Why I'm writing about this? The ISP is software controlled. Trying to re-implement libcamera
which does manage ISP is tendious task given all aspects that needs to be covered. Previously we would have UVC, or other type of sensor with integrated ISP returning YUYV, where we would configure brightness, contrast, focus, etc. It does mean that without libcamera
it is fairly hard to implement automated brightness control, or auto focus. The https://github.com/ayufan-research/ustreamer does use ISP directly, and this is why you saw the black image mostly, since ISP was not configured. It can be one time off configured with usage of libcamera-still
(with any type of parameters you might want).
About ustreamer
. This can be supported today with using a trick:
a. comment out the VIDIOC_S_INPUT: https://github.com/pikvm/ustreamer/blob/7ceb8a3da5a2aab92de9a5bebe294a122f01bb34/src/ustreamer/device.c#L421
b. use v4l2-ctl
to manually feed ISP with a stream of data v4l2-ctl --device /dev/video0 --out-device /dev/video13 --stream-mmap --stream-out-dmabuf
(it will copy over RG10 data into ISP)
c. run ustreamer --device /dev/video14
(it will expose YUYV, or whatever you configure it to)
d. this will work, but have pretty terrible latency
Exactly above behavior is implemented in https://github.com/ayufan-research/ustreamer. The camera data, is feed into ISP, and this is feed into JPEG. The implementation in https://github.com/ayufan-research/ustreamer is pretty clumsy, since ustreamer architecture does not allow to easily create a multi-stage pipelines for data processing, or hooking libcamera. For example the https://github.com/ayufan-research/ustreamer even though it hooks ISP, it does not offer YUYV stream for H264 support and there's no easy way aroud this limitation except maybe rewriting the device
be libcamera
centric (which internally uses ISP).
The integrated kernel data pipelining (as done with v4l2-ctl ...
) might land at some point in linux kernel: https://lore.kernel.org/linux-arm-kernel/Yd8fJd2SASGkhOjm@pendragon.ideasonboard.com/t/. This change uses media-controller routing capabilities, to hook into avaiable v4l2 devices and to route data streams between them to get sink with a desired formats. Unsure if this got merged, and if yes when it filters to Raspberry PI OS. However, this as well will have the same limitation: libcamera needs to control ISP to provide auto brightness and auto focus control.
As an exercise I tried to do it, and thus (sorry, for slightly hijacking this thread with own project) created in last few days the https://github.com/ayufan-research/camera-streamer. It did felt slightly easier after trying the first iteration in ustreamer codebase (due to lack of flexible multi-stage pipelines) to start from scratch with a project being v4l2 hardware centric. I wrote the v4l2
only pipeline, but at some point figured out above and saw that not using libcamera
is in general not wise idea. Using libcamera
comes with latency impact (but you get good control of exposure and focus), as I measured in my project compared to direct ISP-mode. Using direct ISP-mode I get a way better performance and latency than anything that I tested so far (using my branch for https://github.com/ayufan-research/ustreamer). Got pretty far, to the point that I started using the linked project on my Voron 0 with Arducam 16MP for MJPEG and H264 daily.
It appears that might I took a wrong approach with https://github.com/ayufan-research/ustreamer to implement ISP step instead of rewriting device
to be libcamera
centric. Usage of libcamera
is not really that hard and it offers the same access to mmaped
and dma-enabled
buffer: https://github.com/ayufan-research/camera-streamer/tree/master/device/libcamera.
Ref.:
@ayufan thank you for the research. It seems ustreamer is not a best choice for IMX219 and so on. The main ustreamer's purpose is HDMI capture. When I wrote ustreamer, libcamera did not exist on RPi. So, V4L2 encoders has no flexible support right now. Actually I dont't mind to add support of ISP and any other things if doesn't spoil latency, etc.
@mdevaev Adding ISP support is doable, but is not the best idea :) If you want to support cameras, consider making device
to support two modes of operation:
v4l2
(as today, for minimal latency)libcamera
(for maximum compatibility) -> the output of libcamera
will be the capture buffer with dma and mmap, so no later stages will need to be changed in ustreamerLatency difference according to my tests is due to the way how buffers are enqueued by libcamera, and how this introduces an additional processing delay. Depending on how you queue (on which slope) and when you enqueue buffer you might achieve significantly better latency as shown in the ISP-mode. The libcamera
can still achieve 120fps, it is just slightly slower :). This does make a difference for a case how fluid the motion feels.
This is especially visible when you have FPS set lower than the sensor can provide. Now, having too many buffers enqueued does increase latency. Doing this very dumb test https://www.youtube.com/watch?v=e8ZtPSIfWPc:
ustreamer
;)# libcamera
$ ./camera_stream -camera-path=/base/soc/i2c0mux/i2c@1/imx519@1a -camera-type=libcamera -camera-format=YUYV -camera-fps=120 -camera-width=2328 -camera-height=1748 -camera-high_res_factor=1.5 -log-filter=buffer_lock
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf2 (refs=2), frame=63/0, processing_ms=101.1, frame_ms=33.1
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf0 (refs=2), frame=64/0, processing_ms=99.2, frame_ms=31.9
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf1 (refs=2), frame=65/0, processing_ms=99.6, frame_ms=34.8
# direct ISP-mode
$ ./camera_stream -camera-path=/dev/video0 -camera-format=RG10 -camera-fps=30 -camera-width=2328 -camera-height=1748 -camera-high_res_factor=1.5 -log-filter=buffer_lock
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf1 (refs=2), frame=32/0, processing_ms=49.7, frame_ms=33.3
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf2 (refs=2), frame=33/0, processing_ms=49.7, frame_ms=33.3
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf0 (refs=2), frame=34/0, processing_ms=49.7, frame_ms=33.4
# libcamera
$ ./camera_stream -camera-path=/base/soc/i2c0mux/i2c@1/imx519@1a -camera-type=libcamera -camera-format=YUYV -camera-fps=10 -camera-width=2328 -camera-height=1748 -camera-high_res_factor=1.5 -log-filter=buffer_lock
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf2 (refs=2), frame=585/0, processing_ms=155.3, frame_ms=100.0
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf0 (refs=2), frame=586/0, processing_ms=155.5, frame_ms=100.2
# direct ISP-mode
$ ./camera_stream -camera-path=/dev/video0 -camera-format=RG10 -camera-fps=10 -camera-width=2328 -camera-height=1748 -camera-high_res_factor=1.5 -log-filter=buffer_lock
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf1 (refs=2), frame=260/0, processing_ms=57.5, frame_ms=99.7
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf2 (refs=2), frame=261/0, processing_ms=57.6, frame_ms=100.0
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf0 (refs=2), frame=262/0, processing_ms=58.0, frame_ms=100.4
# libcamera
$ ./camera_stream -camera-path=/base/soc/i2c0mux/i2c@1/imx519@1a -camera-type=libcamera -camera-format=YUYV -camera-fps=120 -camera-width=1280 -camera-height=720 -log-filter=buffer_lock
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf0 (refs=2), frame=139/0, processing_ms=20.1, frame_ms=7.9
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf1 (refs=2), frame=140/0, processing_ms=20.6, frame_ms=8.8
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf2 (refs=2), frame=141/0, processing_ms=19.8, frame_ms=8.1
# direct ISP-mode
$ ./camera_stream -camera-path=/dev/video0 -camera-format=RG10 -camera-fps=120 -camera-width=1280 -camera-height=720 -log-filter=buffer_lock
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf0 (refs=2), frame=157/0, processing_ms=18.5, frame_ms=8.4
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf1 (refs=2), frame=158/0, processing_ms=18.5, frame_ms=8.3
device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf2 (refs=2), frame=159/0, processing_ms=18.5, frame_ms=8.3
Latency difference according to my tests is due to the way how buffers are enqueued by libcamera, and how this introduces an additional processing delay. Depending on how you queue (on which slope) and when you enqueue buffer you might achieve significantly better latency as shown in the ISP-mode. The
libcamera
can still achieve 120fps, it is just slightly slower :). This does make a difference for a case how fluid the motion feels.This is especially visible when you have FPS set lower than the sensor can provide. Now, having too many buffers enqueued does increase latency. Doing this very dumb test https://www.youtube.com/watch?v=e8ZtPSIfWPc:
- 2328x1748@30fps: libcamera and my direct ISP is comparable, loosing about 9 frames (at 60HZ)
- 2328x1748@10fps: libcamera looses about 25 frames, the direct ISP looses about 11 frames
- I did not test, but I wonder how this compares to
ustreamer
;)2328x1748@30fps
# libcamera $ ./camera_stream -camera-path=/base/soc/i2c0mux/i2c@1/imx519@1a -camera-type=libcamera -camera-format=YUYV -camera-fps=120 -camera-width=2328 -camera-height=1748 -camera-high_res_factor=1.5 -log-filter=buffer_lock device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf2 (refs=2), frame=63/0, processing_ms=101.1, frame_ms=33.1 device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf0 (refs=2), frame=64/0, processing_ms=99.2, frame_ms=31.9 device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf1 (refs=2), frame=65/0, processing_ms=99.6, frame_ms=34.8 # direct ISP-mode $ ./camera_stream -camera-path=/dev/video0 -camera-format=RG10 -camera-fps=30 -camera-width=2328 -camera-height=1748 -camera-high_res_factor=1.5 -log-filter=buffer_lock device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf1 (refs=2), frame=32/0, processing_ms=49.7, frame_ms=33.3 device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf2 (refs=2), frame=33/0, processing_ms=49.7, frame_ms=33.3 device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf0 (refs=2), frame=34/0, processing_ms=49.7, frame_ms=33.4
2328x1748@10fps
# libcamera $ ./camera_stream -camera-path=/base/soc/i2c0mux/i2c@1/imx519@1a -camera-type=libcamera -camera-format=YUYV -camera-fps=10 -camera-width=2328 -camera-height=1748 -camera-high_res_factor=1.5 -log-filter=buffer_lock device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf2 (refs=2), frame=585/0, processing_ms=155.3, frame_ms=100.0 device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf0 (refs=2), frame=586/0, processing_ms=155.5, frame_ms=100.2 # direct ISP-mode $ ./camera_stream -camera-path=/dev/video0 -camera-format=RG10 -camera-fps=10 -camera-width=2328 -camera-height=1748 -camera-high_res_factor=1.5 -log-filter=buffer_lock device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf1 (refs=2), frame=260/0, processing_ms=57.5, frame_ms=99.7 device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf2 (refs=2), frame=261/0, processing_ms=57.6, frame_ms=100.0 device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf0 (refs=2), frame=262/0, processing_ms=58.0, frame_ms=100.4
1280x720@120fps for Arducam 16MP
# libcamera $ ./camera_stream -camera-path=/base/soc/i2c0mux/i2c@1/imx519@1a -camera-type=libcamera -camera-format=YUYV -camera-fps=120 -camera-width=1280 -camera-height=720 -log-filter=buffer_lock device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf0 (refs=2), frame=139/0, processing_ms=20.1, frame_ms=7.9 device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf1 (refs=2), frame=140/0, processing_ms=20.6, frame_ms=8.8 device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf2 (refs=2), frame=141/0, processing_ms=19.8, frame_ms=8.1 # direct ISP-mode $ ./camera_stream -camera-path=/dev/video0 -camera-format=RG10 -camera-fps=120 -camera-width=1280 -camera-height=720 -log-filter=buffer_lock device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf0 (refs=2), frame=157/0, processing_ms=18.5, frame_ms=8.4 device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf1 (refs=2), frame=158/0, processing_ms=18.5, frame_ms=8.3 device/buffer_lock.c: http_jpeg: Captured buffer JPEG:capture:mplane:buf2 (refs=2), frame=159/0, processing_ms=18.5, frame_ms=8.3
Im using your latest camera-streamer for my voron (3d printer) with an arducam imx477 and cm4. Just wanted to say THANK YOU!!!!!
Camera-streamer is amazing, ustreamer didn't worked for me either. Im using as dt-overlay: dtoverlay=imx477,media-controller=1
It works absolutely perfect with the new media-controller.
Im streaming at 1080p@60fps, it's almost absolutely latency free, takes no cpu usage and im overall extremely happy!
Your fork of ustreamer doesn't work for me sadly. Same error like mainline ustreamer.
However, please continue the work on camera-streamer it's perfect, i love it and thank you so much for that!
Cheers
I've got a camera and experimented with it. Too much work will be needed for native support, but Raspberry offers "libcamerify", which works fine with uStreamer. On Raspberry OS, install libcamera-tools
and run as follows: libcamerify ./ustreamer -r 1920x1080 --encoder=m2m-image
I have Bullseye on a Pi4 and RPI camera2
bristolfish@Mycodo:~ $ sudo apt install libcamera-tools Reading package lists... Done Building dependency tree... Done Reading state information... Done libcamera-tools is already the newest version (0.1.0+rpt20231122-1). 0 upgraded, 0 newly installed, 0 to remove and 50 not upgraded. bristolfish@Mycodo:~ $ libcamera-hello [5:33:33.478314526] [61051] INFO Camera camera_manager.cpp:284 libcamera v0.1.0+147-057299d0-dirty (2024-01-24T08:51:15+00:00) [5:33:33.556571064] [61057] WARN RPiSdn sdn.cpp:39 Using legacy SDN tuning - please consider moving SDN inside rpi.denoise [5:33:33.558631749] [61057] WARN RPI vc4.cpp:398 Mismatch between Unicam and CamHelper for embedded data usage! [5:33:33.559594611] [61057] INFO RPI vc4.cpp:452 Registered camera /base/soc/i2c0mux/i2c@1/imx219@10 to Unicam device /dev/media4 and ISP device /dev/media0 [5:33:33.559697147] [61057] INFO RPI pipeline_base.cpp:1167 Using configuration file '/usr/share/libcamera/pipeline/rpi/vc4/rpi_apps.yaml' [5:33:33.560713675] [61057] WARN V4L2 v4l2_pixelformat.cpp:338 Unsupported V4L2 pixel format H264 Preview window unavailable Mode selection for 1640:1232:12:P SRGGB10_CSI2P,640x480/0 - Score: 4504.81 SRGGB10_CSI2P,1640x1232/0 - Score: 1000 SRGGB10_CSI2P,1920x1080/0 - Score: 1541.48 SRGGB10_CSI2P,3280x2464/0 - Score: 1718 SRGGB8,640x480/0 - Score: 5504.81 SRGGB8,1640x1232/0 - Score: 2000 SRGGB8,1920x1080/0 - Score: 2541.48 SRGGB8,3280x2464/0 - Score: 2718 Stream configuration adjusted [5:33:33.563140599] [61051] INFO Camera camera.cpp:1183 configuring streams: (0) 1640x1232-YUV420 (1) 1640x1232-SBGGR10_CSI2P [5:33:33.563608039] [61057] INFO RPI vc4.cpp:616 Sensor: /base/soc/i2c0mux/i2c@1/imx219@10 - Selected sensor format: 1640x1232-SBGGR10_1X10 - Selected unicam format: 1640x1232-pBAA
^C bristolfish@Mycodo:~ $ libcamerify ustreamer -r 1920x1080 --encoder=m2m-image ERROR: ld.so: object '/usr/lib/aarch64-linux-gnu/libcamera/v4l2-compat.so' from LD_PRELOAD cannot be preloaded (cannot open shared object file): ignored. Unknown encoder type: m2m-image; available: CPU, HW, NOOP
Install libcamera-v4l2
also. They split a package.
Thanks, that made part of the error go away, but I'm left with this one
sudo apt-get install libcamera-v4l2
Reading package lists... Done
Building dependency tree... Done
Reading state information... Done
The following NEW packages will be installed:
libcamera-v4l2
0 upgraded, 1 newly installed, 0 to remove and 50 not upgraded.
Need to get 42.4 kB of archives.
After this operation, 199 kB of additional disk space will be used.
Get:1 http://archive.raspberrypi.com/debian bookworm/main arm64 libcamera-v4l2 arm64 0.1.0+rpt20231122-1 [42.4 kB]
Fetched 42.4 kB in 0s (168 kB/s)
Selecting previously unselected package libcamera-v4l2:arm64.
(Reading database ... 165940 files and directories currently installed.)
Preparing to unpack .../libcamera-v4l2_0.1.0+rpt20231122-1_arm64.deb ...
Unpacking libcamera-v4l2:arm64 (0.1.0+rpt20231122-1) ...
Setting up libcamera-v4l2:arm64 (0.1.0+rpt20231122-1) ...
bristolfish@Mycodo:~ $ libcamerify ustreamer -r 1920x1080 --encoder=m2m-image
Unknown encoder type: m2m-image; available: CPU, HW, NOOP
Is it ustreamer from the debian repo? Show ustreamer --version
.
bristolfish@Mycodo:~ $ ustreamer --version 4.9
This is the very old release. Uninstall it and build from git if you want to use m2m encoder.
Thanks. I'll try the Docker version I think
It is better to use native build. The list of dependencies is in the README, the build will take a few seconds.
@samuk Sup?
I do want to end up running it in Docker, the underlying OS will be https://github.com/balena-os I'll have a go at native first though if that's your recommendation.
What's up with Docker?
Docker requires port forwarding and video device. There are no problems, but I usually recommend the native launch because it's easier.
Is it working?
Hi,
I get "Unable to start capturing: Invalid argument" on a Raspberry Pi 2B running Raspberry Pi OS Bullseye:
I installed ustreamer from "deb http://raspbian.raspberrypi.org/raspbian/ bullseye":
I tried upgrading to ustreamer v4.11 (source build), but the error persists. According to
strace
, the error occurs inioctl(8, VIDIOC_STREAMON, [V4L2_BUF_TYPE_VIDEO_CAPTURE])
:libcamera-vid
works fine.Any ideas how to solve this issue?
Thank you!