Closed BrodyStone21 closed 2 years ago
because I thought since both cameras were running mjpeg streams, it would be the same process.
There is a separate process created for each path
defined for each camera.
For the RPi camera I would try using the recommended mjpeg args in the docs: https://docs.frigate.video/configuration/camera_specific#mjpeg-cameras
I'm not sure how relevant this is, but if the above doesn't work then https://patchwork.ffmpeg.org/project/ffmpeg/patch/1485019650-30152-1-git-send-email-pkoshevoy@gmail.com/ could give some clues. Seems like it doesn't support 444 which your ffprobe indicated yuvj444p
Maybe play around with the RPi camera streaming settings and see if something there makes a difference
because I thought since both cameras were running mjpeg streams, it would be the same process.
There is a separate process created for each
path
defined for each camera.For the RPi camera I would try using the recommended mjpeg args in the docs: https://docs.frigate.video/configuration/camera_specific#mjpeg-cameras
Thanks for the reply. Unfortunately, I'm already using the recommended input/output args for playing mjpeg streams; see my config.
I'm not sure how relevant this is, but if the above doesn't work then https://patchwork.ffmpeg.org/project/ffmpeg/patch/1485019650-30152-1-git-send-email-pkoshevoy@gmail.com/ could give some clues. Seems like it doesn't support 444 which your ffprobe indicated
yuvj444p
Maybe play around with the RPi camera streaming settings and see if something there makes a difference
I'll try messing around some more with it. My original plan was to try and use rtsp with the pi instead of mjpeg, but finding a solution proved to be pretty difficult with the migration from the legacy cam to libcamera.
Thanks for the reply. Unfortunately, I'm already using the recommended input/output args for playing mjpeg streams; see my config.
You do not have -use_wallclock_as_timestamps 1
set, which is recommended
Thanks for the reply. Unfortunately, I'm already using the recommended input/output args for playing mjpeg streams; see my config.
You do not have
-use_wallclock_as_timestamps 1
set, which is recommended
Oops, forgot to mention I removed that for testing. I obviously forgot to put it back in.
I put it back in and the issue persists. Any other ideas?
Can you run ffprobe on the camera that works? Would be helpful to maybe confirm for sure what the differences between them are
Can you run ffprobe on the camera that works? Would be helpful to maybe confirm for sure what the differences between them are
Sure thing
ffprobe version 5.0.1-Jellyfin Copyright (c) 2007-2022 the FFmpeg developers
built with gcc 10 (Debian 10.2.1-6)
configuration: --prefix=/usr/lib/jellyfin-ffmpeg --target-os=linux --extra-libs=-lfftw3f --extra-version=Jellyfin --disable-doc --disable-ffplay --disable-ptx-compression --disable-shared --disable-libxcb --disable-sdl2 --disable-xlib --enable-lto --enable-gpl --enable-version3 --enable-static --enable-gmp --enable-gnutls --enable-chromaprint --enable-libdrm --enable-libass --enable-libfreetype --enable-libfribidi --enable-libfontconfig --enable-libbluray --enable-libmp3lame --enable-libopus --enable-libtheora --enable-libvorbis --enable-libdav1d --enable-libwebp --enable-libvpx --enable-libx264 --enable-libx265 --enable-libzvbi --enable-libzimg --enable-libfdk-aac --arch=amd64 --enable-libshaderc --enable-libplacebo --enable-vulkan --enable-opencl --enable-vaapi --enable-amf --enable-libmfx --enable-ffnvcodec --enable-cuda --enable-cuda-llvm --enable-cuvid --enable-nvdec --enable-nvenc
libavutil 57. 17.100 / 57. 17.100
libavcodec 59. 18.100 / 59. 18.100
libavformat 59. 16.100 / 59. 16.100
libavdevice 59. 4.100 / 59. 4.100
libavfilter 8. 24.100 / 8. 24.100
libswscale 6. 4.100 / 6. 4.100
libswresample 4. 3.100 / 4. 3.100
libpostproc 56. 3.100 / 56. 3.100
Input #0, mpjpeg, from 'http://192.168.1.101:8000/video':
Duration: N/A, bitrate: N/A
Stream #0:0: Video: mjpeg (Baseline), yuvj420p(pc, bt470bg/unknown/unknown), 1920x1080 [SAR 1:1 DAR 16:9], 25 tbr, 25 tbn
yeah looks like this one is yuv420 and not yuv444, seems that is the issue.
Something interesting here: https://github.com/raspberrypi/picamera2/pull/161
If unable to set it there, may need to look into cuvid args and see if you can set the pixel format to yuv444 on nvidia side
If unable to set it there, may need to look into cuvid args and see if you can set the pixel format to yuv444 on nvidia side
I'll ask there, thank you. This problem is proving to be much more difficult than I thought haha.
I'm sure that once I get it figured out, it'll be able to help a lot of people using frigate with Raspberry Pi's.
UPDATE!!
Finally got the Raspberry Pi to encode it's stream in YUV420 instead of YUV444. I asked on the raspberrypi/picamera2 repo and was helped there. The main developer has now created a pull request to make the JpegEncoder module use YUV420 by default, but it can also be manually specified.
This request will go into the "next" branch for now, but will eventually make it's way into the final release. For the meantime, you can reference this PR. Or just use the JpegEncoder module from the next branch.
Cool, glad it is working now 👍
Describe the problem you are having
Hey guys, I've recently set up frigate and gotten hardware transcoding working. However, it's only working on one of my cameras.
I have two cameras, one is a Google Pixel 3 running IP Webcam downloaded from the Google Play Store. It's streaming its contents using mjpeg over http.
The second camera is a Raspberry Pi 4 running 32-bit Bullseye with the official v2 camera module. It is configured to use libcamera (what Raspberry Pi OS has started migrating to) instead of the legacy camera option, and I'm using a modified version of raspberrypi/picamera2/examples/mjpeg_server.py script to stream the camera feed over http.
I'm on Ubuntu 21.04 LTS headless with the latest version of nvidia-driver-515 installed.
Hardware acceleration for the first camera works just fine. However, I cannot get hardware acclerationf for the second camera to work. I must be misunderstanding something, because I thought since both cameras were running mjpeg streams, it would be the same process.
I'm trying to get this set up correctly as a test, before I commit to buying more raspberry pis and starting a home project.
Below is my docker run command, but I've had issues with both docker-compose and docker run.
Version
0.11.0-rc1
Frigate config file
Relevant log output
FFprobe output from your camera
Frigate stats
Operating system
Debian
Install method
Docker CLI
Coral version
CPU (no coral)
Network connection
Mixed
Camera make and model
Raspberry Pi 4 (32-bit Bullseye) v2 Camera Module
Any other information that may be helpful
I can open my web interface endpoint, and it works fine. I see both cameras. However, the Raspberry Pi stream is green. After clicking on it, it shows the camera as white.