Closed rourke750 closed 4 years ago
The Coral isn't a general compute device that you can offload processing to. It can only run specialized tensorflow lite models. You would have to build in motion detection and everything else into the model somehow.
Thank you @blakeblackshear ! Actually I was referring to insert every captured frame (after scaled) to Coral to do directly detection with the Tensorflow Lite model, without needing motion estimation and object tracking to obtain regions. I totally agree with the way is implemented, I only rethink of Raspberry performance.
Earlier versions worked that way. In my experience, the motion detection and object tracking are a very small percentage of the CPU use in comparison to decoding the video stream and the bare minimum processing, even on a Pi. I'm not sure it would reduce the CPU usage a meaningful amount.
@kpine will you update your version with the latest release, would be nice to be able to use the new "clips" function...
Thanks!
Now that the tflite runtime is out for python 3.8, I can move away from the plasma store and use the built-in shared memory. That should make it much easier to have an official RPi image.
Until then I have built v0.6.0 from commit 309c0dc and it seems to be working, I don't plan on updating it for minor releases but may update it for major ones, I made it because I wanted the clip recording myself.
This is unofficial and unsupported so please use at your own risk. I reccomend using docker-compose rather than the docker run, that way you can have two image lines in there and just switch between them using comments quite quickly.
Raspberry Pi support has been merged to the dev branch. It would be great if some of you could test and help identify issues. I have been testing with the latest version of Raspbian and Docker.
32bit image
docker pull blakeblackshear/frigate:dev-34c7697-armv7hf
hwaccel_args:
- -c:v
- h264_mmal
64bit image
docker pull blakeblackshear/frigate:dev-34c7697-arm64
hwaccel_args:
- -c:v
- h264_v4l2m2m
HI Blake
Awesome, thanks for this :)
I've tried it out on a Raspberry Pi 4 but get an error.
Linux frigate-pi 4.19.75-v7l+ #1270 SMP Tue Sep 24 18:51:41 BST 2019 armv7l GNU/Linux Docker version 19.03.5, build 633a0ea
Image blakeblackshear/frigate:dev-34c7697-armv7hf
Creating ffmpeg process... ffmpeg -hide_banner -loglevel info -c:v h264_mmal -avoid_negative_ts make_zero -fflags nobuffer -flags low_delay -strict experimental -fflags +genpts+discardcorrupt -r 7 -use_wallclock_as_timestamps 1 -i rtmp://xxx:1935/bcs/channel0_sub.bcs?channel=0&stream=0&user=admin&password=xxx -f rawvideo -pix_fmt rgb24 pipe: Input #0, flv, from 'rtmp://xxx:1935/bcs/channel0_sub.bcs?channel=0&stream=0&user=admin&password=xxx': Metadata: displayWidth : 640 displayHeight : 480 Duration: 00:00:00.00, start: 1602166636.443000, bitrate: N/A Stream #0:0: Data: none Stream #0:1: Audio: aac, 16000 Hz, mono, fltp Stream #0:2: Video: h264, yuv420p(progressive), 640x480, 6 fps, 1k tbr, 1k tbn mmalipc: mmal_vc_init_fd: could not open vchiq service [h264_mmal @ 0x19fd9b0] Cannot initialize MMAL VC driver! Stream mapping: Stream #0:2 -> #0:0 (h264 (h264_mmal) -> rawvideo (native)) Error while opening decoder for input stream #0:2 : Function not implemented front: ffmpeg sent a broken frame. something is wrong. front: ffmpeg process is not running. exiting capture thread... Creating ffmpeg process...
Can you try without the hwaccel_args? I only tested the 64bit image on the RPi4 and the 32bit image on a RPi3. Getting hwaccell to work was way more challenging than I expected, so it may be possible that I will need a 32bit specific build for the RPi4.
I could have sworn I tried that but I got a different error - perhaps the error below based on my terminal history
[h264_v4l2m2m @ 0x143d860] Could not find a valid device [h264_v4l2m2m @ 0x143d860] can't configure decoder Stream mapping: Stream #0:1 -> #0:0 (h264 (h264_v4l2m2m) -> rawvideo (native)) Error while opening decoder for input stream #0:1 : Invalid argument
Anyway I've just tried it again (so without the hwaccel_args) and it works perfectly :), so I must have done something wrong before
I couldn't get the 64-bit image to run though - it gave the 'exec format error'.
You will need the 64bit Raspbian OS to run the 64bit image. The exec format error
is expected. I will need to flash another SD card to test 32bit on the RPi4. I did notice that running the 64bit image resulted in faster inference speeds. I was seeing ~15ms on a RPi4 with 32bit OS and ~11ms on a RPi4 with 64bit OS.
Thanks - guessed that was the case.
Wow nice - I look forward to installing it once the lite version is released
I imagine a lot of people, at least for now, will end up with the 32-bit version even on their RPi4 because the 64-bit version of Raspberry Pi OS (replaces Raspbian) is still in beta. The downloads page doesn't mention a 64-bit version so you have to hunt it down intentionally.
My RPi4 is using a 32-bit version of Hypriot, and the docker image I was building is also 32-bit (cross compiled on a x86_64 VM via docker buildx). I've planned to re-image it with a 64-bit version and will certainly do so when frigate supports it. I have only heard that the 64-bit OS improves performance on the Pi 4.
I don't think Hypriot offers a 64-bit version yet. BalenaOS and Ubuntu do and DietPi has a beta version.
I'm a fan of Hypriot too. It's what I use for everything else. I started with Raspbian because I wanted to eliminate as many variables as possible for getting ffmpeg hwaccel to work.
FYI there is a 'lite' version of Raspberry PI OS 64 here: https://downloads.raspberrypi.org/raspios_lite_arm64/images/
I am also running Frigate at home on a Pi4 running 32 bit however it is running as part of a k3s cluster. I've been running the kpine/frigate-raspberrypi:0.5.2 image now for a while without any issues on it.
I ordered another RPi4 for testing 32bit builds.
Working on an update to where I convert pixel formats and initial testing shows the CPU usage dropped by 50% on average for a 1080p 5fps stream on the RPi 4.
@blakeblackshear sounds great! I am starting to test in my K3S of RPi4. By now, I was using kpine/frigate-raspberrypi:0.5.2 32bits. Let me ask in advance to debug:
- How do I activate debug traces to see in container log? I only see the initial trace.
There is no logging level configuration. You should be able to see the logs with kubectl logs -f <pod>
.
- I don't see the MQTT messages for frigate/
/events/start & end, at least in version 0.5.2. Is it something new in 0.6.x?
Correct. That was added in 0.6.x.
- Should I change from -pix_fmt- rgb24 to yuv420?
Not yet. That won't work until 0.7.x.
Latest rpi images are up with all the additional optimizations. I am close to the next RC.
Raspberry Pi 3/4 (32bit): blakeblackshear/frigate:dev-8ce6bf1-armv7hf
Raspberry Pi 4 (64bit): blakeblackshear/frigate:dev-8ce6bf1-arm64
I'm trying out 0.7.0-rc1-arm64 on a RockPi Model B running Ubuntu 18.04.4 and I have 3 1080p cameras and a USB Edge TPU.
Impressions after 30 mins, it appears to be working brilliantly.
CPU System is down from 20% to 10% CPU User is down from 60-70% to 30-40% System Load is down from 5-6 to 3-4 CPU Temperature is down from 70c to 55c
Inference speed is a bit odd, according to Home Assistant previous infrence speed was 280-350ms and now it's 20-22ms, not sure if something wasn't reporting correctly before there.
I've even managed to take out -filter:v fps=fps=5
and it still uses less resources than before.
The reduction in CPU load is about what I would expect based on the changes I made. Not sure why your inference speeds would be so much lower, but I did remove pyarrow from the process of handing frames back and forth. Does the RockPi Model B support the same hardware acceleration for decoding as the RPi4?
I don't think they use the same hardware acceleration, Pi uses OpenMax and Rock uses rkmpp, I did try to compile a version of ffmpeg myself but hit the limits of my ability.
@blakeblackshear , what settings do you recommend to make the most of Raspberry Pi 4 (32bit),i.e blakeblackshear/frigate:0.7.0-rc1-armv7hf:
Don't specify output_args and use the default.
Thank you very much, just testing figures. I have also checked that HEVC hw decoder is also available in your Docker compilation:
# ffmpeg -decoders | grep hevc
VFS..D hevc
V..... hevc_v4l2m2m (codec hevc)
But is it not valid in Pi right? BTW, what 64-bit OS Image do you use for Pi 4?
Some feedback from my side, I've upgraded one of my Pi4s (Coral) to the 64bit image @mzac (https://downloads.raspberrypi.org/raspios_lite_arm64/images/) referenced and installed blakeblackshear/frigate:dev-8ce6bf1-arm64.
I'm still configuring everything but so far I have 24h runtime without any crashes. 2 x 1920*1080 @ 25fps source streams CPU at 34% load, inference at 16ms. Default config settings. Once I've moved everything over to my dev Pi4 I'll start doing some more detailed testing and tweaking.
@blakeblackshear: Thanks for the efforts you put into Frigate, very much appreciated!
Has anyone gotten the latest version working with a raspberry pi 4? I am stuck on the opencv stuff as there arent any packages for arch.