blakeblackshear / frigate

NVR with realtime local object detection for IP cameras
https://frigate.video
MIT License
19.09k stars 1.74k forks source link

Improve cache maintenance #570

Closed blakeblackshear closed 3 years ago

blakeblackshear commented 3 years ago

Instead of just using max_seconds for cache limiting, also look at available storage at /tmp/cache and proactively clear cache to prevent it from filling up entirely.

cybertza commented 3 years ago

Awesome, I think also addressing this via the log in the intermediate if the watermark reaches 90% of the /tmp/cache and or SHM, and perhaps even a startup check and log entry for /tmp/cache writability, and free space. Awesome work thou

Ysbrand commented 3 years ago

Not sure, but it looks that the cache cleanup is not working: Currently running HA-Addon 1.5 (0.8.0 Release Candidate 3)

clips:
  max_seconds: 60
  tmpfs_cache_size: 256m
-rw-r--r-- 1 root root 3386954 Jan 21 13:55 Voordeur-20210121135543.mp4
-rw-r--r-- 1 root root 1558587 Jan 21 13:55 Voordeur-20210121135551.mp4
-rw-r--r-- 1 root root 2413829 Jan 21 13:56 Voordeur-20210121135558.mp4
-rw-r--r-- 1 root root 3204001 Jan 21 13:56 Voordeur-20210121135608.mp4
-rw-r--r-- 1 root root 2614125 Jan 21 13:56 Voordeur-20210121135621.mp4
-rw-r--r-- 1 root root 2323783 Jan 21 13:56 Voordeur-20210121135631.mp4
-rw-r--r-- 1 root root 2549327 Jan 21 13:56 Voordeur-20210121135641.mp4
-rw-r--r-- 1 root root 1660734 Jan 21 13:56 Voordeur-20210121135651.mp4
-rw-r--r-- 1 root root 3328346 Jan 21 13:57 Voordeur-20210121135658.mp4
-rw-r--r-- 1 root root 1619142 Jan 21 13:57 Voordeur-20210121135711.mp4
-rw-r--r-- 1 root root 2540720 Jan 21 13:57 Voordeur-20210121135718.mp4
-rw-r--r-- 1 root root 2546460 Jan 21 13:57 Voordeur-20210121135728.mp4
-rw-r--r-- 1 root root 3494911 Jan 21 13:57 Voordeur-20210121135738.mp4
-rw-r--r-- 1 root root 2459607 Jan 21 13:58 Voordeur-20210121135751.mp4
-rw-r--r-- 1 root root 2439002 Jan 21 13:58 Voordeur-20210121135801.mp4
-rw-r--r-- 1 root root 1620688 Jan 21 13:58 Voordeur-20210121135811.mp4
-rw-r--r-- 1 root root 2581544 Jan 21 13:58 Voordeur-20210121135818.mp4
-rw-r--r-- 1 root root 2538984 Jan 21 13:58 Voordeur-20210121135828.mp4
-rw-r--r-- 1 root root 2581731 Jan 21 13:58 Voordeur-20210121135838.mp4
-rw-r--r-- 1 root root 3346215 Jan 21 13:59 Voordeur-20210121135848.mp4
-rw-r--r-- 1 root root 2550758 Jan 21 13:59 Voordeur-20210121135901.mp4
-rw-r--r-- 1 root root 1483757 Jan 21 13:59 Voordeur-20210121135911.mp4
-rw-r--r-- 1 root root 2528616 Jan 21 13:59 Voordeur-20210121135918.mp4
-rw-r--r-- 1 root root 3504434 Jan 21 13:59 Voordeur-20210121135928.mp4
-rw-r--r-- 1 root root 2510272 Jan 21 13:59 Voordeur-20210121135941.mp4
-rw-r--r-- 1 root root 2562013 Jan 21 14:00 Voordeur-20210121135951.mp4
-rw-r--r-- 1 root root 2538035 Jan 21 14:00 Voordeur-20210121140001.mp4
-rw-r--r-- 1 root root 2369760 Jan 21 14:00 Voordeur-20210121140011.mp4
-rw-r--r-- 1 root root 2540978 Jan 21 14:00 Voordeur-20210121140021.mp4
-rw-r--r-- 1 root root 2448570 Jan 21 14:00 Voordeur-20210121140031.mp4
-rw-r--r-- 1 root root 2489359 Jan 21 14:00 Voordeur-20210121140041.mp4
-rw-r--r-- 1 root root 2538801 Jan 21 14:01 Voordeur-20210121140051.mp4
-rw-r--r-- 1 root root 2573918 Jan 21 14:01 Voordeur-20210121140101.mp4
-rw-r--r-- 1 root root 1698697 Jan 21 14:01 Voordeur-20210121140111.mp4
-rw-r--r-- 1 root root 3408830 Jan 21 14:01 Voordeur-20210121140118.mp4
-rw-r--r-- 1 root root 1683854 Jan 21 14:01 Voordeur-20210121140131.mp4
-rw-r--r-- 1 root root 2598506 Jan 21 14:01 Voordeur-20210121140138.mp4
-rw-r--r-- 1 root root 2489882 Jan 21 14:01 Voordeur-20210121140148.mp4
drwxrwxrwt 2 root root     820 Jan 21 14:01 .
-rw-r--r-- 1 root root 2359344 Jan 21 14:02 Voordeur-20210121140158.mp4

It's my understanding that clips are supposed to be cleaned after 60 seconds with the current setting (and this how it did behave in the beta's.)

UPDATE: Looks like it didn't like the idea of 60 seconds retention. I've changed the value to 300 seconds and now it happily deletes files from the cache.

UPDATE 2: A few hours later my cache is full again with files that are more than 1 hour old

Ysbrand commented 3 years ago

It runs fine for some hours and cleans up files well before the 5 minutes and suddenly there are no longer cleanups happening and cache fills up (and eventually HA stops working as it runs out of other resources,)

blakeblackshear commented 3 years ago

Do you have any in your logs? It sounds like the cache cleanup thread is getting stuck. The cache is cleaned up sooner than the max seconds value when there are no ongoing events.

Ysbrand commented 3 years ago

I've tried to find logs in my docker, but can't find them. Any pointers? (I'm using the HA extension).

Looks like they are all pipes, but where to?

lrwx------ 1 root root 64 Jan 22 13:59 0 -> /dev/null
l-wx------ 1 root root 64 Jan 22 13:59 1 -> 'pipe:[103700]'
l-wx------ 1 root root 64 Jan 22 13:59 2 -> 'pipe:[103701]'
lr-x------ 1 root root 64 Jan 22 13:59 3 -> 'pipe:[103760]'
l-wx------ 1 root root 64 Jan 22 13:59 4 -> 'pipe:[103760]'
lr-x------ 1 root root 64 Jan 22 13:59 5 -> 'pipe:[103764]'
l-wx------ 1 root root 64 Jan 22 13:59 6 -> 'pipe:[103764]'
lr-x------ 1 root root 64 Jan 22 13:59 7 -> /dev/null
l-wx------ 1 root root 64 Jan 22 13:59 8 -> 'pipe:[103768]'
lr-x------ 1 root root 64 Jan 22 13:59 9 -> 'pipe:[103769]'
blakeblackshear commented 3 years ago

There are no logs in the container itself. If you are using the addon, you should be able to see the logs in the log tab for the addon under supervisor in homeassistant.

Ysbrand commented 3 years ago

I was aware of these logs, but I have not seen anything special. Also, as these logs do not contain any timestamp, it's a kind of hard to determine if I'm looking at old information (and how old it is).

Ysbrand commented 3 years ago

I think this is the issue:

Exception in thread event_cleanup:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 3129, in execute_sql
    cursor.execute(sql, params or ())
sqlite3.OperationalError: database is locked
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
  File "/usr/lib/python3.8/threading.py", line 932, in _bootstrap_inner
    self.run()
  File "/opt/frigate/frigate/events.py", line 297, in run
    self.expire('clips')
  File "/opt/frigate/frigate/events.py", line 281, in expire
    update_query.execute()
  File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 1898, in inner
    return method(self, database, *args, **kwargs)
  File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 1969, in execute
    return self._execute(database)
  File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 2465, in _execute
    cursor = database.execute(self)
  File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 3142, in execute
    return self.execute_sql(sql, params, commit=commit)
  File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 3136, in execute_sql
    self.commit()
  File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 2902, in __exit__
    reraise(new_type, new_type(exc_value, *exc_args), traceback)
  File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 185, in reraise
    raise value.with_traceback(tb)
  File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 3129, in execute_sql
    cursor.execute(sql, params or ())
peewee.OperationalError: database is locked
Exception in thread event_processor:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 3129, in execute_sql
    cursor.execute(sql, params or ())
sqlite3.OperationalError: database is locked
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
  File "/usr/lib/python3.8/threading.py", line 932, in _bootstrap_inner
    self.run()
  File "/opt/frigate/frigate/events.py", line 181, in run
    Event.create(
  File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 6338, in create
    inst.save(force_insert=True)
  File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 6548, in save
    pk = self.insert(**field_dict).execute()
  File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 1898, in inner
    return method(self, database, *args, **kwargs)
  File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 1969, in execute
    return self._execute(database)
  File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 2730, in _execute
    return super(Insert, self)._execute(database)
  File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 2465, in _execute
    cursor = database.execute(self)
  File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 3142, in execute
    return self.execute_sql(sql, params, commit=commit)
  File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 3136, in execute_sql
    self.commit()
  File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 2902, in __exit__
    reraise(new_type, new_type(exc_value, *exc_args), traceback)
  File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 185, in reraise
    raise value.with_traceback(tb)
  File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 3129, in execute_sql
    cursor.execute(sql, params or ())
peewee.OperationalError: database is locked

Not sure if a process table is of any help:

root@ccab4aaf-frigate-beta:/tmp/cache# ps aux --sort -rss
USER         PID %CPU %MEM    VSZ   RSS TTY      STAT START   TIME COMMAND
root           6  2.0  4.8 1481124 188700 ?      Sl   09:10   7:06 python3 -u -m frigate
root          37  2.7  1.9 556236 76784 ?        Sl   09:10   9:37 frigate.detector.coral
root          40  4.6  1.9 677148 75040 ?        Sl   09:10  16:10 frigate.process:Voordeur
root          42  2.1  1.8 747024 72268 ?        Sl   09:10   7:23 python3 -u -m frigate
root          31  0.0  1.5 400232 61472 ?        S    09:10   0:01 frigate.logger
root        8205  9.4  0.7 461168 29072 ?        Ssl  13:48   6:24 ffmpeg -hide_banner -loglevel error -avoid_negative_ts make_zero -ff
root          50  1.9  0.4  59292 19156 ?        Ss   09:10   6:39 ffmpeg -hide_banner -loglevel error -avoid_negative_ts make_zero -ff
root          36  0.1  0.2  13444 10052 ?        S    09:10   0:26 /usr/bin/python3 -c from multiprocessing.resource_tracker import mai
nobody        20  0.0  0.0  16336  3836 ?        S    09:10   0:05 nginx: worker process
root        8998  0.0  0.0   3864  3028 pts/0    Ss   14:14   0:00 /bin/bash
root       10047  0.0  0.0   5456  2400 pts/0    R+   14:56   0:00 ps aux --sort -rss
root          19  0.0  0.0  15864  1140 ?        Ss   09:10   0:00 nginx: master process /usr/sbin/nginx
root           1  0.0  0.0    784     4 ?        Ss   09:10   0:00 /sbin/docker-init -- /run.sh
cybertza commented 3 years ago
Filesystem      Size  Used Avail Use% Mounted on
/dev/loop2       40G  6.7G   34G  17% /
tmpfs            64M     0   64M   0% /dev
tmpfs            16G     0   16G   0% /sys/fs/cgroup
shm             1.0G   40M  985M   4% /dev/shm
shfs             11T  2.4T  8.7T  22% /clips
rootfs           16G  1.2G   15G   8% /cache
/dev/loop2       40G  6.7G   34G  17% /etc/hosts
tmpfs            16G   16G     0 100% /tmp/cache
tmpfs            16G   12K   16G   1% /proc/driver/nvidia
tmpfs            16G  4.0K   16G   1% /etc/nvidia/nvidia-application-profiles-rc.d
devtmpfs         16G     0   16G   0% /dev/nvidia0
Tasks:  37 total,   1 running,  35 sleeping,   0 stopped,   1 zombie
%Cpu(s): 15.0 us, 16.5 sy,  0.0 ni, 68.3 id,  0.2 wa,  0.0 hi,  0.2 si,  0.0 st
MiB Mem :  31982.4 total,   5014.2 free,   2262.3 used,  24705.9 buff/cache
MiB Swap:      0.0 total,      0.0 free,      0.0 used.  11708.7 avail Mem 

  PID USER      PR  NI    VIRT    RES    SHR S  %CPU  %MEM     TIME+ COMMAND                                               
    1 root      20   0 2167120 546932  41692 S   0.0   1.7   9:59.79 python3                                               
   18 root      20   0   16416   3220     12 S   0.0   0.0   0:00.00 nginx                                                 
   19 nobody    20   0   21916  11760   2956 S   0.0   0.0   4:16.47 nginx                                                 
   28 root      20   0  606828  62460   5768 S   0.0   0.2   0:08.02 frigate.logger                                        
   33 root      20   0   19556  15964   6056 S   0.0   0.0   1:04.83 python3                                               
   34 root      20   0  852244 112616  22376 S   0.0   0.3  48:13.72 frigate.detecto                                       
   40 root      20   0 1015220 115312  15960 S   0.0   0.4   4:20.67 frigate.process                                       
   41 root      20   0 1004536 104384  15896 S   0.0   0.3   2:05.60 frigate.process                                       
   42 root      20   0 1005932 106108  15960 S   0.0   0.3   5:19.03 frigate.process                                       
   43 root      20   0 1003284 103416  15960 S   0.0   0.3   5:46.65 frigate.process                                       
   44 root      20   0 1011716 112120  15960 S   0.0   0.3   7:51.33 frigate.process                                       
   45 root      20   0 1002464 102444  15768 S   0.0   0.3   1:27.97 frigate.process                                       
   46 root      20   0  870232  98340  12936 S   0.0   0.3   1:18.86 frigate.process                                       
   47 root      20   0 1009100 109412  15960 S   0.0   0.3   8:48.82 frigate.process                                       
   48 root      20   0 1003616 103596  15960 S   0.0   0.3   6:10.97 frigate.process                                       
   49 root      20   0 1003248 103188  15832 S   0.0   0.3   0:08.47 frigate.process                                       
   50 root      20   0 1005348 105156  15896 S   0.0   0.3   4:07.48 frigate.process                                       
   51 root      20   0 1023576 123680  15960 S   0.0   0.4   8:03.70 frigate.process                                       
   52 root      20   0 1003776 104008  15960 S   0.0   0.3   7:04.67 frigate.process                                       
   53 root      20   0  999880  95328   7424 S   0.0   0.3   1:59.60 python3                                               
   56 root      20   0  999880  97072   7488 S   0.0   0.3   3:50.47 python3                                               
   60 root      20   0  999880  97096   7488 S   0.0   0.3   3:57.20 python3                                               
   62 root      20   0  999880  97160   7552 S   0.0   0.3   3:54.94 python3                                               
   64 root      20   0  999880  97116   7552 S   0.0   0.3   3:55.33 python3                                               
   66 root      20   0  999880  97120   7552 S   0.0   0.3   3:47.47 python3                                               
   67 root      20   0  999880  97128   7552 S   0.0   0.3   3:49.20 python3                                               
   71 root      20   0  999880  97500   7868 S   0.0   0.3   3:56.84 python3                                               
   83 root      20   0  999880  95464   7868 S   0.0   0.3   3:57.44 python3                                               
   86 root      20   0 1066012 101636   7684 S   0.0   0.3   0:26.72 python3                                               
   91 root      20   0  999880  97156   7552 S   0.0   0.3   3:43.17 python3                                               
   94 root      20   0  999880  97208   7552 S   0.0   0.3   3:52.34 python3                                               
# ps aux | grep 'Z'
USER       PID %CPU %MEM    VSZ   RSS TTY      STAT START   TIME COMMAND
root      7502  1.4  0.0      0     0 ?        Zs   19:09   0:00 [ffmpeg] <defunct>
root      7546  0.0  0.0   3312   676 pts/1    R+   19:09   0:00 grep Z

oldest file and current time, then a ton of 0 byte files

# find /tmp/cache -type f -printf '%T+ %p\n' | sort | head -n 1
2021-01-22+13:32:54.3743130160 /tmp/cache/Garage-20210122133242.mp4
# date
Fri Jan 22 19:12:47 SAST 2021

if i rm * the folder then all returns to operational again, but not sure for how long.

I can enable the logging if you like, since i am able te reliabily repear this on the nvidia build

cybertza commented 3 years ago

Does not seem clearing the folder restores the process.

cameras and detection work, but the folder grows again, and 16gb should be more than ample for it to catch up if it had an issue.

cybertza commented 3 years ago

So im trying this from inside the docker to see if it will work

watch -n 300 -x find /tmp/cache -mmin +10 -type f -exec rm -fv {} \;

so my clips should die after 300, and technically im making sure there is no leftover thats 10 mins old

Ysbrand commented 3 years ago

I tried more or less the same, but clips were no longer captured as the process was broken. cd /tmp/cache; while true ; do find . -mmin +5 -delete ;sleep 60 ; done &

blakeblackshear commented 3 years ago

@Ysbrand the thread is crashing because of the database is locked error. Others have reported this when they have the database stored on a network drive or other slow disk. What are you using for your clips directory?

cybertza commented 3 years ago

Ok so here is what i have done until this is resolved programmatically, bearing in mind i run Unraid, so some things are useless for you:

Install Nerdpack : https://raw.githubusercontent.com/dmacias72/unRAID-NerdPack/master/plugin/NerdPack.plg Under Apps Install Screen ( this is so that i dont have to keep a terminal open

~~on my share drive, in the config folder i added a file called: ~~

cache.sh ``` #! /bin/sh watch -n 300 -x find /tmp/cache -mmin +15 -type f -exec rm -fv {} \; ``` Started screen in the Unraid webterminal on the server as root:

Then i Terminal'd into my docker:

~~docker exec -it frigate /bin/bash~~

and ran the following command

~~watch -n 60 -x find /tmp/cache -mmin +12 -type f -exec rm -fv {} \;~~

I do want you to note that this needs to be done right after an reboot, aka before the drive was at 100% once, I do also feel that the time needs to exceed the max record time, but havent tested this, ~~

i check this command every 1 min, and it deletes all files older than 15 mins,

while im typing this, i guess i could have just passed the command directly to the docker as a screen session...

i assume that would look something like this: in host terminal

screen
docker exec -it frigate watch -n 60 -x find /tmp/cache -mmin +15 -type f -exec rm -fv {} \;

i would still do this inside a screen thou

cybertza commented 3 years ago

sbrand the thread is crashing because of the database is locked error. Others have reported this when they have the database stored on a network drive or other slow disk. What are you using for your clips directory?

From what i understand his Os runs out of space eventually and that then crashes the DB

blakeblackshear commented 3 years ago

That would make sense too. Let me run a simulation and make sure the cache is adhering to the max_seconds setting.

cybertza commented 3 years ago

image I def run out of /tmp/cache

blakeblackshear commented 3 years ago

Can you post your config so I can make mine as similar as possible?

cybertza commented 3 years ago

my max sec is set to 300 and as shown above, sometime i have older files in the folder

cybertza commented 3 years ago

build is nvidia x64 on unraid docker

ffmpeg:
  hwaccel_args:
    - -c:v 
    - h264_cuvid
# Optional: detectors configuration
# USB Coral devices will be auto detected with CPU fallback
#detectors:
#  # Required: name of the detector
#  coral:
#    # Required: type of the detector
#    # Valid values are 'edgetpu' (requires device property below) and 'cpu'.
#    type: edgetpu
#    # Optional: device name as defined here: https://coral.ai/docs/edgetpu/multiple-edgetpu/#using-the-tensorflow-lite-python-api
#    device: usb
clips:
  # Optional: Maximum length of time to retain video during long events. (default: shown below)
  # NOTE: If an object is being tracked for longer than this amount of time, the cache
  #       will begin to expire and the resulting clip will be the last x seconds of the event.
  #enabled: True
  max_seconds: 300
  # Optional: Retention settings for clips (default: shown below)
  tmpfs_cache_size: 16384m
  retain:
    # Required: Default retention days (default: shown below)
    default: 60
    # Optional: Per object retention days
#    objects:
#      person: 60

objects:
  # Optional: list of objects to track from labelmap.txt (default: shown below)
  track:
    - person
    - bicycle
    - car
    - motorcycle
#    - gun
#    - airplane
    - bus
#    - train
    - truck
#    - boat
#    - traffic light
#    - fire hydrant
#    - stop sign
#    - parking meter
#    - bench
#    - bird
#    - cat
    - dog
#    - horse
#    - sheep
#    - cow
#    - elephant
#    - bear
#    - zebra
#    - giraffe
    - backpack
#    - umbrella
    - handbag
#    - tie
#    - suitcase
#    - frisbee
#    - skis
#    - snowboard
#    - sports ball
#    - kite
#    - baseball bat
#    - baseball glove
#    - skateboard
#    - surfboard
#    - tennis racket
#    - bottle
#    - wine glass
#    - cup
#    - fork
#    - knife
#    - spoon
#    - bowl
#    - banana
#    - apple
#    - sandwich
#    - orange
#    - broccoli
#    - carrot
#    - hot dog
#    - pizza
#    - donut
#    - cake
#    - chair
#    - couch
#    - potted plant
#    - bed
#    - dining table
#    - toilet
#    - tv
#    - laptop
#    - mouse
#    - remote
#    - keyboard
#    - cell phone
#    - microwave
#    - oven
#    - toaster
#    - sink
#    - refrigerator
#    - book
#    - clock
#    - vase
#    - scissors
#    - teddy bear
#    - hair drier
#    - toothbrush
  # Optional: filters to reduce false positives for specific object types
# filters:
#   person:
#     # Optional: minimum width*height of the bounding box for the detected object (default: 0)
#     min_area: 5000
#     # Optional: maximum width*height of the bounding box for the detected object (default: 24000000)
#     max_area: 100000
#     # Optional: minimum score for the object to initiate tracking (default: shown below)
#     min_score: 0.5
#     # Optional: minimum decimal percentage for tracked object's computed score to be considered a true positive (default: shown below)
#     threshold: 0.85      
mqtt:
  host: 172.16.103.9
  topic_prefix: frigate
  # Optional: client id (default: shown below)
  # WARNING: must be unique if you are running multiple instances
  client_id: frigate
  # Optional: user  
cameras:
  intercom:
    ffmpeg:
      inputs:
        - path: rtsp://admin:{FRIGATE_RTSP_PASSWORD}@172.16.103.230:554/profile1
          roles:
            - detect
            - rtmp
            - clips
#            - record            
    height: 720
    width: 1280
    fps: 5
    clips:
      # Required: enables clips for the camera (default: shown below)
      enabled: True
      pre_capture: 5
      # Optional: Number of seconds before the event to include in the clips (default: shown below)
      #pre_capture: 5
      # Optional: Number of seconds after the event to include in the clips (default: shown below)
      #post_capture: 5
#      # Optional: Objects to save clips for. (default: all tracked objects)
#      objects:
#        - person
#      # Optional: Camera override for retention settings (default: global values)
#      retain:
#        # Required: Default retention days (default: shown below)
#        default: 10
#        # Optional: Per object retention days
#        objects:
#          person: 15  
  front_gate_left:
    ffmpeg:
      inputs: 
        - path: rtsp://admin:{FRIGATE_RTSP_PASSWORD}@172.16.103.215:554/profile1
          roles:        
            - detect
            - rtmp
            - clips
##            - record    
    height: 1080
    width: 1920
    fps: 5
    clips:
      enabled: True 
      pre_capture: 5
    motion:
      mask:
      - 1920,485,1781,366,1636,268,1504,300,1561,394,1442,457,1361,353,1235,295,1138,180,1031,180,844,182,836,147,763,132,651,121,623,161,0,165,0,0,1920,0
  front_gate_Right:
    ffmpeg:
      inputs: 
        - path: rtsp://admin:{FRIGATE_RTSP_PASSWORD}@172.16.103.210:554/profile1
          roles:
            - detect
            - rtmp
            - clips
#            - record         
    height: 1080
    width: 1920
    fps: 5     
    clips:
      enabled: True 
      pre_capture: 5      
  Ridge_Right_Right:
    ffmpeg:
      inputs: 
        - path: rtsp://admin:{FRIGATE_RTSP_PASSWORD}@172.16.103.211:554/profile1
          roles:
            - detect
            - rtmp
            - clips
#            - record  
    height: 1080
    width: 1920
    fps: 5     
    clips:
      enabled: True 
      pre_capture: 5      
  Front_Yard_Left:
    ffmpeg:
      inputs: 
        - path: rtsp://admin:{FRIGATE_RTSP_PASSWORD}@172.16.103.219:554/profile1
          roles:
            - detect
            - rtmp
            - clips
#            - record     
    height: 1080
    width: 1920
    fps: 5   
    clips:
      enabled: True 
      pre_capture: 5      
  Ridge_Right_Left:
    ffmpeg:
      inputs: 
        - path: rtsp://admin:{FRIGATE_RTSP_PASSWORD}@172.16.103.212:554/profile1
          roles:
            - detect
            - rtmp
            - clips
#            - record    
    height: 1080
    width: 1920
    fps: 5    
    clips:
      enabled: True  
      pre_capture: 5      
  Garage:
    ffmpeg:
      inputs: 
        - path: rtsp://admin:{FRIGATE_RTSP_PASSWORD}@172.16.103.207:554/profile1
          roles:
            - detect
            - rtmp
            - clips
#            - record    
    height: 1080
    width: 1920
    fps: 5    
    clips:
      enabled: True
      pre_capture: 5      

  Ridge_Left_Left:
    ffmpeg:
      inputs: 
        - path: rtsp://admin:{FRIGATE_RTSP_PASSWORD}@172.16.103.208:554/profile1
          roles:
            - detect
            - rtmp
            - clips
#            - record    
    height: 1080
    width: 1920
    fps: 5  
    clips:
      enabled: True 
      pre_capture: 5      
  Ridge_Left_Right:
    ffmpeg:
      inputs: 
        - path: rtsp://admin:{FRIGATE_RTSP_PASSWORD}@172.16.103.204:554/live.sdp
          roles:
            - detect
            - rtmp
            - clips
#            - record 
    height: 1080
    width: 1920
    fps: 5
    clips:
      enabled: True 
      pre_capture: 5      
  Rooms:
    ffmpeg:
      inputs: 
        - path: rtsp://admin:{FRIGATE_RTSP_PASSWORD}@172.16.103.209:554/profile2
          roles:
            - detect
        - path: rtsp://admin:{FRIGATE_RTSP_PASSWORD}@172.16.103.209:554/profile1
          roles:    
            - rtmp
            - clips
#            - record 
    height: 576
    width: 704
    fps: 5
    clips:
      enabled: True 
      pre_capture: 5
  B09:
    ffmpeg:
      inputs: 
        - path: rtsp://admin:{FRIGATE_RTSP_PASSWORD}@172.16.103.203:554/live.sdp
          roles:
            - detect
            - rtmp
            - clips
#            - record 
    height: 1080
    width: 1920
    fps: 5
    clips:
      enabled: True 
      pre_capture: 5      
  Parking:
    ffmpeg:
      inputs: 
        - path: rtsp://admin:{FRIGATE_RTSP_PASSWORD}@172.16.103.220:554/profile1
          roles:
            - detect
            - rtmp
            - clips
#            - record         
    height: 1080
    width: 1920
    fps: 5
    clips:
      enabled: True 
      pre_capture: 5      
  B11:
    ffmpeg:
      inputs: 
        - path: rtsp://admin:{FRIGATE_RTSP_PASSWORD}@172.16.103.221:554/profile1
          roles:
            - detect
            - rtmp
            - clips
#            - record         
    height: 1080
    width: 1920
    fps: 5
    clips:
      enabled: True 
      pre_capture: 5      
    #    
cybertza commented 3 years ago

and that command:

docker exec -it frigate watch -n 60 -x find /tmp/cache -mmin +15 -type f -exec rm -fv {} \;

Definitely deletes files, so somewhere it happens.

unless I have completely and grossly underestimated how much data i store in cache?

Either way, it will definitely be great if a warning gets generated when disk free on /tmp/cache is above a water level like 90% or something.

blakeblackshear commented 3 years ago

Can you post the output of ls -lah /tmp/cache so I can see how large each file is? You do have a lot of cameras.

cybertza commented 3 years ago

this is also useful in another screen i guess while you test

docker exec -it frigate watch -c -d -n 5 "date && df -h /tmp/cache"
cybertza commented 3 years ago
-rw-r--r-- 1 root root 4.4M Jan 23 01:34 Ridge_Right_Left-20210123013405.mp4
-rw-r--r-- 1 root root 4.3M Jan 23 01:34 Ridge_Right_Left-20210123013415.mp4
-rw-r--r-- 1 root root 4.3M Jan 23 01:34 Ridge_Right_Left-20210123013425.mp4
-rw-r--r-- 1 root root 4.3M Jan 23 01:34 Ridge_Right_Left-20210123013435.mp4
-rw-r--r-- 1 root root 4.3M Jan 23 01:34 Ridge_Right_Left-20210123013445.mp4
-rw-r--r-- 1 root root 4.3M Jan 23 01:35 Ridge_Right_Left-20210123013455.mp4
-rw-r--r-- 1 root root 4.3M Jan 23 01:35 Ridge_Right_Left-20210123013505.mp4
-rw-r--r-- 1 root root 4.3M Jan 23 01:35 Ridge_Right_Left-20210123013515.mp4
-rw-r--r-- 1 root root 3.1M Jan 23 01:35 Ridge_Right_Left-20210123013525.mp4
-rw-r--r-- 1 root root 4.8M Jan 23 01:34 Ridge_Right_Right-20210123013355.mp4
-rw-r--r-- 1 root root 4.8M Jan 23 01:34 Ridge_Right_Right-20210123013405.mp4
-rw-r--r-- 1 root root 4.9M Jan 23 01:34 Ridge_Right_Right-20210123013415.mp4
-rw-r--r-- 1 root root 4.8M Jan 23 01:34 Ridge_Right_Right-20210123013425.mp4
-rw-r--r-- 1 root root 4.8M Jan 23 01:34 Ridge_Right_Right-20210123013435.mp4
-rw-r--r-- 1 root root 4.8M Jan 23 01:34 Ridge_Right_Right-20210123013445.mp4
-rw-r--r-- 1 root root 4.8M Jan 23 01:35 Ridge_Right_Right-20210123013455.mp4
-rw-r--r-- 1 root root 4.8M Jan 23 01:35 Ridge_Right_Right-20210123013505.mp4
-rw-r--r-- 1 root root 4.8M Jan 23 01:35 Ridge_Right_Right-20210123013515.mp4
-rw-r--r-- 1 root root 3.6M Jan 23 01:35 Ridge_Right_Right-20210123013525.mp4
-rw-r--r-- 1 root root 4.6M Jan 23 01:34 Rooms-20210123013355.mp4
-rw-r--r-- 1 root root 3.1M Jan 23 01:34 Rooms-20210123013407.mp4
-rw-r--r-- 1 root root 4.5M Jan 23 01:34 Rooms-20210123013415.mp4
-rw-r--r-- 1 root root 3.0M Jan 23 01:34 Rooms-20210123013427.mp4
-rw-r--r-- 1 root root 4.5M Jan 23 01:34 Rooms-20210123013435.mp4
-rw-r--r-- 1 root root 3.1M Jan 23 01:34 Rooms-20210123013447.mp4
-rw-r--r-- 1 root root 4.6M Jan 23 01:35 Rooms-20210123013455.mp4
-rw-r--r-- 1 root root 3.1M Jan 23 01:35 Rooms-20210123013507.mp4
-rw-r--r-- 1 root root 4.5M Jan 23 01:35 Rooms-20210123013515.mp4
-rw-r--r-- 1 root root 2.1M Jan 23 01:35 Rooms-20210123013527.mp4
-rw-r--r-- 1 root root 5.0M Jan 23 01:34 front_gate_Right-20210123013353.mp4
-rw-r--r-- 1 root root 5.0M Jan 23 01:34 front_gate_Right-20210123013403.mp4
-rw-r--r-- 1 root root 5.0M Jan 23 01:34 front_gate_Right-20210123013414.mp4
-rw-r--r-- 1 root root 5.0M Jan 23 01:34 front_gate_Right-20210123013424.mp4
-rw-r--r-- 1 root root 5.0M Jan 23 01:34 front_gate_Right-20210123013434.mp4
-rw-r--r-- 1 root root 4.0M Jan 23 01:34 front_gate_Right-20210123013445.mp4
-rw-r--r-- 1 root root 5.0M Jan 23 01:35 front_gate_Right-20210123013453.mp4
-rw-r--r-- 1 root root 5.0M Jan 23 01:35 front_gate_Right-20210123013504.mp4
-rw-r--r-- 1 root root 5.0M Jan 23 01:35 front_gate_Right-20210123013514.mp4
-rw-r--r-- 1 root root 3.6M Jan 23 01:35 front_gate_Right-20210123013524.mp4
-rw-r--r-- 1 root root 4.8M Jan 23 01:34 front_gate_left-20210123013355.mp4
-rw-r--r-- 1 root root 4.8M Jan 23 01:34 front_gate_left-20210123013405.mp4
-rw-r--r-- 1 root root 4.8M Jan 23 01:34 front_gate_left-20210123013415.mp4
-rw-r--r-- 1 root root 4.8M Jan 23 01:34 front_gate_left-20210123013425.mp4
-rw-r--r-- 1 root root 4.8M Jan 23 01:34 front_gate_left-20210123013435.mp4
-rw-r--r-- 1 root root 4.8M Jan 23 01:34 front_gate_left-20210123013445.mp4
-rw-r--r-- 1 root root 4.8M Jan 23 01:35 front_gate_left-20210123013455.mp4
-rw-r--r-- 1 root root 4.8M Jan 23 01:35 front_gate_left-20210123013505.mp4
-rw-r--r-- 1 root root 4.8M Jan 23 01:35 front_gate_left-20210123013515.mp4
-rw-r--r-- 1 root root 3.6M Jan 23 01:35 front_gate_left-20210123013525.mp4
-rw-r--r-- 1 root root 2.3M Jan 23 01:34 intercom-20210123013355.mp4
-rw-r--r-- 1 root root 2.3M Jan 23 01:34 intercom-20210123013405.mp4
-rw-r--r-- 1 root root 2.3M Jan 23 01:34 intercom-20210123013415.mp4
-rw-r--r-- 1 root root 2.3M Jan 23 01:34 intercom-20210123013425.mp4
-rw-r--r-- 1 root root 2.3M Jan 23 01:34 intercom-20210123013435.mp4
-rw-r--r-- 1 root root 2.3M Jan 23 01:34 intercom-20210123013445.mp4
-rw-r--r-- 1 root root 2.3M Jan 23 01:35 intercom-20210123013455.mp4
-rw-r--r-- 1 root root 2.3M Jan 23 01:35 intercom-20210123013505.mp4
-rw-r--r-- 1 root root 2.3M Jan 23 01:35 intercom-20210123013515.mp4
-rw-r--r-- 1 root root 1.6M Jan 23 01:35 intercom-20210123013525.mp4
cybertza commented 3 years ago
-rw-r--r-- 1 root root 4.3M 2021-01-23 01:37:05.237024199 +0200 Ridge_Right_Left-20210123013655.mp4
-rw-r--r-- 1 root root 4.2M 2021-01-23 01:37:15.230696310 +0200 Ridge_Right_Left-20210123013705.mp4
-rw-r--r-- 1 root root 4.2M 2021-01-23 01:37:25.231368199 +0200 Ridge_Right_Left-20210123013715.mp4
-rw-r--r-- 1 root root 4.2M 2021-01-23 01:37:35.234040027 +0200 Ridge_Right_Left-20210123013725.mp4
-rw-r--r-- 1 root root 4.2M 2021-01-23 01:37:45.231712026 +0200 Ridge_Right_Left-20210123013735.mp4
-rw-r--r-- 1 root root 4.3M 2021-01-23 01:37:55.231383965 +0200 Ridge_Right_Left-20210123013745.mp4
-rw-r--r-- 1 root root 4.3M 2021-01-23 01:38:05.233055844 +0200 Ridge_Right_Left-20210123013755.mp4
-rw-r--r-- 1 root root 4.3M 2021-01-23 01:38:15.236727664 +0200 Ridge_Right_Left-20210123013805.mp4
-rw-r--r-- 1 root root 3.1M 2021-01-23 01:38:22.079503184 +0200 Ridge_Right_Left-20210123013815.mp4
-rw-r--r-- 1 root root 4.9M 2021-01-23 01:36:55.283350782 +0200 Ridge_Right_Right-20210123013645.mp4
-rw-r--r-- 1 root root 4.8M 2021-01-23 01:37:05.285022624 +0200 Ridge_Right_Right-20210123013655.mp4
-rw-r--r-- 1 root root 4.8M 2021-01-23 01:37:15.284694539 +0200 Ridge_Right_Right-20210123013705.mp4
-rw-r--r-- 1 root root 4.8M 2021-01-23 01:37:25.285366427 +0200 Ridge_Right_Right-20210123013715.mp4
-rw-r--r-- 1 root root 4.8M 2021-01-23 01:37:35.286038321 +0200 Ridge_Right_Right-20210123013725.mp4
-rw-r--r-- 1 root root 4.8M 2021-01-23 01:37:45.283710320 +0200 Ridge_Right_Right-20210123013735.mp4
-rw-r--r-- 1 root root 4.9M 2021-01-23 01:37:55.287382128 +0200 Ridge_Right_Right-20210123013745.mp4
-rw-r--r-- 1 root root 4.8M 2021-01-23 01:38:05.284054171 +0200 Ridge_Right_Right-20210123013755.mp4
-rw-r--r-- 1 root root 4.8M 2021-01-23 01:38:15.287725991 +0200 Ridge_Right_Right-20210123013805.mp4
-rw-r--r-- 1 root root 3.3M 2021-01-23 01:38:21.550520537 +0200 Ridge_Right_Right-20210123013815.mp4
-rw-r--r-- 1 root root 3.1M 2021-01-23 01:36:55.173354391 +0200 Rooms-20210123013647.mp4
-rw-r--r-- 1 root root 4.5M 2021-01-23 01:37:07.170960747 +0200 Rooms-20210123013655.mp4
-rw-r--r-- 1 root root 3.0M 2021-01-23 01:37:15.152698869 +0200 Rooms-20210123013707.mp4
-rw-r--r-- 1 root root 4.6M 2021-01-23 01:37:27.188303994 +0200 Rooms-20210123013715.mp4
-rw-r--r-- 1 root root 3.0M 2021-01-23 01:37:35.143043012 +0200 Rooms-20210123013727.mp4
-rw-r--r-- 1 root root 4.6M 2021-01-23 01:37:47.158648808 +0200 Rooms-20210123013735.mp4
-rw-r--r-- 1 root root 3.0M 2021-01-23 01:37:55.131387245 +0200 Rooms-20210123013747.mp4
-rw-r--r-- 1 root root 4.5M 2021-01-23 01:38:07.178992006 +0200 Rooms-20210123013755.mp4
-rw-r--r-- 1 root root 3.1M 2021-01-23 01:38:15.139730846 +0200 Rooms-20210123013807.mp4
-rw-r--r-- 1 root root 2.6M 2021-01-23 01:38:21.707515387 +0200 Rooms-20210123013815.mp4
-rw-r--r-- 1 root root 5.0M 2021-01-23 01:36:54.296383165 +0200 front_gate_Right-20210123013643.mp4
-rw-r--r-- 1 root root 5.0M 2021-01-23 01:37:04.697041916 +0200 front_gate_Right-20210123013654.mp4
-rw-r--r-- 1 root root 5.0M 2021-01-23 01:37:15.095700740 +0200 front_gate_Right-20210123013704.mp4
-rw-r--r-- 1 root root 4.0M 2021-01-23 01:37:23.415427777 +0200 front_gate_Right-20210123013715.mp4
-rw-r--r-- 1 root root 5.0M 2021-01-23 01:37:33.818086482 +0200 front_gate_Right-20210123013723.mp4
-rw-r--r-- 1 root root 5.0M 2021-01-23 01:37:44.216745324 +0200 front_gate_Right-20210123013733.mp4
-rw-r--r-- 1 root root 5.0M 2021-01-23 01:37:54.617404108 +0200 front_gate_Right-20210123013744.mp4
-rw-r--r-- 1 root root 5.0M 2021-01-23 01:38:05.017062930 +0200 front_gate_Right-20210123013754.mp4
-rw-r--r-- 1 root root 4.0M 2021-01-23 01:38:13.335790026 +0200 front_gate_Right-20210123013805.mp4
-rw-r--r-- 1 root root 4.3M 2021-01-23 01:38:21.866510171 +0200 front_gate_Right-20210123013813.mp4
-rw-r--r-- 1 root root 4.8M 2021-01-23 01:36:55.293350454 +0200 front_gate_left-20210123013645.mp4
-rw-r--r-- 1 root root 4.8M 2021-01-23 01:37:05.296022263 +0200 front_gate_left-20210123013655.mp4
-rw-r--r-- 1 root root 4.8M 2021-01-23 01:37:15.286694473 +0200 front_gate_left-20210123013705.mp4
-rw-r--r-- 1 root root 4.8M 2021-01-23 01:37:25.286366394 +0200 front_gate_left-20210123013715.mp4
-rw-r--r-- 1 root root 4.8M 2021-01-23 01:37:35.287038288 +0200 front_gate_left-20210123013725.mp4
-rw-r--r-- 1 root root 4.8M 2021-01-23 01:37:45.286710221 +0200 front_gate_left-20210123013735.mp4
-rw-r--r-- 1 root root 4.8M 2021-01-23 01:37:55.286382160 +0200 front_gate_left-20210123013745.mp4
-rw-r--r-- 1 root root 4.8M 2021-01-23 01:38:05.287054073 +0200 front_gate_left-20210123013755.mp4
-rw-r--r-- 1 root root 4.8M 2021-01-23 01:38:15.289725926 +0200 front_gate_left-20210123013805.mp4
-rw-r--r-- 1 root root 3.3M 2021-01-23 01:38:21.571519849 +0200 front_gate_left-20210123013815.mp4
-rw-r--r-- 1 root root 2.3M 2021-01-23 01:36:55.142355408 +0200 intercom-20210123013645.mp4
-rw-r--r-- 1 root root 2.3M 2021-01-23 01:37:05.159026758 +0200 intercom-20210123013655.mp4
-rw-r--r-- 1 root root 2.3M 2021-01-23 01:37:15.148699001 +0200 intercom-20210123013705.mp4
-rw-r--r-- 1 root root 2.3M 2021-01-23 01:37:25.145371020 +0200 intercom-20210123013715.mp4
-rw-r--r-- 1 root root 2.3M 2021-01-23 01:37:35.160042455 +0200 intercom-20210123013725.mp4
-rw-r--r-- 1 root root 2.3M 2021-01-23 01:37:45.149714716 +0200 intercom-20210123013735.mp4
-rw-r--r-- 1 root root 2.3M 2021-01-23 01:37:55.143386852 +0200 intercom-20210123013745.mp4
-rw-r--r-- 1 root root 2.3M 2021-01-23 01:38:05.153058469 +0200 intercom-20210123013755.mp4
-rw-r--r-- 1 root root 2.3M 2021-01-23 01:38:15.150730485 +0200 intercom-20210123013805.mp4
-rw-r--r-- 1 root root 1.6M 2021-01-23 01:38:21.835511188 +0200 intercom-20210123013815.mp4
blakeblackshear commented 3 years ago

Those are 10 second segments. You have a max_seconds of 300 (plus a 90 second buffer). Assuming 5mb per 10 seconds, you get 5mb*39=195mb per camera needed in cache space. With 13 cameras, thats ~2.5GB needed. It looks like you have a 16GB tmpfs volume mounted at /tmp/cache (wow. thats a lot of ram) so you should be fine. There must be an exception in the logs related to the thread that cleans up the cache files. Mine are being cleaned up just fine in testing with a similar config.

cybertza commented 3 years ago

So mine is working at this second, but like i said its random, how do i enable the correct log to be able to see what happens?

Where is the code located for the cleaner?

cybertza commented 3 years ago

Yea, that's kind of how I have been testing, but it does max out somehow and then it starts writing 0byte files and all,

blakeblackshear commented 3 years ago

The very end of this block is where the files are deleted. I think im going to add some debug logging and push up an RC4.

cybertza commented 3 years ago

cool, would you be able to create a docker for that as well, or another way for me to test on unraid?

cybertza commented 3 years ago

so the things i notice is a sudden increase in usage /tmp/cache usage, and then after that 0 size files get written, and then ffmpeg starts making errors, so it may be great to also do a check that a file getting loaded is not 0 size

referring to this: lvalue and rvalue have different structures

cybertza commented 3 years ago

As an idea, i would think while you looking this code it may be nice to set a water mark warning for the folder, aka less than 10% free or whatever, or report cache free to mqtt or status.

For the process,

An elegant way may be to first delete all files that are older than x, and also 0 byte files.

x = 90 sec (or pre record) + max clip length + 60 seconds (for safety but really rather long in the scheme of things)

technically the math may be 150 seconds longer than the max_clip lenth, coz that should have already been offloaded.

The logic behind this, is that if a process gets stuck, perhaps then it doesn't crash all of it.

cybertza commented 3 years ago

I guess, basically do this before the rest, and then rebuild the list? https://github.com/blakeblackshear/frigate/blob/c6044ba9a1ca40830df63b14b746dc8961e5ecde/frigate/events.py#L90

cybertza commented 3 years ago

Could this also under some unique conditions this could cause an issue? https://github.com/blakeblackshear/frigate/blob/c6044ba9a1ca40830df63b14b746dc8961e5ecde/frigate/events.py#L85

I don't for instance know at this moment how of tine this code gets called, but it would not be impossible for different camera's to keep this flag high? If lets say for instance there is movement over all the camera's at the same time (and by that i mean if from 12:00 ~ 12:04 cam 1 has motion, and then cam 2 has motion from 12:03 ~ 12:06 (and the cyclically keep locking the cameras up?

cybertza commented 3 years ago

Just as for some feedback, i haven't had a overgrow on /cache now for a while, but it was very replicate able earlier..

Every 5.0s: date && df -h /tmp/cache                                  57a91bf992de: Sat Jan 23 02:35:26 2021

Sat Jan 23 02:35:26 SAST 2021
Filesystem      Size  Used Avail Use% Mounted on
tmpfs            16G  485M   16G   3% /tmp/cache
blakeblackshear commented 3 years ago

Could this also under some unique conditions this could cause an issue? https://github.com/blakeblackshear/frigate/blob/c6044ba9a1ca40830df63b14b746dc8961e5ecde/frigate/events.py#L85

I don't for instance know at this moment how of tine this code gets called, but it would not be impossible for different camera's to keep this flag high? If lets say for instance there is movement over all the camera's at the same time (and by that i mean if from 12:00 ~ 12:04 cam 1 has motion, and then cam 2 has motion from 12:03 ~ 12:06 (and the cyclically keep locking the cameras up?

That's why max_seconds is a global setting and not a camera specific setting. I could break that out by camera, but that doesn't explain why your 16g cache volume would be filling up.

Ysbrand commented 3 years ago

@Ysbrand the thread is crashing because of the database is locked error. Others have reported this when they have the database stored on a network drive or other slow disk. What are you using for your clips directory?

No, my database and my mediafiles are on a local ha filesystem inside ha. The disk is SSD

Ysbrand commented 3 years ago

Updated to RC 4 but still the same issue:

 * Starting nginx nginx
   ...done.
frigate.app                    INFO    : Creating directory: /tmp/cache
frigate.app                    INFO    : Creating tmpfs of size 512m
frigate.app                    WARNING : Camera Voordeur has rtmp enabled, but rtmp is not assigned to an input.
Starting migrations
peewee_migrate                 INFO    : Starting migrations
There is nothing to migrate
peewee_migrate                 INFO    : There is nothing to migrate
detector.coral                 INFO    : Starting detection process: 37
frigate.app                    INFO    : Camera processor started for Voordeur: 40
frigate.edgetpu                INFO    : Attempting to load TPU as usb
frigate.app                    INFO    : Capture process started for Voordeur: 41
frigate.edgetpu                INFO    : TPU found
[mov,mp4,m4a,3gp,3g2,mj2 @ 0x55970c9c10] moov atom not found
/media/frigate/recordings/Voordeur-20210122224003.mp4: Invalid data found when processing input
frigate.record                 INFO    : bad file: Voordeur-20210122224003.mp4
watchdog.Voordeur              INFO    : No frames received from Voordeur in 20 seconds. Exiting ffmpeg...
watchdog.Voordeur              INFO    : Waiting for ffmpeg to exit gracefully...
frigate.video                  INFO    : Voordeur: ffmpeg sent a broken frame. memoryview assignment: lvalue and rvalue have different structures
frigate.video                  INFO    : Voordeur: ffmpeg sent a broken frame. memoryview assignment: lvalue and rvalue have different structures
frigate.video                  INFO    : Voordeur: ffmpeg sent a broken frame. memoryview assignment: lvalue and rvalue have different structures
frigate.video                  INFO    : Voordeur: ffmpeg sent a broken frame. memoryview assignment: lvalue and rvalue have different structures
frigate.video                  INFO    : Voordeur: ffmpeg sent a broken frame. read of closed file
frigate.video                  INFO    : Voordeur: ffmpeg sent a broken frame. read of closed file
frigate.video                  INFO    : Voordeur: ffmpeg sent a broken frame. read of closed file
frigate.video                  INFO    : Voordeur: ffmpeg sent a broken frame. read of closed file
frigate.video                  INFO    : Voordeur: ffmpeg sent a broken frame. read of closed file
frigate.video                  INFO    : Voordeur: ffmpeg process is not running. exiting capture thread...
watchdog.Voordeur              INFO    : No frames received from Voordeur in 20 seconds. Exiting ffmpeg...
watchdog.Voordeur              INFO    : Waiting for ffmpeg to exit gracefully...
frigate.video                  INFO    : Voordeur: ffmpeg sent a broken frame. memoryview assignment: lvalue and rvalue have different structures
frigate.video                  INFO    : Voordeur: ffmpeg sent a broken frame. memoryview assignment: lvalue and rvalue have different structures
frigate.video                  INFO    : Voordeur: ffmpeg sent a broken frame. memoryview assignment: lvalue and rvalue have different structures
frigate.video                  INFO    : Voordeur: ffmpeg process is not running. exiting capture thread...
watchdog.Voordeur              INFO    : No frames received from Voordeur in 20 seconds. Exiting ffmpeg...
watchdog.Voordeur              INFO    : Waiting for ffmpeg to exit gracefully...
Exception in thread event_processor:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 3129, in execute_sql
    cursor.execute(sql, params or ())
sqlite3.OperationalError: database is locked
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
  File "/usr/lib/python3.8/threading.py", line 932, in _bootstrap_inner
    self.run()
  File "/opt/frigate/frigate/events.py", line 188, in run
    Event.create(
  File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 6338, in create
frigate.video                  INFO    : Voordeur: ffmpeg sent a broken frame. memoryview assignment: lvalue and rvalue have different structures
frigate.video                  INFO    : Voordeur: ffmpeg process is not running. exiting capture thread...
    inst.save(force_insert=True)
  File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 6548, in save
    pk = self.insert(**field_dict).execute()
  File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 1898, in inner
    return method(self, database, *args, **kwargs)
  File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 1969, in execute
    return self._execute(database)
  File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 2730, in _execute
    return super(Insert, self)._execute(database)
  File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 2465, in _execute
    cursor = database.execute(self)
  File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 3142, in execute
    return self.execute_sql(sql, params, commit=commit)
  File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 3136, in execute_sql
    self.commit()
  File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 2902, in __exit__
    reraise(new_type, new_type(exc_value, *exc_args), traceback)
  File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 185, in reraise
    raise value.with_traceback(tb)
  File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 3129, in execute_sql
    cursor.execute(sql, params or ())
peewee.OperationalError: database is locked
-rw-r--r-- 1 root root 2530523 Jan 23 08:39 Voordeur-20210123083933.mp4
-rw-r--r-- 1 root root 2515425 Jan 23 08:39 Voordeur-20210123083943.mp4
-rw-r--r-- 1 root root 2494974 Jan 23 08:40 Voordeur-20210123083953.mp4
-rw-r--r-- 1 root root 2557094 Jan 23 08:40 Voordeur-20210123084003.mp4
SNIP
-rw-r--r-- 1 root root 2513165 Jan 23 08:48 Voordeur-20210123084823.mp4
-rw-r--r-- 1 root root 2524698 Jan 23 08:48 Voordeur-20210123084833.mp4
drwxrwxrwt 2 root root    1160 Jan 23 08:48 .
-rw-r--r-- 1 root root  262192 Jan 23 08:48 Voordeur-20210123084843.mp4

root@ccab4aaf-frigate-beta:/tmp/cache# df -h
Filesystem      Size  Used Avail Use% Mounted on
overlay         220G   37G  172G  18% /
tmpfs            64M     0   64M   0% /dev
tmpfs           1.9G     0  1.9G   0% /sys/fs/cgroup
shm              64M  1.2M   63M   2% /dev/shm
/dev/root       103M  103M     0 100% /usr/sbin/docker-init
/dev/sda8       220G   37G  172G  18% /data
tmpfs           512M   73M  440M  15% /tmp/cache
root@ccab4aaf-frigate-beta:/tmp/cache# du -sk .
75628 
```  .

Is there any way that we can check the current query/last query which did lock the database?

@blakeblackshear : if it helps I can send you a copy of the database file.
cybertza commented 3 years ago

please supply your df -h when you update, from the host

Ysbrand commented 3 years ago

please supply your df -h when you update, from the host

Done

cybertza commented 3 years ago

@blakeblackshear

Seems i found the moment the Cache starts growing:

bwlGYhYsD5QNxB4+bGeKid2g8j/9k=', True, False])
2021-01-23T08:22:43.662149002Z peewee                         DEBUG   : ('SELECT "t1"."id", "t1"."label", "t1"."camera", "t1"."start_time", "t1"."end_time", "t1"."top_score", "t1"."false_positive", "t1"."zones", "t1"."thumbnail", "t1"."has_clip", "t1"."has_snapshot" FROM "event" AS "t1" WHERE ((("t1"."camera" = ?) AND ("t1"."start_time" < ?)) AND ("t1"."label" = ?))', ['Rooms', 1610526163.66087, 'car'])
2021-01-23T08:22:43.845123220Z peewee                         DEBUG   : ('UPDATE "event" SET "has_clip" = ? WHERE ((("event"."camera" = ?) AND ("event"."start_time" < ?)) AND ("event"."label" = ?))', [False, 'Rooms', 1610526163.66087, 'car'])
2021-01-23T08:22:43.846438008Z Exception in thread event_cleanup:
2021-01-23T08:22:43.846911516Z Traceback (most recent call last):
2021-01-23T08:22:43.847127032Z   File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 3129, in execute_sql
2021-01-23T08:22:43.848788303Z     cursor.execute(sql, params or ())
2021-01-23T08:22:43.848806141Z sqlite3.OperationalError: database is locked
2021-01-23T08:22:43.848811259Z 
2021-01-23T08:22:43.848815106Z During handling of the above exception, another exception occurred:
2021-01-23T08:22:43.848818699Z 
2021-01-23T08:22:43.848822098Z Traceback (most recent call last):
2021-01-23T08:22:43.848826270Z   File "/usr/lib/python3.8/threading.py", line 932, in _bootstrap_inner
2021-01-23T08:22:43.849587243Z     self.run()
2021-01-23T08:22:43.849600706Z   File "/opt/frigate/frigate/events.py", line 304, in run
2021-01-23T08:22:43.849703422Z     self.expire('clips')
2021-01-23T08:22:43.849713510Z   File "/opt/frigate/frigate/events.py", line 288, in expire
2021-01-23T08:22:43.849880541Z     update_query.execute()
2021-01-23T08:22:43.849890215Z   File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 1898, in inner
2021-01-23T08:22:43.850244709Z     return method(self, database, *args, **kwargs)
2021-01-23T08:22:43.850255009Z   File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 1969, in execute
2021-01-23T08:22:43.850720765Z     return self._execute(database)
2021-01-23T08:22:43.850733495Z   File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 2465, in _execute
2021-01-23T08:22:43.854252647Z     cursor = database.execute(self)
2021-01-23T08:22:43.854280887Z   File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 3142, in execute
2021-01-23T08:22:43.854752113Z     return self.execute_sql(sql, params, commit=commit)
2021-01-23T08:22:43.854766250Z   File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 3136, in execute_sql
2021-01-23T08:22:43.855161640Z     self.commit()
2021-01-23T08:22:43.855172547Z   File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 2902, in __exit__
2021-01-23T08:22:43.855544292Z     reraise(new_type, new_type(exc_value, *exc_args), traceback)
2021-01-23T08:22:43.855555373Z   File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 185, in reraise
2021-01-23T08:22:43.855599047Z     raise value.with_traceback(tb)
2021-01-23T08:22:43.855609264Z   File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 3129, in execute_sql
2021-01-23T08:22:43.855927109Z     cursor.execute(sql, params or ())
2021-01-23T08:22:43.855937687Z peewee.OperationalError: database is locked
2021-01-23T08:22:48.576577961Z Exception in thread event_processor:
2021-01-23T08:22:48.576607841Z Traceback (most recent call last):
2021-01-23T08:22:48.576612823Z   File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 3129, in execute_sql
2021-01-23T08:22:48.576974168Z     cursor.execute(sql, params or ())
2021-01-23T08:22:48.576984553Z sqlite3.OperationalError: database is locked
2021-01-23T08:22:48.576988413Z 
2021-01-23T08:22:48.576993687Z During handling of the above exception, another exception occurred:
2021-01-23T08:22:48.576997396Z 
2021-01-23T08:22:48.577000605Z Traceback (most recent call last):
2021-01-23T08:22:48.577008156Z   File "/usr/lib/python3.8/threading.py", line 932, in _bootstrap_inner
2021-01-23T08:22:48.577136749Z     self.run()
2021-01-23T08:22:48.577152201Z   File "/opt/frigate/frigate/events.py", line 188, in run
2021-01-23T08:22:48.577185155Z     Event.create(
2021-01-23T08:22:48.577193252Z   File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 6338, in create
2021-01-23T08:22:48.577848897Z     inst.save(force_insert=True)
2021-01-23T08:22:48.577861769Z   File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 6548, in save
2021-01-23T08:22:48.578490867Z     pk = self.insert(**field_dict).execute()
2021-01-23T08:22:48.578501147Z   File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 1898, in inner
2021-01-23T08:22:48.578712579Z     return method(self, database, *args, **kwargs)
2021-01-23T08:22:48.578736336Z   File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 1969, in execute
2021-01-23T08:22:48.579005138Z     return self._execute(database)
2021-01-23T08:22:48.579015532Z   File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 2730, in _execute
2021-01-23T08:22:48.579328361Z     return super(Insert, self)._execute(database)
2021-01-23T08:22:48.579343511Z   File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 2465, in _execute
2021-01-23T08:22:48.579672351Z     cursor = database.execute(self)
2021-01-23T08:22:48.579683015Z   File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 3142, in execute
2021-01-23T08:22:48.580195289Z     return self.execute_sql(sql, params, commit=commit)
2021-01-23T08:22:48.580482620Z   File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 3136, in execute_sql
2021-01-23T08:22:48.581077074Z     self.commit()
2021-01-23T08:22:48.581100883Z   File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 2902, in __exit__
2021-01-23T08:22:48.581542853Z     reraise(new_type, new_type(exc_value, *exc_args), traceback)
2021-01-23T08:22:48.581564190Z   File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 185, in reraise
2021-01-23T08:22:48.581643095Z     raise value.with_traceback(tb)
2021-01-23T08:22:48.581651097Z   File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 3129, in execute_sql
2021-01-23T08:22:48.582020365Z     cursor.execute(sql, params or ())
2021-01-23T08:22:48.582030937Z peewee.OperationalError: database is locked

It seems quite some other processes also stop the debug log output, and then i just get INFO logs from there,

It seems that i had multiple camera's trigger at the same time just before that:

bwlGYhYsD5QNxB4+bGeKid2g8j/9k=', True, False])
2021-01-23T08:22:43.662149002Z peewee                         DEBUG   : ('SELECT "t1"."id", "t1"."label", "t1"."camera", "t1"."start_time", "t1"."end_time", "t1"."top_score", "t1"."false_positive", "t1"."zones", "t1"."thumbnail", "t1"."has_clip", "t1"."has_snapshot" FROM "event" AS "t1" WHERE ((("t1"."camera" = ?) AND ("t1"."start_time" < ?)) AND ("t1"."label" = ?))', ['Rooms', 1610526163.66087, 'car'])
2021-01-23T08:22:43.845123220Z peewee                         DEBUG   : ('UPDATE "event" SET "has_clip" = ? WHERE ((("event"."camera" = ?) AND ("event"."start_time" < ?)) AND ("event"."label" = ?))', [False, 'Rooms', 1610526163.66087, 'car'])
2021-01-23T08:22:43.846438008Z Exception in thread event_cleanup:
2021-01-23T08:22:43.846911516Z Traceback (most recent call last):
2021-01-23T08:22:43.847127032Z   File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 3129, in execute_sql
2021-01-23T08:22:43.848788303Z     cursor.execute(sql, params or ())
2021-01-23T08:22:43.848806141Z sqlite3.OperationalError: database is locked
2021-01-23T08:22:43.848811259Z 
2021-01-23T08:22:43.848815106Z During handling of the above exception, another exception occurred:
2021-01-23T08:22:43.848818699Z 
2021-01-23T08:22:43.848822098Z Traceback (most recent call last):
2021-01-23T08:22:43.848826270Z   File "/usr/lib/python3.8/threading.py", line 932, in _bootstrap_inner
2021-01-23T08:22:43.849587243Z     self.run()
2021-01-23T08:22:43.849600706Z   File "/opt/frigate/frigate/events.py", line 304, in run
2021-01-23T08:22:43.849703422Z     self.expire('clips')
2021-01-23T08:22:43.849713510Z   File "/opt/frigate/frigate/events.py", line 288, in expire
2021-01-23T08:22:43.849880541Z     update_query.execute()
2021-01-23T08:22:43.849890215Z   File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 1898, in inner
2021-01-23T08:22:43.850244709Z     return method(self, database, *args, **kwargs)
2021-01-23T08:22:43.850255009Z   File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 1969, in execute
2021-01-23T08:22:43.850720765Z     return self._execute(database)
2021-01-23T08:22:43.850733495Z   File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 2465, in _execute
2021-01-23T08:22:43.854252647Z     cursor = database.execute(self)
2021-01-23T08:22:43.854280887Z   File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 3142, in execute
2021-01-23T08:22:43.854752113Z     return self.execute_sql(sql, params, commit=commit)
2021-01-23T08:22:43.854766250Z   File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 3136, in execute_sql
2021-01-23T08:22:43.855161640Z     self.commit()
2021-01-23T08:22:43.855172547Z   File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 2902, in __exit__
2021-01-23T08:22:43.855544292Z     reraise(new_type, new_type(exc_value, *exc_args), traceback)
2021-01-23T08:22:43.855555373Z   File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 185, in reraise
2021-01-23T08:22:43.855599047Z     raise value.with_traceback(tb)
2021-01-23T08:22:43.855609264Z   File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 3129, in execute_sql
2021-01-23T08:22:43.855927109Z     cursor.execute(sql, params or ())
2021-01-23T08:22:43.855937687Z peewee.OperationalError: database is locked
2021-01-23T08:22:48.576577961Z Exception in thread event_processor:
2021-01-23T08:22:48.576607841Z Traceback (most recent call last):
2021-01-23T08:22:48.576612823Z   File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 3129, in execute_sql
2021-01-23T08:22:48.576974168Z     cursor.execute(sql, params or ())
2021-01-23T08:22:48.576984553Z sqlite3.OperationalError: database is locked
2021-01-23T08:22:48.576988413Z 
2021-01-23T08:22:48.576993687Z During handling of the above exception, another exception occurred:
2021-01-23T08:22:48.576997396Z 
2021-01-23T08:22:48.577000605Z Traceback (most recent call last):
2021-01-23T08:22:48.577008156Z   File "/usr/lib/python3.8/threading.py", line 932, in _bootstrap_inner
2021-01-23T08:22:48.577136749Z     self.run()
2021-01-23T08:22:48.577152201Z   File "/opt/frigate/frigate/events.py", line 188, in run
2021-01-23T08:22:48.577185155Z     Event.create(
2021-01-23T08:22:48.577193252Z   File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 6338, in create
2021-01-23T08:22:48.577848897Z     inst.save(force_insert=True)
2021-01-23T08:22:48.577861769Z   File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 6548, in save
2021-01-23T08:22:48.578490867Z     pk = self.insert(**field_dict).execute()
2021-01-23T08:22:48.578501147Z   File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 1898, in inner
2021-01-23T08:22:48.578712579Z     return method(self, database, *args, **kwargs)
2021-01-23T08:22:48.578736336Z   File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 1969, in execute
2021-01-23T08:22:48.579005138Z     return self._execute(database)
2021-01-23T08:22:48.579015532Z   File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 2730, in _execute
2021-01-23T08:22:48.579328361Z     return super(Insert, self)._execute(database)
2021-01-23T08:22:48.579343511Z   File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 2465, in _execute
2021-01-23T08:22:48.579672351Z     cursor = database.execute(self)
2021-01-23T08:22:48.579683015Z   File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 3142, in execute
2021-01-23T08:22:48.580195289Z     return self.execute_sql(sql, params, commit=commit)
2021-01-23T08:22:48.580482620Z   File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 3136, in execute_sql
2021-01-23T08:22:48.581077074Z     self.commit()
2021-01-23T08:22:48.581100883Z   File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 2902, in __exit__
2021-01-23T08:22:48.581542853Z     reraise(new_type, new_type(exc_value, *exc_args), traceback)
2021-01-23T08:22:48.581564190Z   File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 185, in reraise
2021-01-23T08:22:48.581643095Z     raise value.with_traceback(tb)
2021-01-23T08:22:48.581651097Z   File "/usr/local/lib/python3.8/dist-packages/peewee.py", line 3129, in execute_sql
2021-01-23T08:22:48.582020365Z     cursor.execute(sql, params or ())
2021-01-23T08:22:48.582030937Z peewee.OperationalError: database is locked

But the tale is that my cache was plotting around 4% with no issues, now its growing. I don't exactly see why this has happened, Or how i can test further for you?

cybertza commented 3 years ago

Before that, the cache was clearing nicely with log msg's

blakeblackshear commented 3 years ago

I know what to try next. Looks like an RC5 is going to be needed.

cybertza commented 3 years ago

I’m keen to test 😄, at least we narrowing it down. Pewee no pewee no more 🥺

blakeblackshear commented 3 years ago

@cybertza give RC5 a try: https://github.com/blakeblackshear/frigate/releases/tag/v0.8.0-rc5

cybertza commented 3 years ago

Started Running it now, Will keep you posted, Thanks for the update.

cybertza commented 3 years ago

hey @blakeblackshear Seems like its been stable for 24 hours

Ysbrand commented 3 years ago

I have not seen any occurrence of cache fill up after upgrading to RC5.

MickPBduece commented 3 years ago

I have installed Frigate latest RC HA on Debian 64 on Pi4 and having an issue with the performance tuning options. When I add them I see the rvalue - lvalue error in the log.

I am certain how for add the logger commands but can provide this and any other info of help

THanks

[s6-init] making user provided files available at /var/run/s6/etc...exited 0. [s6-init] ensuring user provided files have correct perms...exited 0. [fix-attrs.d] applying ownership & permissions fixes... [fix-attrs.d] done. [cont-init.d] executing container initialization scripts... [cont-init.d] done. [services.d] starting services [services.d] done. [2021-09-24 15:51:37] frigate.app INFO : Starting Frigate (0.9.0-a943ac1) [2021-09-24 15:51:37] frigate.app INFO : Creating directory: /tmp/cache Starting migrations [2021-09-24 15:51:37] peewee_migrate INFO : Starting migrations There is nothing to migrate [2021-09-24 15:51:37] peewee_migrate INFO : There is nothing to migrate [2021-09-24 15:51:37] frigate.mqtt INFO : MQTT connected [2021-09-24 15:51:37] detector.coral INFO : Starting detection process: 214 [2021-09-24 15:51:37] frigate.app INFO : Output process started: 216 [2021-09-24 15:51:37] frigate.app INFO : Camera processor started for front_door: 219 [2021-09-24 15:51:37] ws4py INFO : Using epoll [2021-09-24 15:51:37] frigate.app INFO : Camera processor started for back: 222 [2021-09-24 15:51:37] frigate.app INFO : Capture process started for front_door: 225 [2021-09-24 15:51:37] frigate.app INFO : Capture process started for back: 228 [2021-09-24 15:51:37] ws4py INFO : Using epoll [2021-09-24 15:51:37] frigate.edgetpu INFO : Attempting to load TPU as usb [2021-09-24 15:51:39] frigate.edgetpu INFO : TPU found [2021-09-24 15:51:42] frigate.video INFO : back: ffmpeg sent a broken frame. memoryview assignment: lvalue and rvalue have different structures [2021-09-24 15:51:42] frigate.video INFO : back: ffmpeg process is not running. exiting capture thread... [2021-09-24 15:51:42] frigate.video INFO : front_door: ffmpeg sent a broken frame. memoryview assignment: lvalue and rvalue have different structures [2021-09-24 15:51:42] frigate.video INFO : front_door: ffmpeg process is not running. exiting capture thread... [2021-09-24 15:51:50] ws4py INFO : Managing websocket [Local => 127.0.0.1:5002 | Remote => 127.0.0.1:57858] [2021-09-24 15:51:57] ws4py INFO : Terminating websocket [Local => 127.0.0.1:5002 | Remote => 127.0.0.1:57858] [2021-09-24 15:51:57] watchdog.front_door ERROR : FFMPEG process crashed unexpectedly for front_door. [2021-09-24 15:51:57] watchdog.front_door ERROR : The following ffmpeg logs include the last 100 lines prior to exit. [2021-09-24 15:51:57] watchdog.front_door ERROR : You may have invalid args defined for this camera. [2021-09-24 15:51:57] ffmpeg.front_door.detect ERROR : Guessed Channel Layout for Input Stream #0.1 : mono [2021-09-24 15:51:57] ffmpeg.front_door.detect ERROR : [h264_v4l2m2m @ 0x1927230] Could not find a valid device [2021-09-24 15:51:57] ffmpeg.front_door.detect ERROR : [h264_v4l2m2m @ 0x1927230] can't configure decoder [2021-09-24 15:51:57] ffmpeg.front_door.detect ERROR : Error while opening decoder for input stream #0:0 : Invalid argument [2021-09-24 15:51:57] watchdog.back ERROR : FFMPEG process crashed unexpectedly for back. [2021-09-24 15:51:57] watchdog.back ERROR : The following ffmpeg logs include the last 100 lines prior to exit. [2021-09-24 15:51:57] watchdog.back ERROR : You may have invalid args defined for this camera. [2021-09-24 15:51:57] ffmpeg.back.detect ERROR : [h264_v4l2m2m @ 0xeae580] Could not find a valid device [2021-09-24 15:51:57] ffmpeg.back.detect ERROR : [h264_v4l2m2m @ 0xeae580] can't configure decoder [2021-09-24 15:51:57] ffmpeg.back.detect ERROR : Error while opening decoder for input stream #0:0 : Invalid argument