blakeblackshear / frigate

NVR with realtime local object detection for IP cameras
https://frigate.video
MIT License
18.23k stars 1.66k forks source link

[Support]: Not moving car #2621

Closed Pascal66 closed 2 years ago

Pascal66 commented 2 years ago

Describe the problem you are having

Avoid detection of stationnary object

Version

Debug 0.10.0-db1255a

Frigate config file

detect:
      width: 1920 
      height: 1080 
      fps: 9
      max_disappeared: 60
      stationary_interval: 90

Relevant log output

not relevant

FFprobe output from your camera

not relevant

Frigate stats

not relevant

Operating system

HassOS

Install method

HassOS Addon

Coral version

USB

Network connection

Wired

Camera make and model

other

Any other information that may be helpful

image

raintonr commented 2 years ago

See here: https://docs.frigate.video/guides/stationary_objects

However... don't think this will work here because I want detection of other objects (people, dogs, etc) in a driveway where a car is usually parked.

Scenarios:

  1. Our car may be parked in the driveway which I would like to be ignored, but if a person walks up the driveway past the car I want to record that.
  2. We may leave home in the car so would like a recording when another car enters this zone (or we actually return too would be OK).
  3. When our camera changes from day/night mode and the car happens to be in the driveway the object detection notes that and triggers a recording even though there are no new or moved objects in frame.

Think point 3 is the key - recordings should only be triggered on new or moved objects. Surely it would be easy(ish) to compare array of objects currently detected and prevent trigger of a new event if those are the same (similar based on threshold) as the last event triggered on this camera?

blakeblackshear commented 2 years ago

However... don't think this will work here because I want detection of other objects (people, dogs, etc) in a driveway where a car is usually parked

I am not sure why you would think that. You can target specific objects with zones.

Surely it would be easy(ish) to compare array of objects currently detected and prevent trigger of a new event if those are the same (similar based on threshold) as the last event triggered on this camera?

Not at all. This is so much more complicated than you are imagining.

blakeblackshear commented 2 years ago

@Pascal66 can you post an image of your entire camera frame so I can make some suggestions?

Pascal66 commented 2 years ago

image image

blakeblackshear commented 2 years ago

You should try following the guide for stationary objects by creating two zones for your driveway an requiring both for events/notifications: https://docs.frigate.video/guides/stationary_objects/

raintonr commented 2 years ago

I am not sure why you would think that. You can target specific objects with zones.

I've read the stationary objects guide but that is not suitable here because we have a camera looking down the entire length of the driveway and not angled correctly to implement a separate 'driveway entrance' zone.

Another possible solution may be to turn zones/masks on and off dynamically (via API/UI). Without that, in this example, one can either always have a car detected whatever other motion occurs (well - if our car is parked there) or never detect it. I have chosen the latter for now with a car mask that ignores the area in which our parked car is usually seen. In fact - such a mask is likely the solution for the OP but code to ignore previously detected objects until they disappear and come back seems a far better solution.

blakeblackshear commented 2 years ago

This could easily be broken into two zones like this. 149797725-48f6f657-dd14-4f66-bbc8-e7c68f9e1783~2

Pascal66 commented 2 years ago

If I do :

camera:
  record:
    events:
      required_zones:
        - entrance

This cant work as it's a conditional for any other event wich not need to use entrance (someone can jump over the enclosure wall to the garden or come from the back of the house) After test, this doesnt work if someone stole the car (or the car just only go away from the parking) as it's only a 'one directional' condition.

Maybe i'm certainly wrong with all configuation parts : max_disappeared stationary_interval required_zones, restricting zone etc...

raintonr commented 2 years ago

As I said earlier, I solved a similar problem by using a mask that filters out 'car' only in the area it is usually parked.

If you do this then if the car moves into your driveway 'entrance' zone (yes, you can use that as proposed by @blakeblackshear - but I actually can't set up such an entrance zone due to the angle of our camera) it will be detected there. If a person is present in the 'car' mask (or anywhere else for that matter) they will still be detected as... well. . a person is not a car

blakeblackshear commented 2 years ago

This cant work as it's a conditional for any other event wich not need to use entrance (someone can jump over the enclosure wall to the garden or come from the back of the house) After test, this doesnt work if someone stole the car (or the car just only go away from the parking) as it's only a 'one directional' condition.

This is easy handled by using a zone for the other areas you want detection. I can show you how it's done, but you never provided your config file.

Pascal66 commented 2 years ago

Trying the @raintonr way for 1 or 2 days. And then back to the required_zone method after. But why not using only stationary_object for that ? IE using the mask method for coordinates and using as a stationnary object after 5 or 10 frames detected as car ?

Pascal66 commented 2 years ago

The mask solution doesnt work for me. ~1event every 5mn Back to required_zone

blakeblackshear commented 2 years ago

Again, if you provide your config so I can see the existing zones, I can make a suggestion.

Pascal66 commented 2 years ago
# frigate.yml
detectors:
#  first_cpu:
#    type: cpu
    # Optional: num_threads value passed to the tflite.Interpreter (default: shown below)
    # This value is only used for CPU types
#    num_threads: 3
#  second_cpu:
#    type: cpu
    # Optional: num_threads value passed to the tflite.Interpreter (default: shown below)
    # This value is only used for CPU types
#    num_threads: 1
#  third_cpu:
#    type: cpu
    # Optional: num_threads value passed to the tflite.Interpreter (default: shown below)
    # This value is only used for CPU types
#    num_threads: 1
#  coral_pci:
#    type: edgetpu
#    device: pci
    # Optional: num_threads value passed to the tflite.Interpreter (default: shown below)
    # This value is only used for CPU types
#    num_threads: 3
  coral_usb:
    type: edgetpu
    device: usb
    # Optional: num_threads value passed to the tflite.Interpreter (default: shown below)
    # This value is only used for CPU types
    num_threads: 3
#  cpu2:
#    type: cpu
#  cpu3:
#    type: cpu
#detectors:
#  coral:
#    type: edgetpu
#    device: usb:0
#  coral2:
#    type: edgetpu
#    device: usb:1
#  coral3:
#    type: edgetpu
#    device: usb:2
#  coral4:
#    type: edgetpu
#    device: usb:3
#  coral5:
#    type: edgetpu
#    device: usb:4
logger:
  default: info
# v0.9b
birdseye:
  enabled: False

mqtt:
  # Required: host name
  host: core-mosquitto
  # Optional: port (default: shown below)
  port: 1883
  # Optional: topic prefix (default: shown below)
  # WARNING: must be unique if you are running multiple instances
  topic_prefix: frigate
  # Optional: client id (default: shown below)
  # WARNING: must be unique if you are running multiple instances
  client_id: frigate
  # Optional: user
  user: user
  # Optional: password
  # NOTE: Environment variables that begin with 'FRIGATE_' may be referenced in {}.
  #       eg. password: '{FRIGATE_MQTT_PASSWORD}'
  password: passwd
  # Optional: interval in seconds for publishing stats (default: shown below)
  stats_interval: 30

ffmpeg:
  global_args:
#    - -an
#    - -dn
#    - -vf
#    - -hide_banner
#    - -loglevel
#    - info
    - -threads 
    - '2'
  input_args:
    - -strict
    - experimental
#    - -fflags
#    - low_delay
#    - -fflags
#    - nobuffer
#    - -avoid_negative_ts
#    - make_zero
#    - -fflags
#    - +genpts+discardcorrupt
#    - -vsync
#    - drop
#    - -rtsp_transport
#    - tcp
#    - -stimeout
#    - '15000000'
#    - -use_wallclock_as_timestamps
#    - '1'

cameras:
  # Name of your camera
  front_door:
    ffmpeg:
      hwaccel_args: 
#rpi
        #- -c:v
        #- h264_v4l2m2m
#intel
        - -hwaccel
        - auto
        #- qsv
        #- -qsv_device
        #- /dev/dri/renderD128

      inputs:
#        - path: rtsp://admin:admin@192.168.27.147:554/mode=real&idc=1&ids=2
#          roles:
#            - detect
#            - rtmp
        - path: rtsp://admin:admin@192.168.27.147:554/mode=real&idc=1&ids=1
          roles:
#            - detect
# v0.9b
            - record
# v0.8
#            - clips
#            - snapshots
    objects:
      filters:
        person:
          # Optional: minimum width*height of the bounding box for the detected object (default: 0)
#low res
#          min_area:  2000
#          max_area: 30000
          # Optional: maximum width*height of the bounding box for the detected object (default: 24000000)
#high res
          min_area:  25000
          max_area: 150000
          # Optional: minimum score for the object to initiate tracking (default: shown below)
          min_score: 0.68
          # Optional: minimum decimal percentage for tracked object's computed score to be considered a true positive 
          threshold: 0.70
        car:
          # Optional: minimum width*height of the bounding box for the detected object (default: 0)
#high res
          min_area: 35000
          max_area: 100000
#          mask:
#            - 403,678,273,721,191,263,369,242
      track:
        - person
        - bicycle
        - car
        #- motorcycle
        #- airplane
        #- bus
        #- train
        #- boat
        #- traffic light
        #- fire hydrant
        #- stop sign
        #- parking meter
        #- bench
        - bird
        - cat
        #- dog
        #- horse
        #- sheep
        #- cow
        #- elephant
        #- bear
        #- zebra
        #- giraffe
        #- backpack
        #- umbrella
        #- handbag
        #- tie
        #- suitcase
        #- frisbee
        #- skis
        #- snowboard
        #- sports ball
        #- kite
        #- baseball bat
        #- baseball glove
        #- skateboard
        #- surfboard
        #- tennis racket
        #- bottle
        #- wine glass
        #- cup
        #- fork
        #- knife
        #- spoon
        #- bowl
        #- banana
        #- apple
        #- sandwich
        #- orange
        #- broccoli
        #- carrot
        #- hot dog
        #- pizza
        #- donut
        #- cake
        #- chair
        #- couch
        #- potted plant
        #- bed
        #- dining table
        #- toilet
        #- tv
        #- laptop
        #- mouse
        #- remote
        #- keyboard
        #- cell phone
        #- microwave
        #- oven
        #- toaster
        #- sink
        #- refrigerator
        #- book
        #- clock
        #- vase
        #- scissors
        #- teddy bear
        #- hair drier
        #- toothbrush
# v0.9b
    detect:
      width: 1920 #640 
      height: 1080 #360 
      fps: 9
      max_disappeared: 60
      stationary_interval: 90

    rtmp:
      enabled: False 
    record:
      enabled: True
      retain:
        days: 1
        mode: all #motion
#      bounding_box: True
      events:
        required_zones:
          - portail
#        enabled: True
#        max_seconds: 15
        pre_capture: 5
        post_capture: 5
        retain:
          default: 1
          mode: all #active_objects
          objects:
            person: 3
            cat: 1
            car: 1
            bird: 1
            bicycle: 1
    snapshots:
  # Optional: Enable writing jpg snapshot to /media/frigate/clips (default: shown below)
  # This value can be set via MQTT and will be updated in startup based on retained value
      enabled: True
  # Optional: print a timestamp on the snapshots (default: shown below)
#      timestamp: False
  # Optional: draw bounding box on the snapshots (default: shown below)
      bounding_box: True
  # Optional: crop the snapshot (default: shown below)
#      crop: False
  # Optional: height to resize the snapshot to (default: original size)
#      height: 175
  # Optional: jpeg encode quality (default: shown below)
      quality: 95
  # Optional: Restrict snapshots to objects that entered any of the listed zones (default: no required zones)
#      required_zones: []
  # Optional: Camera override for retention settings (default: global values)
      retain:
    # Required: Default retention days (default: shown below)
        default: 1
    # Optional: Per object retention days
        objects:
          person: 3
    motion:
      mask:
#high res
        - 556,145,569,384,1021,302,1030,0,0,0,0,334,0,621,0,838,0,1080,403,1080,350,950,223,492,188,229
        - 1612,199,1629,0,1402,0,1153,0,1013,0,1009,239,1016,389,1156,374,1543,283,1517,186
        - 1920,1080,1409,1080,1482,883,1584,422,1608,192,1622,0,1920,0
#low res
#         - 0,0,0,360,133,360,109,309,94,233,74,138,61,73,174,50,181,136,269,136,337,134,339,0
#         - 460,360,503,241,528,114,446,110,382,135,312,137,306,0,640,0,640,360
    zones:  
      portail:
        coordinates: 365,198,531,155,540,297,203,343,188,242
#        objects:
#          - car
      allee:
        coordinates: 438,1080,396,1080,256,670,201,355,374,325,557,301,482,732
      entree:    
        coordinates: 545,438,459,1080,1113,1080,1151,394
      zone_jardin_porte_fenetre:
        coordinates: 1178,733,1397,908,1533,667,1601,189,1523,184,1502,304,1317,356
# v0.9b
    mqtt:
  # Optional: Enable publishing snapshot via mqtt for camera (default: shown below)
  # NOTE: Only applies to publishing image data to MQTT via 'frigate/<camera_name>/<object_name>/snapshot'.
  # All other messages will still be published.
      enabled: True
  # Optional: print a timestamp on the snapshots (default: shown below)
      timestamp: False
  # Optional: draw bounding box on the snapshots (default: shown below)
      bounding_box: True
  # Optional: crop the snapshot (default: shown below)
      crop: True
  # Optional: height to resize the snapshot to (default: shown below)
      height: 480
  # Optional: jpeg encode quality (default: shown below)
      quality: 95
  # Optional: Restrict mqtt messages to objects that entered any of the listed zones (default: no required zones)
      required_zones: []
raintonr commented 2 years ago

That doesn't look right. This works here (car is ignored in the given bounding box):

cameras:
  mycam:
    # Other config stuff...
    objects:
      filters:
        car:
          mask:
            - 206,298,255,454,101,478,99,313
blakeblackshear commented 2 years ago

Here is what I would recommend:

    zones:  
      portail: # i assumed this was the entrance to your driveway
        coordinates: 365,198,531,155,540,297,203,343,188,242
        objects:
         - car
      entire_frame:
        coordinates: 0,0,1920,0,1920,1080,0,1080
        objects:
          - person
          - bicycle
          - bird
          - cat
    snapshots:
      required_zones:
        - portail
        - entire_frame

This will only save snapshots for cars that activate the portail zone, but save snapshots of all other objects regardless of where they are in the frame since the entire_frame zone is active for any location. You can add additional object specific zones as desired.

When your parked car is detected in it's parking spot, it will not be saved because it never enters the portail zone unless its arriving or leaving. The car object is not able to activate the entire_frame zone, so it should never have either of the required zones.

Keep in mind that presence in a zone is based on the bottom center of the tracked object's bounding box. It doesn't matter where how much of the bounding box overlaps with the zone.

raintonr commented 2 years ago

Keep in mind that presence in a zone is based on the bottom center of the tracked object's bounding box. It doesn't matter where how much of the bounding box overlaps with the zone.

Now this is super useful info! Could you please add that to the doco (unless I missed it?)?

NickM-27 commented 2 years ago

It is in the docs on this page: https://docs.frigate.video/configuration/masks/

But it took me a while to find it, it is pertinent information so would be good to call it out specifically

Pascal66 commented 2 years ago

Here is what I would recommend:

    zones:  
      portail: # i assumed this was the entrance to your driveway
        coordinates: 365,198,531,155,540,297,203,343,188,242
        objects:
         - car
      entire_frame:
        coordinates: 0,0,1920,0,1920,1080,0,1080
        objects:
          - person
          - bicycle
          - bird
          - cat
    snapshots:
      required_zones:
        - portail
        - entire_frame

This will only save snapshots for cars that activate the portail zone, but save snapshots of all other objects regardless of where they are in the frame since the entire_frame zone is active for any location. You can add additional object specific zones as desired.

When your parked car is detected in it's parking spot, it will not be saved because it never enters the portail zone unless its arriving or leaving. The car object is not able to activate the entire_frame zone, so it should never have either of the required zones.

Keep in mind that presence in a zone is based on the bottom center of the tracked object's bounding box. It doesn't matter where how much of the bounding box overlaps with the zo

Here is what I would recommend:

    zones:  
      portail: # i assumed this was the entrance to your driveway
        coordinates: 365,198,531,155,540,297,203,343,188,242
        objects:
         - car
      entire_frame:
        coordinates: 0,0,1920,0,1920,1080,0,1080
        objects:
          - person
          - bicycle
          - bird
          - cat
    snapshots:
      required_zones:
        - portail
        - entire_frame

This will only save snapshots for cars that activate the portail zone, but save snapshots of all other objects regardless of where they are in the frame since the entire_frame zone is active for any location. You can add additional object specific zones as desired.

When your parked car is detected in it's parking spot, it will not be saved because it never enters the portail zone unless its arriving or leaving. The car object is not able to activate the entire_frame zone, so it should never have either of the required zones.

Keep in mind that presence in a zone is based on the bottom center of the tracked object's bounding box. It doesn't matter where how much of the bounding box overlaps with the zone.

Despit the logic, your logic seems to work better than just a mask So i'm waiting for a real example (wich I can reproduce by myself) until tomorrow. As I can see for now (because it take 1 or 2 day to be sure) Is you example work better. I'll confirm that in 1 day.

Pascal66 commented 2 years ago

It's ok for now (Sometimes, car is now detected as a person, but I'll go with your solution). Thank you