blakeblackshear / frigate

NVR with realtime local object detection for IP cameras
https://frigate.video
MIT License
17.66k stars 1.62k forks source link

Raspberry pi #109

Closed rourke750 closed 3 years ago

rourke750 commented 4 years ago

Has anyone gotten the latest version working with a raspberry pi 4? I am stuck on the opencv stuff as there arent any packages for arch.

kpine commented 4 years ago

opencv-python-headless is provided by piwheels. https://www.piwheels.org/project/opencv-python-headless/

pip3 install --extra-index-url https://www.piwheels.org/simple opencv-python-headless==4.1.0.25

There's currently a bug for newer versions, 4.1.0.25 works without any workarounds.

kpine commented 4 years ago

Some other gotchas with the 0.5.0 update.

You'll need to install the ARM version of the Tensorflow Lite wheel.

The SharedArray python package was added as a dependency, but there's no binary wheel for it, nor are there any Ubuntu/Debian packages from what I can find. It has a dependency on numpy development files, so it will need to be built from scratch.

kpine commented 4 years ago

There's also no wheel for pyarrow.

kpine commented 4 years ago

Hmm, well I've been doing a little more research and this is not boding well for Raspberry Pi users. Hope I'm wrong... :fearful:

The PyArrow python module is now being used in Frigate 0.5.0. As far as I can tell, there is no official support for pyarrow on ARM platforms [1]. It is sounding like it does not build at all for armv7 [2], so if you're using a Raspbian-based OS like I currently am, which only supports 32-bit ARM, you are out of luck.

Otherwise, if you move to an aarch64 based-OS, and you might be able to compile it yourself [3]. I saw some mention of conda-forge, but I don't see any ARM packages.

[1] https://github.com/apache/incubator-superset/issues/8688 [2] https://issues.apache.org/jira/browse/ARROW-7042 [3] https://gist.github.com/heavyinfo/04e1326bb9bed9cecb19c2d603c8d521

blakeblackshear commented 4 years ago

I can drop the SharedArray dependency, but PyArrow is essential to the way I am sharing frames across multiple processes. I don't think there is any way around it. It might be possible to drop it when Python 3.9 comes out and Tensorflow lite supports it. Honestly, Python is such a PITA when it comes to multiprocessing that I considered rewriting it in Go.

kpine commented 4 years ago

I was able to install SharedArray 3.0.0 w/o issue, there doesn't seem to be much difference between it and 3.0.2 (also submitted a bug to piwheels).

What's in Python 3.9 that would let you drop PyArrow?

blakeblackshear commented 4 years ago

Mainly the new pickle 5 protocol and the way it let's you share memory across multiple processes. At the moment, I am jumping through a bunch of hoops to use a separate process for each camera and share a single process for the Coral across all cameras. There are a lot of efforts in the community to address the issues with the limitations of the GIL in python when it comes to distributed processing.

zewelor commented 4 years ago

If pickle 5 is main blocker and it force to use arrow, maybe backport https://github.com/pitrou/pickle5-backport could simplify setup?

blakeblackshear commented 4 years ago

Possibly. PyArrow is doing a lot of the legwork for managing shared memory, and it would require substantial changes to recreate that functionality. I would want to take a serious look at porting this project to Go before going down that path.

kpine commented 4 years ago

Related question, do you still need to store the frame data in a SharedArray if you are using Plasma?

SharedArray is working in PiWheels again, so it's not an issue at all installing on a Rapsberry Pi. Just curious if it's still necessary.

blakeblackshear commented 4 years ago

I don't. It is just leftover from before I switched. I will drop it in a future release.

kpine commented 4 years ago

I was able to make a custom built pyarrow wheel from the instructions above for my RPi4. I haven't tried frigate yet but benchmark.py worked (see #111). Hopefully I'll get a chance to try tonight.

kpine commented 4 years ago

So far so good, although it's night time so I haven't gotten to test much. I did stick my head out the front door and it detected me, so I can claim some sort of victory at least. Here are the current debug stats:

{
  "coral": {
    "detection_queue": 0,
    "detection_start": 0,
    "fps": 0,
    "inference_speed": 17.47
  },
  "front_door": {
    "detection_fps": 0,
    "fps": 5.1,
    "skipped_fps": 0
  },
  "plasma_store_rc": null,
  "tracked_objects_queue": 0
}

Nothing unusual in the container logs. CPU usage looks roughly the same as 0.4.0 for me. I only have one camera.

I've uploaded my Dockerfiles for RPi builds to my fork, based on 0.5.0. Dockerfile.rpi-pyarrow will build the pyarrow python wheel and Dockerfile.rpi will build the Frigate image. The wheel binary needs to be copied into the dist subdirectory of the root project directory.

I've also got a Frigate test image built kpine/frigate-raspberrypi:0.5.0-test, as well as an image that contains the wheel only, kpine/raspberrypi-pyarrow-plasma:0.16.0.

mr-onion-2 commented 4 years ago

Hi @kpine

Thanks for this. I feel like a bit of a noob for asking, but please can you advise how you're meant to use the pyarrow image? When I run it, it just exits after a couple of seconds with no sign of the dist files anywhere

rourke750 commented 4 years ago

If you don't mind waiting an hour or two I can give you a single dockerfile to run that will do everything for you.

mr-onion-2 commented 4 years ago

Awesome thanks :)

kpine commented 4 years ago

@mr-onion-2 You can create a container and copy the dist directory into the frigate project directory.

$ cd ~/src/frigate
$ docker create --name pyarrow kpine/raspberrypi-pyarrow-plasma:0.16.0  # or your own image name
7c28c656658efcd79c84ec52e4de50cedfa57f2c9363b8c63cb683495fb07e37
$ docker cp pyarrow:/dist/ .
$ ls dist
pyarrow-0.16.0-cp37-cp37m-linux_armv7l.whl
mr-onion-2 commented 4 years ago

Thanks - worked perfect :)

mambatronics commented 4 years ago

Hi @kpine If I may ask, is there any solution in building pyarrow for Raspberry Pi 3B+? I've been installing the Apache Beam SDK to connect my RPi to the Google Cloud Platform (with my virtual environment activated). I'm a bit of a beginner in the Debian OS and I'm encountering an error: ERROR: Could not build wheels for pyarrow which use PEP 517 and cannot be installed directly

Buzztiger commented 4 years ago

Have you tried just to use the Docker image @kpine (kpine/frigate-raspberrypi:0.5.0-test) provided?

Works fine for my pi4. Just follow the original tutorial but adjust in the docker-compose config file the Docker image to the one from kpine

mambatronics commented 4 years ago

Hi @Buzztiger thank you so much for this.

I've read some docker documentations and watched some tutorials but most of them are using ubuntu as an example in running images. Can I ask how to run and install the docker image that @kpine provided?

I apologize if this is such a noob question but I can't seem to build pyarrow in my RPi. Thank you for your time and understanding.

scstraus commented 4 years ago

What OS are you running on your Pi?

mambatronics commented 4 years ago

@scstraus I've got Raspbian GNU/Linux 10 (buster) installed as my OS. I did the same commands as @kpine has advised: $ mkdir pyarrow_kpine $ cd pyarrow_kpine $ docker create --name pyarrow kpine/raspberrypi-pyarrow-plasma:0.16.0 $ docker cp pyarrow:/dist/ . $ ls dist pyarrow-0.16.0-cp37-cp37m-linux_armv7l.whl

I can't seem to figure out how to run and what to do with it as I am just a beginner in these systems.

Buzztiger commented 4 years ago

Hi @Buzztiger thank you so much for this.

I've read some docker documentations and watched some tutorials but most of them are using ubuntu as an example in running images. Can I ask how to run and install the docker image that @kpine provided?

I apologize if this is such a noob question but I can't seem to build pyarrow in my RPi. Thank you for your time and understanding.

Mini Tutorial:

So I assume we are starting off with a blank raspbian image (Raspbian Buster Lite) on a Pi4 with a coral stick attached. Then just follow one of the various existing tutorials for installing docker and docker-compose on a Pi4 (e.g. https://dev.to/rohansawant/installing-docker-and-docker-compose-on-the-raspberry-pi-in-5-simple-steps-3mgl)

Now for the dry dock for the frigate.

Note: we're directly using @kpine modified version for the Pi4 with the last updates from @blakeblackshear. It's still crashing after a while so we'll add a cronjob to restart it frequently.

1) Create directories mkdir /code mkdir /code/02-Frigate mkdir /code/02-Frigate/config

2) Create config files cd /code/02-Frigate /code/02-Frigate# nano docker-compose.yml

paste the following into this file for the configuration of the docker image:

frigate:
    container_name: frigate
    restart: unless-stopped
    privileged: true
   shm_size: '2g' # should work for 5-7 cameras
   image: kpine/frigate-raspberrypi:latest
   volumes:
     - /dev/bus/usb:/dev/bus/usb
     - /etc/localtime:/etc/localtime:ro
     - /code/02-Frigate/config:/config
   ports:
     - "5000:5000"

Config file for the frigate. You will need to adapt this to your own needs e.g. urls to the cam streams etc.

/code/02-Frigate/config# nano config.yml config.example.yml

Now we pull the image and do a first start

/code/02-Frigate# docker-compose up

This will take a while but once everything is up you should see lines like:

Starting frigate ... done Attaching to frigate frigate | On connect called

ctrl-c to stop as we want to run this in the background.

/code/02-Frigate# docker start frigate

This should fire up the frigate in the background. Now as mentioned we'll add a cronjob to restart it due to the current bug.

/code/02-Frigate# crontab -e

Note that the first time you run this it will ask you which editor you prefer. I like nano.

add the following line, which restarts frigate every hour.

0 /1 docker restart frigate

This should do it, configuration for home assistant as @blakeblackshear described it here

Fezile01 commented 4 years ago

Hi @Buzztiger

Thank you for the mini tutorial.

I am using a Raspberry 3B

When following the tutorial, is pyarrow-0.16.0-cp37-cp37m-linux_armv7l.whl already included in the kpine/frigate-raspberrypi:latest (kpine/frigate-raspberrypi:0.5.1-rc4)?

Buzztiger commented 4 years ago

@Fezile01 yes it's already included afaik. At least on my Pi4 this runs out of the box without additional steps.

Buzztiger commented 4 years ago

@kpine, @blakeblackshear just to confirm that the recent changes have solved the crash issues for me too. My Pi4 + coral is running stable now for several days. Thank you for all your efforts.

andreasfrosig commented 4 years ago

@Buzztiger thanks for the guide! Are you using multiple cameras? My setup seems to have an issue with distinguishing between the cameras for the snapshots and /{camera} feeds. They are jumping (seemingly randomly) between the actual camera rtsp feeds as soon as some object is detected.

Buzztiger commented 4 years ago

Yes, I saw your issue. I have currently two cams connected and no issues with swapped channels. I'll hook up the rest of the cams this weekend (4 in total then) and check if I can replicate your problem.

Ubique88 commented 4 years ago

Thanks @Buzztiger for the mini guild, i have run in to an issue when I try to compose up.

my docker-compose.yml looks like;

version: "3.8"

services:
 frigate:
  container_name: frigate
  restart: unless-stopped
  privileged: true
  shm_size: '512m' # should work for 5-7 cameras
  image: kpine/frigate-raspberrypi:latest
  ports:
   - "5000:5000"
  volumes:
   - /dev/bus/usb:/dev/bus/usb
   - /etc/localtime:/etc/localtime:ro
    /code/02-Frigate/config:/config

I am not sure if I am missing anything in the config.yml?

But I am getting the error

pi@Pi4:/code/02-Frigate $ docker-compose up
Creating frigate ... done
Attaching to frigate
frigate    | Traceback (most recent call last):
frigate    |   File "detect_objects.py", line 345, in <module>
frigate    |     main()
frigate    |   File "detect_objects.py", line 150, in main
frigate    |     client.connect(MQTT_HOST, MQTT_PORT, 60)
frigate    |   File "/usr/lib/python3/dist-packages/paho/mqtt/client.py", line 839, in connect
frigate    |     return self.reconnect()
frigate    |   File "/usr/lib/python3/dist-packages/paho/mqtt/client.py", line 962, in reconnect
frigate    |     sock = socket.create_connection((self._host, self._port), source_address=(self._bind_address, 0))
frigate    |   File "/usr/lib/python3.7/socket.py", line 707, in create_connection
frigate    |     for res in getaddrinfo(host, port, 0, SOCK_STREAM):
frigate    |   File "/usr/lib/python3.7/socket.py", line 748, in getaddrinfo
frigate    |     for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
frigate    | socket.gaierror: [Errno -2] Name or service not known
frigate    | Traceback (most recent call last):
frigate    |   File "detect_objects.py", line 345, in <module>
frigate    |     main()
frigate    |   File "detect_objects.py", line 150, in main
frigate    |     client.connect(MQTT_HOST, MQTT_PORT, 60)
frigate    |   File "/usr/lib/python3/dist-packages/paho/mqtt/client.py", line 839, in connect
frigate    |     return self.reconnect()
frigate    |   File "/usr/lib/python3/dist-packages/paho/mqtt/client.py", line 962, in reconnect
frigate    |     sock = socket.create_connection((self._host, self._port), source_address=(self._bind_address, 0))
frigate    |   File "/usr/lib/python3.7/socket.py", line 707, in create_connection
frigate    |     for res in getaddrinfo(host, port, 0, SOCK_STREAM):
frigate    |   File "/usr/lib/python3.7/socket.py", line 748, in getaddrinfo
frigate    |     for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
frigate    | socket.gaierror: [Errno -2] Name or service not known
frigate exited with code 0
frigate    | Traceback (most recent call last):
frigate    |   File "detect_objects.py", line 345, in <module>
frigate    |     main()
frigate    |   File "detect_objects.py", line 150, in main
frigate    |     client.connect(MQTT_HOST, MQTT_PORT, 60)
frigate    |   File "/usr/lib/python3/dist-packages/paho/mqtt/client.py", line 839, in connect
frigate    |     return self.reconnect()
frigate    |   File "/usr/lib/python3/dist-packages/paho/mqtt/client.py", line 962, in reconnect
frigate    |     sock = socket.create_connection((self._host, self._port), source_address=(self._bind_address, 0))
frigate    |   File "/usr/lib/python3.7/socket.py", line 707, in create_connection
frigate    |     for res in getaddrinfo(host, port, 0, SOCK_STREAM):
frigate    |   File "/usr/lib/python3.7/socket.py", line 748, in getaddrinfo
frigate    |     for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
frigate    | socket.gaierror: [Errno -2] Name or service not known
frigate exited with code 1
frigate    | Traceback (most recent call last):
frigate    |   File "detect_objects.py", line 345, in <module>
frigate    |     main()
frigate    |   File "detect_objects.py", line 150, in main
frigate    |     client.connect(MQTT_HOST, MQTT_PORT, 60)
frigate    |   File "/usr/lib/python3/dist-packages/paho/mqtt/client.py", line 839, in connect
frigate    |     return self.reconnect()
frigate    |   File "/usr/lib/python3/dist-packages/paho/mqtt/client.py", line 962, in reconnect
frigate    |     sock = socket.create_connection((self._host, self._port), source_address=(self._bind_address, 0))
frigate    |   File "/usr/lib/python3.7/socket.py", line 707, in create_connection
frigate    |     for res in getaddrinfo(host, port, 0, SOCK_STREAM):
frigate    |   File "/usr/lib/python3.7/socket.py", line 748, in getaddrinfo
frigate    |     for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
frigate    | socket.gaierror: [Errno -2] Name or service not known
frigate exited with code 1
Buzztiger commented 4 years ago

That looks like some mqtt connection issue. Can you post your config file as well?

Ubique88 commented 4 years ago

Sorry for the delay we had a rare sunny weekend here in the UK so I sat the garden drinking beer. Here is my config file, as I said I am not sure if I just have this wrong/misunderstood the example one;

web_port: 5000

mqtt:
  host: mqtt.server.com
  topic_prefix: frigate
  # client_id: frigate # Optional -- set to override default client id of 'frigate' if running multiple instances
  # user: username # Optional
  #################
  ## Environment variables that begin with 'FRIGATE_' may be referenced in {}.
  ##   password: '{FRIGATE_MQTT_PASSWORD}'
  #################
  # password: password # Optional

#################
# Default ffmpeg args. Optional and can be overwritten per camera.
# Should work with most RTSP cameras that send h264 video
# Built from the properties below with:
# "ffmpeg" + global_args + input_args + "-i" + input + output_args
#################
# ffmpeg:
#   global_args:
#     - -hide_banner
#     - -loglevel
#     - panic
#   hwaccel_args: []
#   input_args:
#     - -avoid_negative_ts
#     - make_zero
#     - -fflags
#     - nobuffer
#     - -flags
#     - low_delay
#     - -strict
#     - experimental
#     - -fflags
#     - +genpts+discardcorrupt
#     - -vsync
#     - drop
#     - -rtsp_transport
#     - tcp
#     - -stimeout
#     - '5000000'
#     - -use_wallclock_as_timestamps
#     - '1'
#   output_args:
#     - -f
#     - rawvideo
#     - -pix_fmt
#     - rgb24

####################
# Global object configuration. Applies to all cameras
# unless overridden at the camera levels.
# Keys must be valid labels. By default, the model uses coco (https://dl.google.com/coral/canned_models/coco_labels.txt).
# All labels from the model are reported over MQTT. These values are used to filter out false positives.
# min_area (optional): minimum width*height of the bounding box for the detected person
# max_area (optional): maximum width*height of the bounding box for the detected person
# threshold (optional): The minimum decimal percentage (50% hit = 0.5) for the confidence from tensorflow
####################
objects:
  track:
    - person
    - car
    - truck
    - cat
    - dog
    - bird
  filters:
    person:
      min_area: 5000
      max_area: 100000
      threshold: 0.5

cameras:
  back:
    ffmpeg:
      ################
      # Source passed to ffmpeg after the -i parameter. Supports anything compatible with OpenCV and FFmpeg.
      # Environment variables that begin with 'FRIGATE_' may be referenced in {}
      ################
      input: rtsp://admin:mypassword@192.168.121.2:554/cam/realmonitor?channel=2&subtype=0
      #################
      # These values will override default values for just this camera
      #################
      # global_args: []
      # hwaccel_args: []
      # input_args: []
      # output_args: []

    ################
    ## Optionally specify the resolution of the video feed. Frigate will try to auto detect if not specified
    ################
    # height: 1280
    # width: 720

    ################
    ## Optional mask. Must be the same aspect ratio as your video feed.
    ## 
    ## The mask works by looking at the bottom center of the bounding box for the detected
    ## person in the image. If that pixel in the mask is a black pixel, it ignores it as a
    ## false positive. In my mask, the grass and driveway visible from my backdoor camera 
    ## are white. The garage doors, sky, and trees (anywhere it would be impossible for a 
    ## person to stand) are black.
    ## 
    ## Masked areas are also ignored for motion detection.
    ################
    # mask: back-mask.bmp

    ################
    # Allows you to limit the framerate within frigate for cameras that do not support
    # custom framerates. A value of 1 tells frigate to look at every frame, 2 every 2nd frame, 
    # 3 every 3rd frame, etc.
    ################
    take_frame: 1

    ################
    # The expected framerate for the camera. Frigate will try and ensure it maintains this framerate
    # by dropping frames as necessary. Setting this lower than the actual framerate will allow frigate
    # to process every frame at the expense of realtime processing.
    ################
    fps: 5

    ################
    # Configuration for the snapshots in the debug view and mqtt
    ################
    snapshots:
      show_timestamp: True

    ################
    # Camera level object config. This config is merged with the global config above.
    ################
    objects:
      track:
        - person
      filters:
        person:
          min_area: 5000
          max_area: 100000
          threshold: 0.5
Buzztiger commented 4 years ago

so your mqtt server adress mqtt.server.com does not exist. I think that's the default value from the template. You need to fill in the address of your own mqtt server.

Ubique88 commented 4 years ago

Sorry that was an old config (have to many versions going on need to tidy it up). This is the error;

pi@Pi4:/code/02-Frigate $ docker-compose up
Creating frigate ... done
Attaching to frigate
frigate    | Traceback (most recent call last):
frigate    |   File "detect_objects.py", line 26, in <module>
frigate    |     CONFIG = yaml.safe_load(f)
frigate    |   File "/usr/lib/python3/dist-packages/yaml/__init__.py", line 94, in safe_load
frigate    |     return load(stream, SafeLoader)
frigate    |   File "/usr/lib/python3/dist-packages/yaml/__init__.py", line 72, in load
frigate    |     return loader.get_single_data()
frigate    |   File "/usr/lib/python3/dist-packages/yaml/constructor.py", line 35, in get_single_data
frigate    |     node = self.get_single_node()
frigate    |   File "/usr/lib/python3/dist-packages/yaml/composer.py", line 36, in get_single_node
frigate    |     document = self.compose_document()
frigate    |   File "/usr/lib/python3/dist-packages/yaml/composer.py", line 55, in compose_document
frigate    |     node = self.compose_node(None, None)
frigate    |   File "/usr/lib/python3/dist-packages/yaml/composer.py", line 84, in compose_node
frigate    |     node = self.compose_mapping_node(anchor)
frigate    |   File "/usr/lib/python3/dist-packages/yaml/composer.py", line 133, in compose_mapping_node
frigate    |     item_value = self.compose_node(node, item_key)
frigate    |   File "/usr/lib/python3/dist-packages/yaml/composer.py", line 84, in compose_node
frigate    |     node = self.compose_mapping_node(anchor)
frigate    |   File "/usr/lib/python3/dist-packages/yaml/composer.py", line 127, in compose_mapping_node
frigate    |     while not self.check_event(MappingEndEvent):
frigate    |   File "/usr/lib/python3/dist-packages/yaml/parser.py", line 98, in check_event
frigate    |     self.current_event = self.state()
frigate    |   File "/usr/lib/python3/dist-packages/yaml/parser.py", line 439, in parse_block_mapping_key
frigate    |     "expected <block end>, but found %r" % token.id, token.start_mark)
frigate    | yaml.parser.ParserError: while parsing a block mapping
frigate    |   in "/config/config.yml", line 4, column 3
frigate    | expected <block end>, but found '<block mapping start>'
frigate    |   in "/config/config.yml", line 7, column 4
frigate    | Traceback (most recent call last):
frigate    |   File "detect_objects.py", line 26, in <module>
frigate    |     CONFIG = yaml.safe_load(f)
frigate    |   File "/usr/lib/python3/dist-packages/yaml/__init__.py", line 94, in safe_load
frigate    |     return load(stream, SafeLoader)
frigate    |   File "/usr/lib/python3/dist-packages/yaml/__init__.py", line 72, in load
frigate    |     return loader.get_single_data()
frigate    |   File "/usr/lib/python3/dist-packages/yaml/constructor.py", line 35, in get_single_data
frigate    |     node = self.get_single_node()
frigate    |   File "/usr/lib/python3/dist-packages/yaml/composer.py", line 36, in get_single_node
frigate    |     document = self.compose_document()
frigate    |   File "/usr/lib/python3/dist-packages/yaml/composer.py", line 55, in compose_document
frigate    |     node = self.compose_node(None, None)
frigate    |   File "/usr/lib/python3/dist-packages/yaml/composer.py", line 84, in compose_node
frigate    |     node = self.compose_mapping_node(anchor)
frigate    |   File "/usr/lib/python3/dist-packages/yaml/composer.py", line 133, in compose_mapping_node
frigate    |     item_value = self.compose_node(node, item_key)
frigate    |   File "/usr/lib/python3/dist-packages/yaml/composer.py", line 84, in compose_node
frigate    |     node = self.compose_mapping_node(anchor)
frigate    |   File "/usr/lib/python3/dist-packages/yaml/composer.py", line 127, in compose_mapping_node
frigate    |     while not self.check_event(MappingEndEvent):
frigate    |   File "/usr/lib/python3/dist-packages/yaml/parser.py", line 98, in check_event
frigate    |     self.current_event = self.state()
frigate    |   File "/usr/lib/python3/dist-packages/yaml/parser.py", line 439, in parse_block_mapping_key
frigate    |     "expected <block end>, but found %r" % token.id, token.start_mark)
frigate    | yaml.parser.ParserError: while parsing a block mapping
frigate    |   in "/config/config.yml", line 4, column 3
frigate    | expected <block end>, but found '<block mapping start>'
frigate    |   in "/config/config.yml", line 7, column 4
frigate exited with code 1
frigate    | Traceback (most recent call last):
frigate    |   File "detect_objects.py", line 26, in <module>
frigate    |     CONFIG = yaml.safe_load(f)
frigate    |   File "/usr/lib/python3/dist-packages/yaml/__init__.py", line 94, in safe_load
frigate    |     return load(stream, SafeLoader)
frigate    |   File "/usr/lib/python3/dist-packages/yaml/__init__.py", line 72, in load
frigate    |     return loader.get_single_data()
frigate    |   File "/usr/lib/python3/dist-packages/yaml/constructor.py", line 35, in get_single_data
frigate    |     node = self.get_single_node()
frigate    |   File "/usr/lib/python3/dist-packages/yaml/composer.py", line 36, in get_single_node
frigate    |     document = self.compose_document()
frigate    |   File "/usr/lib/python3/dist-packages/yaml/composer.py", line 55, in compose_document
frigate    |     node = self.compose_node(None, None)
frigate    |   File "/usr/lib/python3/dist-packages/yaml/composer.py", line 84, in compose_node
frigate    |     node = self.compose_mapping_node(anchor)
frigate    |   File "/usr/lib/python3/dist-packages/yaml/composer.py", line 133, in compose_mapping_node
frigate    |     item_value = self.compose_node(node, item_key)
frigate    |   File "/usr/lib/python3/dist-packages/yaml/composer.py", line 84, in compose_node
frigate    |     node = self.compose_mapping_node(anchor)
frigate    |   File "/usr/lib/python3/dist-packages/yaml/composer.py", line 127, in compose_mapping_node
frigate    |     while not self.check_event(MappingEndEvent):
frigate    |   File "/usr/lib/python3/dist-packages/yaml/parser.py", line 98, in check_event
frigate    |     self.current_event = self.state()
frigate    |   File "/usr/lib/python3/dist-packages/yaml/parser.py", line 439, in parse_block_mapping_key
frigate    |     "expected <block end>, but found %r" % token.id, token.start_mark)
frigate    | yaml.parser.ParserError: while parsing a block mapping
frigate    |   in "/config/config.yml", line 4, column 3
frigate    | expected <block end>, but found '<block mapping start>'
frigate    |   in "/config/config.yml", line 7, column 4
frigate exited with code 1
frigate    | Traceback (most recent call last):
frigate    |   File "detect_objects.py", line 26, in <module>
frigate    |     CONFIG = yaml.safe_load(f)
frigate    |   File "/usr/lib/python3/dist-packages/yaml/__init__.py", line 94, in safe_load
frigate    |     return load(stream, SafeLoader)
frigate    |   File "/usr/lib/python3/dist-packages/yaml/__init__.py", line 72, in load
frigate    |     return loader.get_single_data()
frigate    |   File "/usr/lib/python3/dist-packages/yaml/constructor.py", line 35, in get_single_data
frigate    |     node = self.get_single_node()
frigate    |   File "/usr/lib/python3/dist-packages/yaml/composer.py", line 36, in get_single_node
frigate    |     document = self.compose_document()
frigate    |   File "/usr/lib/python3/dist-packages/yaml/composer.py", line 55, in compose_document
frigate    |     node = self.compose_node(None, None)
frigate    |   File "/usr/lib/python3/dist-packages/yaml/composer.py", line 84, in compose_node
frigate    |     node = self.compose_mapping_node(anchor)
frigate    |   File "/usr/lib/python3/dist-packages/yaml/composer.py", line 133, in compose_mapping_node
frigate    |     item_value = self.compose_node(node, item_key)
frigate    |   File "/usr/lib/python3/dist-packages/yaml/composer.py", line 84, in compose_node
frigate    |     node = self.compose_mapping_node(anchor)
frigate    |   File "/usr/lib/python3/dist-packages/yaml/composer.py", line 127, in compose_mapping_node
frigate    |     while not self.check_event(MappingEndEvent):
frigate    |   File "/usr/lib/python3/dist-packages/yaml/parser.py", line 98, in check_event
frigate    |     self.current_event = self.state()
frigate    |   File "/usr/lib/python3/dist-packages/yaml/parser.py", line 439, in parse_block_mapping_key
frigate    |     "expected <block end>, but found %r" % token.id, token.start_mark)
frigate    | yaml.parser.ParserError: while parsing a block mapping
frigate    |   in "/config/config.yml", line 4, column 3
frigate    | expected <block end>, but found '<block mapping start>'
frigate    |   in "/config/config.yml", line 7, column 4
frigate exited with code 1
frigate    | Traceback (most recent call last):
frigate    |   File "detect_objects.py", line 26, in <module>
frigate    |     CONFIG = yaml.safe_load(f)
frigate    |   File "/usr/lib/python3/dist-packages/yaml/__init__.py", line 94, in safe_load
frigate    |     return load(stream, SafeLoader)
frigate    |   File "/usr/lib/python3/dist-packages/yaml/__init__.py", line 72, in load
frigate    |     return loader.get_single_data()
frigate    |   File "/usr/lib/python3/dist-packages/yaml/constructor.py", line 35, in get_single_data
frigate    |     node = self.get_single_node()
frigate    |   File "/usr/lib/python3/dist-packages/yaml/composer.py", line 36, in get_single_node
frigate    |     document = self.compose_document()
frigate    |   File "/usr/lib/python3/dist-packages/yaml/composer.py", line 55, in compose_document
frigate    |     node = self.compose_node(None, None)
frigate    |   File "/usr/lib/python3/dist-packages/yaml/composer.py", line 84, in compose_node
frigate    |     node = self.compose_mapping_node(anchor)
frigate    |   File "/usr/lib/python3/dist-packages/yaml/composer.py", line 133, in compose_mapping_node
frigate    |     item_value = self.compose_node(node, item_key)
frigate    |   File "/usr/lib/python3/dist-packages/yaml/composer.py", line 84, in compose_node
frigate    |     node = self.compose_mapping_node(anchor)
frigate    |   File "/usr/lib/python3/dist-packages/yaml/composer.py", line 127, in compose_mapping_node
frigate    |     while not self.check_event(MappingEndEvent):
frigate    |   File "/usr/lib/python3/dist-packages/yaml/parser.py", line 98, in check_event
frigate    |     self.current_event = self.state()
frigate    |   File "/usr/lib/python3/dist-packages/yaml/parser.py", line 439, in parse_block_mapping_key
frigate    |     "expected <block end>, but found %r" % token.id, token.start_mark)
frigate    | yaml.parser.ParserError: while parsing a block mapping
frigate    |   in "/config/config.yml", line 4, column 3
frigate    | expected <block end>, but found '<block mapping start>'
frigate    |   in "/config/config.yml", line 7, column 4
frigate exited with code 1

this is the config

web_port: 5000

mqtt:
  host: 127.0.0.1
  topic_prefix: frigate
  # client_id: frigate # Optional -- set to override default client id of 'frigate' if running multiple instances
   user: *username
  #################
  ## Environment variables that begin with 'FRIGATE_' may be referenced in {}.
  ##   password: '{FRIGATE_MQTT_PASSWORD}'
  #################
   password: *password

#################
# Default ffmpeg args. Optional and can be overwritten per camera.
# Should work with most RTSP cameras that send h264 video
# Built from the properties below with:
# "ffmpeg" + global_args + input_args + "-i" + input + output_args
#################
# ffmpeg:
#   global_args:
#     - -hide_banner
#     - -loglevel
#     - panic
#   hwaccel_args: []
#   input_args:
#     - -avoid_negative_ts
#     - make_zero
#     - -fflags
#     - nobuffer
#     - -flags
#     - low_delay
#     - -strict
#     - experimental
#     - -fflags
#     - +genpts+discardcorrupt
#     - -vsync
#     - drop
#     - -rtsp_transport
#     - tcp
#     - -stimeout
#     - '5000000'
#     - -use_wallclock_as_timestamps
#     - '1'
#   output_args:
#     - -f
#     - rawvideo
#     - -pix_fmt
#     - rgb24

####################
# Global object configuration. Applies to all cameras
# unless overridden at the camera levels.
# Keys must be valid labels. By default, the model uses coco (https://dl.google.com/coral/canned_models/coco_labels.txt).
# All labels from the model are reported over MQTT. These values are used to filter out false positives.
# min_area (optional): minimum width*height of the bounding box for the detected person
# max_area (optional): maximum width*height of the bounding box for the detected person
# threshold (optional): The minimum decimal percentage (50% hit = 0.5) for the confidence from tensorflow
####################
objects:
  track:
    - person
    - car
    - truck
    - cat
    - dog
    - bird
  filters:
    person:
      min_area: 5000
      max_area: 100000
      threshold: 0.5

cameras:
  back:
    ffmpeg:
      ################
      # Source passed to ffmpeg after the -i parameter. Supports anything compatible with OpenCV and FFmpeg.
      # Environment variables that begin with 'FRIGATE_' may be referenced in {}
      ################
      input: rtsp://admin:*password@192.168.121.2:554/cam/realmonitor?channel=2&subtype=0
      #################
      # These values will override default values for just this camera
      #################
      # global_args: []
      # hwaccel_args: []
      # input_args: []
      # output_args: []

    ################
    ## Optionally specify the resolution of the video feed. Frigate will try to auto detect if not specified
    ################
    # height: 1280
    # width: 720

    ################
    ## Optional mask. Must be the same aspect ratio as your video feed.
    ## 
    ## The mask works by looking at the bottom center of the bounding box for the detected
    ## person in the image. If that pixel in the mask is a black pixel, it ignores it as a
    ## false positive. In my mask, the grass and driveway visible from my backdoor camera 
    ## are white. The garage doors, sky, and trees (anywhere it would be impossible for a 
    ## person to stand) are black.
    ## 
    ## Masked areas are also ignored for motion detection.
    ################
    # mask: back-mask.bmp

    ################
    # Allows you to limit the framerate within frigate for cameras that do not support
    # custom framerates. A value of 1 tells frigate to look at every frame, 2 every 2nd frame, 
    # 3 every 3rd frame, etc.
    ################
    take_frame: 1

    ################
    # The expected framerate for the camera. Frigate will try and ensure it maintains this framerate
    # by dropping frames as necessary. Setting this lower than the actual framerate will allow frigate
    # to process every frame at the expense of realtime processing.
    ################
    fps: 5

    ################
    # Configuration for the snapshots in the debug view and mqtt
    ################
    snapshots:
      show_timestamp: True

    ################
    # Camera level object config. This config is merged with the global config above.
    ################
    objects:
      track:
        - person
      filters:
        person:
          min_area: 5000
          max_area: 100000
          threshold: 0.5
Buzztiger commented 4 years ago

You have a wrongly formatted config file. Let me explain:

If you read through your error message carefully you can spot the following lines:

frigate | "expected , but found %r" % token.id, token.start_mark) frigate | yaml.parser.ParserError: while parsing a block mapping frigate | in "/config/config.yml", line 4, column 3 frigate | expected , but found '' frigate | in "/config/config.yml", line 7, column 4

the python script complains about not being able to parse your config file (yaml coded), in particular line 4, column 3 and line 7 column 4. Check the indentation of those lines. I think at least the mqtt user line is one space too far right.

Ubique88 commented 4 years ago

What an amateur am completely miss that.

Thank you for all you help, it is up and working great now.

I only need to workout why the stream framerate is so low (about 7fps) must internal network bottle neck(im looking at you powerline).

Thanks again.

Hogster commented 4 years ago

Hi,

Thanks @kpine for the great work making it work on a Pi 4 and @Buzztiger for the great tutorial.

I have it working on my new Pi with a Coral USB and currently have one camera for testing purposes and will add more in the near future. Currently I have an average "frigate_coral_inference" around 30 ms.

It works, however, I'm confused as to what would be the optimal setting for the Pi. It would be great if we could share experiences on different camera/frigate settings as well if someone could explain what the different settings such as resolution, fps, take_frame etc. impacts accuracy and speed.

With the substream of the camera with a resolution of 740x480 and 5 fps the system seem to be fast however it has trouble identifying objects that are a bit further away.

With the highest resolution 3840x2160 and 30 fps the system does not work at all. I used the default config.yml with take_frame: 5 and when I look at the live feed of myself I become "invisible" and the system does not react att all.

Currently I'm using 1080P with fps of 10 on camera and config and take_frame of 10 which seem to be relatively stable and fast. However I would like to be able to use max resolution since I would like to watch the camera feed on max resolution.

blakeblackshear commented 4 years ago

With the highest resolution 3840x2160 and 30 fps the system does not work at all.

I wouldn't expect the Pi to be able to handle a 4k 30fps video feed. It just doesn't have the processing power for that.

However I would like to be able to use max resolution since I would like to watch the camera feed on max resolution.

The debug endpoint for viewing the video feed from frigate is not supposed to be a video feed you watch constantly. It is for debugging your configuration. It takes a good bit of processing power to re-encode the frames and send them to the browser. All that processing time is using cycles that frigate needs to process the video frames and will slow everything else down. Use something else to view the 4k stream.

someone could explain what the different settings such as resolution, fps, take_frame etc. impacts accuracy and speed

resolution Ideally, the resolution should be set so that the smallest person you would want to detect is about 300px tall. The tensorflow models are trained on 300x300 pixel images, so frigate is going to resize every area it looks at to that size anyway. If you are analyzing a video feed with too high of a resolution, you will see no improvement in accuracy and a decrease in speed in exchange for the extra CPU it takes to process the extra pixels.

fps Keep in mind that this is a setting on the camera. The fps config setting was removed in 0.5.1. When frigate is running, keep an eye on the skipped_fps for the camera. For every skipped frame, your CPU is wasting time decoding frames that frigate is throwing away. You should lower the fps on your camera to avoid skipped frames. A higher fps will result in better object tracking because the objects have moved a smaller distance between frames. Too high of an fps will result in wasted time and slower speeds. I run 5ps on my 8 1080p cameras with an Intel NUC.

take_frame This is a way to tell frigate to skip frames for cameras that cannot adjust fps. If you have it set to 10 and your camera is 10 fps, it will grab every 10th frame and lower your fps to 1fps.

Hogster commented 4 years ago

Thanks for your reply @blakeblackshear!

The debug endpoint for viewing the video feed from frigate is not supposed to be a video feed you watch constantly. It is for debugging your configuration.

My problem is that my max resolution for the camera substream is 740x480 which is too low so I can't use that for Frigate. And when Frigate detects a person when .e.g the alarm is on I want to start the stream on my fire tv (not the frigate debug endpoint but the camera stream) and it would be nice to use the 4K resolution. I guess I need to find a camera with a more adjustable substream.

Ideally, the resolution should be set so that the smallest person you would want to detect is about 300px tall.

OK, so if I understand correctly, If I use a 1080P the smallest person (optimally) should be a third of the height (roughly) (300/1080=27%)?

Again, thanks for your reply!

Would still be great to hear about the experience from other what has been working well...

blakeblackshear commented 4 years ago

You could try resizing the 4k stream with ffmpeg with additional output parameters, but it still might put too much load on the pi. I think I saw that someone got hardware acceleration working on the pi for decoding with ffmpeg (not sure where). That would help reduce CPU usage.

Ubique88 commented 4 years ago

My Pi4 struggles at 1080p it starts to drop frames and lag. I have mine running at about 15FPS with no dropped frames at 720p. The detection seems to work best at that and with a 2048 bitrate. Any lower and it will miss people from the other side of the road. With these setting the CPU runs at about 70% so 4k will be a push, unless you can get the hardware acceleration working.

scstraus commented 4 years ago

Would still be great to hear about the experience from other what has been working well...

I'm not on a Pi so I have more processor to play with, but I've found 1080p 5fps to be the sweet spot where people get the resolution they need (though there are cases where it still goes under the 300 pixel size to detect something, it doesn't seem to have a very big effect on the recognition quality). I think 4k is probably overkill generally. If I wanted to optimize my CPU usage further, I'd reduce frame rate as even something like 2fps is enough to recognize people in just about any real world case. Heck, my first try at tensorflow only managed 1 frame every 10 seconds on the Pi CPU and I still thought that was pretty cool ;-)

So, depends a bit on your goals, but I'm happy with 1080p 5 fps.

Hogster commented 4 years ago

I think I saw that someone got hardware acceleration working on the pi for decoding with ffmpeg

@blakeblackshear, I'll have a look at that. I'm a hobbyist (aka noob) trying to get my mind around all this, but if you don't try...

@Ubique88 I tried 720P and it seems to run much smoother, howver I run at 5FPS since it seem to be enough, as @scstraus mentioned, probably 2FPS would be enough. My nice 4K stream for Fire TV is ruined anyway :)

Hogster commented 4 years ago

@Buzztiger (or anyone else who can help) don't want to sound too stupid, but how would I go around and update to the latest release when installed through your excellent guide? Have never used docker before.

@blakeblackshear is there a way to train the model with e.g. a robotic lawn mower or anything other that one would find interesting to track?

mattheys commented 4 years ago

@kpine Hi I tried to use your image to build an aarch64 version of pyarrow by doing docker buildx build -f Dockerfile.rpi-pyarrow --platform linux/arm64 . but because you are using FROM balenalib/raspberrypi3-debian:buster-build as builder it's still building an armv7l version, I checked this by running bash in that image and doing a uname -a and got this Linux 3fe99e56a34c 4.19.76-linuxkit #1 SMP Tue May 26 11:42:35 UTC 2020 armv7l GNU/Linux

Can you recommend an image that I can use to build pyarrow? Also is there a reason to only use 1 core to build with? I'm using Docker Desktop on Windows on a Xeon workstation so would be good to build this faster if I can.

ARG MAKE_JOBS=1
ENV CMAKE_BUILD_PARALLEL_LEVEL=${MAKE_JOBS}

Ideally I would like to build a multiarch image that can be used for multiple architectures.

kpine commented 4 years ago

For aarch64 I would just use the regular Ubuntu or Debian base images. For PyArrow I would try to install with conda-forge first, since they already build aarch64.

Also is there a reason to only use 1 core to build with?

No... I defaulted it to 1 because that works best for me. If you want more then set the MAKE_JOBS build arg to whatever you want (that's the point after all), or edit the Dockerfile since you'll be doing that anyways.

kpine commented 4 years ago

Looks like Balena also has RPi4 aarch64 base images now, so you should be able to just swap that in if you want.

mattheys commented 4 years ago

Thanks for the help, I gave up trying to compile for aarch64 and just ran your armv7l image instead, seems to work ok. Inference speed is 30ms which might not be the best but I only have 2 cameras at the moment.

After pyarrow which I managed to do I needed to compile numpy which I did, then scipy which got complicated and slow so I gave up there. Also couldn't find an anaconda client for aarch64, all seemed to be x86 & x86_64.

kpine commented 4 years ago

Sounds like you'd have to use miniforge, which looks to be a version of miniconda that supports aarch64 and is specific for conda-forge.

I'm not sure what errors you ran into, but the Dockerfile are already installing the distribution versions of numpy and scipy. Those already support aarch64, so I wouldn't expect it to be necessary to compile them. Or Conda-forge also has those available for install as well if the distribution versions are too old.

On my RPi4 armhf install I'm getting inference times of about 22 fps, for 1 camera (not sure if the number of cameras matters?).

If I ever get around to updating my RPi4 to aarch64 I might revisit this.

masantiago commented 4 years ago

@blakeblackshear considering Coral with Raspberry, I am wondering if it would be more efficient to delegate the whole detection to Coral instead of including motion detection and merging regions with object tracking by CPU. Taking the ssd model for MOBILENETv2 or YOLO, probably it is saving performance for Raspberry and be also accurate with good fps. What do you think? In next versions it could be configurable which steps to compute in detection. It is only an idea though without having tested the bottlenecks yet. You may have a best experimental opinion and then the idea makes no sense.