Feniks consists of 6 different services. Those are:
Runs natively on Linux.
Uses FFmpeg to handle video, audio, snapshot, probing and streaming.
Uses Redis as a main NoSql database and message broker.
MongoDB or SQLite is used for AI Events and query databases.
The web application supports more than one user and node server.
The UI app was developed by using Vue3 & Quasar. It uses gridstack.js to support highly customizable (like resize, dragging) video players.
It has a built-in watchdog mechanism to monitor all processes and recover them.
All running node services can be viewed on the UI app. All Services can be started/stopped by using the services page.
The services are tested on x86 workstation, Dell Intel x86 laptop, Raspberry PI (ARM664) and Nvidia Jetson Nano.
It supports multiple storages. You can assign any storage to any camera.
Broken / failed connections are shown in the information page. If a stream fails more than once in a given time, a notification will be sent to receivers (users) by the cloud provider service.
Supported Stream Types are shown below: | Hardware Demand | Latency | Compatibility | |
---|---|---|---|---|
FLV | * | * | **** | |
HLS | * | ***** | ***** | |
WebSockets | ***** | * | ***** | |
WebRTC | * | * | **** |
Supported video codecs for both streaming and recording are:
Supported audio codecs for both streaming and recording are:
Supported Video Container Formats:
Supported Media Servers:
All AI events (Object Detection, Face Recognition, Plate Recognition) can be queried by date, time, camera, label and score. All those fields are all indexed and saved as denormalized entities to provide best read performance even for big data.
It can store detected objects on clouds. Current cloud providers are:
Feniks supports 3 different motion detection methods, those are:
Re-streaming via media server to reduce the number of connections to your camera