Blue Candle is a streamlined, Windows-compatible object detection service designed for integration with Blue Iris and Agent DVR. At its core, Blue Candle leverages the power and simplicity of Yolo 8 models, ensuring efficient and accurate detection capabilities. The service is dockerless, making it easily deployable in various environments.
Written in Rust, Blue Candle promises high performance and reliability. It uses Axum for its web framework, ensuring a robust and scalable web interface. For machine learning operations, Candle is employed as the backend.
The Yolo model implementation in Blue Candle is based on examples found in Candle which in turn is based on Tinygrad implementation.
Our goal with Blue Candle is to provide an accessible, user-friendly, and efficient object detection solution that can seamlessly integrate with your existing home automation setup.
Download the Release:
Check CUDA Compatibility:
Choose the Correct Release:
Open Command Prompt:
Blue Candle help:
Run the following command to get started with Blue Candle:
blue_candle --help
This command will display the usage and options for Blue Candle.
Simple Blue Candle start:
Run the following command to get started with Blue Candle:
blue_candle
This command will start blue candle and start running on the default port 32168. And you should see the following
20XX-XX-03T21:33:24.318128Z INFO blue_candle::detector: Detector is initialized for GPU
20XX-XX-03T21:33:24.608722Z INFO blue_candle::detector: Test detection
20XX-XX-03T21:33:25.090863Z INFO blue_candle: Server inference startup test, processing time: 481.8493ms, inference time: 119.6684ms
20XX-XX-03T21:33:25.091261Z INFO blue_candle: Starting server, listening on 0.0.0.0:32168
20XX-XX-03T21:34:36.102087Z INFO blue_candle: Request time 369.0577ms, processing time: 362.837ms, inference time: 80.253ms
Labels Filter:
--labels "label1 label2"
to filter results to include only specified labels.Port Selection:
--port [PORT_NUMBER]
to set the port for HTTP requests. Default is 32168.CPU Mode:
--cpu
to force the application to use CPU instead of GPU.Confidence Threshold:
--confidence-threshold [VALUE]
to set the confidence threshold for model predictions. Default is 0.25.Non-Maximum Suppression Threshold:
--nms-threshold [VALUE]
to set the NMS threshold. Default is 0.45.Legend Size in Images:
--legend-size [SIZE]
to set the font size of the legend in saved images. Set to 0 to disable.Model Path:
--model-path "./models"
to specify the directory to download other YOLO8 models.Model Selection:
--model "/path/to/model.safetensors"
to specify the path to the model weights file.Image Path:
--image-path "/path/to/save/images"
to specify where to save processed images.Test Image:
--image "/path/to/test.jpg"
to run object detection on a specific image.blue_candle --port 1337 --labels "person car" --model "/path/to/model.safetensors" --image "/path/to/test.jpg"
This command runs Blue Candle on port 1337, filtering results to include only "person" and "car" labels, using a specified model, and processing a specific test image.
Remember to replace the file paths and options in the example commands with those that are applicable to your setup.
The following command was executed to perform a test object detection:
blue_candle -- --test
Output from the command:
20XX-XX-03T21:43:43.483778Z INFO blue_candle::detector: Detector is initialized for GPU
20XX-XX-03T21:43:43.785153Z INFO blue_candle::detector: Test detection
20XX-XX-03T21:43:45.255355Z INFO blue_candle: Tested image in 1.4702015s, processing time: 479.6861ms, inference time: 117.6626ms
The image below is the result of the object detection test:
rustc --version
.git clone https://github.com/xnorpx/blue-candle
cd blue-candle
cargo run --release -- --help
to list helpcargo run --release -- --test
to runs self testcargo run --release
to run servercargo run --release --features cuda -- --test
The test client is a tool designed for benchmarking and testing the object detection service of Blue Candle. Before using it, you need to build the client, and then you can run it with various options.
cargo build --bin test_ob_service
. This compiles the test client.test_ob_service.exe
(or test_ob_service
on non-Windows) in the target/debug
directory.test_ob_service --help
to view usage and options.-o
or --origin [URL]
(default: http://127.0.0.1:32168
).--min-confidence [VALUE]
(default: 0.6).-i
or --image [IMAGE_PATH]
for optional image input.--image-ob [IMAGE_PATH]
to save the image with bounding boxes.-n
or --number-of-requests [NUMBER]
(default: 1).-i
or --interval [MILLISECONDS]
(default: 1000).-h
or --help
, -V
or --version
.test_ob_service -o http://127.0.0.1:32168 --min-confidence 0.7 --image "/path/to/test_image.jpg" --number-of-requests 5 --interval 500
This command sets the service origin, minimum confidence level, test image path, number of requests, and interval between requests.
This project is dual-licensed under both the Apache License 2.0 and the MIT License, offering flexibility and broad compatibility with various licensing requirements for most of the code. However, specific components related to the Yolo8 model are licensed under the Affero General Public License (AGPL) due to their dependency on Ultralytics Yolo8, which is AGPL-licensed.
By using, modifying, or distributing any part of this project, you agree to comply with the terms of the respective licenses for each component:
Logo generated by OpenAI's DALL-E, an AI image generation tool.