xnorpx / blue-candle

Object detection service for Blue Iris
51 stars 5 forks source link

New Install - Transitioning from CPAI to BC troubleshooting. #167

Open DeFlanko opened 2 weeks ago

DeFlanko commented 2 weeks ago

Same Issue as reported here: #161

Just downloaded and attempt to run without parameters.

2024-11-10T21:05:18.220716Z  INFO blue_candle: Starting Blue Candle object detection service
2024-11-10T21:05:18.220839Z  INFO blue_candle::system_info: CPU | GenuineIntel | 13th Gen Intel(R) Core(TM) i7-13700 | 16 Cores | 24 Logical Cores
2024-11-10T21:05:18.254264Z  INFO blue_candle::system_info: Cuda GPU: [0] | NVIDIA GeForce RTX 3070 | CC 8.6 | 5888 Cores | 4881/8192 MB
2024-11-10T21:05:18.255475Z  INFO blue_candle::detector: Detector is initialized for GPU
2024-11-10T21:05:18.447301Z  INFO blue_candle::detector: Test detection
2024-11-10T21:05:18.605267Z  INFO blue_candle: Server inference startup test, processing time: 157.8745ms, inference time: 72.9488ms
2024-11-10T21:05:18.605459Z  INFO blue_candle: Starting server, listening on 0.0.0.0:32168
Error: An attempt was made to access a socket in a way forbidden by its access permissions. (os error 10013)

Environment Variables: image

Note: CodeProject.AI (CPAI) is still running -- i was hoping to do a momentary flip to Blue Candle (BC) to test it out and then flip back as necessary for a smoother transition for Blue Iris (BI)

DeFlanko commented 2 weeks ago

Ahh is this why? image

DeFlanko commented 2 weeks ago
2024-11-10T21:19:52.228038Z  INFO blue_candle: Starting Blue Candle object detection service
2024-11-10T21:19:52.228196Z  INFO blue_candle::system_info: CPU | GenuineIntel | 13th Gen Intel(R) Core(TM) i7-13700 | 16 Cores | 24 Logical Cores
2024-11-10T21:19:52.265956Z  INFO blue_candle::system_info: Cuda GPU: [0] | NVIDIA GeForce RTX 3070 | CC 8.6 | 5888 Cores | 4947/8192 MB
2024-11-10T21:19:52.267272Z  INFO blue_candle::detector: Detector is initialized for GPU
2024-11-10T21:19:52.579108Z  INFO blue_candle::detector: Test detection
2024-11-10T21:19:52.733863Z  INFO blue_candle: Server inference startup test, processing time: 154.6567ms, inference time: 69.5835ms
2024-11-10T21:19:52.734054Z  INFO blue_candle: Starting server, listening on 0.0.0.0:32169

Ok so because CPAI and BC use the same PORT by default -- conflict...

xnorpx commented 2 weeks ago
2024-11-10T21:19:52.228038Z  INFO blue_candle: Starting Blue Candle object detection service
2024-11-10T21:19:52.228196Z  INFO blue_candle::system_info: CPU | GenuineIntel | 13th Gen Intel(R) Core(TM) i7-13700 | 16 Cores | 24 Logical Cores
2024-11-10T21:19:52.265956Z  INFO blue_candle::system_info: Cuda GPU: [0] | NVIDIA GeForce RTX 3070 | CC 8.6 | 5888 Cores | 4947/8192 MB
2024-11-10T21:19:52.267272Z  INFO blue_candle::detector: Detector is initialized for GPU
2024-11-10T21:19:52.579108Z  INFO blue_candle::detector: Test detection
2024-11-10T21:19:52.733863Z  INFO blue_candle: Server inference startup test, processing time: 154.6567ms, inference time: 69.5835ms
2024-11-10T21:19:52.734054Z  INFO blue_candle: Starting server, listening on 0.0.0.0:32169

Ok so because CPAI and BC use the same PORT by default -- conflict...

@DeFlanko Welcome to Blue Candle, yes if you still run CPAI there will be conflict, or change port as you did. Maybe I should catch that error and print out helpful message about changing port or stop other applications using the same port.

DeFlanko commented 2 weeks ago

so i decided to stop CPAI for a moment and run BC with default Port 32168 and re analyse some clips i had to see if it just "works"...

C:\Users\defla>"C:\Program Files\Blue Candle (Ai)\blue_candle-0.8.0-win-cuda-12-CC-86\blue_candle.exe" --port 32168
2024-11-10T21:31:26.900511Z  INFO blue_candle: Starting Blue Candle object detection service
2024-11-10T21:31:26.900670Z  INFO blue_candle::system_info: CPU | GenuineIntel | 13th Gen Intel(R) Core(TM) i7-13700 | 16 Cores | 24 Logical Cores
2024-11-10T21:31:26.938446Z  INFO blue_candle::system_info: Cuda GPU: [0] | NVIDIA GeForce RTX 3070 | CC 8.6 | 5888 Cores | 1052/8192 MB
2024-11-10T21:31:26.939806Z  INFO blue_candle::detector: Detector is initialized for GPU
2024-11-10T21:31:27.213466Z  INFO blue_candle::detector: Test detection
2024-11-10T21:31:27.369697Z  INFO blue_candle: Server inference startup test, processing time: 156.0953ms, inference time: 69.7755ms
2024-11-10T21:31:27.369854Z  INFO blue_candle: Starting server, listening on 0.0.0.0:32168
2024-11-10T21:32:27.997231Z  INFO blue_candle::server_stats: Stats: Total Req: 65, Max Req/Sec: 4, Min Inference : 4.8557ms, Max Inference: 11.884ms, Min Processing: 29.5169ms, Max Processing: 380.1632ms, Min Request: 47.9118ms, Max Request: 399.2897ms
2024-11-10T21:33:35.585710Z  INFO blue_candle::server_stats: Stats: Total Req: 155, Max Req/Sec: 5, Min Inference : 4.4041ms, Max Inference: 15.4212ms, Min Processing: 25.4712ms, Max Processing: 380.84ms, Min Request: 31.3857ms, Max Request: 399.2897ms
2024-11-10T21:34:40.948590Z  INFO blue_candle::server_stats: Stats: Total Req: 159, Max Req/Sec: 1, Min Inference : 4.4041ms, Max Inference: 15.4212ms, Min Processing: 25.4712ms, Max Processing: 380.84ms, Min Request: 31.3857ms, Max Request: 399.2897ms
2024-11-10T21:35:44.556195Z  INFO blue_candle::server_stats: Stats: Total Req: 163, Max Req/Sec: 1, Min Inference : 4.4041ms, Max Inference: 15.4212ms, Min Processing: 25.4712ms, Max Processing: 383.3255ms, Min Request: 31.3857ms, Max Request: 410.9926ms
2024-11-10T21:36:51.715765Z  INFO blue_candle::server_stats: Stats: Total Req: 165, Max Req/Sec: 1, Min Inference : 4.4041ms, Max Inference: 15.4212ms, Min Processing: 25.4712ms, Max Processing: 383.3255ms, Min Request: 31.3857ms, Max Request: 410.9926ms
2024-11-10T21:38:03.485893Z  INFO blue_candle::server_stats: Stats: Total Req: 199, Max Req/Sec: 4, Min Inference : 4.4041ms, Max Inference: 15.4212ms, Min Processing: 25.4712ms, Max Processing: 401.1368ms, Min Request: 31.3857ms, Max Request: 430.5489ms

I replayed some clips back and noticed the same orange boxes in BI.... Does this mean its working?

image

if so... how (or can i) use the models that CPAI was using?

image

image

DeFlanko commented 2 weeks ago

i assume no right now....

image

DeFlanko commented 2 weeks ago

Can these be used? https://www.reddit.com/r/BlueIris/comments/1fm6kc7/yolov8_model_person_vehicle_and_animal/

https://github.com/nmbgeek/ipcam-yolo-models/

https://github.com/MikeLud/CodeProject.AI-Custom-IPcam-Models/tree/main/YOLOv8%20Models/Custom%20Models

xnorpx commented 1 week ago

I added a FAQ here:

https://github.com/xnorpx/blue-candle/discussions/168

There is discussion how to use custom models here, but I have yet to implement it.

https://github.com/xnorpx/blue-candle/issues/50