codeproject / CodeProject.AI-Server

CodeProject.AI Server is a self contained service that software developers can include in, and distribute with, their applications in order to augment their apps with the power of AI.
Other
565 stars 135 forks source link

Q&A: Is Google Coral working in Linux/Docker? #74

Closed Dvalin21 closed 2 months ago

Dvalin21 commented 8 months ago

I've seen where google coral is working in windows 10 and server, but I haven't seen where its working in linux/docker. Also, is it possible to use a google coral for facial recognition as well or just for detection? Thanks.

SeanECP commented 8 months ago

Google Coral should work just fine on the latest version 2.3.2 on Linux/Docker https://hub.docker.com/r/codeproject/ai-server/tags

ArlindoFNeto commented 8 months ago

Google Coral should work just fine on the latest version 2.3.2 on Linux/Docker https://hub.docker.com/r/codeproject/ai-server/tags

Why is the Docker version not the same as the CodeProject.AI-Server version? Is version 2.3.2 related only to the Docker container? The latest version of CodeProject.AI-Server is 2.2.4.

Dvalin21 commented 8 months ago

So I believe I have it working. Currently I have Coral up and running, but not before I got this error and then had to restart it.

11:36:30:objectdetection_coral_adapter.py: Edge TPU detected
11:36:30:ObjectDetection (Coral):  [RuntimeError] : Traceback (most recent call last):
  File "/app/modules/ObjectDetectionCoral/objectdetection_coral_adapter.py", line 104, in do_detection
    result = do_detect(opts, img, score_threshold)
  File "/app/modules/ObjectDetectionCoral/objectdetection_coral.py", line 191, in do_detect
    interpreter.invoke()
  File "/app/modules/ObjectDetectionCoral/bin/linux/python39/venv/lib/python3.9/site-packages/tflite_runtime/interpreter.py", line 941, in invoke
    self._interpreter.Invoke()
RuntimeError: Encountered an unresolved custom op. Did you miss a custom op or delegate?Node number 11 (EdgeTpuDelegateForCustomOp) failed to invoke.
rafaelmussi commented 8 months ago

Google Coral should work just fine on the latest version 2.3.2 on Linux/Docker https://hub.docker.com/r/codeproject/ai-server/tags

@SeanECP , running version 2.3.2-Alpha using latest tag, and I'm having the same issue as @Dvalin21

13:59:39:objectdetection_coral_adapter.py: INFO: Created TensorFlow Lite XNNPACK delegate for CPU.
13:59:39:objectdetection_coral_adapter.py: Edge TPU detected
13:59:39:objectdetection_coral_adapter.py: Timeout connecting to the server
14:01:16:ObjectDetection (Coral):  [RuntimeError] : Traceback (most recent call last):
  File "/app/modules/ObjectDetectionCoral/objectdetection_coral_adapter.py", line 104, in do_detection
    result = do_detect(opts, img, score_threshold)
  File "/app/modules/ObjectDetectionCoral/objectdetection_coral.py", line 191, in do_detect
    interpreter.invoke()
  File "/app/modules/ObjectDetectionCoral/bin/linux/python39/venv/lib/python3.9/site-packages/tflite_runtime/interpreter.py", line 941, in invoke
    self._interpreter.Invoke()
RuntimeError: Encountered an unresolved custom op. Did you miss a custom op or delegate?Node number 11 (EdgeTpuDelegateForCustomOp) failed to invoke.

Coral Module Info:

Module Path:   <root>/modules/ObjectDetectionCoral
AutoStart:     False
Queue:         objectdetection_queue
Platforms:     windows,linux,linux-arm64,macos,macos-arm64
GPU Libraries: installed if available
GPU Enabled:   enabled
Parallelism:   1
Accelerator:   
Half Precis.:  enable
Runtime:       python3.9
Runtime Loc:   Local
FilePath:      objectdetection_coral_adapter.py
Pre installed: False
Start pause:   1 sec
LogVerbosity:  
Valid:         True
Environment Variables
   MODELS_DIR = <root>/modules/ObjectDetectionCoral/assets
   MODEL_SIZE = Medium

System Info:

System:           Docker
Operating System: Linux (Linux 6.5.0-10-generic #10-Ubuntu SMP PREEMPT_DYNAMIC Fri Oct 13 13:49:38 UTC 2023)
CPUs:             Common KVM processor
                  1 CPU x 4 cores. 4 logical processors (x64)
System RAM:       4 GiB
Target:           Linux
BuildConfig:      Release
Execution Env:    Docker
Runtime Env:      Production
.NET framework:   .NET 7.0.13
Video adapter info:
System GPU info:
  GPU 3D Usage       0%
  GPU RAM Usage      0
Global Environment variables:
  CPAI_APPROOTPATH = <root>
  CPAI_PORT        = 32168
SeanECP commented 8 months ago

So I believe I have it working. Currently I have Coral up and running, but not before I got this error and then had to restart it.

In case my answer does not help, can you please share your System Info tab?

When launching the container, are you ensuring you add --privileged -v /dev/bus/usb:/dev/bus/usb to the docker command? Also, is there anything else you can share like your setup, like are you using unRAID? Also, what happens if you try plugging into a different USB port?

SeanECP commented 8 months ago

@SeanECP , running version 2.3.2-Alpha using latest tag, and I'm having the same issue as @Dvalin21

Thanks so much for the info. When launching the container, are you ensuring you add --privileged -v /dev/bus/usb:/dev/bus/usb to the docker command? Also, is there anything else you can share like your setup, like are you using unRAID? Also, what happens if you try plugging into a different USB port?

rafaelmussi commented 8 months ago

@SeanECP , running version 2.3.2-Alpha using latest tag, and I'm having the same issue as @Dvalin21

Thanks so much for the info. When launching the container, are you ensuring you add --privileged -v /dev/bus/usb:/dev/bus/usb to the docker command? Also, is there anything else you can share like your setup, like are you using unRAID? Also, what happens if you try plugging into a different USB port?

Thank you for the prompt reply. Yes, I added --privileged -v /dev/bus/usb:/dev/bus/usb to the docker command and also tried via docker-compose.yml adding the /dev/bus/usb via volume and also via devices. All get the same results.

Im running docker inside a Proxmox VM with Ubuntu 23.10 passing through the whole usb port. I was able to have the coral working (using the pycoral/examples/classify_image.py example) on the ubuntu vm so I know the usb passthrough is working from proxmox to the vm.

If you need any other information, just let me know.

Thank you for all your help.

Dvalin21 commented 8 months ago

@SeanECP , running version 2.3.2-Alpha using latest tag, and I'm having the same issue as @Dvalin21

Thanks so much for the info. When launching the container, are you ensuring you add --privileged -v /dev/bus/usb:/dev/bus/usb to the docker command? Also, is there anything else you can share like your setup, like are you using unRAID? Also, what happens if you try plugging into a different USB port?

You know, I was wondering if I needed to do that. Launch as privileged.

Dvalin21 commented 8 months ago

@SeanECP , running version 2.3.2-Alpha using latest tag, and I'm having the same issue as @Dvalin21

Thanks so much for the info. When launching the container, are you ensuring you add --privileged -v /dev/bus/usb:/dev/bus/usb to the docker command? Also, is there anything else you can share like your setup, like are you using unRAID? Also, what happens if you try plugging into a different USB port?

Thank you for the prompt reply. Yes, I added --privileged -v /dev/bus/usb:/dev/bus/usb to the docker command and also tried via docker-compose.yml adding the /dev/bus/usb via volume and also via devices. All get the same results.

Im running docker inside a Proxmox VM with Ubuntu 23.10 passing through the whole usb port. I was able to have the coral working (using the pycoral/examples/classify_image.py example) on the ubuntu vm so I know the usb passthrough is working from proxmox to the vm.

If you need any other information, just let me know.

Thank you for all your help.

Some what off topic and this will help me if I need to make a separate issue. Are you getting 1002ms inference with your Coral? Thats like 1.002 seconds which is SUPER slow.

SeanECP commented 8 months ago

Im running docker inside a Proxmox VM with Ubuntu 23.10 passing through the whole usb port. I was able to have the coral working (using the pycoral/examples/classify_image.py example) on the ubuntu vm so I know the usb passthrough is working from proxmox to the vm.

Thanks so much for your response. Would you mind doing a quick write-up of your steps? We'd love to include it in the docs.

SeanECP commented 8 months ago

Some what off topic and this will help me if I need to make a separate issue. Are you getting 1002ms inference with your Coral? Thats like 1.002 seconds which is SUPER slow.

Some have reported changing the Model size to small helps. Are you using Blue Iris?

rafaelmussi commented 8 months ago

Im running docker inside a Proxmox VM with Ubuntu 23.10 passing through the whole usb port. I was able to have the coral working (using the pycoral/examples/classify_image.py example) on the ubuntu vm so I know the usb passthrough is working from proxmox to the vm.

Thanks so much for your response. Would you mind doing a quick write-up of your steps? We'd love to include it in the docs.

It's not working yet.

Here is the Trace log:

14:10:04:Starting /app...onCoral/bin/linux/python39/venv/bin/python3 "/app...ionCoral/objectdetection_coral_adapter.py"
14:10:04:
14:10:04:Attempting to start ObjectDetectionCoral with /app/modules/ObjectDetectionCoral/bin/linux/python39/venv/bin/python3 "/app/modules/ObjectDetectionCoral/objectdetection_coral_adapter.py"
14:10:04:Module 'ObjectDetection (Coral)' 1.6.2 (ID: ObjectDetectionCoral)
14:10:04:Module Path:   /app/modules/ObjectDetectionCoral
14:10:04:AutoStart:     True
14:10:04:Queue:         objectdetection_queue
14:10:04:Platforms:     windows,linux,linux-arm64,macos,macos-arm64
14:10:04:GPU Libraries: installed if available
14:10:04:GPU Enabled:   enabled
14:10:04:Parallelism:   1
14:10:04:Accelerator:
14:10:04:Half Precis.:  enable
14:10:04:Runtime:       python3.9
14:10:04:Runtime Loc:   Local
14:10:04:FilePath:      objectdetection_coral_adapter.py
14:10:04:Pre installed: False
14:10:04:Start pause:   1 sec
14:10:04:LogVerbosity:
14:10:04:Valid:         True
14:10:04:Environment Variables
14:10:04:MODELS_DIR = %CURRENT_MODULE_PATH%/assets
14:10:04:MODEL_SIZE = Medium
14:10:04:
14:10:04:Started ObjectDetection (Coral) module
14:10:11:objectdetection_coral_adapter.py: INFO: Created TensorFlow Lite XNNPACK delegate for CPU.
14:10:11:objectdetection_coral_adapter.py: CPAI_MODULE_REQUIRED_MB not found. Setting to default 0
14:10:11:objectdetection_coral_adapter.py: NUM_THREADS not found. Setting to default 1
14:10:11:objectdetection_coral_adapter.py: MIN_CONFIDENCE not found. Setting to default 0.5
14:10:11:objectdetection_coral_adapter.py: MODULE_PATH:    /app/modules/ObjectDetectionCoral
14:10:11:objectdetection_coral_adapter.py: MODELS_DIR:     /app/modules/ObjectDetectionCoral/assets
14:10:11:objectdetection_coral_adapter.py: Edge TPU detected
14:10:11:objectdetection_coral_adapter.py: Timeout connecting to the server
14:10:11:objectdetection_coral_adapter.py: ObjectDetection (Coral) started.ObjectDetection (Coral): ObjectDetection (Coral) started.
14:10:11:objectdetection_coral_adapter.py: MODEL_SIZE:     medium
14:10:11:objectdetection_coral_adapter.py: CPU_MODEL_NAME: efficientdet_lite3_512_ptq.tflite
14:10:11:objectdetection_coral_adapter.py: TPU_MODEL_NAME: efficientdet_lite3_512_ptq_edgetpu.tflite
14:10:11:objectdetection_coral_adapter.py: Input details: {'name': 'serving_default_images:0', 'index': 0, 'shape': array([  1, 512, 512,   3], dtype=int32), 'shape_signature': array([  1, 512, 512,   3], dtype=int32), 'dtype': , 'quantization': (0.0078125, 127), 'quantization_parameters': {'scales': array([0.0078125], dtype=float32), 'zero_points': array([127], dtype=int32), 'quantized_dimension': 0}, 'sparsity_parameters': {}}
14:10:11:objectdetection_coral_adapter.py: Output details: {'name': 'StatefulPartitionedCall:31', 'index': 23, 'shape': array([ 1, 25,  4], dtype=int32), 'shape_signature': array([ 1, 25,  4], dtype=int32), 'dtype': , 'quantization': (0.0, 0), 'quantization_parameters': {'scales': array([], dtype=float32), 'zero_points': array([], dtype=int32), 'quantized_dimension': 0}, 'sparsity_parameters': {}}
14:10:11:ObjectDetection (Coral): ObjectDetection (Coral) started.
14:10:55:Client request 'detect' in queue 'objectdetection_queue' (...ee3761)
14:10:55:Request 'detect' dequeued from 'objectdetection_queue' (...ee3761)
14:10:55:ObjectDetection (Coral): Retrieved objectdetection_queue command
14:10:55:ObjectDetection (Coral):  [RuntimeError] : Traceback (most recent call last):
  File "/app/modules/ObjectDetectionCoral/objectdetection_coral_adapter.py", line 104, in do_detection
    result = do_detect(opts, img, score_threshold)
  File "/app/modules/ObjectDetectionCoral/objectdetection_coral.py", line 191, in do_detect
    interpreter.invoke()
  File "/app/modules/ObjectDetectionCoral/bin/linux/python39/venv/lib/python3.9/site-packages/tflite_runtime/interpreter.py", line 941, in invoke
    self._interpreter.Invoke()
RuntimeError: Encountered an unresolved custom op. Did you miss a custom op or delegate?Node number 11 (EdgeTpuDelegateForCustomOp) failed to invoke.
14:10:55:ObjectDetection (Coral): Rec'd request for ObjectDetection (Coral) command 'detect' (...ee3761) took 261ms
14:10:55:Response received (...ee3761)
14:11:04:Client request 'detect' in queue 'objectdetection_queue' (...14e30a)
14:11:04:Request 'detect' dequeued from 'objectdetection_queue' (...14e30a)
14:11:04:ObjectDetection (Coral): Retrieved objectdetection_queue command
14:11:05:ObjectDetection (Coral): Rec'd request for ObjectDetection (Coral) command 'detect' (...14e30a) took 1131ms
14:11:05:Response received (...14e30a): The interpreter is in use. Please try again later
SeanECP commented 8 months ago

It's not working yet.

docker -> proxmox -> usb ads a level of complexity. It could be that docker to proxmox is the source of the issue. In the past, other proxmox users have reported increasing the amount of RAM from 4GB to 8GB helped. Another user recommended the following, though I am not certain it actually helped: https://www.codeproject.com/Messages/5961228/Re-v2-1-11Beta-I-M-2-Coral-in-Docker

ministryofsillywalks commented 8 months ago

Same issue for me. It installs without an issue and the coral tpu is detected however once I try to detect something it get this error:

15:13:25:ObjectDetection (Coral):  [RuntimeError] : Traceback (most recent call last):
  File "/app/modules/ObjectDetectionCoral/objectdetection_coral_adapter.py", line 104, in do_detection
    result = do_detect(opts, img, score_threshold)
  File "/app/modules/ObjectDetectionCoral/objectdetection_coral.py", line 191, in do_detect
    interpreter.invoke()
  File "/app/modules/ObjectDetectionCoral/bin/linux/python39/venv/lib/python3.9/site-packages/tflite_runtime/interpreter.py", line 941, in invoke
    self._interpreter.Invoke()
RuntimeError: Encountered an unresolved custom op. Did you miss a custom op or delegate?Node number 11 (EdgeTpuDelegateForCustomOp) failed to invoke.

Running the 2.3.2 Alpha on Unraid

SeanECP commented 7 months ago

Same issue for me. It installs without an issue and the coral tpu is detected however once I try to detect something it get this error:

A user made an Unraid guide, which may be of assistance: https://www.codeproject.com/AI/docs/faq/coral.html

WackyWRZ commented 7 months ago

Adding that I am also having the same issue with the coral working in a docker container (2.3.2 Alpha / latest) with the same error referenced. I am running the M.2 version and passing the /dev/apex_0 through. I've tried multiple versions, and I've followed the unRAID specific instructions and running the docker in privilege mode. Detection thru BlueIris shows "nothing detected"

The ONLY way to get it working is to roll the Docker back to 2.1.11 and follow the instructions in the unRAID guide... This works, but only after attempting to install the module, then uninstalling it, and reinstalling it due to PIP errors (as referenced in the guide).

11:20:27:Started ObjectDetection (Coral) module
11:20:28:objectdetection_coral_adapter.py: INFO: Created TensorFlow Lite XNNPACK delegate for CPU.
11:20:28:objectdetection_coral_adapter.py: Edge TPU detected
11:21:55:ObjectDetection (Coral):  [RuntimeError] : Traceback (most recent call last):
  File "/app/modules/ObjectDetectionCoral/objectdetection_coral_adapter.py", line 104, in do_detection
    result = do_detect(opts, img, score_threshold)
  File "/app/modules/ObjectDetectionCoral/objectdetection_coral.py", line 191, in do_detect
    interpreter.invoke()
  File "/app/modules/ObjectDetectionCoral/bin/linux/python39/venv/lib/python3.9/site-packages/tflite_runtime/interpreter.py", line 941, in invoke
    self._interpreter.Invoke()
RuntimeError: Encountered an unresolved custom op. Did you miss a custom op or delegate?Node number 11 (EdgeTpuDelegateForCustomOp) failed to invoke.
ministryofsillywalks commented 6 months ago

Just installed the latest version (2.3.4) and coral inside docker ist still not working (same error as above) Downgraded to 2.1.11 and all is working fine. @ChrisMaunder could you please have a look at this?

rafaelmussi commented 6 months ago

Just installed the latest version (2.3.4) and coral inside docker ist still not working (same error as above) Downgraded to 2.1.11 and all is working fine. @ChrisMaunder could you please have a look at this?

Confirmed here as well: Downgrading to 2.1.11 works fine (coral usb, proxmox 7, vm ubuntu 23.10)

SeanECP commented 6 months ago

Just installed the latest version (2.3.4) and coral inside docker ist still not working (same error as above) Downgraded to 2.1.11 and all is working fine. @ChrisMaunder could you please have a look at this?

Which docker image are you using? CPU, GPU or RPi?

SeanECP commented 6 months ago

Confirmed here as well: Downgrading to 2.1.11 works fine (coral usb, proxmox 7, vm ubuntu 23.10)

Which docker image are you using? CPU, GPU or RPi?

rafaelmussi commented 6 months ago

Confirmed here as well: Downgrading to 2.1.11 works fine (coral usb, proxmox 7, vm ubuntu 23.10)

Which docker image are you using? CPU, GPU or RPi?

image: codeproject/ai-server:2.1.11

ministryofsillywalks commented 6 months ago

Just installed the latest version (2.3.4) and coral inside docker ist still not working (same error as above) Downgraded to 2.1.11 and all is working fine. @ChrisMaunder could you please have a look at this?

Which docker image are you using? CPU, GPU or RPi?

In my case codeproject/ai-server:2.1.11 (this works) codeproject/ai-server:latest (this doesn't)

So I think CPU

ChrisMaunder commented 6 months ago

@ministryofsillywalks If you are able, it would be hugely helpful if you could open a terminal into the container (or connect to it via Docker Desktop or Visual Studio Code) and edit the file /modules/ObjectDetectionCoral/objectdetection_coral.py. Line 50-53 has

# For Linux we have installed the pycoral libs via apt-get, not PIP in the venv,
# So make sure the interpreter can find the coral libraries
# if platform.system() == "Linux":
#    sys.path.insert(0, "/usr/lib/python3.9/site-packages/")

uncomment this to

# For Linux we have installed the pycoral libs via apt-get, not PIP in the venv,
# So make sure the interpreter can find the coral libraries
if platform.system() == "Linux":
    sys.path.insert(0, "/usr/lib/python3.9/site-packages/")

and let me know if that solves the issue.

I'm not that hopeful, but it's the main change between version 1.3 of the Coral module (for server 2.1.11) and the current version Coral module version

Dvalin21 commented 6 months ago

Some what off topic and this will help me if I need to make a separate issue. Are you getting 1002ms inference with your Coral? Thats like 1.002 seconds which is SUPER slow.

Some have reported changing the Model size to small helps. Are you using Blue Iris?

Sorry, just now seeing this, but no I'm not.

ministryofsillywalks commented 6 months ago
# For Linux we have installed the pycoral libs via apt-get, not PIP in the venv,
# So make sure the interpreter can find the coral libraries
if platform.system() == "Linux":
    sys.path.insert(0, "/usr/lib/python3.9/site-packages/")

unfortunately that didn't seem to work. After editing the container starts up fine and the coral is shown as working but I get this error:

19:02:26:Started ObjectDetection (Coral) module
19:02:28:objectdetection_coral_adapter.py: INFO: Created TensorFlow Lite XNNPACK delegate for CPU.
19:02:28:objectdetection_coral_adapter.py: Edge TPU detected
19:02:28:ObjectDetection (Coral):  [RuntimeError] : Traceback (most recent call last):
  File "/app/modules/ObjectDetectionCoral/objectdetection_coral_adapter.py", line 104, in do_detection
    result = do_detect(opts, img, score_threshold)
  File "/app/modules/ObjectDetectionCoral/objectdetection_coral.py", line 191, in do_detect
    interpreter.invoke()
  File "/app/modules/ObjectDetectionCoral/bin/linux/python39/venv/lib/python3.9/site-packages/tflite_runtime/interpreter.py", line 941, in invoke
    self._interpreter.Invoke()
RuntimeError: Encountered an unresolved custom op. Did you miss a custom op or delegate?Node number 11 (EdgeTpuDelegateForCustomOp) failed to invoke.

Detections just fail as nothing shows up.

Also later this error pops up

19:08:47:Started ObjectDetection (Coral) module
19:08:48:objectdetection_coral_adapter.py: F port/default/port_from_tf/statusor.cc:38] Attempting to fetch value instead of handling error Failed precondition: Could not map pages : 8 (Device or resource busy)
19:08:48:Module ObjectDetectionCoral has shutdown
ChrisMaunder commented 2 months ago

Try the latest version please.