bitsy-ai / rpi-object-tracking

Object tracking tutorial using TensorFlow / TensorFlow Lite, Raspberry Pi, Pi Camera, and a Pimoroni Pan-Tilt Hat.
https://medium.com/@grepLeigh/real-time-object-tracking-with-tensorflow-raspberry-pi-and-pan-tilt-hat-2aeaef47e134
MIT License
183 stars 70 forks source link

RuntimeError: #51

Open g30ba1 opened 3 years ago

g30ba1 commented 3 years ago

Description

I was trying to run the detect application, but I get an error.

*Note: I can run the commands detect and track with no error when I not call edgetpu.

What I Did

$ rpi-deep-pantilt detect --edge-tpu

Traceback (most recent call last):
  File "/home/pi/.virtualenvs/dl4rpi/bin/rpi-deep-pantilt", line 8, in <module>
    sys.exit(main())
  File "/home/pi/.virtualenvs/dl4rpi/lib/python3.7/site-packages/rpi_deep_pantilt/cli.py", line 107, in main
    cli()
  File "/home/pi/.virtualenvs/dl4rpi/lib/python3.7/site-packages/click/core.py", line 829, in __call__
    return self.main(*args, **kwargs)
  File "/home/pi/.virtualenvs/dl4rpi/lib/python3.7/site-packages/click/core.py", line 782, in main
    rv = self.invoke(ctx)
  File "/home/pi/.virtualenvs/dl4rpi/lib/python3.7/site-packages/click/core.py", line 1259, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/home/pi/.virtualenvs/dl4rpi/lib/python3.7/site-packages/click/core.py", line 1066, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/home/pi/.virtualenvs/dl4rpi/lib/python3.7/site-packages/click/core.py", line 610, in invoke
    return callback(*args, **kwargs)
  File "/home/pi/.virtualenvs/dl4rpi/lib/python3.7/site-packages/rpi_deep_pantilt/cli.py", line 52, in detect
    model = SSDMobileNet_V3_Coco_EdgeTPU_Quant()
  File "/home/pi/.virtualenvs/dl4rpi/lib/python3.7/site-packages/rpi_deep_pantilt/detect/ssd_mobilenet_v3_coco.py", line 56, in __init__
    self.tflite_interpreter.allocate_tensors()
  File "/home/pi/.virtualenvs/dl4rpi/lib/python3.7/site-packages/tensorflow_core/lite/python/interpreter.py", line 244, in allocate_tensors
    return self._interpreter.AllocateTensors()
  File "/home/pi/.virtualenvs/dl4rpi/lib/python3.7/site-packages/tensorflow_core/lite/python/interpreter_wrapper/tensorflow_wrap_interpreter_wrapper.py", line 106, in AllocateTensors
    return _tensorflow_wrap_interpreter_wrapper.InterpreterWrapper_AllocateTensors(self)
RuntimeError: Internal: Unsupported data type in custom op handler: 0Node number 2 (EdgeTpuDelegateForCustomOp) failed to prepare.

I think that the problem could be originated by some incompatibility between all the packages listed above.

Would you mind to take a look?

Thank you in advance!

leigh-johnson commented 3 years ago

Hey @g30ba1, thank you for reporting this! Can you try this combination of dependencies and let me know how it goes?

pip install <wheel link> in a new virtual environment. :crossed_fingers:

  1. tflite_runtime-2.5.0 https://github.com/google-coral/pycoral/releases/download/release-frogfish/tflite_runtime-2.5.0-cp37-cp37m-linux_armv7l.whl

  2. tensorflow-2.4.0-rc2 - no official build for this yet. I cross-compiled this wheel myself: https://github.com/bitsy-ai/tensorflow-arm-bin/releases/download/v2.4.0-rc2/tensorflow-2.4.0rc2-cp37-none-linux_armv7l.whl

  3. rpi-deep-pantilt-2.0.0rc0 https://pypi.org/project/rpi-deep-pantilt/2.0.0rc0/#files

g30ba1 commented 3 years ago

Hi again Leigh, I've created a new virtualenv following your instructions:

rpi-deep-pantilt version: 2.0.0.rc0 Python version: 3.7.3 TensorFlow version: 2.4.0rc2 edgetpu version: 2.15.0 tflite-runtime version: 2.5.0 Operating System: Raspbian 10 (buster)

$ python
Python 3.7.3 (default, Jul 25 2020, 13:03:44) 
[GCC 8.3.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import edgetpu
>>> edgetpu.__version__
'2.15.0'
>>> 

Unfortunately, when I run the detect command with --edge-tpu, I get the error:

$ rpi-deep-pantilt detect --edge-tpu
Downloading data from https://github.com/leigh-johnson/rpi-deep-pantilt/releases/download/v1.1.1/ssdlite_mobilenet_edgetpu_coco_quant.tar.gz
81715200/81713258 [==============================] - 89s 1us/step
WARNING:root:Loading Coral tflite_runtime for Edge TPU
Traceback (most recent call last):
  File "/home/pi/.virtualenvs/RDP/bin/rpi-deep-pantilt", line 8, in <module>
    sys.exit(main())
  File "/home/pi/.virtualenvs/RDP/lib/python3.7/site-packages/rpi_deep_pantilt/cli.py", line 250, in main
    cli()
  File "/home/pi/.virtualenvs/RDP/lib/python3.7/site-packages/click/core.py", line 829, in __call__
    return self.main(*args, **kwargs)
  File "/home/pi/.virtualenvs/RDP/lib/python3.7/site-packages/click/core.py", line 782, in main
    rv = self.invoke(ctx)
  File "/home/pi/.virtualenvs/RDP/lib/python3.7/site-packages/click/core.py", line 1259, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/home/pi/.virtualenvs/RDP/lib/python3.7/site-packages/click/core.py", line 1066, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/home/pi/.virtualenvs/RDP/lib/python3.7/site-packages/click/core.py", line 610, in invoke
    return callback(*args, **kwargs)
  File "/home/pi/.virtualenvs/RDP/lib/python3.7/site-packages/rpi_deep_pantilt/cli.py", line 117, in detect
    run_stationary_detect(labels, predictor_cls, rotation)
  File "/home/pi/.virtualenvs/RDP/lib/python3.7/site-packages/rpi_deep_pantilt/detect/camera.py", line 84, in run_stationary_detect
    model = model_cls()
  File "/home/pi/.virtualenvs/RDP/lib/python3.7/site-packages/rpi_deep_pantilt/detect/pretrained/api_v2/ssd_mobilenet_v3_coco.py", line 118, in __init__
    edge_tpu=True,
  File "/home/pi/.virtualenvs/RDP/lib/python3.7/site-packages/rpi_deep_pantilt/detect/custom/base_predictors.py", line 176, in __init__
    super().__init__(*args, **kwargs)
  File "/home/pi/.virtualenvs/RDP/lib/python3.7/site-packages/rpi_deep_pantilt/detect/custom/base_predictors.py", line 149, in __init__
    super().__init__(*args, **kwargs)
  File "/home/pi/.virtualenvs/RDP/lib/python3.7/site-packages/rpi_deep_pantilt/detect/custom/base_predictors.py", line 86, in __init__
    self.tflite_interpreter.allocate_tensors()
  File "/home/pi/.virtualenvs/RDP/lib/python3.7/site-packages/tflite_runtime/interpreter.py", line 259, in allocate_tensors
    return self._interpreter.AllocateTensors()
RuntimeError: Internal: Unsupported data type in custom op handler: 53699600Node number 2 (EdgeTpuDelegateForCustomOp) failed to prepare.

When I run the detect command alone, I get the error:

rpi-deep-pantilt detect
Traceback (most recent call last):
  File "/home/pi/.virtualenvs/RDP/bin/rpi-deep-pantilt", line 8, in <module>
    sys.exit(main())
  File "/home/pi/.virtualenvs/RDP/lib/python3.7/site-packages/rpi_deep_pantilt/cli.py", line 250, in main
    cli()
  File "/home/pi/.virtualenvs/RDP/lib/python3.7/site-packages/click/core.py", line 829, in __call__
    return self.main(*args, **kwargs)
  File "/home/pi/.virtualenvs/RDP/lib/python3.7/site-packages/click/core.py", line 782, in main
    rv = self.invoke(ctx)
  File "/home/pi/.virtualenvs/RDP/lib/python3.7/site-packages/click/core.py", line 1259, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/home/pi/.virtualenvs/RDP/lib/python3.7/site-packages/click/core.py", line 1066, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/home/pi/.virtualenvs/RDP/lib/python3.7/site-packages/click/core.py", line 610, in invoke
    return callback(*args, **kwargs)
  File "/home/pi/.virtualenvs/RDP/lib/python3.7/site-packages/rpi_deep_pantilt/cli.py", line 117, in detect
    run_stationary_detect(labels, predictor_cls, rotation)
  File "/home/pi/.virtualenvs/RDP/lib/python3.7/site-packages/rpi_deep_pantilt/detect/camera.py", line 84, in run_stationary_detect
    model = model_cls()
  File "/home/pi/.virtualenvs/RDP/lib/python3.7/site-packages/rpi_deep_pantilt/detect/pretrained/api_v2/ssd_mobilenet_v3_coco.py", line 230, in __init__
    input_type=input_type,
  File "/home/pi/.virtualenvs/RDP/lib/python3.7/site-packages/rpi_deep_pantilt/detect/custom/base_predictors.py", line 176, in __init__
    super().__init__(*args, **kwargs)
  File "/home/pi/.virtualenvs/RDP/lib/python3.7/site-packages/rpi_deep_pantilt/detect/custom/base_predictors.py", line 149, in __init__
    super().__init__(*args, **kwargs)
  File "/home/pi/.virtualenvs/RDP/lib/python3.7/site-packages/rpi_deep_pantilt/detect/custom/base_predictors.py", line 83, in __init__
    model_path=self.model_path,
  File "/home/pi/.virtualenvs/RDP/lib/python3.7/site-packages/tensorflow/lite/python/interpreter.py", line 207, in __init__
    custom_op_registerers_by_func))
ValueError: Mmap of '/home/pi/.keras/models/ssd_mobilenet_v3_small_coco_2019_08_14/model_postprocessed_quantized_128_uint8.tflite' failed.

If I try to run the track command alone:

$ rpi-deep-pantilt track
Process Process-2:
Traceback (most recent call last):
  File "/usr/lib/python3.7/multiprocessing/process.py", line 297, in _bootstrap
    self.run()
  File "/usr/lib/python3.7/multiprocessing/process.py", line 99, in run
    self._target(*self._args, **self._kwargs)
  File "/home/pi/.virtualenvs/RDP/lib/python3.7/site-packages/rpi_deep_pantilt/detect/camera.py", line 28, in run_pantilt_detect
    model = model_cls()
  File "/home/pi/.virtualenvs/RDP/lib/python3.7/site-packages/rpi_deep_pantilt/detect/pretrained/api_v2/ssd_mobilenet_v3_coco.py", line 230, in __init__
    input_type=input_type,
  File "/home/pi/.virtualenvs/RDP/lib/python3.7/site-packages/rpi_deep_pantilt/detect/custom/base_predictors.py", line 176, in __init__
    super().__init__(*args, **kwargs)
  File "/home/pi/.virtualenvs/RDP/lib/python3.7/site-packages/rpi_deep_pantilt/detect/custom/base_predictors.py", line 149, in __init__
    super().__init__(*args, **kwargs)
  File "/home/pi/.virtualenvs/RDP/lib/python3.7/site-packages/rpi_deep_pantilt/detect/custom/base_predictors.py", line 83, in __init__
    model_path=self.model_path,
  File "/home/pi/.virtualenvs/RDP/lib/python3.7/site-packages/tensorflow/lite/python/interpreter.py", line 207, in __init__
    custom_op_registerers_by_func))
ValueError: Mmap of '/home/pi/.keras/models/ssd_mobilenet_v3_small_coco_2019_08_14/model_postprocessed_quantized_128_uint8.tflite' failed.

Trying to run rpi-deep-pantilt track --edge-tpu

$ rpi-deep-pantilt track --edge-tpu
WARNING:root:Loading Coral tflite_runtime for Edge TPU
Process Process-2:
Traceback (most recent call last):
  File "/usr/lib/python3.7/multiprocessing/process.py", line 297, in _bootstrap
    self.run()
  File "/usr/lib/python3.7/multiprocessing/process.py", line 99, in run
    self._target(*self._args, **self._kwargs)
  File "/home/pi/.virtualenvs/RDP/lib/python3.7/site-packages/rpi_deep_pantilt/detect/camera.py", line 28, in run_pantilt_detect
    model = model_cls()
  File "/home/pi/.virtualenvs/RDP/lib/python3.7/site-packages/rpi_deep_pantilt/detect/pretrained/api_v2/ssd_mobilenet_v3_coco.py", line 118, in __init__
    edge_tpu=True,
  File "/home/pi/.virtualenvs/RDP/lib/python3.7/site-packages/rpi_deep_pantilt/detect/custom/base_predictors.py", line 176, in __init__
    super().__init__(*args, **kwargs)
  File "/home/pi/.virtualenvs/RDP/lib/python3.7/site-packages/rpi_deep_pantilt/detect/custom/base_predictors.py", line 149, in __init__
    super().__init__(*args, **kwargs)
  File "/home/pi/.virtualenvs/RDP/lib/python3.7/site-packages/rpi_deep_pantilt/detect/custom/base_predictors.py", line 86, in __init__
    self.tflite_interpreter.allocate_tensors()
  File "/home/pi/.virtualenvs/RDP/lib/python3.7/site-packages/tflite_runtime/interpreter.py", line 259, in allocate_tensors
    return self._interpreter.AllocateTensors()
RuntimeError: Internal: Unsupported data type in custom op handler: 25057520Node number 2 (EdgeTpuDelegateForCustomOp) failed to prepare.

As you can see, I can´t run any single command now 😩

I hope that the Tracebacks information be useful for you.

Notes: I'm able to run the command rpi-deep-pantilt test pantilt and rpi-deep-pantilt test camera

Martin2kid commented 3 years ago

I'm beginner but I remembered I also had similar problem while ago but solved by running(this was posted on Toward Data Science but not on Github); $ sudo apt-get update && sudo apt-get install libedgetpu1-std Restarted Pi4 (unplug Coral & re-pluged it)

Also note that I tried to install RPI-deep-pantilt onto another Pi4 few week ago and this pip install community wheel was not accessible & installable and I had to clone SD card from working Pi4; $ pip install https://github.com/leigh-johnson/Tensorflow-bin/releases/download/v2.2.0/tensorflow-2.2.0-cp37-cp37m-linux_armv7l.whl

Just in case, If you activated Virtualenv & activated(by running "source .venv/bin/activate") I would expect you are in (.venv) pi@raspberrypi:~/rpi-deep-pantilt $

Hope that helps

g30ba1 commented 3 years ago

Hi Martin, well, I checked my edgetpu packages on the terminal and this is the output:

$ dpkg -l | grep edgetpu
ii  edgetpu-examples                       15.0                                     all               Example code for Edge TPU Python API
ii  libedgetpu1-legacy-max:armhf    15.0                                     armhf        Support library for Edge TPU
rc  libedgetpu1-max:armhf               14.1                                     armhf        Support library for Edge TPU
ii  python3-edgetpu                          15.0                                     armhf        Edge TPU Python API

As you can see, I've updated my edgetpu packages (I'm not using the legacy versions anymore)

Looks like I gonna need to downgrade the edgetpu packages 😩

Note that I'm not using the new pycoral API (Well, I'm pretty sure that many of Us were not aware of the Edge TPU Python API being deprecated some months ago) And I don't know if that could be a problem too)

leigh-johnson commented 3 years ago

I'll try out the pycoral API in a 2.0.0 rc branch, thanks for mentioning it! The release of this API is news to me as well. I want to test and cut a 2.0.0 release before the end of 2020.

Martin2kid commented 3 years ago

Jorge,

Looking at my working setup (I have 2 Pi4 running rpi-deep-pantilt, 1 using standard Pimorini servo hat like you see in Leigh's post & another using large brushless gimbal and this link https://youtu.be/Ce-c9StqzsE

My working setup (following Leigh's Toward Data Science & Github instruction) are follows; Pi4 8-20-2020 OS (most recent), 8 GB, 4GB Works too rpi_deep_pantilt-1.2.1 tflite-runtime==2.1.0 Tensorflow 2.2.0 libedgetpu1-std:armhf 14.1

As you can see, somehow your setup is not same as following Leigh's post (may be something has been changed that I don't know) but I know those combination works very well except when you move out of FOV for few second & come back then it freezes.

g30ba1 commented 3 years ago

Sounds great!

If you can integrate all the most recent versions of the main packages involved (TF 2.x.0, pycoral, tflite-runtime 2.5.0, etc.) in one of the rpi-deep-pantilt 2.0.0 branches would be AWESOME.

Let me know if I can help you, by trying others combinations of dependences.

Martin2kid commented 3 years ago

Jorge,

I've followed her Github post then noticed $rpi-deep-pantilt detect --edge-tpu won't work just like you noticed. so I went back to look at the other post at Toward Data Science and noticed last line $ sudo apt-get update && sudo apt-get install libedgetpu1-std and it worked. As you can see you have different version & different combination

Martin2kid commented 3 years ago

I'll try out the pycoral API in a 2.0.0 rc branch, thanks for mentioning it! The release of this API is news to me as well. I want to test and cut a 2.0.0 release before the end of 2020.

Looking forward to it, hopefully this would solve FOV freeze issue (out of camera view for second, come back into view & it won't resume face tracking)

g30ba1 commented 3 years ago

Jorge,

I've followed her Github post then noticed $rpi-deep-pantilt detect --edge-tpu won't work just like you noticed. so I went back to look at the other post at Toward Data Science and noticed last line $ sudo apt-get update && sudo apt-get install libedgetpu1-std and it worked. As you can see you have different version & different combination

I'm aware of the difference between using libedgetpu1-std (Your case) and libedgetpu1-max (me), but I'm using my Raspberry Pi 4B to running other ML projects and I don't want to change the version of the Edge TPU Runtime (Form now)

I'm gonna need to clone my microSD and make the change (To libedgetpu1-std) in the cloned one.

It's amazing how many variables can affect our projects, right 😅?

Martin2kid commented 3 years ago

Sorry Jorge, newbie just tried help without giving much thought😁

leigh-johnson commented 3 years ago

test and cut a 2.0.0 release before the end of 2020.

Looking forward to it, hopefully this would solve FOV freeze issue (out of camera view for second, come back into view & it won't resume face tracking)

:crossed_fingers: Hopefully! Thanks for your patience @Martin2kid, I know you're itching to test this out. The last things on my todo list for that release (besides trying pycoral) are...

It's amazing how many variables can affect our projects, right ?

Ugh, tell me about it! The Ansible playbooks I mentioned will help with this. My day job is mostly MLOps these days, so I skipped past the boring best practices to build something quick and fun. As more people come to depend on this code for biz/research though, it needs to be a bit more reliable. :+1:

Martin2kid commented 3 years ago

👏I hear you all, I'm still researching FOC brushless motor control so hopefully It'll make rpi-deep-pantilt tracks object 360 degree continuously and in silence and I already see limitation of using PWM(only up to 180 degree) for this--need to study I2c & SPI for this newbie, Ugh...

g30ba1 commented 3 years ago

Sorry Jorge, newbie just tried help without giving much thought😁

Don't say that Martin, I appreciate any word/advice/tip from all the people involved in this project. At the end, All we want is to contribute to the repository and to help Leigh to release the best code possible.

btw, that FOC brushless motor control sounds very interesting, do you have a repository or blog about it?

Martin2kid commented 3 years ago

Jorge,

Thank you for your kindness!

I started to looking into FOC controller because of it's almost zero mechanical latency & to eliminate noise that typically associated with conventional mechanical gear reduction meshing gear noise.(such as Pimorini hat servo motor noise) and here are few links you may want to explore.

Simplebgc32 support PWM. I2c and serial input and only extended version support 2 to 3 closed loop position control together with magnetic or rotary encoder, it’s not open-source. https://www.basecamelectronics.com/simplebgc32bit/ Reliable hardware vendor: https://shop.iflight-rc.com/index.php?route=product/category&path=34_48 This product is rather pricy but I used this anyway because this is the only one I could use with PWM output and closed loop position control

Storm32 support PWM, I2c and serial input and only T-STorM32 support 2 to 3 closed loop position control and servo motor mode together with magnetic encoder, it’s open source. T-Storm32 is not readily available yet & I’m hesitated order through small external vendor located in Russia although it supposed have servo motor mode.

Good: These two product is based on IMU sensor based FOC controller and great for application that require 3 axes stabilization as they were originally designed to stabilize video camera mounted on to drones and are widely being used in Cinema and professional videography as a stabilizer currently. They both have PID controller.

Bad: Since these two above are IMU sensor based controller they develop drift and absolute positioning reference is required to counter that drift(sensor offset) and further require Magnetometer or GPS input —even with that additional requirement, it won’t be 100% absolute reference point as Magnetometer is susceptible to magnetic field & near by iron source interference and even GPS also have offset when it is in stationary. Their PID controller is based on physical force orientation are not capable of compensating software side such as vision processing latency.

Better Alternatives for absolute closed loop positioning FOC controller;

Pablo https://www.youtube.com/watch?v=MKNkZOja7-s has 2 ch control board and it’s open source Good: small 2 ch board with high-power motor output Bad: No PWM input example yet(But I’m sure someone good with Arduino can write this code without problem, it just me waiting Pablo to put out PWM example David from SimpleFOC community indicated to me this board & Pablo’s code heat up motor hotter but I have not noticed that yet

Justine https://youtu.be/OZvjfbpXpro also has 2 ch control board and it’s open source Good: support PWM, I2c, ROS and example code. Bad: Pricy, 16bit, too big.

SimpleFOC https://www.simplefoc.com has 1 ch stackable control board and it’s open source Good: Arduino Library support, Good price point Bad: Only 1 ch, stackable design makes too tall when multiple board are stacked, no PWM input support yet(No such Arduino library source)

g30ba1 commented 3 years ago

Jorge,

Thank you for your kindness!

I started to looking into FOC controller because of it's almost zero mechanical latency & to eliminate noise that typically associated with conventional mechanical gear reduction meshing gear noise.(such as Pimorini hat servo motor noise) and here are few links you may want to explore.

Simplebgc32 support PWM. I2c and serial input and only extended version support 2 to 3 closed loop position control together with magnetic or rotary encoder, it’s not open-source. https://www.basecamelectronics.com/simplebgc32bit/ Reliable hardware vendor: https://shop.iflight-rc.com/index.php?route=product/category&path=34_48 This product is rather pricy but I used this anyway because this is the only one I could use with PWM output and closed loop position control

Storm32 support PWM, I2c and serial input and only T-STorM32 support 2 to 3 closed loop position control and servo motor mode together with magnetic encoder, it’s open source. T-Storm32 is not readily available yet & I’m hesitated order through small external vendor located in Russia although it supposed have servo motor mode.

Good: These two product is based on IMU sensor based FOC controller and great for application that require 3 axes stabilization as they were originally designed to stabilize video camera mounted on to drones and are widely being used in Cinema and professional videography as a stabilizer currently. They both have PID controller.

Bad: Since these two above are IMU sensor based controller they develop drift and absolute positioning reference is required to counter that drift(sensor offset) and further require Magnetometer or GPS input —even with that additional requirement, it won’t be 100% absolute reference point as Magnetometer is susceptible to magnetic field & near by iron source interference and even GPS also have offset when it is in stationary. Their PID controller is based on physical force orientation are not capable of compensating software side such as vision processing latency.

Better Alternatives for absolute closed loop positioning FOC controller;

Pablo https://www.youtube.com/watch?v=MKNkZOja7-s has 2 ch control board and it’s open source Good: small 2 ch board with high-power motor output Bad: No PWM input example yet(But I’m sure someone good with Arduino can write this code without problem, it just me waiting Pablo to put out PWM example David from SimpleFOC community indicated to me this board & Pablo’s code heat up motor hotter but I have not noticed that yet

Justine https://youtu.be/OZvjfbpXpro also has 2 ch control board and it’s open source Good: support PWM, I2c, ROS and example code. Bad: Pricy, 16bit, too big.

SimpleFOC https://www.simplefoc.com has 1 ch stackable control board and it’s open source Good: Arduino Library support, Good price point Bad: Only 1 ch, stackable design makes too tall when multiple board are stacked, no PWM input support yet(No such Arduino library source)

Thank you very much for the info.

I'm gonna check the links 😈

Martin2kid commented 3 years ago

Jorge,

YW! I'd very much appreciated if you can look into Pablo's code and implement PWM input and share with us or modify his code for Dual Closed Loop position control using PWM(Newbie wish list) if you like.

I did left message on his Youtube but he replied(or assuming I'm capable of perhaps?) it's there already.

g30ba1 commented 3 years ago

Jorge,

I've followed her Github post then noticed $rpi-deep-pantilt detect --edge-tpu won't work just like you noticed. so I went back to look at the other post at Toward Data Science and noticed last line $ sudo apt-get update && sudo apt-get install libedgetpu1-std and it worked. As you can see you have different version & different combination

Hi leigh, I've cloned my Raspberry Pi microSD, and I'm modifying my packages to meet the rpi-deep-pantilt requirements.

But, I have a problem: If I try to install the libedgetpu1-std library by using sudo apt-get install, the packaged libedgetpu1-legacy-std is the one that is installed. (The original libedgetpu1-sdt is not available anymore)

I don't know if the packages are the same. Well, after that, I run the "rpi-deep-pantilt track --edge-tpu command, but I get the error:

rpi-deep-pantilt track --edge-tpu
INFO: Initialized TensorFlow Lite runtime.
Process Process-2:
Traceback (most recent call last):
  File "/usr/lib/python3.7/multiprocessing/process.py", line 297, in _bootstrap
    self.run()
  File "/usr/lib/python3.7/multiprocessing/process.py", line 99, in run
    self._target(*self._args, **self._kwargs)
  File "/home/pi/.virtualenvs/dl4rpi/lib/python3.7/site-packages/rpi_deep_pantilt/control/manager.py", line 44, in run_detect
    model = SSDMobileNet_V3_Coco_EdgeTPU_Quant()
  File "/home/pi/.virtualenvs/dl4rpi/lib/python3.7/site-packages/rpi_deep_pantilt/detect/ssd_mobilenet_v3_coco.py", line 56, in __init__
    self.tflite_interpreter.allocate_tensors()
  File "/home/pi/.virtualenvs/dl4rpi/lib/python3.7/site-packages/tensorflow_core/lite/python/interpreter.py", line 244, in allocate_tensors
    return self._interpreter.AllocateTensors()
  File "/home/pi/.virtualenvs/dl4rpi/lib/python3.7/site-packages/tensorflow_core/lite/python/interpreter_wrapper/tensorflow_wrap_interpreter_wrapper.py", line 106, in AllocateTensors
    return _tensorflow_wrap_interpreter_wrapper.InterpreterWrapper_AllocateTensors(self)
RuntimeError: Internal: Unsupported data type in custom op handler: 0Node number 2 (EdgeTpuDelegateForCustomOp) failed to prepare.

Do you have the original package "libedgetpu1-std"?

Martin2kid commented 3 years ago

Jorge,

I do remember I had very similar issue in February 2020, see this specific communication here at https://github.com/google-coral/edgetpu/issues/44. were also mentioned in closed issue.

I believe Leigh fixed this in 1.21 (this issue & problem what you see was based on 1.0.0 and that's how I was able to make mine working with Coral stick (Edge TPU) in 1.21 after long wait.

It's been almost an 3/4 of year now and my memory seems to be short nowadays but I believe I started with fresh install of Buster & RPI and verified SMbus simulink were working first then Coral stick is fully inserted into USB port (Blue) -- of course restart after libedgetpu1-std install if not, it'll produce error.

I almost dimly remember that installation folder structure was that .Venv folder was within rpi-dee--pantilt and SMbus simulink were created within rpi-deep-pantilt in sub folder "/home/pi/rpi-deep-pantilt/.venv/lib/python3.7/site-packages"

Mistake that I made in the past were; Google's Coral newer version install and libedgetpu from elsewhere.

Cheers!

Martin2kid commented 3 years ago

Jorge,

Please see #52 issue, he now has his rpi_deep_pantilt working and hope yours are working as well. It is my guess that "$ sudo apt-get update && sudo apt-get install libedgetpu1-std" and others that doesn't always works in link-wise and pretty much explain troubles that I had that it took multiple attempts in various time frame.

g30ba1 commented 3 years ago

Hi everybody! In first place, I have to say thank you for all your support.

Well, now the good news: By reading the threads provided by Martin, I've found my problem: Due to an non-standard use of the modules-classes of the tflite runtime (Coral people), the error was originated by the way that the class Interpreter was being called in the script "ssd_mobilenet_v3_coco.py", so, I modified the lines 52-57 into this:

        self.tflite_interpreter = Interpreter(
            model_path=self.model_path,
            experimental_delegates=[
                load_delegate(self.EDGETPU_SHARED_LIB)
            ]
        )   

It´s very important to add this two lines at the beginning of the script:

from tflite_runtime.interpreter import load_delegate
from tflite_runtime.interpreter import Interpreter   

After this changes, the program runs GREAT!!

By typing:

rpi-deep-pantilt track --loglevel=DEBUG --edge-tpu

I'm getting 55 FPS! (Well, my RPi 4B is overclocked to 1.95GHz too, but, it is IMPRESSIVE!)

What a badass program Leigh 😎! And thank you very much Martin 😁.

Have a nice weekend.

Martin2kid commented 3 years ago

Jorge,

I'm so glad it is working!

And, did you notice P & T motion jitter once in a while & was it matching with speed of object movement on display, was it random? ( I had to ramp up power supply voltage to 5.4 Volt & 5 Amp to minimize that behavior (still twitches once in a while)

Thank you for update!

kindofausername commented 3 years ago

I'll try out the pycoral API in a 2.0.0 rc branch, thanks for mentioning it! The release of this API is news to me as well. I want to test and cut a 2.0.0 release before the end of 2020.

Looking forward to it, hopefully this would solve FOV freeze issue (out of camera view for second, come back into view & it won't resume face tracking)

I think this issue is because of the Pid controller. I mean: I realized that the time of freeze (until tracking continuous) is proportional to the time the object is out of view. In my Setup, if my face is outside the FOV for one e.g. one second the "freeze" will last one second. I think it would help if we try to reset the Pid values to default when the object leaves FOV?!

Martin2kid commented 3 years ago

arminf82,

Good point and probably will be a lot of work, I thought about disabling PID controller since my motors (FOC controlled brushless) are much faster than servo motor. but could not figure out what code to change.

g30ba1 commented 3 years ago

Jorge,

I'm so glad it is working!

And, did you notice P & T motion jitter once in a while & was it matching with speed of object movement on display, was it random? ( I had to ramp up power supply voltage to 5.4 Volt & 5 Amp to minimize that behavior (still twitches once in a while)

Thank you for update!

If I'm getting this right, it doesn't matter how much current you can supply to the RPi 4B, because the motors are getting the current from the Pan Tilt HAT, so, the misbehaving will be the same, right?

We need to separate the current supply to the motors, by using a separate driver (board).

I'm currently researching about that.

Martin2kid commented 3 years ago

Jorge,

It is common issue that Pi4 generate low voltage warning to begin with.

I noticed Pi4 trigers low voltage warning when I plugged USB stick like Movidius and USB cam even without using Pan&Tilt hat. and I also noticed that warning running Rpi-deep-pantilt with that Hat as well(Jitter & twitches) with my 8GB Pi4.

I used this programmable BEC (https://www.hobbytown.com/castle-creations-10-amp-adjustable-bec-cse010-0004-00/p18210) and it fixed much of jitters.

I believe hat & 2 servos draws voltage from Pi4 & that can strain processing, caused unstable output with jitters.

You can drive 2 servos by using isolated power supply connection to servo; only connect ground & signal wire to Hat and connect positive to dedicated supply power--but both Pi's and Servo's ground must be shared. (I didn't have to use this method because CC-Bec took care of my jitter problem) and my other brushless FOC driven pan & tilt uses isolated power supply to brushless motor controller & result are about same exception of much faster speed and smooth fluid like motion in completely silence.

Martin2kid commented 3 years ago

Finally!

I was able to build closed loop motor position control hardware that can cover 360 degree & beyond by using AS5048a magnetic sensor with PWM & SPI input and slip-ring supporting up to 5 Amp.

This can also work with more recent version of AS**** product with either PWM, I2c and SPI and possibly CAN bus as well.

But I need software side support that will zero out in 360 degree & continue or restart positional detection restarting with 1 degree in either PWM, I2c or SPI input wise.

8BF4028B-4D9D-4AAB-96AD-553B273C926A_1_105_c

This would help with wiring tangle issue & limitation beyond 180 degree and not to mention extremly fast positioning control in total silence.

leigh-johnson commented 3 years ago

Whoa, amazing @Martin2kid. I'm brimming with questions. What's the materials cost of the prototype? Does 360 degrees of freedom span 1 axis (panning) or 2? (Pan+tilt).

My goal is to finish the 2.0 update over the holiday break, which starts Wednesday 12/23 for me. Looking forward to hacking on this project again! It was last holiday break that I published 1.0. 🥳

I've also ordered a few mid-range Arducam pan-tilt-zoom modules, to test upgrades/replacements for the Pimoroni hat.
https://www.arducam.com/product/arducam-12mp-pan-tilt-zoom-ptz-camera-fo-raspberry-pi-and-jetson-nano/

Martin2kid commented 3 years ago

Leigh,

It's not really prototype I made, rather I was able to find exactely what I was looking for after almost 2 years of random search; This motor is 22 pole motor & cost $39.00 from here; https://shop.iflight-rc.com/index.php?route=product/product&path=34_44&product_id=1347

And magnetic encoder cost is $18.00 from same store; https://shop.iflight-rc.com/index.php?route=product/product&path=34_61&product_id=262 The main & most important factor is that they used ring magnet allowing wiring pass through & still maintain sensor readability with tight gap over ring magnet and also provide PWM & SPI cable connectors.

Slip ring of 12.5mm cost is $9.00 also from same store;

360 degree rotation require a slip ring and I think it would normally apply to only pan axe, unless application require such freedom on both pan & tilt axis such as acrobatic drone or fighter UAV.

Biggest huddle however is the Brushless FOC Motor controller; Iflight is one of the distributor for Simplebgc32 controller board which is licensed closed code system and they are not cheap (about $200 for extended controller board that provide 2-3 axis closed-loop absolute positional control supporting magnetic encoder with resolution of 0.05 degree but still subject to IMU drift).

On the other hand SimpleFOC project hacked into much cheaper Storm32BGC board (average cost of less than $20) which is open source project and I'm still investigating and waiting for PWM implementation from them.

Please see this post; https://community.simplefoc.com/t/code-adaptation-for-strom32-bgc-v1-31-3-x-bldc-motors-with-simplefoclibrary/21

https://community.simplefoc.com/t/pwm-input-360-position-control-with-as5048a-pwm-encoder/166

My current working Pan & Tilt system is pretty expensive by using larger motors with 42 pole count, SimpleBGC32 extended controller, Customized Slip Ring, Magnetometer Correction and payload supporting up to typical Laptops weight(Aprx $800).

I'm seriously looking forward 2.0 update! but don't force yourself, we all need to spend more time with our loved ones.

Anyway, I've looked into that P&T camera which I also looked into it in the past; I really liked the idea, but I did not like slow motorized zoom control with noise & significant latency caused by higher gear ratio and very limited available resources.... Thinking brushless FOC controller for that zoom motor? or perhaps with ever increasing USB Cam with digital focus?

Wondering, Would you be interested in hardware co-development and possible capitalization of your development or perhaps reaching out to FOC development community?

Martin2kid commented 3 years ago

Leigh,

This video of Spherical Actuator at https://youtu.be/GlhRBdnXqBM

Pretty much explain and best describe where I'm heading with FOC motor control of 3 axis, if you can imagine camera is mounted on the top of this actuator & additional zoom function is controlled by another brushless motor and comprising of 9 DOF object tracking just like human eye and compact enough to put into cell-phone like device.

This particular design is not really smooth enough(requiring more linear positioning control, perhaps with more number of pole) for vision based application but provide good example.

I currently experimenting with hand winded coil over ping-pong ball and realized that I need much more precision method of winding or perhaps need printed motor spear or somehow in 3d printing.