UlysseCoteAllard / MyoArmbandDataset

GNU General Public License v3.0
86 stars 40 forks source link

How to realize smooth control of robotic arm using the decrete output of the Neural Network in real-time? #4

Closed wangwuqi closed 5 years ago

wangwuqi commented 5 years ago

I have seen your video on Youtube, it's really fantastic. I also do some work with Myo, and use it to control a Raspberry Pi car. The network works well on offline data, but when conduct real-time recognition, how to realize a smooth control with the decrete NN's output?

UlysseCoteAllard commented 5 years ago

Hi, The way we did it is to obtain a new classification every 260ms (this could be reduced using sliding windows (on my laptop pre-processing+classification takes around 25ms)). Then, we actuate the robotic arm/prosthetic accordingly (depending on the prediction).

So for example, if the ConvNet predict close hand, we will have the robotic arm close as long as no other gesture is detected.

I hope this is clear and do not hesitate to ask any other questions.

Have a great day.

wangwuqi commented 5 years ago

thank you very much, I also do like this. but somtimes, it the NN outputs an unwanted result when it processes the data that is not one of the pre-defined gestures or just misclassified the sliding window data, then the control will be incorrect. Your video shows a perfect control with 100% classification accuracy. It's really amaing!