This project is not maintained anymore and is archived. Feel free to fork and make your own changes if needed. It's because Myo production and sales has officially ended as of Oct 12, 2018.
Thanks to everyone for their valuable feedback.
Gesture recognition using myo armband via neural network (tensorflow library).
Library | Version |
---|---|
Python | ^3.5 |
Tensorflow | ^1.1.0 |
Numpy | ^1.12.0 |
sklearn | ^0.18.1 |
myo-python | ^0.2.2 |
You can use your own scripts for collecting EMG data from Myo armband.
But you need to push 64-value array with data from each sensor.
By default myo-python returns 8-value array from each sensors.
Each output return by 2-value array: [datetime, [EMG DATA]]
.
64 - value array its 8 output from armband. Just put it to one dimension array.
So you just need to collect 8 values with gesture from armband (if you read data 10 times/s its not a problem).
In repo are collected dataset from Myo armband collected by me. Dataset contains only 5 gestures:
π - Ok (1)
βοΈ - Fist (2)
βοΈ - Like (3)
π€ - Rock (4)
π - Spock (5)
python3 train.py
75k iteration take about 20 min on GTX 960 or 2h on i3-6100.
Accuracy after ~75k iteration (98.75%):
Loose after ~75k iteration (1.28):
python3 predict.py
You must have installed MYO SDK. Script will return number (0-5) witch represent gesture (0 - relaxed arm).
python3 predict_train_dataset.py
Example output:
Accuracy on Test-Set: 98.27% (19235 / 19573)
[2438 5 9 6 4 20] (0) Relax
[ 4 2652 45 1 3 9] (1) Ok
[ 8 44 4989 1 1 9] (2) Fist
[ 8 2 2 4152 28 13] (3) Like
[ 2 5 6 27 1839 1] (4) Rock
[ 14 22 13 21 5 3165] (5) Spock
(0) (1) (2) (3) (4) (5)
I know that making prediction on training dataset wrong. But i don't have time to make testing dataset(
Fully connected 1 (528 neurons) |
---|
ReLu |
Fully connected 2 (786 neurons) |
ReLu |
Fully connected 3 (1248 neurons) |
ReLu |
Dropout |
Softmax_linear |