Skip to content

amluckydave/GO-MYO

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

64 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

GO MYO


🏗️ Introduction

GO MYO: Online Training & Predicting Hand Gestures.

Designed By @amluckydave

🚀 Progressing

This project aims to realize real-time hand gestures prediction, which means the GO MYO-embeded model can output target gestures before the completion of signal collection.

DO NOT provide vital classification algorithm coding. (- at present)

📌 Training

The first step for predicting is to train the individual classifier, you can follow the GUI notes to save your gesture raw EMG data in the default folder.

Once all the preset 12 gestures are recorded and saved properly, you can press "Train" button to train the recognition model. And the trained classifier will be saved in the same folders.

Then, you can move to Prediction period. If you had trained your classifier, you can just skip training and initialize the prediction.

📝 Predicting

You should initialize the prediction model with the trained data (weights, default options etc.) titled with "CL.h5". Then, just connect the MYO armband, and do the predicting. To evaluate the real-time performance, you can check the LCD module (milli seconds) which stands for the duration between gesture begining and result generating.

Reference

% Python bindings for the Myo SDK: @NiklasRosenstein | myo-python

% You can check the media files about the real-time experiment (rawVideo.mp4, ba0.gif, etc).