Document

Author
Farshid AMIRABDOLLAHIAN, Michael WALTERS, Rory HEFFERNAN, Sarah FLETCHER, Phil WEBB
Abstract
With the technological advances in sensing human motion, and its potential to drive and control mechanical interfaces remotely, a multitude of input mechanisms are used to link actions between the human and the robot. In this study we explored the feasibility of using the human arm’s myoelectric signals with the aim of identifying a number of gestures automatically. We deployed k-nearest neighbour’s algorithm in machine learning to train and later identify gestures, and achieved an accuracy of around 65%. This indicates potential feasibility while highlighting areas for improvement both in accuracy and utility/usability of such approaches.