ProjectsProject Details

Foot Gestures Recognition for Controlling a 3D-Printed Prosthetic Hand

Project ID: 5874-1-21
Year: 2021
Student/s: Sammy Apsel, Da-el Klang
Supervisor/s: Shunit Polinsky

Todays Prosthetic hands are commonly based on reading an EMG signal from the stump area. These solutions arent always suitable for all amputees since they are expensive, tend to have a lot of noise, and could cause phantom pain due to the simulation of atrophied muscles.
This work is about foot gestures recognition for controlling a 3D-printed prosthetic hand. Furthermore, the goal of this work is to build a lightweight, user-friendly system by which the user could control the prosthesis. Specifically, our solution is based on 2 inertial sensors, of which one placed on the center of the foot, and the other on the shank.
Using these sensors, which are connected to a small and cheap computer, well use machine learning algorithms to train various models by which well classify 6 different foot gestures. Ultimately, the goal is to get to minimum false positives and maximum true positives.

Poster for Foot Gestures Recognition for Controlling a 3D-Printed Prosthetic Hand
Collaborators:
Logo of Haifa-3D Collaborator
Haifa-3D