Hand prosthesis with embedded control with multiple gesture type can reach high prices, and offer less flexibility and patient specific adjustments. With the introduction of 3D printing to the problem, many open source designs for prosthesis emerged, most of them with a single type of hand movement and limited control of the hand. In this research we aim to develop an accessible 3d printed hand with a control mechanism which is reliable, portable, working in real time and controlled intuitively. The prosthesis used was an upgraded version of an open source hand from e-enable, modified to be powered from 3 servo engines, not from the movement of the Hand Stump. The classification algorithm offline was developed on Matlab, it was designed to take into account the needs for real time (less then 250ms) and reliability while taking into account the low frequency sampling rate. Different features and machine learning methods where tested to reach maximum accuracy. Different recording method where tested to make the learning process short but efficient. The real time implementation of the algorithm was done on Intel Edison using Python. The sensor used was the MYO Armband connected via Bluetooth to the Edison.