Cell phones are with us almost anytime and anywhere.
We do everything with them - run with them, navigate with them while driving and communicate with the world with them. However, we have to let them know what we want to do and it is sometimes inconvenient to activate them at the same time as performing certain actions.
It would have been easier for us if the device would have recognized the activity we were performing and as a result would have performed pre-defined actions automatically according to our preferences, even without us explicitly asking for it at that moment.
In this project we developed a system that is the first step for such a scenario: identification of user activity in real time.
Our project is based on devices running the Android operating system. Using the device's acceleration sensors, we are able to classify what action the user is performing at this moment.
We are currently able to distinguish between 4 operations: Running, walking, using stairs (going up or down without distinguishing between these two actions) and driving.
The project is based on an existing database or one created by the user and with the help of classifiers that implement algorithms from the field of computer learning - identifies the action that is currently being performed.