Human Activity Recognition

This page provides tools, data sets, and results which deal with the automatic detection of human activity recognition (e.g., walking) as well as activities of daily living (e.g., watering plants).


Our aim is to develop robust activity recognition methods based on mobile device sensors that generate high quality results in a real world setting. This section provides our developed apps including source code, and docs. We provide a comprehensive data collector app and develop currently a HAR app for everday life.

Data Sets

The data set covers the acceleration, GPS, gyroscope, light, magnetic field, and sound level data of the activities climbing stairs down and up, jumping, lying, standing, sitting, running/jogging, and walking of fifteen subjects. For each activity, the on-body positions chest, forearm, head, shin, thigh, upper arm, and waist were simultaneously recorded.


For processing the recorded sensor data, we developed useful frameworks. Our frameworks are available, free for use, and can be modified or enhanced without further requests. Currently, we focus on derving useful information from accelormeter data, i.e, segmenting the data stream and compute meaningful features.