Robust Activity Recognition using Wearable IMU Sensors
Yashaswini Prathivadi, Terrell R. Bennett, Roozbeh Jafari

Citation
Yashaswini Prathivadi, Terrell R. Bennett, Roozbeh Jafari. "Robust Activity Recognition using Wearable IMU Sensors". Sensors 2014, IEEE, 2, November, 2014.

Abstract
In this paper, an orientation transformation (OT) algorithm is presented that increases the effectiveness of performing activity recognition using body sensor networks (BSNs). One of the main limitations of current recognition systems is the requirement of maintaining a known, or original, orientation of the sensor on the body. The proposed OT algorithm overcomes this limitation by transforming the sensor data into the original orientation framework such that orientation dependent recognition algorithms can still be used to perform activity recognition irrespective of sensor orientation on body. The approach is tested on an orientation dependent activity recognition system which is based on dynamic time warping (DTW). The DTW algorithm is used to detect the activities after the data is transformed by OT. The precision and recall for the activity recognition for five subjects and five movements was observed to range from 74% to 100% and from 83% to 100%, respectively. The correlation coefficient between the transformed data and the data from the original orientation is above 0.94 on axis with well-defined patterns.

Electronic downloads

Citation formats  
  • HTML
    Yashaswini Prathivadi, Terrell R. Bennett, Roozbeh Jafari.
    <a
    href="http://www.terraswarm.org/pubs/340.html"
    >Robust Activity Recognition using Wearable IMU
    Sensors</a>, Sensors 2014, IEEE, 2, November, 2014.
  • Plain text
    Yashaswini Prathivadi, Terrell R. Bennett, Roozbeh Jafari.
    "Robust Activity Recognition using Wearable IMU
    Sensors". Sensors 2014, IEEE, 2, November, 2014.
  • BibTeX
    @inproceedings{PrathivadiBennettJafari14_RobustActivityRecognitionUsingWearableIMUSensors,
        author = {Yashaswini Prathivadi and Terrell R. Bennett and
                  Roozbeh Jafari},
        title = {Robust Activity Recognition using Wearable IMU
                  Sensors},
        booktitle = {Sensors 2014},
        organization = {IEEE},
        day = {2},
        month = {November},
        year = {2014},
        abstract = {In this paper, an orientation transformation (OT)
                  algorithm is presented that increases the
                  effectiveness of performing activity recognition
                  using body sensor networks (BSNs). One of the main
                  limitations of current recognition systems is the
                  requirement of maintaining a known, or original,
                  orientation of the sensor on the body. The
                  proposed OT algorithm overcomes this limitation by
                  transforming the sensor data into the original
                  orientation framework such that orientation
                  dependent recognition algorithms can still be used
                  to perform activity recognition irrespective of
                  sensor orientation on body. The approach is tested
                  on an orientation dependent activity recognition
                  system which is based on dynamic time warping
                  (DTW). The DTW algorithm is used to detect the
                  activities after the data is transformed by OT.
                  The precision and recall for the activity
                  recognition for five subjects and five movements
                  was observed to range from 74% to 100% and from
                  83% to 100%, respectively. The correlation
                  coefficient between the transformed data and the
                  data from the original orientation is above 0.94
                  on axis with well-defined patterns.},
        URL = {http://terraswarm.org/pubs/340.html}
    }
    

Posted by Barb Hoversten on 28 Jul 2014.

Notice: This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright.