Zero-Effort Camera-Assisted Calibration Techniques for Wearable Motion Sensors
Jian Wu, Roozbeh Jafari

Citation
Jian Wu, Roozbeh Jafari. "Zero-Effort Camera-Assisted Calibration Techniques for Wearable Motion Sensors". International Conference on Cyber-Physical Systems (ACM/IEEE ICCPS), 14, April, 2014.

Abstract
Activity recognition using wearable motion sensors, also known as inertial measurement units (IMUs), plays an important role in wellness and healthcare monitoring applications. The design of the activity recognition algorithms typically requires information about the placement and the orientation of the IMUs on the body, and the signal processing is often designed, fitted and trained to work with a known orientation of the sensors. Signal processing techniques, and the placement of the sensors to capture physical movements are viewed as the cyber and physical systems, respectively. These two systems are required to remain tightly coupled to ensure the correct functionality of activity recognition algorithms, hence bringing a perfect perspective for cyber physical systems (CPS). In the case of accidental movement or displacement of the sensors, it is important to identify the new location and orientation to adjust the signal processing algorithms. This step, often called calibration or recalibration, requires extra effort on the part of the user to either perform a set of known movements, or enter information about the placement of the sensors manually. In this paper, we propose a camera-assisted calibration approach that does not require any extra efforts from the user. The calibration is done when the user is appearing in front of the camera and performing an arbitrary activity of choice (e.g., walking in front of the camera). Our approach constructs simulated IMU orientation readings, leveraging information acquired by a camera system. A two-step search algorithm is proposed to calibrate the sensor orientation by searching the minimum rotation distance between the simulated IMU and the real IMU readings. The novelty of our proposed methodology is in enabling seamless calibration techniques for wearable computers. Our experimental results show that the orientation of the sensors can be acquired with root mean square errors (RMSE) of 5.59 and 10.73 degrees for 6 daily movements when the sensor is placed on thigh and arm respectively.

Electronic downloads

Citation formats  
  • HTML
    Jian Wu, Roozbeh Jafari. <a
    href="http://www.terraswarm.org/pubs/233.html"
    >Zero-Effort Camera-Assisted Calibration Techniques for
    Wearable Motion Sensors</a>, International Conference
    on Cyber-Physical Systems (ACM/IEEE ICCPS), 14, April, 2014.
  • Plain text
    Jian Wu, Roozbeh Jafari. "Zero-Effort Camera-Assisted
    Calibration Techniques for Wearable Motion Sensors".
    International Conference on Cyber-Physical Systems (ACM/IEEE
    ICCPS), 14, April, 2014.
  • BibTeX
    @inproceedings{WuJafari14_ZeroEffortCameraAssistedCalibrationTechniquesForWearable,
        author = {Jian Wu and Roozbeh Jafari},
        title = {Zero-Effort Camera-Assisted Calibration Techniques
                  for Wearable Motion Sensors},
        booktitle = {International Conference on Cyber-Physical Systems
                  (ACM/IEEE ICCPS)},
        day = {14},
        month = {April},
        year = {2014},
        abstract = {Activity recognition using wearable motion
                  sensors, also known as inertial measurement units
                  (IMUs), plays an important role in wellness and
                  healthcare monitoring applications. The design of
                  the activity recognition algorithms typically
                  requires information about the placement and the
                  orientation of the IMUs on the body, and the
                  signal processing is often designed, fitted and
                  trained to work with a known orientation of the
                  sensors. Signal processing techniques, and the
                  placement of the sensors to capture physical
                  movements are viewed as the cyber and physical
                  systems, respectively. These two systems are
                  required to remain tightly coupled to ensure the
                  correct functionality of activity recognition
                  algorithms, hence bringing a perfect perspective
                  for cyber physical systems (CPS). In the case of
                  accidental movement or displacement of the
                  sensors, it is important to identify the new
                  location and orientation to adjust the signal
                  processing algorithms. This step, often called
                  calibration or recalibration, requires extra
                  effort on the part of the user to either perform a
                  set of known movements, or enter information about
                  the placement of the sensors manually. In this
                  paper, we propose a camera-assisted calibration
                  approach that does not require any extra efforts
                  from the user. The calibration is done when the
                  user is appearing in front of the camera and
                  performing an arbitrary activity of choice (e.g.,
                  walking in front of the camera). Our approach
                  constructs simulated IMU orientation readings,
                  leveraging information acquired by a camera
                  system. A two-step search algorithm is proposed to
                  calibrate the sensor orientation by searching the
                  minimum rotation distance between the simulated
                  IMU and the real IMU readings. The novelty of our
                  proposed methodology is in enabling seamless
                  calibration techniques for wearable computers. Our
                  experimental results show that the orientation of
                  the sensors can be acquired with root mean square
                  errors (RMSE) of 5.59 and 10.73 degrees for 6
                  daily movements when the sensor is placed on thigh
                  and arm respectively.},
        URL = {http://terraswarm.org/pubs/233.html}
    }
    

Posted by Barb Hoversten on 11 Dec 2013.

Notice: This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright.