MotionSynthesis Toolset (MoST): An Open Source Tool and Dataset for Human Motion Data Synthesis and Validation
Terrell R. Bennett, Hunter C. Massey, Jian Wu, Syed Ali Hasnain, Roozbeh Jafari

Citation
Terrell R. Bennett, Hunter C. Massey, Jian Wu, Syed Ali Hasnain, Roozbeh Jafari. "MotionSynthesis Toolset (MoST): An Open Source Tool and Dataset for Human Motion Data Synthesis and Validation". IEEE Sensors Journal, May 2016.

Abstract
Access to experimental data in the development of algorithms and techniques for wearable computing devices and body sensor networks (BSNs) allows faster validation and refinement of algorithms. The MotionSynthesis Toolset (MoST) is an open source toolset and database built to assist in data collection, data sharing and allow collaboration in review and validation of datasets. The tools can generate a sequence of movements and synthesize a data stream based on the data stored in the database. New movements can be added to the database and the tools by the community. The tools also allow visualization, and validation of the movements and data with video and signal waveforms. The dataset has more than twenty subjects and multiple repetitions of the movements from each subject to increase data diversity.

Electronic downloads


Internal. This publication has been marked by the author for TerraSwarm-only distribution, so electronic downloads are not available without logging in.
Citation formats  
  • HTML
    Terrell R. Bennett, Hunter C.  Massey, Jian Wu, Syed Ali
    Hasnain, Roozbeh Jafari. <a
    href="http://www.terraswarm.org/pubs/785.html"
    >MotionSynthesis Toolset (MoST): An Open Source Tool and
    Dataset for Human Motion Data Synthesis and
    Validation</a>, <i>IEEE Sensors
    Journal</i>, May 2016.
  • Plain text
    Terrell R. Bennett, Hunter C.  Massey, Jian Wu, Syed Ali
    Hasnain, Roozbeh Jafari. "MotionSynthesis Toolset
    (MoST): An Open Source Tool and Dataset for Human Motion
    Data Synthesis and Validation". <i>IEEE Sensors
    Journal</i>, May 2016.
  • BibTeX
    @article{BennettMasseyWuHasnainJafari16_MotionSynthesisToolsetMoSTOpenSourceToolDatasetFor,
        author = {Terrell R. Bennett and Hunter C.  Massey and Jian
                  Wu and Syed Ali Hasnain and Roozbeh Jafari},
        title = {MotionSynthesis Toolset (MoST): An Open Source
                  Tool and Dataset for Human Motion Data Synthesis
                  and Validation},
        journal = {IEEE Sensors Journal},
        month = {May},
        year = {2016},
        abstract = {Access to experimental data in the development of
                  algorithms and techniques for wearable computing
                  devices and body sensor networks (BSNs) allows
                  faster validation and refinement of algorithms.
                  The MotionSynthesis Toolset (MoST) is an open
                  source toolset and database built to assist in
                  data collection, data sharing and allow
                  collaboration in review and validation of
                  datasets. The tools can generate a sequence of
                  movements and synthesize a data stream based on
                  the data stored in the database. New movements can
                  be added to the database and the tools by the
                  community. The tools also allow visualization, and
                  validation of the movements and data with video
                  and signal waveforms. The dataset has more than
                  twenty subjects and multiple repetitions of the
                  movements from each subject to increase data
                  diversity.},
        URL = {http://terraswarm.org/pubs/785.html}
    }
    

Posted by Elizabeth Coyne on 3 May 2016.
Groups: pw

Notice: This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright.