Stochastic Variational Inference for Hidden Markov Models
Nick Foti, Jason Xu, Dillon Laird, Emily B. Fox

Citation
Nick Foti, Jason Xu, Dillon Laird, Emily B. Fox. "Stochastic Variational Inference for Hidden Markov Models". Neural Information Processing Systems (NIPS), 8, December, 2014.

Abstract
Variational inference algorithms have proven successful for Bayesian analysis in large data settings, with recent advances using stochastic variational inference (SVI). However, such methods have largely been studied in independent or exchangeable data settings. We develop an SVI algorithm to learn the parameters of hidden Markov models (HMMs) in a time-dependent data setting. The challenge in applying stochastic optimization in this setting arises from dependencies in the chain, which must be broken to consider minibatches of observations. We propose an algorithm that harnesses the memory decay of the chain to adaptively bound errors arising from edge effects. We demonstrate the effectiveness of our algorithm on synthetic experiments and a large genomics dataset where a batch algorithm is computationally infeasible.

Electronic downloads

Citation formats  
  • HTML
    Nick Foti, Jason Xu, Dillon Laird, Emily B. Fox. <a
    href="http://www.terraswarm.org/pubs/423.html"
    >Stochastic Variational Inference for Hidden Markov
    Models</a>, Neural Information Processing Systems
    (NIPS), 8, December, 2014.
  • Plain text
    Nick Foti, Jason Xu, Dillon Laird, Emily B. Fox.
    "Stochastic Variational Inference for Hidden Markov
    Models". Neural Information Processing Systems (NIPS),
    8, December, 2014.
  • BibTeX
    @inproceedings{FotiXuLairdFox14_StochasticVariationalInferenceForHiddenMarkovModels,
        author = {Nick Foti and Jason Xu and Dillon Laird and Emily
                  B. Fox},
        title = {Stochastic Variational Inference for Hidden Markov
                  Models},
        booktitle = {Neural Information Processing Systems (NIPS)},
        day = {8},
        month = {December},
        year = {2014},
        abstract = {Variational inference algorithms have proven
                  successful for Bayesian analysis in large data
                  settings, with recent advances using stochastic
                  variational inference (SVI). However, such methods
                  have largely been studied in independent or
                  exchangeable data settings. We develop an SVI
                  algorithm to learn the parameters of hidden Markov
                  models (HMMs) in a time-dependent data setting.
                  The challenge in applying stochastic optimization
                  in this setting arises from dependencies in the
                  chain, which must be broken to consider
                  minibatches of observations. We propose an
                  algorithm that harnesses the memory decay of the
                  chain to adaptively bound errors arising from edge
                  effects. We demonstrate the effectiveness of our
                  algorithm on synthetic experiments and a large
                  genomics dataset where a batch algorithm is
                  computationally infeasible.},
        URL = {http://terraswarm.org/pubs/423.html}
    }
    

Posted by Barb Hoversten on 3 Nov 2014.
Groups: services

Notice: This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright.