*banner
 

Probabilistic Inference by Hashing and Optimization
Stefano Ermon

Citation
Stefano Ermon. "Probabilistic Inference by Hashing and Optimization". Talk or presentation, 25, January, 2016.

Abstract
Statistical inference in high-dimensional probabilistic models (i.e., with many variables) is one of the central problems of statistical machine learning and stochastic decision making. To date, only a handful of distinct methods have been developed, most notably (MCMC) sampling, decomposition, and variational methods. In this talk, I will introduce a fundamentally new approach based on random projections and combinatorial optimization. Our approach provides provable guarantees on accuracy, and outperforms traditional methods in a range of domains, in particular those involving combinations of probabilistic and causal dependencies (such as those coming from physical laws) among the variables. This allows for a tighter integration between inductive and deductive reasoning, and offers a range of new modeling opportunities.

Electronic downloads

Citation formats  
  • HTML
    Stefano Ermon. <a
    href="http://chess.eecs.berkeley.edu/pubs/1168.html"
    ><i>Probabilistic Inference by Hashing and
    Optimization</i></a>, Talk or presentation,  25,
    January, 2016.
  • Plain text
    Stefano Ermon. "Probabilistic Inference by Hashing and
    Optimization". Talk or presentation,  25, January, 2016.
  • BibTeX
    @presentation{Ermon16_ProbabilisticInferenceByHashingOptimization,
        author = {Stefano Ermon},
        title = {Probabilistic Inference by Hashing and Optimization},
        day = {25},
        month = {January},
        year = {2016},
        abstract = {Statistical inference in high-dimensional
                  probabilistic models (i.e., with many variables)
                  is one of the central problems of statistical
                  machine learning and stochastic decision making.
                  To date, only a handful of distinct methods have
                  been developed, most notably (MCMC) sampling,
                  decomposition, and variational methods. In this
                  talk, I will introduce a fundamentally new
                  approach based on random projections and
                  combinatorial optimization. Our approach provides
                  provable guarantees on accuracy, and outperforms
                  traditional methods in a range of domains, in
                  particular those involving combinations of
                  probabilistic and causal dependencies (such as
                  those coming from physical laws) among the
                  variables. This allows for a tighter integration
                  between inductive and deductive reasoning, and
                  offers a range of new modeling opportunities.},
        URL = {http://chess.eecs.berkeley.edu/pubs/1168.html}
    }
    

Posted by Sadigh Dorsa on 17 Feb 2016.
For additional information, see the Publications FAQ or contact webmaster at chess eecs berkeley edu.

Notice: This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright.

©2002-2018 Chess