About
Deterministic (variational) techniques are used all over Machine Learning to approximate Bayesian inference for continuous- and hybrid-variable problems. In contrast to discrete variable approximations, surprisingly little is known about convergence, quality of approximation, numerical stability, specific biases, and differential strengths and weaknesses of known methods.
In this workshop, we aim to highlight important problems and to gather ideas of how to address them. The target audience are practitioners, providing insight into and analysis of problems with certain methods or comparative studies of several methods, as well as theoreticians interested in characterizing the hardness of continuous distributions or proving relevant properties of an established method. We especially welcome contributions from Statistics (Markov Chain Monte Carlo), Information Geometry, Optimal Filtering, or other related fields if they make an effort of bridging the gap towards variational techniques.
Videos
Bounds on the Bethe Free Energy for Gaussian Networks
Feb 1, 2008 4305 views
Approximation and Inference using Latent Variable Sparse Linear Models
Feb 1, 2008 4438 views
Message-Passing Algorithms for GMRFs and Non-Linear Optimization
Feb 1, 2008 4076 views
Approximating the Partition Function by Deleting and then Correcting for Model E...
Dec 31, 2007 3279 views
A Completed Information Projection Interpretation of Expectation Propagation
Dec 31, 2007 4443 views
Improving on Expectation Propagation
Dec 31, 2007 4260 views
Perturbative Corrections to Expectation Consistent Approximate Inference
Dec 31, 2007 3738 views
Variational Optimisation by Marginal Matching
Dec 31, 2007 3796 views
Introduction to the Workshop
Dec 31, 2007 3610 views
Infer.NET - Practical Implementation Issues and a Comparison of Approximation Te...
Dec 31, 2007 10103 views
Large-scale Bayesian Inference for Collaborative Filtering
Dec 31, 2007 9729 views
