NIPS Workshop on Approximate Bayesian Inference in Continuous/Hybrid Models, Whistler 2007
Deterministic (variational) techniques are used all over Machine Learning to approximate Bayesian inference for continuous- and hybrid-variable problems. In contrast to discrete variable approximations, surprisingly little is known about convergence, quality of approximation, numerical stability, specific biases, and differential strengths and weaknesses of known methods.
In this workshop, we aim to highlight important problems and to gather ideas of how to address them. The target audience are practitioners, providing insight into and analysis of problems with certain methods or comparative studies of several methods, as well as theoreticians interested in characterizing the hardness of continuous distributions or proving relevant properties of an established method. We especially welcome contributions from Statistics (Markov Chain Monte Carlo), Information Geometry, Optimal Filtering, or other related fields if they make an effort of bridging the gap towards variational techniques.
Message-Passing Algorithms for GMRFs and Non-Linear Optimization
Feb 1, 2008 4079 views
Bounds on the Bethe Free Energy for Gaussian Networks
Feb 1, 2008 4309 views
Approximation and Inference using Latent Variable Sparse Linear Models
Feb 1, 2008 4440 views
Introduction to the Workshop
Dec 31, 2007 3613 views
Improving on Expectation Propagation
Dec 31, 2007 4265 views
Variational Optimisation by Marginal Matching
Dec 31, 2007 3798 views
A Completed Information Projection Interpretation of Expectation Propagation
Dec 31, 2007 4447 views
Approximating the Partition Function by Deleting and then Correcting for Model E...
Dec 31, 2007 3282 views
Perturbative Corrections to Expectation Consistent Approximate Inference
Dec 31, 2007 3742 views
Infer.NET - Practical Implementation Issues and a Comparison of Approximation Te...
Dec 31, 2007 10109 views
Large-scale Bayesian Inference for Collaborative Filtering
Dec 31, 2007 9734 views
