"Non MC methods" usually refer to deterministic methods that provides some quick approximation of the posterior distribution. Historically, a proeminent non-MC method is the well-known Laplace's approximation, but other methods such as Variational Bayes and Expectation-Propagations have emerged reccently in Machine Learning. All these methods are introduced very clearly in e.g. Chris Bishop's book (not available on-line). One issue with these fast approximations is that it is often not easy to assess the approximation error.

## Laplace's approximation

Note that the term "Laplace's approximation" may refer to slightly different things; in machine learning, the most common meaning is that of a way to compute a Gaussian distribution that approximates the posterior distribution, while in Statistics it usually refers to a way to approximate certain posterior moments; see the classical paper of Tierney and Kadane (1986) for the latter. Of course in both cases it amounts to do some Taylor expansion of the log posterior density around the mode.

- Wikipedia has an entry on LA as a general method for approximating integrals, but with few details specific to Statistics.
- David McKay's book has a small Chapter on LA.
- See also the slides by Ryan Guerra and Sargur Srihari.

## INLA

INLA (Integrated nested Laplace Approximation) is turbo-charged version of LA that have shown excellent performance (i.e. very small approximation error) when applied to models based on a latent GMRF (Gaussian Markov random field). The main paper on INLA is here, but one may also get started by browsing the r-inla package web-site, which includes in particular an introduction to INLA and the models it can process.

## Variational Bayes

- Wikipedia has a rather long entry on VB.
- A tutorial by Fox and Roberts can be found here.
- The following paper is not a review, but it explains VB in a very nice way, which might be more approachable to Statisticians not familiar with ML terminology: Faes, C., Ormerod J.T. and Wand M.P. (2011); preprint, paper in JASA.

## Expectation Propagation

EP (Minka, 2004) is a relatively new non-MC method; a good introduction is the paper Expectation Propagation for Exponential Families by Matthias Seeger. Thomas Minka has useful webpages on his own EP papers and a roadmap to research using EP.

*Nicolas Chopin*