Abstract: Filtering--estimating the state of a partially ob-servable Markov process from a sequence of observations--is one of the most widely stud-ied problems in control theory, AI, and computational statistics. Exact computation of theposterior distribution is generally intractable for large discrete systems and for nonlinear con-tinuous systems, so a good deal of effort has gone into developing robust approximation algo-rithms. This paper describes a simple stochastic approximation algorithm for filtering called decayed MCMC. The algorithm applies Markov chain Monte Carlo sampling to the space ofstate trajectories using a proposal distribution that favours flips of more recent state variables.The formal analysis of the algorithm involves a generalization of standard coupling argumentsfor MCMC convergence. We prove that for any ergodic underlying Markov process, the con-vergence time of decayed MCMC with inversepolynomial decay remains bounded as the lengthof the observation sequence grows. We show experimentally that decayed MCMC is at leastcompetitive with other approximation algorithms such as particle filtering.
Appeared in:In Proceedings of the Conference on Uncertainty in Artificial Intelligence, 2002 (UAI-02), Edmonton, Alberta: Morgan Kaufmann, 2002.
Download: PS version