nips nips2007 nips2007-213 nips2007-213-reference knowledge-graph by maker-knowledge-mining

213 nips-2007-Variational Inference for Diffusion Processes


Source: pdf

Author: Cédric Archambeau, Manfred Opper, Yuan Shen, Dan Cornford, John S. Shawe-taylor

Abstract: Diffusion processes are a family of continuous-time continuous-state stochastic processes that are in general only partially observed. The joint estimation of the forcing parameters and the system noise (volatility) in these dynamical systems is a crucial, but non-trivial task, especially when the system is nonlinear and multimodal. We propose a variational treatment of diffusion processes, which allows us to compute type II maximum likelihood estimates of the parameters by simple gradient techniques and which is computationally less demanding than most MCMC approaches. We also show how a cheap estimate of the posterior over the parameters can be constructed based on the variational free energy. 1


reference text

[1] F. J. Alexander, G. L. Eyink, and J. M. Restrepo. Accelerated Monte Carlo for optimal estimation of time series. Journal of Statistical Physics, 119:1331–1345, 2005.

[2] J. D. Annan, J. C. Hargreaves, N. R. Edwards, and R. Marsh. Parameter estimation in an intermediate complexity earth system model using an ensemble Kalman filter. Ocean Modelling, 8:135–154, 2005.

[3] A. Apte, M. Hairer, A. Stuart, and J. Voss. Sampling the posterior: An approach to non-Gaussian data assimilation. Physica D, 230:50–64, 2007.

[4] C. Archambeau, D. Cornford, M. Opper, and J. Shawe-Taylor. Gaussian process approximation of stochastic differential equations. Journal of Machine Learning Research: Workshop and Conference Proceedings, 1:1–16, 2007.

[5] D. Barber. Expectation correction for smoothed inference in switching linear dynamical systems. Journal of Machine Learning Research, 7:2515–2540, 2006.

[6] A. Beskos, O. Papaspiliopoulos, G. Roberts, and P. Fearnhead. Exact and computationally efficient likelihood-based estimation for discretely observed diffusion processes (with discussion). Journal of the Royal Statistical Society B, 68(3):333–382, 2006.

[7] Christopher M. Bishop. Pattern Recognition and Machine Learning. Springer, New York, 2006.

[8] D. Crisan and T. Lyons. A particle approximation of the solution of the Kushner-Stratonovitch equation. Probability Theory and Related Fields, 115(4):549–578, 1999.

[9] A. P. Dempster, N. M. Laird, and D. B. Rubin. Maximum likelihood from incomplete data via EM algorithm. Journal of the Royal Statistical Society B, 39(1):1–38, 1977.

[10] G. L. Eyink, J. L. Restrepo, and F. J. Alexander. A mean field approximation in data assimilation for nonlinear dynamics. Physica D, 194:347–368, 2004.

[11] A. Golightly and D. J. Wilkinson. Bayesian inference for nonlinear multivariate diffusion models observed with error. Computational Statistics and Data Analysis, 2007. Accepted.

[12] A. H. Jazwinski. Stochastic Processes and Filtering Theory. Academic Press, New York, 1970.

[13] Peter E. Kloeden and Eckhard Platen. Numerical Solution of Stochastic Differential Equations. Springer, Berlin, 1999.

[14] H. Lappalainen and J. W. Miskin. Ensemble learning. In M. Girolami, editor, Advances in Independent Component Analysis, pages 76–92. Springer-Verlag, 2000.

[15] R. N. Miller, M. Ghil, and F. Gauthiez. Advanced data assimilation in strongly nonlinear dynamical systems. Journal of the Atmospheric Sciences, 51:1037–1056, 1994.

[16] Jorge Nocedal and Stephen J. Wright. Numerical Optimization. Springer, 2000.

[17] G. Roberts and O. Stramer. On inference for partially observed non-linear diffusion models using the Metropolis-Hastings algorithm. Biometirka, 88:603–621, 2001. 8