nips nips2002 nips2002-73 nips2002-73-reference knowledge-graph by maker-knowledge-mining
Source: pdf
Author: David Barber
Abstract: The application of latent/hidden variable Dynamic Bayesian Networks is constrained by the complexity of marginalising over latent variables. For this reason either small latent dimensions or Gaussian latent conditional tables linearly dependent on past states are typically considered in order that inference is tractable. We suggest an alternative approach in which the latent variables are modelled using deterministic conditional probability tables. This specialisation has the advantage of tractable inference even for highly complex non-linear/non-Gaussian visible conditional probability tables. This approach enables the consideration of highly complex latent dynamics whilst retaining the benefits of a tractable probabilistic model. 1
[1] C.M. Bishop, Neural Networks for Pattern Recognition, Oxford University Press, 1995.
[2] H.A. Bourlard and N. Morgan, Connectionist Speech Recognition. A Hybrid Approach., Kluwer, 1994.
[3] A. Doucet, N. de Freitas, and N. J. Gordon, Sequential Monte Carlo Methods in Practice, Springer, 2001.
[4] J. Hertz, A. Krogh, and R. Palmer, Introduction to the theory of neural computation., Addison-Wesley, 1991.
[5] M. I. Jordan, Learning in Graphical Models, MIT Press, 1998.
[6] J.F. Kolen and S.C. Kramer, Dynamic Recurrent Networks, IEEE Press, 2001.
[7] A. Krogh and S.K. Riis, Hidden Neural Networks, Neural Computation 11 (1999), 541–563.
[8] M. Kudo, J. Toyama, and M. Shimbo, Multidimensional Curve Classification Using Passing-Through Regions, Pattern Recognition Letters 20 (1999), no. 11-13, 1103– 1111.
[9] L.R. Rabiner and B.H. Juang, An introduction to hidden Markov models, IEEE Transactions on Acoustics Speech, Signal Processing 3 (1986), no. 1, 4–16.
[10] M. West and J. Harrison, Bayesian forecasting and dynamic models, Springer, 1999.