nips nips2008 nips2008-234 nips2008-234-reference knowledge-graph by maker-knowledge-mining

234 nips-2008-The Infinite Factorial Hidden Markov Model


Source: pdf

Author: Jurgen V. Gael, Yee W. Teh, Zoubin Ghahramani

Abstract: We introduce a new probability distribution over a potentially infinite number of binary Markov chains which we call the Markov Indian buffet process. This process extends the IBP to allow temporal dependencies in the hidden variables. We use this stochastic process to build a nonparametric extension of the factorial hidden Markov model. After constructing an inference scheme which combines slice sampling and dynamic programming we demonstrate how the infinite factorial hidden Markov model can be used for blind source separation. 1


reference text

[1] L. R. Rabiner, “A tutorial on hidden markov models and selected applications in speech recognition,” Proceedings of the IEEE, vol. 77, pp. 257–286, 1989.

[2] Z. Ghahramani and M. I. Jordan, “Factorial hidden markov models,” Machine Learning, vol. 29, pp. 245– 273, 1997.

[3] P. Wang and Q. Ji, “Multi-view face tracking with factorial and switching hmm,” in Proceedings of the Seventh IEEE Workshops on Application of Computer Vision, pp. 401–406, IEEE Computer Society, 2005.

[4] B. Logan and P. Moreno, “Factorial hmms for acoustic modeling,” 1998.

[5] K. Duh, “Joint labeling of multiple sequences: A factorial hmm approach,” in 43rd Annual Meeting of the Association of Computational Linguistics (ACL) - Student Research Workshop, 2005.

[6] T. L. Griffiths and Z. Ghahramani, “Infinite latent feature models and the indian buffet process,” Advances in Neural Information Processing Systems, vol. 18, pp. 475–482, 2006.

[7] R. M. Neal, “Bayesian mixture modeling,” Maximum Entropy and Bayesian Methods, 1992.

[8] Y. W. Teh, D. G¨ r¨ r, and Z. Ghahramani, “Stick-breaking construction for the indian buffet process,” ou Proceedings of the International Conference on Artificial Intelligence and Statistics, vol. 11, 2007.

[9] A. Hyvarinen and E. Oja, “Independent component analysis: Algorithms and applications,” Neural Networks, vol. 13, pp. 411–30, 2000.

[10] D. Knowles and Z. Ghahramani, “Infinite sparse factor analysis and infinite independent components analysis,” Lecture Notes in Computer Science, vol. 4666, p. 381, 2007.

[11] S. L. Scott, “Bayesian methods for hidden markov models: Recursive computing in the 21st century,” Journal of the American Statistical Association, vol. 97, pp. 337–351, Mar. 2002.

[12] J. Van Gael, Y. Saatci, Y. W. Teh, and Z. Ghahramani, “Beam sampling for the infinite hidden markov model,” in The 25th International Conference on Machine Learning, vol. 25, (Helsinki), 2008.

[13] M. J. Beal, Z. Ghahramani, and C. E. Rasmussen, “The infinite hidden markov model,” Advances in Neural Information Processing Systems, vol. 14, pp. 577 – 584, 2002.

[14] Z. Ghahramani, T. L. Griffiths, and P. Sollich, “Bayesian nonparametric latent feature models,” Bayesian Statistics, vol. 8, 2007. 8