nips nips2005 nips2005-62 nips2005-62-reference knowledge-graph by maker-knowledge-mining

62 nips-2005-Efficient Estimation of OOMs


Source: pdf

Author: Herbert Jaeger, Mingjie Zhao, Andreas Kolling

Abstract: A standard method to obtain stochastic models for symbolic time series is to train state-emitting hidden Markov models (SE-HMMs) with the Baum-Welch algorithm. Based on observable operator models (OOMs), in the last few months a number of novel learning algorithms for similar purposes have been developed: (1,2) two versions of an ”efficiency sharpening” (ES) algorithm, which iteratively improves the statistical efficiency of a sequence of OOM estimators, (3) a constrained gradient descent ML estimator for transition-emitting HMMs (TE-HMMs). We give an overview on these algorithms and compare them with SE-HMM/EM learning on synthetic and real-life data. 1


reference text

[1] M. L. Littman, R. S. Sutton, and S. Singh. Predictive representation of state. In Advances in Neural Information Processing Systems 14 (Proc. NIPS 01), pages 1555–1561, 2001. http://www.eecs.umich.edu/∼baveja/Papers/psr.pdf.

[2] H. Jaeger, M. Zhao, K. Kretzschmar, T. Oberstein, D. Popovici, and A. Kolling. Learning observable operator models via the es algorithm. In S. Haykin, J. Principe, T. Sejnowski, and J. McWhirter, editors, New Directions in Statistical Signal Processing: from Systems to Brains, chapter 20. MIT Press, to appear in 2005.

[3] H. Xue and V. Govindaraju. Stochastic models combining discrete symbols and continuous attributes in handwriting recognition. In Proc. DAS 2002, 2002.

[4] R. Edwards, J.J. McDonald, and M.J. Tsatsomeros. On matrices with common invariant cones with applications in neural and gene networks. Linear Algebra and its Applications, in press, 2004 (online version). http://www.math.wsu.edu/math/faculty/tsat/files/emt.pdf.

[5] K. Kretzschmar. Learning symbol sequences with Observable Operator Models. GMD Report 161, Fraunhofer Institute AIS, 2003. http://omk.sourceforge.net/files/OomLearn.pdf.