nips nips2002 nips2002-22 nips2002-22-reference knowledge-graph by maker-knowledge-mining
Source: pdf
Author: Herbert Jaeger
Abstract: Echo state networks (ESN) are a novel approach to recurrent neural network training. An ESN consists of a large, fixed, recurrent
[1] A.F. Atiya and A.G. Parlos. New results on recurrent network training: Unifying the algorithms and accelerating convergence. IEEE Trans. Neural Networks, 11(3):697- 709,2000.
[2] B. Farhang-Boroujeny. Adaptive Filters: Theory and Applications. Wiley, 1998.
[3] L.A. Feldkamp, D.V. Prokhorov, C.F. Eagen, and F. Yuan. Enhanced multistream Kalman filter training for recurrent neural networks. In J .A.K . Suykens and J. Vandewalle, editors, Nonlinear Modeling: Advanced Black-Box Techniques, pages 29- 54. Kluwer, 1998.
[4] J. Hertzberg, H. Jaeger, and F. Schonherr. Learning to ground fact symbols in behavior-based robots. In F. van Harmelen, editor, Proc. 15th Europ. Gonf. on Art. Int. (EGAI 02), pages 708- 712. lOS Press, Amsterdam, 2002.
[5] H. Jaeger. The
[6] H. Jaeger. Short term memory in echo state networks. GMD-Report 152, GMD - German National Research Institute for Computer Science, 2002. http://www.gmd.de/People/Herbert.Jaeger/Publications.html.
[7] H. Jaeger. Tutorial on training recurrent neural networks, covering BPPT, RTRL , EKF and the echo state network approach. GMD Report 159, Fraunhofer Institute AIS , 2002.
[8] W. Maass, T. Natschlaeger, and H. Markram. Real-time computing without stable states: A new framework for neural computation based on perturbations. http://www.cis.tugraz.at/igi/maass/psfiles/LSM-vl06.pdf. 2002.
[9] W. Maass, Th. NatschHiger, and H. Markram. A model for real-time computation in generic neural microcircuits. In S. Becker, S. Thrun, and K. Obermayer , editors, Advances in Neural Information Processing System 15 (Proc. NIPS 2002). MIT Press, 2002.