nips nips2001 nips2001-165 nips2001-165-reference knowledge-graph by maker-knowledge-mining

165 nips-2001-Scaling Laws and Local Minima in Hebbian ICA


Source: pdf

Author: Magnus Rattray, Gleb Basalyga

Abstract: We study the dynamics of a Hebbian ICA algorithm extracting a single non-Gaussian component from a high-dimensional Gaussian background. For both on-line and batch learning we find that a surprisingly large number of examples are required to avoid trapping in a sub-optimal state close to the initial conditions. To extract a skewed signal at least examples are required for -dimensional data and examples are required to extract a symmetrical signal with non-zero kurtosis. § ¡ ©£¢  £ §¥ ¡ ¨¦¤£¢


reference text

S-I Amari, A Cichocki, and H H Yang. In D S Touretzky, M C Mozer, and M E Hasselmo, editors, Neural Information Processing Systems 8, pages 757–763. MIT Press, Cambridge MA, 1996. A J Bell and T J Sejnowski. Neural Computation, 7:1129–1159, 1995. M Biehl. Europhys. Lett., 25:391–396, 1994. M Biehl and H Schwarze. J. Phys. A, 28:643–656, 1995. J-F Cardoso and B. Laheld. IEEE Trans. on Signal Processing, 44:3017–3030, 1996. C. W. Gardiner. Handbook of Stochastic Methods. Springer-Verlag, New York, 1985. A Hyv¨ rinen. Neural Computing Surveys, 2:94–128, 1999. a A Hyv¨ rinen and E Oja. Signal Processing, 64:301–313, 1998. a M Rattray. Neural Computation, 14, 2002 (in press). D Saad, editor. On-line Learning in Neural Networks. Cambridge University Press, 1998. D Saad and S A Solla. Phys. Rev. Lett., 74:4337–4340, 1995. K Y M Wong, S Li, and P Luo. In S A Solla, T K Leen, and K-R M¨ ller, editors, Neural u Information Processing Systems 12. MIT Press, Cambridge MA, 2000.