jmlr jmlr2006 jmlr2006-36 jmlr2006-36-reference knowledge-graph by maker-knowledge-mining
Source: pdf
Author: Gilles Blanchard, Motoaki Kawanabe, Masashi Sugiyama, Vladimir Spokoiny, Klaus-Robert Müller
Abstract: Finding non-Gaussian components of high-dimensional data is an important preprocessing step for efficient information processing. This article proposes a new linear method to identify the “nonGaussian subspace” within a very general semi-parametric framework. Our proposed method, called NGCA (non-Gaussian component analysis), is based on a linear operator which, to any arbitrary nonlinear (smooth) function, associates a vector belonging to the low dimensional nonGaussian target subspace, up to an estimation error. By applying this operator to a family of different nonlinear functions, one obtains a family of different vectors lying in a vicinity of the target space. As a final step, the target space itself is estimated by applying PCA to this family of vectors. We show that this procedure is consistent in the sense that the estimaton error tends to zero at a parametric rate, uniformly over the family, Numerical examples demonstrate the usefulness of our method.
M. Belkin and P. Niyogi. Laplacian eigenmaps for dimensionality reduction and data representation. Neural Computation, 15(6):1373–1396, 2003. C. M. Bishop, M. Svensen and C. K. I. Wiliams. GTM: The generative topographic mapping. Neural Computation, 10(1):215–234, 1998. C. M. Bishop and G. D. James. Analysis of multiphase flow using dual-energy gamma densitometry and neural networks. Nuclear Instruments and Methods in Physics Research, A327:580–593, 1993. P. Comon. Independent component analysis—a new concept? Signal Processing, 36:287–314, 1994. T. F. Cox and M. A. A. Cox. Multidimensional Scaling. Chapman & Hall, London, 2001. L. Devroye, L. Gy¨ rfi, and G. Lugosi. A Probabilistic Theory of Pattern Recognition, Springer, 1996. o B. Efron. Bootstrap methods: Another look at the jackknife. The Annals of Statistics, 7(1):1–26, 1979. J. H. Friedman and J. W. Tukey. A projection pursuit algorithm for exploratory data analysis. IEEE Transactions on Computers, 23(9):881–890, 1975. S. Harmeling, A. Ziehe, M. Kawanabe and K.-R. M¨ ller. Kernel-based nonlinear blind source separation. u Neural Computation, 15(5):1089–1124, 2003. P. J. Huber. Projection pursuit. The Annals of Statistics, 13:435–475, 1985. A. Hyv¨ rinen. Fast and robust fixed-point algorithms for independent component analysis. IEEE Transactions a on Neural Networks, 10(3):626–634, 1999. A. Hyv¨ rinen, J. Karhunen and E. Oja. Independent Component Analysis. Wiley, 2001. a M. C. Jones and R. Sibson. What is projection pursuit? Journal of the Royal Statistical Society, series A, 150:1–36, 1987. C. McDiarmid. On the method of bounded differences, Surveys in Combinatorics, London Math. Soc. Lecture Notes Series 141:148–188, 1989. 281 ¨ B LANCHARD , K AWANABE , S UGIYAMA , S POKOINY AND M ULLER F. Meinecke, A. Ziehe, M. Kawanabe, and K.-R. M¨ ller. A resampling approach to estimate the stabilu ity of one-dimensional or multidimensional independent components. IEEE Transactions on Biomedical Engineering, 49:1514–1525, 2002. S. Roweis and L. Saul. Nonlinear dimensionality reduction by locally linear embedding. Science, 290(5500): 2323–2326, 2000. T. Lange, V. Roth, M. L. Braun and J. M. Buhmann. Stability-based validation of clustering solutions. Neural Computation, 16(6):1299-1323, 2004. B. Sch¨ lkopf, A. J. Smola and K.–R. M¨ ller. Nonlinear component analysis as a kernel eigenvalue problem. o u Neural Computation, 10(5):1299–1319, 1998. J. B. Tenenbaum, V. de Silva and J. C. Langford. A global geometric framework for nonlinear dimensionality reduction. Science, 290(5500):2319–2323, 2000. 282