nips nips2006 nips2006-149 nips2006-149-reference knowledge-graph by maker-knowledge-mining

149 nips-2006-Nonnegative Sparse PCA


Source: pdf

Author: Ron Zass, Amnon Shashua

Abstract: We describe a nonnegative variant of the ”Sparse PCA” problem. The goal is to create a low dimensional representation from a collection of points which on the one hand maximizes the variance of the projected points and on the other uses only parts of the original coordinates, and thereby creating a sparse representation. What distinguishes our problem from other Sparse PCA formulations is that the projection involves only nonnegative weights of the original coordinates — a desired quality in various fields, including economics, bioinformatics and computer vision. Adding nonnegativity contributes to sparseness, where it enforces a partitioning of the original coordinates among the new axes. We describe a simple yet efficient iterative coordinate-descent type of scheme which converges to a local optimum of our optimization criteria, giving good results on large real world datasets. 1


reference text

[1] Liviu Badea and Doina Tilivea. Sparse factorizations of gene expression guided by binding data. In Pacific Symposium on Biocomputing, 2005.

[2] Alexandre d’Aspremont, Laurent El Ghaoui, Michael I. Jordan, and Gert R. G. Lanckriet. A direct formulation for sparse PCA using semidefinite programming. In Proceedings of the conference on Neural Information Processing Systems (NIPS), 2004.

[3] C. A. Floudas and V. Visweswaran. Quadratic optimization. In Handbook of global optimization, pages 217–269. Kluwer Acad. Publ., Dordrecht, 1995.

[4] Matthias Heiler and Christoph Schn¨ rr. Learning non-negative sparse image codes by convex o programming. In Proc. of the 10th IEEE Intl. Conf. on Comp. Vision (ICCV), 2005.

[5] Patrik O. Hoyer. Non-negative sparse coding. In Neural Networks for Signal Processing, 2002. Proceedings of the 2002 12th IEEE Workshop on, pages 557–565, 2002.

[6] Patrik O. Hoyer. Non-negative matrix factorization with sparseness constraints. Journal of Machine Learning Research, 5:1457–1469, 2004.

[7] Ravi Jagannathan and Tongshu Ma. Risk reduction in large portfolios: Why imposing the wrong constraints helps. Journal of Finance, 58(4):1651–1684, 08 2003.

[8] Ian T. Jolliffe. Rotation of principal components: Choice of normalization constraints. Journal of Applied Statistics, 22(1):29–35, 1995.

[9] Ian T. Jolliffe, Nickolay T. Trendafilov, and Mudassir Uddin. A modified principal component technique based on the LASSO. Journal of Computational and Graphical Statistics, 12(3):531–547, September 2003.

[10] D. D. Lee and H. S. Seung. Learning the parts of objects by non-negative matrix factorization. Nature, 401(6755):788–791, October 1999.

[11] S. Li, X. Hou, H. Zhang, and Q. Cheng. Learning spatially localized, parts-based representation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2001.

[12] Baback Moghaddam, Yair Weiss, and Shai Avidan. Spectral bounds for sparse pca: Exact and greedy algorithms. In Proceedings of the conference on Neural Information Processing Systems (NIPS), 2005.

[13] Beresford N. Parlett. The symmetric eigenvalue problem. Prentice-Hall, Inc., Upper Saddle River, NJ, USA, 1980.

[14] H. Zou, T. Hastie, and R. Tibshirani. Sparse principal component analysis, 2004.