cvpr cvpr2013 cvpr2013-191 cvpr2013-191-reference knowledge-graph by maker-knowledge-mining
Source: pdf
Author: Bo Jiang, Chris Ding, Bio Luo, Jin Tang
Abstract: Principal Component Analysis (PCA) is a widely used to learn a low-dimensional representation. In many applications, both vector data X and graph data W are available. Laplacian embedding is widely used for embedding graph data. Wepropose a graph-Laplacian PCA (gLPCA) to learn a low dimensional representation of X that incorporates graph structures encoded in W. This model has several advantages: (1) It is a data representation model. (2) It has a compact closed-form solution and can be efficiently computed. (3) It is capable to remove corruptions. Extensive experiments on 8 datasets show promising results on image reconstruction and significant improvement on clustering and classification.
[1] M. Belkin and P. Niyogi. Laplacian eigenmaps and spectral technques for embedding and clustering. In NIPS 2001.
[2] P. Chan, M. Schlag, and J. Zien. Spectral k-way ratio-cut partitioning and clustering. IEEE Trans. CAD-Integrated Circuits and Systems, 13: 1088–1096, 1994.
[3] M. Collins, S. Dasgupta, and R. Schapire. A generalization of principal component analysis to the exponential family. Neural Info. Processing Systems (NIPS 2001), 2001.
[4] C. Ding and X. He. K-means clustering via principal component analysis. In ICML 2004.
[5] C. Ding, T. Li, and M. I. Jordan. Convex and seminonnegative matrix factorization. IEEE Transactions on Pattern Analysis and Machine Intelligence, 32(1):45–55, 2010.
[6] C. Ding, D. Zhou, X. He, and H. Zha. R1-PCA: Rotational invariant L1-norm principal component analysis for robust subspace factorization. In ICML 2006.
[7] M. Gu, H. Zha, C. Ding, X. He, and H. Simon. Spectral relaxation models and structure analysis for k-way graph clustering and bi-clustering. Penn State Univ Tech Report CSE01-007, 2001.
[8] K. M. Hall. R-dimensional quadratic placement algorithm. Management Science, 17:219–229, 1971.
[9] X. He and P. Niyogi. Locality preserving projection. In NIPS
[10]
[11]
[12]
[13]
[14]
[15] 2003. I. Jolliffe. Principal Component Analysis. Springer, 2nd edition, 2002. D. Luo, C. Ding, and H. Huang. Towards structural sparsity: An explicit L2/L0 approach. In ICDM 2010. S. T. Roweis and L. K. Saul. Nonlinear dimensionality reduction by locally linear embedding. Science, 290(22):2323– 2326, 2000. J. Shi and J. Malik. Normalized cuts and image segmentation. IEEE. Trans. on Pattern Analysis and Machine Intelligence, 22:888–905, 2000. J. B. Tenenbaum, V. de. Silva, and J. C. Langford. A global geometric framework for nonlinear dimensionality. Science, 290(22):2319–2323, 2000. Z. Zhang and Z. Zha. Principal manifolds and nonlinear dimensionality reduction via tangent space alignment. SIAM J. Scientific Computing, 26:3 13–338, 2004. 333444999866