nips nips2004 nips2004-59 nips2004-59-reference knowledge-graph by maker-knowledge-mining

59 nips-2004-Efficient Kernel Discriminant Analysis via QR Decomposition


Source: pdf

Author: Tao Xiong, Jieping Ye, Qi Li, Ravi Janardan, Vladimir Cherkassky

Abstract: Linear Discriminant Analysis (LDA) is a well-known method for feature extraction and dimension reduction. It has been used widely in many applications such as face recognition. Recently, a novel LDA algorithm based on QR Decomposition, namely LDA/QR, has been proposed, which is competitive in terms of classification accuracy with other LDA algorithms, but it has much lower costs in time and space. However, LDA/QR is based on linear projection, which may not be suitable for data with nonlinear structure. This paper first proposes an algorithm called KDA/QR, which extends the LDA/QR algorithm to deal with nonlinear data by using the kernel operator. Then an efficient approximation of KDA/QR called AKDA/QR is proposed. Experiments on face image data show that the classification accuracy of both KDA/QR and AKDA/QR are competitive with Generalized Discriminant Analysis (GDA), a general kernel discriminant analysis algorithm, while AKDA/QR has much lower time and space costs. 1


reference text

[1] G. Baudat and F. Anouar. Generalized discriminant analysis using a kernel approach. Neural Computation, 12(10):2385–2404, 2000.

[2] P.N. Belhumeour, J.P. Hespanha, and D.J. Kriegman. Eigenfaces vs. fisherfaces: Recognition using class specific linear projection. IEEE TPAMI, 19(7):711–720, 1997.

[3] K. Fukunaga. Introduction to Statistical Pattern Classification. Academic Press, San Diego, California, USA, 1990.

[4] G. H. Golub and C. F. Van Loan. Matrix Computations. The Johns Hopkins University Press, Baltimore, MD, USA, third edition, 1996.

[5] Q. Liu, R. Huang, H. Lu, and S. Ma. Kernel-based optimized feature vectors selection and discriminant analysis for face recognition. In ICPR Proceedings, pages 362 – 365, 2002.

[6] S. Mika, G. R¨ tsch, and K.-R. M¨ ller. A mathematical programming approach to the kernel a u fisher algorithm. In NIPS Proceedings, pages 591 – 597, 2001.

[7] S. Mika, G. Ratsch, J. Weston, B. Sch¨ kopf, and K.-R. M¨ ller. Fisher discriminant analysis o u with kernels. In IEEE Neural Networks for Signal Processing Workshop, pages 41 – 48, 1999.

[8] S. Mika, A.J. Smola, and B. Sch¨ lkopf. An improved training algorithm for kernel fisher diso criminants. In AISTATS Proceedings, pages 98–104, 2001.

[9] B. Sch¨ kopf and A. Smola. Learning with Kernels: Support Vector Machines, Regularization, o Optimization and Beyond. MIT Press, 2002.

[10] B. Sch¨ kopf, A. Smola, and K. M¨ ller. Nonlinear component analysis as a kernel eigenvalue o u problem. Neural Computation, 10(5):1299–1319, 1998.

[11] J. Ye and Q. Li. LDA/QR: An efficient and effective dimension reduction algorithm and its theoretical foundation. Pattern recognition, pages 851–854, 2004.

[12] J. Ye, Q. Li, H. Xiong, H. Park, R. Janardan, and V. Kumar. IDR/QR: An incremental dimension reduction algorithm via QR decomposition. In ACM SIGKDD Proceedings, pages 364–373, 2004.

[13] W. Zheng, L. Zhao, and C. Zou. A modified algorithm for generalized discriminant analysis. Neural Computation, 16(6):1283–1297, 2004.