nips nips2011 nips2011-179 nips2011-179-reference knowledge-graph by maker-knowledge-mining

179 nips-2011-Multilinear Subspace Regression: An Orthogonal Tensor Decomposition Approach


Source: pdf

Author: Qibin Zhao, Cesar F. Caiafa, Danilo P. Mandic, Liqing Zhang, Tonio Ball, Andreas Schulze-bonhage, Andrzej S. Cichocki

Abstract: A multilinear subspace regression model based on so called latent variable decomposition is introduced. Unlike standard regression methods which typically employ matrix (2D) data representations followed by vector subspace transformations, the proposed approach uses tensor subspace transformations to model common latent variables across both the independent and dependent data. The proposed approach aims to maximize the correlation between the so derived latent variables and is shown to be suitable for the prediction of multidimensional dependent data from multidimensional independent data, where for the estimation of the latent variables we introduce an algorithm based on Multilinear Singular Value Decomposition (MSVD) on a specially defined cross-covariance tensor. It is next shown that in this way we are also able to unify the existing Partial Least Squares (PLS) and N-way PLS regression algorithms within the same framework. Simulations on benchmark synthetic data confirm the advantages of the proposed approach, in terms of its predictive ability and robustness, especially for small sample sizes. The potential of the proposed technique is further illustrated on a real world task of the decoding of human intracranial electrocorticogram (ECoG) from a simultaneously recorded scalp electroencephalograph (EEG). 1


reference text

[1] L. Wolf, H. Jhuang, and T. Hazan. Modeling appearances with low-rank SVM. In IEEE Conference on Computer Vision and Pattern Recognition, pages 1–6. IEEE, 2007.

[2] Hamed Pirsiavash, Deva Ramanan, and Charless Fowlkes. Bilinear classifiers for visual recognition. In Y. Bengio, D. Schuurmans, J. Lafferty, C. K. I. Williams, and A. Culotta, editors, Advances in Neural Information Processing Systems 22, pages 1482–1490. 2009.

[3] H. Lu, K.N. Plataniotis, and A.N. Venetsanopoulos. MPCA: Multilinear principal component analysis of tensor objects. IEEE Transactions on Neural Networks, 19(1):18–39, 2008.

[4] S. Yan, D. Xu, Q. Yang, L. Zhang, X. Tang, and H.J. Zhang. Multilinear discriminant analysis for face recognition. IEEE Transactions on Image Processing, 16(1):212–220, 2007.

[5] D. Tao, X. Li, X. Wu, and S.J. Maybank. General tensor discriminant analysis and Gabor features for gait recognition. IEEE Transactions on Pattern Analysis and Machine Intelligence, 29(10):1700–1715, 2007.

[6] A.K. Smilde and H.A.L. Kiers. Multiway covariates regression models. Journal of Chemometrics, 13(1):31–48, 1999.

[7] X. He, D. Cai, and P. Niyogi. Tensor subspace analysis. Advances in Neural Information Processing Systems, 18:499, 2006.

[8] T.G. Kolda and B.W. Bader. Tensor decompositions and applications. SIAM Review, 51(3):455–500, 2009.

[9] A. Cichocki, R. Zdunek, A. H. Phan, and S. I. Amari. Nonnegative Matrix and Tensor Factorizations. John Wiley & Sons, 2009.

[10] E. Acar, D.M. Dunlavy, T.G. Kolda, and M. Mørup. Scalable tensor factorizations for incomplete data. Chemometrics and Intelligent Laboratory Systems, 2010.

[11] R. Bro, R.A. Harshman, N.D. Sidiropoulos, and M.E. Lundy. Modeling multi-way data with linearly dependent loadings. Journal of Chemometrics, 23(7-8):324–340, 2009.

[12] S. Wold, M. Sjostroma, and L. Erikssonb. PLS-regression: A basic tool of chemometrics. Chemometrics and Intelligent Laboratory Systems, 58:109–130, 2001.

[13] H. Wold. Soft modeling by latent variables: The nonlinear iterative partial least squares approach. Perspectives in probability and statistics, papers in honour of MS Bartlett, pages 520–540, 1975.

[14] A. Krishnan, L.J. Williams, A.R. McIntosh, and H. Abdi. Partial least squares (PLS) methods for neuroimaging: A tutorial and review. NeuroImage, 56(2):455 – 475, 2011.

[15] H. Abdi. Partial least squares regression and projection on latent structure regression (PLS Regression). Wiley Interdisciplinary Reviews: Computational Statistics, 2(1):97–106, 2010.

[16] R. Rosipal and N. Kr¨ mer. Overview and recent advances in partial least squares. In Subspace, a Latent Structure and Feature Selection, volume 3940 of Lecture Notes in Computer Science, pages 34–51. Springer, 2006.

[17] R. Bro. Multiway calibration. Multilinear PLS. Journal of Chemometrics, 10(1):47–61, 1996.

[18] L. De Lathauwer. Decompositions of a higher-order tensor in block terms - Part II: Definitions and uniqueness. SIAM J. Matrix Anal. Appl, 30(3):1033–1066, 2008.

[19] L. De Lathauwer, B. De Moor, and J. Vandewalle. A multilinear singular value decomposition. SIAM Journal on Matrix Analysis and Applications, 21(4):1253–1278, 2000.

[20] M. Velliste, S. Perel, M.C. Spalding, A.S. Whitford, and A.B. Schwartz. Cortical control of a prosthetic arm for self-feeding. Nature, 453(7198):1098–1101, 2008.

[21] Z.C. Chao, Y. Nagasaka, and N. Fujii. Long-term asynchronous decoding of arm motion using electrocorticographic signals in monkeys. Frontiers in Neuroengineering, 3(3), 2010.

[22] T. Pistohl, T. Ball, A. Schulze-Bonhage, A. Aertsen, and C. Mehring. Prediction of arm movement trajectories from ECoG-recordings in humans. Journal of Neuroscience Methods, 167(1):105–114, 2008.

[23] T.J. Bradberry, R.J. Gentili, and J.L. Contreras-Vidal. Reconstructing three-dimensional hand movements from noninvasive electroencephalographic signals. The Journal of Neuroscience, 30(9):3432, 2010. 9