iccv iccv2013 iccv2013-357 iccv2013-357-reference knowledge-graph by maker-knowledge-mining

357 iccv-2013-Robust Matrix Factorization with Unknown Noise


Source: pdf

Author: Deyu Meng, Fernando De_La_Torre

Abstract: Many problems in computer vision can be posed as recovering a low-dimensional subspace from highdimensional visual data. Factorization approaches to lowrank subspace estimation minimize a loss function between an observed measurement matrix and a bilinear factorization. Most popular loss functions include the L2 and L1 losses. L2 is optimal for Gaussian noise, while L1 is for Laplacian distributed noise. However, real data is often corrupted by an unknown noise distribution, which is unlikely to be purely Gaussian or Laplacian. To address this problem, this paper proposes a low-rank matrix factorization problem with a Mixture of Gaussians (MoG) noise model. The MoG model is a universal approximator for any continuous distribution, and hence is able to model a wider range of noise distributions. The parameters of the MoG model can be estimated with a maximum likelihood method, while the subspace is computed with standard approaches. We illustrate the benefits of our approach in extensive syn- thetic and real-world experiments including structure from motion, face modeling and background subtraction.


reference text

[1] P. M. Q. Aguiar, J. M. F. Xavier, and M. Stosic. Spectrally optimal factorization of incomplete matrices. In CVPR, 2008. 2

[2] D. Andrews and C. Mallows. Scale mixtures of normal distributions. Journal of the Royal Statistical Society, Series B, 36(1):99–102, 1974. 2, 3

[3] C. Archambeau, N. Delannay, and M. Verleysen. Robust probabilistic projections. In ICML, 2006. 2

[4] D. J. Bartholomew. Latent Variable Models and Factor Analysis. Charles Griffin, 1987. 2

[5] R. Basri and D. W. Jacobs. Lambertian reflection and linear subspaces. IEEE Transactions Pattern Analysis and Machine Intelligence, 25:218–233, 2003. 5

[6] A. Buchanan and A. Fitzgibbon. Damped Newton algorithms for matrix factorization with missing data. In CVPR, 2005. 2, 4, 6

[7] E. Cand e`s, X. D. Li, Y. Ma, and J. Wright. Robust principal component analysis? Journal of the ACM, 58, 201 1. 6

[8] E. Cand e`s and J. Romberg. l1-MAGIC: recovery of sparse signals via convex programming. Technical Report, California Institute of Technology, 2005. 4

[9] P. Chen. Optimization algorithms on subspaces: Revisiting missing data problem in low-rank matrix. International Journal of Computer Vision, 80: 125–142, 2008. 1, 2

[10] F. De la Torre. A least-squares framework for component analysis. IEEE Transactions Pattern Analysis and Machine Intelligence, 34(6): 1041–1055, 2012. 7

[11] F. De la Torre and M. J. Black. A framework for robust subspace learning. International Journal of Computer Vision, 54: 117–142, 2003. 1, 2, 4, 6, 7

[12] A. P. Dempster, N. M. Laird, and D. B. Rubin. Maximum likelihood from incomplete data via the em algorithm. Journal of the Royal Statistical Society, B, 39(1): 1–38, 1977. 3, 4

[13] C. Ding, D. Zhou, X. F. He, and H. Y. Zha. R1-PCA: Rotational invariant l1norm principal component analysis for robust subspace factorization. In ICML, 2006. 1

[14] A. Eriksson and A. van den Hengel. Efficient computation of robust low-rank matrix approximations in the presence of missing data using the l1 norm. In CVPR, 2010. 1, 2, 4, 5, 6

[15] K. R. Gabriel and S. Zamir. Lower rank approximation of matrices by least squares with any choice of weights. Technometrics, 21:489–498, 1979. 2

[16] A. S. Georghiades, P. N. Belhumeur, and D. J. Kriegman. From few to many: Illumination cone models for face recognition under variable lighting and pose. IEEE Transactions Pattern Analysis and Machine Intelligence, 23:643–660, 2001. 2, 5

[17] G. H. Golub and C. F. van Loan. Matrix Computation. Maryland: Johns Hopkins University Press, 1989. 5, 7

[18] H. Ji, C. Q. Liu, Z. W. Shen, and Y. H. Xu. Robust video denoising using low rank matrix completion. In CVPR, 2010. 1

[19] C. Juli a`, F. Lumbreras, and A. D. Sappa. A factorization-based approach to photometric stereo. International Journal of Imaging Systems and Technology, 21: 115–1 19, 2011. 1

[20] Q. F. Ke and T. Kanade. Robust l1norm factorization in the presence of outliers and missing data by alternative convex programming. In CVPR, 2005. 1, 2, 4, 5, 6

[21] J. J. Koenderink and A. J. van Doorn. The generic bilinear calibrationestimation problem. International Journal of Computer Vision, 23(3):217–234, 1997. 2

[22] N. Kwak. Principal component analysis based on l1-norm maximization. IEEE Transactions Pattern Analysis and Machine Intelligence, 30: 1672–1680, 2008. 1, 2, 5, 7

[23] B. Lakshminarayanan, G. Bouchard, and C. Archambeau. Robust bayesian matrix factorisation. In AISTATS, 201 1. 2

[24] L. Y. Li, W. M. Huang, I. Gu, and Q. Tian. Statistical modeling of complex backgrounds for foreground object detection. IEEE Transactions on Image Processing, 13(1 1): 1459–1472, 2004. 6, 7

[25] V. Maz’ya and G. Schmidt. On approximate approximations using gaussian kernels. IMA Journal of Numerical Analysis, 16(1): 13–29, 1996. 2, 3

[26] G. J. McLachlan and K. E. Basford. Mixture Models: Inference and Applications to Clustering. Marcel Dekker, 1988. 2

[27] D. Y. Meng, Z. B. Xu, L. Zhang, and J. Zhao. A cyclic weighted median method for l1low-rank matrix factorization with missing entries. AAAI, 2013. 4, 6

[28] K. Mitra, S. Sheorey, and R. Chellappa. Large-scale matrix factorization with missing data under additional constraints. In NIPS, 2010. 1, 2 1343 iMoG(1)Frame398 Frame39 FramO amaniFsr lge Frame386 Frame387 Frame38 Frame389 Frame390 Frame391 Frame392 Frame393 Frame394 Frame395 Frame396 Frame397 e40 Frame401 Frame402 MoG(2) MoG(3) IRLS SVD RegL1ALM PCAL1 CWM A N-Robust PCN Figure 3. From top row to bottom: original Lobby frames, absolute errors computed by different methods(the details are better seen by zooming on a computer screen). The moving object region in Frame 402 is enlarged for better visualization. Original Frame MoG IRLS SVD RegL1ALM PCAL1 CWM NN-Robust PCA Figure4.Fromleft oright:orignalframes,reconstructedbackgroundscomputedbydifer ntmethods.

[29] B. Moghaddam and A. Pentland. Probabilistic visual learning for object representation. IEEE Transactions Pattern Analysis and Machine Intelligence,

[30] [3 1]

[32]

[33]

[34]

[35]

[36]

[37]

[38]

[39]

[40]

[41]

[42]

[43]

[44]

[45] 19:696–710, 1997. 2 J. Nakamura. Image Sensors and Signal Processing for Digital Still Cameras. CRC Press, 2005. 2 T. Okatani and K. Deguchi. On the wiberg algorithm for matrix factorization in the presence of missing components. International Journal of Computer Vision, 72:329–337, 2007. 1, 2 T. Okatani, T. Yoshida, and K. Deguchi. Efficient algorithm for low-rank matrix factorization with missing components and performance comparison of latest algorithms. In ICCV, 2011. 1 S. Roweis. EM algorthms for PCA and SPCA. In NIPS, 1998. 2 A. Shashua. On photometric issues in 3d visual recognition from a single 2d image. International Journal of Computer Vision, 21:99–122, 1997. 2 N. Srebro and T. Jaakkola. Weighted low-rank approximations. In ICML, 2003. 1, 2, 4, 6 P. Sturm. Algorithms for plane-based pose estimation. In CVPR, 2000. 1 M. E. Tipping and C. M. Bishop. Mixtures of probabilistic principal component analysers. Neural Computation, 11:443–482, 1999. 2 M. E. Tipping and C. M. Bishop. Probabilistic principal component analysis. Journal of the Royal Statistical Society: Series B, 61:611–622, 1999. 2 C. Tomasi and T. Kanade. Shape and motion from image streams under orthography: a factorization method. International Journal of Computer Vision, 9: 137–154, 1992. 1 M. Turk and A. Pentland. Eigenfaces for recognition. Journal of Cognitive Neuro Science, 3:71–86, 1991. 1 R. Vidal, R. Tron, and R. Hartley. Multiframe motion segmentation with missing data using power factorization and GPCA. International Journal of Computer Vision, 79:85–105, 2008. 1 N. Y. Wang, T. S. Yao, J. D. Wang, and D. Y. Yeung. A probabilistic approach to robust matrix factorization. In ECCV, 2012. 2 J. Wright, Y. G. Peng, Y. Ma, A. Ganesh, and S. Rao. Robust principal component analysis: Exact recovery of corrupted low-rank matrices by convex optimization. In NIPS, 2009. 1, 2, 4, 5, 7 K. Zhao and Z. Y. Zhang. Successively alternate least square for low-rank matrix factorization with bounded missing data. Computer Vision and Image Understanding, 114: 1084–1096, 2010. 1, 2 Y. Q. Zheng, G. C. Liu, S. Sugimoto, S. C. Yan, and M. Okutomi. Practical low-rank matrix approximation under robust l1-norm. In CVPR, 2012. 2, 4, 5, 6, 7 1344