nips nips2009 nips2009-185 nips2009-185-reference knowledge-graph by maker-knowledge-mining

185 nips-2009-Orthogonal Matching Pursuit From Noisy Random Measurements: A New Analysis


Source: pdf

Author: Sundeep Rangan, Alyson K. Fletcher

Abstract: A well-known analysis of Tropp and Gilbert shows that orthogonal matching pursuit (OMP) can recover a k-sparse n-dimensional real vector from m = 4k log(n) noise-free linear measurements obtained through a random Gaussian measurement matrix with a probability that approaches one as n → ∞. This work strengthens this result by showing that a lower number of measurements, m = 2k log(n − k), is in fact sufficient for asymptotic recovery. More generally, when the sparsity level satisfies kmin ≤ k ≤ kmax but is unknown, m = 2kmax log(n − kmin ) measurements is sufficient. Furthermore, this number of measurements is also sufficient for detection of the sparsity pattern (support) of the vector with measurement errors provided the signal-to-noise ratio (SNR) scales to infinity. The scaling m = 2k log(n − k) exactly matches the number of measurements required by the more complex lasso method for signal recovery in a similar SNR scaling.


reference text

[1] S. Mallat. A Wavelet Tour of Signal Processing. Academic Press, second edition, 1999.

[2] A. Miller. Subset Selection in Regression. Number 95 in Monographs on Statistics and Applied Probability. Chapman & Hall/CRC, New York, second edition, 2002.

[3] E. J. Cand` s, J. Romberg, and T. Tao. Robust uncertainty principles: Exact signal reconstruce tion from highly incomplete frequency information. IEEE Trans. Inform. Theory, 52(2):489– 509, February 2006.

[4] D. L. Donoho. Compressed sensing. IEEE Trans. Inform. Theory, 52(4):1289–1306, April 2006.

[5] E. J. Cand` s and T. Tao. Near-optimal signal recovery from random projections: Universal e encoding strategies? IEEE Trans. Inform. Theory, 52(12):5406–5425, December 2006.

[6] B. K. Natarajan. Sparse approximate solutions to linear systems. SIAM J. Computing, 24(2):227–234, April 1995.

[7] S. Chen, S. A. Billings, and W. Luo. Orthogonal least squares methods and their application to non-linear system identification. Int. J. Control, 50(5):1873–1896, November 1989.

[8] Y. C. Pati, R. Rezaiifar, and P. S. Krishnaprasad. Orthogonal matching pursuit: Recursive function approximation with applications to wavelet decomposition. In Conf. Rec. 27th Asilomar Conf. Sig., Sys., & Comput., volume 1, pages 40–44, Pacific Grove, CA, November 1993.

[9] G. Davis, S. Mallat, and Z. Zhang. Adaptive time-frequency decomposition. Optical Eng., 37(7):2183–2191, July 1994.

[10] J. A. Tropp and A. C. Gilbert. Signal recovery from random measurements via orthogonal matching pursuit. IEEE Trans. Inform. Theory, 53(12):4655–4666, December 2007.

[11] J. A. Tropp and A. C. Gilbert. Signal recovery from random measurements via orthogonal matching pursuit: The Gaussian case. Appl. Comput. Math. 2007-01, California Inst. of Tech., August 2007. 8

[12] J. A. Tropp. Greed is good: Algorithmic results for sparse approximation. IEEE Trans. Inform. Theory, 50(10):2231–2242, October 2004.

[13] M. J. Wainwright. Sharp thresholds for high-dimensional and noisy recovery of sparsity. Technical report, Univ. of California, Berkeley, Dept. of Statistics, May 2006. arXiv:math.ST/0605740 v1 30 May 2006.

[14] M. J. Wainwright. Sharp thresholds for high-dimensional and noisy sparsity recovery using ℓ1 -constrained quadratic programming (lasso). IEEE Trans. Inform. Theory, 55(5):2183–2202, May 2009.

[15] D. L. Donoho and J. Tanner. Counting faces of randomly-projected polytopes when the projection radically lowers dimension. J. Amer. Math. Soc., 22(1):1–53, January 2009.

[16] D. Needell and J. A. Tropp. CoSaMP: Iterative signal recovery from incomplete and inaccurate samples. Appl. Comput. Harm. Anal., 26(3):301–321, July 2008.

[17] W. Dai and O. Milenkovic. Subspace pursuit for compressive sensing: Closing the gap between performance and complexity. arXiv:0803.0811v1 [cs.NA]., March 2008.

[18] D. L. Donoho, Y. Tsaig, I. Drori, and J. L. Starck. Sparse solution of underdetermined linear equations by stagewise orthogonal matching pursuit. preprint, March 2006.

[19] D. Needell and R. Vershynin. Uniform uncertainty principle and signal recovery via regularized orthogonal matching pursuit. Found. Comput. Math., 9(3):317–334, June 2008.

[20] M. J. Wainwright. Information-theoretic limits on sparsity recovery in the high-dimensional and noisy setting. Technical Report 725, Univ. of California, Berkeley, Dept. of Statistics, January 2007.

[21] A. K. Fletcher, S. Rangan, and V. K. Goyal. Necessary and sufficient conditions for sparsity pattern recovery. IEEE Trans. Inform. Theory, 55(12), December 2009. To appear. Original submission available online [33].

[22] R. Tibshirani. Regression shrinkage and selection via the lasso. J. Royal Stat. Soc., Ser. B, 58(1):267–288, 1996.

[23] S. S. Chen, D. L. Donoho, and M. A. Saunders. Atomic decomposition by basis pursuit. SIAM J. Sci. Comp., 20(1):33–61, 1999.

[24] M. Akcakaya and V. Tarokh. Noisy compressive sampling limits in linear and sublinear ¸ regimes. In Proc. Conf. on Inform. Sci. & Sys., pages 1–4, Princeton, NJ, March 2008.

[25] M. Akcakaya and V. Tarokh. Shannon theoretic limits on noisy compressive sampling. ¸ arXiv:0711.0366v1 [cs.IT]., November 2007.

[26] G. Reeves. Sparse signal sampling using noisy linear projections. Technical Report UCB/EECS-2008-3, Univ. of California, Berkeley, Dept. of Elec. Eng. and Comp. Sci., January 2008.

[27] S. Aeron, M. Zhao, and V. Saligrama. On sensing capacity of sensor networks for the class of linear observation, fixed SNR models. arXiv:0704.3434v3 [cs.IT]., June 2007.

[28] A. K. Fletcher and S. Rangan. Sparse support recovery from random measurements with orthogonal matching pursuit. Manuscript available online at http://www.eecs.berkeley.edu/∼alyson/Publications/FletcherRangan OMP.pdf, October 2009.

[29] A. K. Fletcher, S. Rangan, and V. K. Goyal. On–off random access channels: A compressed sensing framework. arXiv:0903.1022v1 [cs.IT]., March 2009.

[30] Y. Jin and B. Rao. Performance limits of matching pursuit algorithms. In Proc. IEEE Int. Symp. Inform. Th., pages 2444–2448, Toronto, Canada, June 2008.

[31] D. Wipf and B. Rao. Comparing the effects of different weight distributions on finding sparse representations. In Proc. Neural Information Process. Syst., Vancouver, Canada, December 2006.

[32] I. Karatzas and S. E. Shreve. Brownian Motion and Stochastic Calculus. Springer-Verlag, New York, NY, 2nd edition, 1991.

[33] A. K. Fletcher, S. Rangan, and V. K. Goyal. Necessary and sufficient conditions on sparsity pattern recovery. arXiv:0804.1839v1 [cs.IT]., April 2008. 9