nips nips2008 nips2008-198 nips2008-198-reference knowledge-graph by maker-knowledge-mining
Source: pdf
Author: Sundeep Rangan, Vivek Goyal, Alyson K. Fletcher
Abstract: This paper addresses the problem of sparsity pattern detection for unknown ksparse n-dimensional signals observed through m noisy, random linear measurements. Sparsity pattern recovery arises in a number of settings including statistical model selection, pattern detection, and image acquisition. The main results in this paper are necessary and sufficient conditions for asymptotically-reliable sparsity pattern recovery in terms of the dimensions m, n and k as well as the signal-tonoise ratio (SNR) and the minimum-to-average ratio (MAR) of the nonzero entries of the signal. We show that m > 2k log(n − k)/(SNR · MAR) is necessary for any algorithm to succeed, regardless of complexity; this matches a previous sufficient condition for maximum likelihood estimation within a constant factor under certain scalings of k, SNR and MAR with n. We also show a sufficient condition for a computationally-trivial thresholding algorithm that is larger than the previous expression by only a factor of 4(1 + SNR) and larger than the requirement for lasso by only a factor of 4/MAR. This provides insight on the precise value and limitations of convex programming-based algorithms.
[1] M. Lewicki. Efficient coding of natural sounds. Nature Neuroscience, 5:356–363, 2002.
[2] B. A. Olshausen and D. J. Field. Sparse coding of sensory inputs. Curr. Op. in Neurobiology, 14:481–487, 2004.
[3] C. J. Rozell, D. H. Johnson, R. G. Baraniuk, and B. A. Olshausen. Sparse coding via thresholding and local competition in neural circuits. Neural Computation, 2008. In press.
[4] B. K. Natarajan. Sparse approximate solutions to linear systems. SIAM J. Computing, 24(2):227–234, April 1995.
[5] S. G. Mallat and Z. Zhang. Matching pursuits with time-frequency dictionaries. IEEE Trans. Signal Process., 41(12):3397–3415, Dec. 1993.
[6] S. S. Chen, D. L. Donoho, and M. A. Saunders. Atomic decomposition by basis pursuit. SIAM J. Sci. Comp., 20(1):33–61, 1999.
[7] R. Tibshirani. Regression shrinkage and selection via the lasso. J. Royal Stat. Soc., Ser. B, 58(1):267–288, 1996.
[8] D. L. Donoho, M. Elad, and V. N. Temlyakov. Stable recovery of sparse overcomplete representations in the presence of noise. IEEE Trans. Inform. Theory, 52(1):6–18, Jan. 2006.
[9] J. A. Tropp. Greed is good: Algorithmic results for sparse approximation. IEEE Trans. Inform. Theory, 50(10):2231–2242, Oct. 2004.
[10] J. A. Tropp. Just relax: Convex programming methods for identifying sparse signals in noise. IEEE Trans. Inform. Theory, 52(3):1030–1051, March 2006.
[11] E. J. Cand` s, J. Romberg, and T. Tao. Robust uncertainty principles: Exact signal reconstruce tion from highly incomplete frequency information. IEEE Trans. Inform. Theory, 52(2):489– 509, Feb. 2006.
[12] D. L. Donoho. Compressed sensing. IEEE Trans. Inform. Theory, 52(4):1289–1306, April 2006.
[13] E. J. Cand` s and T. Tao. Near-optimal signal recovery from random projections: Universal e encoding strategies? IEEE Trans. Inform. Theory, 52(12):5406–5425, Dec. 2006.
[14] M. J. Wainwright. Sharp thresholds for high-dimensional and noisy recovery of sparsity. arXiv:0605.740v1 [math.ST]., May 2006.
[15] S. Sarvotham, D. Baron, and R. G. Baraniuk. Measurements vs. bits: Compressed sensing meets information theory. In Proc. 44th Ann. Allerton Conf. on Commun., Control and Comp., Monticello, IL, Sept. 2006.
[16] A. K. Fletcher, S. Rangan, and V. K. Goyal. Rate-distortion bounds for sparse approximation. In IEEE Statist. Sig. Process. Workshop, pages 254–258, Madison, WI, Aug. 2007.
[17] M. J. Wainwright. Information-theoretic limits on sparsity recovery in the high-dimensional and noisy setting. Tech. Report 725, Univ. of California, Berkeley, Dept. of Stat., Jan. 2007.
[18] V. K. Goyal, A. K. Fletcher, and S. Rangan. Compressive sampling and lossy compression. IEEE Sig. Process. Mag., 25(2):48–56, March 2008.
[19] G. Reeves. Sparse signal sampling using noisy linear projections. Tech. Report UCB/EECS2008-3, Univ. of California, Berkeley, Dept. of Elec. Eng. and Comp. Sci., Jan. 2008.
[20] M. Akcakaya and V. Tarokh. Shannon theoretic limits on noisy compressive sampling. ¸ arXiv:0711.0366v1 [cs.IT]., Nov. 2007.
[21] M. Akcakaya and V. Tarokh. Noisy compressive sampling limits in linear and sublinear ¸ regimes. In Proc. Conf. on Inform. Sci. & Sys., Princeton, NJ, March 2008.
[22] J. Haupt and R. Nowak. Signal reconstruction from noisy random projections. IEEE Trans. Inform. Theory, 52(9):4036–4048, Sept. 2006.
[23] A. K. Fletcher, S. Rangan, and V. K. Goyal. Necessary and sufficient conditions on sparsity pattern recovery. arXiv:0804.1839v1 [cs.IT]., April 2008.
[24] H. Rauhut, K. Schnass, and P. Vandergheynst. Compressed sensing and redundant dictionaries. IEEE Trans. Inform. Theory, 54(5):2210–2219, May 2008.
[25] S. Aeron, M. Zhao, and V. Saligrama. On sensing capacity of sensor networks for the class of linear observation, fixed SNR models. arXiv:0704.3434v3 [cs.IT]., June 2007.