nips nips2011 nips2011-276 nips2011-276-reference knowledge-graph by maker-knowledge-mining

276 nips-2011-Structured sparse coding via lateral inhibition


Source: pdf

Author: Arthur D. Szlam, Karol Gregor, Yann L. Cun

Abstract: This work describes a conceptually simple method for structured sparse coding and dictionary design. Supposing a dictionary with K atoms, we introduce a structure as a set of penalties or interactions between every pair of atoms. We describe modifications of standard sparse coding algorithms for inference in this setting, and describe experiments showing that these algorithms are efficient. We show that interesting dictionaries can be learned for interactions that encode tree structures or locally connected structures. Finally, we show that our framework allows us to learn the values of the interactions from the data, rather than having them pre-specified. 1


reference text

Ackley, D., Hinton, G., and Sejnowski, T. (1985). A learning algorithm for boltzmann machines*. Cognitive science, 9(1):147–169. Aharon, M., Elad, M., and Bruckstein, A. (2006). K-SVD: An algorithm for designing overcomplete dictionaries for sparse representation. IEEE Transactions on Signal Processing, 54(11):4311– 4322. Baraniuk, R. G., Cevher, V., Duarte, M. F., and Hegde, C. (2009). Model-Based Compressive Sensing. Beck, A. and Teboulle, M. (2009). A fast iterative shrinkage-thresholding algorithm with application to wavelet-based image deblurring. ICASSP’09, pages 693–696. Druckmann, S. and Chklovskii, D. (2010). Over-complete representations on recurrent neural networks can support persistent percepts. Garrigues, P. and Olshausen, B. (2008). Learning horizontal connections in a sparse coding model of natural images. Advances in Neural Information Processing Systems, 20:505–512. Garrigues, P. and Olshausen, B. (2010). Group sparse coding with a laplacian scale mixture prior. In Lafferty, J., Williams, C. K. I., Shawe-Taylor, J., Zemel, R., and Culotta, A., editors, Advances in Neural Information Processing Systems 23, pages 676–684. Gregor, K. and LeCun, Y. (2010). Emergence of Complex-Like Cells in a Temporal Product Network with Local Receptive Fields. Arxiv preprint arXiv:1006.0448. Hopfield, J. (1982). Neural networks and physical systems with emergent collective computational abilities. Proceedings of the National Academy of Sciences of the United States of America, 79(8):2554. Huang, J., Zhang, T., and Metaxas, D. N. (2009). Learning with structured sparsity. In ICML, page 53. Hyvarinen, A. and Hoyer, P. (2001). A two-layer sparse coding model learns simple and complex cell receptive fields and topography from natural images. Vision Research, 41(18):2413–2423. Jacob, L., Obozinski, G., and Vert, J.-P. (2009). Group lasso with overlap and graph lasso. In Proceedings of the 26th Annual International Conference on Machine Learning, ICML ’09, pages 433–440, New York, NY, USA. ACM. Jenatton, R., Mairal, J., Obozinski, G., and Bach, F. (2010). Proximal methods for sparse hierarchical dictionary learning. In International Conference on Machine Learning (ICML). Kavukcuoglu, K., Ranzato, M., Fergus, R., and LeCun, Y. (2009). Learning invariant features through topographic filter maps. In Proc. International Conference on Computer Vision and Pattern Recognition (CVPR’09). IEEE. Kim, S. and Xing, E. P. (2010). Tree-guided group lasso for multi-task regression with structured sparsity. In ICML, pages 543–550. Li, Y. and Osher, S. (2009). Coordinate descent optimization for l1 minimization with application to compressed sensing; a greedy algorithm. Inverse Problems and Imaging, 3(3):487–503. Olshausen, B. and Field, D. (1996). Emergence of simple-cell receptive field properties by learning a sparse code for natural images. Nature, 381(6583):607–609. Wu, T. T. and Lange, K. (2008). Coordinate descent algorithms for lasso penalized regression. ANNALS OF APPLIED STATISTICS, 2:224. 9