nips nips2011 nips2011-183 nips2011-183-reference knowledge-graph by maker-knowledge-mining
Source: pdf
Author: Alyson K. Fletcher, Sundeep Rangan, Lav R. Varshney, Aniruddha Bhargava
Abstract: Many functional descriptions of spiking neurons assume a cascade structure where inputs are passed through an initial linear filtering stage that produces a lowdimensional signal that drives subsequent nonlinear stages. This paper presents a novel and systematic parameter estimation procedure for such models and applies the method to two neural estimation problems: (i) compressed-sensing based neural mapping from multi-neuron excitation, and (ii) estimation of neural receptive fields in sensory neurons. The proposed estimation algorithm models the neurons via a graphical model and then estimates the parameters in the model using a recently-developed generalized approximate message passing (GAMP) method. The GAMP method is based on Gaussian approximations of loopy belief propagation. In the neural connectivity problem, the GAMP-based method is shown to be computational efficient, provides a more exact modeling of the sparsity, can incorporate nonlinearities in the output and significantly outperforms previous compressed-sensing methods. For the receptive field estimation, the GAMP method can also exploit inherent structured sparsity in the linear weights. The method is validated on estimation of linear nonlinear Poisson (LNP) cascade models for receptive fields of salamander retinal ganglion cells. 1
[1] Peter Dayan and L. F. Abbott. Theoretical Neuroscience. Computational and Mathematical Modeling of Neural Systems. MIT Press, 2001.
[2] Odelia Schwartz, Jonathan W. Pillow, Nicole C. Rust, and Eero P. Simoncelli. Spike-triggered neural characterization. J. Vis., 6(4):13, July 2006.
[3] Liam Paninski, Jonathan W. Pillow, and Eero P. Simoncelli. Maximum Likelihood Estimation of a Stochastic Integrate-and-Fire Neural Encoding Model. Neural Computation, 16(12):2533– 2561, December 2004.
[4] Tao Hu and Dmitri B. Chklovskii. Reconstruction of sparse circuits using multi-neuronal excitation (RESCUME). In Yoshua Bengio, Dale Schuurmans, John Lafferty, Chris Williams, and Aron Culotta, editors, Advances in Neural Information Processing Systems 22, pages 790– 798. MIT Press, Cambridge, MA, 2009.
[5] James R. Anderson, Bryan W. Jones, Carl B. Watt, Margaret V. Shaw, Jia-Hui Yang, David DeMill, James S. Lauritzen, Yanhua Lin, Kevin D. Rapp, David Mastronarde, Pavel Koshevoy, Bradley Grimm, Tolga Tasdizen, Ross Whitaker, and Robert E. Marc. Exploring the retinal connectome. Mol. Vis, 17:355–379, February 2011.
[6] Elad Ganmor, Ronen Segev, and Elad Schneidman. The architecture of functional interaction networks in the retina. J. Neurosci., 31(8):3044–3054, February 2011.
[7] Lav R. Varshney, Per Jesper Sj¨ str¨ m, and Dmitri B. Chklovskii. Optimal information storage o o in noisy synapses under resource constraints. Neuron, 52(3):409–423, November 2006.
[8] E. J. Cand` s, J. Romberg, and T. Tao. Robust uncertainty principles: Exact signal reconstruce tion from highly incomplete frequency information. IEEE Trans. Inform. Theory, 52(2):489– 509, February 2006.
[9] D. L. Donoho. Compressed sensing. IEEE Trans. Inform. Theory, 52(4):1289–1306, April 2006. 8
[10] E. J. Cand` s and T. Tao. Near-optimal signal recovery from random projections: Universal e encoding strategies? IEEE Trans. Inform. Theory, 52(12):5406–5425, December 2006.
[11] S. Rangan. Generalized Approximate Message Passing for Estimation with Random Linear Mixing. arXiv:1010.5141 [cs.IT]., October 2010.
[12] S. Rangan, A.K. Fletcher, V.K.Goyal, and P. Schniter. Hybrid Approximate Message Passing with Applications to Group Sparsity . arXiv, 2011.
[13] D. Guo and C.-C. Wang. Random sparse linear systems observed via arbitrary channels: A decoupling principle. In Proc. IEEE Int. Symp. Inform. Th., pages 946 – 950, Nice, France, June 2007.
[14] David L. Donoho, Arian Maleki, and Andrea Montanari. Message-passing algorithms for compressed sensing. PNAS, 106(45):18914–18919, September 2009.
[15] David H. Hubel. Eye, Brain, and Vision. W. H. Freeman, 2nd edition, 1995.
[16] Toshihiko Hosoya, Stephen A. Baccus, and Markus Meister. Dynamic predictive coding by the retina. Nature, 436(7047):71–77, July 2005.
[17] E. J. Chichilnisky. A simple white noise analysis of neuronal light responses. Network: Computation in Neural Systems., 12:199–213, 2001.
[18] L. Paninski. Convergence properties of some spike-triggered analysis techniques. Network: Computation in Neural Systems, 14:437–464, 2003.
[19] S. Bakin. Adaptive regression and model selection in data mining problems. PhD thesis, Australian National University, Canberra, 1999.
[20] M. Yuan and Y. Lin. Model selection and estimation in regression with grouped variables. J. Royal Statist. Soc., 68:49–67, 2006.
[21] Lukas Meier, Sara van de Geer, and Peter B¨ hlmann. Model selection and estimation in reu gression with grouped variables. J. Royal Statist. Soc., 70:53–71, 2008. ´
[22] Aur´ lie C. Lozano, Grzegorz Swirszcz, and Naoki Abe. Group orthogonal matching pursuit e for variable selection and prediction. In Proc. NIPS, Vancouver, Canada, December 2008.
[23] C. M. Bishop. Pattern Recognition and Machine Learning. Information Science and Statistics. Springer, New York, NY, 2006.
[24] Markus Meister, Jerome Pine, and Denis A. Baylor. Multi-neuronal signals from the retina: acquisition and analysis. J. Neurosci. Methods, 51(1):95–106, January 1994.
[25] Joaquin Rapela, Jerry M. Mendel, and Norberto M. Grzywacz. Estimating nonlinear receptive fields from natural images. J. Vis., 6(4):11, May 2006.
[26] D. Needell and J. A. Tropp. CoSaMP: Iterative signal recovery from incomplete and inaccurate samples. Appl. Comput. Harm. Anal., 26(3):301–321, May 2009.
[27] W. Dai and O. Milenkovic. Subspace pursuit for compressive sensing signal reconstruction. IEEE Trans. Inform. Theory, 55(5):2230–2249, May 2009.
[28] Dmitri B. Chklovskii, Bartlett W. Mel, and Karel Svoboda. Cortical rewiring and information storage. Nature, 431(7010):782–788, October 2004.
[29] Tai Sing Lee and David Mumford. Hierarchical bayesian inference in the visual cortex. J. Opt. Soc. Am. A, 20(7):1434–1448, July 2003.
[30] Karl Friston. The free-energy principle: a unified brain theory? Nat. Rev. Neurosci., 11(2):127– 138, February 2010.
[31] Guy Isely, Christopher J. Hillar, and Friedrich T. Sommer. Decyphering subsampled data: Adaptive compressive sampling as a principle of brain communication. In J. Lafferty, C. K. I. Williams, J. Shawe-Taylor, R. S. Zemel, and A. Culotta, editors, Advances in Neural Information Processing Systems 23, pages 910–918. MIT Press, Cambridge, MA, 2010. 9