nips nips2011 nips2011-75 nips2011-75-reference knowledge-graph by maker-knowledge-mining
Source: pdf
Author: Biljana Petreska, Byron M. Yu, John P. Cunningham, Gopal Santhanam, Stephen I. Ryu, Krishna V. Shenoy, Maneesh Sahani
Abstract: Simultaneous recordings of many neurons embedded within a recurrentlyconnected cortical network may provide concurrent views into the dynamical processes of that network, and thus its computational function. In principle, these dynamics might be identified by purely unsupervised, statistical means. Here, we show that a Hidden Switching Linear Dynamical Systems (HSLDS) model— in which multiple linear dynamical laws approximate a nonlinear and potentially non-stationary dynamical process—is able to distinguish different dynamical regimes within single-trial motor cortical activity associated with the preparation and initiation of hand movements. The regimes are identified without reference to behavioural or experimental epochs, but nonetheless transitions between them correlate strongly with external events whose timing may vary from trial to trial. The HSLDS model also performs better than recent comparable models in predicting the firing rate of an isolated neuron based on the firing rates of others, suggesting that it captures more of the “shared variance” of the data. Thus, the method is able to trace the dynamical processes underlying the coordinated evolution of network activity in a way that appears to reflect its computational role. 1
[1] A. C. Smith and E. N. Brown. Estimating a state-space model from point process observations. Neural Computation, 15(5):965–991, 2003.
[2] M. Stopfer, V. Jayaraman, and G. Laurent. Intensity versus identity coding in an olfactory system. Neuron, 39:991–1004, 2003.
[3] S. L. Brown, J. Joseph, and M. Stopfer. Encoding a temporally structured stimulus with a temporally structured neural representation. Nature Neuroscience, 8(11):1568–1576, 2005.
[4] R. Levi, R. Varona, Y. I. Arshavsky, M. I. Rabinovich, and A. I. Selverston. The role of sensory network dynamics in generating a motor program. Journal of Neuroscience, 25(42):9807–9815, 2005.
[5] O. Mazor and G. Laurent. Transient dynamics versus fixed points in odor representations by locust antennal lobe projection neurons. Neuron, 48:661–673, 2005.
[6] B. M. Broome, V. Jayaraman, and G. Laurent. Encoding and decoding of overlapping odor sequences. Neuron, 51:467–482, 2006.
[7] M. M. Churchland, B. M. Yu, M. Sahani, and K. V. Shenoy. Techniques for extracting single-trial activity patterns from large-scale neural recordings. Current Opinion in Neurobiology, 17(5):609–618, 2007.
[8] B. M. Yu, J. P. Cunningham, G. Santhanam, S. I. Ryu, K. V. Shenoy, and M. Sahani. Gaussian-process factor analysis for low-dimensional single-trial analysis of neural population activity. J Neurophysiol, 102:614–635, 2009.
[9] Y. Bar-Shalom and Xiao-Rong Li. Estimation and Tracking: Principles, Techniques and Software. Artech House, Norwood, MA, 1998.
[10] B. Mesot and D. Barber. Switching linear dynamical systems for noise robust speech recognition. IEEE Transactions of Audio, Speech and Language Processing, 15(6):1850–1858, 2007.
[11] W. Wu, M.J. Black, D. Mumford, Y. Gao, E. Bienenstock, and J. P. Donoghue. Modeling and decoding motor cortical activity using a switching kalman filter. IEEE Transactions on Biomedical Engineering, 51(6):933–942, 2004.
[12] D. Barber. Bayesian Reasoning and Machine Learning. Cambridge University Press. In Press, 2011.
[13] K. P. Murphy. Switching kalman filters. Technical Report 98-10, Compaq Cambridge Research Lab, 1998.
[14] B. M. Yu, A. Afshar, G. Santhanam, S. I. Ryu, K. V. Shenoy, and M. Sahani. Extracting dynamical structure embedded in neural activity. In Y. Weiss, B. Sch¨ lkopf, and J. Platt, editors, Advances in Neural o Information Processing Systems 18, pages 1545–1552. Cambridge, MA: MIT Press, 2006.
[15] M. West and J. Harrison. Bayesian Forecasting and Dynamic Models. Springer, 1999.
[16] D. L. Alspach and H. W. Sorenson. Nonlinear bayesian estimation using gaussian sum approximations. IEEE Transactions on Automatic Control, 17(4):439–448, 1972.
[17] X. Boyen and D. Koller. Tractable inference for complex stochastic processes. In Proceedings of the 14th Conference on Uncertainty in Artificial Intelligence - UAI, volume 17, pages 33–42. Morgan Kaufmann, 1998.
[18] T. Minka. A Family of Algorithms for Approximate Bayesian Inference. PhD Thesis, MIT Media Lab, 2001.
[19] D. Barber. Expectation correction for smoothed inference in switching linear dynamical systems. Journal of Machine Learning Research, 7:2515–2540, 2006.
[20] A. J. Viterbi. Error bounds for convolutional codes and an asymptotically optimum decoding algorithm. IEEE Transactions on Information Theory, IT-13:260–267, 1967.
[21] N. A. Thacker and P. A. Bromiley. The effects of a square root transform on a poisson distributed quantity. Technical Report 2001-010, University of Manchester, 2001.
[22] A. P. Georgopoulos, A. B. Schwartz, and R. E. Ketiner. Neuronal population coding of movement direction. Science, 233:1416–1419, 1986.
[23] W. Erlhagen and G. Schoner. Dynamic field theory of movement preparation. Psychol Rev, 109:545–572, 2002. 9