nips nips2001 nips2001-82 knowledge-graph by maker-knowledge-mining
Source: pdf
Author: Xiaohui Xie, Martin A. Giese
Abstract: Asymmetric lateral connections are one possible mechanism that can account for the direction selectivity of cortical neurons. We present a mathematical analysis for a class of these models. Contrasting with earlier theoretical work that has relied on methods from linear systems theory, we study the network’s nonlinear dynamic properties that arise when the threshold nonlinearity of the neurons is taken into account. We show that such networks have stimulus-locked traveling pulse solutions that are appropriate for modeling the responses of direction selective cortical neurons. In addition, our analysis shows that outside a certain regime of stimulus speeds the stability of this solutions breaks down giving rise to another class of solutions that are characterized by specific spatiotemporal periodicity. This predicts that if direction selectivity in the cortex is mainly achieved by asymmetric lateral connections lurching activity waves might be observable in ensembles of direction selective cortical neurons within appropriate regimes of the stimulus speed.
Reference: text
sentIndex sentText sentNum sentScore
1 Generating velocity tuning by asymmetric recurrent connections £ ¢ ¡ Xiaohui Xie and Martin A. [sent-1, score-0.378]
2 edu £ ¤ ¥ Abstract Asymmetric lateral connections are one possible mechanism that can account for the direction selectivity of cortical neurons. [sent-5, score-0.556]
3 Contrasting with earlier theoretical work that has relied on methods from linear systems theory, we study the network’s nonlinear dynamic properties that arise when the threshold nonlinearity of the neurons is taken into account. [sent-7, score-0.4]
4 We show that such networks have stimulus-locked traveling pulse solutions that are appropriate for modeling the responses of direction selective cortical neurons. [sent-8, score-1.228]
5 In addition, our analysis shows that outside a certain regime of stimulus speeds the stability of this solutions breaks down giving rise to another class of solutions that are characterized by specific spatiotemporal periodicity. [sent-9, score-0.836]
6 This predicts that if direction selectivity in the cortex is mainly achieved by asymmetric lateral connections lurching activity waves might be observable in ensembles of direction selective cortical neurons within appropriate regimes of the stimulus speed. [sent-10, score-1.658]
7 1 Introduction Classical models for the direction selectivity in the primary visual cortex have assumed feed-forward mechanisms, like multiplication or gating of afferent thalamo-cortical inputs (e. [sent-11, score-0.426]
8 [1, 2, 3]), or linear spatio-temporal filtering followed by a nonlinear operation (e. [sent-13, score-0.125]
9 The existence of strong lateral connectivity has motivated modeling studies, which have shown that the properties of direction selective cortical neurons can also be accurately reproduced by recurrent neural network models with asymmetric lateral excitatory or inhibitory connections [6, 7]. [sent-16, score-0.908]
10 Since these biophysically detailed models are not accessible for mathematical analysis, more simplified models appropriate for a mathematical analysis have been proposed. [sent-17, score-0.187]
11 Such analysis was based on methods from linear systems theory by neglecting the nonlinear properties of the neurons [6, 8, 9]. [sent-18, score-0.289]
12 The nonlinear dynamic phenomena resulting from the interplay between the recurrent connectivity and the nonlinear threshold characteristics of the neurons have not been tractable in this theoretical framework. [sent-19, score-0.554]
13 In this paper we present a mathematical analysis that takes the nonlinear behavior of the individual neurons into account. [sent-20, score-0.322]
14 We present the result of the analysis of such networks for two types of threshold nonlinearities, for which closed-form analytical solutions of the network dynamics can be derived. [sent-21, score-0.491]
15 We show that such nonlinear networks have a class of form-stable solutions, in the following signified as stimulus-locked traveling pulses, which are suitable for modeling the activity of direction selective neurons. [sent-22, score-0.865]
16 Contrary to networks with linear neurons, the stability of the traveling pulse solutions in the nonlinear network can break down giving raise to another class of solutions (lurching activity waves) that is characterized by spatio-temporal periodicity. [sent-23, score-1.571]
17 Our mathematical analysis and simulations showed that recurrent models with biologically realistic degrees of direction selectivity typically also show transitions between traveling pulse and lurching solutions. [sent-24, score-1.48]
18 2 Basic model Dynamic neural fields have been proposed to model the average behavior of a large ensembles of neurons [10, 11, 12]. [sent-25, score-0.175]
19 The scalar neural activity distribution characterizes the average activity at time of an ensemble of functionally similar neurons that code for the position , where can be any abstract stimulus parameter. [sent-26, score-0.584]
20 By the continuous approximation of biophysically discrete neuronal dynamics it is in some cases possible to treat the nonlinear neural dynamics analytically. [sent-27, score-0.5]
21 ¨ ¦ ¤ ¢ ©§¥£¡ ¦ ¢ ¨ ¦ ¤ ¢ ©§¥¡ is described by: 6 ¨ ¦ ¤ ¢ 7©©5£¡ & ¢ 1 ¨ ¨ ¦ ¤ & ¢ '2©©©§'£¡ 3 4 ¢ The field dynamics of neural activation variable (1) ¨ ¦ ¤ ¢ ©©¥¡ ¡ ) ¨ & ¢ 0©('%$£"! [sent-28, score-0.294]
22 ©§¥£¡ # ¢ ¡ ¨ ¦ ¤ ¢ ¦ This dynamics is essentially a leaky integrator with a total input on the right hand side, which includes a feedfoward input term and a feedback term that integrates the recurrent contributions from other laterally connected neurons. [sent-29, score-0.244]
23 The interaction kernel characterizes the average synaptic connection strength between the neurons coding position and the neurons coding position . [sent-30, score-0.416]
24 This function is nonlinear and monotonically increasing, and introduces the nonlinearity that makes it difficult to analyze the network dynamics. [sent-32, score-0.214]
25 ¨ ¦ ¤ ¢ ©§¥¡ 3 ¨ & 98" ¢ # ¢ ¡ ) ¢ & ¢ With a moving stimulus at constant velocity , it is often convenient to transform the static coordinate to the moving frame by changing variable . [sent-33, score-0.713]
26 Therefore the traveling pulse solution driven by the moving stimulus can be found by solving Eq. [sent-37, score-1.187]
27 (3), and the stability of the traveling pulse can be studied by perturbing the stationary solution in Eq. [sent-38, score-1.197]
28 For this purpose we consider only one-dimensional neural fields and assume that the nonlinear activation function is either a step function or a linear threshold function. [sent-44, score-0.342]
29 ) 3 Step activation function © ¨ § ¥ ¦ ¨ ¤¡ ¢ ¢ £ ¨ ¤¡ ¡ © ¨ § ¨ ¡ ¡0) We first consider step activation function where when and zero otherwise. [sent-45, score-0.256]
30 This form of activation function approximates activities of neurons, which, by saturation, are either active or inactive. [sent-46, score-0.155]
31 For the one-dimensional case, we assume that only a single stationary excited regime with ( )exists that is located between the points . [sent-47, score-0.283]
32 Only neurons inside this regime contribute to the integral, and accordingly Eq. [sent-48, score-0.281]
33 The spatial shape of the stationary solution obeys the ordinary differential equation ¤ F¨ A ¡ 3 ¨ ¡ A Q ¨ # ¡ A ¨ # Q A ¡ A ¨ ¨ ¡ A Q ¡ A D Q A 1 D 1 Q£ A ¨ ¤¡ ! [sent-50, score-0.221]
34 The solution of the above equation can be found by treating as fixed parameters, and solving Eq. [sent-53, score-0.131]
35 1 Stability of the traveling pulse solution The stability of the traveling pulse solution can be analyzed by perturbing the stationary solution in the moving coordinate system. [sent-60, score-2.327]
36 The traveling pulse solution is asymptotically stable only if the real parts of all eigenvalues are negative. [sent-68, score-1.002]
37 2 Simulation results of step activation function model We use the following function " ¨ ! [sent-70, score-0.128]
38 # ¡ ©¥ ¨ ¢ '£¡ as an example interaction kernel, numerically simulate the dynamics and compare the simulation results with the above mathematical analysis. [sent-73, score-0.332]
39 The stimulus used is a moving bar with constant width and amplitude. [sent-74, score-0.33]
40 Panel (a) shows the speed tuning curve plotted as the dependence of the peak activity of the traveling pulse as function of the stimulus velocity . [sent-77, score-1.341]
41 Panel (b) shows the maximum real part of the eigenvalues obtained from Eq. [sent-79, score-0.145]
42 For small and large stimulus velocities maximum of the real parts of becomes positive indicating a loss of stability of the form-stable solution. [sent-81, score-0.55]
43 To verify this result we calculated the variability of the peak activity over time in simulation. [sent-82, score-0.237]
44 Panel (c) shows the average variability as function of the stimulus velocity. [sent-83, score-0.223]
45 At the velocities for which the eigenvalues indicate a loss of stability the variability of the amplitudes suddenly increases, consistent with our interpretation as a loss of the form stability of the solution. [sent-84, score-0.686]
46 Panel (e) shows the propagation of the form-stable traveling pulse. [sent-86, score-0.417]
47 Panel (d) shows the solution that arises when stability is lost. [sent-87, score-0.337]
48 This solution is characterized by a spatio-temporal periodicity that is defined in the moving coordinate system by , where and are constants that depend on the network dynamics. [sent-88, score-0.383]
49 ) 05 ' (¡ D 4 Linear threshold activation function © 4¤ ¤ ¡@8 9 7 ¨ ¤0) ¡ In this case, the activation function is taken to be . [sent-94, score-0.345]
50 Cortical neurons typically operate far below the saturation level. [sent-95, score-0.171]
51 The linear threshold activation function is thus more suitable to capture the properties of real neurons while still permitting a relatively simple theoretical analysis. [sent-96, score-0.436]
52 We consider a ring network with periodic boundary conditions. [sent-97, score-0.17]
53 The dynamics is given by ¥ 6 (11) H¨¦ A 6 I©§¤ (¡ 3 4 6 51 ¨ ¦ ¤ ©§I& & A (¡ 5 ¨ 2G"(" & A # A ¡ & F #b A ! [sent-98, score-0.166]
54 We chose this form because it simplifies the mathematical analysis of ring networks. [sent-101, score-0.115]
55 Again, we consider a moving stimulus with velocity and analyze the network in the moving frame. [sent-102, score-0.675]
56 1 General solutions and stability analysis Because the activation function has linear threshold characteristics, inside the excited regime for which the total input ( ) is positive the system is linear. [sent-104, score-0.819]
57 One approach to solve this dynamics is therefore to find the solutions to the differential equation assuming the boundaries of the excited regime are given. [sent-105, score-0.557]
58 The conditions at the boundaries lead to a set of self-consistent equations for the solutions to satisfy, from which the boundaries can be determined. [sent-106, score-0.221]
59 The stationary solution in moving coordinates can ¤ D ¦ © R where then be written as ) where matrix is defined as the diagonal matrix . [sent-111, score-0.336]
60 The above solution has to satisfy two boundary vector conditions, from which and can be determined. [sent-113, score-0.143]
61 2 ¦ ¥ ( 3 4 £ A ¦ ¥ 95 2 A ( ) Stability of this traveling pulse solution can be analyzed by linear perturbation. [sent-114, score-0.885]
62 Note that perturbed boundaries points do not contribute to the linearized perturbed dynamics since ,where is the total input at the stationary solution of the moving frame on right hand side of Eq. [sent-115, score-0.817]
63 Therefore, the linearized perturbation dynamics can be fully characterized by the perturbed Fourier modes with fixed boundaries. [sent-117, score-0.385]
64 Hence, the stability of the traveling pulse solution is determined by the eigenvalues of matrix . [sent-118, score-1.157]
65 If the largest real part of eigenvalues of is negative, then the stimulus locking traveling pulse is stable. [sent-119, score-1.122]
66 2 Simplified linear threshold network The general solution introduced above requires the solution of an equation system. [sent-121, score-0.416]
67 Next we consider a special simple model for which an exact solution can be found that contains only two Fourier components for the interaction kernel and the input . [sent-123, score-0.211]
68 For this model a closed form solution and stability analysis is presented, that at the same time provides insight in some rather general properties of linear threshold networks. [sent-124, score-0.453]
69 3 The interaction kernel and feedforward input are assumed to have the following form: (13) ¨ A G F D 5¡ ¢E # ! [sent-125, score-0.111]
70 B C ¨ A¡ (" This network was used by Hansel and Sompolinsky as model of cortical orientation selectivity [14]. [sent-127, score-0.409]
71 However different from their network, we consider here an asymmetric interaction kernel and a form-constant moving stimulus . [sent-128, score-0.529]
72 In terms of these two order parameters plus the phase variable, the stimulus-locked traveling pulse solution and its stability conditions can be expressed analytically. [sent-132, score-1.066]
73 Similar to the results of step function model, panel (A) shows the speed tuning curve plotted as values of order parameters and as function of different stimulus velocities . [sent-136, score-0.505]
74 Panel (B) shows the largest real part of the eigenvalues of a stability matrix that can be obtained by linearizing the order parameter dynamics around the stationary solution. [sent-137, score-0.646]
75 Panel (C) shows the average variations as function of the stimulus velocity. [sent-138, score-0.184]
76 The space-time evolution of the form-stable traveling pulse is shown in panel (E); the form-unstable lurching wave is shown in panel (D). [sent-139, score-1.432]
77 Thus we found that lurching wave solution type arises very robustly for both types of threshold functions when the network achieved substantial direction selective behavior. [sent-140, score-0.785]
78 5 Conclusion We have presented different methods for an analysis of the nonlinear dynamics of simple recurrent neural models for the direction selectivity of cortical neurons. [sent-142, score-0.814]
79 Compared to earlier works, we have taken into account the essentially nonlinear effects that are introduced by the nonlinear threshold characteristics of the cortical neurons. [sent-143, score-0.435]
80 The key result of our work is that such networks have a class of form-stable traveling pulse solutions that behave similar as the solutions of linear spatio-temporal filtering models within a certain regime of stimulus speeds. [sent-144, score-1.335]
81 By the essential nonlinearity of the network, however, bifurcations can arise for which the traveling pulse solutions become unstable. [sent-145, score-0.919]
82 We observed that in this case a new class of spatio-temporally periodic solutions (”lurching activity waves”) arises. [sent-146, score-0.261]
83 Since we found this solution type very frequently for networks with substantial direction selectivity our analysis predicts that such ”lurching behavior” might be observable in visual cortex areas if, in fact, the direction selectivity is essentially based on asymmetric lateral connectivity. [sent-147, score-1.146]
84 The synaptic veto mechanism: does it underlie direction and orientation selectivity in the visual cortex. [sent-153, score-0.365]
85 1 C 0 -40 -30 -20 10 0 0 2 Variation a Peak Activity 3 1 0 -50 10 0 Velocity Velocity D E TIME e TIME d SPACE SPACE Figure 1: Traveling pulse solution and its stability in two classes of models. [sent-179, score-0.649]
86 In the left side shown is the step activation function model, while the linear threshold model is drawn in the right. [sent-180, score-0.245]
87 Panel (a) and (A) show the velocity tuning curves of the traveling pulse in terms of its peak activity in (a) or order parameters in (A). [sent-181, score-1.127]
88 Panel (b) and (B) plot the largest real parts of eigenvalues of a stability matrix obtained from perturbed linear dynamics around the stationary solution. [sent-183, score-0.749]
89 Outside certain range of stimulus velocities the largest real part of the eigenvalues become positive indicating a loss of stability of the form-stable solution. [sent-184, score-0.677]
90 A nonzero variance signifies a loss of stability for traveling pulse solutions, which is consistent with eigenvalue analysis in Panel (b) and (B). [sent-186, score-1.028]
91 A color coded plot of spatialtemporal evolution of the activity is shown in panels (d) and (e), and in (D) and (E). [sent-187, score-0.219]
92 Panel (e) and (E) show the propagation of the form-stable peak over time; panel (d) and (D) show the lurching activity wave that arises when stability is lost. [sent-188, score-0.89]
93 The stimulus is a moving bar with width and amplitude . [sent-190, score-0.33]
94 Parameters used in linear threshold model are , , and . [sent-191, score-0.117]
95 Modeling direction selectivity of simple cells in striate visual cortex within the framework of the canonical microcircuit. [sent-209, score-0.426]
96 Model circuit of spiking neurons generating directional selectivity in simple cells. [sent-215, score-0.381]
97 Analysis of direction selectivity arising from recurrent cortical interactions. [sent-220, score-0.524]
98 An architectural hypothesis for direction selectivity in the visual cortex: the role of spatially asymmetric intracortical inhibition. [sent-226, score-0.483]
99 A mathematical theory of the functional dynamics of cortical and thalamic nervous tissue. [sent-229, score-0.381]
100 Effects of delay on the type and velocity of travelling pulses in neuronal networks with spatially decaying connectivity. [sent-248, score-0.289]
wordName wordTfidf (topN-words)
[('traveling', 0.417), ('pulse', 0.34), ('selectivity', 0.218), ('stability', 0.209), ('lurching', 0.206), ('qy', 0.201), ('stimulus', 0.184), ('panel', 0.182), ('dynamics', 0.166), ('moving', 0.146), ('neurons', 0.137), ('velocity', 0.131), ('activation', 0.128), ('cortical', 0.123), ('activity', 0.116), ('solutions', 0.113), ('regime', 0.112), ('direction', 0.105), ('solution', 0.1), ('nonlinear', 0.097), ('eigenvalues', 0.091), ('fourier', 0.09), ('stationary', 0.09), ('threshold', 0.089), ('asymmetric', 0.088), ('peak', 0.082), ('excited', 0.081), ('recurrent', 0.078), ('waves', 0.077), ('perturbed', 0.075), ('interaction', 0.072), ('lateral', 0.07), ('velocities', 0.068), ('network', 0.068), ('wave', 0.067), ('panels', 0.065), ('selective', 0.064), ('cortex', 0.061), ('mathematical', 0.061), ('linearized', 0.061), ('soc', 0.061), ('opt', 0.057), ('boundaries', 0.054), ('real', 0.054), ('reads', 0.054), ('bingen', 0.051), ('giese', 0.051), ('koch', 0.051), ('frame', 0.05), ('nonlinearity', 0.049), ('hansel', 0.045), ('biol', 0.045), ('cybern', 0.045), ('simpli', 0.043), ('boundary', 0.043), ('visual', 0.042), ('characterized', 0.042), ('tuning', 0.041), ('perturbation', 0.041), ('perturbing', 0.041), ('connections', 0.04), ('kernel', 0.039), ('variability', 0.039), ('modeling', 0.038), ('biophysically', 0.038), ('pulses', 0.038), ('ensembles', 0.038), ('evolution', 0.038), ('largest', 0.036), ('spatiotemporal', 0.036), ('loss', 0.035), ('saturation', 0.034), ('transforming', 0.034), ('neuronal', 0.033), ('simulation', 0.033), ('inside', 0.032), ('periodic', 0.032), ('equation', 0.031), ('characterizes', 0.031), ('nervous', 0.031), ('curve', 0.03), ('spatially', 0.03), ('lines', 0.03), ('characteristics', 0.029), ('substantial', 0.029), ('static', 0.029), ('type', 0.029), ('linear', 0.028), ('biologically', 0.028), ('ltering', 0.028), ('networks', 0.028), ('arises', 0.028), ('activities', 0.027), ('connectivity', 0.027), ('rd', 0.027), ('ring', 0.027), ('coordinate', 0.027), ('analysis', 0.027), ('spiking', 0.026), ('observable', 0.026)]
simIndex simValue paperId paperTitle
same-paper 1 1.0000008 82 nips-2001-Generating velocity tuning by asymmetric recurrent connections
Author: Xiaohui Xie, Martin A. Giese
Abstract: Asymmetric lateral connections are one possible mechanism that can account for the direction selectivity of cortical neurons. We present a mathematical analysis for a class of these models. Contrasting with earlier theoretical work that has relied on methods from linear systems theory, we study the network’s nonlinear dynamic properties that arise when the threshold nonlinearity of the neurons is taken into account. We show that such networks have stimulus-locked traveling pulse solutions that are appropriate for modeling the responses of direction selective cortical neurons. In addition, our analysis shows that outside a certain regime of stimulus speeds the stability of this solutions breaks down giving rise to another class of solutions that are characterized by specific spatiotemporal periodicity. This predicts that if direction selectivity in the cortex is mainly achieved by asymmetric lateral connections lurching activity waves might be observable in ensembles of direction selective cortical neurons within appropriate regimes of the stimulus speed.
2 0.1909226 23 nips-2001-A theory of neural integration in the head-direction system
Author: Richard Hahnloser, Xiaohui Xie, H. S. Seung
Abstract: Integration in the head-direction system is a computation by which horizontal angular head velocity signals from the vestibular nuclei are integrated to yield a neural representation of head direction. In the thalamus, the postsubiculum and the mammillary nuclei, the head-direction representation has the form of a place code: neurons have a preferred head direction in which their firing is maximal [Blair and Sharp, 1995, Blair et al., 1998, ?]. Integration is a difficult computation, given that head-velocities can vary over a large range. Previous models of the head-direction system relied on the assumption that the integration is achieved in a firing-rate-based attractor network with a ring structure. In order to correctly integrate head-velocity signals during high-speed head rotations, very fast synaptic dynamics had to be assumed. Here we address the question whether integration in the head-direction system is possible with slow synapses, for example excitatory NMDA and inhibitory GABA(B) type synapses. For neural networks with such slow synapses, rate-based dynamics are a good approximation of spiking neurons [Ermentrout, 1994]. We find that correct integration during high-speed head rotations imposes strong constraints on possible network architectures.
3 0.15955989 37 nips-2001-Associative memory in realistic neuronal networks
Author: Peter E. Latham
Abstract: Almost two decades ago , Hopfield [1] showed that networks of highly reduced model neurons can exhibit multiple attracting fixed points, thus providing a substrate for associative memory. It is still not clear, however, whether realistic neuronal networks can support multiple attractors. The main difficulty is that neuronal networks in vivo exhibit a stable background state at low firing rate, typically a few Hz. Embedding attractor is easy; doing so without destabilizing the background is not. Previous work [2, 3] focused on the sparse coding limit, in which a vanishingly small number of neurons are involved in any memory. Here we investigate the case in which the number of neurons involved in a memory scales with the number of neurons in the network. In contrast to the sparse coding limit, we find that multiple attractors can co-exist robustly with a stable background state. Mean field theory is used to understand how the behavior of the network scales with its parameters, and simulations with analog neurons are presented. One of the most important features of the nervous system is its ability to perform associative memory. It is generally believed that associative memory is implemented using attractor networks - experimental studies point in that direction [4- 7], and there are virtually no competing theoretical models. Perhaps surprisingly, however, it is still an open theoretical question whether attractors can exist in realistic neuronal networks. The
4 0.15761307 141 nips-2001-Orientation-Selective aVLSI Spiking Neurons
Author: Shih-Chii Liu, Jörg Kramer, Giacomo Indiveri, Tobi Delbrück, Rodney J. Douglas
Abstract: We describe a programmable multi-chip VLSI neuronal system that can be used for exploring spike-based information processing models. The system consists of a silicon retina, a PIC microcontroller, and a transceiver chip whose integrate-and-fire neurons are connected in a soft winner-take-all architecture. The circuit on this multi-neuron chip approximates a cortical microcircuit. The neurons can be configured for different computational properties by the virtual connections of a selected set of pixels on the silicon retina. The virtual wiring between the different chips is effected by an event-driven communication protocol that uses asynchronous digital pulses, similar to spikes in a neuronal system. We used the multi-chip spike-based system to synthesize orientation-tuned neurons using both a feedforward model and a feedback model. The performance of our analog hardware spiking model matched the experimental observations and digital simulations of continuous-valued neurons. The multi-chip VLSI system has advantages over computer neuronal models in that it is real-time, and the computational time does not scale with the size of the neuronal network.
5 0.13592425 72 nips-2001-Exact differential equation population dynamics for integrate-and-fire neurons
Author: Julian Eggert, Berthold Bäuml
Abstract: Mesoscopical, mathematical descriptions of dynamics of populations of spiking neurons are getting increasingly important for the understanding of large-scale processes in the brain using simulations. In our previous work, integral equation formulations for population dynamics have been derived for a special type of spiking neurons. For Integrate- and- Fire type neurons , these formulations were only approximately correct. Here, we derive a mathematically compact, exact population dynamics formulation for Integrate- and- Fire type neurons. It can be shown quantitatively in simulations that the numerical correspondence with microscopically modeled neuronal populations is excellent. 1 Introduction and motivation The goal of the population dynamics approach is to model the time course of the collective activity of entire populations of functionally and dynamically similar neurons in a compact way, using a higher descriptionallevel than that of single neurons and spikes. The usual observable at the level of neuronal populations is the populationaveraged instantaneous firing rate A(t), with A(t)6.t being the number of neurons in the population that release a spike in an interval [t, t+6.t). Population dynamics are formulated in such a way, that they match quantitatively the time course of a given A(t), either gained experimentally or by microscopical, detailed simulation. At least three main reasons can be formulated which underline the importance of the population dynamics approach for computational neuroscience. First, it enables the simulation of extensive networks involving a massive number of neurons and connections, which is typically the case when dealing with biologically realistic functional models that go beyond the single neuron level. Second, it increases the analytical understanding of large-scale neuronal dynamics , opening the way towards better control and predictive capabilities when dealing with large networks. Third, it enables a systematic embedding of the numerous neuronal models operating at different descriptional scales into a generalized theoretic framework, explaining the relationships, dependencies and derivations of the respective models. Early efforts on population dynamics approaches date back as early as 1972, to the work of Wilson and Cowan [8] and Knight [4], which laid the basis for all current population-averaged graded-response models (see e.g. [6] for modeling work using these models). More recently, population-based approaches for spiking neurons were developed, mainly by Gerstner [3, 2] and Knight [5]. In our own previous work [1], we have developed a theoretical framework which enables to systematize and simulate a wide range of models for population-based dynamics. It was shown that the equations of the framework produce results that agree quantitatively well with detailed simulations using spiking neurons, so that they can be used for realistic simulations involving networks with large numbers of spiking neurons. Nevertheless, for neuronal populations composed of Integrate-and-Fire (I&F;) neurons, this framework was only correct in an approximation. In this paper, we derive the exact population dynamics formulation for I&F; neurons. This is achieved by reducing the I&F; population dynamics to a point process and by taking advantage of the particular properties of I&F; neurons. 2 2.1 Background: Integrate-and-Fire dynamics Differential form We start with the standard Integrate- and- Fire (I&F;) model in form of the wellknown differential equation [7] (1) which describes the dynamics of the membrane potential Vi of a neuron i that is modeled as a single compartment with RC circuit characteristics. The membrane relaxation time is in this case T = RC with R being the membrane resistance and C the membrane capacitance. The resting potential v R est is the stationary potential that is approached in the no-input case. The input arriving from other neurons is described in form of a current ji. In addition to eq. (1), which describes the integrate part of the I&F; model, the neuronal dynamics are completed by a nonlinear step. Every time the membrane potential Vi reaches a fixed threshold () from below, Vi is lowered by a fixed amount Ll > 0, and from the new value of the membrane potential integration according to eq. (1) starts again. if Vi(t) = () (from below) . (2) At the same time, it is said that the release of a spike occurred (i.e., the neuron fired), and the time ti = t of this singular event is stored. Here ti indicates the time of the most recent spike. Storing all the last firing times , we gain the sequence of spikes {t{} (spike ordering index j, neuronal index i). 2.2 Integral form Now we look at the single neuron in a neuronal compound. We assume that the input current contribution ji from presynaptic spiking neurons can be described using the presynaptic spike times tf, a response-function ~ and a connection weight W¡ . ',J ji(t) = Wi ,j ~(t - tf) (3) l: l: j f Integrating the I&F; equation (1) beginning at the last spiking time tT, which determines the initial condition by Vi(ti) = vi(ti - 0) - 6., where vi(ti - 0) is the membrane potential just before the neuron spikes, we get 1 Vi(t) = v Rest + fj(t - t:) + l: Wi ,j l: a(t - t:; t - tf) , j - Vi(t:)) e- S / T (4) f with the refractory function fj(s) = - (v Rest (5) and the alpha-function r ds
6 0.13526903 73 nips-2001-Eye movements and the maturation of cortical orientation selectivity
7 0.13352412 150 nips-2001-Probabilistic Inference of Hand Motion from Neural Activity in Motor Cortex
8 0.10954561 87 nips-2001-Group Redundancy Measures Reveal Redundancy Reduction in the Auditory Pathway
9 0.10015196 111 nips-2001-Learning Lateral Interactions for Feature Binding and Sensory Segmentation
10 0.096127957 131 nips-2001-Neural Implementation of Bayesian Inference in Population Codes
11 0.094863333 10 nips-2001-A Hierarchical Model of Complex Cells in Visual Cortex for the Binocular Perception of Motion-in-Depth
12 0.089696981 49 nips-2001-Citcuits for VLSI Implementation of Temporally Asymmetric Hebbian Learning
13 0.089319512 57 nips-2001-Correlation Codes in Neuronal Populations
14 0.086433962 65 nips-2001-Effective Size of Receptive Fields of Inferior Temporal Visual Cortex Neurons in Natural Scenes
15 0.085294046 174 nips-2001-Spike timing and the coding of naturalistic sounds in a central auditory area of songbirds
16 0.082148738 160 nips-2001-Reinforcement Learning and Time Perception -- a Model of Animal Experiments
17 0.081522748 124 nips-2001-Modeling the Modulatory Effect of Attention on Human Spatial Vision
18 0.076310724 48 nips-2001-Characterizing Neural Gain Control using Spike-triggered Covariance
19 0.074887916 181 nips-2001-The Emergence of Multiple Movement Units in the Presence of Noise and Feedback Delay
20 0.073364221 103 nips-2001-Kernel Feature Spaces and Nonlinear Blind Souce Separation
topicId topicWeight
[(0, -0.199), (1, -0.251), (2, -0.148), (3, 0.019), (4, 0.12), (5, 0.044), (6, 0.003), (7, 0.027), (8, 0.057), (9, 0.049), (10, -0.047), (11, 0.087), (12, -0.043), (13, -0.019), (14, 0.042), (15, 0.027), (16, -0.143), (17, -0.016), (18, 0.025), (19, -0.033), (20, 0.007), (21, 0.041), (22, 0.108), (23, 0.011), (24, -0.024), (25, 0.129), (26, 0.047), (27, -0.18), (28, 0.074), (29, 0.029), (30, -0.135), (31, -0.038), (32, -0.041), (33, -0.03), (34, 0.05), (35, -0.013), (36, 0.089), (37, -0.105), (38, 0.027), (39, -0.074), (40, -0.021), (41, -0.014), (42, -0.0), (43, 0.049), (44, -0.03), (45, 0.111), (46, 0.087), (47, -0.082), (48, -0.09), (49, 0.16)]
simIndex simValue paperId paperTitle
same-paper 1 0.97138143 82 nips-2001-Generating velocity tuning by asymmetric recurrent connections
Author: Xiaohui Xie, Martin A. Giese
Abstract: Asymmetric lateral connections are one possible mechanism that can account for the direction selectivity of cortical neurons. We present a mathematical analysis for a class of these models. Contrasting with earlier theoretical work that has relied on methods from linear systems theory, we study the network’s nonlinear dynamic properties that arise when the threshold nonlinearity of the neurons is taken into account. We show that such networks have stimulus-locked traveling pulse solutions that are appropriate for modeling the responses of direction selective cortical neurons. In addition, our analysis shows that outside a certain regime of stimulus speeds the stability of this solutions breaks down giving rise to another class of solutions that are characterized by specific spatiotemporal periodicity. This predicts that if direction selectivity in the cortex is mainly achieved by asymmetric lateral connections lurching activity waves might be observable in ensembles of direction selective cortical neurons within appropriate regimes of the stimulus speed.
2 0.81865168 23 nips-2001-A theory of neural integration in the head-direction system
Author: Richard Hahnloser, Xiaohui Xie, H. S. Seung
Abstract: Integration in the head-direction system is a computation by which horizontal angular head velocity signals from the vestibular nuclei are integrated to yield a neural representation of head direction. In the thalamus, the postsubiculum and the mammillary nuclei, the head-direction representation has the form of a place code: neurons have a preferred head direction in which their firing is maximal [Blair and Sharp, 1995, Blair et al., 1998, ?]. Integration is a difficult computation, given that head-velocities can vary over a large range. Previous models of the head-direction system relied on the assumption that the integration is achieved in a firing-rate-based attractor network with a ring structure. In order to correctly integrate head-velocity signals during high-speed head rotations, very fast synaptic dynamics had to be assumed. Here we address the question whether integration in the head-direction system is possible with slow synapses, for example excitatory NMDA and inhibitory GABA(B) type synapses. For neural networks with such slow synapses, rate-based dynamics are a good approximation of spiking neurons [Ermentrout, 1994]. We find that correct integration during high-speed head rotations imposes strong constraints on possible network architectures.
3 0.58191365 57 nips-2001-Correlation Codes in Neuronal Populations
Author: Maoz Shamir, Haim Sompolinsky
Abstract: Population codes often rely on the tuning of the mean responses to the stimulus parameters. However, this information can be greatly suppressed by long range correlations. Here we study the efficiency of coding information in the second order statistics of the population responses. We show that the Fisher Information of this system grows linearly with the size of the system. We propose a bilinear readout model for extracting information from correlation codes, and evaluate its performance in discrimination and estimation tasks. It is shown that the main source of information in this system is the stimulus dependence of the variances of the single neuron responses.
4 0.57459879 141 nips-2001-Orientation-Selective aVLSI Spiking Neurons
Author: Shih-Chii Liu, Jörg Kramer, Giacomo Indiveri, Tobi Delbrück, Rodney J. Douglas
Abstract: We describe a programmable multi-chip VLSI neuronal system that can be used for exploring spike-based information processing models. The system consists of a silicon retina, a PIC microcontroller, and a transceiver chip whose integrate-and-fire neurons are connected in a soft winner-take-all architecture. The circuit on this multi-neuron chip approximates a cortical microcircuit. The neurons can be configured for different computational properties by the virtual connections of a selected set of pixels on the silicon retina. The virtual wiring between the different chips is effected by an event-driven communication protocol that uses asynchronous digital pulses, similar to spikes in a neuronal system. We used the multi-chip spike-based system to synthesize orientation-tuned neurons using both a feedforward model and a feedback model. The performance of our analog hardware spiking model matched the experimental observations and digital simulations of continuous-valued neurons. The multi-chip VLSI system has advantages over computer neuronal models in that it is real-time, and the computational time does not scale with the size of the neuronal network.
5 0.56014824 150 nips-2001-Probabilistic Inference of Hand Motion from Neural Activity in Motor Cortex
Author: Yun Gao, Michael J. Black, Elie Bienenstock, Shy Shoham, John P. Donoghue
Abstract: Statistical learning and probabilistic inference techniques are used to infer the hand position of a subject from multi-electrode recordings of neural activity in motor cortex. First, an array of electrodes provides training data of neural firing conditioned on hand kinematics. We learn a nonparametric representation of this firing activity using a Bayesian model and rigorously compare it with previous models using cross-validation. Second, we infer a posterior probability distribution over hand motion conditioned on a sequence of neural test data using Bayesian inference. The learned firing models of multiple cells are used to define a nonGaussian likelihood term which is combined with a prior probability for the kinematics. A particle filtering method is used to represent, update, and propagate the posterior distribution over time. The approach is compared with traditional linear filtering methods; the results suggest that it may be appropriate for neural prosthetic applications.
6 0.55021656 73 nips-2001-Eye movements and the maturation of cortical orientation selectivity
7 0.49920326 72 nips-2001-Exact differential equation population dynamics for integrate-and-fire neurons
8 0.46210301 131 nips-2001-Neural Implementation of Bayesian Inference in Population Codes
9 0.45238826 124 nips-2001-Modeling the Modulatory Effect of Attention on Human Spatial Vision
10 0.43821716 166 nips-2001-Self-regulation Mechanism of Temporally Asymmetric Hebbian Plasticity
11 0.43641165 48 nips-2001-Characterizing Neural Gain Control using Spike-triggered Covariance
12 0.42719576 37 nips-2001-Associative memory in realistic neuronal networks
13 0.41684484 87 nips-2001-Group Redundancy Measures Reveal Redundancy Reduction in the Auditory Pathway
14 0.40749684 160 nips-2001-Reinforcement Learning and Time Perception -- a Model of Animal Experiments
15 0.39565471 19 nips-2001-A Rotation and Translation Invariant Discrete Saliency Network
16 0.37787375 26 nips-2001-Active Portfolio-Management based on Error Correction Neural Networks
17 0.36544055 111 nips-2001-Learning Lateral Interactions for Feature Binding and Sensory Segmentation
18 0.35090029 52 nips-2001-Computing Time Lower Bounds for Recurrent Sigmoidal Neural Networks
19 0.33931151 10 nips-2001-A Hierarchical Model of Complex Cells in Visual Cortex for the Binocular Perception of Motion-in-Depth
20 0.33823436 165 nips-2001-Scaling Laws and Local Minima in Hebbian ICA
topicId topicWeight
[(14, 0.034), (17, 0.016), (19, 0.013), (27, 0.118), (30, 0.536), (38, 0.034), (59, 0.014), (72, 0.031), (79, 0.032), (83, 0.012), (91, 0.09)]
simIndex simValue paperId paperTitle
Author: Takashi Morie, Tomohiro Matsuura, Makoto Nagata, Atsushi Iwata
Abstract: This paper describes a clustering algorithm for vector quantizers using a “stochastic association model”. It offers a new simple and powerful softmax adaptation rule. The adaptation process is the same as the on-line K-means clustering method except for adding random fluctuation in the distortion error evaluation process. Simulation results demonstrate that the new algorithm can achieve efficient adaptation as high as the “neural gas” algorithm, which is reported as one of the most efficient clustering methods. It is a key to add uncorrelated random fluctuation in the similarity evaluation process for each reference vector. For hardware implementation of this process, we propose a nanostructure, whose operation is described by a single-electron circuit. It positively uses fluctuation in quantum mechanical tunneling processes.
same-paper 2 0.97430873 82 nips-2001-Generating velocity tuning by asymmetric recurrent connections
Author: Xiaohui Xie, Martin A. Giese
Abstract: Asymmetric lateral connections are one possible mechanism that can account for the direction selectivity of cortical neurons. We present a mathematical analysis for a class of these models. Contrasting with earlier theoretical work that has relied on methods from linear systems theory, we study the network’s nonlinear dynamic properties that arise when the threshold nonlinearity of the neurons is taken into account. We show that such networks have stimulus-locked traveling pulse solutions that are appropriate for modeling the responses of direction selective cortical neurons. In addition, our analysis shows that outside a certain regime of stimulus speeds the stability of this solutions breaks down giving rise to another class of solutions that are characterized by specific spatiotemporal periodicity. This predicts that if direction selectivity in the cortex is mainly achieved by asymmetric lateral connections lurching activity waves might be observable in ensembles of direction selective cortical neurons within appropriate regimes of the stimulus speed.
3 0.97243285 173 nips-2001-Speech Recognition with Missing Data using Recurrent Neural Nets
Author: S. Parveen, P. Green
Abstract: In the ‘missing data’ approach to improving the robustness of automatic speech recognition to added noise, an initial process identifies spectraltemporal regions which are dominated by the speech source. The remaining regions are considered to be ‘missing’. In this paper we develop a connectionist approach to the problem of adapting speech recognition to the missing data case, using Recurrent Neural Networks. In contrast to methods based on Hidden Markov Models, RNNs allow us to make use of long-term time constraints and to make the problems of classification with incomplete data and imputing missing values interact. We report encouraging results on an isolated digit recognition task.
4 0.95847881 159 nips-2001-Reducing multiclass to binary by coupling probability estimates
Author: B. Zadrozny
Abstract: This paper presents a method for obtaining class membership probability estimates for multiclass classification problems by coupling the probability estimates produced by binary classifiers. This is an extension for arbitrary code matrices of a method due to Hastie and Tibshirani for pairwise coupling of probability estimates. Experimental results with Boosted Naive Bayes show that our method produces calibrated class membership probability estimates, while having similar classification accuracy as loss-based decoding, a method for obtaining the most likely class that does not generate probability estimates.
5 0.93184561 163 nips-2001-Risk Sensitive Particle Filters
Author: Sebastian Thrun, John Langford, Vandi Verma
Abstract: We propose a new particle filter that incorporates a model of costs when generating particles. The approach is motivated by the observation that the costs of accidentally not tracking hypotheses might be significant in some areas of state space, and next to irrelevant in others. By incorporating a cost model into particle filtering, states that are more critical to the system performance are more likely to be tracked. Automatic calculation of the cost model is implemented using an MDP value function calculation that estimates the value of tracking a particular state. Experiments in two mobile robot domains illustrate the appropriateness of the approach.
6 0.92609036 151 nips-2001-Probabilistic principles in unsupervised learning of visual structure: human data and a model
7 0.80814302 65 nips-2001-Effective Size of Receptive Fields of Inferior Temporal Visual Cortex Neurons in Natural Scenes
8 0.79782271 149 nips-2001-Probabilistic Abstraction Hierarchies
9 0.78802955 73 nips-2001-Eye movements and the maturation of cortical orientation selectivity
10 0.74682498 102 nips-2001-KLD-Sampling: Adaptive Particle Filters
11 0.71034127 63 nips-2001-Dynamic Time-Alignment Kernel in Support Vector Machine
12 0.67981762 46 nips-2001-Categorization by Learning and Combining Object Parts
13 0.6743558 77 nips-2001-Fast and Robust Classification using Asymmetric AdaBoost and a Detector Cascade
14 0.66727823 60 nips-2001-Discriminative Direction for Kernel Classifiers
15 0.66680086 52 nips-2001-Computing Time Lower Bounds for Recurrent Sigmoidal Neural Networks
16 0.66422129 20 nips-2001-A Sequence Kernel and its Application to Speaker Recognition
17 0.65713376 176 nips-2001-Stochastic Mixed-Signal VLSI Architecture for High-Dimensional Kernel Machines
18 0.64690435 34 nips-2001-Analog Soft-Pattern-Matching Classifier using Floating-Gate MOS Technology
19 0.6374436 116 nips-2001-Linking Motor Learning to Function Approximation: Learning in an Unlearnable Force Field
20 0.63538259 162 nips-2001-Relative Density Nets: A New Way to Combine Backpropagation with HMM's