nips nips2010 nips2010-34 knowledge-graph by maker-knowledge-mining
Source: pdf
Author: K. Wong, He Wang, Si Wu, Chi Fung
Abstract: Neuronal connection weights exhibit short-term depression (STD). The present study investigates the impact of STD on the dynamics of a continuous attractor neural network (CANN) and its potential roles in neural information processing. We find that the network with STD can generate both static and traveling bumps, and STD enhances the performance of the network in tracking external inputs. In particular, we find that STD endows the network with slow-decaying plateau behaviors, namely, the network being initially stimulated to an active state will decay to silence very slowly in the time scale of STD rather than that of neural signaling. We argue that this provides a mechanism for neural systems to hold short-term memory easily and shut off persistent activities naturally.
Reference: text
sentIndex sentText sentNum sentScore
1 cn Abstract Neuronal connection weights exhibit short-term depression (STD). [sent-12, score-0.257]
2 The present study investigates the impact of STD on the dynamics of a continuous attractor neural network (CANN) and its potential roles in neural information processing. [sent-13, score-0.319]
3 We find that the network with STD can generate both static and traveling bumps, and STD enhances the performance of the network in tracking external inputs. [sent-14, score-0.68]
4 In particular, we find that STD endows the network with slow-decaying plateau behaviors, namely, the network being initially stimulated to an active state will decay to silence very slowly in the time scale of STD rather than that of neural signaling. [sent-15, score-0.74]
5 We argue that this provides a mechanism for neural systems to hold short-term memory easily and shut off persistent activities naturally. [sent-16, score-0.175]
6 The network structure is the key that determines the responsive behaviors of a network to external inputs, and hence the computations implemented by the neural system. [sent-18, score-0.429]
7 A predominant type of STP is short-term depression (STD), which decreases the connection efficacy when a pre-synaptic neuron fires. [sent-25, score-0.285]
8 For instance, it was found that STD can achieve gain control in regulating neural responses to external inputs, realizing Weber’s law [2, 3]. [sent-30, score-0.17]
9 We analyze the dynamics of a CANN with STD included, and find that apart from the static bump states, the network can also hold moving bump solutions. [sent-41, score-1.268]
10 In particular, we find that with STD, the network can have slowdecaying plateau states, that is, the network being stimulated to an active state by a transient input will decay to silence very slowly in the time order of STD rather than that of neural signaling. [sent-43, score-0.682]
11 It implies that STD can provide a mechanism for neural systems to generate short-term memory and shut off activities naturally. [sent-45, score-0.175]
12 We also find that STD retains the neutral stability of the CANN, and enhances the tracking performance of the network to external inputs. [sent-46, score-0.355]
13 For example, the stimulus may represent the moving direction, the orientation or a general continuous feature of objects extracted by the neural system. [sent-48, score-0.308]
14 Let u(x, t) be the synaptic input at time t to the neurons whose preferred stimulus is x. [sent-49, score-0.362]
15 The dynamics is particularly convenient to analyze in the limit that the interaction range a is much less than the stimulus range L, so that we can effectively take x ∈ (−∞, ∞). [sent-53, score-0.242]
16 The dynamics of u(x, t) is determined by the external input Iext (x, t), the network input from other neurons, and its own relaxation. [sent-54, score-0.333]
17 It increases with the synaptic input, but saturates in the presence of a global activity-dependent inhibition. [sent-58, score-0.183]
18 Its dynamics is given by ∂p(x, t) = 1 − p(x, t) − p(x, t)τd βr(x, t), (2) τd ∂t where τd is the time constant for synaptic depression, and the parameter β controls the depression effect due to neural firing. [sent-66, score-0.553]
19 The network dynamics is governed by two time scales. [sent-67, score-0.22]
20 The interplay between the fast and slow dynamics causes the network to exhibit interesting dynamical behaviors. [sent-71, score-0.293]
21 1 -3 0 5 10 t/ 15 20 0 s Figure 1: The neural response profile tracks the change of position of the external stimulus from z0 = 0 to 1. [sent-86, score-0.309]
22 1 Dynamics of CANN without Dynamical Synapses It is instructive to first consider the network dynamics when no dynamical synapses are included. [sent-95, score-0.295]
23 In this case, the network can support a continuous family of stationary states when the global inhibition is not too strong. [sent-98, score-0.218]
24 This implies that the Gaussian bumps are able to track changes in the position of the external stimuli by continuously shifting the position of the bumps, with other distortion modes affecting the tracking process only in the transients. [sent-106, score-0.522]
25 1 and 2, when an external stimulus with a Gaussian profile is initially centered at z = 0, pinning the center of a Gaussian neuronal response at the same position. [sent-108, score-0.334]
26 The bump moves towards the new stimulus position, and catches up with the stimulus change after a time duration. [sent-111, score-0.621]
27 3 Dynamics of CANN with Synaptic Depression For clarity, we will first summarize the main results obtained on the network dynamics due to STD, and then present the theoretical analysis in Sec. [sent-113, score-0.22]
28 Apart from the static bump state, the network also supports moving bump states. [sent-117, score-1.157]
29 To construct a phase diagram mapping these behaviors, we first consider how the global inhibition k and the synaptic depression β scale with other parameters. [sent-118, score-0.637]
30 The phase diagram obtained by numerical solutions to the network dynamics is shown in Fig. [sent-123, score-0.401]
31 06 Moving Silent Figure 3: Phase diagram of the network states. [sent-126, score-0.184]
32 6 k We first note that the synaptic depression and the global inhibition plays the same role in reducing the amplitude of the bump states. [sent-143, score-0.848]
33 Hence we see that the silent state with u(x, t) = 0 is the only stable state when either k or β is large. [sent-146, score-0.237]
34 When STD is weak, the network behaves similarly with CANNs without STD, that is, the static bump state is present up to k near 1. [sent-147, score-0.711]
35 However, when β increases, a state with the bump spontaneously moving at a constant velocity comes into existence. [sent-148, score-0.566]
36 Such moving states have been predicted in CANNs [12, 13], and can be associated with traveling wave behaviors widely observed in the neocortex [18]. [sent-149, score-0.28]
37 At an intermediate range of β, both the static and moving states coexist, and the final state of the network depends on the initial condition. [sent-150, score-0.531]
38 When β increases further, only the moving state is present. [sent-151, score-0.207]
39 2 The Plateau Behavior The network dynamics displays a very interesting behavior in the parameter regime when the static bump solution just loses its stability. [sent-153, score-0.832]
40 In this regime, an initially activated network state decays very slowly to silence, in the time order of τd . [sent-154, score-0.271]
41 Hence, although the bump state eventually decays to the silent state, it goes through a plateau region of a slowly decaying amplitude, as shown in Fig. [sent-155, score-0.847]
42 01 A 100 t 200 300 0 0 100 200 300 t 400 500 Figure 4: Magnitudes of rescaled neuronal input ρJ0 u(x, t) and synaptic depression 1 − p(x, t) at (k, β) = (0. [sent-162, score-0.477]
43 Compared with networks without STD, we find that the bump shifts to the new position faster. [sent-175, score-0.394]
44 However, when β is too strong, the bump tends to overshoot the target before eventually approaching it. [sent-178, score-0.359]
45 We observe that the profile of the bump remains effectively Gaussian in the presence of synaptic depression. [sent-213, score-0.542]
46 On the other hand, there is a considerable distortion of the profile of the synaptic depression, when STD is strong. [sent-214, score-0.221]
47 Yet, to the lowest order approximation, let us approximate the profile of the synaptic depression to be a Gaussian as well, which is valid when STD is weak, as shown in Fig. [sent-215, score-0.412]
48 3 (8) (9) By considering the steady state solution of u and p0 and their stability against fluctuations of u and p0 , we find that stable solutions exist when β≤ p0 (1 − 4(1 − 4/7p0 )2 2/3p0 ) 1+ 5 τs τd (1 − 2/3p0 ) , (10) when p0 is the steady state solution of Eqs. [sent-229, score-0.308]
49 Unfortunately, this line is not easily observed in numerical solutions since the static bump is unstable against fluctuations that are asymmetric with respect to its central position. [sent-233, score-0.647]
50 Although the bump is stable against symmetric fluctuations, asymmetric fluctuations can displace its position and eventually convert it to a moving bump. [sent-234, score-0.577]
51 7(b), the profile of a moving bump is characterized by a lag of the synaptic depression behind the moving bump. [sent-237, score-1.065]
52 This is because neurons tend to be less active in locations of low values of p(x, t), causing the bump to move away from locations of strong synaptic depression. [sent-238, score-0.621]
53 In turn, the region of synaptic depression tends to follow the bump. [sent-239, score-0.457]
54 However, if the time scale of synaptic depression is large, the recovery of the synaptic depressed region is slowed down, and cannot catch up with the bump motion. [sent-240, score-1.092]
55 To incorporate asymmetry into the moving state, we propose the following ansatz: u(x, t) = p(x, t) = (x − vt)2 , 4a2 (x − vt)2 (x − vt)2 1 − p0 (t) exp − + p1 (t) exp − 2 2a 2a2 u0 (t) exp − (11) x − vt a . [sent-242, score-0.334]
56 3, the boundary of this region effectively coincides with the numerical solution of the line separating the static and moving phases. [sent-248, score-0.408]
57 Note that when τd /τs increases, the static phase shrinks. [sent-249, score-0.256]
58 This is because the recovery of the synaptic depressed region is slowed down, making it harder to catch up with changes in the bump motion. [sent-250, score-0.68]
59 25 u p (a) Figure 7: Neuronal input u(x, t) and the STD coefficient p(x, t) in (a) the static state at (k, β) = (0. [sent-273, score-0.243]
60 (13) is to consider the instability of the static bump, which is obtained by setting v and p1 to zero in Eqs. [sent-280, score-0.21]
61 Considering the instability of the static bump against the asymmetric fluctuations in p1 and vt, we again arrive at Eq. [sent-282, score-0.605]
62 This shows that as soon as the moving bump comes into existence, the static bump becomes unstable. [sent-284, score-1.048]
63 This also implies that in the entire region that the static and moving bumps coexist, the static bump is unstable to asymmetric fluctuations. [sent-285, score-1.165]
64 As we shall see, this metastatic behavior is also the cause of the enhanced tracking performance. [sent-288, score-0.255]
65 3 The Plateau Behavior To illustrate the plateau behavior, we select a point in the marginally unstable regime of the silent phase, that is, in the vicinity of the static phase. [sent-290, score-0.555]
66 8, the nullclines of u and p0 6 (du/dt = 0 and dp0 /dt = 0 respectively) do not have any intersections as they do in the static phase where the bump state exists. [sent-292, score-0.704]
67 Yet, they are still close enough to create a √ region with very slow 1/2 dynamics near the apex of the u-nullcline at (u, p0 ) = [(8/k) , 7/4(1 − k)]. [sent-293, score-0.22]
68 Due to the much faster dynamics of u, trajectories starting from a wide range of initial conditions converge rapidly, in a time of the order τs , to a common trajectory in the close neighborhood of the u-nullcline. [sent-301, score-0.197]
69 This gives rise to the plateau region of u which can survive for a duration of the order τd . [sent-304, score-0.234]
70 The plateau ends after the trajectory has passed the slow region near the apex of the u-nullcline. [sent-305, score-0.333]
71 This dynamics is in clear contrast with trajectory D, in which the bump height decays to zero in a time of the order τs . [sent-306, score-0.545]
72 The trajectories then rely mainly on the dynamics of p0 to carry them out of this slow region, and hence plateaus of lifetimes of the order τd are created. [sent-310, score-0.306]
73 0 k Figure 8: Trajectories of network dynamics starting from various initial conditions at (k, β) = (0. [sent-337, score-0.22]
74 Figure 9: Contours of plateau lifetimes in the space of k and β. [sent-344, score-0.273]
75 Following similar arguments, the plateau behavior also exists in the stable region of the static states. [sent-349, score-0.457]
76 This happens when the initial condition of the network lies outside the basin of attraction of the static states, but it is still in the vicinity of the basin boundary. [sent-350, score-0.346]
77 When one goes deeper into the silent phase, the region of slow dynamics between the u- and p0 nullclines broadens. [sent-351, score-0.333]
78 Hence plateau lifetimes are longest near the phase boundary between the bump and silent states, and become shorter when one goes deeper into the silent phase. [sent-352, score-0.939]
79 This is confirmed by the contours of plateau lifetimes in the phase diagram shown in Fig. [sent-353, score-0.421]
80 The initial condition is uniformly set by introducing an external stimulus I ext (x|z0 ) = αu0 exp[−x2 /(4a2 )] to the right hand side of Eq. [sent-355, score-0.271]
81 After the network has reached a steady state, the stimulus is removed at t = 0, leaving the network to relax. [sent-357, score-0.443]
82 4 The Tracking Behavior To study the tracking behavior, we add the external stimulus I ext (x|z0 ) = αu0 exp −(x − z0 )2 /(4a2 ) to the right hand side of Eq. [sent-359, score-0.405]
83 (11), where z0 is the position of the stimulus abruptly changed at t = 0. [sent-360, score-0.199]
84 (11) and (12), and the solution reproduces the qualitative features due to the presence of synaptic depression, namely, the faster response at weak β, and the overshooting at stronger β. [sent-362, score-0.233]
85 As remarked previously, this is due to the metastatic behavior of the bumps, which enhances their reaction to move from the static state when a small push is exerted. [sent-363, score-0.401]
86 7 However, when describing the overshooting of the tracking process, the quantitative agreement between the numerical solution and the ansatz in Eqs. [sent-364, score-0.226]
87 5 Conclusions and Discussions In this work, we have investigated the impact of STD on the dynamics of a CANN, and found that the network can support both static and moving bumps. [sent-369, score-0.55]
88 Static bumps exist only when the synaptic depression is sufficiently weak. [sent-370, score-0.588]
89 A consequence of synaptic depression is that it places static bumps in the metastatic state, so that its response to changing stimuli is speeded up, enhancing its tracking performance. [sent-371, score-0.954]
90 We conjecture that moving bump states may be associated with traveling wave behaviors widely observed in the neurocortex. [sent-372, score-0.639]
91 When the network is initially stimulated to an active state by an external input, it will decay to silence very slowly after the input is removed. [sent-374, score-0.492]
92 The duration of the plateau is of the time scale of STD rather than neural signaling, and it provides a way for the network to hold the stimulus information for up to hundreds of milliseconds, if the network operates in the parameter regime that the bumps are marginally unstable. [sent-375, score-0.774]
93 In a CANN without STD, an active state of the network decays to silence exponentially fast or persists forever, depending on the initial activity level of the network. [sent-377, score-0.324]
94 Indeed, how to shut off the activity of a CANN has been a challenging issue that received wide attention in theoretical neuroscience, with solutions suggesting that a strong external input either in the form of inhibition or excitation must be applied (see, e. [sent-378, score-0.24]
95 Here, we show that STD provides a mechanism for closing down network activities naturally and in the desirable duration. [sent-381, score-0.174]
96 It describes the phase diagram of the static and moving phases, the plateau behavior, and provides insights on the metastatic nature of the bumps and its relation with the enhanced tracking performance. [sent-383, score-1.058]
97 However, higher order perturbation analysis is required to yield more accurate descriptions of results such as the overshooting in the tracking process (Fig. [sent-385, score-0.184]
98 [20] showed that STF provides a way for the network to encode the information of external inputs in the facilitated connection weights, and it has the advantage of not having to recruit persistent neural firing and hence is economically efficient. [sent-393, score-0.28]
99 In terms of information transmission, prolonged neural firing is preferable in the early information pathways, so that the stimulus information can be conveyed to higher cortical areas through neuronal interactions. [sent-400, score-0.286]
100 It is our goal in future work to explore the joint impact of STD and STF on the dynamics of neuronal networks. [sent-402, score-0.176]
wordName wordTfidf (topN-words)
[('std', 0.529), ('bump', 0.359), ('depression', 0.229), ('canns', 0.217), ('plateau', 0.189), ('static', 0.183), ('synaptic', 0.183), ('bumps', 0.176), ('moving', 0.147), ('cann', 0.132), ('stimulus', 0.131), ('silent', 0.117), ('stf', 0.117), ('external', 0.113), ('dynamics', 0.111), ('network', 0.109), ('tracking', 0.099), ('steady', 0.094), ('lifetimes', 0.084), ('metastatic', 0.084), ('silence', 0.084), ('vt', 0.082), ('tsodyks', 0.081), ('inhibition', 0.077), ('diagram', 0.075), ('phase', 0.073), ('behaviors', 0.068), ('neuronal', 0.065), ('state', 0.06), ('ring', 0.055), ('markram', 0.054), ('signaling', 0.054), ('trajectories', 0.051), ('pro', 0.051), ('milliseconds', 0.05), ('mongillo', 0.05), ('overshooting', 0.05), ('shut', 0.05), ('stp', 0.05), ('neurons', 0.048), ('le', 0.047), ('kc', 0.046), ('region', 0.045), ('ansatz', 0.044), ('uctuations', 0.044), ('dynamical', 0.042), ('behavior', 0.04), ('decays', 0.04), ('attractor', 0.039), ('distortion', 0.038), ('slowly', 0.037), ('fung', 0.036), ('unstable', 0.036), ('activities', 0.036), ('asymmetric', 0.036), ('trajectory', 0.035), ('perturbation', 0.035), ('position', 0.035), ('exp', 0.035), ('enhances', 0.034), ('abruptly', 0.033), ('apex', 0.033), ('coexist', 0.033), ('depressed', 0.033), ('endows', 0.033), ('prolonged', 0.033), ('slowed', 0.033), ('substrate', 0.033), ('synapses', 0.033), ('stimulated', 0.033), ('traveling', 0.033), ('numerical', 0.033), ('states', 0.032), ('symbols', 0.032), ('kong', 0.032), ('enhanced', 0.032), ('active', 0.031), ('dx', 0.031), ('slow', 0.031), ('china', 0.031), ('memory', 0.03), ('regime', 0.03), ('hong', 0.03), ('neural', 0.03), ('nullclines', 0.029), ('plateaus', 0.029), ('iext', 0.029), ('gaussian', 0.029), ('mechanism', 0.029), ('connection', 0.028), ('neuron', 0.028), ('realizing', 0.027), ('basin', 0.027), ('catch', 0.027), ('ext', 0.027), ('instability', 0.027), ('cortical', 0.027), ('modes', 0.026), ('initially', 0.025), ('working', 0.025)]
simIndex simValue paperId paperTitle
same-paper 1 0.99999958 34 nips-2010-Attractor Dynamics with Synaptic Depression
Author: K. Wong, He Wang, Si Wu, Chi Fung
Abstract: Neuronal connection weights exhibit short-term depression (STD). The present study investigates the impact of STD on the dynamics of a continuous attractor neural network (CANN) and its potential roles in neural information processing. We find that the network with STD can generate both static and traveling bumps, and STD enhances the performance of the network in tracking external inputs. In particular, we find that STD endows the network with slow-decaying plateau behaviors, namely, the network being initially stimulated to an active state will decay to silence very slowly in the time scale of STD rather than that of neural signaling. We argue that this provides a mechanism for neural systems to hold short-term memory easily and shut off persistent activities naturally.
2 0.13185707 127 nips-2010-Inferring Stimulus Selectivity from the Spatial Structure of Neural Network Dynamics
Author: Kanaka Rajan, L Abbott, Haim Sompolinsky
Abstract: How are the spatial patterns of spontaneous and evoked population responses related? We study the impact of connectivity on the spatial pattern of fluctuations in the input-generated response, by comparing the distribution of evoked and intrinsically generated activity across the different units of a neural network. We develop a complementary approach to principal component analysis in which separate high-variance directions are derived for each input condition. We analyze subspace angles to compute the difference between the shapes of trajectories corresponding to different network states, and the orientation of the low-dimensional subspaces that driven trajectories occupy within the full space of neuronal activity. In addition to revealing how the spatiotemporal structure of spontaneous activity affects input-evoked responses, these methods can be used to infer input selectivity induced by network dynamics from experimentally accessible measures of spontaneous activity (e.g. from voltage- or calcium-sensitive optical imaging experiments). We conclude that the absence of a detailed spatial map of afferent inputs and cortical connectivity does not limit our ability to design spatially extended stimuli that evoke strong responses. 1 1 Motivation Stimulus selectivity in neural networks was historically measured directly from input-driven responses [1], and only later were similar selectivity patterns observed in spontaneous activity across the cortical surface [2, 3]. We argue that it is possible to work in the reverse order, and show that analyzing the distribution of spontaneous activity across the different units in the network can inform us about the selectivity of evoked responses to stimulus features, even when no apparent sensory map exists. Sensory-evoked responses are typically divided into a signal component generated by the stimulus and a noise component corresponding to ongoing activity that is not directly related to the stimulus. Subsequent effort focuses on understanding how the signal depends on properties of the stimulus, while the remaining, irregular part of the response is treated as additive noise. The distinction between external stochastic processes and the noise generated deterministically as a function of intrinsic recurrence has been previously studied in chaotic neural networks [4]. It has also been suggested that internally generated noise is not additive and can be more sensitive to the frequency and amplitude of the input, compared to the signal component of the response [5 - 8]. In this paper, we demonstrate that the interaction between deterministic intrinsic noise and the spatial properties of the external stimulus is also complex and nonlinear. We study the impact of network connectivity on the spatial pattern of input-driven responses by comparing the structure of evoked and spontaneous activity, and show how the unique signature of these dynamics determines the selectivity of networks to spatial features of the stimuli driving them. 2 Model description In this section, we describe the network model and the methods we use to analyze its dynamics. Subsequent sections explore how the spatial patterns of spontaneous and evoked responses are related in terms of the distribution of the activity across the network. Finally, we show how the stimulus selectivity of the network can be inferred from its spontaneous activity patterns. 2.1 Network elements We build a firing rate model of N interconnected units characterized by a statistical description of the underlying circuitry (as N → ∞, the system “self averages” making the description independent of a specific network architecture, see also [11, 12]). Each unit is characterized by an activation variable xi ∀ i = 1, 2, . . . N , and a nonlinear response function ri which relates to xi through ri = R0 + φ(xi ) where, R0 tanh x for x ≤ 0 R0 φ(x) = (1) x (Rmax − R0 ) tanh otherwise. Rmax −R0 Eq. 1 allows us to independently set the maximum firing rate Rmax and the background rate R0 to biologically reasonable values, while retaining a maximum gradient at x = 0 to guarantee the smoothness of the transition to chaos [4]. We introduce a recurrent weight matrix with element Jij equivalent to the strength of the synapse from unit j → unit i. The individual weights are chosen independently and randomly from a Gaus2 sian distribution with mean and variance given by [Jij ]J = 0 and Jij J = g 2 /N , where square brackets are ensemble averages [9 - 11,13]. The control parameter g which scales as the variance of the synaptic weights, is particularly important in determining whether or not the network produces spontaneous activity with non-trivial dynamics (Specifically, g = 0 corresponds to a completely uncoupled network and a network with g = 1 generates non-trivial spontaneous activity [4, 9, 10]). The activation variable for each unit xi is therefore determined by the relation, N τr dxi = −xi + g Jij rj + Ii , dt j=1 with the time scale of the network set by the single-neuron time constant τr of 10 ms. 2 (2) The amplitude I of an oscillatory external input of frequency f , is always the same for each unit, but in some examples shown in this paper, we introduce a neuron-specific phase factor θi , chosen randomly from a uniform distribution between 0 and 2π, such that Ii = I cos(2πf t + θi ) ∀ i = 1, 2, . . . N. (3) In visually responsive neurons, this mimics a population of simple cells driven by a drifting grating of temporal frequency f , with the different phases arising from offsets in spatial receptive field locations. The randomly assigned phases in our model ensure that the spatial pattern of input is not correlated with the pattern of recurrent connectivity. In our selectivity analysis however (Fig. 3), we replace the random phases with spatial input patterns that are aligned with network connectivity. 2.2 PCA redux Principal component analysis (PCA) has been applied profitably to neuronal recordings (see for example [14]) but these analyses often plot activity trajectories corresponding to different network states using the fixed principal component coordinates derived from combined activities under all stimulus conditions. Our analysis offers a complementary approach whereby separate principal components are derived for each stimulus condition, and the resulting principal angles reveal not only the difference between the shapes of trajectories corresponding to different network states, but also the orientation of the low-dimensional subspaces these trajectories occupy within the full N -dimensional space of neuronal activity. The instantaneous network state can be described by a point in an N -dimensional space with coordinates equal to the firing rates of the N units. Over time, the network activity traverses a trajectory in this N -dimensional space and PCA can be used to delineate the subspace in which this trajectory lies. The analysis is done by diagonalizing the equal-time cross-correlation matrix of network firing rates given by, Dij = (ri (t) − ri )(rj (t) − rj ) , (4) where
3 0.1165459 253 nips-2010-Spike timing-dependent plasticity as dynamic filter
Author: Joscha Schmiedt, Christian Albers, Klaus Pawelzik
Abstract: When stimulated with complex action potential sequences synapses exhibit spike timing-dependent plasticity (STDP) with modulated pre- and postsynaptic contributions to long-term synaptic modifications. In order to investigate the functional consequences of these contribution dynamics (CD) we propose a minimal model formulated in terms of differential equations. We find that our model reproduces data from to recent experimental studies with a small number of biophysically interpretable parameters. The model allows to investigate the susceptibility of STDP to arbitrary time courses of pre- and postsynaptic activities, i.e. its nonlinear filter properties. We demonstrate this for the simple example of small periodic modulations of pre- and postsynaptic firing rates for which our model can be solved. It predicts synaptic strengthening for synchronous rate modulations. Modifications are dominant in the theta frequency range, a result which underlines the well known relevance of theta activities in hippocampus and cortex for learning. We also find emphasis of specific baseline spike rates and suppression for high background rates. The latter suggests a mechanism of network activity regulation inherent in STDP. Furthermore, our novel formulation provides a general framework for investigating the joint dynamics of neuronal activity and the CD of STDP in both spike-based as well as rate-based neuronal network models. 1
4 0.1008034 21 nips-2010-Accounting for network effects in neuronal responses using L1 regularized point process models
Author: Ryan Kelly, Matthew Smith, Robert Kass, Tai S. Lee
Abstract: Activity of a neuron, even in the early sensory areas, is not simply a function of its local receptive field or tuning properties, but depends on global context of the stimulus, as well as the neural context. This suggests the activity of the surrounding neurons and global brain states can exert considerable influence on the activity of a neuron. In this paper we implemented an L1 regularized point process model to assess the contribution of multiple factors to the firing rate of many individual units recorded simultaneously from V1 with a 96-electrode “Utah” array. We found that the spikes of surrounding neurons indeed provide strong predictions of a neuron’s response, in addition to the neuron’s receptive field transfer function. We also found that the same spikes could be accounted for with the local field potentials, a surrogate measure of global network states. This work shows that accounting for network fluctuations can improve estimates of single trial firing rate and stimulus-response transfer functions. 1
5 0.092927873 200 nips-2010-Over-complete representations on recurrent neural networks can support persistent percepts
Author: Shaul Druckmann, Dmitri B. Chklovskii
Abstract: A striking aspect of cortical neural networks is the divergence of a relatively small number of input channels from the peripheral sensory apparatus into a large number of cortical neurons, an over-complete representation strategy. Cortical neurons are then connected by a sparse network of lateral synapses. Here we propose that such architecture may increase the persistence of the representation of an incoming stimulus, or a percept. We demonstrate that for a family of networks in which the receptive field of each neuron is re-expressed by its outgoing connections, a represented percept can remain constant despite changing activity. We term this choice of connectivity REceptive FIeld REcombination (REFIRE) networks. The sparse REFIRE network may serve as a high-dimensional integrator and a biologically plausible model of the local cortical circuit. 1
6 0.091479868 68 nips-2010-Effects of Synaptic Weight Diffusion on Learning in Decision Making Networks
7 0.084801048 161 nips-2010-Linear readout from a neural population with partial correlation data
8 0.081973322 238 nips-2010-Short-term memory in neuronal networks through dynamical compressed sensing
9 0.077088721 16 nips-2010-A VLSI Implementation of the Adaptive Exponential Integrate-and-Fire Neuron Model
10 0.068858363 268 nips-2010-The Neural Costs of Optimal Control
11 0.066898763 119 nips-2010-Implicit encoding of prior probabilities in optimal neural populations
12 0.063255705 162 nips-2010-Link Discovery using Graph Feature Tracking
13 0.05813235 252 nips-2010-SpikeAnts, a spiking neuron network modelling the emergence of organization in a complex system
14 0.05122688 91 nips-2010-Fast detection of multiple change-points shared by many signals using group LARS
15 0.049994908 81 nips-2010-Evaluating neuronal codes for inference using Fisher information
16 0.048153605 167 nips-2010-Mixture of time-warped trajectory models for movement decoding
17 0.046256412 148 nips-2010-Learning Networks of Stochastic Differential Equations
18 0.044646807 263 nips-2010-Switching state space model for simultaneously estimating state transitions and nonstationary firing rates
19 0.044361975 96 nips-2010-Fractionally Predictive Spiking Neurons
20 0.043930925 171 nips-2010-Movement extraction by detecting dynamics switches and repetitions
topicId topicWeight
[(0, 0.113), (1, 0.009), (2, -0.131), (3, 0.154), (4, 0.047), (5, 0.077), (6, -0.048), (7, 0.02), (8, -0.001), (9, -0.007), (10, 0.026), (11, -0.008), (12, 0.016), (13, 0.032), (14, 0.023), (15, 0.002), (16, -0.038), (17, -0.03), (18, -0.02), (19, 0.101), (20, 0.012), (21, 0.024), (22, 0.083), (23, 0.036), (24, 0.047), (25, 0.021), (26, 0.015), (27, 0.049), (28, -0.006), (29, -0.113), (30, 0.013), (31, -0.052), (32, 0.049), (33, -0.024), (34, -0.117), (35, -0.042), (36, -0.09), (37, 0.095), (38, -0.019), (39, 0.036), (40, -0.069), (41, 0.01), (42, 0.055), (43, -0.048), (44, -0.026), (45, -0.097), (46, 0.024), (47, 0.04), (48, 0.062), (49, 0.024)]
simIndex simValue paperId paperTitle
same-paper 1 0.96459711 34 nips-2010-Attractor Dynamics with Synaptic Depression
Author: K. Wong, He Wang, Si Wu, Chi Fung
Abstract: Neuronal connection weights exhibit short-term depression (STD). The present study investigates the impact of STD on the dynamics of a continuous attractor neural network (CANN) and its potential roles in neural information processing. We find that the network with STD can generate both static and traveling bumps, and STD enhances the performance of the network in tracking external inputs. In particular, we find that STD endows the network with slow-decaying plateau behaviors, namely, the network being initially stimulated to an active state will decay to silence very slowly in the time scale of STD rather than that of neural signaling. We argue that this provides a mechanism for neural systems to hold short-term memory easily and shut off persistent activities naturally.
2 0.80718499 127 nips-2010-Inferring Stimulus Selectivity from the Spatial Structure of Neural Network Dynamics
Author: Kanaka Rajan, L Abbott, Haim Sompolinsky
Abstract: How are the spatial patterns of spontaneous and evoked population responses related? We study the impact of connectivity on the spatial pattern of fluctuations in the input-generated response, by comparing the distribution of evoked and intrinsically generated activity across the different units of a neural network. We develop a complementary approach to principal component analysis in which separate high-variance directions are derived for each input condition. We analyze subspace angles to compute the difference between the shapes of trajectories corresponding to different network states, and the orientation of the low-dimensional subspaces that driven trajectories occupy within the full space of neuronal activity. In addition to revealing how the spatiotemporal structure of spontaneous activity affects input-evoked responses, these methods can be used to infer input selectivity induced by network dynamics from experimentally accessible measures of spontaneous activity (e.g. from voltage- or calcium-sensitive optical imaging experiments). We conclude that the absence of a detailed spatial map of afferent inputs and cortical connectivity does not limit our ability to design spatially extended stimuli that evoke strong responses. 1 1 Motivation Stimulus selectivity in neural networks was historically measured directly from input-driven responses [1], and only later were similar selectivity patterns observed in spontaneous activity across the cortical surface [2, 3]. We argue that it is possible to work in the reverse order, and show that analyzing the distribution of spontaneous activity across the different units in the network can inform us about the selectivity of evoked responses to stimulus features, even when no apparent sensory map exists. Sensory-evoked responses are typically divided into a signal component generated by the stimulus and a noise component corresponding to ongoing activity that is not directly related to the stimulus. Subsequent effort focuses on understanding how the signal depends on properties of the stimulus, while the remaining, irregular part of the response is treated as additive noise. The distinction between external stochastic processes and the noise generated deterministically as a function of intrinsic recurrence has been previously studied in chaotic neural networks [4]. It has also been suggested that internally generated noise is not additive and can be more sensitive to the frequency and amplitude of the input, compared to the signal component of the response [5 - 8]. In this paper, we demonstrate that the interaction between deterministic intrinsic noise and the spatial properties of the external stimulus is also complex and nonlinear. We study the impact of network connectivity on the spatial pattern of input-driven responses by comparing the structure of evoked and spontaneous activity, and show how the unique signature of these dynamics determines the selectivity of networks to spatial features of the stimuli driving them. 2 Model description In this section, we describe the network model and the methods we use to analyze its dynamics. Subsequent sections explore how the spatial patterns of spontaneous and evoked responses are related in terms of the distribution of the activity across the network. Finally, we show how the stimulus selectivity of the network can be inferred from its spontaneous activity patterns. 2.1 Network elements We build a firing rate model of N interconnected units characterized by a statistical description of the underlying circuitry (as N → ∞, the system “self averages” making the description independent of a specific network architecture, see also [11, 12]). Each unit is characterized by an activation variable xi ∀ i = 1, 2, . . . N , and a nonlinear response function ri which relates to xi through ri = R0 + φ(xi ) where, R0 tanh x for x ≤ 0 R0 φ(x) = (1) x (Rmax − R0 ) tanh otherwise. Rmax −R0 Eq. 1 allows us to independently set the maximum firing rate Rmax and the background rate R0 to biologically reasonable values, while retaining a maximum gradient at x = 0 to guarantee the smoothness of the transition to chaos [4]. We introduce a recurrent weight matrix with element Jij equivalent to the strength of the synapse from unit j → unit i. The individual weights are chosen independently and randomly from a Gaus2 sian distribution with mean and variance given by [Jij ]J = 0 and Jij J = g 2 /N , where square brackets are ensemble averages [9 - 11,13]. The control parameter g which scales as the variance of the synaptic weights, is particularly important in determining whether or not the network produces spontaneous activity with non-trivial dynamics (Specifically, g = 0 corresponds to a completely uncoupled network and a network with g = 1 generates non-trivial spontaneous activity [4, 9, 10]). The activation variable for each unit xi is therefore determined by the relation, N τr dxi = −xi + g Jij rj + Ii , dt j=1 with the time scale of the network set by the single-neuron time constant τr of 10 ms. 2 (2) The amplitude I of an oscillatory external input of frequency f , is always the same for each unit, but in some examples shown in this paper, we introduce a neuron-specific phase factor θi , chosen randomly from a uniform distribution between 0 and 2π, such that Ii = I cos(2πf t + θi ) ∀ i = 1, 2, . . . N. (3) In visually responsive neurons, this mimics a population of simple cells driven by a drifting grating of temporal frequency f , with the different phases arising from offsets in spatial receptive field locations. The randomly assigned phases in our model ensure that the spatial pattern of input is not correlated with the pattern of recurrent connectivity. In our selectivity analysis however (Fig. 3), we replace the random phases with spatial input patterns that are aligned with network connectivity. 2.2 PCA redux Principal component analysis (PCA) has been applied profitably to neuronal recordings (see for example [14]) but these analyses often plot activity trajectories corresponding to different network states using the fixed principal component coordinates derived from combined activities under all stimulus conditions. Our analysis offers a complementary approach whereby separate principal components are derived for each stimulus condition, and the resulting principal angles reveal not only the difference between the shapes of trajectories corresponding to different network states, but also the orientation of the low-dimensional subspaces these trajectories occupy within the full N -dimensional space of neuronal activity. The instantaneous network state can be described by a point in an N -dimensional space with coordinates equal to the firing rates of the N units. Over time, the network activity traverses a trajectory in this N -dimensional space and PCA can be used to delineate the subspace in which this trajectory lies. The analysis is done by diagonalizing the equal-time cross-correlation matrix of network firing rates given by, Dij = (ri (t) − ri )(rj (t) − rj ) , (4) where
3 0.6741783 200 nips-2010-Over-complete representations on recurrent neural networks can support persistent percepts
Author: Shaul Druckmann, Dmitri B. Chklovskii
Abstract: A striking aspect of cortical neural networks is the divergence of a relatively small number of input channels from the peripheral sensory apparatus into a large number of cortical neurons, an over-complete representation strategy. Cortical neurons are then connected by a sparse network of lateral synapses. Here we propose that such architecture may increase the persistence of the representation of an incoming stimulus, or a percept. We demonstrate that for a family of networks in which the receptive field of each neuron is re-expressed by its outgoing connections, a represented percept can remain constant despite changing activity. We term this choice of connectivity REceptive FIeld REcombination (REFIRE) networks. The sparse REFIRE network may serve as a high-dimensional integrator and a biologically plausible model of the local cortical circuit. 1
4 0.6517933 21 nips-2010-Accounting for network effects in neuronal responses using L1 regularized point process models
Author: Ryan Kelly, Matthew Smith, Robert Kass, Tai S. Lee
Abstract: Activity of a neuron, even in the early sensory areas, is not simply a function of its local receptive field or tuning properties, but depends on global context of the stimulus, as well as the neural context. This suggests the activity of the surrounding neurons and global brain states can exert considerable influence on the activity of a neuron. In this paper we implemented an L1 regularized point process model to assess the contribution of multiple factors to the firing rate of many individual units recorded simultaneously from V1 with a 96-electrode “Utah” array. We found that the spikes of surrounding neurons indeed provide strong predictions of a neuron’s response, in addition to the neuron’s receptive field transfer function. We also found that the same spikes could be accounted for with the local field potentials, a surrogate measure of global network states. This work shows that accounting for network fluctuations can improve estimates of single trial firing rate and stimulus-response transfer functions. 1
5 0.64110976 68 nips-2010-Effects of Synaptic Weight Diffusion on Learning in Decision Making Networks
Author: Kentaro Katahira, Kazuo Okanoya, Masato Okada
Abstract: When animals repeatedly choose actions from multiple alternatives, they can allocate their choices stochastically depending on past actions and outcomes. It is commonly assumed that this ability is achieved by modifications in synaptic weights related to decision making. Choice behavior has been empirically found to follow Herrnstein’s matching law. Loewenstein & Seung (2006) demonstrated that matching behavior is a steady state of learning in neural networks if the synaptic weights change proportionally to the covariance between reward and neural activities. However, their proof did not take into account the change in entire synaptic distributions. In this study, we show that matching behavior is not necessarily a steady state of the covariance-based learning rule when the synaptic strength is sufficiently strong so that the fluctuations in input from individual sensory neurons influence the net input to output neurons. This is caused by the increasing variance in the input potential due to the diffusion of synaptic weights. This effect causes an undermatching phenomenon, which has been observed in many behavioral experiments. We suggest that the synaptic diffusion effects provide a robust neural mechanism for stochastic choice behavior.
6 0.62374961 252 nips-2010-SpikeAnts, a spiking neuron network modelling the emergence of organization in a complex system
7 0.60014862 253 nips-2010-Spike timing-dependent plasticity as dynamic filter
8 0.56846648 161 nips-2010-Linear readout from a neural population with partial correlation data
9 0.56747508 238 nips-2010-Short-term memory in neuronal networks through dynamical compressed sensing
10 0.49613094 157 nips-2010-Learning to localise sounds with spiking neural networks
11 0.48502755 16 nips-2010-A VLSI Implementation of the Adaptive Exponential Integrate-and-Fire Neuron Model
12 0.45508525 81 nips-2010-Evaluating neuronal codes for inference using Fisher information
13 0.43698469 119 nips-2010-Implicit encoding of prior probabilities in optimal neural populations
14 0.42922121 190 nips-2010-On the Convexity of Latent Social Network Inference
15 0.40678251 111 nips-2010-Hallucinations in Charles Bonnet Syndrome Induced by Homeostasis: a Deep Boltzmann Machine Model
16 0.39243501 57 nips-2010-Decoding Ipsilateral Finger Movements from ECoG Signals in Humans
17 0.39228344 19 nips-2010-A rational decision making framework for inhibitory control
18 0.38489681 263 nips-2010-Switching state space model for simultaneously estimating state transitions and nonstationary firing rates
19 0.3707338 117 nips-2010-Identifying graph-structured activation patterns in networks
20 0.36565781 17 nips-2010-A biologically plausible network for the computation of orientation dominance
topicId topicWeight
[(13, 0.015), (27, 0.071), (30, 0.031), (35, 0.015), (45, 0.109), (50, 0.036), (52, 0.021), (60, 0.02), (77, 0.57), (90, 0.024)]
simIndex simValue paperId paperTitle
same-paper 1 0.92267114 34 nips-2010-Attractor Dynamics with Synaptic Depression
Author: K. Wong, He Wang, Si Wu, Chi Fung
Abstract: Neuronal connection weights exhibit short-term depression (STD). The present study investigates the impact of STD on the dynamics of a continuous attractor neural network (CANN) and its potential roles in neural information processing. We find that the network with STD can generate both static and traveling bumps, and STD enhances the performance of the network in tracking external inputs. In particular, we find that STD endows the network with slow-decaying plateau behaviors, namely, the network being initially stimulated to an active state will decay to silence very slowly in the time scale of STD rather than that of neural signaling. We argue that this provides a mechanism for neural systems to hold short-term memory easily and shut off persistent activities naturally.
2 0.89922547 16 nips-2010-A VLSI Implementation of the Adaptive Exponential Integrate-and-Fire Neuron Model
Author: Sebastian Millner, Andreas Grübl, Karlheinz Meier, Johannes Schemmel, Marc-olivier Schwartz
Abstract: We describe an accelerated hardware neuron being capable of emulating the adaptive exponential integrate-and-fire neuron model. Firing patterns of the membrane stimulated by a step current are analyzed in transistor level simulations and in silicon on a prototype chip. The neuron is destined to be the hardware neuron of a highly integrated wafer-scale system reaching out for new computational paradigms and opening new experimentation possibilities. As the neuron is dedicated as a universal device for neuroscientific experiments, the focus lays on parameterizability and reproduction of the analytical model. 1
3 0.81345147 230 nips-2010-Robust Clustering as Ensembles of Affinity Relations
Author: Hairong Liu, Longin J. Latecki, Shuicheng Yan
Abstract: In this paper, we regard clustering as ensembles of k-ary affinity relations and clusters correspond to subsets of objects with maximal average affinity relations. The average affinity relation of a cluster is relaxed and well approximated by a constrained homogenous function. We present an efficient procedure to solve this optimization problem, and show that the underlying clusters can be robustly revealed by using priors systematically constructed from the data. Our method can automatically select some points to form clusters, leaving other points un-grouped; thus it is inherently robust to large numbers of outliers, which has seriously limited the applicability of classical methods. Our method also provides a unified solution to clustering from k-ary affinity relations with k ≥ 2, that is, it applies to both graph-based and hypergraph-based clustering problems. Both theoretical analysis and experimental results show the superiority of our method over classical solutions to the clustering problem, especially when there exists a large number of outliers.
4 0.7946074 15 nips-2010-A Theory of Multiclass Boosting
Author: Indraneel Mukherjee, Robert E. Schapire
Abstract: Boosting combines weak classifiers to form highly accurate predictors. Although the case of binary classification is well understood, in the multiclass setting, the “correct” requirements on the weak classifier, or the notion of the most efficient boosting algorithms are missing. In this paper, we create a broad and general framework, within which we make precise and identify the optimal requirements on the weak-classifier, as well as design the most effective, in a certain sense, boosting algorithms that assume such requirements. 1
5 0.71580368 142 nips-2010-Learning Bounds for Importance Weighting
Author: Corinna Cortes, Yishay Mansour, Mehryar Mohri
Abstract: This paper presents an analysis of importance weighting for learning from finite samples and gives a series of theoretical and algorithmic results. We point out simple cases where importance weighting can fail, which suggests the need for an analysis of the properties of this technique. We then give both upper and lower bounds for generalization with bounded importance weights and, more significantly, give learning guarantees for the more common case of unbounded importance weights under the weak assumption that the second moment is bounded, a condition related to the R´ nyi divergence of the training and test distributions. e These results are based on a series of novel and general bounds we derive for unbounded loss functions, which are of independent interest. We use these bounds to guide the definition of an alternative reweighting algorithm and report the results of experiments demonstrating its benefits. Finally, we analyze the properties of normalized importance weights which are also commonly used.
6 0.66296619 234 nips-2010-Segmentation as Maximum-Weight Independent Set
7 0.60158163 68 nips-2010-Effects of Synaptic Weight Diffusion on Learning in Decision Making Networks
8 0.55932719 252 nips-2010-SpikeAnts, a spiking neuron network modelling the emergence of organization in a complex system
10 0.53766149 10 nips-2010-A Novel Kernel for Learning a Neuron Model from Spike Train Data
11 0.5339343 8 nips-2010-A Log-Domain Implementation of the Diffusion Network in Very Large Scale Integration
12 0.53260046 253 nips-2010-Spike timing-dependent plasticity as dynamic filter
13 0.51563704 127 nips-2010-Inferring Stimulus Selectivity from the Spatial Structure of Neural Network Dynamics
14 0.50989902 200 nips-2010-Over-complete representations on recurrent neural networks can support persistent percepts
15 0.47797057 238 nips-2010-Short-term memory in neuronal networks through dynamical compressed sensing
16 0.44112062 117 nips-2010-Identifying graph-structured activation patterns in networks
17 0.43809786 115 nips-2010-Identifying Dendritic Processing
18 0.42890376 96 nips-2010-Fractionally Predictive Spiking Neurons
19 0.42202583 21 nips-2010-Accounting for network effects in neuronal responses using L1 regularized point process models
20 0.42066541 17 nips-2010-A biologically plausible network for the computation of orientation dominance