nips nips2002 nips2002-186 knowledge-graph by maker-knowledge-mining
Source: pdf
Author: R. J. Vogelstein, Francesco Tenore, Ralf Philipp, Miriam S. Adlerstein, David H. Goldberg, Gert Cauwenberghs
Abstract: Address-event representation (AER), originally proposed as a means to communicate sparse neural events between neuromorphic chips, has proven efficient in implementing large-scale networks with arbitrary, configurable synaptic connectivity. In this work, we further extend the functionality of AER to implement arbitrary, configurable synaptic plasticity in the address domain. As proof of concept, we implement a biologically inspired form of spike timing-dependent plasticity (STDP) based on relative timing of events in an AER framework. Experimental results from an analog VLSI integrate-and-fire network demonstrate address domain learning in a task that requires neurons to group correlated inputs.
Reference: text
sentIndex sentText sentNum sentScore
1 edu Abstract Address-event representation (AER), originally proposed as a means to communicate sparse neural events between neuromorphic chips, has proven efficient in implementing large-scale networks with arbitrary, configurable synaptic connectivity. [sent-5, score-0.654]
2 In this work, we further extend the functionality of AER to implement arbitrary, configurable synaptic plasticity in the address domain. [sent-6, score-0.652]
3 As proof of concept, we implement a biologically inspired form of spike timing-dependent plasticity (STDP) based on relative timing of events in an AER framework. [sent-7, score-0.567]
4 Experimental results from an analog VLSI integrate-and-fire network demonstrate address domain learning in a task that requires neurons to group correlated inputs. [sent-8, score-0.53]
5 Developments in neuromorphic engineering and address-event representation (AER) have provided an infrastructure suitable for emulating large-scale neural systems in silicon, e. [sent-10, score-0.21]
6 Although an integral part of neuromorphic engineering since its inception [1], only recently have implemented systems begun to incorporate adaptation and learning with biological models of synaptic plasticity. [sent-13, score-0.512]
7 A variety of learning rules have been realized in neuromorphic hardware [4, 5]. [sent-14, score-0.225]
8 These systems usually employ circuitry incorporated into the individual cells, imposing constraints on the nature of inputs and outputs of the implemented algorithm. [sent-15, score-0.053]
9 AER-based systems are inherently scalable, and because the encoding and decoding of events is performed at the periphery, learning algorithms can be arbitrarily complex without increasing the size of repeating neural units. [sent-18, score-0.223]
10 Furthermore, AER makes no assumptions about the signals repre- 1 2 Data bus 3 0 2 1 3 time Decoder 0 Receiver Encoder Sender 0 1 2 3 REQ REQ ACK ACK Figure 1: Address-event representation. [sent-19, score-0.092]
11 Sender events are encoded into an address, sent over the bus, and decoded. [sent-20, score-0.333]
12 Handshaking signals REQ and ACK are required to ensure that only one cell pair is communicating at a time. [sent-21, score-0.125]
13 sented as spikes, so learning can address any measure of cellular activity. [sent-23, score-0.174]
14 , [6]), but recently, the possibility of modifying synapses based on the timing of action potentials has been explored in both the neuroscience [7, 8] and neuromorphic engineering disciplines [9]–[11]. [sent-27, score-0.253]
15 We propose that AER-based neuromorphic systems are ideally suited to implement learning rules founded on this notion of spike-timing dependent plasticity (STDP). [sent-29, score-0.346]
16 In the following sections, we describe an implementation of one biologicallyplausible STDP learning rule and demonstrate that table-based synaptic connectivity can be extended to table-based synaptic plasticity in a scalable and reconfigurable neuromorphic AER architecture. [sent-30, score-0.979]
17 2 Address-domain architecture Address-event representation is a communication protocol that uses time-multiplexing to emulate extensive connectivity [12] (Fig. [sent-31, score-0.114]
18 In an AER system, one array of neurons encodes its activity in the form of spikes that are transmitted to another array of neurons. [sent-33, score-0.335]
19 The “brute force” approach to communicating these signals would be to use one wire for each pair of neurons, requiring N wires for N cell pairs. [sent-34, score-0.158]
20 However, an AER system identifies the location of a spiking cell and encodes this as an address, which is then sent across a shared data bus. [sent-35, score-0.158]
21 The receiving array decodes the address and routes it to the appropriate cell, reconstructing the sender’s activity. [sent-36, score-0.235]
22 Handshaking signals REQ and ACK are required to ensure that only one cell pair is using the data bus at a time. [sent-37, score-0.136]
23 Two pieces of information uniquely identify a spike: its location, which is explicitly encoded as an address, and the time that it occurs, which need not be explicitly encoded because the events are communicated in real-time. [sent-39, score-0.316]
24 In its original formulation, AER implements a one-to-one connection topology, which is appropriate for emulating the optic and auditory nerves [12, 13]. [sent-41, score-0.065]
25 To create more complex neural circuits, convergent and divergent connectivity is required. [sent-42, score-0.066]
26 Several authors have discussed and implemented methods of enhancing the connectivity of AER systems to this end [14]–[16]. [sent-43, score-0.119]
27 These methods call for a memory-based projective field mapping that enables routing an address-event to multiple receiver locations. [sent-44, score-0.155]
28 ables continuous-valued synaptic weights by means of graded (probabilistic or deterministic) transmission of address-events. [sent-49, score-0.307]
29 This architecture employs a look-up table (LUT), an integrate-and-fire address-event transceiver (IFAT), and some additional support circuitry. [sent-50, score-0.118]
30 Each row in the table corresponds to a single synaptic connection—it contains information about the sender location, the receiver location, the connection polarity (excitatory or inhibitory), and the connection magnitude. [sent-53, score-0.801]
31 When a spike is sent to the system, the sender address is used as an index into the LUT and a signal activates the event generator (EG) circuit. [sent-54, score-0.655]
32 The EG scrolls through all the table entries corresponding to synaptic connections from the sending neuron. [sent-55, score-0.424]
33 For each synapse, the receiver address and the spike polarity are sent to the IFAT, and the EG initiates as many spikes as are specified in the weight magnitude field. [sent-56, score-0.657]
34 Events received by the IFAT are temporally and spatially integrated by analog circuitry. [sent-57, score-0.12]
35 Each integrate-and-fire cell receives excitatory and inhibitory inputs that increment or decrement the potential stored on an internal capacitance. [sent-58, score-0.072]
36 When this potential exceeds a given threshold, the cell generates an output event and broadcasts its address to the AE arbiter. [sent-59, score-0.306]
37 The physical location of neurons in the array is inconsequential as connections are routed through the LUT, which is implemented in random-access memory (RAM) outside of the chip. [sent-60, score-0.383]
38 An interesting feature of the IFAT is that it is insensitive to the timescale over which events occur. [sent-61, score-0.223]
39 Effects of leakage current in real neurons are emulated by regularly sending inhibitory events to all of the cells in the array. [sent-63, score-0.415]
40 Modulating the timing of the “global decay events” allows us to dynamically warp the time axis. [sent-64, score-0.115]
41 We have designed and implemented a prototype system that uses the IFAT infrastructure to implement massively connected, reconfigurable neural networks. [sent-65, score-0.18]
42 It consists of a custom VLSI IFAT chip with a 1024-neuron array, a RAM that stores the look-up table, and a microcontroller unit (MCU) that realizes the event generator. [sent-68, score-0.164]
43 The elements are an integrate-andfire array transceiver (IFAT) chip, a random-access memory (RAM) look-up table, and a microcontroller unit (MCU). [sent-71, score-0.131]
44 Input events are routed by the RAM look-up table, and integrated by the IFAT chip. [sent-73, score-0.322]
45 Events emitted by the IFAT are sent to the look-up table, where they are routed back to the IFAT. [sent-75, score-0.145]
46 of three physical mechanisms: w = npq (1) where n is the number of quantal neurotransmitter sites, p is the probability of synaptic release per site, and q is the measure of the postsynaptic effect of the synapse. [sent-77, score-0.516]
47 Many early neural network models held n and p constant and attributed all of the variability in the weight to q. [sent-78, score-0.079]
48 Our architecture is capable of varying all three components: n by sending multiple events to the same receiver location, p by probabilistically routing the events (as in [17]), and q by varying the size of the potential increments and decrements in the IFAT cells. [sent-79, score-0.688]
49 In the experiments described in this paper, the transmission of address-events is deterministic, and the weight is controlled by varying the number of events per synapse, corresponding to a variation in n. [sent-80, score-0.272]
50 3 Address-domain learning The AER architecture lends itself to implementations of synaptic plasticity, since information about presynaptic and postsynaptic activity is readily available and the contents of the synaptic weight fields in RAM are easily modifiable “on the fly. [sent-81, score-1.15]
51 ” As in biological systems, synapses can be dynamically created and pruned by inserting or deleting entries in the LUT. [sent-82, score-0.091]
52 Like address domain connectivity, the advantage of address domain plasticity is that the constituents of the implemented learning rule are not constrained to be local in space or time. [sent-83, score-0.609]
53 Basic forms of Hebbian learning can be implemented with no overhead in the address domain. [sent-85, score-0.227]
54 When a presynaptic event, routed by the LUT through the IFAT, elicits a postsynaptic event, the synaptic strength between the two neurons is simply updated by incrementing the data field of the LUT entry at the active address location. [sent-86, score-1.139]
55 A similar strategy can be adopted for other learning rules of the incremental outer-product type, such as delta-rule or backpropagation supervised learning. [sent-87, score-0.051]
56 Non-local learning rules require control of the LUT address space to implement spatial and/or temporal dependencies. [sent-88, score-0.266]
57 (a) Synaptic updates ∆w as a function of the relative timing of presynaptic and postsynaptic events, with asymmetric windows of anti-causal and causal regimes τ − > τ+ . [sent-90, score-0.58]
58 (b) Address-domain implementation using presynaptic (top) and postsynaptic (bottom) event queues of window lengths τ+ and τ− . [sent-91, score-0.524]
59 4 Spike timing-dependent plasticity Learning rules based on STDP specify changes in synaptic strength depending on the time interval between each pair of presynaptic and postsynaptic events. [sent-93, score-0.956]
60 “Causal” postsynaptic events that succeed presynaptic action potentials (APs) by a short duration of time potentiate the synaptic strength, while “anti-causal” presynaptic events succeeding postsynaptic APs by a short duration depress the synaptic strength. [sent-94, score-1.858]
61 The amount of strengthening or weakening is dependent on the exact time of the event within the causal or anti-causal regime, as illustrated in Fig. [sent-95, score-0.205]
62 The weight update has the form ∆w = −η[τ− − (tpre − tpost )] η[τ+ + (tpre − tpost )] 0 0 ≤ tpre − tpost ≤ τ− −τ+ ≤ tpre − tpost ≤ 0 otherwise (2) where tpre and tpost denote time stamps of presynaptic and postsynaptic events. [sent-97, score-1.41]
63 For stable learning, the time windows of causal and anti-causal regimes τ + and τ− are subject to the constraint τ+ < τ− . [sent-98, score-0.115]
64 For more general functional forms of STDP ∆w(t pre − tpost ), the area under the synaptic modification curve in the anti-causal regime must be greater than that in the causal regime to ensure convergence of the synaptic strengths [7]. [sent-99, score-1.029]
65 The STDP synaptic modification rule (2) is implemented in the address domain by augmenting the AER architecture with two event queues, one each for presynaptic and postsynaptic events, shown in Figure 4 (b). [sent-100, score-1.108]
66 Each time a presynaptic event is generated, the sender’s address is entered into a queue with an associated value of τ + . [sent-101, score-0.577]
67 All values in the queue are decremented every time a global decay event is observed, marking one unit of time T . [sent-102, score-0.292]
68 the synaptic strengths in the LUT according to the values stored in the queue. [sent-110, score-0.377]
69 Anti-causal events require an equivalent set of operations, matching each incoming presynaptic spike with a second queue of postsynaptic events. [sent-111, score-0.854]
70 In this case, entries in the queue are initialized with a value of τ− and decremented after every interval of time T between decay events, corresponding to the decrease in strength to be applied at the presynaptic/postsynaptic pair. [sent-112, score-0.273]
71 We have chosen a particularly simple form of the synaptic modification function (2) as proof of principle in the experiments. [sent-113, score-0.307]
72 More general functions can be implemented by a table that maps time bins in the history of the queue to specified values of ∆w(nT ), with positive values of n indexing the postsynaptic queue, and negative values indexing the presynaptic queue. [sent-114, score-0.61]
73 5 Experimental results We have implemented a Hebbian spike timing-based learning rule on a network of 21 neurons using the IFAT system (Fig. [sent-115, score-0.315]
74 Each of the 20 neurons in the input layer is driven by an externally supplied, randomly generated list of events. [sent-117, score-0.171]
75 Sufficiently high levels of input cause these neurons to produce spikes that subsequently drive the output layer. [sent-118, score-0.173]
76 All events are communicated over the address-event bus and are monitored by a workstation communicating with the MCU and RAM. [sent-119, score-0.414]
77 We have proved that this can be accomplished in hardware in the address domain by presenting the network with stimulus patterns containing a set of correlated inputs and a set of uncorrelated inputs: neurons x1 . [sent-121, score-0.503]
78 Thus, over a sufficiently long period of time each neuron in the input layer will receive the same amount of activation, but the correlated group will fire synchronous spikes more frequently than any other combination of neurons. [sent-129, score-0.17]
79 In the implemented learning rule (2), causal activity results in synaptic strengthening and anti-causal activity results in synaptic weakening. [sent-130, score-0.864]
80 All synapses connecting the input and output layers are equally likely to be active during an anti-causal regime. [sent-135, score-0.092]
81 However, the increase in average contribution to the postsynaptic membrane potential for the correlated group of neurons renders this population slightly more likely to be active during the causal regime than any single member of the uncorrelated group. [sent-136, score-0.6]
82 Therefore, the synaptic strengths for this group of neurons will increase with respect to the uncorrelated group, further augmenting their likelihood of causing a postsynaptic spike. [sent-137, score-0.78]
83 Over time, this positive feedback results in a random but stable distribution of synaptic strengths in which the correlated neurons’ synapses form the strongest connections and the remaining neurons are distributed around an equilibrium value for weak connections. [sent-138, score-0.656]
84 An example of a typical distribution of synaptic strengths recorded after 200,000 events have been processed by the input layer is shown in Fig. [sent-140, score-0.646]
85 For the data shown, synapses driving the input layer were fixed at the maximum strength (+31), the rate of decay was −4 per unit of time, and the plastic synapses between the input and output layers were all initialized to +8. [sent-142, score-0.319]
86 Because the events sent from the workstation to the input layer are randomly generated, fluctuations in the strengths of individual synapses occur consistently throughout the operation of the system. [sent-143, score-0.531]
87 Thus, the final distribution of synaptic weights is different each time, but a pattern can be clearly discerned from the average value of synaptic weights after 20 separate trials of 200,000 events each, as shown in Fig. [sent-144, score-0.837]
88 The system is robust to changes in various parameters of the spike timing-based learning algorithm as well as to modifications in the number of correlated, uncorrelated, and total neurons (data not shown). [sent-146, score-0.232]
89 It also converges to a similar distribution regardless of the initial values of the synaptic strengths (with the constraint that the net activity must be larger than the rate of decay of the voltage stored on the membrane capacitance of the output neuron). [sent-147, score-0.466]
90 6 Conclusion We have demonstrated that the address domain provides an efficient representation to implement synaptic plasticity that depends on the relative timing of events. [sent-148, score-0.757]
91 Unlike dedicated hardware implementations of learning functions embedded into the connectivity, the address domain implementation allows for learning rules with interactions that are not constrained in space and time. [sent-149, score-0.314]
92 The IFAT architecture can be augmented to include sensory input, physical nearestneighbor connectivity between neurons, and more realistic biological models of neural computation. [sent-151, score-0.142]
93 Unlike a purely digital implementation or software emulation, the AER framework preserves the continuous nature of the timing of events. [sent-153, score-0.066]
94 Whatley, “A pulse-coded communications infrastructure for neuromorphic systems,” in Pulsed Neural Networks (W. [sent-163, score-0.173]
95 Abbott, “Competitive Hebbian learning through spike-timingdependent synaptic plasticity,” Nature Neuroscience, vol. [sent-197, score-0.307]
96 Turrigiano, “Stable Hebbian learning from spike timing-dependent plasticity,” Journal of Neuroscience, vol. [sent-208, score-0.107]
97 Mahowald, “Spike based normalizing Hebbian learning in an analog VLSI artificial neuron,” in Learning On Silicon (G. [sent-214, score-0.086]
98 Boahen, “Point-to-point connectivity between neuromorphic chips using address events,” IEEE Trans. [sent-251, score-0.364]
99 Koch, “Multi-chip neuromorphic motion processing,” in Proceedings 20th Anniversary Conference on Advanced Research in VLSI (D. [sent-259, score-0.124]
100 Andreou, “Probabilistic synaptic weighting in a reconfigurable network of VLSI integrate-and-fire neurons,” Neural Networks, vol. [sent-280, score-0.337]
wordName wordTfidf (topN-words)
[('ifat', 0.375), ('aer', 0.319), ('synaptic', 0.307), ('events', 0.223), ('postsynaptic', 0.209), ('sender', 0.206), ('presynaptic', 0.19), ('address', 0.174), ('stdp', 0.134), ('tpost', 0.13), ('lut', 0.13), ('plasticity', 0.13), ('neurons', 0.125), ('queue', 0.125), ('receiver', 0.125), ('neuromorphic', 0.124), ('ram', 0.118), ('mcu', 0.112), ('vlsi', 0.11), ('spike', 0.107), ('tpre', 0.104), ('hebbian', 0.099), ('event', 0.088), ('causal', 0.087), ('analog', 0.086), ('req', 0.082), ('sent', 0.08), ('polarity', 0.074), ('strengths', 0.07), ('synapse', 0.07), ('strength', 0.069), ('timing', 0.066), ('connectivity', 0.066), ('gurable', 0.065), ('ack', 0.065), ('routed', 0.065), ('regime', 0.064), ('synapses', 0.063), ('array', 0.061), ('recon', 0.06), ('bus', 0.06), ('implemented', 0.053), ('rules', 0.051), ('hardware', 0.05), ('decay', 0.049), ('communicating', 0.049), ('workstation', 0.049), ('infrastructure', 0.049), ('weight', 0.049), ('architecture', 0.048), ('spikes', 0.048), ('layer', 0.046), ('correlated', 0.046), ('pol', 0.045), ('scalable', 0.045), ('connections', 0.045), ('cell', 0.044), ('silicon', 0.044), ('circuits', 0.043), ('chip', 0.043), ('mahowald', 0.042), ('implement', 0.041), ('activity', 0.04), ('domain', 0.039), ('sending', 0.039), ('uncorrelated', 0.039), ('bayoumi', 0.037), ('emulating', 0.037), ('massively', 0.037), ('norwell', 0.037), ('queues', 0.037), ('transceiver', 0.037), ('cauwenberghs', 0.037), ('eg', 0.036), ('location', 0.034), ('integrated', 0.034), ('table', 0.033), ('handshaking', 0.033), ('board', 0.033), ('communicated', 0.033), ('encoder', 0.033), ('microcontroller', 0.033), ('wires', 0.033), ('signals', 0.032), ('encoded', 0.03), ('network', 0.03), ('strengthening', 0.03), ('decremented', 0.03), ('routing', 0.03), ('group', 0.03), ('enhanced', 0.029), ('layers', 0.029), ('inhibitory', 0.028), ('biological', 0.028), ('boahen', 0.028), ('regimes', 0.028), ('aps', 0.028), ('kluwer', 0.028), ('connection', 0.028), ('modi', 0.027)]
simIndex simValue paperId paperTitle
same-paper 1 0.99999946 186 nips-2002-Spike Timing-Dependent Plasticity in the Address Domain
Author: R. J. Vogelstein, Francesco Tenore, Ralf Philipp, Miriam S. Adlerstein, David H. Goldberg, Gert Cauwenberghs
Abstract: Address-event representation (AER), originally proposed as a means to communicate sparse neural events between neuromorphic chips, has proven efficient in implementing large-scale networks with arbitrary, configurable synaptic connectivity. In this work, we further extend the functionality of AER to implement arbitrary, configurable synaptic plasticity in the address domain. As proof of concept, we implement a biologically inspired form of spike timing-dependent plasticity (STDP) based on relative timing of events in an AER framework. Experimental results from an analog VLSI integrate-and-fire network demonstrate address domain learning in a task that requires neurons to group correlated inputs.
2 0.3222813 154 nips-2002-Neuromorphic Bisable VLSI Synapses with Spike-Timing-Dependent Plasticity
Author: Giacomo Indiveri
Abstract: We present analog neuromorphic circuits for implementing bistable synapses with spike-timing-dependent plasticity (STDP) properties. In these types of synapses, the short-term dynamics of the synaptic efficacies are governed by the relative timing of the pre- and post-synaptic spikes, while on long time scales the efficacies tend asymptotically to either a potentiated state or to a depressed one. We fabricated a prototype VLSI chip containing a network of integrate and fire neurons interconnected via bistable STDP synapses. Test results from this chip demonstrate the synapse’s STDP learning properties, and its long-term bistable characteristics.
3 0.30297831 102 nips-2002-Hidden Markov Model of Cortical Synaptic Plasticity: Derivation of the Learning Rule
Author: Michael Eisele, Kenneth D. Miller
Abstract: Cortical synaptic plasticity depends on the relative timing of pre- and postsynaptic spikes and also on the temporal pattern of presynaptic spikes and of postsynaptic spikes. We study the hypothesis that cortical synaptic plasticity does not associate individual spikes, but rather whole firing episodes, and depends only on when these episodes start and how long they last, but as little as possible on the timing of individual spikes. Here we present the mathematical background for such a study. Standard methods from hidden Markov models are used to define what “firing episodes” are. Estimating the probability of being in such an episode requires not only the knowledge of past spikes, but also of future spikes. We show how to construct a causal learning rule, which depends only on past spikes, but associates pre- and postsynaptic firing episodes as if it also knew future spikes. We also show that this learning rule agrees with some features of synaptic plasticity in superficial layers of rat visual cortex (Froemke and Dan, Nature 416:433, 2002).
4 0.2134041 180 nips-2002-Selectivity and Metaplasticity in a Unified Calcium-Dependent Model
Author: Luk Chong Yeung, Brian S. Blais, Leon N. Cooper, Harel Z. Shouval
Abstract: A unified, biophysically motivated Calcium-Dependent Learning model has been shown to account for various rate-based and spike time-dependent paradigms for inducing synaptic plasticity. Here, we investigate the properties of this model for a multi-synapse neuron that receives inputs with different spike-train statistics. In addition, we present a physiological form of metaplasticity, an activity-driven regulation mechanism, that is essential for the robustness of the model. A neuron thus implemented develops stable and selective receptive fields, given various input statistics 1
5 0.20089899 50 nips-2002-Circuit Model of Short-Term Synaptic Dynamics
Author: Shih-Chii Liu, Malte Boegershausen, Pascal Suter
Abstract: We describe a model of short-term synaptic depression that is derived from a silicon circuit implementation. The dynamics of this circuit model are similar to the dynamics of some present theoretical models of shortterm depression except that the recovery dynamics of the variable describing the depression is nonlinear and it also depends on the presynaptic frequency. The equations describing the steady-state and transient responses of this synaptic model fit the experimental results obtained from a fabricated silicon network consisting of leaky integrate-and-fire neurons and different types of synapses. We also show experimental data demonstrating the possible computational roles of depression. One possible role of a depressing synapse is that the input can quickly bring the neuron up to threshold when the membrane potential is close to the resting potential.
6 0.18910135 200 nips-2002-Topographic Map Formation by Silicon Growth Cones
7 0.16257891 76 nips-2002-Dynamical Constraints on Computing with Spike Timing in the Cortex
8 0.15724145 129 nips-2002-Learning in Spiking Neural Assemblies
9 0.12792091 177 nips-2002-Retinal Processing Emulation in a Programmable 2-Layer Analog Array Processor CMOS Chip
10 0.12650904 51 nips-2002-Classifying Patterns of Visual Motion - a Neuromorphic Approach
11 0.12606822 171 nips-2002-Reconstructing Stimulus-Driven Neural Networks from Spike Times
12 0.1143726 11 nips-2002-A Model for Real-Time Computation in Generic Neural Microcircuits
13 0.10355357 66 nips-2002-Developing Topography and Ocular Dominance Using Two aVLSI Vision Sensors and a Neurotrophic Model of Plasticity
14 0.095599353 43 nips-2002-Binary Coding in Auditory Cortex
15 0.093626708 91 nips-2002-Field-Programmable Learning Arrays
16 0.079048082 198 nips-2002-Theory-Based Causal Inference
17 0.078071587 184 nips-2002-Spectro-Temporal Receptive Fields of Subthreshold Responses in Auditory Cortex
18 0.076903909 5 nips-2002-A Digital Antennal Lobe for Pattern Equalization: Analysis and Design
19 0.076656088 116 nips-2002-Interpreting Neural Response Variability as Monte Carlo Sampling of the Posterior
20 0.071286984 28 nips-2002-An Information Theoretic Approach to the Functional Classification of Neurons
topicId topicWeight
[(0, -0.171), (1, 0.303), (2, 0.028), (3, -0.146), (4, 0.08), (5, 0.337), (6, 0.199), (7, -0.034), (8, 0.002), (9, -0.081), (10, 0.073), (11, -0.05), (12, -0.003), (13, 0.045), (14, 0.104), (15, 0.035), (16, -0.039), (17, -0.086), (18, 0.023), (19, -0.058), (20, -0.056), (21, -0.08), (22, 0.068), (23, 0.046), (24, -0.048), (25, -0.01), (26, -0.058), (27, 0.021), (28, -0.048), (29, -0.164), (30, 0.063), (31, -0.036), (32, 0.078), (33, -0.046), (34, -0.146), (35, -0.028), (36, -0.034), (37, -0.066), (38, 0.052), (39, 0.003), (40, -0.033), (41, 0.087), (42, 0.054), (43, -0.028), (44, -0.016), (45, 0.038), (46, -0.01), (47, 0.029), (48, -0.049), (49, 0.0)]
simIndex simValue paperId paperTitle
same-paper 1 0.97013545 186 nips-2002-Spike Timing-Dependent Plasticity in the Address Domain
Author: R. J. Vogelstein, Francesco Tenore, Ralf Philipp, Miriam S. Adlerstein, David H. Goldberg, Gert Cauwenberghs
Abstract: Address-event representation (AER), originally proposed as a means to communicate sparse neural events between neuromorphic chips, has proven efficient in implementing large-scale networks with arbitrary, configurable synaptic connectivity. In this work, we further extend the functionality of AER to implement arbitrary, configurable synaptic plasticity in the address domain. As proof of concept, we implement a biologically inspired form of spike timing-dependent plasticity (STDP) based on relative timing of events in an AER framework. Experimental results from an analog VLSI integrate-and-fire network demonstrate address domain learning in a task that requires neurons to group correlated inputs.
2 0.82977003 154 nips-2002-Neuromorphic Bisable VLSI Synapses with Spike-Timing-Dependent Plasticity
Author: Giacomo Indiveri
Abstract: We present analog neuromorphic circuits for implementing bistable synapses with spike-timing-dependent plasticity (STDP) properties. In these types of synapses, the short-term dynamics of the synaptic efficacies are governed by the relative timing of the pre- and post-synaptic spikes, while on long time scales the efficacies tend asymptotically to either a potentiated state or to a depressed one. We fabricated a prototype VLSI chip containing a network of integrate and fire neurons interconnected via bistable STDP synapses. Test results from this chip demonstrate the synapse’s STDP learning properties, and its long-term bistable characteristics.
3 0.80681622 200 nips-2002-Topographic Map Formation by Silicon Growth Cones
Author: Brian Taba, Kwabena A. Boahen
Abstract: We describe a self-configuring neuromorphic chip that uses a model of activity-dependent axon remodeling to automatically wire topographic maps based solely on input correlations. Axons are guided by growth cones, which are modeled in analog VLSI for the first time. Growth cones migrate up neurotropin gradients, which are represented by charge diffusing in transistor channels. Virtual axons move by rerouting address-events. We refined an initially gross topographic projection by simulating retinal wave input. 1 Neuromorphic Systems Neuromorphic engineers are attempting to match the computational efficiency of biological systems by morphing neurocircuitry into silicon circuits [1]. One of the most detailed implementations to date is the silicon retina described in [2] . This chip comprises thirteen different cell types, each of which must be individually and painstakingly wired. While this circuit-level approach has been very successful in sensory systems, it is less helpful when modeling largely unelucidated and exceedingly plastic higher processing centers in cortex. Instead of an explicit blueprint for every cortical area, what is needed is a developmental rule that can wire complex circuits from minimal specifications. One candidate is the famous
4 0.78265953 180 nips-2002-Selectivity and Metaplasticity in a Unified Calcium-Dependent Model
Author: Luk Chong Yeung, Brian S. Blais, Leon N. Cooper, Harel Z. Shouval
Abstract: A unified, biophysically motivated Calcium-Dependent Learning model has been shown to account for various rate-based and spike time-dependent paradigms for inducing synaptic plasticity. Here, we investigate the properties of this model for a multi-synapse neuron that receives inputs with different spike-train statistics. In addition, we present a physiological form of metaplasticity, an activity-driven regulation mechanism, that is essential for the robustness of the model. A neuron thus implemented develops stable and selective receptive fields, given various input statistics 1
5 0.77555746 102 nips-2002-Hidden Markov Model of Cortical Synaptic Plasticity: Derivation of the Learning Rule
Author: Michael Eisele, Kenneth D. Miller
Abstract: Cortical synaptic plasticity depends on the relative timing of pre- and postsynaptic spikes and also on the temporal pattern of presynaptic spikes and of postsynaptic spikes. We study the hypothesis that cortical synaptic plasticity does not associate individual spikes, but rather whole firing episodes, and depends only on when these episodes start and how long they last, but as little as possible on the timing of individual spikes. Here we present the mathematical background for such a study. Standard methods from hidden Markov models are used to define what “firing episodes” are. Estimating the probability of being in such an episode requires not only the knowledge of past spikes, but also of future spikes. We show how to construct a causal learning rule, which depends only on past spikes, but associates pre- and postsynaptic firing episodes as if it also knew future spikes. We also show that this learning rule agrees with some features of synaptic plasticity in superficial layers of rat visual cortex (Froemke and Dan, Nature 416:433, 2002).
6 0.68055379 50 nips-2002-Circuit Model of Short-Term Synaptic Dynamics
8 0.43897259 11 nips-2002-A Model for Real-Time Computation in Generic Neural Microcircuits
9 0.43535891 76 nips-2002-Dynamical Constraints on Computing with Spike Timing in the Cortex
10 0.42956978 129 nips-2002-Learning in Spiking Neural Assemblies
11 0.42704177 177 nips-2002-Retinal Processing Emulation in a Programmable 2-Layer Analog Array Processor CMOS Chip
12 0.37637264 71 nips-2002-Dopamine Induced Bistability Enhances Signal Processing in Spiny Neurons
13 0.32570645 91 nips-2002-Field-Programmable Learning Arrays
14 0.31941342 171 nips-2002-Reconstructing Stimulus-Driven Neural Networks from Spike Times
15 0.30733743 160 nips-2002-Optoelectronic Implementation of a FitzHugh-Nagumo Neural Model
16 0.28461179 18 nips-2002-Adaptation and Unsupervised Learning
17 0.26346818 23 nips-2002-Adaptive Quantization and Density Estimation in Silicon
18 0.2628341 75 nips-2002-Dynamical Causal Learning
19 0.24808413 51 nips-2002-Classifying Patterns of Visual Motion - a Neuromorphic Approach
20 0.24283674 198 nips-2002-Theory-Based Causal Inference
topicId topicWeight
[(1, 0.013), (3, 0.042), (23, 0.019), (42, 0.041), (54, 0.067), (55, 0.033), (58, 0.3), (67, 0.04), (68, 0.064), (74, 0.072), (83, 0.077), (87, 0.013), (92, 0.018), (98, 0.109)]
simIndex simValue paperId paperTitle
same-paper 1 0.84078282 186 nips-2002-Spike Timing-Dependent Plasticity in the Address Domain
Author: R. J. Vogelstein, Francesco Tenore, Ralf Philipp, Miriam S. Adlerstein, David H. Goldberg, Gert Cauwenberghs
Abstract: Address-event representation (AER), originally proposed as a means to communicate sparse neural events between neuromorphic chips, has proven efficient in implementing large-scale networks with arbitrary, configurable synaptic connectivity. In this work, we further extend the functionality of AER to implement arbitrary, configurable synaptic plasticity in the address domain. As proof of concept, we implement a biologically inspired form of spike timing-dependent plasticity (STDP) based on relative timing of events in an AER framework. Experimental results from an analog VLSI integrate-and-fire network demonstrate address domain learning in a task that requires neurons to group correlated inputs.
2 0.71220982 61 nips-2002-Convergent Combinations of Reinforcement Learning with Linear Function Approximation
Author: Ralf Schoknecht, Artur Merke
Abstract: Convergence for iterative reinforcement learning algorithms like TD(O) depends on the sampling strategy for the transitions. However, in practical applications it is convenient to take transition data from arbitrary sources without losing convergence. In this paper we investigate the problem of repeated synchronous updates based on a fixed set of transitions. Our main theorem yields sufficient conditions of convergence for combinations of reinforcement learning algorithms and linear function approximation. This allows to analyse if a certain reinforcement learning algorithm and a certain function approximator are compatible. For the combination of the residual gradient algorithm with grid-based linear interpolation we show that there exists a universal constant learning rate such that the iteration converges independently of the concrete transition data. 1
3 0.60295284 161 nips-2002-PAC-Bayes & Margins
Author: John Langford, John Shawe-Taylor
Abstract: unkown-abstract
4 0.50082564 200 nips-2002-Topographic Map Formation by Silicon Growth Cones
Author: Brian Taba, Kwabena A. Boahen
Abstract: We describe a self-configuring neuromorphic chip that uses a model of activity-dependent axon remodeling to automatically wire topographic maps based solely on input correlations. Axons are guided by growth cones, which are modeled in analog VLSI for the first time. Growth cones migrate up neurotropin gradients, which are represented by charge diffusing in transistor channels. Virtual axons move by rerouting address-events. We refined an initially gross topographic projection by simulating retinal wave input. 1 Neuromorphic Systems Neuromorphic engineers are attempting to match the computational efficiency of biological systems by morphing neurocircuitry into silicon circuits [1]. One of the most detailed implementations to date is the silicon retina described in [2] . This chip comprises thirteen different cell types, each of which must be individually and painstakingly wired. While this circuit-level approach has been very successful in sensory systems, it is less helpful when modeling largely unelucidated and exceedingly plastic higher processing centers in cortex. Instead of an explicit blueprint for every cortical area, what is needed is a developmental rule that can wire complex circuits from minimal specifications. One candidate is the famous
5 0.49782914 159 nips-2002-Optimality of Reinforcement Learning Algorithms with Linear Function Approximation
Author: Ralf Schoknecht
Abstract: There are several reinforcement learning algorithms that yield approximate solutions for the problem of policy evaluation when the value function is represented with a linear function approximator. In this paper we show that each of the solutions is optimal with respect to a specific objective function. Moreover, we characterise the different solutions as images of the optimal exact value function under different projection operations. The results presented here will be useful for comparing the algorithms in terms of the error they achieve relative to the error of the optimal approximate solution. 1
6 0.49201745 199 nips-2002-Timing and Partial Observability in the Dopamine System
7 0.48477399 154 nips-2002-Neuromorphic Bisable VLSI Synapses with Spike-Timing-Dependent Plasticity
8 0.47980058 168 nips-2002-Real-Time Monitoring of Complex Industrial Processes with Particle Filters
9 0.47875109 50 nips-2002-Circuit Model of Short-Term Synaptic Dynamics
10 0.46374071 11 nips-2002-A Model for Real-Time Computation in Generic Neural Microcircuits
11 0.46204036 23 nips-2002-Adaptive Quantization and Density Estimation in Silicon
12 0.45775175 130 nips-2002-Learning in Zero-Sum Team Markov Games Using Factored Value Functions
13 0.45379663 5 nips-2002-A Digital Antennal Lobe for Pattern Equalization: Analysis and Design
14 0.45170116 51 nips-2002-Classifying Patterns of Visual Motion - a Neuromorphic Approach
15 0.45113873 91 nips-2002-Field-Programmable Learning Arrays
16 0.44887999 43 nips-2002-Binary Coding in Auditory Cortex
17 0.44598025 62 nips-2002-Coulomb Classifiers: Generalizing Support Vector Machines via an Analogy to Electrostatic Systems
18 0.44531009 102 nips-2002-Hidden Markov Model of Cortical Synaptic Plasticity: Derivation of the Learning Rule
19 0.44502747 76 nips-2002-Dynamical Constraints on Computing with Spike Timing in the Cortex
20 0.4445869 148 nips-2002-Morton-Style Factorial Coding of Color in Primary Visual Cortex