nips nips2000 nips2000-67 knowledge-graph by maker-knowledge-mining
Source: pdf
Author: Shih-Chii Liu, Bradley A. Minch
Abstract: In this work, we explore homeostasis in a silicon integrate-and-fire neuron. The neuron adapts its firing rate over long time periods on the order of seconds or minutes so that it returns to its spontaneous firing rate after a lasting perturbation. Homeostasis is implemented via two schemes. One scheme looks at the presynaptic activity and adapts the synaptic weight depending on the presynaptic spiking rate. The second scheme adapts the synaptic
Reference: text
sentIndex sentText sentNum sentScore
1 edu Abstract In this work, we explore homeostasis in a silicon integrate-and-fire neuron. [sent-10, score-0.221]
2 The neuron adapts its firing rate over long time periods on the order of seconds or minutes so that it returns to its spontaneous firing rate after a lasting perturbation. [sent-11, score-1.16]
3 One scheme looks at the presynaptic activity and adapts the synaptic weight depending on the presynaptic spiking rate. [sent-13, score-1.186]
4 The second scheme adapts the synaptic "threshold" depending on the neuron's activity. [sent-14, score-0.176]
5 The threshold is lowered if the neuron's activity decreases over a long time and is increased for prolonged increase in postsynaptic activity. [sent-15, score-0.447]
6 The results shown here are measured from a chip fabricated in a 2-J. [sent-17, score-0.039]
7 1 Introduction We explored long-time constant adaptation mechanisms in a simple integrate-and-fire silicon neuron. [sent-19, score-0.597]
8 Many researchers have postulated constant adaptation mechanisms which, for example, preserve the firing rate of the neuron over long time invervals (Liu et al. [sent-20, score-1.184]
9 1998) or use the presynaptic spiking statistics to adapt the spiking rate of the neuron so that the distribution of this spiking rate is uniformly distributed (Stemmler and Koch 1999). [sent-21, score-1.947]
10 1999) where if the K or Na conductances are perturbed by adding antagonists, the cell returns to its original spiking rate in a couple of days. [sent-23, score-0.571]
11 This work differs from previous work that explore the adaptation of the firing threshold and the gain of the neuron through the regulation of Hodgkin-Huxley like conductances (Shin and Koch 1999) and regulation of the neuron to perturbation in the conductances (Simoni and DeWeerth 1999). [sent-24, score-1.566]
12 Our neuron circuit is a simple integrate-and-fire neuron and lepse Spike Vm Irefr Vrl JUL VOl -:- output, Vo >---'--- -:- - - - -. [sent-25, score-0.947]
13 I C2 Pbase /~/ v,IV)::ectm : lepse -=-j : \ \ I I \ I -. [sent-27, score-0.05]
14 _/ "", Figure 1: Schematic of neuron circuit with long time constant mechanisms for presynaptic adaptation. [sent-33, score-1.141]
15 our adaptation mechanisms have time constants of seconds to minutes. [sent-34, score-0.513]
16 We also describe adaptation of the synaptic weight to presynaptic spiking rates. [sent-35, score-0.993]
17 This presynaptic adaptation models the contrast gain control curves of cortical simple cells (Ohzawa et al. [sent-36, score-0.735]
18 We fabricated two different circuits in a 2-pm CMOS process. [sent-38, score-0.039]
19 One circuit implements presynaptic adaptation and the other circuit implements postsynaptic adaptation. [sent-39, score-1.011]
20 The long time constant adaptation mechanisms use tunnelling and injection mechanisms to remove charge from and to add charge onto a floating gate (Diorio et al. [sent-40, score-1.267]
21 We added these mechanisms to a simple integrate-and-fire neuron circuit (Mead 1989). [sent-42, score-0.692]
22 This circuit (shown in Figure 1) takes an input current, lepsc, which charges up the membrane, V m . [sent-43, score-0.212]
23 When the membrane exceeds a threshold, the output of the neuron, Vo , spikes. [sent-44, score-0.079]
24 The spiking rate of the neuron, fo is determined by the input current, lepsc , that is, fo = m lepsc where 1 . [sent-45, score-0.761]
25 2 Adaptation mechanisms in silicon neuron circuit In order to permit continuous operation with only positive polarity bias voltages, we use two distinct mechanisms to modify the floating-gate charges in our neuron circuits. [sent-47, score-1.463]
26 We use Fowler-Nordheim tunneling through high-quality gate oxide to remove electrons from the floating gates (Lenzlinger and Snow 1969). [sent-48, score-0.535]
27 Here, we apply a large voltage across the oxide, which reduces the width of the Si-Si0 2 energy barrier to such an extent that electrons are likely to tunnel through the barrier. [sent-49, score-0.417]
28 The tunneling current is given approximately by -/'o I tun- t e- vo/voz , where Vox = V'tun - Vfg is the voltage across the tunneling oxide and lot and Vo are measurable device parameters. [sent-50, score-0.665]
29 For the 400-A oxides that are typical of a 2-l-拢m CMOS process, a typical value of Vo is 1000 V and an oxide voltage of about 30 V is required to obtain an appreciable tunneling current. [sent-51, score-0.491]
30 We use subthreshold channel hot-electron injection in an nMOS transistor (Diorio, Minch, and Hasler 1999) to add electrons to the floating gates. [sent-52, score-0.605]
31 In this process, electrons in the channel of the nMOS transistor accelerate in the high electric field that exists in the depletion region near the drain, gaining enough energy to surmount the Si-Si0 2 energy barrier (about 3. [sent-53, score-0.325]
32 To facilitate the hot-electron injection process, we locally increase the substrate doping density of the nMOS transistor using the p-base layer that is normally used to form the base of a vertical npn bipolar transistor. [sent-55, score-0.368]
33 The p-base substrate implant simultaneously increases the electric field at the drain end of the channel and increases the nMOS transistor's threshold voltage from 0. [sent-56, score-0.615]
34 8 V to about 6 V, permitting subthreshold operation at gate voltages that permit the collection of the injected electrons by the floating gate. [sent-57, score-0.414]
35 The hot-electron injection current is given approximately by 1- . [sent-58, score-0.21]
36 40 20 50 100 150 200 250 300 350 Presynaptic frequency (Hz) Figure 2: Adaptation curves of synaptic efficacy to presynaptic frequencies using long time constant adaptation mechanisms. [sent-80, score-0.904]
37 3 Experimental results We measured the transient and steady-state spiking rates of the neuron around four different steady-state presynaptic rates of 100Hz, 150Hz, 200Hz, and 250Hz. [sent-83, score-1.14]
38 In these measurements, the drain of the pbase injection transistor was set at 4V and the tunnelling voltage was set at 35. [sent-84, score-0.822]
39 For each steady-state presynaptic rate, we presented step increases and decreases in the presynaptic rate of 15Hz, 30Hz, 45Hz, and 60Hz. [sent-86, score-0.905]
40 The instantaneous postsynaptic rate is plotted along one the four steep curves in Figure 2. [sent-87, score-0.286]
41 After every change in the presynaptic rate, we returned the presynaptic rate to its steady-state value before we presented the next change in presynaptic rate. [sent-88, score-1.224]
42 The transient gain of the curves decreases for higher input spiking rates. [sent-89, score-0.626]
43 We also recorded the dynamics of the adaptation mechanisms by measuring the spiking rate of the neuron when the presynaptic frequency was decreased at time (t=O) from 350 Hz to 300 Hz as shown in Figure 3. [sent-91, score-1.663]
44 The system adapts over a time constant of minutes back to the initial output frequency. [sent-92, score-0.251]
45 These data show that the synaptic efficacy adapted to a higher weight value over time. [sent-93, score-0.12]
46 The time constant of adaptation can be increased by either increasing the tunnelling voltage or the pbase injector's drain voltage, Vd. [sent-94, score-0.903]
47 ::l & ,rn ~ITnr ~ 20 ::l 0 10 00 100 200 300 400 500 600 700 800 900 Time (sec) Figure 3: Temporal adaptation of spiking rate of neuron to a decrease in the presynaptic frequency from 350Hz to 300Hz. [sent-101, score-1.479]
48 4 Postsynaptic adaptation In the second mechanism, the neuron's spiking rate determines the synaptic "threshold". [sent-103, score-0.79]
49 The schematic of this adaptation circuitry is shown in Figure 4. [sent-104, score-0.346]
50 The floating-gate pbase transistor provides a quiescent input to the neuron so that the neuron fires at a quiescent rate. [sent-105, score-1.124]
51 The tunneling mechanism is always turned on so the neuron's spiking rate increases in time if the neuron does not spike. [sent-106, score-1.102]
52 However the injection mechanism turns on when the neuron spikes. [sent-107, score-0.612]
53 The time constant of these mechanisms is in terms of seconds to minutes. [sent-108, score-0.308]
54 The increase in the floating-gate voltage is equivalent to a decrease in the synaptic threshold. [sent-109, score-0.442]
55 If the neuron's activity is high, the injection mechanism turns on thus decreasing the floatinggate voltage and the input current to the neuron. [sent-110, score-0.611]
56 These two opposing mechanisms ensure that the cell will remain at a constant activity under steady-state conditions. [sent-111, score-0.265]
57 In other words, the threshold of the neuron is modulated by its output spiking rate. [sent-112, score-0.816]
58 The threshold of the neuron continuously decreases and each output spike increases the threshold. [sent-113, score-0.637]
59 1 can be used to solve for V/ gD , thus leading us to the following expression for the steady-state input current, linD: kV/ gO linD = lopbe----rJ'T = Im/(foT/j)"Y where 1m is a preconstant and 'Y is close to 1. [sent-116, score-0.035]
60 2 Transient analysis When a positive step voltage is applied to v;,,,,, the step change, floating gate. [sent-118, score-0.458]
61 The initial transient current is : ~V, is coupled into the ! [sent-119, score-0.182]
62 111 /-- Membrane voltage, Vm I Adaptation I circuitry ~ \ -l V -lI tun . [sent-124, score-0.036]
63 I ~~_________-_衯_____ V~f9~__________~// \\ Figure 4: Schematic of neuron circuit with long time constant mechanisms for postsynaptic adaptation. [sent-130, score-0.937]
64 and the initial increase in the postsynaptic firing rate is k~V fo + dfo = foe""fYT. [sent-131, score-0.566]
65 If we assume that the step input, Vin = 10g(li) (where fi is the firing rate of the presynaptic neuron), then the change in the floating-gate voltage is described by ~ V = dfd Ii- We then solve for dfo, dfo k~V = e UT fo - - k dfi 1 ~ --. [sent-132, score-1.154]
66 Ur Ii (5) Equation 5 shows that the transient change in the neuron's spiking rate is proportional to the input contrast in the firing rate. [sent-133, score-0.775]
67 With time, the floating-gate voltage adapts back to the steady-state condition, so the spiking rate returns to fo. [sent-134, score-0.835]
68 3 Experimental results In these experiments, we set the tunneling voltage, vtun to 28V, and the injection voltage to 6. [sent-136, score-0.586]
69 2V into the floating-gate voltage and then measured the output frequency of the neuron over a period of 10 minutes. [sent-138, score-0.731]
70 The output of this experiment is shown in Figure 5. [sent-139, score-0.033]
71 The frequency dropped from about 19Hz to 13Hz but the circuit adapted after this initial perturbation and the spiking rate of the neuron returned to about 19Hz over 26min. [sent-140, score-1.084]
72 A similar experiment is performed but this time a step increase of O. [sent-141, score-0.11]
73 2V was coupled into the floating gate node (shown in Figure 5). [sent-142, score-0.221]
74 Initially, the neuron's rate increased from 20Hz to 28Hz but over a long period of minutes, the firing rate returned to 20Hz. [sent-143, score-0.494]
75 5 Conclusion In this work, we show how long-time constant adaptation mechanisms can be added to a silicon integrate-and-fire neuron in a normal CMOS process. [sent-144, score-0.982]
76 These homeostatic mechanisms can be combined with short time constant synaptic depressing synapses on the same neuron to provide a range of adapting mechanisms. [sent-145, score-0.746]
77 The presynaptic adaptation mechanism can also account for the contrast gain curves of cortical simple cells. [sent-146, score-0.788]
78 & =s 0 200 400 600 800 1000 1200 1400 1600 Time (sec) Figure 5: Response of silicon neuron to an increase and a decrease of a step input of 0. [sent-157, score-0.646]
79 The curve shows that the adaptation time constant is in the order of about 10 min. [sent-159, score-0.337]
80 Plasticity in the intrinsic excitability of cortical pyramidal neurons. [sent-168, score-0.033]
81 A model neuron with activitydependent conductances regulated by multiple calcium sensors. [sent-188, score-0.476]
82 Dynamic range and sensitivity adaptation in a silicon spiking neuron. [sent-204, score-0.694]
83 How voltage-dependent conductances can adapt to maximize the information encoded by neuronal firing rate. [sent-215, score-0.24]
wordName wordTfidf (topN-words)
[('neuron', 0.385), ('presynaptic', 0.324), ('spiking', 0.323), ('voltage', 0.274), ('adaptation', 0.251), ('mechanisms', 0.18), ('injection', 0.174), ('tunneling', 0.138), ('circuit', 0.127), ('firing', 0.122), ('rate', 0.121), ('silicon', 0.12), ('postsynaptic', 0.12), ('floating', 0.118), ('transistor', 0.118), ('electrons', 0.109), ('transient', 0.108), ('dfo', 0.101), ('homeostasis', 0.101), ('minch', 0.101), ('nmos', 0.101), ('pbase', 0.101), ('synaptic', 0.095), ('conductances', 0.091), ('adapts', 0.081), ('cmos', 0.079), ('drain', 0.079), ('oxide', 0.079), ('dfd', 0.076), ('lepsc', 0.076), ('tunnelling', 0.076), ('threshold', 0.075), ('vo', 0.068), ('diorio', 0.065), ('gate', 0.065), ('fo', 0.065), ('decreases', 0.061), ('schematic', 0.059), ('koch', 0.059), ('returned', 0.055), ('gain', 0.054), ('mechanism', 0.053), ('minutes', 0.051), ('charges', 0.05), ('desai', 0.05), ('deweerth', 0.05), ('lepse', 0.05), ('lind', 0.05), ('ohzawa', 0.05), ('quiescent', 0.05), ('simoni', 0.05), ('stemmler', 0.05), ('subthreshold', 0.05), ('constant', 0.046), ('membrane', 0.046), ('curves', 0.045), ('hasler', 0.043), ('lenzlinger', 0.043), ('neuro', 0.043), ('shin', 0.043), ('seconds', 0.042), ('increases', 0.042), ('spike', 0.041), ('time', 0.04), ('activity', 0.039), ('hz', 0.039), ('fabricated', 0.039), ('mead', 0.039), ('substrate', 0.039), ('liu', 0.039), ('regulation', 0.039), ('frequency', 0.039), ('long', 0.039), ('change', 0.038), ('coupled', 0.038), ('increase', 0.037), ('charge', 0.036), ('circuitry', 0.036), ('voltages', 0.036), ('vm', 0.036), ('permit', 0.036), ('decrease', 0.036), ('current', 0.036), ('increased', 0.036), ('returns', 0.036), ('channel', 0.036), ('input', 0.035), ('barrier', 0.034), ('snow', 0.034), ('perturbation', 0.034), ('step', 0.033), ('output', 0.033), ('cortical', 0.033), ('sec', 0.032), ('implements', 0.031), ('contrast', 0.028), ('electric', 0.028), ('adapt', 0.027), ('remove', 0.026), ('efficacy', 0.025)]
simIndex simValue paperId paperTitle
same-paper 1 0.9999997 67 nips-2000-Homeostasis in a Silicon Integrate and Fire Neuron
Author: Shih-Chii Liu, Bradley A. Minch
Abstract: In this work, we explore homeostasis in a silicon integrate-and-fire neuron. The neuron adapts its firing rate over long time periods on the order of seconds or minutes so that it returns to its spontaneous firing rate after a lasting perturbation. Homeostasis is implemented via two schemes. One scheme looks at the presynaptic activity and adapts the synaptic weight depending on the presynaptic spiking rate. The second scheme adapts the synaptic
2 0.310774 11 nips-2000-A Silicon Primitive for Competitive Learning
Author: David Hsu, Miguel Figueroa, Chris Diorio
Abstract: Competitive learning is a technique for training classification and clustering networks. We have designed and fabricated an 11transistor primitive, that we term an automaximizing bump circuit, that implements competitive learning dynamics. The circuit performs a similarity computation, affords nonvolatile storage, and implements simultaneous local adaptation and computation. We show that our primitive is suitable for implementing competitive learning in VLSI, and demonstrate its effectiveness in a standard clustering task.
3 0.18061791 129 nips-2000-Temporally Dependent Plasticity: An Information Theoretic Account
Author: Gal Chechik, Naftali Tishby
Abstract: The paradigm of Hebbian learning has recently received a novel interpretation with the discovery of synaptic plasticity that depends on the relative timing of pre and post synaptic spikes. This paper derives a temporally dependent learning rule from the basic principle of mutual information maximization and studies its relation to the experimentally observed plasticity. We find that a supervised spike-dependent learning rule sharing similar structure with the experimentally observed plasticity increases mutual information to a stable near optimal level. Moreover, the analysis reveals how the temporal structure of time-dependent learning rules is determined by the temporal filter applied by neurons over their inputs. These results suggest experimental prediction as to the dependency of the learning rule on neuronal biophysical parameters 1
4 0.16345438 88 nips-2000-Multiple Timescales of Adaptation in a Neural Code
Author: Adrienne L. Fairhall, Geoffrey D. Lewen, William Bialek, Robert R. de Ruyter van Steveninck
Abstract: Many neural systems extend their dynamic range by adaptation. We examine the timescales of adaptation in the context of dynamically modulated rapidly-varying stimuli, and demonstrate in the fly visual system that adaptation to the statistical ensemble of the stimulus dynamically maximizes information transmission about the time-dependent stimulus. Further, while the rate response has long transients, the adaptation takes place on timescales consistent with optimal variance estimation.
5 0.16259634 146 nips-2000-What Can a Single Neuron Compute?
Author: Blaise Agüera y Arcas, Adrienne L. Fairhall, William Bialek
Abstract: In this paper we formulate a description of the computation performed by a neuron as a combination of dimensional reduction and nonlinearity. We implement this description for the HodgkinHuxley model, identify the most relevant dimensions and find the nonlinearity. A two dimensional description already captures a significant fraction of the information that spikes carry about dynamic inputs. This description also shows that computation in the Hodgkin-Huxley model is more complex than a simple integrateand-fire or perceptron model. 1
6 0.13865624 55 nips-2000-Finding the Key to a Synapse
7 0.11259821 147 nips-2000-Who Does What? A Novel Algorithm to Determine Function Localization
8 0.10958463 104 nips-2000-Processing of Time Series by Neural Circuits with Biologically Realistic Synaptic Dynamics
9 0.085548341 57 nips-2000-Four-legged Walking Gait Control Using a Neuromorphic Chip Interfaced to a Support Vector Learning Algorithm
10 0.084591456 42 nips-2000-Divisive and Subtractive Mask Effects: Linking Psychophysics and Biophysics
11 0.074952535 124 nips-2000-Spike-Timing-Dependent Learning for Oscillatory Networks
12 0.070109874 56 nips-2000-Foundations for a Circuit Complexity Theory of Sensory Processing
13 0.068256967 40 nips-2000-Dendritic Compartmentalization Could Underlie Competition and Attentional Biasing of Simultaneous Visual Stimuli
14 0.053807676 141 nips-2000-Universality and Individuality in a Neural Code
15 0.045147732 100 nips-2000-Permitted and Forbidden Sets in Symmetric Threshold-Linear Networks
16 0.044590242 81 nips-2000-Learning Winner-take-all Competition Between Groups of Neurons in Lateral Inhibitory Networks
17 0.038226381 101 nips-2000-Place Cells and Spatial Navigation Based on 2D Visual Feature Extraction, Path Integration, and Reinforcement Learning
18 0.037093528 33 nips-2000-Combining ICA and Top-Down Attention for Robust Speech Recognition
19 0.03678621 118 nips-2000-Smart Vision Chip Fabricated Using Three Dimensional Integration Technology
20 0.034619212 99 nips-2000-Periodic Component Analysis: An Eigenvalue Method for Representing Periodic Structure in Speech
topicId topicWeight
[(0, 0.129), (1, -0.21), (2, -0.3), (3, -0.043), (4, 0.046), (5, -0.051), (6, -0.055), (7, 0.148), (8, -0.068), (9, 0.121), (10, -0.037), (11, -0.002), (12, -0.046), (13, -0.423), (14, -0.128), (15, 0.063), (16, 0.005), (17, -0.047), (18, -0.021), (19, -0.006), (20, 0.068), (21, 0.046), (22, 0.085), (23, 0.043), (24, 0.069), (25, 0.104), (26, 0.007), (27, 0.016), (28, 0.042), (29, -0.007), (30, -0.054), (31, 0.081), (32, -0.168), (33, 0.082), (34, -0.054), (35, 0.067), (36, -0.015), (37, 0.056), (38, -0.007), (39, 0.055), (40, -0.092), (41, -0.011), (42, 0.197), (43, -0.08), (44, -0.159), (45, -0.085), (46, -0.018), (47, 0.058), (48, -0.063), (49, 0.027)]
simIndex simValue paperId paperTitle
same-paper 1 0.98825246 67 nips-2000-Homeostasis in a Silicon Integrate and Fire Neuron
Author: Shih-Chii Liu, Bradley A. Minch
Abstract: In this work, we explore homeostasis in a silicon integrate-and-fire neuron. The neuron adapts its firing rate over long time periods on the order of seconds or minutes so that it returns to its spontaneous firing rate after a lasting perturbation. Homeostasis is implemented via two schemes. One scheme looks at the presynaptic activity and adapts the synaptic weight depending on the presynaptic spiking rate. The second scheme adapts the synaptic
2 0.8801719 11 nips-2000-A Silicon Primitive for Competitive Learning
Author: David Hsu, Miguel Figueroa, Chris Diorio
Abstract: Competitive learning is a technique for training classification and clustering networks. We have designed and fabricated an 11transistor primitive, that we term an automaximizing bump circuit, that implements competitive learning dynamics. The circuit performs a similarity computation, affords nonvolatile storage, and implements simultaneous local adaptation and computation. We show that our primitive is suitable for implementing competitive learning in VLSI, and demonstrate its effectiveness in a standard clustering task.
3 0.40147713 129 nips-2000-Temporally Dependent Plasticity: An Information Theoretic Account
Author: Gal Chechik, Naftali Tishby
Abstract: The paradigm of Hebbian learning has recently received a novel interpretation with the discovery of synaptic plasticity that depends on the relative timing of pre and post synaptic spikes. This paper derives a temporally dependent learning rule from the basic principle of mutual information maximization and studies its relation to the experimentally observed plasticity. We find that a supervised spike-dependent learning rule sharing similar structure with the experimentally observed plasticity increases mutual information to a stable near optimal level. Moreover, the analysis reveals how the temporal structure of time-dependent learning rules is determined by the temporal filter applied by neurons over their inputs. These results suggest experimental prediction as to the dependency of the learning rule on neuronal biophysical parameters 1
4 0.38668779 147 nips-2000-Who Does What? A Novel Algorithm to Determine Function Localization
Author: Ranit Aharonov-Barki, Isaac Meilijson, Eytan Ruppin
Abstract: We introduce a novel algorithm, termed PPA (Performance Prediction Algorithm), that quantitatively measures the contributions of elements of a neural system to the tasks it performs. The algorithm identifies the neurons or areas which participate in a cognitive or behavioral task, given data about performance decrease in a small set of lesions. It also allows the accurate prediction of performances due to multi-element lesions. The effectiveness of the new algorithm is demonstrated in two models of recurrent neural networks with complex interactions among the elements. The algorithm is scalable and applicable to the analysis of large neural networks. Given the recent advances in reversible inactivation techniques, it has the potential to significantly contribute to the understanding of the organization of biological nervous systems, and to shed light on the long-lasting debate about local versus distributed computation in the brain.
Author: Susanne Still, Bernhard Schölkopf, Klaus Hepp, Rodney J. Douglas
Abstract: To control the walking gaits of a four-legged robot we present a novel neuromorphic VLSI chip that coordinates the relative phasing of the robot's legs similar to how spinal Central Pattern Generators are believed to control vertebrate locomotion [3]. The chip controls the leg movements by driving motors with time varying voltages which are the outputs of a small network of coupled oscillators. The characteristics of the chip's output voltages depend on a set of input parameters. The relationship between input parameters and output voltages can be computed analytically for an idealized system. In practice, however, this ideal relationship is only approximately true due to transistor mismatch and offsets. Fine tuning of the chip's input parameters is done automatically by the robotic system, using an unsupervised Support Vector (SV) learning algorithm introduced recently [7]. The learning requires only that the description of the desired output is given. The machine learns from (unlabeled) examples how to set the parameters to the chip in order to obtain a desired motor behavior.
6 0.36058101 55 nips-2000-Finding the Key to a Synapse
7 0.35755935 88 nips-2000-Multiple Timescales of Adaptation in a Neural Code
8 0.35134569 146 nips-2000-What Can a Single Neuron Compute?
9 0.30610666 42 nips-2000-Divisive and Subtractive Mask Effects: Linking Psychophysics and Biophysics
10 0.2748743 56 nips-2000-Foundations for a Circuit Complexity Theory of Sensory Processing
11 0.24213725 104 nips-2000-Processing of Time Series by Neural Circuits with Biologically Realistic Synaptic Dynamics
12 0.1976891 132 nips-2000-The Interplay of Symbolic and Subsymbolic Processes in Anagram Problem Solving
13 0.1922837 124 nips-2000-Spike-Timing-Dependent Learning for Oscillatory Networks
14 0.17519835 125 nips-2000-Stability and Noise in Biochemical Switches
15 0.17355286 40 nips-2000-Dendritic Compartmentalization Could Underlie Competition and Attentional Biasing of Simultaneous Visual Stimuli
16 0.13902476 141 nips-2000-Universality and Individuality in a Neural Code
17 0.13596952 113 nips-2000-Robust Reinforcement Learning
18 0.13585876 34 nips-2000-Competition and Arbors in Ocular Dominance
19 0.13253143 64 nips-2000-High-temperature Expansions for Learning Models of Nonnegative Data
20 0.12158928 30 nips-2000-Bayesian Video Shot Segmentation
topicId topicWeight
[(17, 0.046), (18, 0.025), (32, 0.012), (33, 0.02), (42, 0.054), (55, 0.012), (59, 0.05), (62, 0.026), (67, 0.03), (69, 0.011), (76, 0.023), (81, 0.035), (90, 0.019), (91, 0.011), (93, 0.509)]
simIndex simValue paperId paperTitle
same-paper 1 0.94720167 67 nips-2000-Homeostasis in a Silicon Integrate and Fire Neuron
Author: Shih-Chii Liu, Bradley A. Minch
Abstract: In this work, we explore homeostasis in a silicon integrate-and-fire neuron. The neuron adapts its firing rate over long time periods on the order of seconds or minutes so that it returns to its spontaneous firing rate after a lasting perturbation. Homeostasis is implemented via two schemes. One scheme looks at the presynaptic activity and adapts the synaptic weight depending on the presynaptic spiking rate. The second scheme adapts the synaptic
2 0.64899278 99 nips-2000-Periodic Component Analysis: An Eigenvalue Method for Representing Periodic Structure in Speech
Author: Lawrence K. Saul, Jont B. Allen
Abstract: An eigenvalue method is developed for analyzing periodic structure in speech. Signals are analyzed by a matrix diagonalization reminiscent of methods for principal component analysis (PCA) and independent component analysis (ICA). Our method-called periodic component analysis (1l
3 0.37750942 11 nips-2000-A Silicon Primitive for Competitive Learning
Author: David Hsu, Miguel Figueroa, Chris Diorio
Abstract: Competitive learning is a technique for training classification and clustering networks. We have designed and fabricated an 11transistor primitive, that we term an automaximizing bump circuit, that implements competitive learning dynamics. The circuit performs a similarity computation, affords nonvolatile storage, and implements simultaneous local adaptation and computation. We show that our primitive is suitable for implementing competitive learning in VLSI, and demonstrate its effectiveness in a standard clustering task.
4 0.2340638 146 nips-2000-What Can a Single Neuron Compute?
Author: Blaise Agüera y Arcas, Adrienne L. Fairhall, William Bialek
Abstract: In this paper we formulate a description of the computation performed by a neuron as a combination of dimensional reduction and nonlinearity. We implement this description for the HodgkinHuxley model, identify the most relevant dimensions and find the nonlinearity. A two dimensional description already captures a significant fraction of the information that spikes carry about dynamic inputs. This description also shows that computation in the Hodgkin-Huxley model is more complex than a simple integrateand-fire or perceptron model. 1
5 0.22359766 88 nips-2000-Multiple Timescales of Adaptation in a Neural Code
Author: Adrienne L. Fairhall, Geoffrey D. Lewen, William Bialek, Robert R. de Ruyter van Steveninck
Abstract: Many neural systems extend their dynamic range by adaptation. We examine the timescales of adaptation in the context of dynamically modulated rapidly-varying stimuli, and demonstrate in the fly visual system that adaptation to the statistical ensemble of the stimulus dynamically maximizes information transmission about the time-dependent stimulus. Further, while the rate response has long transients, the adaptation takes place on timescales consistent with optimal variance estimation.
7 0.19779086 91 nips-2000-Noise Suppression Based on Neurophysiologically-motivated SNR Estimation for Robust Speech Recognition
8 0.19592701 129 nips-2000-Temporally Dependent Plasticity: An Information Theoretic Account
9 0.19023755 55 nips-2000-Finding the Key to a Synapse
10 0.18693674 124 nips-2000-Spike-Timing-Dependent Learning for Oscillatory Networks
11 0.18223616 57 nips-2000-Four-legged Walking Gait Control Using a Neuromorphic Chip Interfaced to a Support Vector Learning Algorithm
12 0.17347002 87 nips-2000-Modelling Spatial Recall, Mental Imagery and Neglect
13 0.17008276 42 nips-2000-Divisive and Subtractive Mask Effects: Linking Psychophysics and Biophysics
14 0.15653245 104 nips-2000-Processing of Time Series by Neural Circuits with Biologically Realistic Synaptic Dynamics
15 0.14990841 125 nips-2000-Stability and Noise in Biochemical Switches
16 0.13904943 141 nips-2000-Universality and Individuality in a Neural Code
17 0.13820842 43 nips-2000-Dopamine Bonuses
18 0.13777211 49 nips-2000-Explaining Away in Weight Space
19 0.13768516 89 nips-2000-Natural Sound Statistics and Divisive Normalization in the Auditory System
20 0.13470969 103 nips-2000-Probabilistic Semantic Video Indexing