nips nips2008 nips2008-209 knowledge-graph by maker-knowledge-mining

209 nips-2008-Short-Term Depression in VLSI Stochastic Synapse


Source: pdf

Author: Peng Xu, Timothy K. Horiuchi, Pamela A. Abshire

Abstract: We report a compact realization of short-term depression (STD) in a VLSI stochastic synapse. The behavior of the circuit is based on a subtractive single release model of STD. Experimental results agree well with simulation and exhibit expected STD behavior: the transmitted spike train has negative autocorrelation and lower power spectral density at low frequencies which can remove redundancy in the input spike train, and the mean transmission probability is inversely proportional to the input spike rate which has been suggested as an automatic gain control mechanism in neural systems. The dynamic stochastic synapse could potentially be a powerful addition to existing deterministic VLSI spiking neural systems. 1

Reference: text


Summary: the most important sentenses genereted by tfidf model

sentIndex sentText sentNum sentScore

1 edu Abstract We report a compact realization of short-term depression (STD) in a VLSI stochastic synapse. [sent-3, score-0.298]

2 The behavior of the circuit is based on a subtractive single release model of STD. [sent-4, score-0.342]

3 The dynamic stochastic synapse could potentially be a powerful addition to existing deterministic VLSI spiking neural systems. [sent-6, score-0.495]

4 Synaptic transmission is a stochastic process by nature, i. [sent-8, score-0.329]

5 it has been observed that at central synapses transmission proceeds in an all-or-none fashion with a certain probability. [sent-10, score-0.3]

6 The synaptic weight has been modeled as R = npq [1], where n is the number of quantal release sites, p is the probability of release per site, and q is some measure of the postsynaptic effect. [sent-11, score-0.523]

7 The synapse undergoes constant changes in order to learn from and adapt to the ever-changing outside world. [sent-12, score-0.304]

8 The variety of synaptic plasticities differ in the triggering condition, time span, and involvement of preand postsynaptic activity. [sent-13, score-0.22]

9 Regulation of the vesicle release probability has been considered as the underlying mechanism for various synaptic plasticities [1–3]. [sent-14, score-0.4]

10 The stochastic nature of the neural computation has been investigated and the benefits of stochastic computation such as energy efficiency, communication efficiency, and computational efficiency have been shown [4–6]. [sent-15, score-0.316]

11 VLSI stochastic synapse could provide a useful hardware tool to investigate stochastic nature of the synapse and also function as the basic computing unit for VLSI implementation of stochastic neural computation. [sent-17, score-1.115]

12 Although adaptive deterministic VLSI synapses have been extensively studied and developed for neurally inspired VLSI learning systems [8–13], stochastic synapses have been difficult to implement in VLSI because it is hard to properly harness the probabilistic behavior, normally provided by noise. [sent-18, score-0.416]

13 Although stochastic behavior in integrated circuits has been investigated in the context of random number generators (RNGs) [14], these circuits either are too complicated to use for a stochastic synapse or suffer from poor randomness. [sent-19, score-0.786]

14 Stochastic transmission was implemented in software using a lookup table and a pseudo random number generator [15]. [sent-21, score-0.211]

15 Stochastic transition between potentiation and depression has been demonstrated in bistable synapses driven by stochastic spiking behavior at the network level for stochastic learning [16]. [sent-22, score-0.617]

16 Experimental results demonstrated true randomness as well as the adjustable transmission probability. [sent-24, score-0.233]

17 The implementation with ∼ 15 transistors is compact for these added features, although there are much more compact deterministic synapses with as few as five transistors. [sent-25, score-0.263]

18 We also proposed the method to implement plasticity and demonstrated the implementation of STD by modulating the probability of spike transmission. [sent-26, score-0.501]

19 In this paper we extend the subtractive single release model of STD to the VLSI stochastic synapse. [sent-28, score-0.339]

20 We describe a novel compact VLSI implementation of a stochastic synapse with STD and demonstrate extensive experimental results showing the agreement with both simulation and theory over a range of conditions and biases. [sent-30, score-0.574]

21 2 VLSI Stochastic Synapse and Plasticity Vicm Vicm Vdd2 Ibias Vr Vr Vc Vi+ Vg+ M1 M2 Vg- Vw M3 M5 Vp C M4 Vtran Vbias M7 Vo+ M6 Vdd Vpre Vh Vi- Vpre~ Vo- Vdd Vw Vo+ Vo- Figure 1: Schematic of the stochastic synapse with STD. [sent-31, score-0.462]

22 Previously we demonstrated a compact stochastic synapse circuit exhibiting true randomness and consuming very little power (10-44 µW). [sent-32, score-0.778]

23 When a presynaptic spike arrives, Vpre∼ goes low, and transistor M5 shuts off. [sent-37, score-0.409]

24 Vtran either goes high (with probability p) or stays low (with probability 1 − p) during an input spike, emulating stochastic transmission. [sent-42, score-0.284]

25 Fabrication mismatch in an uncompensated stochastic synapse circuit would likely permanently bias the circuit to one solution. [sent-43, score-0.784]

26 By controlling the common-mode voltage of the floating gates, we operate the circuit such that hot-electron injection occurs only on the side where the output voltage is close to ground. [sent-45, score-0.489]

27 Over multiple clock cycles hot-electron injection works in negative feedback to equalize the floating gate voltages, bringing the circuit into stochastic operation. [sent-46, score-0.501]

28 The procedure can be halted to achieve a specific probability or allowed to reach equilibrium (50% transmission probability). [sent-47, score-0.235]

29 The transmission probability can be adjusted by changing the input offset or the floating gate charges. [sent-48, score-0.428]

30 5 1 + erf v−µ , where µ is the input offset voltage for p = 50%, 2δ δ is the standard deviation characterizing the spread of the probability tuning, and v = Vi− − Vi+ is the input offset voltage. [sent-51, score-0.402]

31 Short-term depression is triggered by the transmitted input spikes Vtran to emulate the probability decrease because of vesicle depletion. [sent-54, score-0.421]

32 Short-term facilitation is triggered by the input spikes Vpre to emulate the probability increase because of presynaptic Ca2+ accumulation. [sent-55, score-0.292]

33 3 Short-Term Depression: Model and Simulation Although long-term plasticity has attracted much attention because of its apparent association with learning and memory, the functional role of short-term plasticity has only recently begun to be understood. [sent-58, score-0.18]

34 Recent evidence suggests that short-term synaptic plasticity is involved in many functions such as gain control [17], phase shift [18], coincidence detection, and network reconfiguration [19]. [sent-59, score-0.277]

35 It has also been shown that depressing stochastic synapses can increase information transmission efficiency by filtering out redundancy in presynaptic spike trains [5]. [sent-60, score-1.039]

36 Activity dependent short-term changes in synaptic efficacy at the macroscopic level are determined by activity dependent changes in vesicle release probability at the microscopic level. [sent-61, score-0.367]

37 Since there is a finite pool of vesicles, and released vesicles cannot be replenished immediately, a successful release triggered by one spike potentially reduces the probability of release triggered by the next spike. [sent-64, score-0.745]

38 We propose an STD model based on our VLSI stochastic synapse that closely emulates the simple subtractive single release model [5, 20]. [sent-65, score-0.643]

39 A presynaptic spike that is transmitted reduces the input offset voltage v at the VLSI stochastic synapse by ∆v, so that the transmission probability p(t) is reduced. [sent-66, score-1.332]

40 5 pmax 1 (6) pss ≈ ≈ ∝ 1 + a∆vτd r a∆vτd r r Therefore the steady state mean probability is inversely proportional to the input spike rate when a∆vτd r 1. [sent-74, score-0.543]

41 2(a) and 2(b) show that the mean probability is a linear function of the inverse of the input spike rate at various ∆v and τd for high input spike rates. [sent-83, score-0.807]

42 4 shows that the output spike train exhibits negative autocorrelation at small time intervals and lower power spectral density (PSD) at low frequencies. [sent-88, score-0.756]

43 Figure 2: Mean probability as a function of input spike rate from simulation. [sent-118, score-0.436]

44 Figure 4: Characterization of the output spike train from the simulation of the stochastic synapse with STD. [sent-137, score-0.892]

45 4 VLSI Implementation of Short-Term Depression We implemented this model using the stochastic synapse circuit described above (see Fig. [sent-139, score-0.623]

46 To change the transmission probability we only need to modulate one side of the input, in this case Vi− . [sent-142, score-0.204]

47 The resistor and capacitor provide for exponential recovery of the voltage to its equilibrium value. [sent-143, score-0.2]

48 The input Vi− is modulated by transistors M6 and M7 based on the result of the previous spike transmission. [sent-144, score-0.404]

49 Every time a spike is transmitted successfully, a pulse with height Vh and width Tp is generated at Vp . [sent-145, score-0.592]

50 This pulse discharges the capacitor with a small current determined by Vw and reduces Vi− by a small amount, thus decreasing the transmission probability. [sent-147, score-0.386]

51 The value of the tunable resistors is controlled by the gate voltage of the pFETs, Vr . [sent-148, score-0.272]

52 The update mechanism would then be driven by the presynaptic spike rather than the successfully transmitted spike. [sent-154, score-0.447]

53 The extra components on the left provide for future implementation of short-term facilitation and also symmetrize the stochastic synapse, improving its randomness. [sent-155, score-0.224]

54 The layout size of the stochastic synapse is 151. [sent-158, score-0.493]

55 As a proof of concept, the layout of the circuit is quite conservative. [sent-163, score-0.192]

56 The circuit uses a nominal power supply of 5 V for normal operation. [sent-165, score-0.282]

57 The differential pair comparator uses a separate power supply for hot-electron injection. [sent-166, score-0.219]

58 Each floating-gate pFET has a tunnelling structure, which is a source-drain connected pFET with its gate connected to the floating node. [sent-167, score-0.207]

59 A separate power supply provides the tunnelling voltage to the shorted source and drain (tunnelling node). [sent-168, score-0.323]

60 When the tunnelling voltage is high enough (∼14-15 V), electron tunnels through the silicon dioxide, from the floating gate to the tunnelling node. [sent-169, score-0.479]

61 Alternatively Ultra-Violet (UV) activated conductances may be used to remove electrons from the gate to avoid the need for special power supplies. [sent-171, score-0.216]

62 We raise the power supply of the differential pair comparator to 5. [sent-174, score-0.219]

63 We use the negative feedback operation of hot-electron injection described above to automatically program the circuit into its stochastic regime. [sent-176, score-0.377]

64 During this procedure, STD is disabled, so that the probability at this operating point is the synaptic transmission probability without any dynamics. [sent-178, score-0.384]

65 We use a signal generator to generate pulse signals which serve as input spikes. [sent-180, score-0.265]

66 Although spike trains are better modeled by Poisson arrivals, the averaging behavior should be similar for deterministic spike trains which make testing easier. [sent-181, score-0.754]

67 The power consumption of the STD block is much smaller than the stochastic synapse. [sent-183, score-0.217]

68 We collect output spikes from the depressing stochastic synapse at an input spike rate of 100 Hz. [sent-185, score-1.051]

69 We divide time into bins according to the input spike rate so that in each bin there is either 1 or 0 output spike. [sent-186, score-0.435]

70 In this way, we convert the output spike train into a bit sequence s(k). [sent-187, score-0.385]

71 5 shows the autocorrelation of the output spike trains at two different Vr . [sent-192, score-0.659]

72 6 shows the PSD of the output spike trains from the same data shown in Fig. [sent-195, score-0.409]

73 The time constant of STD increases with Vr so that the larger Vr is, the longer the period of the negative autocorrelation is and the lower the frequencies where power is reduced. [sent-198, score-0.309]

74 Notice that the autocorrelation and PSD for Vr = 1. [sent-200, score-0.25]

75 Normally redundant information is represented by positive autocorrelation in the time domain, which is characterized by power at low frequencies. [sent-203, score-0.309]

76 By reducing the low frequency component of the spike train, redundant information is suppressed and overall information transmission efficiency is improved. [sent-204, score-0.482]

77 If the negative autocorrelation of the synaptic dynamics matches the positive autocorrelation in the input spike train, the redundancy is cancelled and the output is uncorrelated [5]. [sent-205, score-1.089]

78 02 0 50 10 20 30 Intervals 40 50 Figure 5: Autocorrelation of output spike trains from the VLSI stochastic synapse with STD for an input spike rate of 100 Hz. [sent-221, score-1.274]

79 Autocorrelation at zero time represents the sequence variance, and negative autocorrelation at short time intervals indicates STD. [sent-222, score-0.312]

80 59 V r r 0 PSD (dB) 20 0 PSD (dB) 20 −20 −40 −60 −80 0 −20 −40 −60 10 20 30 40 Frequency (Hz) 50 −80 0 10 20 30 40 Frequency (Hz) 50 Figure 6: Power spectral density of output spike trains from the VLSI stochastic synapse with STD for an input spike rate of 100 Hz. [sent-225, score-1.274]

81 We collect output spikes in response to 104 input spikes at input spike rates from 100 Hz to 1000 Hz with 100 Hz intervals. [sent-227, score-0.571]

82 7(a) shows that the mean transmission probability is inversely proportional to the input spike rate for various pulse widths when the rate is high enough. [sent-229, score-0.844]

83 By scaling the probability with the input spike rate, the synapse tends to normalize the DC component of input frequency and preserve the neuron dynamic range, thus avoiding saturation due to fast firing presynaptic neurons and retaining sensitivity to less frequently firing neurons [17]. [sent-231, score-0.833]

84 The slope of mean probability decreases as the pulse width increases. [sent-232, score-0.297]

85 Since the pulse width determines the discharging time of the capacitor at Vi− , the larger the pulse width, the larger the ∆v is and the smaller the slope is. [sent-233, score-0.529]

86 The discharging current is approximately constant, thus ∆v is proportional to the pulse width. [sent-236, score-0.215]

87 01 20 30 40 50 Pulse width (µs) 1/r (a) Mean probability as a function of input spike rate for pulse width Tp =10, 20, 30, 40, 50 µs. [sent-255, score-0.691]

88 Figure 7: Steady state behavior of VLSI stochastic synapse with STD for different pulse widths. [sent-262, score-0.627]

89 As Vr increases, the slope of mean transmission probability as a linear function of 1 decreases. [sent-264, score-0.258]

90 8(a) shows that a∆vτd is approximately an exponential function of Vr , indicating that the equivalent R of the pFET is approximately exponential to its gate voltage Vr . [sent-267, score-0.243]

91 For Vw , the slope of mean transmission probability decreases as Vw increases. [sent-268, score-0.258]

92 8(b) shows that a∆vτd is approximately an exponential function of Vw , indicating that the discharging current from the transistor M6 is approximately exponential to its gate voltage Vw . [sent-271, score-0.326]

93 6 Conclusion We designed and tested a VLSI stochastic synapse with short-term depression. [sent-305, score-0.462]

94 The behavior of the depressing synapse agrees with theoretical predictions and simulation. [sent-306, score-0.404]

95 It is a good candidate to bring randomness and rich dynamics into VLSI spiking neural systems, such as for rate-independent coincidence detection of Poisson spike trains. [sent-309, score-0.446]

96 However, the application of such dynamic stochastic synapses in large networks still remains a challenge. [sent-310, score-0.287]

97 Tsodyks, “An algorithm for modifying neurotransmitter release probability based on pre- and postsynaptic spike timing,” Neural Computation, vol. [sent-327, score-0.55]

98 Zador, “Dynamic stochastic synapses as computational units,” Neural Comput. [sent-333, score-0.287]

99 Abbott, “Redundancy reduction and sustained firing with stochastic depressing synapses,” J. [sent-343, score-0.258]

100 Douglas, “A VLSI array of low-power spiking neurons and bistable synapses with spike-timing dependent plasticity,” IEEE Trans. [sent-414, score-0.195]


similar papers computed by tfidf model

tfidf for this paper:

wordName wordTfidf (topN-words)

[('vlsi', 0.324), ('spike', 0.311), ('synapse', 0.304), ('autocorrelation', 0.25), ('vr', 0.212), ('vw', 0.2), ('transmission', 0.171), ('std', 0.166), ('pulse', 0.165), ('circuit', 0.161), ('stochastic', 0.158), ('oating', 0.15), ('synaptic', 0.147), ('hz', 0.147), ('release', 0.137), ('synapses', 0.129), ('gate', 0.124), ('voltage', 0.119), ('psd', 0.107), ('depression', 0.106), ('depressing', 0.1), ('pfet', 0.1), ('vo', 0.1), ('mv', 0.093), ('plasticity', 0.09), ('tunnelling', 0.083), ('circuits', 0.083), ('vi', 0.076), ('transmitted', 0.071), ('vicm', 0.067), ('vpre', 0.067), ('vtran', 0.067), ('trains', 0.066), ('presynaptic', 0.065), ('randomness', 0.062), ('supply', 0.062), ('vmax', 0.062), ('intervals', 0.062), ('input', 0.06), ('power', 0.059), ('ms', 0.059), ('injection', 0.058), ('spikes', 0.054), ('slope', 0.054), ('capacitor', 0.05), ('comparator', 0.05), ('discharging', 0.05), ('erf', 0.05), ('vesicle', 0.05), ('differential', 0.048), ('triggered', 0.047), ('width', 0.045), ('simulation', 0.045), ('subtractive', 0.044), ('train', 0.042), ('dotted', 0.041), ('postsynaptic', 0.04), ('generator', 0.04), ('inversely', 0.04), ('coincidence', 0.04), ('offset', 0.04), ('poisson', 0.04), ('redundancy', 0.039), ('tp', 0.037), ('pmax', 0.037), ('silicon', 0.037), ('abbott', 0.035), ('compact', 0.034), ('modulating', 0.034), ('arrivals', 0.033), ('bistable', 0.033), ('chicca', 0.033), ('dnp', 0.033), ('electron', 0.033), ('electrons', 0.033), ('facilitation', 0.033), ('ibias', 0.033), ('indiveri', 0.033), ('nelson', 0.033), ('pfets', 0.033), ('plasticities', 0.033), ('recon', 0.033), ('subthreshold', 0.033), ('transistor', 0.033), ('transistors', 0.033), ('vdd', 0.033), ('vesicles', 0.033), ('vg', 0.033), ('spiking', 0.033), ('implementation', 0.033), ('probability', 0.033), ('output', 0.032), ('rate', 0.032), ('equilibrium', 0.031), ('layout', 0.031), ('steady', 0.03), ('vh', 0.029), ('quantal', 0.029), ('neurotransmitter', 0.029), ('resistors', 0.029)]

similar papers list:

simIndex simValue paperId paperTitle

same-paper 1 0.99999988 209 nips-2008-Short-Term Depression in VLSI Stochastic Synapse

Author: Peng Xu, Timothy K. Horiuchi, Pamela A. Abshire

Abstract: We report a compact realization of short-term depression (STD) in a VLSI stochastic synapse. The behavior of the circuit is based on a subtractive single release model of STD. Experimental results agree well with simulation and exhibit expected STD behavior: the transmitted spike train has negative autocorrelation and lower power spectral density at low frequencies which can remove redundancy in the input spike train, and the mean transmission probability is inversely proportional to the input spike rate which has been suggested as an automatic gain control mechanism in neural systems. The dynamic stochastic synapse could potentially be a powerful addition to existing deterministic VLSI spiking neural systems. 1

2 0.25059673 220 nips-2008-Spike Feature Extraction Using Informative Samples

Author: Zhi Yang, Qi Zhao, Wentai Liu

Abstract: This paper presents a spike feature extraction algorithm that targets real-time spike sorting and facilitates miniaturized microchip implementation. The proposed algorithm has been evaluated on synthesized waveforms and experimentally recorded sequences. When compared with many spike sorting approaches our algorithm demonstrates improved speed, accuracy and allows unsupervised execution. A preliminary hardware implementation has been realized using an integrated microchip interfaced with a personal computer. 1

3 0.19489355 81 nips-2008-Extracting State Transition Dynamics from Multiple Spike Trains with Correlated Poisson HMM

Author: Kentaro Katahira, Jun Nishikawa, Kazuo Okanoya, Masato Okada

Abstract: Neural activity is non-stationary and varies across time. Hidden Markov Models (HMMs) have been used to track the state transition among quasi-stationary discrete neural states. Within this context, independent Poisson models have been used for the output distribution of HMMs; hence, the model is incapable of tracking the change in correlation without modulating the firing rate. To achieve this, we applied a multivariate Poisson distribution with correlation terms for the output distribution of HMMs. We formulated a Variational Bayes (VB) inference for the model. The VB could automatically determine the appropriate number of hidden states and correlation types while avoiding the overlearning problem. We developed an efficient algorithm for computing posteriors using the recursive relationship of a multivariate Poisson distribution. We demonstrated the performance of our method on synthetic data and a real spike train recorded from a songbird. 1

4 0.17486472 204 nips-2008-Self-organization using synaptic plasticity

Author: Vicençc Gómez, Andreas Kaltenbrunner, Vicente López, Hilbert J. Kappen

Abstract: Large networks of spiking neurons show abrupt changes in their collective dynamics resembling phase transitions studied in statistical physics. An example of this phenomenon is the transition from irregular, noise-driven dynamics to regular, self-sustained behavior observed in networks of integrate-and-fire neurons as the interaction strength between the neurons increases. In this work we show how a network of spiking neurons is able to self-organize towards a critical state for which the range of possible inter-spike-intervals (dynamic range) is maximized. Self-organization occurs via synaptic dynamics that we analytically derive. The resulting plasticity rule is defined locally so that global homeostasis near the critical state is achieved by local regulation of individual synapses. 1

5 0.16848053 137 nips-2008-Modeling Short-term Noise Dependence of Spike Counts in Macaque Prefrontal Cortex

Author: Arno Onken, Steffen Grünewälder, Matthias Munk, Klaus Obermayer

Abstract: Correlations between spike counts are often used to analyze neural coding. The noise is typically assumed to be Gaussian. Yet, this assumption is often inappropriate, especially for low spike counts. In this study, we present copulas as an alternative approach. With copulas it is possible to use arbitrary marginal distributions such as Poisson or negative binomial that are better suited for modeling noise distributions of spike counts. Furthermore, copulas place a wide range of dependence structures at the disposal and can be used to analyze higher order interactions. We develop a framework to analyze spike count data by means of copulas. Methods for parameter inference based on maximum likelihood estimates and for computation of mutual information are provided. We apply the method to our data recorded from macaque prefrontal cortex. The data analysis leads to three findings: (1) copula-based distributions provide significantly better fits than discretized multivariate normal distributions; (2) negative binomial margins fit the data significantly better than Poisson margins; and (3) the dependence structure carries 12% of the mutual information between stimuli and responses. 1

6 0.1516531 59 nips-2008-Dependent Dirichlet Process Spike Sorting

7 0.10976325 16 nips-2008-Adaptive Template Matching with Shift-Invariant Semi-NMF

8 0.10681402 38 nips-2008-Bio-inspired Real Time Sensory Map Realignment in a Robotic Barn Owl

9 0.098025411 89 nips-2008-Gates

10 0.088659793 230 nips-2008-Temporal Difference Based Actor Critic Learning - Convergence and Neural Implementation

11 0.087118603 58 nips-2008-Dependence of Orientation Tuning on Recurrent Excitation and Inhibition in a Network Model of V1

12 0.082889885 160 nips-2008-On Computational Power and the Order-Chaos Phase Transition in Reservoir Computing

13 0.081237912 43 nips-2008-Cell Assemblies in Large Sparse Inhibitory Networks of Biologically Realistic Spiking Neurons

14 0.070280246 166 nips-2008-On the asymptotic equivalence between differential Hebbian and temporal difference learning using a local third factor

15 0.06570331 78 nips-2008-Exact Convex Confidence-Weighted Learning

16 0.052482281 8 nips-2008-A general framework for investigating how far the decoding process in the brain can be simplified

17 0.050063845 90 nips-2008-Gaussian-process factor analysis for low-dimensional single-trial analysis of neural population activity

18 0.048113231 25 nips-2008-An interior-point stochastic approximation method and an L1-regularized delta rule

19 0.042797197 96 nips-2008-Hebbian Learning of Bayes Optimal Decisions

20 0.041135941 27 nips-2008-Artificial Olfactory Brain for Mixture Identification


similar papers computed by lsi model

lsi for this paper:

topicId topicWeight

[(0, -0.122), (1, 0.109), (2, 0.226), (3, 0.252), (4, -0.195), (5, 0.022), (6, 0.042), (7, -0.09), (8, -0.112), (9, -0.035), (10, 0.014), (11, -0.006), (12, -0.003), (13, -0.053), (14, 0.026), (15, -0.056), (16, -0.004), (17, -0.044), (18, -0.031), (19, 0.005), (20, 0.044), (21, -0.014), (22, -0.066), (23, 0.053), (24, 0.009), (25, 0.027), (26, -0.024), (27, -0.065), (28, 0.032), (29, 0.029), (30, 0.045), (31, 0.006), (32, -0.067), (33, 0.02), (34, -0.202), (35, -0.004), (36, -0.034), (37, -0.005), (38, 0.026), (39, -0.022), (40, -0.004), (41, 0.042), (42, 0.012), (43, 0.042), (44, 0.037), (45, 0.005), (46, 0.055), (47, -0.066), (48, 0.036), (49, -0.02)]

similar papers list:

simIndex simValue paperId paperTitle

same-paper 1 0.97032076 209 nips-2008-Short-Term Depression in VLSI Stochastic Synapse

Author: Peng Xu, Timothy K. Horiuchi, Pamela A. Abshire

Abstract: We report a compact realization of short-term depression (STD) in a VLSI stochastic synapse. The behavior of the circuit is based on a subtractive single release model of STD. Experimental results agree well with simulation and exhibit expected STD behavior: the transmitted spike train has negative autocorrelation and lower power spectral density at low frequencies which can remove redundancy in the input spike train, and the mean transmission probability is inversely proportional to the input spike rate which has been suggested as an automatic gain control mechanism in neural systems. The dynamic stochastic synapse could potentially be a powerful addition to existing deterministic VLSI spiking neural systems. 1

2 0.83053392 220 nips-2008-Spike Feature Extraction Using Informative Samples

Author: Zhi Yang, Qi Zhao, Wentai Liu

Abstract: This paper presents a spike feature extraction algorithm that targets real-time spike sorting and facilitates miniaturized microchip implementation. The proposed algorithm has been evaluated on synthesized waveforms and experimentally recorded sequences. When compared with many spike sorting approaches our algorithm demonstrates improved speed, accuracy and allows unsupervised execution. A preliminary hardware implementation has been realized using an integrated microchip interfaced with a personal computer. 1

3 0.79374921 59 nips-2008-Dependent Dirichlet Process Spike Sorting

Author: Jan Gasthaus, Frank Wood, Dilan Gorur, Yee W. Teh

Abstract: In this paper we propose a new incremental spike sorting model that automatically eliminates refractory period violations, accounts for action potential waveform drift, and can handle “appearance” and “disappearance” of neurons. Our approach is to augment a known time-varying Dirichlet process that ties together a sequence of infinite Gaussian mixture models, one per action potential waveform observation, with an interspike-interval-dependent likelihood that prohibits refractory period violations. We demonstrate this model by showing results from sorting two publicly available neural data recordings for which a partial ground truth labeling is known. 1

4 0.70500726 204 nips-2008-Self-organization using synaptic plasticity

Author: Vicençc Gómez, Andreas Kaltenbrunner, Vicente López, Hilbert J. Kappen

Abstract: Large networks of spiking neurons show abrupt changes in their collective dynamics resembling phase transitions studied in statistical physics. An example of this phenomenon is the transition from irregular, noise-driven dynamics to regular, self-sustained behavior observed in networks of integrate-and-fire neurons as the interaction strength between the neurons increases. In this work we show how a network of spiking neurons is able to self-organize towards a critical state for which the range of possible inter-spike-intervals (dynamic range) is maximized. Self-organization occurs via synaptic dynamics that we analytically derive. The resulting plasticity rule is defined locally so that global homeostasis near the critical state is achieved by local regulation of individual synapses. 1

5 0.69349092 81 nips-2008-Extracting State Transition Dynamics from Multiple Spike Trains with Correlated Poisson HMM

Author: Kentaro Katahira, Jun Nishikawa, Kazuo Okanoya, Masato Okada

Abstract: Neural activity is non-stationary and varies across time. Hidden Markov Models (HMMs) have been used to track the state transition among quasi-stationary discrete neural states. Within this context, independent Poisson models have been used for the output distribution of HMMs; hence, the model is incapable of tracking the change in correlation without modulating the firing rate. To achieve this, we applied a multivariate Poisson distribution with correlation terms for the output distribution of HMMs. We formulated a Variational Bayes (VB) inference for the model. The VB could automatically determine the appropriate number of hidden states and correlation types while avoiding the overlearning problem. We developed an efficient algorithm for computing posteriors using the recursive relationship of a multivariate Poisson distribution. We demonstrated the performance of our method on synthetic data and a real spike train recorded from a songbird. 1

6 0.66803968 16 nips-2008-Adaptive Template Matching with Shift-Invariant Semi-NMF

7 0.60320741 38 nips-2008-Bio-inspired Real Time Sensory Map Realignment in a Robotic Barn Owl

8 0.50134182 137 nips-2008-Modeling Short-term Noise Dependence of Spike Counts in Macaque Prefrontal Cortex

9 0.48630857 43 nips-2008-Cell Assemblies in Large Sparse Inhibitory Networks of Biologically Realistic Spiking Neurons

10 0.41116807 58 nips-2008-Dependence of Orientation Tuning on Recurrent Excitation and Inhibition in a Network Model of V1

11 0.40625787 230 nips-2008-Temporal Difference Based Actor Critic Learning - Convergence and Neural Implementation

12 0.38559562 90 nips-2008-Gaussian-process factor analysis for low-dimensional single-trial analysis of neural population activity

13 0.37694591 160 nips-2008-On Computational Power and the Order-Chaos Phase Transition in Reservoir Computing

14 0.36561027 27 nips-2008-Artificial Olfactory Brain for Mixture Identification

15 0.30382743 8 nips-2008-A general framework for investigating how far the decoding process in the brain can be simplified

16 0.30191073 166 nips-2008-On the asymptotic equivalence between differential Hebbian and temporal difference learning using a local third factor

17 0.2909919 110 nips-2008-Kernel-ARMA for Hand Tracking and Brain-Machine interfacing During 3D Motor Control

18 0.28281218 89 nips-2008-Gates

19 0.24371928 222 nips-2008-Stress, noradrenaline, and realistic prediction of mouse behaviour using reinforcement learning

20 0.23040721 78 nips-2008-Exact Convex Confidence-Weighted Learning


similar papers computed by lda model

lda for this paper:

topicId topicWeight

[(4, 0.027), (6, 0.062), (7, 0.04), (12, 0.014), (25, 0.031), (28, 0.086), (30, 0.415), (57, 0.06), (59, 0.014), (63, 0.048), (71, 0.052), (77, 0.031), (83, 0.037)]

similar papers list:

simIndex simValue paperId paperTitle

same-paper 1 0.78655732 209 nips-2008-Short-Term Depression in VLSI Stochastic Synapse

Author: Peng Xu, Timothy K. Horiuchi, Pamela A. Abshire

Abstract: We report a compact realization of short-term depression (STD) in a VLSI stochastic synapse. The behavior of the circuit is based on a subtractive single release model of STD. Experimental results agree well with simulation and exhibit expected STD behavior: the transmitted spike train has negative autocorrelation and lower power spectral density at low frequencies which can remove redundancy in the input spike train, and the mean transmission probability is inversely proportional to the input spike rate which has been suggested as an automatic gain control mechanism in neural systems. The dynamic stochastic synapse could potentially be a powerful addition to existing deterministic VLSI spiking neural systems. 1

2 0.57333487 147 nips-2008-Multiscale Random Fields with Application to Contour Grouping

Author: Longin J. Latecki, Chengen Lu, Marc Sobel, Xiang Bai

Abstract: We introduce a new interpretation of multiscale random fields (MSRFs) that admits efficient optimization in the framework of regular (single level) random fields (RFs). It is based on a new operator, called append, that combines sets of random variables (RVs) to single RVs. We assume that a MSRF can be decomposed into disjoint trees that link RVs at different pyramid levels. The append operator is then applied to map RVs in each tree structure to a single RV. We demonstrate the usefulness of the proposed approach on a challenging task involving grouping contours of target shapes in images. It provides a natural representation of multiscale contour models, which is needed in order to cope with unstable contour decompositions. The append operator allows us to find optimal image segment labels using the classical framework of relaxation labeling. Alternative methods like Markov Chain Monte Carlo (MCMC) could also be used.

3 0.31681868 62 nips-2008-Differentiable Sparse Coding

Author: J. A. Bagnell, David M. Bradley

Abstract: Prior work has shown that features which appear to be biologically plausible as well as empirically useful can be found by sparse coding with a prior such as a laplacian (L1 ) that promotes sparsity. We show how smoother priors can preserve the benefits of these sparse priors while adding stability to the Maximum A-Posteriori (MAP) estimate that makes it more useful for prediction problems. Additionally, we show how to calculate the derivative of the MAP estimate efficiently with implicit differentiation. One prior that can be differentiated this way is KL-regularization. We demonstrate its effectiveness on a wide variety of applications, and find that online optimization of the parameters of the KL-regularized model can significantly improve prediction performance. 1

4 0.31225368 75 nips-2008-Estimating vector fields using sparse basis field expansions

Author: Stefan Haufe, Vadim V. Nikulin, Andreas Ziehe, Klaus-Robert Müller, Guido Nolte

Abstract: We introduce a novel framework for estimating vector fields using sparse basis field expansions (S-FLEX). The notion of basis fields, which are an extension of scalar basis functions, arises naturally in our framework from a rotational invariance requirement. We consider a regression setting as well as inverse problems. All variants discussed lead to second-order cone programming formulations. While our framework is generally applicable to any type of vector field, we focus in this paper on applying it to solving the EEG/MEG inverse problem. It is shown that significantly more precise and neurophysiologically more plausible location and shape estimates of cerebral current sources from EEG/MEG measurements become possible with our method when comparing to the state-of-the-art. 1

5 0.3118099 27 nips-2008-Artificial Olfactory Brain for Mixture Identification

Author: Mehmet K. Muezzinoglu, Alexander Vergara, Ramon Huerta, Thomas Nowotny, Nikolai Rulkov, Henry Abarbanel, Allen Selverston, Mikhail Rabinovich

Abstract: The odor transduction process has a large time constant and is susceptible to various types of noise. Therefore, the olfactory code at the sensor/receptor level is in general a slow and highly variable indicator of the input odor in both natural and artificial situations. Insects overcome this problem by using a neuronal device in their Antennal Lobe (AL), which transforms the identity code of olfactory receptors to a spatio-temporal code. This transformation improves the decision of the Mushroom Bodies (MBs), the subsequent classifier, in both speed and accuracy. Here we propose a rate model based on two intrinsic mechanisms in the insect AL, namely integration and inhibition. Then we present a MB classifier model that resembles the sparse and random structure of insect MB. A local Hebbian learning procedure governs the plasticity in the model. These formulations not only help to understand the signal conditioning and classification methods of insect olfactory systems, but also can be leveraged in synthetic problems. Among them, we consider here the discrimination of odor mixtures from pure odors. We show on a set of records from metal-oxide gas sensors that the cascade of these two new models facilitates fast and accurate discrimination of even highly imbalanced mixtures from pure odors. 1

6 0.31131932 24 nips-2008-An improved estimator of Variance Explained in the presence of noise

7 0.31093565 204 nips-2008-Self-organization using synaptic plasticity

8 0.30960196 238 nips-2008-Theory of matching pursuit

9 0.30888969 131 nips-2008-MDPs with Non-Deterministic Policies

10 0.30837122 202 nips-2008-Robust Regression and Lasso

11 0.30819574 245 nips-2008-Unlabeled data: Now it helps, now it doesn't

12 0.30799758 96 nips-2008-Hebbian Learning of Bayes Optimal Decisions

13 0.30732262 85 nips-2008-Fast Rates for Regularized Objectives

14 0.30539036 78 nips-2008-Exact Convex Confidence-Weighted Learning

15 0.30502498 79 nips-2008-Exploring Large Feature Spaces with Hierarchical Multiple Kernel Learning

16 0.30471432 16 nips-2008-Adaptive Template Matching with Shift-Invariant Semi-NMF

17 0.30422783 162 nips-2008-On the Design of Loss Functions for Classification: theory, robustness to outliers, and SavageBoost

18 0.30369711 179 nips-2008-Phase transitions for high-dimensional joint support recovery

19 0.30341345 29 nips-2008-Automatic online tuning for fast Gaussian summation

20 0.30332652 194 nips-2008-Regularized Learning with Networks of Features