nips nips2006 nips2006-197 knowledge-graph by maker-knowledge-mining

197 nips-2006-Uncertainty, phase and oscillatory hippocampal recall


Source: pdf

Author: Máté Lengyel, Peter Dayan

Abstract: Many neural areas, notably, the hippocampus, show structured, dynamical, population behavior such as coordinated oscillations. It has long been observed that such oscillations provide a substrate for representing analog information in the firing phases of neurons relative to the underlying population rhythm. However, it has become increasingly clear that it is essential for neural populations to represent uncertainty about the information they capture, and the substantial recent work on neural codes for uncertainty has omitted any analysis of oscillatory systems. Here, we observe that, since neurons in an oscillatory network need not only fire once in each cycle (or even at all), uncertainty about the analog quantities each neuron represents by its firing phase might naturally be reported through the degree of concentration of the spikes that it fires. We apply this theory to memory in a model of oscillatory associative recall in hippocampal area CA3. Although it is not well treated in the literature, representing and manipulating uncertainty is fundamental to competent memory; our theory enables us to view CA3 as an effective uncertainty-aware, retrieval system. 1

Reference: text


Summary: the most important sentenses genereted by tfidf model

sentIndex sentText sentNum sentScore

1 Uncertainty, phase and oscillatory hippocampal recall M´ t´ Lengyel and Peter Dayan ae Gatsby Computational Neuroscience Unit University College London 17 Queen Square, London WC1N 3AR, United Kingdom {lmate,dayan}@gatsby. [sent-1, score-0.614]

2 It has long been observed that such oscillations provide a substrate for representing analog information in the firing phases of neurons relative to the underlying population rhythm. [sent-5, score-0.768]

3 However, it has become increasingly clear that it is essential for neural populations to represent uncertainty about the information they capture, and the substantial recent work on neural codes for uncertainty has omitted any analysis of oscillatory systems. [sent-6, score-0.39]

4 Here, we observe that, since neurons in an oscillatory network need not only fire once in each cycle (or even at all), uncertainty about the analog quantities each neuron represents by its firing phase might naturally be reported through the degree of concentration of the spikes that it fires. [sent-7, score-1.712]

5 We apply this theory to memory in a model of oscillatory associative recall in hippocampal area CA3. [sent-8, score-0.616]

6 Although it is not well treated in the literature, representing and manipulating uncertainty is fundamental to competent memory; our theory enables us to view CA3 as an effective uncertainty-aware, retrieval system. [sent-9, score-0.337]

7 Most groups working on the theory of spiking oscillatory networks have considered only the second of these – this is true, for instance, of Hopfield’s work on olfactory representations [2] and Yoshioka’s [3] and Lengyel & Dayan’s work [4] on analog associative memories in CA3. [sent-11, score-0.433]

8 Since neurons do really fire more or less than one spike per cycle, and furthermore in a way that can be informationally rich [5, 6], this poses a key question as to what the other dimensions convey. [sent-12, score-0.448]

9 The number of spikes per cycle is an obvious analog of a conventional firing rate. [sent-13, score-0.407]

10 Single neurons can convey the certainty of a binary proposition by firing more or less strongly [10, 11]; a whole population can use firing rates to convey uncertainty about a collectively-coded analog quantity [12]. [sent-15, score-0.911]

11 However, if neurons can fire multiple spikes per cycle, then the degree to which the spikes are concentrated around a mean phase is an additional channel for representing information. [sent-16, score-1.15]

12 Concentration is not merely an abstract quantity; rather we can expect that the effect of the neuron on its postsynaptic partners will be strongly influenced by the burstiness of the spikes, an effect apparent, for instance, in the complex time-courses of short term synaptic dynamics. [sent-17, score-0.303]

13 Here, we suggest that concentration codes for the uncertainty about phase – highly concentrated spiking represents high certainty about the mean phase in the cycle. [sent-18, score-1.049]

14 One might wonder whether uncertainty is actually important for the cases of oscillatory processing that have been identified. [sent-19, score-0.245]

15 One key computation for spiking oscillatory networks is memory retrieval [3, 4]. [sent-20, score-0.534]

16 Although it is not often viewed this way, memory retrieval is a genuinely probabilistic task [13, 14], with the complete answer to a retrieval query not being a single memory pattern, but rather a distribution over memory patterns. [sent-21, score-0.961]

17 This is because at the time of the query the memory device only has access to incomplete information regarding the memory trace that needs to be recalled. [sent-22, score-0.522]

18 Most importantly, the way memory traces are stored in the synaptic weight matrix implies a data lossy compression algorithm, and therefore the original patterns cannot be decompressed at retrieval with absolute certainty. [sent-23, score-0.616]

19 In this paper, we first describe how oscillatory structures can use all three activity characteristics at their disposal to represent two pieces of information and two forms of uncertainty (Section 2). [sent-24, score-0.34]

20 We then suggest that this representational scheme is appropriate as a model of uncertainty-aware probabilistic recall in CA3. [sent-25, score-0.166]

21 We show in numerical simulations that the derived dynamics lead to competent memory retrieval, supplemented by uncertainty signals that are predictive of retrieval errors (Section 4). [sent-27, score-0.65]

22 2 Representation Single cell The heart of our proposal is a suggestion for how to interpret the activity of a single neuron in a single oscillatory cycle (such as a theta-cycle in the hippocampus) as representing a probability distribution. [sent-28, score-0.606]

23 We treat the implied distribution over the true phase x as being conditional on z. [sent-32, score-0.375]

24 However, if z = 1, then the distribution over x is a mixture of q (x), a uniform distribution on [0, T ), and a narrow, quasi-delta, distribution q⊥ (x; φ) (of width T ) around the mean firing phase (φ) of the spikes. [sent-34, score-0.316]

25 The natural alternative is to consider an approximation in which neurons make independent contributions, with marginals as in equation 2. [sent-39, score-0.391]

26 A) A neuron’s firing times during a period [0, T ) are described by three parameters: r, the number of spikes; φ the mean phase of those spikes; and c, the phase concentration. [sent-41, score-0.632]

27 C) If z = 1, then φ and c jointly define a distribution over phase which is a mixture (weighted by γ(c)) of a distribution peaked at φ and a uniform distribution. [sent-43, score-0.316]

28 Dynamics When the actual distribution P the population has to represent lies outside the class of representable distributions Q in equation 3 with independent marginals, a key computational step is to find activity parameters φ, c, r for the neurons that make Q as close to P as possible. [sent-45, score-0.58]

29 1 We have thus suggested a general representational framework, in which the specification of a computational task amounts to defining a P distribution which the network should represent as best as possible. [sent-48, score-0.167]

30 Equation 5 then defines the dynamics of the interaction between the neurons that optimizes the network’s approximation. [sent-49, score-0.462]

31 3 CA3 memory One of the most widely considered tasks that recurrent neural networks need to solve is that of autoassociative memory storage and retrieval. [sent-50, score-0.633]

32 Moreover, hippocampal area CA3, which is thought to play a key role in memory processing, exhibits oscillatory dynamics in which firing phases are known to play an important functional role. [sent-51, score-0.762]

33 We characterize the activity in CA3 neurons during recall as representing the probability distribution over memories being recalled. [sent-53, score-0.771]

34 Treating storage from a statistical perspective, we use Bayes rule to define a posterior distribution over the memory pattern implied by a noisy and impartial cue. [sent-54, score-0.384]

35 This distribution is represented approximately by the activities φi , ri , ci of the neurons in the network as in equation 3. [sent-55, score-0.521]

36 Recurrent dynamics among the neurons as in equation 5 find appropriate values of these parameters, and model network interactions during recall in CA3. [sent-56, score-0.696]

37 a firing phase (xi ∈ [0, T ), where T is the period of the population oscillation. [sent-59, score-0.383]

38 Posterior for memory recall Following [14, 4], we characterize retrieval in terms of the posterior distribution over x, z given three sources of information: a recall cue (˜, ˜), the synaptic weight max z trix, and the prior over the memories. [sent-61, score-0.751]

39 Dynamics for memory recall Plugging the posterior from equation 8 to the general dynamics equation 5 yields the neuronal update rules that will be appropriate for uncertainty-aware memory recall, and which we treat as a model of recurrent dynamics in CA3. [sent-67, score-1.028]

40 These dynamics generalize, and thus inherit, some of the characteristics of the purely phase-based network suggested in [4]. [sent-70, score-0.206]

41 This means that they also inherit the match with physiologically-measured phase response curves (PRCs) from in vitro CA3 neurons that were measured to test this suggestion [16]. [sent-71, score-0.825]

42 The key difference here is that we expect the magnitude (though not the shape) of the influence of a presynaptic neuron on the phase of a postsynaptic one to scale with its rate, for high concentration. [sent-72, score-0.577]

43 Preliminary in vitro results show that PRCs recorded in response to burst stimulation are not qualitatively different from PRCs induced by single spikes; however, it remains to be seen if their magnitude scales in the way implied by the dynamics here. [sent-73, score-0.343]

44 Time evolution of firing phases (left panels), concentrations (middle panels), and rates (right panels) of neurons that should (top row) or should not (bottom row) participate in the memory pattern being retrieved. [sent-80, score-0.963]

45 Note that firing phases in the top row are plotted as a difference from the stored firing phases so that φ = 0 means perfect retrieval. [sent-81, score-0.421]

46 Color code shows precision (blue: low, yellow: high) of the phase of the input to neurons, with red lines showing cells receving incorrect input rate. [sent-82, score-0.352]

47 4 Simulations Figure 2 shows the course of recall in the full network (with N = 100 neurons, and 10 stored patterns with pz = 0. [sent-83, score-0.448]

48 The top left panel shows that neurons that should fire in the memory trace (ie for which z = 1) quickly converge on their correct phase, and that this convergence usually takes a longer time for neurons receiving more uncertain input. [sent-86, score-1.006]

49 This is paralleled by the way their firing concentrations change (top middle panel): neurons with reliable input immediately increase their concentrations from the initial γ(c) = 0. [sent-87, score-0.498]

50 5 value to γ(c) = 1, while for those having more unreliable input it takes a longer time to build up confidence about their firing phases (and by the time they become confident their phases are indeed correct). [sent-88, score-0.365]

51 Finally, since the firing rate input to the network is correct 90%, most neurons that should or should not fire do or do not fire, respectively, with maximal certainty about their rate (top and bottom right panels). [sent-90, score-0.672]

52 In particular, we may expect there to be a relationship between the actual error in the phase of firing of the neurons recalled by the memory, and the firing rates and concentrations (in the form of burst strengths) of the associated neurons themselves. [sent-92, score-1.434]

53 Here, we have sorted the neurons according to their burst strengths λ, and plotted histograms of errors in firing phase for each group. [sent-95, score-0.82]

54 The lower the burst strength, the more likely are large errors – at least to an approximation. [sent-96, score-0.148]

55 A similar relationship exists between recalled (analogue) and stored (binary) firing rates, where extreme values of the recalled firing rate indicate that the stored firing rate was 0 or 1 with higher certainty (Figure 3B). [sent-97, score-0.652]

56 He recorded neurons in hippocampal area CA1 (not CA3, although we may hope for some similar properties) whilst rats were shuttling on a linear track for food reward. [sent-99, score-0.461]

57 CA1 neurons have place fields – locations in the environment where they respond with spikes – and the phases of these spikes relative to the ongoing theta oscillation in the hippocampus are also known to convey information about location in space [5]. [sent-100, score-1.171]

58 To create the plot, we first selected epochs with highquality and high power theta activity in the hippocampus (to ensure that phase is well estimated). [sent-101, score-0.578]

59 We then computed the mean firing phase within the theta cycle, φ, of each neuron as a function of the location of the rat, separately for each visit to the same location. [sent-102, score-0.572]

60 We assumed that the ‘true’ phase x a neuron should recall at a given location is the average of these phases across different visits. [sent-103, score-0.759]

61 7 Stored firing rate A 0 Error in firing phase " 0 0. [sent-135, score-0.527]

62 " 0 ‘Error’ in firing phase " Figure 3: Uncertainty signals are predictive of the error a cell is making both in simulation (A,B), and as recorded from behaving animals (C). [sent-140, score-0.432]

63 Burst strength signals overall uncertainty about and thus predicts error in mean firing phase (A,C), while graded firing rates signal certainty about whether to fire or not (B). [sent-141, score-0.667]

64 then evaluated the error a neuron was making at a given location on a given visit as the difference between its φ in that trial at that location and the ‘true’ phase x associated with that location. [sent-142, score-0.522]

65 This allowed us to compute statistics of the error in phase as a function of the burst strength. [sent-143, score-0.464]

66 The curves in the figure show that, as for the simulation, burst strength is at least partly inversely correlated with actual phase error, defined in terms of the overall activity in the population. [sent-144, score-0.643]

67 One further way to evaluate the memory is to compare it to two existing associative memories that have previously been studied, and can be seen as special cases. [sent-146, score-0.452]

68 On one hand, our memory adds the dimension of phase to the uncertainty-aware rate-based memory that Sommer & Dayan [14] studied. [sent-147, score-0.838]

69 This memory made a somewhat similar variational approximation, but, as for the meanfield Boltzmann machine [17], only involving r and ρ(r) and no phases. [sent-148, score-0.261]

70 On the other hand, the memory device can be seen as adding the dimension of rate to the phase-based memory that Lengyel & Dayan [4] treated. [sent-149, score-0.563]

71 Given these roots, we can follow the logic in figure 4 and compare the performance of our memory with these precursors in the cases for which they are designed. [sent-152, score-0.261]

72 For instance, to compare with the rate-based network, we construct memories which include phase information. [sent-153, score-0.45]

73 During recall, we present cues with relatively accurate rates, but relatively inaccurate phases, and evaluate the extent to which the network is perturbed by the presence of the phases (which, of course, it has to store in the single set of synaptic weights). [sent-154, score-0.397]

74 Here, a relatively small network (N = 100) was used, with memories that are dense (pz = 0. [sent-156, score-0.234]

75 Performance is evaluated by calculating the average error made in recalled firing rates). [sent-158, score-0.135]

76 In the figure, the two blue curves are for the full model (with the phase information in the input being relatively unreliable, its circular concentration parameter distributed uniformly between 0. [sent-159, score-0.502]

77 1 and 10 across cells); the two yellow curves are for a network with only rates (which is similar to that described, but not simulated, by Sommer & Dayan [14]). [sent-160, score-0.228]

78 Exactly the same rate information is provided to all networks, and is 10% inaccurate (a degree known to the dynamics in the form of η0 and η1 ). [sent-161, score-0.147]

79 The two flat dashed lines show the performance in the case that there are no recurrent synaptic weights at all. [sent-162, score-0.205]

80 This shows that the phase information, and the existence of phase uncertainty and processing during recall, does not A B 0. [sent-165, score-0.749]

81 coded model full model w/o learning full model 0. [sent-177, score-0.121]

82 1 1 10 100 Number of stored patterns 1000 1 10 100 Number of stored patterns 1000 Figure 4: Recall performance compared with a rate-only network (A) and a phase-only network (B). [sent-187, score-0.44]

83 Given its small size, the network is quite competent as an auto-associator. [sent-191, score-0.177]

84 Figure 4B shows a similar comparison between this network and a network that only has to deal with uncertainty in firing phases but not in rates. [sent-192, score-0.486]

85 Again, its performance at recalling phase, given uncertain and noisy phase cues, but good rate-cues, is exactly on a par with the pure, phase-based network. [sent-193, score-0.404]

86 Further, the average errors are only modest, so the capacity of the network for storing analog phases is also impressive. [sent-194, score-0.388]

87 5 Discussion We have considered an interpretation of the activities of neurons in oscillating structures such as area CA3 of the hippocampus as representing distributions over two underlying quantities, one binary and one analogue. [sent-195, score-0.591]

88 We also showed how this representational capacity can be used to excellent effect in the key, uncertainty-sensitive computation of memory recall, an operation in which CA3 is known to be involved. [sent-196, score-0.328]

89 The resulting network model of CA3 encompasses critical aspects of its physiological properties, notably information-bearing firing rates and phases. [sent-197, score-0.152]

90 Further, since it generalizes earlier theories of purely phase-based memories, this model is also consistent with the measured phase response curves of CA3 neurons, which characterize their actual dynamical interactions. [sent-198, score-0.385]

91 In vitro experiments along the lines of those carried out before [16], in which we have precise experimental control over pre- and post-synaptic activity can be used to test these predictions. [sent-201, score-0.189]

92 Further, making the sort of assumptions that underlie figure 3C, we can use data from awake behaving rats to see if the gross statistics of the changes in the activity of the neurons fit the expectations licensed by the theory. [sent-202, score-0.516]

93 From a computational perspective, we have demonstrated that the network is a highly competent associative memory, correctly recalling both binary and analog information, along with certainty about it, and degrading gracefully in the face of overload. [sent-203, score-0.481]

94 In fact, compared with the representation of other analogue quantities (such as the orientation of a visually preseted bar), analogue memory actually poses a particularly tough problem for the representation of uncertainty. [sent-204, score-0.461]

95 By contrast, for analogue memory, each neuron has an independent analogue value, and so the dimensionality of the distribution scales with the number of neurons involved. [sent-206, score-0.701]

96 This extra representational power comes from the ability of neurons to distribute their spikes within a cycle to indicate their uncertainty about phase (using the dimension of time in just the same way that distributional population codes [12] used the dimension of neural space). [sent-207, score-1.273]

97 This dimension for representing analogue uncertainty is coupled to that of the firing rate for representing binary uncertainty, since neurons have to fire multiple times in a cycle to have a measurable lack of concentration. [sent-208, score-0.83]

98 However, this coupling is exactly appropriate given the form of the distribution assumed in equation 2, since weakly firing neurons express only weak certainty about phase in any case. [sent-209, score-0.868]

99 In fact, it is conceivable that we could combine a different model for the firing rate uncertainty with this model for analogue uncertainty, if, for instance, it is found that neuronal firing rates covary in ways that are not anticipated from equation 2. [sent-210, score-0.33]

100 Exactly this seems to characterize the interaction between the hippocampus and the necortex during both consolidation and retrieval [18, 19]. [sent-212, score-0.239]


similar papers computed by tfidf model

tfidf for this paper:

wordName wordTfidf (topN-words)

[('ring', 0.387), ('neurons', 0.356), ('phase', 0.316), ('memory', 0.261), ('spikes', 0.199), ('neuron', 0.175), ('phases', 0.169), ('burst', 0.148), ('recalled', 0.135), ('memories', 0.134), ('certainty', 0.134), ('oscillatory', 0.128), ('cycle', 0.123), ('hippocampus', 0.117), ('uncertainty', 0.117), ('dynamics', 0.106), ('zi', 0.101), ('network', 0.1), ('recall', 0.099), ('synaptic', 0.097), ('activity', 0.095), ('retrieval', 0.089), ('dayan', 0.087), ('analogue', 0.085), ('firing', 0.085), ('analog', 0.085), ('lengyel', 0.084), ('stored', 0.083), ('concentration', 0.083), ('competent', 0.077), ('recurrent', 0.072), ('pz', 0.071), ('concentrations', 0.071), ('hippocampal', 0.071), ('representational', 0.067), ('population', 0.067), ('spike', 0.065), ('coded', 0.061), ('prcs', 0.058), ('vitro', 0.058), ('associative', 0.057), ('representing', 0.054), ('rates', 0.052), ('zj', 0.051), ('convey', 0.05), ('sommer', 0.05), ('theta', 0.05), ('traces', 0.049), ('cue', 0.048), ('strength', 0.048), ('panels', 0.044), ('rate', 0.041), ('yellow', 0.04), ('wij', 0.04), ('storage', 0.039), ('francesco', 0.039), ('gure', 0.038), ('oscillations', 0.037), ('circular', 0.037), ('patterns', 0.037), ('curves', 0.036), ('lines', 0.036), ('equation', 0.035), ('re', 0.034), ('storing', 0.034), ('oscillating', 0.034), ('yoshioka', 0.034), ('rats', 0.034), ('wilson', 0.034), ('coordinated', 0.034), ('hop', 0.034), ('characterize', 0.033), ('uncertain', 0.033), ('implied', 0.031), ('cues', 0.031), ('oscillation', 0.031), ('visit', 0.031), ('suggestion', 0.031), ('postsynaptic', 0.031), ('behaving', 0.031), ('quantities', 0.03), ('full', 0.03), ('activities', 0.03), ('spiking', 0.029), ('recalling', 0.028), ('inherit', 0.028), ('presynaptic', 0.028), ('pattern', 0.028), ('codes', 0.028), ('treat', 0.028), ('course', 0.028), ('key', 0.027), ('neurosci', 0.027), ('unreliable', 0.027), ('exactly', 0.027), ('concentrated', 0.026), ('dt', 0.026), ('participate', 0.026), ('xi', 0.025), ('posterior', 0.025)]

similar papers list:

simIndex simValue paperId paperTitle

same-paper 1 1.0 197 nips-2006-Uncertainty, phase and oscillatory hippocampal recall

Author: Máté Lengyel, Peter Dayan

Abstract: Many neural areas, notably, the hippocampus, show structured, dynamical, population behavior such as coordinated oscillations. It has long been observed that such oscillations provide a substrate for representing analog information in the firing phases of neurons relative to the underlying population rhythm. However, it has become increasingly clear that it is essential for neural populations to represent uncertainty about the information they capture, and the substantial recent work on neural codes for uncertainty has omitted any analysis of oscillatory systems. Here, we observe that, since neurons in an oscillatory network need not only fire once in each cycle (or even at all), uncertainty about the analog quantities each neuron represents by its firing phase might naturally be reported through the degree of concentration of the spikes that it fires. We apply this theory to memory in a model of oscillatory associative recall in hippocampal area CA3. Although it is not well treated in the literature, representing and manipulating uncertainty is fundamental to competent memory; our theory enables us to view CA3 as an effective uncertainty-aware, retrieval system. 1

2 0.44066143 187 nips-2006-Temporal Coding using the Response Properties of Spiking Neurons

Author: Thomas Voegtlin

Abstract: In biological neurons, the timing of a spike depends on the timing of synaptic currents, in a way that is classically described by the Phase Response Curve. This has implications for temporal coding: an action potential that arrives on a synapse has an implicit meaning, that depends on the position of the postsynaptic neuron on the firing cycle. Here we show that this implicit code can be used to perform computations. Using theta neurons, we derive a spike-timing dependent learning rule from an error criterion. We demonstrate how to train an auto-encoder neural network using this rule. 1

3 0.3220385 59 nips-2006-Context dependent amplification of both rate and event-correlation in a VLSI network of spiking neurons

Author: Elisabetta Chicca, Giacomo Indiveri, Rodney J. Douglas

Abstract: Cooperative competitive networks are believed to play a central role in cortical processing and have been shown to exhibit a wide set of useful computational properties. We propose a VLSI implementation of a spiking cooperative competitive network and show how it can perform context dependent computation both in the mean firing rate domain and in spike timing correlation space. In the mean rate case the network amplifies the activity of neurons belonging to the selected stimulus and suppresses the activity of neurons receiving weaker stimuli. In the event correlation case, the recurrent network amplifies with a higher gain the correlation between neurons which receive highly correlated inputs while leaving the mean firing rate unaltered. We describe the network architecture and present experimental data demonstrating its context dependent computation capabilities. 1

4 0.30316681 36 nips-2006-Attentional Processing on a Spike-Based VLSI Neural Network

Author: Yingxue Wang, Rodney J. Douglas, Shih-Chii Liu

Abstract: The neurons of the neocortex communicate by asynchronous events called action potentials (or ’spikes’). However, for simplicity of simulation, most models of processing by cortical neural networks have assumed that the activations of their neurons can be approximated by event rates rather than taking account of individual spikes. The obstacle to exploring the more detailed spike processing of these networks has been reduced considerably in recent years by the development of hybrid analog-digital Very-Large Scale Integrated (hVLSI) neural networks composed of spiking neurons that are able to operate in real-time. In this paper we describe such a hVLSI neural network that performs an interesting task of selective attentional processing that was previously described for a simulated ’pointer-map’ rate model by Hahnloser and colleagues. We found that most of the computational features of their rate model can be reproduced in the spiking implementation; but, that spike-based processing requires a modification of the original network architecture in order to memorize a previously attended target. 1

5 0.26905856 154 nips-2006-Optimal Change-Detection and Spiking Neurons

Author: Angela J. Yu

Abstract: Survival in a non-stationary, potentially adversarial environment requires animals to detect sensory changes rapidly yet accurately, two oft competing desiderata. Neurons subserving such detections are faced with the corresponding challenge to discern “real” changes in inputs as quickly as possible, while ignoring noisy fluctuations. Mathematically, this is an example of a change-detection problem that is actively researched in the controlled stochastic processes community. In this paper, we utilize sophisticated tools developed in that community to formalize an instantiation of the problem faced by the nervous system, and characterize the Bayes-optimal decision policy under certain assumptions. We will derive from this optimal strategy an information accumulation and decision process that remarkably resembles the dynamics of a leaky integrate-and-fire neuron. This correspondence suggests that neurons are optimized for tracking input changes, and sheds new light on the computational import of intracellular properties such as resting membrane potential, voltage-dependent conductance, and post-spike reset voltage. We also explore the influence that factors such as timing, uncertainty, neuromodulation, and reward should and do have on neuronal dynamics and sensitivity, as the optimal decision strategy depends critically on these factors. 1

6 0.23977651 99 nips-2006-Information Bottleneck Optimization and Independent Component Extraction with Spiking Neurons

7 0.18884088 145 nips-2006-Neurophysiological Evidence of Cooperative Mechanisms for Stereo Computation

8 0.15722302 18 nips-2006-A selective attention multi--chip system with dynamic synapses and spiking neurons

9 0.13215336 162 nips-2006-Predicting spike times from subthreshold dynamics of a neuron

10 0.1182742 148 nips-2006-Nonlinear physically-based models for decoding motor-cortical population activity

11 0.096822187 189 nips-2006-Temporal dynamics of information content carried by neurons in the primary visual cortex

12 0.08769469 17 nips-2006-A recipe for optimizing a time-histogram

13 0.08762458 192 nips-2006-Theory and Dynamics of Perceptual Bistability

14 0.086621813 190 nips-2006-The Neurodynamics of Belief Propagation on Binary Markov Random Fields

15 0.081276104 71 nips-2006-Effects of Stress and Genotype on Meta-parameter Dynamics in Reinforcement Learning

16 0.076328129 16 nips-2006-A Theory of Retinal Population Coding

17 0.06772352 143 nips-2006-Natural Actor-Critic for Road Traffic Optimisation

18 0.065920256 165 nips-2006-Real-time adaptive information-theoretic optimization of neurophysiology experiments

19 0.054041281 12 nips-2006-A Probabilistic Algorithm Integrating Source Localization and Noise Suppression of MEG and EEG data

20 0.04948185 51 nips-2006-Clustering Under Prior Knowledge with Application to Image Segmentation


similar papers computed by lsi model

lsi for this paper:

topicId topicWeight

[(0, -0.212), (1, -0.538), (2, 0.004), (3, 0.104), (4, 0.052), (5, 0.074), (6, 0.018), (7, 0.069), (8, 0.013), (9, -0.034), (10, -0.029), (11, 0.01), (12, -0.02), (13, 0.023), (14, 0.021), (15, 0.018), (16, 0.015), (17, -0.004), (18, -0.028), (19, 0.049), (20, -0.032), (21, 0.006), (22, 0.004), (23, 0.009), (24, 0.013), (25, 0.016), (26, -0.037), (27, 0.027), (28, -0.006), (29, -0.047), (30, -0.008), (31, -0.058), (32, 0.014), (33, -0.046), (34, 0.028), (35, -0.089), (36, -0.018), (37, -0.037), (38, -0.023), (39, 0.007), (40, -0.011), (41, 0.074), (42, -0.053), (43, 0.038), (44, 0.004), (45, -0.019), (46, 0.007), (47, -0.012), (48, 0.061), (49, -0.037)]

similar papers list:

simIndex simValue paperId paperTitle

same-paper 1 0.97787231 197 nips-2006-Uncertainty, phase and oscillatory hippocampal recall

Author: Máté Lengyel, Peter Dayan

Abstract: Many neural areas, notably, the hippocampus, show structured, dynamical, population behavior such as coordinated oscillations. It has long been observed that such oscillations provide a substrate for representing analog information in the firing phases of neurons relative to the underlying population rhythm. However, it has become increasingly clear that it is essential for neural populations to represent uncertainty about the information they capture, and the substantial recent work on neural codes for uncertainty has omitted any analysis of oscillatory systems. Here, we observe that, since neurons in an oscillatory network need not only fire once in each cycle (or even at all), uncertainty about the analog quantities each neuron represents by its firing phase might naturally be reported through the degree of concentration of the spikes that it fires. We apply this theory to memory in a model of oscillatory associative recall in hippocampal area CA3. Although it is not well treated in the literature, representing and manipulating uncertainty is fundamental to competent memory; our theory enables us to view CA3 as an effective uncertainty-aware, retrieval system. 1

2 0.90694201 36 nips-2006-Attentional Processing on a Spike-Based VLSI Neural Network

Author: Yingxue Wang, Rodney J. Douglas, Shih-Chii Liu

Abstract: The neurons of the neocortex communicate by asynchronous events called action potentials (or ’spikes’). However, for simplicity of simulation, most models of processing by cortical neural networks have assumed that the activations of their neurons can be approximated by event rates rather than taking account of individual spikes. The obstacle to exploring the more detailed spike processing of these networks has been reduced considerably in recent years by the development of hybrid analog-digital Very-Large Scale Integrated (hVLSI) neural networks composed of spiking neurons that are able to operate in real-time. In this paper we describe such a hVLSI neural network that performs an interesting task of selective attentional processing that was previously described for a simulated ’pointer-map’ rate model by Hahnloser and colleagues. We found that most of the computational features of their rate model can be reproduced in the spiking implementation; but, that spike-based processing requires a modification of the original network architecture in order to memorize a previously attended target. 1

3 0.90372545 187 nips-2006-Temporal Coding using the Response Properties of Spiking Neurons

Author: Thomas Voegtlin

Abstract: In biological neurons, the timing of a spike depends on the timing of synaptic currents, in a way that is classically described by the Phase Response Curve. This has implications for temporal coding: an action potential that arrives on a synapse has an implicit meaning, that depends on the position of the postsynaptic neuron on the firing cycle. Here we show that this implicit code can be used to perform computations. Using theta neurons, we derive a spike-timing dependent learning rule from an error criterion. We demonstrate how to train an auto-encoder neural network using this rule. 1

4 0.90216041 59 nips-2006-Context dependent amplification of both rate and event-correlation in a VLSI network of spiking neurons

Author: Elisabetta Chicca, Giacomo Indiveri, Rodney J. Douglas

Abstract: Cooperative competitive networks are believed to play a central role in cortical processing and have been shown to exhibit a wide set of useful computational properties. We propose a VLSI implementation of a spiking cooperative competitive network and show how it can perform context dependent computation both in the mean firing rate domain and in spike timing correlation space. In the mean rate case the network amplifies the activity of neurons belonging to the selected stimulus and suppresses the activity of neurons receiving weaker stimuli. In the event correlation case, the recurrent network amplifies with a higher gain the correlation between neurons which receive highly correlated inputs while leaving the mean firing rate unaltered. We describe the network architecture and present experimental data demonstrating its context dependent computation capabilities. 1

5 0.83662713 18 nips-2006-A selective attention multi--chip system with dynamic synapses and spiking neurons

Author: Chiara Bartolozzi, Giacomo Indiveri

Abstract: Selective attention is the strategy used by biological sensory systems to solve the problem of limited parallel processing capacity: salient subregions of the input stimuli are serially processed, while non–salient regions are suppressed. We present an mixed mode analog/digital Very Large Scale Integration implementation of a building block for a multi–chip neuromorphic hardware model of selective attention. We describe the chip’s architecture and its behavior, when its is part of a multi–chip system with a spiking retina as input, and show how it can be used to implement in real-time flexible models of bottom-up attention. 1

6 0.78604615 99 nips-2006-Information Bottleneck Optimization and Independent Component Extraction with Spiking Neurons

7 0.65084046 154 nips-2006-Optimal Change-Detection and Spiking Neurons

8 0.58456171 145 nips-2006-Neurophysiological Evidence of Cooperative Mechanisms for Stereo Computation

9 0.57444614 162 nips-2006-Predicting spike times from subthreshold dynamics of a neuron

10 0.49350083 189 nips-2006-Temporal dynamics of information content carried by neurons in the primary visual cortex

11 0.45950031 190 nips-2006-The Neurodynamics of Belief Propagation on Binary Markov Random Fields

12 0.3885251 148 nips-2006-Nonlinear physically-based models for decoding motor-cortical population activity

13 0.36077535 107 nips-2006-Large Margin Multi-channel Analog-to-Digital Conversion with Applications to Neural Prosthesis

14 0.31805804 71 nips-2006-Effects of Stress and Genotype on Meta-parameter Dynamics in Reinforcement Learning

15 0.31577322 16 nips-2006-A Theory of Retinal Population Coding

16 0.30721381 29 nips-2006-An Information Theoretic Framework for Eukaryotic Gradient Sensing

17 0.29429901 114 nips-2006-Learning Time-Intensity Profiles of Human Activity using Non-Parametric Bayesian Models

18 0.2868588 192 nips-2006-Theory and Dynamics of Perceptual Bistability

19 0.2693966 1 nips-2006-A Bayesian Approach to Diffusion Models of Decision-Making and Response Time

20 0.26638737 113 nips-2006-Learning Structural Equation Models for fMRI


similar papers computed by lda model

lda for this paper:

topicId topicWeight

[(1, 0.06), (3, 0.012), (7, 0.053), (9, 0.476), (20, 0.012), (22, 0.032), (44, 0.055), (57, 0.04), (65, 0.027), (69, 0.018), (71, 0.084), (82, 0.02), (90, 0.013), (93, 0.019)]

similar papers list:

simIndex simValue paperId paperTitle

same-paper 1 0.94614989 197 nips-2006-Uncertainty, phase and oscillatory hippocampal recall

Author: Máté Lengyel, Peter Dayan

Abstract: Many neural areas, notably, the hippocampus, show structured, dynamical, population behavior such as coordinated oscillations. It has long been observed that such oscillations provide a substrate for representing analog information in the firing phases of neurons relative to the underlying population rhythm. However, it has become increasingly clear that it is essential for neural populations to represent uncertainty about the information they capture, and the substantial recent work on neural codes for uncertainty has omitted any analysis of oscillatory systems. Here, we observe that, since neurons in an oscillatory network need not only fire once in each cycle (or even at all), uncertainty about the analog quantities each neuron represents by its firing phase might naturally be reported through the degree of concentration of the spikes that it fires. We apply this theory to memory in a model of oscillatory associative recall in hippocampal area CA3. Although it is not well treated in the literature, representing and manipulating uncertainty is fundamental to competent memory; our theory enables us to view CA3 as an effective uncertainty-aware, retrieval system. 1

2 0.92408895 173 nips-2006-Shifting, One-Inclusion Mistake Bounds and Tight Multiclass Expected Risk Bounds

Author: Benjamin I. Rubinstein, Peter L. Bartlett, J. H. Rubinstein

Abstract: Under the prediction model of learning, a prediction strategy is presented with an i.i.d. sample of n − 1 points in X and corresponding labels from a concept f ∈ F, and aims to minimize the worst-case probability of erring on an nth point. By exploiting the structure of F, Haussler et al. achieved a VC(F)/n bound for the natural one-inclusion prediction strategy, improving on bounds implied by PAC-type results by a O(log n) factor. The key data structure in their result is the natural subgraph of the hypercube—the one-inclusion graph; the key step is a d = VC(F) bound on one-inclusion graph density. The first main result of this n n−1 paper is a density bound of n ≤d−1 / ( ≤d ) < d, which positively resolves a conjecture of Kuzmin & Warmuth relating to their unlabeled Peeling compression scheme and also leads to an improved mistake bound for the randomized (deterministic) one-inclusion strategy for all d (for d ≈ Θ(n)). The proof uses a new form of VC-invariant shifting and a group-theoretic symmetrization. Our second main result is a k-class analogue of the d/n mistake bound, replacing the VC-dimension by the Pollard pseudo-dimension and the one-inclusion strategy by its natural hypergraph generalization. This bound on expected risk improves on known PAC-based results by a factor of O(log n) and is shown to be optimal up to a O(log k) factor. The combinatorial technique of shifting takes a central role in understanding the one-inclusion (hyper)graph and is a running theme throughout. 1

3 0.91969752 149 nips-2006-Nonnegative Sparse PCA

Author: Ron Zass, Amnon Shashua

Abstract: We describe a nonnegative variant of the ”Sparse PCA” problem. The goal is to create a low dimensional representation from a collection of points which on the one hand maximizes the variance of the projected points and on the other uses only parts of the original coordinates, and thereby creating a sparse representation. What distinguishes our problem from other Sparse PCA formulations is that the projection involves only nonnegative weights of the original coordinates — a desired quality in various fields, including economics, bioinformatics and computer vision. Adding nonnegativity contributes to sparseness, where it enforces a partitioning of the original coordinates among the new axes. We describe a simple yet efficient iterative coordinate-descent type of scheme which converges to a local optimum of our optimization criteria, giving good results on large real world datasets. 1

4 0.80970865 7 nips-2006-A Local Learning Approach for Clustering

Author: Mingrui Wu, Bernhard Schölkopf

Abstract: We present a local learning approach for clustering. The basic idea is that a good clustering result should have the property that the cluster label of each data point can be well predicted based on its neighboring data and their cluster labels, using current supervised learning methods. An optimization problem is formulated such that its solution has the above property. Relaxation and eigen-decomposition are applied to solve this optimization problem. We also briefly investigate the parameter selection issue and provide a simple parameter selection method for the proposed algorithm. Experimental results are provided to validate the effectiveness of the proposed approach. 1

5 0.56881446 167 nips-2006-Recursive ICA

Author: Honghao Shan, Lingyun Zhang, Garrison W. Cottrell

Abstract: Independent Component Analysis (ICA) is a popular method for extracting independent features from visual data. However, as a fundamentally linear technique, there is always nonlinear residual redundancy that is not captured by ICA. Hence there have been many attempts to try to create a hierarchical version of ICA, but so far none of the approaches have a natural way to apply them more than once. Here we show that there is a relatively simple technique that transforms the absolute values of the outputs of a previous application of ICA into a normal distribution, to which ICA maybe applied again. This results in a recursive ICA algorithm that may be applied any number of times in order to extract higher order structure from previous layers. 1

6 0.56200689 187 nips-2006-Temporal Coding using the Response Properties of Spiking Neurons

7 0.55342168 158 nips-2006-PG-means: learning the number of clusters in data

8 0.5392102 76 nips-2006-Emergence of conjunctive visual features by quadratic independent component analysis

9 0.51732326 80 nips-2006-Fundamental Limitations of Spectral Clustering

10 0.51196897 70 nips-2006-Doubly Stochastic Normalization for Spectral Clustering

11 0.50069922 83 nips-2006-Generalized Maximum Margin Clustering and Unsupervised Kernel Learning

12 0.48734677 59 nips-2006-Context dependent amplification of both rate and event-correlation in a VLSI network of spiking neurons

13 0.47948429 148 nips-2006-Nonlinear physically-based models for decoding motor-cortical population activity

14 0.47838449 107 nips-2006-Large Margin Multi-channel Analog-to-Digital Conversion with Applications to Neural Prosthesis

15 0.46718726 162 nips-2006-Predicting spike times from subthreshold dynamics of a neuron

16 0.46396822 127 nips-2006-MLLE: Modified Locally Linear Embedding Using Multiple Weights

17 0.45969844 18 nips-2006-A selective attention multi--chip system with dynamic synapses and spiking neurons

18 0.45892876 72 nips-2006-Efficient Learning of Sparse Representations with an Energy-Based Model

19 0.45836315 65 nips-2006-Denoising and Dimension Reduction in Feature Space

20 0.45818591 67 nips-2006-Differential Entropic Clustering of Multivariate Gaussians