nips nips2001 nips2001-96 knowledge-graph by maker-knowledge-mining

96 nips-2001-Information-Geometric Decomposition in Spike Analysis


Source: pdf

Author: Hiroyuki Nakahara, Shun-ichi Amari

Abstract: We present an information-geometric measure to systematically investigate neuronal firing patterns, taking account not only of the second-order but also of higher-order interactions. We begin with the case of two neurons for illustration and show how to test whether or not any pairwise correlation in one period is significantly different from that in the other period. In order to test such a hypothesis of different firing rates, the correlation term needs to be singled out 'orthogonally' to the firing rates, where the null hypothesis might not be of independent firing. This method is also shown to directly associate neural firing with behavior via their mutual information, which is decomposed into two types of information, conveyed by mean firing rate and coincident firing, respectively. Then, we show that these results, using the 'orthogonal' decomposition, are naturally extended to the case of three neurons and n neurons in general. 1

Reference: text


Summary: the most important sentenses genereted by tfidf model

sentIndex sentText sentNum sentScore

1 Information-geometric decomposition In spike analysis Hiroyuki Nakahara; Shun-ichi Amari Lab. [sent-2, score-0.156]

2 jp Abstract We present an information-geometric measure to systematically investigate neuronal firing patterns, taking account not only of the second-order but also of higher-order interactions. [sent-6, score-1.078]

3 We begin with the case of two neurons for illustration and show how to test whether or not any pairwise correlation in one period is significantly different from that in the other period. [sent-7, score-0.679]

4 In order to test such a hypothesis of different firing rates, the correlation term needs to be singled out 'orthogonally' to the firing rates, where the null hypothesis might not be of independent firing. [sent-8, score-1.94]

5 This method is also shown to directly associate neural firing with behavior via their mutual information, which is decomposed into two types of information, conveyed by mean firing rate and coincident firing, respectively. [sent-9, score-2.265]

6 Then, we show that these results, using the 'orthogonal' decomposition, are naturally extended to the case of three neurons and n neurons in general. [sent-10, score-0.412]

7 1 Introduction Based on the theory of hierarchical structure and related invariant decomposition of interactions by information geometry [3], the present paper briefly summarizes methods useful for systematically analyzing a population of neural firing [9]. [sent-11, score-1.406]

8 Many researches have shown that the mean firing rate of a single neuron may carry significant information on sensory and motion signals. [sent-12, score-1.202]

9 Information conveyed by populational firing, however, may not be only an accumulation of mean firing rates. [sent-13, score-1.008]

10 , coincident firing [13, 14], may also carry behavioral information. [sent-16, score-1.212]

11 One obvious step to investigate this issue is to single out a contribution by coincident firing between two neurons, i. [sent-17, score-1.247]

12 In general, however, it is not sufficient to test a pairwise correlation of neural firing, because there can be triplewise and higher correlations. [sent-20, score-0.42]

13 For example, three variables (neurons) are not independent in general even when they are pairwise independent. [sent-21, score-0.244]

14 We need to establish a systematic method of analysis, including these higher-order • also affiliated with Dept. [sent-22, score-0.138]

15 We propose one approach, the information-geometric measure that uses the dual orthogonality of the natural and expectation parameters in exponential family distributions [4]. [sent-28, score-0.343]

16 We represent a neural firing pattern by a binary random vector x. [sent-29, score-0.821]

17 The probability distribution of firing patterns can be expanded by a log linear model, where the set {p( x)} of all the probability distributions forms a (2n - I)-dimensional manifold 8 n . [sent-30, score-1.026]


similar papers computed by tfidf model

tfidf for this paper:

wordName wordTfidf (topN-words)

[('firing', 0.758), ('coincident', 0.303), ('pairwise', 0.175), ('neurons', 0.168), ('conveyed', 0.134), ('correlation', 0.124), ('decomposition', 0.108), ('systematically', 0.107), ('japan', 0.107), ('amari', 0.096), ('carry', 0.094), ('hirosawa', 0.088), ('nakahara', 0.088), ('researches', 0.088), ('orthogonality', 0.08), ('accumulation', 0.08), ('wako', 0.08), ('prob', 0.07), ('saitama', 0.07), ('hypothesis', 0.068), ('summarizes', 0.067), ('riken', 0.067), ('investigate', 0.064), ('ik', 0.061), ('null', 0.061), ('associate', 0.059), ('logp', 0.059), ('advanced', 0.057), ('behavioral', 0.057), ('expanded', 0.057), ('rates', 0.053), ('patterns', 0.053), ('systematic', 0.052), ('decomposed', 0.052), ('briefly', 0.049), ('sensory', 0.048), ('spike', 0.048), ('illustration', 0.048), ('motion', 0.048), ('il', 0.048), ('establish', 0.048), ('population', 0.047), ('manifold', 0.047), ('dual', 0.046), ('period', 0.045), ('neuronal', 0.044), ('begin', 0.044), ('analyzing', 0.043), ('log', 0.042), ('interactions', 0.041), ('expansion', 0.041), ('test', 0.041), ('geometry', 0.04), ('hierarchical', 0.039), ('mutual', 0.039), ('measure', 0.037), ('neuron', 0.037), ('sufficient', 0.037), ('neuroscience', 0.037), ('significant', 0.037), ('invariant', 0.037), ('mean', 0.036), ('distributions', 0.035), ('correlations', 0.035), ('obvious', 0.035), ('naturally', 0.034), ('issue', 0.034), ('needs', 0.034), ('significantly', 0.034), ('forms', 0.034), ('family', 0.034), ('rate', 0.033), ('brain', 0.032), ('contribution', 0.03), ('mathematical', 0.03), ('expectation', 0.029), ('independent', 0.028), ('exponential', 0.028), ('types', 0.028), ('institute', 0.027), ('structure', 0.027), ('account', 0.026), ('neural', 0.024), ('behavior', 0.023), ('taking', 0.023), ('single', 0.023), ('three', 0.021), ('extended', 0.021), ('binary', 0.021), ('propose', 0.02), ('including', 0.02), ('probabilities', 0.02), ('general', 0.02), ('present', 0.019), ('higher', 0.019), ('method', 0.018), ('pattern', 0.018), ('knowledge', 0.017), ('uses', 0.017), ('natural', 0.017)]

similar papers list:

simIndex simValue paperId paperTitle

same-paper 1 0.99999994 96 nips-2001-Information-Geometric Decomposition in Spike Analysis

Author: Hiroyuki Nakahara, Shun-ichi Amari

Abstract: We present an information-geometric measure to systematically investigate neuronal firing patterns, taking account not only of the second-order but also of higher-order interactions. We begin with the case of two neurons for illustration and show how to test whether or not any pairwise correlation in one period is significantly different from that in the other period. In order to test such a hypothesis of different firing rates, the correlation term needs to be singled out 'orthogonally' to the firing rates, where the null hypothesis might not be of independent firing. This method is also shown to directly associate neural firing with behavior via their mutual information, which is decomposed into two types of information, conveyed by mean firing rate and coincident firing, respectively. Then, we show that these results, using the 'orthogonal' decomposition, are naturally extended to the case of three neurons and n neurons in general. 1

2 0.37852025 37 nips-2001-Associative memory in realistic neuronal networks

Author: Peter E. Latham

Abstract: Almost two decades ago , Hopfield [1] showed that networks of highly reduced model neurons can exhibit multiple attracting fixed points, thus providing a substrate for associative memory. It is still not clear, however, whether realistic neuronal networks can support multiple attractors. The main difficulty is that neuronal networks in vivo exhibit a stable background state at low firing rate, typically a few Hz. Embedding attractor is easy; doing so without destabilizing the background is not. Previous work [2, 3] focused on the sparse coding limit, in which a vanishingly small number of neurons are involved in any memory. Here we investigate the case in which the number of neurons involved in a memory scales with the number of neurons in the network. In contrast to the sparse coding limit, we find that multiple attractors can co-exist robustly with a stable background state. Mean field theory is used to understand how the behavior of the network scales with its parameters, and simulations with analog neurons are presented. One of the most important features of the nervous system is its ability to perform associative memory. It is generally believed that associative memory is implemented using attractor networks - experimental studies point in that direction [4- 7], and there are virtually no competing theoretical models. Perhaps surprisingly, however, it is still an open theoretical question whether attractors can exist in realistic neuronal networks. The

3 0.21014108 197 nips-2001-Why Neuronal Dynamics Should Control Synaptic Learning Rules

Author: Jesper Tegnér, Ádám Kepecs

Abstract: Hebbian learning rules are generally formulated as static rules. Under changing condition (e.g. neuromodulation, input statistics) most rules are sensitive to parameters. In particular, recent work has focused on two different formulations of spike-timing-dependent plasticity rules. Additive STDP [1] is remarkably versatile but also very fragile, whereas multiplicative STDP [2, 3] is more robust but lacks attractive features such as synaptic competition and rate stabilization. Here we address the problem of robustness in the additive STDP rule. We derive an adaptive control scheme, where the learning function is under fast dynamic control by postsynaptic activity to stabilize learning under a variety of conditions. Such a control scheme can be implemented using known biophysical mechanisms of synapses. We show that this adaptive rule makes the addit ive STDP more robust. Finally, we give an example how meta plasticity of the adaptive rule can be used to guide STDP into different type of learning regimes. 1

4 0.13989533 142 nips-2001-Orientational and Geometric Determinants of Place and Head-direction

Author: Neil Burgess, Tom Hartley

Abstract: We present a model of the firing of place and head-direction cells in rat hippocampus. The model can predict the response of individual cells and populations to parametric manipulations of both geometric (e.g. O'Keefe & Burgess, 1996) and orientational (Fenton et aI., 2000a) cues, extending a previous geometric model (Hartley et al., 2000). It provides a functional description of how these cells' spatial responses are derived from the rat's environment and makes easily testable quantitative predictions. Consideration of the phenomenon of remapping (Muller & Kubie, 1987; Bostock et aI., 1991) indicates that the model may also be consistent with nonparametric changes in firing, and provides constraints for its future development. 1

5 0.12666202 72 nips-2001-Exact differential equation population dynamics for integrate-and-fire neurons

Author: Julian Eggert, Berthold Bäuml

Abstract: Mesoscopical, mathematical descriptions of dynamics of populations of spiking neurons are getting increasingly important for the understanding of large-scale processes in the brain using simulations. In our previous work, integral equation formulations for population dynamics have been derived for a special type of spiking neurons. For Integrate- and- Fire type neurons , these formulations were only approximately correct. Here, we derive a mathematically compact, exact population dynamics formulation for Integrate- and- Fire type neurons. It can be shown quantitatively in simulations that the numerical correspondence with microscopically modeled neuronal populations is excellent. 1 Introduction and motivation The goal of the population dynamics approach is to model the time course of the collective activity of entire populations of functionally and dynamically similar neurons in a compact way, using a higher descriptionallevel than that of single neurons and spikes. The usual observable at the level of neuronal populations is the populationaveraged instantaneous firing rate A(t), with A(t)6.t being the number of neurons in the population that release a spike in an interval [t, t+6.t). Population dynamics are formulated in such a way, that they match quantitatively the time course of a given A(t), either gained experimentally or by microscopical, detailed simulation. At least three main reasons can be formulated which underline the importance of the population dynamics approach for computational neuroscience. First, it enables the simulation of extensive networks involving a massive number of neurons and connections, which is typically the case when dealing with biologically realistic functional models that go beyond the single neuron level. Second, it increases the analytical understanding of large-scale neuronal dynamics , opening the way towards better control and predictive capabilities when dealing with large networks. Third, it enables a systematic embedding of the numerous neuronal models operating at different descriptional scales into a generalized theoretic framework, explaining the relationships, dependencies and derivations of the respective models. Early efforts on population dynamics approaches date back as early as 1972, to the work of Wilson and Cowan [8] and Knight [4], which laid the basis for all current population-averaged graded-response models (see e.g. [6] for modeling work using these models). More recently, population-based approaches for spiking neurons were developed, mainly by Gerstner [3, 2] and Knight [5]. In our own previous work [1], we have developed a theoretical framework which enables to systematize and simulate a wide range of models for population-based dynamics. It was shown that the equations of the framework produce results that agree quantitatively well with detailed simulations using spiking neurons, so that they can be used for realistic simulations involving networks with large numbers of spiking neurons. Nevertheless, for neuronal populations composed of Integrate-and-Fire (I&F;) neurons, this framework was only correct in an approximation. In this paper, we derive the exact population dynamics formulation for I&F; neurons. This is achieved by reducing the I&F; population dynamics to a point process and by taking advantage of the particular properties of I&F; neurons. 2 2.1 Background: Integrate-and-Fire dynamics Differential form We start with the standard Integrate- and- Fire (I&F;) model in form of the wellknown differential equation [7] (1) which describes the dynamics of the membrane potential Vi of a neuron i that is modeled as a single compartment with RC circuit characteristics. The membrane relaxation time is in this case T = RC with R being the membrane resistance and C the membrane capacitance. The resting potential v R est is the stationary potential that is approached in the no-input case. The input arriving from other neurons is described in form of a current ji. In addition to eq. (1), which describes the integrate part of the I&F; model, the neuronal dynamics are completed by a nonlinear step. Every time the membrane potential Vi reaches a fixed threshold () from below, Vi is lowered by a fixed amount Ll > 0, and from the new value of the membrane potential integration according to eq. (1) starts again. if Vi(t) = () (from below) . (2) At the same time, it is said that the release of a spike occurred (i.e., the neuron fired), and the time ti = t of this singular event is stored. Here ti indicates the time of the most recent spike. Storing all the last firing times , we gain the sequence of spikes {t{} (spike ordering index j, neuronal index i). 2.2 Integral form Now we look at the single neuron in a neuronal compound. We assume that the input current contribution ji from presynaptic spiking neurons can be described using the presynaptic spike times tf, a response-function ~ and a connection weight W¡ . ',J ji(t) = Wi ,j ~(t - tf) (3) l: l: j f Integrating the I&F; equation (1) beginning at the last spiking time tT, which determines the initial condition by Vi(ti) = vi(ti - 0) - 6., where vi(ti - 0) is the membrane potential just before the neuron spikes, we get 1 Vi(t) = v Rest + fj(t - t:) + l: Wi ,j l: a(t - t:; t - tf) , j - Vi(t:)) e- S / T (4) f with the refractory function fj(s) = - (v Rest (5) and the alpha-function r ds

6 0.1219467 2 nips-2001-3 state neurons for contextual processing

7 0.093409576 87 nips-2001-Group Redundancy Measures Reveal Redundancy Reduction in the Auditory Pathway

8 0.081792921 141 nips-2001-Orientation-Selective aVLSI Spiking Neurons

9 0.068456151 174 nips-2001-Spike timing and the coding of naturalistic sounds in a central auditory area of songbirds

10 0.06041703 49 nips-2001-Citcuits for VLSI Implementation of Temporally Asymmetric Hebbian Learning

11 0.059015609 23 nips-2001-A theory of neural integration in the head-direction system

12 0.058580954 57 nips-2001-Correlation Codes in Neuronal Populations

13 0.058512844 131 nips-2001-Neural Implementation of Bayesian Inference in Population Codes

14 0.05728044 122 nips-2001-Model Based Population Tracking and Automatic Detection of Distribution Changes

15 0.048109569 27 nips-2001-Activity Driven Adaptive Stochastic Resonance

16 0.044427045 65 nips-2001-Effective Size of Receptive Fields of Inferior Temporal Visual Cortex Neurons in Natural Scenes

17 0.044258725 166 nips-2001-Self-regulation Mechanism of Temporally Asymmetric Hebbian Plasticity

18 0.041781761 124 nips-2001-Modeling the Modulatory Effect of Attention on Human Spatial Vision

19 0.038678844 73 nips-2001-Eye movements and the maturation of cortical orientation selectivity

20 0.036159094 97 nips-2001-Information-Geometrical Significance of Sparsity in Gallager Codes


similar papers computed by lsi model

lsi for this paper:

topicId topicWeight

[(0, -0.112), (1, -0.227), (2, -0.144), (3, 0.052), (4, 0.161), (5, -0.001), (6, 0.126), (7, -0.104), (8, -0.048), (9, -0.01), (10, 0.048), (11, -0.04), (12, 0.108), (13, -0.031), (14, 0.044), (15, -0.148), (16, 0.058), (17, -0.057), (18, -0.173), (19, -0.215), (20, 0.184), (21, 0.149), (22, 0.012), (23, 0.103), (24, -0.066), (25, -0.127), (26, -0.104), (27, 0.216), (28, -0.047), (29, -0.093), (30, 0.194), (31, 0.078), (32, -0.047), (33, 0.091), (34, -0.052), (35, 0.111), (36, -0.01), (37, -0.031), (38, 0.062), (39, -0.081), (40, -0.046), (41, 0.068), (42, 0.235), (43, 0.11), (44, 0.047), (45, -0.046), (46, 0.019), (47, 0.092), (48, 0.018), (49, -0.11)]

similar papers list:

simIndex simValue paperId paperTitle

same-paper 1 0.98890805 96 nips-2001-Information-Geometric Decomposition in Spike Analysis

Author: Hiroyuki Nakahara, Shun-ichi Amari

Abstract: We present an information-geometric measure to systematically investigate neuronal firing patterns, taking account not only of the second-order but also of higher-order interactions. We begin with the case of two neurons for illustration and show how to test whether or not any pairwise correlation in one period is significantly different from that in the other period. In order to test such a hypothesis of different firing rates, the correlation term needs to be singled out 'orthogonally' to the firing rates, where the null hypothesis might not be of independent firing. This method is also shown to directly associate neural firing with behavior via their mutual information, which is decomposed into two types of information, conveyed by mean firing rate and coincident firing, respectively. Then, we show that these results, using the 'orthogonal' decomposition, are naturally extended to the case of three neurons and n neurons in general. 1

2 0.81093299 37 nips-2001-Associative memory in realistic neuronal networks

Author: Peter E. Latham

Abstract: Almost two decades ago , Hopfield [1] showed that networks of highly reduced model neurons can exhibit multiple attracting fixed points, thus providing a substrate for associative memory. It is still not clear, however, whether realistic neuronal networks can support multiple attractors. The main difficulty is that neuronal networks in vivo exhibit a stable background state at low firing rate, typically a few Hz. Embedding attractor is easy; doing so without destabilizing the background is not. Previous work [2, 3] focused on the sparse coding limit, in which a vanishingly small number of neurons are involved in any memory. Here we investigate the case in which the number of neurons involved in a memory scales with the number of neurons in the network. In contrast to the sparse coding limit, we find that multiple attractors can co-exist robustly with a stable background state. Mean field theory is used to understand how the behavior of the network scales with its parameters, and simulations with analog neurons are presented. One of the most important features of the nervous system is its ability to perform associative memory. It is generally believed that associative memory is implemented using attractor networks - experimental studies point in that direction [4- 7], and there are virtually no competing theoretical models. Perhaps surprisingly, however, it is still an open theoretical question whether attractors can exist in realistic neuronal networks. The

3 0.57739198 142 nips-2001-Orientational and Geometric Determinants of Place and Head-direction

Author: Neil Burgess, Tom Hartley

Abstract: We present a model of the firing of place and head-direction cells in rat hippocampus. The model can predict the response of individual cells and populations to parametric manipulations of both geometric (e.g. O'Keefe & Burgess, 1996) and orientational (Fenton et aI., 2000a) cues, extending a previous geometric model (Hartley et al., 2000). It provides a functional description of how these cells' spatial responses are derived from the rat's environment and makes easily testable quantitative predictions. Consideration of the phenomenon of remapping (Muller & Kubie, 1987; Bostock et aI., 1991) indicates that the model may also be consistent with nonparametric changes in firing, and provides constraints for its future development. 1

4 0.55411166 197 nips-2001-Why Neuronal Dynamics Should Control Synaptic Learning Rules

Author: Jesper Tegnér, Ádám Kepecs

Abstract: Hebbian learning rules are generally formulated as static rules. Under changing condition (e.g. neuromodulation, input statistics) most rules are sensitive to parameters. In particular, recent work has focused on two different formulations of spike-timing-dependent plasticity rules. Additive STDP [1] is remarkably versatile but also very fragile, whereas multiplicative STDP [2, 3] is more robust but lacks attractive features such as synaptic competition and rate stabilization. Here we address the problem of robustness in the additive STDP rule. We derive an adaptive control scheme, where the learning function is under fast dynamic control by postsynaptic activity to stabilize learning under a variety of conditions. Such a control scheme can be implemented using known biophysical mechanisms of synapses. We show that this adaptive rule makes the addit ive STDP more robust. Finally, we give an example how meta plasticity of the adaptive rule can be used to guide STDP into different type of learning regimes. 1

5 0.33802983 2 nips-2001-3 state neurons for contextual processing

Author: Ádám Kepecs, S. Raghavachari

Abstract: Neurons receive excitatory inputs via both fast AMPA and slow NMDA type receptors. We find that neurons receiving input via NMDA receptors can have two stable membrane states which are input dependent. Action potentials can only be initiated from the higher voltage state. Similar observations have been made in several brain areas which might be explained by our model. The interactions between the two kinds of inputs lead us to suggest that some neurons may operate in 3 states: disabled, enabled and firing. Such enabled, but non-firing modes can be used to introduce context-dependent processing in neural networks. We provide a simple example and discuss possible implications for neuronal processing and response variability. 1

6 0.31931114 87 nips-2001-Group Redundancy Measures Reveal Redundancy Reduction in the Auditory Pathway

7 0.31278569 72 nips-2001-Exact differential equation population dynamics for integrate-and-fire neurons

8 0.30820352 83 nips-2001-Geometrical Singularities in the Neuromanifold of Multilayer Perceptrons

9 0.27043456 158 nips-2001-Receptive field structure of flow detectors for heading perception

10 0.25932777 124 nips-2001-Modeling the Modulatory Effect of Attention on Human Spatial Vision

11 0.2530638 166 nips-2001-Self-regulation Mechanism of Temporally Asymmetric Hebbian Plasticity

12 0.23302406 57 nips-2001-Correlation Codes in Neuronal Populations

13 0.23288317 141 nips-2001-Orientation-Selective aVLSI Spiking Neurons

14 0.18408974 131 nips-2001-Neural Implementation of Bayesian Inference in Population Codes

15 0.17030506 125 nips-2001-Modularity in the motor system: decomposition of muscle patterns as combinations of time-varying synergies

16 0.15985028 23 nips-2001-A theory of neural integration in the head-direction system

17 0.15739238 14 nips-2001-A Neural Oscillator Model of Auditory Selective Attention

18 0.14107081 111 nips-2001-Learning Lateral Interactions for Feature Binding and Sensory Segmentation

19 0.13868389 49 nips-2001-Citcuits for VLSI Implementation of Temporally Asymmetric Hebbian Learning

20 0.13833824 42 nips-2001-Bayesian morphometry of hippocampal cells suggests same-cell somatodendritic repulsion


similar papers computed by lda model

lda for this paper:

topicId topicWeight

[(14, 0.035), (19, 0.027), (27, 0.111), (30, 0.076), (38, 0.047), (59, 0.034), (72, 0.022), (79, 0.02), (83, 0.017), (91, 0.178), (93, 0.312)]

similar papers list:

simIndex simValue paperId paperTitle

same-paper 1 0.81689876 96 nips-2001-Information-Geometric Decomposition in Spike Analysis

Author: Hiroyuki Nakahara, Shun-ichi Amari

Abstract: We present an information-geometric measure to systematically investigate neuronal firing patterns, taking account not only of the second-order but also of higher-order interactions. We begin with the case of two neurons for illustration and show how to test whether or not any pairwise correlation in one period is significantly different from that in the other period. In order to test such a hypothesis of different firing rates, the correlation term needs to be singled out 'orthogonally' to the firing rates, where the null hypothesis might not be of independent firing. This method is also shown to directly associate neural firing with behavior via their mutual information, which is decomposed into two types of information, conveyed by mean firing rate and coincident firing, respectively. Then, we show that these results, using the 'orthogonal' decomposition, are naturally extended to the case of three neurons and n neurons in general. 1

2 0.68682879 65 nips-2001-Effective Size of Receptive Fields of Inferior Temporal Visual Cortex Neurons in Natural Scenes

Author: Thomas P. Trappenberg, Edmund T. Rolls, Simon M. Stringer

Abstract: Inferior temporal cortex (IT) neurons have large receptive fields when a single effective object stimulus is shown against a blank background, but have much smaller receptive fields when the object is placed in a natural scene. Thus, translation invariant object recognition is reduced in natural scenes, and this may help object selection. We describe a model which accounts for this by competition within an attractor in which the neurons are tuned to different objects in the scene, and the fovea has a higher cortical magnification factor than the peripheral visual field. Furthermore, we show that top-down object bias can increase the receptive field size, facilitating object search in complex visual scenes, and providing a model of object-based attention. The model leads to the prediction that introduction of a second object into a scene with blank background will reduce the receptive field size to values that depend on the closeness of the second object to the target stimulus. We suggest that mechanisms of this type enable the output of IT to be primarily about one object, so that the areas that receive from IT can select the object as a potential target for action.

3 0.57653052 66 nips-2001-Efficiency versus Convergence of Boolean Kernels for On-Line Learning Algorithms

Author: Roni Khardon, Dan Roth, Rocco A. Servedio

Abstract: We study online learning in Boolean domains using kernels which capture feature expansions equivalent to using conjunctions over basic features. We demonstrate a tradeoff between the computational efficiency with which these kernels can be computed and the generalization ability of the resulting classifier. We first describe several kernel functions which capture either limited forms of conjunctions or all conjunctions. We show that these kernels can be used to efficiently run the Perceptron algorithm over an exponential number of conjunctions; however we also prove that using such kernels the Perceptron algorithm can make an exponential number of mistakes even when learning simple functions. We also consider an analogous use of kernel functions to run the multiplicative-update Winnow algorithm over an expanded feature space of exponentially many conjunctions. While known upper bounds imply that Winnow can learn DNF formulae with a polynomial mistake bound in this setting, we prove that it is computationally hard to simulate Winnow’s behavior for learning DNF over such a feature set, and thus that such kernel functions for Winnow are not efficiently computable.

4 0.57367074 160 nips-2001-Reinforcement Learning and Time Perception -- a Model of Animal Experiments

Author: Jonathan L. Shapiro, J. Wearden

Abstract: Animal data on delayed-reward conditioning experiments shows a striking property - the data for different time intervals collapses into a single curve when the data is scaled by the time interval. This is called the scalar property of interval timing. Here a simple model of a neural clock is presented and shown to give rise to the scalar property. The model is an accumulator consisting of noisy, linear spiking neurons. It is analytically tractable and contains only three parameters. When coupled with reinforcement learning it simulates peak procedure experiments, producing both the scalar property and the pattern of single trial covariances. 1

5 0.56682479 57 nips-2001-Correlation Codes in Neuronal Populations

Author: Maoz Shamir, Haim Sompolinsky

Abstract: Population codes often rely on the tuning of the mean responses to the stimulus parameters. However, this information can be greatly suppressed by long range correlations. Here we study the efficiency of coding information in the second order statistics of the population responses. We show that the Fisher Information of this system grows linearly with the size of the system. We propose a bilinear readout model for extracting information from correlation codes, and evaluate its performance in discrimination and estimation tasks. It is shown that the main source of information in this system is the stimulus dependence of the variances of the single neuron responses.

6 0.56615412 111 nips-2001-Learning Lateral Interactions for Feature Binding and Sensory Segmentation

7 0.56567872 123 nips-2001-Modeling Temporal Structure in Classical Conditioning

8 0.56483698 182 nips-2001-The Fidelity of Local Ordinal Encoding

9 0.56471449 100 nips-2001-Iterative Double Clustering for Unsupervised and Semi-Supervised Learning

10 0.56397718 89 nips-2001-Grouping with Bias

11 0.56282246 68 nips-2001-Entropy and Inference, Revisited

12 0.56269652 52 nips-2001-Computing Time Lower Bounds for Recurrent Sigmoidal Neural Networks

13 0.56140047 174 nips-2001-Spike timing and the coding of naturalistic sounds in a central auditory area of songbirds

14 0.56139338 144 nips-2001-Partially labeled classification with Markov random walks

15 0.56090105 150 nips-2001-Probabilistic Inference of Hand Motion from Neural Activity in Motor Cortex

16 0.56050587 161 nips-2001-Reinforcement Learning with Long Short-Term Memory

17 0.55898106 27 nips-2001-Activity Driven Adaptive Stochastic Resonance

18 0.55755889 54 nips-2001-Contextual Modulation of Target Saliency

19 0.55747497 131 nips-2001-Neural Implementation of Bayesian Inference in Population Codes

20 0.55737883 7 nips-2001-A Dynamic HMM for On-line Segmentation of Sequential Data