nips nips2000 nips2000-88 knowledge-graph by maker-knowledge-mining
Source: pdf
Author: Adrienne L. Fairhall, Geoffrey D. Lewen, William Bialek, Robert R. de Ruyter van Steveninck
Abstract: Many neural systems extend their dynamic range by adaptation. We examine the timescales of adaptation in the context of dynamically modulated rapidly-varying stimuli, and demonstrate in the fly visual system that adaptation to the statistical ensemble of the stimulus dynamically maximizes information transmission about the time-dependent stimulus. Further, while the rate response has long transients, the adaptation takes place on timescales consistent with optimal variance estimation.
Reference: text
sentIndex sentText sentNum sentScore
1 Multiple times cales of adaptation in a neural code Adrienne L. [sent-1, score-0.325]
2 de Ruyter van Steveninck NEe Research Institute 4 Independence Way Princeton, New Jersey 08540 adrienne! [sent-4, score-0.153]
3 We examine the timescales of adaptation in the context of dynamically modulated rapidly-varying stimuli, and demonstrate in the fly visual system that adaptation to the statistical ensemble of the stimulus dynamically maximizes information transmission about the time-dependent stimulus. [sent-9, score-1.509]
4 Further, while the rate response has long transients, the adaptation takes place on timescales consistent with optimal variance estimation. [sent-10, score-0.826]
5 1 Introduction Adaptation was one of the first phenomena discovered when Adrian recorded the responses of single sensory neurons [1, 2]. [sent-11, score-0.117]
6 Since that time, many different forms of adaptation have been found in almost all sensory systems. [sent-12, score-0.349]
7 The simplest forms of adaptation, such as light and dark adaptation in the visual system, seem to involve just discarding a large constant background signal so that the system can maintain sensitivity to small changes. [sent-13, score-0.357]
8 Adaptation to statistics might happen on evolutionary time scales, or, at the opposite extreme, it might happen in real time as an animal moves through the world. [sent-15, score-0.212]
9 Perhaps the simplest of statistical adaptation experiments, as in Ref [7] and Fig. [sent-17, score-0.281]
10 1, is to switch between stimuli that are drawn from different probability distributions and ask how the neuron responds to the switch. [sent-18, score-0.261]
11 When we 'repeat' the experiment we repeat the time dependence of the parameters describing the distribution, but we choose new signals from the same distributions; thus we probe the response or adaptation to the distributions and not to the particular signals. [sent-19, score-0.638]
12 These switching experiments typically reveal transient responses to the switch that have rather long time scales, and it is tempting to identify these long time scales as the time scales of adaptation. [sent-20, score-0.905]
13 In this work we re-examine the phenomena of statistical adaptation in the motion sensitive neurons of the fly visual system. [sent-23, score-0.39]
14 Specifically, we are interested in adaptation to the variance or dynamic range of the velocity distribution [10]. [sent-24, score-0.484]
15 Further, the precise form of rescaling chosen by the fly's visual system is that which maximizes information transmission. [sent-27, score-0.317]
16 There are several natural questions: (1) How long does it take the system to accomplish the rescaling of its input/output relation? [sent-28, score-0.27]
17 (2) Are the transients seen in switching experiments an indication of gradual rescaling? [sent-29, score-0.322]
18 (3) If the system adapts to the variance of its inputs, is the neural signal ambiguous about the absolute scale of velocity? [sent-30, score-0.209]
19 (4) Can we see the optimization of information transmission occurring in real time? [sent-31, score-0.14]
20 2 Stimulus structure and experimental setup A fly (Calliphora vicina) is immobilized in wax and views a computer controlled oscilloscope display while we record action potentials from the identified neuron HI using standard methods. [sent-32, score-0.143]
21 The stimulus movie is a random pattern of dark and light vertical bars, and the entire pattern moves along a random trajectory with velocity S(t); since the neuron is motion (and not position) sensitive we refer to this signal as the stimulus. [sent-33, score-0.625]
22 We construct the stimulus S(t) as the product of a normalized white noise s(t), constructed from a random number sequence refreshed every Ts = 2 ms, and an amplitude or standard deviation (J'(t) which varies on a characteristic timescale Ta » Ts. [sent-34, score-0.821]
23 For analysis all spike times are discretized at the 2 ms resolution of the movie. [sent-36, score-0.35]
24 00 10 20 30 40 Period T (sec) Figure 1: (a) The spike rate measured in response to a square-wave modulated white noise stimulus set) , averaged over many presentations of set), and normalized by the mean and standard deviation. [sent-44, score-0.987]
25 (b) Decay time of the rate following an upward switch as a function of switching period T. [sent-45, score-0.669]
26 3 Spike rate dynamics Switching experiments as described above correspond to a stimulus such that the amplitude (J'(t) is a square wave, alternating between two values (J'l and (J'2, (J'l > (J'2. [sent-46, score-0.681]
27 Experiments were performed over a range of switching periods (T = 40, 20, 10, 4 s), with the amplitudes (J'l and (J'2 in a ratio of 5:1. [sent-47, score-0.348]
28 Remarkably, the timescales of the response depend strongly on those of the experiment; in fact, the response times rescale by T, as is seen in Fig. [sent-48, score-0.497]
29 The decay of the rate in the first half of the experiment is fitted by an exponential, and in Fig. [sent-50, score-0.198]
30 lea), the fitted decay times are well described as a linear function of the stimulus period. [sent-53, score-0.466]
31 This demonstrates that the timescale of adaptation of the rate is not absolute, but is a function of the timescale established in the experiment. [sent-54, score-0.604]
32 A typical averaged rate response to the exponential-sinusoidal stimulus is shown in Fig. [sent-56, score-0.597]
33 The rate is close to sinusoidal over this parameter regime, indicating a logarithmic encoding of the stimulus variance. [sent-58, score-0.528]
34 Significantly, the rate response shows a phase lead ~ with respect to the stimulus. [sent-59, score-0.223]
35 This may be interpreted as the effect of adaptation: at every point on the cycle, the gain of the response is set to a value defined by the stimulus a short time before. [sent-60, score-0.573]
36 (b) The time shift 0 between response and stimulus, for a range of periods T. [sent-75, score-0.33]
37 As before, the response of the system was measured over a range of periods T. [sent-76, score-0.336]
38 2(b) shows the measured relation of the timeshift 8(T) = T ~ of the response as a function of T. [sent-78, score-0.338]
39 One observes that the relation is nearly linear over more than one order of magnitude in T; that is, the phase shift is approximately constant. [sent-79, score-0.209]
40 Once again there is a strong and simple dependence of the apparent timescale of adaptation on the stimulus parameters. [sent-80, score-0.818]
41 Responses to stimulus sequences composed of many frequencies also exhibit a phase shift, consistent with that observed for the single frequency experiments. [sent-81, score-0.374]
42 4 The dynamic input-output relation Both the switching and sinusoidally modulated experiments indicate that responses to changing the variance of input signals have multiple time scales, ranging from a few seconds to several minutes. [sent-82, score-0.738]
43 Does it really take the system this long to adjust its input/output relation to the new input distribution? [sent-83, score-0.277]
44 In the range of velocities used, and at the contrast level used in the laboratory, spiking in HI depends on features of the velocity waveform that occur within a window of '" 100 ms. [sent-84, score-0.206]
45 After a few seconds, then, the system has had access to several tens of independent samples of the motion signal, and should be able to estimate its variance to within'" 20%; after a minute the precision would be better than a few percent. [sent-85, score-0.209]
46 In practice, we are changing the input variance not by a few percent but a factor of two or ten; if the system were really efficient, these changes would be detected and compensated by adaptation on much shorter time scales. [sent-86, score-0.575]
47 To address this, we look directly at the input/output relation as the standard deviation u(t) varies in time. [sent-87, score-0.306]
48 [10]) features of the stimulus that modulate the probability of occurrence of individual spikes, P(spikelstimulus); we will not consider patterns of spikes, although the same methods can be easily generalised. [sent-89, score-0.374]
49 The space of stimulus histories of length '" 100 ms, discretised at 2 ms, leading up to a s(like has a dimensionality'" 50, too large to allow adequate sampling of P(spikelstimulus) from the data, so we must begin by reducing the dimensionality of the stimulus description. [sent-90, score-0.802]
50 The simplest way to do so is to find a subset of directions in stimulus space determined to be relevant for the system, and to project the stimulus onto that set of directions. [sent-91, score-0.748]
51 The rescaling observed in steady state experiments was seen to occur independently in both dimensions, so without loss of generality we will use as our filter the single dimension given by the spike-triggered average. [sent-95, score-0.589]
52 The stimulus projected onto this filter will be denoted by So. [sent-96, score-0.568]
53 The filtered stimulus is then passed through a nonlinear decision process akin to a threshold. [sent-97, score-0.374]
54 The distribution P(solspike) is estimated from the projected stimulus evaluated at the spike times, and the ratio of the two is the nonlinear input/output relation. [sent-100, score-0.632]
55 A number of experiments have shown that the filter characteristics of HI are adaptive, and we see this in the present experiments as well: as the amplitude u(t) is decreased, the filter changes both in overall amplitude and shape. [sent-101, score-0.742]
56 The filter becomes increasingly extended: the system integrates over longer periods of time under conditions of low velocities. [sent-102, score-0.385]
57 Thus the filter depends on the input variance, and we expect that there should be an observable relaxation of the filter to its new steady state form after a switch in variance. [sent-103, score-0.67]
58 We find, however, that within 200 ms following the switch, the amplitude of the filter has already adjusted to the new variance, and further that the detailed shape of the filter has attained its steady state form in less than I s. [sent-104, score-0.721]
59 The precise timescale of the establishment of the new filter shape depends on the value of u: for the change to U1, the steady state form is achieved within 200 ms. [sent-105, score-0.475]
60 The long tail of the low variance filter for U2 « (1) is established more slowly. [sent-106, score-0.288]
61 Nonetheless, these time scales which characterize adaptation of the filter are much shorter than the rate transients seen in the switching experiments, and are closer to what we might expect for an efficient estimator. [sent-107, score-0.979]
62 We construct time dependent input/output relations by forming conditional distributions using spikes from particular time slices in a periodic experiment. [sent-108, score-0.303]
63 I(c), we show the input/output relation calculated in 1 s bins throughout the switching experiment. [sent-112, score-0.479]
64 Within the first second the input/output relation is almost indistinguishable from its steady state form. [sent-113, score-0.334]
65 Further, it takes the same form for the two halves of the experiment: it is rescaled by the standard deviation, as was seen for the steady state experiments. [sent-114, score-0.204]
66 The close collapse or rescaling of the input/output relations depends not only on the normalisation by the standard deviation, but also on the use of the "local" adapted filter (i. [sent-115, score-0.44]
67 Returning to the sinusoidal experiments, the input/output relations were 2 . [sent-118, score-0.145]
68 01 Figure 3: Input/output relations for (a) switching, (b) sinusoidal and (c) randomly modulated experiments. [sent-129, score-0.201]
69 1 show the modulation envelope u(t) , in log for (b) and (c) (solid), and the measured rate (dotted), normalised by mean and standard deviation. [sent-132, score-0.25]
70 2 show input/output relations calculated in non-overlapping bins throughout the stimulus cycle, with the input 80 in units of the standard deviation of the whole stimulus. [sent-135, score-0.643]
71 Once again the functions show a remarkable rescaling which is sharpened by the use of the appropriate local filter: see Fig. [sent-141, score-0.158]
72 Finally, we consider an amplitude which varies randomly with correlation time Tu '" 3 s: u(t) is a repeated segment of the exponential of a Gaussian random process, pictured in Fig. [sent-144, score-0.299]
73 Dividing the stimulus into sequential bins of 2 s in width, we obtain the filters for each timeslice, and calculate the local prior distributions, which are not Gaussian in this case as they are distorted by the local variations of u(t). [sent-147, score-0.456]
74 Nonetheless, the ratio P(solspike)j P(so) conspires such that the form of the input/output relation is preserved. [sent-148, score-0.165]
75 In all three cases, our results show that the system rapidly and continuously adjusts its coding strategy, rescaling the input/output relation with respect to the local variance of the input as for steady state stimuli. [sent-149, score-0.789]
76 Variance normalisation occurs as rapidly as is measurable, and the system chooses a similar form for the input/output relation in each case. [sent-150, score-0.337]
77 5 Information transmission What does this mean for the coding efficiency of the neuron? [sent-151, score-0.168]
78 An experiment was designed to track the information transmission as a function of time. [sent-152, score-0.199]
79 We then ask how much information the spike train conveys about (a) which of the random segments Si(t) and (b) which of the amplitudes O"j was used. [sent-157, score-0.337]
80 Specifically, the experiment consists of a series of trials of length 2 s where the fast component is one of the sequences {Si}, and after 1 s, the amplitude switches from 0"1 to 0"2 or vice versa. [sent-158, score-0.255]
81 This allows us to measure the mutual information between the response and either the fast or the slow component of the stimulus as a function of time across the 2 s repeated segment. [sent-160, score-0.706]
82 The spike response is represented by "words" [13], generated from the spike times discretised to timebins of 2 ms, where no spike is represented by 0, and a spike by 1. [sent-162, score-1.106]
83 Similarly, one can calculate the information about the amplitude using a given probe s: 2 Is (w(t); 0") = H[Ps(w(t))]- L P(O"j)H[Ps(w(t); O"j)]. [sent-166, score-0.25]
84 (5) j=l The amount of information for each S j varies rapidly depending on the presence or absence of spikes, so we average these contributions over the {Sj} to give I(w; 0"). [sent-167, score-0.176]
85 50 075 Time relative to switch (sec) Figure 4: Information per spike as a function of time where 0" is switched every 2 s. [sent-172, score-0.477]
86 As one would expect, the amount of information transmitted per second about the stimulus details, or s, depends on the ensemble parameter 0": larger velocities allow a higher SNR for velocity estimation, and the system is able to transmit more information. [sent-175, score-0.697]
87 However, when we convert the information rate to bits/spike, we find that the system is transmitting at a constant efficiency of around 1. [sent-176, score-0.295]
88 Any change in information rate during a switch from 0"1 to 0"2 is undetectable. [sent-178, score-0.33]
89 For a switch from 0"2 to 0"1, the time to recovery is of order 100 ms. [sent-179, score-0.258]
90 This demonstrates explicitly that the system is indeed rapidly maximising its information transmission. [sent-180, score-0.175]
91 Further, the transient "excess" of spikes following an upward switch provide information at a constant rate per spike. [sent-181, score-0.498]
92 Thus, information about the ensemble variable is retained at all times: the response is not ambiguous with respect to the absolute scale of velocity. [sent-183, score-0.255]
93 Despite the rescaling of input/output curves, responses within different ensembles are distinguishable. [sent-184, score-0.207]
94 6 Discussion We find that the neural response to a stimulus with well-separated timescales S(t) = O"(t)s(t) takes the form of a ratel8)timing code, where the response r(t) may be approximately modelled as r(t) = R[O"(t)]g (s(t)). [sent-185, score-0.827]
95 (6) Here R modulates the overall rate and depends on the slow dynamics of the variance envelope, while the precise timing of a given spike in response to fast events in the stimulus is determined by the nonlinear input/output relation g, which depends only on the normalised quantity s(t). [sent-186, score-1.274]
96 Through this apparent normalisation by the local standard deviation, g, as for steady-state experiments, maximises information transmission about the fast components of the stimulus. [sent-187, score-0.273]
97 The function R modulating the rate varies on much slower timescales so cannot be taken as an indicator of the extent of the system's adaptation to a new ensemble. [sent-188, score-0.673]
98 Rather, R appears to function as an independent degree of freedom, capable of transmitting information, at a slower rate, about the slow stimulus modulations. [sent-189, score-0.495]
99 The presence of many timescales in R may itself be an adaptation to the many timescales of variation in natural signals. [sent-190, score-0.659]
100 At the same time, the rapid readjustment of the input/output relation - and the consequent recovery of information after a sudden change in 0" - indicate that the adaptive mechanisms approach the limiting speed set by the need to gather statistics. [sent-191, score-0.26]
wordName wordTfidf (topN-words)
[('stimulus', 0.374), ('adaptation', 0.281), ('spike', 0.219), ('switch', 0.191), ('switching', 0.191), ('timescales', 0.189), ('steady', 0.169), ('relation', 0.165), ('rescaling', 0.158), ('filter', 0.155), ('amplitude', 0.155), ('response', 0.132), ('timescale', 0.116), ('bialek', 0.11), ('solspike', 0.108), ('velocity', 0.106), ('ruyter', 0.105), ('steveninck', 0.105), ('variance', 0.097), ('transmission', 0.092), ('rate', 0.091), ('spikes', 0.087), ('ms', 0.087), ('periods', 0.087), ('van', 0.087), ('bins', 0.082), ('relations', 0.082), ('spikelso', 0.081), ('upward', 0.081), ('varies', 0.077), ('normalised', 0.076), ('system', 0.076), ('fly', 0.073), ('scales', 0.07), ('neuron', 0.07), ('amplitudes', 0.07), ('transients', 0.07), ('sensory', 0.068), ('time', 0.067), ('de', 0.066), ('deviation', 0.064), ('sinusoidal', 0.063), ('experiments', 0.061), ('experiment', 0.059), ('modulated', 0.056), ('ql', 0.055), ('attneave', 0.054), ('discretised', 0.054), ('pcr', 0.054), ('potters', 0.054), ('shorter', 0.054), ('spikelstimulus', 0.054), ('velocities', 0.054), ('sec', 0.052), ('signals', 0.052), ('rapidly', 0.051), ('responses', 0.049), ('decay', 0.048), ('hi', 0.048), ('period', 0.048), ('information', 0.048), ('apparent', 0.047), ('probe', 0.047), ('hateren', 0.047), ('adrienne', 0.047), ('sudden', 0.047), ('warland', 0.047), ('occur', 0.046), ('normalisation', 0.045), ('times', 0.044), ('shift', 0.044), ('sj', 0.044), ('slow', 0.044), ('transmitting', 0.042), ('flight', 0.042), ('adrian', 0.042), ('brenner', 0.042), ('envelope', 0.042), ('fast', 0.041), ('measured', 0.041), ('throughout', 0.041), ('projected', 0.039), ('ensemble', 0.039), ('movie', 0.039), ('wk', 0.039), ('happen', 0.039), ('motions', 0.039), ('presentations', 0.039), ('efficiency', 0.038), ('coding', 0.038), ('long', 0.036), ('absolute', 0.036), ('motion', 0.036), ('precise', 0.035), ('white', 0.035), ('rescaled', 0.035), ('continuously', 0.035), ('slower', 0.035), ('barlow', 0.035), ('nonetheless', 0.035)]
simIndex simValue paperId paperTitle
same-paper 1 0.99999976 88 nips-2000-Multiple Timescales of Adaptation in a Neural Code
Author: Adrienne L. Fairhall, Geoffrey D. Lewen, William Bialek, Robert R. de Ruyter van Steveninck
Abstract: Many neural systems extend their dynamic range by adaptation. We examine the timescales of adaptation in the context of dynamically modulated rapidly-varying stimuli, and demonstrate in the fly visual system that adaptation to the statistical ensemble of the stimulus dynamically maximizes information transmission about the time-dependent stimulus. Further, while the rate response has long transients, the adaptation takes place on timescales consistent with optimal variance estimation.
2 0.31735134 146 nips-2000-What Can a Single Neuron Compute?
Author: Blaise Agüera y Arcas, Adrienne L. Fairhall, William Bialek
Abstract: In this paper we formulate a description of the computation performed by a neuron as a combination of dimensional reduction and nonlinearity. We implement this description for the HodgkinHuxley model, identify the most relevant dimensions and find the nonlinearity. A two dimensional description already captures a significant fraction of the information that spikes carry about dynamic inputs. This description also shows that computation in the Hodgkin-Huxley model is more complex than a simple integrateand-fire or perceptron model. 1
3 0.283288 141 nips-2000-Universality and Individuality in a Neural Code
Author: Elad Schneidman, Naama Brenner, Naftali Tishby, Robert R. de Ruyter van Steveninck, William Bialek
Abstract: The problem of neural coding is to understand how sequences of action potentials (spikes) are related to sensory stimuli, motor outputs, or (ultimately) thoughts and intentions. One clear question is whether the same coding rules are used by different neurons, or by corresponding neurons in different individuals. We present a quantitative formulation of this problem using ideas from information theory, and apply this approach to the analysis of experiments in the fly visual system. We find significant individual differences in the structure of the code, particularly in the way that temporal patterns of spikes are used to convey information beyond that available from variations in spike rate. On the other hand, all the flies in our ensemble exhibit a high coding efficiency, so that every spike carries the same amount of information in all the individuals. Thus the neural code has a quantifiable mixture of individuality and universality. 1
4 0.1648967 55 nips-2000-Finding the Key to a Synapse
Author: Thomas Natschläger, Wolfgang Maass
Abstract: Experimental data have shown that synapses are heterogeneous: different synapses respond with different sequences of amplitudes of postsynaptic responses to the same spike train. Neither the role of synaptic dynamics itself nor the role of the heterogeneity of synaptic dynamics for computations in neural circuits is well understood. We present in this article methods that make it feasible to compute for a given synapse with known synaptic parameters the spike train that is optimally fitted to the synapse, for example in the sense that it produces the largest sum of postsynaptic responses. To our surprise we find that most of these optimally fitted spike trains match common firing patterns of specific types of neurons that are discussed in the literature.
5 0.16345438 67 nips-2000-Homeostasis in a Silicon Integrate and Fire Neuron
Author: Shih-Chii Liu, Bradley A. Minch
Abstract: In this work, we explore homeostasis in a silicon integrate-and-fire neuron. The neuron adapts its firing rate over long time periods on the order of seconds or minutes so that it returns to its spontaneous firing rate after a lasting perturbation. Homeostasis is implemented via two schemes. One scheme looks at the presynaptic activity and adapts the synaptic weight depending on the presynaptic spiking rate. The second scheme adapts the synaptic
7 0.14265162 129 nips-2000-Temporally Dependent Plasticity: An Information Theoretic Account
8 0.13522236 89 nips-2000-Natural Sound Statistics and Divisive Normalization in the Auditory System
9 0.13438335 125 nips-2000-Stability and Noise in Biochemical Switches
10 0.11231387 102 nips-2000-Position Variance, Recurrence and Perceptual Learning
11 0.10179531 43 nips-2000-Dopamine Bonuses
12 0.098683372 104 nips-2000-Processing of Time Series by Neural Circuits with Biologically Realistic Synaptic Dynamics
13 0.098526776 80 nips-2000-Learning Switching Linear Models of Human Motion
14 0.091214582 124 nips-2000-Spike-Timing-Dependent Learning for Oscillatory Networks
15 0.090903968 42 nips-2000-Divisive and Subtractive Mask Effects: Linking Psychophysics and Biophysics
16 0.089741334 8 nips-2000-A New Model of Spatial Representation in Multimodal Brain Areas
17 0.089499131 49 nips-2000-Explaining Away in Weight Space
18 0.08579202 100 nips-2000-Permitted and Forbidden Sets in Symmetric Threshold-Linear Networks
19 0.081790276 10 nips-2000-A Productive, Systematic Framework for the Representation of Visual Structure
20 0.079667911 19 nips-2000-Adaptive Object Representation with Hierarchically-Distributed Memory Sites
topicId topicWeight
[(0, 0.254), (1, -0.301), (2, -0.308), (3, 0.01), (4, 0.002), (5, -0.051), (6, -0.062), (7, 0.229), (8, -0.004), (9, -0.013), (10, 0.023), (11, 0.032), (12, 0.053), (13, 0.192), (14, -0.115), (15, -0.041), (16, -0.011), (17, -0.054), (18, -0.153), (19, -0.075), (20, 0.059), (21, 0.122), (22, 0.133), (23, -0.018), (24, 0.08), (25, 0.035), (26, -0.11), (27, -0.031), (28, -0.03), (29, -0.17), (30, -0.038), (31, -0.073), (32, -0.044), (33, -0.039), (34, 0.062), (35, 0.0), (36, -0.007), (37, -0.049), (38, 0.072), (39, -0.002), (40, 0.025), (41, 0.019), (42, 0.043), (43, -0.0), (44, -0.052), (45, 0.054), (46, 0.047), (47, 0.056), (48, 0.077), (49, -0.016)]
simIndex simValue paperId paperTitle
same-paper 1 0.97695333 88 nips-2000-Multiple Timescales of Adaptation in a Neural Code
Author: Adrienne L. Fairhall, Geoffrey D. Lewen, William Bialek, Robert R. de Ruyter van Steveninck
Abstract: Many neural systems extend their dynamic range by adaptation. We examine the timescales of adaptation in the context of dynamically modulated rapidly-varying stimuli, and demonstrate in the fly visual system that adaptation to the statistical ensemble of the stimulus dynamically maximizes information transmission about the time-dependent stimulus. Further, while the rate response has long transients, the adaptation takes place on timescales consistent with optimal variance estimation.
2 0.83005857 141 nips-2000-Universality and Individuality in a Neural Code
Author: Elad Schneidman, Naama Brenner, Naftali Tishby, Robert R. de Ruyter van Steveninck, William Bialek
Abstract: The problem of neural coding is to understand how sequences of action potentials (spikes) are related to sensory stimuli, motor outputs, or (ultimately) thoughts and intentions. One clear question is whether the same coding rules are used by different neurons, or by corresponding neurons in different individuals. We present a quantitative formulation of this problem using ideas from information theory, and apply this approach to the analysis of experiments in the fly visual system. We find significant individual differences in the structure of the code, particularly in the way that temporal patterns of spikes are used to convey information beyond that available from variations in spike rate. On the other hand, all the flies in our ensemble exhibit a high coding efficiency, so that every spike carries the same amount of information in all the individuals. Thus the neural code has a quantifiable mixture of individuality and universality. 1
3 0.78102887 146 nips-2000-What Can a Single Neuron Compute?
Author: Blaise Agüera y Arcas, Adrienne L. Fairhall, William Bialek
Abstract: In this paper we formulate a description of the computation performed by a neuron as a combination of dimensional reduction and nonlinearity. We implement this description for the HodgkinHuxley model, identify the most relevant dimensions and find the nonlinearity. A two dimensional description already captures a significant fraction of the information that spikes carry about dynamic inputs. This description also shows that computation in the Hodgkin-Huxley model is more complex than a simple integrateand-fire or perceptron model. 1
4 0.51811594 125 nips-2000-Stability and Noise in Biochemical Switches
Author: William Bialek
Abstract: Many processes in biology, from the regulation of gene expression in bacteria to memory in the brain, involve switches constructed from networks of biochemical reactions. Crucial molecules are present in small numbers, raising questions about noise and stability. Analysis of noise in simple reaction schemes indicates that switches stable for years and switchable in milliseconds can be built from fewer than one hundred molecules. Prospects for direct tests of this prediction, as well as implications, are discussed. 1
5 0.47491494 55 nips-2000-Finding the Key to a Synapse
Author: Thomas Natschläger, Wolfgang Maass
Abstract: Experimental data have shown that synapses are heterogeneous: different synapses respond with different sequences of amplitudes of postsynaptic responses to the same spike train. Neither the role of synaptic dynamics itself nor the role of the heterogeneity of synaptic dynamics for computations in neural circuits is well understood. We present in this article methods that make it feasible to compute for a given synapse with known synaptic parameters the spike train that is optimally fitted to the synapse, for example in the sense that it produces the largest sum of postsynaptic responses. To our surprise we find that most of these optimally fitted spike trains match common firing patterns of specific types of neurons that are discussed in the literature.
7 0.45286742 43 nips-2000-Dopamine Bonuses
8 0.42171532 67 nips-2000-Homeostasis in a Silicon Integrate and Fire Neuron
9 0.40848911 42 nips-2000-Divisive and Subtractive Mask Effects: Linking Psychophysics and Biophysics
10 0.35899755 89 nips-2000-Natural Sound Statistics and Divisive Normalization in the Auditory System
11 0.33807227 129 nips-2000-Temporally Dependent Plasticity: An Information Theoretic Account
12 0.3127971 102 nips-2000-Position Variance, Recurrence and Perceptual Learning
13 0.30511275 49 nips-2000-Explaining Away in Weight Space
14 0.3032054 8 nips-2000-A New Model of Spatial Representation in Multimodal Brain Areas
15 0.3005718 80 nips-2000-Learning Switching Linear Models of Human Motion
16 0.28156042 19 nips-2000-Adaptive Object Representation with Hierarchically-Distributed Memory Sites
17 0.25951654 65 nips-2000-Higher-Order Statistical Properties Arising from the Non-Stationarity of Natural Signals
18 0.25267553 10 nips-2000-A Productive, Systematic Framework for the Representation of Visual Structure
19 0.22229935 99 nips-2000-Periodic Component Analysis: An Eigenvalue Method for Representing Periodic Structure in Speech
20 0.21548679 96 nips-2000-One Microphone Source Separation
topicId topicWeight
[(10, 0.027), (17, 0.073), (32, 0.015), (33, 0.059), (42, 0.367), (54, 0.011), (55, 0.026), (60, 0.019), (62, 0.037), (65, 0.024), (67, 0.058), (76, 0.034), (79, 0.019), (81, 0.089), (90, 0.022), (91, 0.011), (93, 0.016), (97, 0.014)]
simIndex simValue paperId paperTitle
same-paper 1 0.8998878 88 nips-2000-Multiple Timescales of Adaptation in a Neural Code
Author: Adrienne L. Fairhall, Geoffrey D. Lewen, William Bialek, Robert R. de Ruyter van Steveninck
Abstract: Many neural systems extend their dynamic range by adaptation. We examine the timescales of adaptation in the context of dynamically modulated rapidly-varying stimuli, and demonstrate in the fly visual system that adaptation to the statistical ensemble of the stimulus dynamically maximizes information transmission about the time-dependent stimulus. Further, while the rate response has long transients, the adaptation takes place on timescales consistent with optimal variance estimation.
2 0.89610004 42 nips-2000-Divisive and Subtractive Mask Effects: Linking Psychophysics and Biophysics
Author: Barbara Zenger, Christof Koch
Abstract: We describe an analogy between psychophysically measured effects in contrast masking, and the behavior of a simple integrate-andfire neuron that receives time-modulated inhibition. In the psychophysical experiments, we tested observers ability to discriminate contrasts of peripheral Gabor patches in the presence of collinear Gabor flankers. The data reveal a complex interaction pattern that we account for by assuming that flankers provide divisive inhibition to the target unit for low target contrasts, but provide subtractive inhibition to the target unit for higher target contrasts. A similar switch from divisive to subtractive inhibition is observed in an integrate-and-fire unit that receives inhibition modulated in time such that the cell spends part of the time in a high-inhibition state and part of the time in a low-inhibition state. The similarity between the effects suggests that one may cause the other. The biophysical model makes testable predictions for physiological single-cell recordings. 1 Psychophysics Visual images of Gabor patches are thought to excite a small and specific subset of neurons in the primary visual cortex and beyond. By measuring psychophysically in humans the contrast detection and discrimination thresholds of peripheral Gabor patches, one can estimate the sensitivity of this subset of neurons. Furthermore, spatial interactions between different neuronal populations can be probed by testing the effects of additional Gabor patches (masks) on performance. Such experiments have revealed a highly configuration-specific pattern of excitatory and inhibitory spatial interactions [1, 2]. 1.1 Methods Two vertical Gabor patches with a spatial frequency of 4cyc/deg were presented at 4 deg eccentricity left and right of fixation, and observers had to report which patch had the higher contrast (spatial 2AFC). In the
Author: Kevin A. Archie, Bartlett W. Mel
Abstract: Neurons in area V4 have relatively large receptive fields (RFs), so multiple visual features are simultaneously
4 0.48824018 141 nips-2000-Universality and Individuality in a Neural Code
Author: Elad Schneidman, Naama Brenner, Naftali Tishby, Robert R. de Ruyter van Steveninck, William Bialek
Abstract: The problem of neural coding is to understand how sequences of action potentials (spikes) are related to sensory stimuli, motor outputs, or (ultimately) thoughts and intentions. One clear question is whether the same coding rules are used by different neurons, or by corresponding neurons in different individuals. We present a quantitative formulation of this problem using ideas from information theory, and apply this approach to the analysis of experiments in the fly visual system. We find significant individual differences in the structure of the code, particularly in the way that temporal patterns of spikes are used to convey information beyond that available from variations in spike rate. On the other hand, all the flies in our ensemble exhibit a high coding efficiency, so that every spike carries the same amount of information in all the individuals. Thus the neural code has a quantifiable mixture of individuality and universality. 1
5 0.46483499 55 nips-2000-Finding the Key to a Synapse
Author: Thomas Natschläger, Wolfgang Maass
Abstract: Experimental data have shown that synapses are heterogeneous: different synapses respond with different sequences of amplitudes of postsynaptic responses to the same spike train. Neither the role of synaptic dynamics itself nor the role of the heterogeneity of synaptic dynamics for computations in neural circuits is well understood. We present in this article methods that make it feasible to compute for a given synapse with known synaptic parameters the spike train that is optimally fitted to the synapse, for example in the sense that it produces the largest sum of postsynaptic responses. To our surprise we find that most of these optimally fitted spike trains match common firing patterns of specific types of neurons that are discussed in the literature.
6 0.458821 146 nips-2000-What Can a Single Neuron Compute?
7 0.44934541 125 nips-2000-Stability and Noise in Biochemical Switches
8 0.44853008 8 nips-2000-A New Model of Spatial Representation in Multimodal Brain Areas
9 0.44609776 89 nips-2000-Natural Sound Statistics and Divisive Normalization in the Auditory System
10 0.43043864 43 nips-2000-Dopamine Bonuses
11 0.42646834 104 nips-2000-Processing of Time Series by Neural Circuits with Biologically Realistic Synaptic Dynamics
12 0.42316979 124 nips-2000-Spike-Timing-Dependent Learning for Oscillatory Networks
13 0.40191248 10 nips-2000-A Productive, Systematic Framework for the Representation of Visual Structure
14 0.39261091 131 nips-2000-The Early Word Catches the Weights
15 0.3845723 102 nips-2000-Position Variance, Recurrence and Perceptual Learning
16 0.37492493 49 nips-2000-Explaining Away in Weight Space
17 0.36751226 129 nips-2000-Temporally Dependent Plasticity: An Information Theoretic Account
18 0.36456311 103 nips-2000-Probabilistic Semantic Video Indexing
19 0.35565382 137 nips-2000-The Unscented Particle Filter
20 0.34998456 91 nips-2000-Noise Suppression Based on Neurophysiologically-motivated SNR Estimation for Robust Speech Recognition