nips nips2010 nips2010-119 knowledge-graph by maker-knowledge-mining
Source: pdf
Author: Deep Ganguli, Eero P. Simoncelli
Abstract: unkown-abstract
Reference: text
sentIndex sentText sentNum sentScore
1 Here we consider the influence of a prior probability distribution over sensory variables on the optimal allocation of neurons and spikes in a population. [sent-5, score-0.469]
2 We model the spikes of each cell as samples from an independent Poisson process with rate governed by an associated tuning curve. [sent-6, score-0.503]
3 For this response model, we approximate the Fisher information in terms of the density and amplitude of the tuning curves, under the assumption that tuning width varies inversely with cell density. [sent-7, score-1.064]
4 This family includes lower bounds on mutual information and perceptual discriminability as special cases. [sent-9, score-0.396]
5 In all cases, we find a closed form expression for the optimum, in which the density and gain of the cells in the population are power law functions of the stimulus prior. [sent-10, score-0.968]
6 We show preliminary evidence that the theory successfully predicts the relationship between empirically measured stimulus priors, physiologically measured neural response properties (cell density, tuning widths, and firing rates), and psychophysically measured discrimination thresholds. [sent-12, score-0.933]
7 1 Introduction Many bottom up theories of neural encoding posit that sensory systems are optimized to represent sensory information, subject to limitations of noise and resources (e. [sent-13, score-0.498]
8 A substantial literature has considered population models in which each neuron’s mean response to a scalar variable is characterized by a tuning curve [e. [sent-17, score-0.802]
9 In these results, the distribution of sensory variables is assumed to be uniform and the populations are assumed to be homogeneous with regard to tuning curve shape, spacing, and amplitude. [sent-21, score-0.618]
10 It would seem natural that a neural system should devote more resources to regions of sensory space that occur with higher probability, analogous to results in coding theory [11]. [sent-23, score-0.328]
11 At the population level, non-uniform allocations of neurons with identical tuning curves have been shown to be optimal for non-uniform stimulus distributions [16, 17]. [sent-25, score-1.216]
12 Here, we examine the influence of a sensory prior on the optimal allocation of neurons and spikes in a population, and the implications of this optimal allocation for subsequent perception. [sent-26, score-0.568]
13 Given a prior distribution over a scalar stimulus parameter, and a resource budget of N neurons with an average of R spikes/sec for the entire population, we seek the optimal shapes, positions, and amplitudes of tuning curves. [sent-27, score-0.86]
14 We assume a population with independent Poisson spiking, and consider a family of objective functions based on Fisher information. [sent-28, score-0.415]
15 We then approximate the Fisher information in terms of two continuous resource variables, the density and gain of the tuning curves. [sent-29, score-0.668]
16 For all objective functions, we find that the optimal tuning curve properties (cell density, tuning width, and gain) are power-law functions of the stimulus prior, with exponents dependent on the specific choice of objective function. [sent-31, score-1.14]
17 Through the Fisher information, we also derive a bound on perceptual discriminability, again in the form a power-law of the stimulus prior. [sent-32, score-0.362]
18 Thus, our framework provides direct and experimentally testable links between sensory priors, properties of the neural representation, and perceptual discriminability. [sent-33, score-0.396]
19 2 Encoding model We assume a conventional model for a population of N neurons responding to a single scalar variable, s [1–6]. [sent-35, score-0.483]
20 The number of spikes emitted (per unit time) by the nth neuron is a sample from an independent Poisson process, with mean rate determined by its tuning function, hn (s). [sent-36, score-0.573]
21 The probability density of the population response can be written as N p(r|s) = hn (s)rn e−hn (s) . [sent-37, score-0.645]
22 n=1 We also assume the total expected spike rate, R, of the population is fixed, which places a constraint on the tuning curves: N hn (s) ds = R, p(s) (1) n=1 where p(s) is the probability distribution of stimuli in the environment. [sent-39, score-0.99]
23 We refer to this as a sensory prior, in anticipation of its future use in Bayesian decoding of the population response. [sent-40, score-0.567]
24 To formulate a family of objective functions which depend on both p(s), and the tuning curves, we first rely on Fisher information, If (s), which can be written as a function of the tuning curves [1, 18]: If (s) = − p(r|s) ∂2 log p(r|s) dr ∂s2 N = h′2 (s) n . [sent-42, score-0.894]
25 hn (s) n=1 The Fisher information can be used to express lower bounds on mutual information [16], the variance of an unbiased estimator [18], and perceptual discriminability [19]. [sent-43, score-0.524]
26 The Cramer-Rao inequality allows us to express the minimum expected squared 2 stimulus discriminability achievable by any decoder1 : p(s) ds. [sent-45, score-0.43]
27 We formulate a generalized objective function that includes the Fisher bounds on information and discriminability as special cases: N N arg max hn (s) p(s) f h′2 (s) n hn (s) n=1 ds, s. [sent-47, score-0.496]
28 To make the problem tractable, we first introduce a parametrization of the population in terms of cell density and gain. [sent-62, score-0.593]
29 The cell density controls both the spacing and width of the tuning curves, and the gain controls their maximum average firing rates. [sent-63, score-0.844]
30 Finally, re-writing the objective function and constraints in these terms allows us to obtain closed-form solutions for the optimal tuning curves. [sent-65, score-0.463]
31 1 Density and gain for a homogeneous population If p(s) is uniform, then by symmetry, the Fisher information for an optimal neural population should also be uniform. [sent-67, score-0.932]
32 We assume a convolutional population of tuning curves, evenly spaced on the unit lattice, such that they approximately “tile” the space: N h(s − n) ≈ 1. [sent-68, score-0.869]
33 n=1 We also assume that this population has an approximately constant Fisher information: N If (s) = h′2 (s − n) h(s − n) n=1 N φ(s − n) ≈ Iconv . [sent-69, score-0.397]
34 = (5) n=1 That is, we assume that the Fisher information curves for the individual neurons, φ(s − n), also tile the stimulus space. [sent-70, score-0.438]
35 The value of the constant, Iconv , is dependent on the details of the tuning curve shape, h(s), which we leave unspecified. [sent-71, score-0.408]
36 1(a-b) shows that the Fisher information for a convolutional population of Gaussian tuning curves, with appropriate width, is approximately constant. [sent-73, score-0.836]
37 Now we introduce two scalar values, a gain (g), and a density (d), that affect the convolutional population as follows: n (6) hn (s) = g h d(s − ) . [sent-74, score-0.808]
38 Here, we use it to bound the squared discriminability of the estimator, as expressed in the stimulus space, which is independent of bias [19]. [sent-76, score-0.43]
39 (a) Homogeneous population with Gaussian tuning curves on the unit lattice. [sent-80, score-0.858]
40 55 is chosen so that the curves approximately tile the stimulus space. [sent-82, score-0.471]
41 (b) The Fisher information of the convolutional population (green) is approximately constant. [sent-83, score-0.487]
42 The cumulative integral of this density, D(s), alters the positions and widths of the tuning curves in the convolutional population. [sent-85, score-0.658]
43 (d) The warped population, with tuning curve peaks (aligned with tick marks, at locations sn = D−1 (n)), is scaled by the gain function, g(s) (blue). [sent-86, score-0.641]
44 A single tuning curve is highlighted (red) to illustrate the effect of the warping and scaling operations. [sent-87, score-0.408]
45 (e) The Fisher information of the inhomogeneous population is approximately proportional to d2 (s)g(s). [sent-88, score-0.397]
46 The density controls both the spacing and width of the tuning curves: as the density increases, the tuning curves become narrower, and are spaced closer together so as to maintain their tiling of stimulus space. [sent-90, score-1.49]
47 (5), that the Fisher information of the convolutional population is approximately constant with respect to s. [sent-93, score-0.487]
48 If the original (unit-spacing) convolutional population is supported on the interval (0, Q) of the stimulus space, then the number of neurons in the modulated population must be N (d) = Qd to cover the same interval. [sent-95, score-1.146]
49 Under the assumption that the tuning curves tile the stimulus space, Eq. [sent-96, score-0.787]
50 2 Density and gain for a heterogeneous population Intuitively, if p(s) is non-uniform, the optimal Fisher information should also be non-uniform. [sent-99, score-0.557]
51 This can be achieved through inhomogeneities in either the tuning curve density or gain. [sent-100, score-0.531]
52 We thus generalize density and gain to be continuous functions of the stimulus, d(s) and g(s), that warp and scale the convolutional population: hn (s) = g(sn ) h(D(s) − n). [sent-101, score-0.444]
53 Optimal heterogeneous population properties, for objective functions specified by Eq. [sent-103, score-0.475]
54 s Here, D(s) = −∞ d(t)dt, the cumulative integral of d(s), warps the shape of the prototype tuning curve. [sent-105, score-0.349]
55 The value sn = D−1 (n) represents the preferred stimulus value of the (warped) nth tuning curve (Fig. [sent-106, score-0.696]
56 Note that the warped population retains the tiling properties of the original convolutional population. [sent-108, score-0.534]
57 As in the uniform case, the density controls both the spacing and width of the tuning curves. [sent-109, score-0.635]
58 We can now write the Fisher information of the heterogeneous population of neurons in Eq. [sent-113, score-0.543]
59 As earlier, the constant Iconv is determined by the precise shape of the tuning curves. [sent-118, score-0.349]
60 To attain the proper rate, we use the fact that the warped tuning curves sum to unity (before multiplication by the gain function) and use Eq. [sent-121, score-0.648]
61 3 Objective function and solution for a heterogeneous population Approximating Fisher information as proportional to squared density and gain allows us to re-write the objective function and resource constraints of Eq. [sent-124, score-0.826]
62 In all cases, the solution specifies a power-law relationship between the prior, and the density and gain of the tuning curves. [sent-130, score-0.575]
63 In general, all solutions allocate more neurons, with correspondingly narrower tuning curves, to higher-probability stimuli. [sent-131, score-0.426]
64 The shape of the optimal gain function depends on the objective function: for α < 0, neurons with lower firing rates are used to represent stimuli with higher probabilities, and for α > 0, neurons with higher firing rates are used for stimuli with higher probabilities. [sent-133, score-0.506]
65 (c) Orientation discrimination thresholds averaged across four human subjects [24]. [sent-138, score-0.369]
66 (d & e) Infomax and discrimax predictions of orientation distribution. [sent-139, score-0.424]
67 In addition to power-law relationships between tuning properties and sensory priors, our formulation offers a direct relationship between the sensory prior and perceptual discriminability. [sent-144, score-0.948]
68 5 Experimental evidence Our framework predicts a quantitative link between the sensory prior, physiological parameters (the density, tuning widths, and gain of cells), and psychophysically measured discrimination thresholds. [sent-148, score-0.905]
69 We obtained subsets of these quantities for two visual stimulus variables, orientation and spatial frequency, both of believed to be encoded by cells in primary visual cortex (area V1). [sent-149, score-0.502]
70 For each variable, we use the infomax and discrimax solutions to convert the physiological and perceptual measurements, using the appropriate exponents from Table 1, into predictions of the stimulus prior p(s). [sent-150, score-1.055]
71 1 Orientation We estimated the prior distribution of orientations in the environment by averaging orientation statistics across three natural image databases. [sent-153, score-0.316]
72 The average distribution of orientations exhibits higher probability at the cardinal orientations (vertical and horizontal) than at the oblique orientations (Fig. [sent-156, score-0.33]
73 Measurements of cell density for a population of 79 orientation-tuned V1 cells in Macaque [23] show more cells tuned to the cardinal orientations than the oblique orientations (Fig. [sent-158, score-1.075]
74 Finally, perceptual discrimination thresholds, averaged across four human subjects [24] show a similar bias (Fig. [sent-160, score-0.426]
75 If a neural population is designed to maximize information, then the cell density and inverse discrimination thresholds should match the stimulus prior, as expressed in infomax column of Table 1. [sent-163, score-1.275]
76 (b) Cell density as a function of preferred spatial frequency for a population of 317 V1 cells [25, 28] Dark blue: average number of cells tuned to each spatial frequency. [sent-167, score-0.953]
77 (d & e) Infomax and discrimax predictions of spatial frequency distribution. [sent-172, score-0.464]
78 We see that the predictions arising from cell density and discrimination thresholds are consistent with one another, and both are consistent with the stimulus prior. [sent-177, score-0.771]
79 For the discrimax objective function, the exponents in the power-law relationships (expressed in Table 1) are too small, resulting in poor qualitative agreement between the stimulus prior and predictions from the physiology and perception (Fig. [sent-179, score-0.81]
80 For example, predicting the prior from perceptual data, under the discrimax objective function, requires exponentiating discrimination thresholds to the fourth power, resulting in an over exaggeration of the cardinal bias. [sent-181, score-0.803]
81 2 Spatial frequency We obtained a prior distribution over spatial frequencies averaged across two natural image databases [20, 21]. [sent-183, score-0.344]
82 We also obtained spatial frequency tuning properties for a population of 317 V1 cells [25]. [sent-188, score-0.958]
83 On average, we see there are more cells, with correspondingly narrower tuning widths, tuned to low spatial frequencies (Fig. [sent-189, score-0.548]
84 These data support the model assumption that tuning width is inversely proportional to cell density. [sent-191, score-0.562]
85 We also obtained average discrimination thresholds for sinusoidal gratings of different spatial frequencies from two studies (Fig. [sent-192, score-0.401]
86 We again test the infomax and discrimax solutions by comparing predicted distributions obtained from the physiological and perceptual data, to the measured prior. [sent-196, score-0.701]
87 The infomax case shows striking agreement between the measured stimulus prior, and predictions based on the physiological and perceptual measurements (Fig. [sent-198, score-0.727]
88 However, as in the orientation case, discrimax predictions are poor (Fig. [sent-200, score-0.424]
89 3(e)), suggesting that information maximization provides a better optimality principle for explaining the neural and perceptual encoding of spatial frequency than discrimination maximization. [sent-201, score-0.536]
90 7 6 Discussion We have examined the influence sensory priors on the optimal allocation of neural resources, as well as the influence of these optimized resources on subsequent perception. [sent-202, score-0.343]
91 Fisher information is known to provide a poor bound on mutual information when there are a small number of neurons, a short decoding time, or non-smooth tuning curves [16, 29]. [sent-207, score-0.577]
92 These assumptions allow us to approximate Fisher information in terms of cell density and gain (Fig. [sent-211, score-0.332]
93 Our framework offers an important generalization of the population coding literature, allowing for non-uniformity of sensory priors, and corresponding heterogeneity in tuning and gain properties. [sent-213, score-1.043]
94 Second, tuning curve encoding models only specify neural responses to single stimulus values. [sent-216, score-0.701]
95 Previous studies assume that prior probabilities are either uniform [6], represented in the spiking activity of a separate population of neurons [5], or represented (in sample form) in the spontaneous activity [35]. [sent-224, score-0.543]
96 Our encoding formulation provides a mechanism whereby the prior is implicitly encoded in the density and gains of tuning curves, which presumably arise from the strength of synaptic connections. [sent-225, score-0.581]
97 Narrow versus wide tuning curves: What’s best for a population code? [sent-258, score-0.713]
98 Optimal tuning widths in population coding of periodic variables. [sent-264, score-0.84]
99 Maximally informative stimuli and tuning curves for sigmoidal ratecoding neurons and populations. [sent-280, score-0.655]
100 Optimal neural population coding of an auditory spatial cue. [sent-286, score-0.54]
wordName wordTfidf (topN-words)
[('population', 0.364), ('tuning', 0.349), ('fisher', 0.258), ('discrimax', 0.234), ('stimulus', 0.209), ('discriminability', 0.189), ('infomax', 0.189), ('sensory', 0.174), ('perceptual', 0.153), ('discrimination', 0.153), ('curves', 0.145), ('comput', 0.128), ('hn', 0.128), ('density', 0.123), ('neurons', 0.119), ('ds', 0.107), ('width', 0.107), ('cell', 0.106), ('orientation', 0.106), ('gain', 0.103), ('cells', 0.099), ('iconv', 0.098), ('thresholds', 0.096), ('resource', 0.093), ('convolutional', 0.09), ('spatial', 0.088), ('predictions', 0.084), ('tile', 0.084), ('orientations', 0.08), ('sn', 0.079), ('widths', 0.074), ('resources', 0.066), ('cpd', 0.063), ('prior', 0.06), ('heterogeneous', 0.06), ('curve', 0.059), ('frequency', 0.058), ('perception', 0.056), ('spacing', 0.056), ('cardinal', 0.056), ('subjects', 0.054), ('mutual', 0.054), ('coding', 0.053), ('dec', 0.051), ('nov', 0.051), ('warped', 0.051), ('objective', 0.051), ('physiological', 0.051), ('encoding', 0.049), ('neuron', 0.048), ('spikes', 0.048), ('macaque', 0.048), ('aug', 0.047), ('ring', 0.046), ('jul', 0.044), ('ja', 0.044), ('narrower', 0.044), ('stimuli', 0.042), ('neuronal', 0.042), ('exponents', 0.042), ('environment', 0.042), ('measured', 0.041), ('databases', 0.039), ('bair', 0.039), ('cavanaugh', 0.039), ('surround', 0.039), ('oct', 0.038), ('averaged', 0.038), ('allocation', 0.038), ('relationships', 0.038), ('homogeneous', 0.036), ('law', 0.036), ('poisson', 0.036), ('physiology', 0.036), ('neurosci', 0.036), ('neural', 0.035), ('psychophysically', 0.034), ('oblique', 0.034), ('testable', 0.034), ('power', 0.034), ('tuned', 0.034), ('solutions', 0.033), ('frequencies', 0.033), ('approximately', 0.033), ('evenly', 0.033), ('squared', 0.032), ('tj', 0.032), ('exponent', 0.032), ('gratings', 0.031), ('jr', 0.031), ('jp', 0.031), ('implications', 0.031), ('optimal', 0.03), ('response', 0.03), ('feb', 0.029), ('mar', 0.029), ('tiling', 0.029), ('decoding', 0.029), ('across', 0.028), ('ep', 0.028)]
simIndex simValue paperId paperTitle
same-paper 1 0.9999997 119 nips-2010-Implicit encoding of prior probabilities in optimal neural populations
Author: Deep Ganguli, Eero P. Simoncelli
Abstract: unkown-abstract
2 0.32152444 161 nips-2010-Linear readout from a neural population with partial correlation data
Author: Adrien Wohrer, Ranulfo Romo, Christian K. Machens
Abstract: How much information does a neural population convey about a stimulus? Answers to this question are known to strongly depend on the correlation of response variability in neural populations. These noise correlations, however, are essentially immeasurable as the number of parameters in a noise correlation matrix grows quadratically with population size. Here, we suggest to bypass this problem by imposing a parametric model on a noise correlation matrix. Our basic assumption is that noise correlations arise due to common inputs between neurons. On average, noise correlations will therefore reflect signal correlations, which can be measured in neural populations. We suggest an explicit parametric dependency between signal and noise correlations. We show how this dependency can be used to ”fill the gaps” in noise correlations matrices using an iterative application of the Wishart distribution over positive definitive matrices. We apply our method to data from the primary somatosensory cortex of monkeys performing a two-alternativeforced choice task. We compare the discrimination thresholds read out from the population of recorded neurons with the discrimination threshold of the monkey and show that our method predicts different results than simpler, average schemes of noise correlations. 1
3 0.27011311 268 nips-2010-The Neural Costs of Optimal Control
Author: Samuel Gershman, Robert Wilson
Abstract: Optimal control entails combining probabilities and utilities. However, for most practical problems, probability densities can be represented only approximately. Choosing an approximation requires balancing the benefits of an accurate approximation against the costs of computing it. We propose a variational framework for achieving this balance and apply it to the problem of how a neural population code should optimally represent a distribution under resource constraints. The essence of our analysis is the conjecture that population codes are organized to maximize a lower bound on the log expected utility. This theory can account for a plethora of experimental data, including the reward-modulation of sensory receptive fields, GABAergic effects on saccadic movements, and risk aversion in decisions under uncertainty. 1
4 0.22286481 81 nips-2010-Evaluating neuronal codes for inference using Fisher information
Author: Haefner Ralf, Matthias Bethge
Abstract: Many studies have explored the impact of response variability on the quality of sensory codes. The source of this variability is almost always assumed to be intrinsic to the brain. However, when inferring a particular stimulus property, variability associated with other stimulus attributes also effectively act as noise. Here we study the impact of such stimulus-induced response variability for the case of binocular disparity inference. We characterize the response distribution for the binocular energy model in response to random dot stereograms and find it to be very different from the Poisson-like noise usually assumed. We then compute the Fisher information with respect to binocular disparity, present in the monocular inputs to the standard model of early binocular processing, and thereby obtain an upper bound on how much information a model could theoretically extract from them. Then we analyze the information loss incurred by the different ways of combining those inputs to produce a scalar single-neuron response. We find that in the case of depth inference, monocular stimulus variability places a greater limit on the extractable information than intrinsic neuronal noise for typical spike counts. Furthermore, the largest loss of information is incurred by the standard model for position disparity neurons (tuned-excitatory), that are the most ubiquitous in monkey primary visual cortex, while more information from the inputs is preserved in phase-disparity neurons (tuned-near or tuned-far) primarily found in higher cortical regions. 1
5 0.21901914 21 nips-2010-Accounting for network effects in neuronal responses using L1 regularized point process models
Author: Ryan Kelly, Matthew Smith, Robert Kass, Tai S. Lee
Abstract: Activity of a neuron, even in the early sensory areas, is not simply a function of its local receptive field or tuning properties, but depends on global context of the stimulus, as well as the neural context. This suggests the activity of the surrounding neurons and global brain states can exert considerable influence on the activity of a neuron. In this paper we implemented an L1 regularized point process model to assess the contribution of multiple factors to the firing rate of many individual units recorded simultaneously from V1 with a 96-electrode “Utah” array. We found that the spikes of surrounding neurons indeed provide strong predictions of a neuron’s response, in addition to the neuron’s receptive field transfer function. We also found that the same spikes could be accounted for with the local field potentials, a surrogate measure of global network states. This work shows that accounting for network fluctuations can improve estimates of single trial firing rate and stimulus-response transfer functions. 1
6 0.14555848 127 nips-2010-Inferring Stimulus Selectivity from the Spatial Structure of Neural Network Dynamics
7 0.11893485 65 nips-2010-Divisive Normalization: Justification and Effectiveness as Efficient Coding Transform
8 0.10718781 96 nips-2010-Fractionally Predictive Spiking Neurons
9 0.1026597 17 nips-2010-A biologically plausible network for the computation of orientation dominance
10 0.088675387 143 nips-2010-Learning Convolutional Feature Hierarchies for Visual Recognition
11 0.085691139 200 nips-2010-Over-complete representations on recurrent neural networks can support persistent percepts
12 0.077668034 44 nips-2010-Brain covariance selection: better individual functional connectivity models using population prior
13 0.074470542 95 nips-2010-Feature Transitions with Saccadic Search: Size, Color, and Orientation Are Not Alike
14 0.070412725 252 nips-2010-SpikeAnts, a spiking neuron network modelling the emergence of organization in a complex system
15 0.070328057 98 nips-2010-Functional form of motion priors in human motion perception
16 0.069272757 20 nips-2010-A unified model of short-range and long-range motion perception
17 0.068883516 157 nips-2010-Learning to localise sounds with spiking neural networks
18 0.066898763 34 nips-2010-Attractor Dynamics with Synaptic Depression
19 0.066094577 115 nips-2010-Identifying Dendritic Processing
20 0.063078143 68 nips-2010-Effects of Synaptic Weight Diffusion on Learning in Decision Making Networks
topicId topicWeight
[(0, 0.174), (1, 0.051), (2, -0.276), (3, 0.217), (4, 0.095), (5, 0.159), (6, -0.049), (7, 0.022), (8, 0.012), (9, -0.025), (10, 0.074), (11, -0.042), (12, -0.038), (13, -0.053), (14, 0.003), (15, -0.042), (16, -0.045), (17, -0.035), (18, -0.02), (19, 0.241), (20, 0.066), (21, -0.236), (22, 0.212), (23, 0.018), (24, 0.002), (25, 0.027), (26, 0.046), (27, 0.137), (28, -0.072), (29, 0.103), (30, -0.053), (31, -0.124), (32, 0.018), (33, -0.001), (34, -0.043), (35, -0.022), (36, 0.069), (37, -0.028), (38, 0.058), (39, 0.005), (40, 0.059), (41, -0.053), (42, -0.106), (43, -0.028), (44, -0.043), (45, 0.083), (46, -0.07), (47, -0.096), (48, -0.038), (49, -0.079)]
simIndex simValue paperId paperTitle
same-paper 1 0.98444879 119 nips-2010-Implicit encoding of prior probabilities in optimal neural populations
Author: Deep Ganguli, Eero P. Simoncelli
Abstract: unkown-abstract
2 0.90451616 81 nips-2010-Evaluating neuronal codes for inference using Fisher information
Author: Haefner Ralf, Matthias Bethge
Abstract: Many studies have explored the impact of response variability on the quality of sensory codes. The source of this variability is almost always assumed to be intrinsic to the brain. However, when inferring a particular stimulus property, variability associated with other stimulus attributes also effectively act as noise. Here we study the impact of such stimulus-induced response variability for the case of binocular disparity inference. We characterize the response distribution for the binocular energy model in response to random dot stereograms and find it to be very different from the Poisson-like noise usually assumed. We then compute the Fisher information with respect to binocular disparity, present in the monocular inputs to the standard model of early binocular processing, and thereby obtain an upper bound on how much information a model could theoretically extract from them. Then we analyze the information loss incurred by the different ways of combining those inputs to produce a scalar single-neuron response. We find that in the case of depth inference, monocular stimulus variability places a greater limit on the extractable information than intrinsic neuronal noise for typical spike counts. Furthermore, the largest loss of information is incurred by the standard model for position disparity neurons (tuned-excitatory), that are the most ubiquitous in monkey primary visual cortex, while more information from the inputs is preserved in phase-disparity neurons (tuned-near or tuned-far) primarily found in higher cortical regions. 1
3 0.8439008 161 nips-2010-Linear readout from a neural population with partial correlation data
Author: Adrien Wohrer, Ranulfo Romo, Christian K. Machens
Abstract: How much information does a neural population convey about a stimulus? Answers to this question are known to strongly depend on the correlation of response variability in neural populations. These noise correlations, however, are essentially immeasurable as the number of parameters in a noise correlation matrix grows quadratically with population size. Here, we suggest to bypass this problem by imposing a parametric model on a noise correlation matrix. Our basic assumption is that noise correlations arise due to common inputs between neurons. On average, noise correlations will therefore reflect signal correlations, which can be measured in neural populations. We suggest an explicit parametric dependency between signal and noise correlations. We show how this dependency can be used to ”fill the gaps” in noise correlations matrices using an iterative application of the Wishart distribution over positive definitive matrices. We apply our method to data from the primary somatosensory cortex of monkeys performing a two-alternativeforced choice task. We compare the discrimination thresholds read out from the population of recorded neurons with the discrimination threshold of the monkey and show that our method predicts different results than simpler, average schemes of noise correlations. 1
4 0.74311841 21 nips-2010-Accounting for network effects in neuronal responses using L1 regularized point process models
Author: Ryan Kelly, Matthew Smith, Robert Kass, Tai S. Lee
Abstract: Activity of a neuron, even in the early sensory areas, is not simply a function of its local receptive field or tuning properties, but depends on global context of the stimulus, as well as the neural context. This suggests the activity of the surrounding neurons and global brain states can exert considerable influence on the activity of a neuron. In this paper we implemented an L1 regularized point process model to assess the contribution of multiple factors to the firing rate of many individual units recorded simultaneously from V1 with a 96-electrode “Utah” array. We found that the spikes of surrounding neurons indeed provide strong predictions of a neuron’s response, in addition to the neuron’s receptive field transfer function. We also found that the same spikes could be accounted for with the local field potentials, a surrogate measure of global network states. This work shows that accounting for network fluctuations can improve estimates of single trial firing rate and stimulus-response transfer functions. 1
5 0.68180758 268 nips-2010-The Neural Costs of Optimal Control
Author: Samuel Gershman, Robert Wilson
Abstract: Optimal control entails combining probabilities and utilities. However, for most practical problems, probability densities can be represented only approximately. Choosing an approximation requires balancing the benefits of an accurate approximation against the costs of computing it. We propose a variational framework for achieving this balance and apply it to the problem of how a neural population code should optimally represent a distribution under resource constraints. The essence of our analysis is the conjecture that population codes are organized to maximize a lower bound on the log expected utility. This theory can account for a plethora of experimental data, including the reward-modulation of sensory receptive fields, GABAergic effects on saccadic movements, and risk aversion in decisions under uncertainty. 1
6 0.62636817 127 nips-2010-Inferring Stimulus Selectivity from the Spatial Structure of Neural Network Dynamics
7 0.528763 17 nips-2010-A biologically plausible network for the computation of orientation dominance
8 0.49805784 157 nips-2010-Learning to localise sounds with spiking neural networks
9 0.48006254 95 nips-2010-Feature Transitions with Saccadic Search: Size, Color, and Orientation Are Not Alike
10 0.45839351 34 nips-2010-Attractor Dynamics with Synaptic Depression
11 0.42893928 19 nips-2010-A rational decision making framework for inhibitory control
12 0.40298176 96 nips-2010-Fractionally Predictive Spiking Neurons
13 0.3995207 200 nips-2010-Over-complete representations on recurrent neural networks can support persistent percepts
14 0.39870507 121 nips-2010-Improving Human Judgments by Decontaminating Sequential Dependencies
15 0.39737576 252 nips-2010-SpikeAnts, a spiking neuron network modelling the emergence of organization in a complex system
16 0.39176133 65 nips-2010-Divisive Normalization: Justification and Effectiveness as Efficient Coding Transform
17 0.36858398 3 nips-2010-A Bayesian Framework for Figure-Ground Interpretation
18 0.32309851 82 nips-2010-Evaluation of Rarity of Fingerprints in Forensics
19 0.30332962 263 nips-2010-Switching state space model for simultaneously estimating state transitions and nonstationary firing rates
20 0.29500145 266 nips-2010-The Maximal Causes of Natural Scenes are Edge Filters
topicId topicWeight
[(13, 0.019), (17, 0.016), (27, 0.601), (30, 0.037), (35, 0.01), (45, 0.109), (50, 0.02), (52, 0.029), (60, 0.016), (77, 0.04), (90, 0.026)]
simIndex simValue paperId paperTitle
same-paper 1 0.95471531 119 nips-2010-Implicit encoding of prior probabilities in optimal neural populations
Author: Deep Ganguli, Eero P. Simoncelli
Abstract: unkown-abstract
2 0.92717946 128 nips-2010-Infinite Relational Modeling of Functional Connectivity in Resting State fMRI
Author: Morten Mørup, Kristoffer Madsen, Anne-marie Dogonowski, Hartwig Siebner, Lars K. Hansen
Abstract: Functional magnetic resonance imaging (fMRI) can be applied to study the functional connectivity of the neural elements which form complex network at a whole brain level. Most analyses of functional resting state networks (RSN) have been based on the analysis of correlation between the temporal dynamics of various regions of the brain. While these models can identify coherently behaving groups in terms of correlation they give little insight into how these groups interact. In this paper we take a different view on the analysis of functional resting state networks. Starting from the definition of resting state as functional coherent groups we search for functional units of the brain that communicate with other parts of the brain in a coherent manner as measured by mutual information. We use the infinite relational model (IRM) to quantify functional coherent groups of resting state networks and demonstrate how the extracted component interactions can be used to discriminate between functional resting state activity in multiple sclerosis and normal subjects. 1
3 0.90552294 60 nips-2010-Deterministic Single-Pass Algorithm for LDA
Author: Issei Sato, Kenichi Kurihara, Hiroshi Nakagawa
Abstract: We develop a deterministic single-pass algorithm for latent Dirichlet allocation (LDA) in order to process received documents one at a time and then discard them in an excess text stream. Our algorithm does not need to store old statistics for all data. The proposed algorithm is much faster than a batch algorithm and is comparable to the batch algorithm in terms of perplexity in experiments.
4 0.87523639 121 nips-2010-Improving Human Judgments by Decontaminating Sequential Dependencies
Author: Harold Pashler, Matthew Wilder, Robert Lindsey, Matt Jones, Michael C. Mozer, Michael P. Holmes
Abstract: For over half a century, psychologists have been struck by how poor people are at expressing their internal sensations, impressions, and evaluations via rating scales. When individuals make judgments, they are incapable of using an absolute rating scale, and instead rely on reference points from recent experience. This relativity of judgment limits the usefulness of responses provided by individuals to surveys, questionnaires, and evaluation forms. Fortunately, the cognitive processes that transform internal states to responses are not simply noisy, but rather are influenced by recent experience in a lawful manner. We explore techniques to remove sequential dependencies, and thereby decontaminate a series of ratings to obtain more meaningful human judgments. In our formulation, decontamination is fundamentally a problem of inferring latent states (internal sensations) which, because of the relativity of judgment, have temporal dependencies. We propose a decontamination solution using a conditional random field with constraints motivated by psychological theories of relative judgment. Our exploration of decontamination models is supported by two experiments we conducted to obtain ground-truth rating data on a simple length estimation task. Our decontamination techniques yield an over 20% reduction in the error of human judgments. 1
5 0.840002 39 nips-2010-Bayesian Action-Graph Games
Author: Albert X. Jiang, Kevin Leyton-brown
Abstract: Games of incomplete information, or Bayesian games, are an important gametheoretic model and have many applications in economics. We propose Bayesian action-graph games (BAGGs), a novel graphical representation for Bayesian games. BAGGs can represent arbitrary Bayesian games, and furthermore can compactly express Bayesian games exhibiting commonly encountered types of structure including symmetry, action- and type-specific utility independence, and probabilistic independence of type distributions. We provide an algorithm for computing expected utility in BAGGs, and discuss conditions under which the algorithm runs in polynomial time. Bayes-Nash equilibria of BAGGs can be computed by adapting existing algorithms for complete-information normal form games and leveraging our expected utility algorithm. We show both theoretically and empirically that our approaches improve significantly on the state of the art. 1
6 0.81636739 266 nips-2010-The Maximal Causes of Natural Scenes are Edge Filters
7 0.74552161 81 nips-2010-Evaluating neuronal codes for inference using Fisher information
8 0.70714164 161 nips-2010-Linear readout from a neural population with partial correlation data
9 0.68337446 6 nips-2010-A Discriminative Latent Model of Image Region and Object Tag Correspondence
10 0.67964637 127 nips-2010-Inferring Stimulus Selectivity from the Spatial Structure of Neural Network Dynamics
11 0.64794916 21 nips-2010-Accounting for network effects in neuronal responses using L1 regularized point process models
12 0.64085478 97 nips-2010-Functional Geometry Alignment and Localization of Brain Areas
13 0.6159054 98 nips-2010-Functional form of motion priors in human motion perception
14 0.60185415 194 nips-2010-Online Learning for Latent Dirichlet Allocation
15 0.60027206 268 nips-2010-The Neural Costs of Optimal Control
16 0.595061 19 nips-2010-A rational decision making framework for inhibitory control
17 0.59005803 244 nips-2010-Sodium entry efficiency during action potentials: A novel single-parameter family of Hodgkin-Huxley models
18 0.58883065 44 nips-2010-Brain covariance selection: better individual functional connectivity models using population prior
19 0.58101171 123 nips-2010-Individualized ROI Optimization via Maximization of Group-wise Consistency of Structural and Functional Profiles
20 0.5756861 17 nips-2010-A biologically plausible network for the computation of orientation dominance