nips nips2008 nips2008-45 knowledge-graph by maker-knowledge-mining

45 nips-2008-Characterizing neural dependencies with copula models


Source: pdf

Author: Pietro Berkes, Frank Wood, Jonathan W. Pillow

Abstract: The coding of information by neural populations depends critically on the statistical dependencies between neuronal responses. However, there is no simple model that can simultaneously account for (1) marginal distributions over single-neuron spike counts that are discrete and non-negative; and (2) joint distributions over the responses of multiple neurons that are often strongly dependent. Here, we show that both marginal and joint properties of neural responses can be captured using copula models. Copulas are joint distributions that allow random variables with arbitrary marginals to be combined while incorporating arbitrary dependencies between them. Different copulas capture different kinds of dependencies, allowing for a richer and more detailed description of dependencies than traditional summary statistics, such as correlation coefficients. We explore a variety of copula models for joint neural response distributions, and derive an efficient maximum likelihood procedure for estimating them. We apply these models to neuronal data collected in macaque pre-motor cortex, and quantify the improvement in coding accuracy afforded by incorporating the dependency structure between pairs of neurons. We find that more than one third of neuron pairs shows dependency concentrated in the lower or upper tails for their firing rate distribution. 1

Reference: text


Summary: the most important sentenses genereted by tfidf model

sentIndex sentText sentNum sentScore

1 Characterizing neural dependencies with copula models Pietro Berkes Volen Center for Complex Systems Brandeis University, Waltham, MA 02454 berkes@brandeis. [sent-1, score-1.042]

2 uk Abstract The coding of information by neural populations depends critically on the statistical dependencies between neuronal responses. [sent-5, score-0.225]

3 However, there is no simple model that can simultaneously account for (1) marginal distributions over single-neuron spike counts that are discrete and non-negative; and (2) joint distributions over the responses of multiple neurons that are often strongly dependent. [sent-6, score-0.448]

4 Here, we show that both marginal and joint properties of neural responses can be captured using copula models. [sent-7, score-1.104]

5 Copulas are joint distributions that allow random variables with arbitrary marginals to be combined while incorporating arbitrary dependencies between them. [sent-8, score-0.347]

6 Different copulas capture different kinds of dependencies, allowing for a richer and more detailed description of dependencies than traditional summary statistics, such as correlation coefficients. [sent-9, score-0.369]

7 We explore a variety of copula models for joint neural response distributions, and derive an efficient maximum likelihood procedure for estimating them. [sent-10, score-1.019]

8 We apply these models to neuronal data collected in macaque pre-motor cortex, and quantify the improvement in coding accuracy afforded by incorporating the dependency structure between pairs of neurons. [sent-11, score-0.281]

9 We find that more than one third of neuron pairs shows dependency concentrated in the lower or upper tails for their firing rate distribution. [sent-12, score-0.278]

10 The stochastic spiking activity of individual neurons in cortex is often well described by a Poisson distribution. [sent-14, score-0.173]

11 Responses from multiple neurons also exhibit strong dependencies (i. [sent-15, score-0.228]

12 Recent work has focused on the construction of large parametric models that capture inter-neuronal dependencies using generalized linear point-process models [5, 6, 7, 8, 9] and binary second-order maximum-entropy models [10, 11, 12]. [sent-22, score-0.228]

13 Although these approaches are quite powerful, they model spike trains only in very fine time bins, and thus describe the dependencies in neural spike count distributions only implicitly. [sent-23, score-0.289]

14 Modeling the joint distribution of neural activities is therefore an important open problem. [sent-24, score-0.122]

15 Here we show how to construct non-independent joint distributions over firing rates using copulas. [sent-25, score-0.108]

16 Top row: The marginal distributions (the leftmost marginal is uniform, by definition of copula). [sent-28, score-0.196]

17 cortex; finally, in Section 6 we review the insights provided by neural copula models and discuss several extensions and future directions. [sent-31, score-0.913]

18 , un ) : [0, 1]n → [0, 1] is a multivariate distribution function on the unit cube with uniform marginals [13, 14]. [sent-35, score-0.265]

19 The basic idea behind copulas is quite simple, and is closely related to that of histogram equalization: for a random variable yi with continuous cumulative distribution function (cdf) Fi , the random variable ui := Fi (yi ) is uniformly distributed on the interval [0, 1]. [sent-36, score-0.351]

20 , Fn and joint distribution F , there exist a unique copula C such that for all ui : −1 −1 C(u1 , . [sent-44, score-0.965]

21 , Fn (yn )) (2) is a n-variate distribution function with marginal distribution functions F1 , . [sent-59, score-0.134]

22 This result gives a way to derive a copula given the joint and marginal distributions (using Eq. [sent-63, score-1.041]

23 1), and also, more importantly here, to construct a joint distribution by specifying the marginal distributions and the dependency structure separately (Eq. [sent-64, score-0.296]

24 For example, one can keep the dependency structure fixed and vary the marginals (Fig. [sent-66, score-0.207]

25 1), or vice versa given fixed marginal distributions define new joint distributions using parametrized copula families (Fig. [sent-67, score-1.219]

26 Since copulas do not depend on the marginals, one can define in this way dependency measures that are insensitive to non-linear transformations of the individual variables [14] and generalize correlation coefficients, which are only appropriate for elliptic distributions. [sent-71, score-0.305]

27 The copula representation has also been used to estimate the conditional entropy of neural latencies by separating the contribution of the individual latencies from that coming from their correlations [16]. [sent-72, score-1.007]

28 One notable example is the Gaussian copula, which generalizes the dependency structure of the multivariate Gaussian distribution to arbitrary marginal distribution (Fig. [sent-74, score-0.28]

29 1), and is defined as C(u1 , u2 ; Σ) = ΦΣ φ−1 (u1 ), φ−1 (u2 ) , 2 (3) Figure 2: Samples drawn from a joint distribution with fixed Gaussian marginals and dependency structure defined by parametric copula families, as indicated by the labels. [sent-75, score-1.176]

30 where φ(u) is the cdf of the univariate Gaussian with mean 0 and variance 1, and ΦΣ is the cdf of a standard multivariate Gaussian with mean 0 and covariance matrix Σ. [sent-80, score-0.166]

31 Other families derive from the economics literature, and are typically one-parameter families that capture various possible dependencies, for example dependencies only in one of the tails of the distribution. [sent-81, score-0.466]

32 Table 1 shows the definition of the copula distributions used in this paper (see [14], for an overview of known copulas and copula construction methods). [sent-82, score-1.956]

33 3 Maximum Likelihood estimation for discrete marginal distributions In the case where the random variables have discrete distribution functions, as in the case of neural firing rates, only a weaker version of Theorem 1 is valid: there always exists a copula that satisfies Eq. [sent-83, score-1.113]

34 With discrete data, the probability of a particular outcome is determined by an integral over the region of [0, 1]n corresponding to that outcome; any two copulas that integrate to the same values on all such regions produce the same joint distribution. [sent-85, score-0.291]

35 These marginals can be given by the empirical cumulative distribution of firing rates (as in this paper) or by any parametrized family of univariate distributions (such as Poisson). [sent-88, score-0.296]

36 , un ) du , ··· = argmax F1 (y1 −1) 3 Fn (yn −1) (5) θ u p(u|θ) = cθ (u1 , . [sent-92, score-0.109]

37 3 Frank Figure 3: Graphical representation of the copula model with discrete marginals. [sent-101, score-0.884]

38 Uniform marginals u are drawn from the copula density function cθ (u1 , . [sent-102, score-0.965]

39 The discrete marginals are then generated deterministically using the inverse cdf of the marginals, which are parametrized by λ. [sent-106, score-0.25]

40 5 2 4 1 3 5 −5 6 0 4 7 8 5 5 9 10 10 Figure 4: Distribution of the maximum likelihood estimation of the parameters of four copula families, for various setting of their parameter (x-axis). [sent-111, score-0.891]

41 The last equation is the copula probability mass inside the volume defined by the vertices Fi (yi ) and Fi (yi − 1), and can be readily computed using the copula distribution Cθ (u1 , . [sent-115, score-1.759]

42 For example, in the bivariate case one obtains argmax p(y1 , y2 |θ) = argmax Cθ (u1 , u2 ) + Cθ (u− , u− ) − Cθ (u− , u2 ) − Cθ (u1 , u− ) , 1 2 1 2 θ (6) θ where ui = Fi (yi ) and u− = Fi (yi − 1). [sent-119, score-0.145]

43 Given neural data in the form of firing rates y1 , y2 from a pair of neurons, we collect the empirical cumulative histogram of responses, Fi (k) = P (yi ≤ k). [sent-122, score-0.15]

44 The data is then transformed through the cdfs ui = Fi (yi ), and the copula model is fit according to Eq. [sent-123, score-0.925]

45 If a parametric distribution family is used for the marginals, the parameters of the copula θ and those of the marginals λ can be estimated simultaneously, or alternatively λ can be fitted first, followed by θ. [sent-125, score-1.04]

46 We checked for biases in ML estimation due to a limited amount of data and low firing rate by generating data from the discrete copula model (Fig. [sent-127, score-0.902]

47 3), for a number of copula families and Poisson marginals with parameters λ1 = 2, λ2 = 3. [sent-128, score-1.076]

48 Inaccuracy in the estimation becomes larger as the copulas approach functional dependency (i. [sent-132, score-0.308]

49 , u2 = f (u1 ) for a deterministic function f ), as it is the case for the Gaussian copula when ρ tends to 1, and for the Gumbel copula as θ goes to infinity. [sent-134, score-1.712]

50 4                   Figure 5: Empirical joint distribution and copula fit for two neuron pairs. [sent-138, score-0.985]

51 The top row shows two neurons that have dependencies mainly in the upper tails of their marginal distribution. [sent-139, score-0.442]

52 4 Results To demonstrate the ability of copula models to fit joint firing rate distribution, we model neural data recorded using a multi-electrode array implanted in the pre-motor cortex (PMd) area of a macaque monkey [18, 19]. [sent-147, score-1.109]

53 We fit the copula model using the marginal distribution of neural activity over the entire recording session, including data recorded between trials (i. [sent-151, score-1.041]

54 , the stimulus-conditional response distribution), the marginal response distribution is an important statistical object in its own right, and has been the focus of recent much literature [10, 11]. [sent-156, score-0.147]

55 For example, the joint activity across neurons, averaged over stimuli, is the only distribution the brain has access to, and must be sufficient for learning to construct representations of the external world. [sent-157, score-0.099]

56 We collected spike responses in 100ms bins, and selected at random, without repetition, a training set of 4000 bins and a test set of 2000 bins. [sent-158, score-0.138]

57 Out of a total of 194 neurons we select a subset of 33 neurons that fired a minimum of 2500 spikes over the whole data set. [sent-159, score-0.198]

58 For every pair of neurons in this subset (528 pairs), we fit the parameters of several copula families to the joint firing rate. [sent-160, score-1.136]

59 Figure 5 shows two examples of the kind of the dependencies present in the data set and how they are fit by different copula families. [sent-161, score-0.985]

60 This is confirmed by the empirical copula, which shows the probability mass in the regions defined by the cdfs of the marginal distribution. [sent-163, score-0.156]

61 Since the marginal cdfs are discrete, the data is projected on a discrete set of points on the unit cube; the colors in the empirical copula plots represent the probability mass in the region where the marginal cdfs are constant. [sent-164, score-1.182]

62 The axis in the empirical copula should be interpreted as the quantiles of the marginal distributions – for example, 0. [sent-165, score-0.986]

63 The higher probability mass in the upper right corner of the plot thus means that the two neurons tend to be in the upper tails of the distributions simultaneously, and thus to have higher firing rates together. [sent-167, score-0.302]

64 On the right, one can see that this dependency structure is well captured by the Gumbel copula fit. [sent-168, score-0.969]

65 Although this is not readily visible in the joint histogram, the dependency becomes clear in the empirical copula plot. [sent-170, score-1.007]

66 This structure is captured by the Frank copula fit. [sent-171, score-0.887]

67 The goodness-of-fit of the copula families is evaluated by cross-validation: We fit different models on training data, and compute the log-likelihood of test data under the fitted model. [sent-175, score-0.984]

68 The models are scored according to the difference between the log-likelihood of a model that assumes independent neurons and the log-likelihood of the copula model. [sent-176, score-0.972]

69 This measure (appropriately renormalized) can be interpreted as the number of bits per second that can be saved when coding the firing rate by taking into account the dependencies encoded by the copula family. [sent-177, score-1.012]

70 (7) (8) (9) We took particular care in selecting a small set of copula families that would be able to capture the dependencies occurring in the data. [sent-179, score-1.113]

71 Some of the families that we considered at first capture similar kind of dependencies, and their scores are highly correlated. [sent-180, score-0.128]

72 For example, the Frank and Gaussian copulas are able to represent both positive and negative dependencies in the data, and simultaneously in lower and upper tails, although the dependencies in the tails are less strong for the Frank family (compare the copula densities in Figs. [sent-181, score-1.46]

73 An advantage of the Frank copula is that it is much more efficient to fit, since the Gaussian copula requires multiple evaluations of the bivariate Gaussian cdf, which requires expensive numerical calculations. [sent-185, score-1.764]

74 In addition, The Gaussian copula was also found to be more prone to overfitting on this data set (Fig. [sent-186, score-0.856]

75 With similar procedures we shortlisted a total 3 families that cover the vast majority of dependencies in our data set: Frank, Clayton, and Gumbel copulas. [sent-189, score-0.24]

76 Examples of the copula density of these families can be found in Figs. [sent-190, score-0.967]

77 The Clayton and Gumbel copulas describe dependencies in the lower and upper tails of the distributions, respectively. [sent-192, score-0.444]

78 We didn’t find any example of neuron pairs where the dependency would be in the upper tail of the distribution for one and in the lower tail for the other distribution, or more complicated dependencies. [sent-193, score-0.236]

79 Dependencies in the data set seem thus to be widespread, despite the fact that individual neurons are recorded from electrodes that are up to 4. [sent-196, score-0.139]

80 The most common dependencies structures over all neuron pairs are given by the Gaussian-like dependencies of the Frank copula (54% of the pairs). [sent-200, score-1.188]

81 Interestingly, a large proportion of the neurons showed dependencies concentrated in the upper tails (Gumbel copula, 22%) or lower tails (Clayton copula, 16%) of the distributions (Fig. [sent-201, score-0.47]

82 1 We computed the significance level by generating an artificial data set using independent neurons with the same empirical pdf as the monkey data. [sent-203, score-0.146]

83 Right: Pie chart of the copula families that best fit the neuron pairs. [sent-210, score-1.014]

84 5 Discussion The results presented here show that it is possible to represent neuronal spike responses using a model that preserves discrete, non-negative marginals while incorporating various types of dependencies between neurons. [sent-211, score-0.385]

85 However, many copula families have only one or two parameters, regardless of the copula dimensionality. [sent-215, score-1.823]

86 If the dependency structure across a neural population is relatively homogeneous, then these copulas may be useful in that they can be estimated using far less data than required, e. [sent-216, score-0.384]

87 On the other hand, if the dependencies within a population vary markedly for different pairs of neurons (as in the data set examined here), such copulas will lack the flexibility to capture the complicated dependencies within a full population. [sent-219, score-0.647]

88 In such cases, we can still apply the Gaussian copula (and other copulas derived from elliptically symmetric distributions), since it is parametrized by the same covariance matrix as a n-dimensional Gaussian. [sent-220, score-1.109]

89 However, the Gaussian copula becomes prohibitively expensive to fit in high dimensions, since evaluating the likelihood requires an exponential number of evaluations of the multivariate Gaussian cdf, which itself must be computed numerically. [sent-221, score-0.921]

90 One challenge for future work will therefore be to design new parametric families of copulas whose parameters grow with the number of neurons, but remain tractable enough for maximum-likelihood estimation. [sent-222, score-0.35]

91 Recently, Kirshner [20] proposed a copula-based representation for multivariate distributions using a model that averages over tree-structured copula distributions. [sent-223, score-0.94]

92 The basic idea is that pairwise copulas can be easily combined to produce a tree-structured representation of a multivariate distribution, and that averaging over such trees gives an even more flexible class of multivariate distributions. [sent-224, score-0.304]

93 Another future challenge is to combine explicit models of the stimulus-dependence underlying neural responses with models capable of capturing their joint response dependencies. [sent-226, score-0.207]

94 The data set analyzed here concerned the distribution over spike responses during all all stimulus conditions (i. [sent-227, score-0.127]

95 , the marginal distribution over responses, as opposed to the the conditional response distribution given a stimulus). [sent-229, score-0.154]

96 Although this marginal response distribution is interesting in its own right, for many applications one is interested in separating correlations that are induced by external stimuli from internal correlations due to the network interactions. [sent-230, score-0.241]

97 One obvious approach is to consider a hybrid model with a Linear-Nonlinear-Poisson model [21] capturing stimulus-induced correlation, adjoined to a copula distribution that models the residual dependencies between neurons (Fig. [sent-231, score-1.142]

98 The LNP part of the model removes stimulus-induced correlations from the neural data, so that the copula model can take into account residual network-related dependencies. [sent-237, score-0.943]

99 A point process framework for relating neural spiking activity to spiking history, neural ensemble and extrinsic covariate effects. [sent-283, score-0.167]

100 Bayesian inference for spiking neuron models with a sparsity prior. [sent-298, score-0.099]


similar papers computed by tfidf model

tfidf for this paper:

wordName wordTfidf (topN-words)

[('copula', 0.856), ('copulas', 0.208), ('frank', 0.132), ('dependencies', 0.129), ('families', 0.111), ('marginals', 0.109), ('ring', 0.101), ('neurons', 0.099), ('gumbel', 0.09), ('tails', 0.084), ('dependency', 0.082), ('marginal', 0.08), ('clayton', 0.078), ('fi', 0.067), ('un', 0.062), ('fn', 0.059), ('responses', 0.058), ('fellows', 0.056), ('pindep', 0.056), ('joint', 0.055), ('bivariate', 0.052), ('cdf', 0.051), ('histogram', 0.049), ('multivariate', 0.048), ('neuron', 0.047), ('correlations', 0.047), ('ml', 0.046), ('parametrized', 0.045), ('pillow', 0.042), ('cdfs', 0.042), ('spike', 0.042), ('neural', 0.04), ('population', 0.038), ('paninski', 0.037), ('distributions', 0.036), ('spiking', 0.035), ('argmax', 0.033), ('poisson', 0.033), ('monkey', 0.033), ('parametric', 0.031), ('neuronal', 0.029), ('berkes', 0.028), ('hatsopoulos', 0.028), ('litke', 0.028), ('macke', 0.028), ('shlens', 0.028), ('sklar', 0.028), ('discrete', 0.028), ('row', 0.027), ('pairs', 0.027), ('distribution', 0.027), ('coding', 0.027), ('ui', 0.027), ('gaussian', 0.026), ('yi', 0.025), ('macaque', 0.024), ('improvement', 0.024), ('upper', 0.023), ('latencies', 0.022), ('yn', 0.022), ('cortex', 0.022), ('gauss', 0.021), ('implanted', 0.021), ('bins', 0.021), ('recorded', 0.021), ('colors', 0.02), ('response', 0.02), ('separating', 0.02), ('sher', 0.02), ('mass', 0.02), ('array', 0.02), ('primate', 0.019), ('electrodes', 0.019), ('cube', 0.019), ('estimation', 0.018), ('incorporating', 0.018), ('deterministically', 0.017), ('likelihood', 0.017), ('collected', 0.017), ('capture', 0.017), ('family', 0.017), ('activity', 0.017), ('rates', 0.017), ('models', 0.017), ('univariate', 0.016), ('structure', 0.016), ('uj', 0.015), ('pair', 0.015), ('cumulative', 0.015), ('concentrated', 0.015), ('gatsby', 0.015), ('correlation', 0.015), ('tail', 0.015), ('edition', 0.015), ('captured', 0.015), ('empirical', 0.014), ('simultaneously', 0.014), ('du', 0.014), ('hybrid', 0.014), ('derive', 0.014)]

similar papers list:

simIndex simValue paperId paperTitle

same-paper 1 0.99999964 45 nips-2008-Characterizing neural dependencies with copula models

Author: Pietro Berkes, Frank Wood, Jonathan W. Pillow

Abstract: The coding of information by neural populations depends critically on the statistical dependencies between neuronal responses. However, there is no simple model that can simultaneously account for (1) marginal distributions over single-neuron spike counts that are discrete and non-negative; and (2) joint distributions over the responses of multiple neurons that are often strongly dependent. Here, we show that both marginal and joint properties of neural responses can be captured using copula models. Copulas are joint distributions that allow random variables with arbitrary marginals to be combined while incorporating arbitrary dependencies between them. Different copulas capture different kinds of dependencies, allowing for a richer and more detailed description of dependencies than traditional summary statistics, such as correlation coefficients. We explore a variety of copula models for joint neural response distributions, and derive an efficient maximum likelihood procedure for estimating them. We apply these models to neuronal data collected in macaque pre-motor cortex, and quantify the improvement in coding accuracy afforded by incorporating the dependency structure between pairs of neurons. We find that more than one third of neuron pairs shows dependency concentrated in the lower or upper tails for their firing rate distribution. 1

2 0.52712512 137 nips-2008-Modeling Short-term Noise Dependence of Spike Counts in Macaque Prefrontal Cortex

Author: Arno Onken, Steffen Grünewälder, Matthias Munk, Klaus Obermayer

Abstract: Correlations between spike counts are often used to analyze neural coding. The noise is typically assumed to be Gaussian. Yet, this assumption is often inappropriate, especially for low spike counts. In this study, we present copulas as an alternative approach. With copulas it is possible to use arbitrary marginal distributions such as Poisson or negative binomial that are better suited for modeling noise distributions of spike counts. Furthermore, copulas place a wide range of dependence structures at the disposal and can be used to analyze higher order interactions. We develop a framework to analyze spike count data by means of copulas. Methods for parameter inference based on maximum likelihood estimates and for computation of mutual information are provided. We apply the method to our data recorded from macaque prefrontal cortex. The data analysis leads to three findings: (1) copula-based distributions provide significantly better fits than discretized multivariate normal distributions; (2) negative binomial margins fit the data significantly better than Poisson margins; and (3) the dependence structure carries 12% of the mutual information between stimuli and responses. 1

3 0.080721155 81 nips-2008-Extracting State Transition Dynamics from Multiple Spike Trains with Correlated Poisson HMM

Author: Kentaro Katahira, Jun Nishikawa, Kazuo Okanoya, Masato Okada

Abstract: Neural activity is non-stationary and varies across time. Hidden Markov Models (HMMs) have been used to track the state transition among quasi-stationary discrete neural states. Within this context, independent Poisson models have been used for the output distribution of HMMs; hence, the model is incapable of tracking the change in correlation without modulating the firing rate. To achieve this, we applied a multivariate Poisson distribution with correlation terms for the output distribution of HMMs. We formulated a Variational Bayes (VB) inference for the model. The VB could automatically determine the appropriate number of hidden states and correlation types while avoiding the overlearning problem. We developed an efficient algorithm for computing posteriors using the recursive relationship of a multivariate Poisson distribution. We demonstrated the performance of our method on synthetic data and a real spike train recorded from a songbird. 1

4 0.059300762 192 nips-2008-Reducing statistical dependencies in natural signals using radial Gaussianization

Author: Siwei Lyu, Eero P. Simoncelli

Abstract: We consider the problem of transforming a signal to a representation in which the components are statistically independent. When the signal is generated as a linear transformation of independent Gaussian or non-Gaussian sources, the solution may be computed using a linear transformation (PCA or ICA, respectively). Here, we consider a complementary case, in which the source is non-Gaussian but elliptically symmetric. Such a source cannot be decomposed into independent components using a linear transform, but we show that a simple nonlinear transformation, which we call radial Gaussianization (RG), is able to remove all dependencies. We apply this methodology to natural signals, demonstrating that the joint distributions of nearby bandpass filter responses, for both sounds and images, are closer to being elliptically symmetric than linearly transformed factorial sources. Consistent with this, we demonstrate that the reduction in dependency achieved by applying RG to either pairs or blocks of bandpass filter responses is significantly greater than that achieved by PCA or ICA.

5 0.051982857 59 nips-2008-Dependent Dirichlet Process Spike Sorting

Author: Jan Gasthaus, Frank Wood, Dilan Gorur, Yee W. Teh

Abstract: In this paper we propose a new incremental spike sorting model that automatically eliminates refractory period violations, accounts for action potential waveform drift, and can handle “appearance” and “disappearance” of neurons. Our approach is to augment a known time-varying Dirichlet process that ties together a sequence of infinite Gaussian mixture models, one per action potential waveform observation, with an interspike-interval-dependent likelihood that prohibits refractory period violations. We demonstrate this model by showing results from sorting two publicly available neural data recordings for which a partial ground truth labeling is known. 1

6 0.051183183 109 nips-2008-Interpreting the neural code with Formal Concept Analysis

7 0.050667968 220 nips-2008-Spike Feature Extraction Using Informative Samples

8 0.046811696 230 nips-2008-Temporal Difference Based Actor Critic Learning - Convergence and Neural Implementation

9 0.044329394 90 nips-2008-Gaussian-process factor analysis for low-dimensional single-trial analysis of neural population activity

10 0.043111023 50 nips-2008-Continuously-adaptive discretization for message-passing algorithms

11 0.040294968 8 nips-2008-A general framework for investigating how far the decoding process in the brain can be simplified

12 0.039220773 43 nips-2008-Cell Assemblies in Large Sparse Inhibitory Networks of Biologically Realistic Spiking Neurons

13 0.038572911 58 nips-2008-Dependence of Orientation Tuning on Recurrent Excitation and Inhibition in a Network Model of V1

14 0.037762322 60 nips-2008-Designing neurophysiology experiments to optimally constrain receptive field models along parametric submanifolds

15 0.037714843 231 nips-2008-Temporal Dynamics of Cognitive Control

16 0.037535343 12 nips-2008-Accelerating Bayesian Inference over Nonlinear Differential Equations with Gaussian Processes

17 0.037414715 224 nips-2008-Structured ranking learning using cumulative distribution networks

18 0.037274458 204 nips-2008-Self-organization using synaptic plasticity

19 0.035770047 24 nips-2008-An improved estimator of Variance Explained in the presence of noise

20 0.030815445 110 nips-2008-Kernel-ARMA for Hand Tracking and Brain-Machine interfacing During 3D Motor Control


similar papers computed by lsi model

lsi for this paper:

topicId topicWeight

[(0, -0.109), (1, 0.057), (2, 0.177), (3, 0.199), (4, -0.164), (5, 0.002), (6, 0.04), (7, -0.028), (8, -0.063), (9, 0.007), (10, -0.035), (11, -0.058), (12, 0.043), (13, -0.056), (14, -0.005), (15, -0.018), (16, 0.034), (17, 0.014), (18, -0.02), (19, 0.196), (20, -0.061), (21, -0.043), (22, 0.046), (23, -0.09), (24, -0.049), (25, -0.079), (26, 0.098), (27, 0.082), (28, -0.033), (29, -0.157), (30, -0.154), (31, 0.096), (32, 0.104), (33, 0.024), (34, 0.544), (35, 0.023), (36, 0.115), (37, -0.18), (38, 0.051), (39, -0.072), (40, -0.043), (41, -0.026), (42, -0.056), (43, 0.056), (44, -0.013), (45, -0.039), (46, 0.126), (47, -0.003), (48, -0.004), (49, -0.048)]

similar papers list:

simIndex simValue paperId paperTitle

same-paper 1 0.95357323 45 nips-2008-Characterizing neural dependencies with copula models

Author: Pietro Berkes, Frank Wood, Jonathan W. Pillow

Abstract: The coding of information by neural populations depends critically on the statistical dependencies between neuronal responses. However, there is no simple model that can simultaneously account for (1) marginal distributions over single-neuron spike counts that are discrete and non-negative; and (2) joint distributions over the responses of multiple neurons that are often strongly dependent. Here, we show that both marginal and joint properties of neural responses can be captured using copula models. Copulas are joint distributions that allow random variables with arbitrary marginals to be combined while incorporating arbitrary dependencies between them. Different copulas capture different kinds of dependencies, allowing for a richer and more detailed description of dependencies than traditional summary statistics, such as correlation coefficients. We explore a variety of copula models for joint neural response distributions, and derive an efficient maximum likelihood procedure for estimating them. We apply these models to neuronal data collected in macaque pre-motor cortex, and quantify the improvement in coding accuracy afforded by incorporating the dependency structure between pairs of neurons. We find that more than one third of neuron pairs shows dependency concentrated in the lower or upper tails for their firing rate distribution. 1

2 0.81480211 137 nips-2008-Modeling Short-term Noise Dependence of Spike Counts in Macaque Prefrontal Cortex

Author: Arno Onken, Steffen Grünewälder, Matthias Munk, Klaus Obermayer

Abstract: Correlations between spike counts are often used to analyze neural coding. The noise is typically assumed to be Gaussian. Yet, this assumption is often inappropriate, especially for low spike counts. In this study, we present copulas as an alternative approach. With copulas it is possible to use arbitrary marginal distributions such as Poisson or negative binomial that are better suited for modeling noise distributions of spike counts. Furthermore, copulas place a wide range of dependence structures at the disposal and can be used to analyze higher order interactions. We develop a framework to analyze spike count data by means of copulas. Methods for parameter inference based on maximum likelihood estimates and for computation of mutual information are provided. We apply the method to our data recorded from macaque prefrontal cortex. The data analysis leads to three findings: (1) copula-based distributions provide significantly better fits than discretized multivariate normal distributions; (2) negative binomial margins fit the data significantly better than Poisson margins; and (3) the dependence structure carries 12% of the mutual information between stimuli and responses. 1

3 0.37857229 81 nips-2008-Extracting State Transition Dynamics from Multiple Spike Trains with Correlated Poisson HMM

Author: Kentaro Katahira, Jun Nishikawa, Kazuo Okanoya, Masato Okada

Abstract: Neural activity is non-stationary and varies across time. Hidden Markov Models (HMMs) have been used to track the state transition among quasi-stationary discrete neural states. Within this context, independent Poisson models have been used for the output distribution of HMMs; hence, the model is incapable of tracking the change in correlation without modulating the firing rate. To achieve this, we applied a multivariate Poisson distribution with correlation terms for the output distribution of HMMs. We formulated a Variational Bayes (VB) inference for the model. The VB could automatically determine the appropriate number of hidden states and correlation types while avoiding the overlearning problem. We developed an efficient algorithm for computing posteriors using the recursive relationship of a multivariate Poisson distribution. We demonstrated the performance of our method on synthetic data and a real spike train recorded from a songbird. 1

4 0.26868317 8 nips-2008-A general framework for investigating how far the decoding process in the brain can be simplified

Author: Masafumi Oizumi, Toshiyuki Ishii, Kazuya Ishibashi, Toshihiko Hosoya, Masato Okada

Abstract: “How is information decoded in the brain?” is one of the most difficult and important questions in neuroscience. Whether neural correlation is important or not in decoding neural activities is of special interest. We have developed a general framework for investigating how far the decoding process in the brain can be simplified. First, we hierarchically construct simplified probabilistic models of neural responses that ignore more than Kth-order correlations by using a maximum entropy principle. Then, we compute how much information is lost when information is decoded using the simplified models, i.e., “mismatched decoders”. We introduce an information theoretically correct quantity for evaluating the information obtained by mismatched decoders. We applied our proposed framework to spike data for vertebrate retina. We used 100-ms natural movies as stimuli and computed the information contained in neural activities about these movies. We found that the information loss is negligibly small in population activities of ganglion cells even if all orders of correlation are ignored in decoding. We also found that if we assume stationarity for long durations in the information analysis of dynamically changing stimuli like natural movies, pseudo correlations seem to carry a large portion of the information. 1

5 0.2500734 90 nips-2008-Gaussian-process factor analysis for low-dimensional single-trial analysis of neural population activity

Author: Byron M. Yu, John P. Cunningham, Gopal Santhanam, Stephen I. Ryu, Krishna V. Shenoy, Maneesh Sahani

Abstract: We consider the problem of extracting smooth, low-dimensional neural trajectories that summarize the activity recorded simultaneously from tens to hundreds of neurons on individual experimental trials. Current methods for extracting neural trajectories involve a two-stage process: the data are first “denoised” by smoothing over time, then a static dimensionality reduction technique is applied. We first describe extensions of the two-stage methods that allow the degree of smoothing to be chosen in a principled way, and account for spiking variability that may vary both across neurons and across time. We then present a novel method for extracting neural trajectories, Gaussian-process factor analysis (GPFA), which unifies the smoothing and dimensionality reduction operations in a common probabilistic framework. We applied these methods to the activity of 61 neurons recorded simultaneously in macaque premotor and motor cortices during reach planning and execution. By adopting a goodness-of-fit metric that measures how well the activity of each neuron can be predicted by all other recorded neurons, we found that GPFA provided a better characterization of the population activity than the two-stage methods. 1

6 0.23259109 236 nips-2008-The Mondrian Process

7 0.2160223 232 nips-2008-The Conjoint Effect of Divisive Normalization and Orientation Selectivity on Redundancy Reduction

8 0.20949748 50 nips-2008-Continuously-adaptive discretization for message-passing algorithms

9 0.20698193 192 nips-2008-Reducing statistical dependencies in natural signals using radial Gaussianization

10 0.19810158 224 nips-2008-Structured ranking learning using cumulative distribution networks

11 0.19156286 109 nips-2008-Interpreting the neural code with Formal Concept Analysis

12 0.18989363 186 nips-2008-Probabilistic detection of short events, with application to critical care monitoring

13 0.1823884 220 nips-2008-Spike Feature Extraction Using Informative Samples

14 0.17341426 59 nips-2008-Dependent Dirichlet Process Spike Sorting

15 0.17248856 96 nips-2008-Hebbian Learning of Bayes Optimal Decisions

16 0.1706792 127 nips-2008-Logistic Normal Priors for Unsupervised Probabilistic Grammar Induction

17 0.16486824 82 nips-2008-Fast Computation of Posterior Mode in Multi-Level Hierarchical Models

18 0.16184624 74 nips-2008-Estimating the Location and Orientation of Complex, Correlated Neural Activity using MEG

19 0.15959084 18 nips-2008-An Efficient Sequential Monte Carlo Algorithm for Coalescent Clustering

20 0.15151493 94 nips-2008-Goal-directed decision making in prefrontal cortex: a computational framework


similar papers computed by lda model

lda for this paper:

topicId topicWeight

[(6, 0.057), (7, 0.428), (12, 0.026), (15, 0.023), (27, 0.053), (28, 0.141), (57, 0.06), (59, 0.014), (63, 0.016), (77, 0.028), (83, 0.023), (94, 0.011)]

similar papers list:

simIndex simValue paperId paperTitle

1 0.9694227 109 nips-2008-Interpreting the neural code with Formal Concept Analysis

Author: Dominik Endres, Peter Foldiak

Abstract: We propose a novel application of Formal Concept Analysis (FCA) to neural decoding: instead of just trying to figure out which stimulus was presented, we demonstrate how to explore the semantic relationships in the neural representation of large sets of stimuli. FCA provides a way of displaying and interpreting such relationships via concept lattices. We explore the effects of neural code sparsity on the lattice. We then analyze neurophysiological data from high-level visual cortical area STSa, using an exact Bayesian approach to construct the formal context needed by FCA. Prominent features of the resulting concept lattices are discussed, including hierarchical face representation and indications for a product-of-experts code in real neurons. 1

2 0.96422434 12 nips-2008-Accelerating Bayesian Inference over Nonlinear Differential Equations with Gaussian Processes

Author: Ben Calderhead, Mark Girolami, Neil D. Lawrence

Abstract: Identification and comparison of nonlinear dynamical system models using noisy and sparse experimental data is a vital task in many fields, however current methods are computationally expensive and prone to error due in part to the nonlinear nature of the likelihood surfaces induced. We present an accelerated sampling procedure which enables Bayesian inference of parameters in nonlinear ordinary and delay differential equations via the novel use of Gaussian processes (GP). Our method involves GP regression over time-series data, and the resulting derivative and time delay estimates make parameter inference possible without solving the dynamical system explicitly, resulting in dramatic savings of computational time. We demonstrate the speed and statistical accuracy of our approach using examples of both ordinary and delay differential equations, and provide a comprehensive comparison with current state of the art methods. 1

3 0.95041752 56 nips-2008-Deep Learning with Kernel Regularization for Visual Recognition

Author: Kai Yu, Wei Xu, Yihong Gong

Abstract: In this paper we aim to train deep neural networks for rapid visual recognition. The task is highly challenging, largely due to the lack of a meaningful regularizer on the functions realized by the networks. We propose a novel regularization method that takes advantage of kernel methods, where an oracle kernel function represents prior knowledge about the recognition task of interest. We derive an efficient algorithm using stochastic gradient descent, and demonstrate encouraging results on a wide range of recognition tasks, in terms of both accuracy and speed. 1

4 0.93313378 51 nips-2008-Convergence and Rate of Convergence of a Manifold-Based Dimension Reduction Algorithm

Author: Andrew Smith, Hongyuan Zha, Xiao-ming Wu

Abstract: We study the convergence and the rate of convergence of a local manifold learning algorithm: LTSA [13]. The main technical tool is the perturbation analysis on the linear invariant subspace that corresponds to the solution of LTSA. We derive a worst-case upper bound of errors for LTSA which naturally leads to a convergence result. We then derive the rate of convergence for LTSA in a special case. 1

same-paper 5 0.91590416 45 nips-2008-Characterizing neural dependencies with copula models

Author: Pietro Berkes, Frank Wood, Jonathan W. Pillow

Abstract: The coding of information by neural populations depends critically on the statistical dependencies between neuronal responses. However, there is no simple model that can simultaneously account for (1) marginal distributions over single-neuron spike counts that are discrete and non-negative; and (2) joint distributions over the responses of multiple neurons that are often strongly dependent. Here, we show that both marginal and joint properties of neural responses can be captured using copula models. Copulas are joint distributions that allow random variables with arbitrary marginals to be combined while incorporating arbitrary dependencies between them. Different copulas capture different kinds of dependencies, allowing for a richer and more detailed description of dependencies than traditional summary statistics, such as correlation coefficients. We explore a variety of copula models for joint neural response distributions, and derive an efficient maximum likelihood procedure for estimating them. We apply these models to neuronal data collected in macaque pre-motor cortex, and quantify the improvement in coding accuracy afforded by incorporating the dependency structure between pairs of neurons. We find that more than one third of neuron pairs shows dependency concentrated in the lower or upper tails for their firing rate distribution. 1

6 0.75457543 137 nips-2008-Modeling Short-term Noise Dependence of Spike Counts in Macaque Prefrontal Cortex

7 0.73815733 71 nips-2008-Efficient Sampling for Gaussian Process Inference using Control Variables

8 0.73159063 213 nips-2008-Sparse Convolved Gaussian Processes for Multi-output Regression

9 0.70340902 188 nips-2008-QUIC-SVD: Fast SVD Using Cosine Trees

10 0.70232439 221 nips-2008-Stochastic Relational Models for Large-scale Dyadic Data using MCMC

11 0.69817823 60 nips-2008-Designing neurophysiology experiments to optimally constrain receptive field models along parametric submanifolds

12 0.696383 99 nips-2008-High-dimensional support union recovery in multivariate regression

13 0.68512726 54 nips-2008-Covariance Estimation for High Dimensional Data Vectors Using the Sparse Matrix Transform

14 0.66530836 8 nips-2008-A general framework for investigating how far the decoding process in the brain can be simplified

15 0.66527265 192 nips-2008-Reducing statistical dependencies in natural signals using radial Gaussianization

16 0.66498357 83 nips-2008-Fast High-dimensional Kernel Summations Using the Monte Carlo Multipole Method

17 0.66235369 66 nips-2008-Dynamic visual attention: searching for coding length increments

18 0.66110557 146 nips-2008-Multi-task Gaussian Process Learning of Robot Inverse Dynamics

19 0.65998888 63 nips-2008-Dimensionality Reduction for Data in Multiple Feature Representations

20 0.65967011 62 nips-2008-Differentiable Sparse Coding