nips nips2013 nips2013-262 knowledge-graph by maker-knowledge-mining

262 nips-2013-Real-Time Inference for a Gamma Process Model of Neural Spiking


Source: pdf

Author: David Carlson, Vinayak Rao, Joshua T. Vogelstein, Lawrence Carin

Abstract: With simultaneous measurements from ever increasing populations of neurons, there is a growing need for sophisticated tools to recover signals from individual neurons. In electrophysiology experiments, this classically proceeds in a two-step process: (i) threshold the waveforms to detect putative spikes and (ii) cluster the waveforms into single units (neurons). We extend previous Bayesian nonparametric models of neural spiking to jointly detect and cluster neurons using a Gamma process model. Importantly, we develop an online approximate inference scheme enabling real-time analysis, with performance exceeding the previous state-of-theart. Via exploratory data analysis—using data with partial ground truth as well as two novel data sets—we find several features of our model collectively contribute to our improved performance including: (i) accounting for colored noise, (ii) detecting overlapping spikes, (iii) tracking waveform dynamics, and (iv) using multiple channels. We hope to enable novel experiments simultaneously measuring many thousands of neurons and possibly adapting stimuli dynamically to probe ever deeper into the mysteries of the brain. 1

Reference: text


Summary: the most important sentenses genereted by tfidf model

sentIndex sentText sentNum sentScore

1 In electrophysiology experiments, this classically proceeds in a two-step process: (i) threshold the waveforms to detect putative spikes and (ii) cluster the waveforms into single units (neurons). [sent-5, score-0.869]

2 We extend previous Bayesian nonparametric models of neural spiking to jointly detect and cluster neurons using a Gamma process model. [sent-6, score-0.468]

3 Crucial for this endeavor is the advancement of our ability to understand the dynamics of the brain, via the measurement of large populations of neural activity at the single neuron level. [sent-11, score-0.347]

4 , electrodes or calcium imaging), real-time decoding of individual neuron responses requires identifying and labeling individual spikes from recordings from large populations. [sent-15, score-0.664]

5 In other words, real-time decoding requires real-time spike sorting. [sent-16, score-0.504]

6 Automatic spike sorting methods are continually evolving to deal with more sophisticated experiments. [sent-17, score-0.557]

7 Most recently, several methods have been proposed to (i) learn the number of separable neurons on each electrode or “multi-trode” [1, 2], or (ii) operate online to resolve overlapping spikes from multiple neurons [3]. [sent-18, score-0.773]

8 Our model explains the continuous output of each neuron by a latent marked Poisson process, with the “marks” characterizing the shape of each spike. [sent-21, score-0.405]

9 Previous efforts to address overlapping spiking often assume a fixed kernel for each waveform, but joint intracellular and extracellular recording clearly indicate that this assumption is false (see Figure 3c). [sent-22, score-0.541]

10 Our work therefore suggests that further improvements in real-time decoding of activity may be most effective if directed at simultaneous real-time spike sorting and decoding. [sent-30, score-0.641]

11 2 Model Our data is a time-series of multielectrode recordings X ⌘ (x1 , · · · , xT ), and consists of T recordings from M channels. [sent-32, score-0.244]

12 Each neuron generates a continuous-time voltage trace, and the outputs of all neurons are superimposed and discretely sampled to produce the recordings X. [sent-35, score-0.616]

13 1 we model the continuous-time output of each neuron as a series of idealized Poisson events smoothed with appropriate kernels, while §2. [sent-37, score-0.266]

14 1 Modeling the continuous-time output of a single neuron There is a rich literature characterizing the spiking activity of a single neuron [4] accounting in detail for factors like non-stationarity, refractoriness and spike waveform. [sent-44, score-1.196]

15 First, we model the spiking activity of each neuron are stationary and memoryless, so that its set of spike times are distributed as a homogeneous Poisson process (PP). [sent-46, score-0.939]

16 We model the neurons themselves are heterogeneous, with the ith neuron having an (unknown) firing rate i . [sent-47, score-0.425]

17 Call the ordered set of spike times of the ith neuron Ti = (⌧i1 , ⌧i2 , . [sent-48, score-0.74]

18 The actual electrical output of a neuron is not binary; instead each spiking event is a smooth perturbation in voltage about a resting state. [sent-53, score-0.463]

19 This perturbation forms the shape of the spike, with the spike shapes varying across neurons as well as across different spikes of the same neuron. [sent-54, score-0.919]

20 However, each neuron has its own characteristic distribution over shapes, and we let ✓ ⇤ 2 ⇥ parametrize this i distribution for neuron i. [sent-55, score-0.532]

21 Whenever this neuron emits a spike, a new shape is drawn independently from the corresponding distribution. [sent-56, score-0.33]

22 This waveform is then offset to the time of the spike, and contributes to the voltage trace associated with that spike. [sent-57, score-0.494]

23 The complete recording from the neuron is the superposition of all these spike waveforms plus noise. [sent-58, score-1.088]

24 We model each spike shape as weighted superpositions of a dictionary of K basis functions d(t) ⌘ (d1 (t), · · · , dK (t))T . [sent-62, score-0.593]

25 Each spike time ⌧ij is associated with a ran⇤ ⇤ ⇤ dom K-dimensional weight vector yij ⌘ (yij1 , . [sent-66, score-0.474]

26 yijK )T , and the shape of this spike at time t PK ⇤ ⇤ is given by the weighted sum k=1 yijk dk (t ⌧ij ). [sent-69, score-0.642]

27 Then, at any time t, the output of neuron i is xi (t) = j=1 k=1 yijk dk (t ⌧ij ). [sent-71, score-0.37]

28 Assume for the moment there are N neurons, and define T ⌘ [i2[N ] Ti as the (ordered) union of the spike times of all neurons. [sent-73, score-0.474]

29 Let ⌧l 2 T indicate the time of the lth overall spike, whereas ⌧ij 2 Ti is the time of the j th spike of neuron i. [sent-74, score-0.777]

30 In words, ⌫l 2 N is the neuron to which the lth element of T belongs, while pl indexes this spike in the spike train T⌫l . [sent-78, score-1.279]

31 Let ✓ l ⌘ (µl , ⌃l ) be the neuron parameter ⇤ associated with spike l, so that ✓ l = ✓ ⇤l . [sent-79, score-0.74]

32 , ylK )T ⌘ y⌫j pj as the weight ⌫ 2 vector of spike ⌧l . [sent-83, score-0.474]

33 Each event ⌧l 2 T has a pair of labels, its neuron parameter ✓ l ⌘ (µl , ⌃l ), and yl , the weight-vector characterizing the spike shape. [sent-86, score-0.926]

34 i With probability one, the neurons have distinct parameters, so that the mark ✓ l identifies the neuron which produced spike l: G(✓ l = ✓ ⇤ ) = P(⌫l = i) = i /⇤. [sent-92, score-0.899]

35 The output waveform x(t) is then a linear functional of this marked Poisson process. [sent-95, score-0.447]

36 2 A nonparametric model of population activity In practice, the number of neurons driving the recorded activity is unknown. [sent-97, score-0.314]

37 Recalling that each neuron is characterized by a pair of parameters ( i , ✓ ⇤ ), we map i P1 the infinite collection of pairs {( i , ✓ ⇤ )} to an random measure ⇤(·) on ⇥: ⇤(d✓) = i=1 i ✓⇤ . [sent-106, score-0.266]

38 Crucially, a normalized Gamma process is the Dirichlet process (DP) [15], so that the spike parameters ✓ are i. [sent-127, score-0.616]

39 For spike l, the shape vector is drawn from a normal with parameters (µl , ⌃l ): these are thus draws from a DP mixture (DPM) of Gaussians [16]. [sent-131, score-0.58]

40 We can exploit the connection with the DP to integrate out the infinite-dimensional measure G(·) (and thus ⇤(·)), and assign spikes to neurons via the so-called Chinese restaurant process (CRP) [17]. [sent-132, score-0.452]

41 Under this scheme, the lth spike is assigned the same parameter as an earlier spike with probability proportional to the number of earlier spikes having that parameter. [sent-133, score-1.235]

42 It is assigned a new parameter (and thus, a new neuron is observed) with probability proportional to ↵. [sent-134, score-0.294]

43 The shape of the j th spike is now a vector of length L, and for matrix D e a weight vector y, is given by Dy. [sent-154, score-0.538]

44 e e Thus, ⌫t and ✓t are the neuron and neuron parameter associated with time bin t, and yt is its weighte vector. [sent-158, score-0.564]

45 Let the binary variable zt indicate whether or not a spike is present in time bin t (recall that e e zt ⇠ Bernoulli(⇤ )). [sent-159, score-0.604]

46 If there is no spike associated with bin t, then we ignore the marks µ and y. [sent-160, score-0.574]

47 We let every spike affect the recordings at all channels, with the spike shape varying across channels. [sent-167, score-1.109]

48 For spike l in m channel m, call the weight-vector yl . [sent-168, score-0.74]

49 All these vectors must be correlated as they correspond to the same spike; we do this simply by concatenating the set of vectors into a single M K-element vector 1 M yl = (yl ; · · · ; yl ), and modeling this as a multivariate normal. [sent-169, score-0.258]

50 We also relax the requirement that the parameters ✓ ⇤ of each neuron remain constant, and instead allow µ⇤ , the mean of the weight-vector distribution, to evolve with time (we keep the covariance parameter ⌃⇤ fixed, however). [sent-171, score-0.266]

51 Each neuron is now associated with a vector-valued function ✓ ⇤ (·), rather than a constant. [sent-176, score-0.266]

52 When a spike at time ⌧l is assigned to neuron i, it is assigned a weight-vector yl drawn from a Gaussian with mean µ⇤ (⌧l ). [sent-177, score-0.925]

53 It also maintains the identities of the neurons that it assigned each of these spikes to, as well as the weight vectors determining the shapes of the associated spike waveforms. [sent-196, score-0.883]

54 We indicate these point estimates with the hat operator, so, for b example Tit is the set of estimated spike times before time t assigned to neuron i. [sent-197, score-0.768]

55 i 2 [C i ⇤ over the parameters ✓i ⌘ (µ⇤ , ⌃⇤ ) of neuron i given the observations until time t. [sent-201, score-0.266]

56 i i Having identified the time and shape of spikes from earlier times, we can calculate their contribution to the recordings xL ⌘ (xt , · · · , xt+L 1 )T . [sent-202, score-0.383]

57 Recalling that the basis functions D, t and thus all spike waveforms, span L time bins, the residual at time t + t1 is then given by P xt+t1 = xt b y b h2[L t1 ] zt h Dbt h (at time t, for t1 > 0, we define zt+t1 = 0). [sent-203, score-0.637]

58 The latter is used to calculate qi,t+1 (✓i ), the new distribution over neuron parameters at time t + 1. [sent-205, score-0.266]

59 5 we decide that there is a spike z present at time t, otherwise, we set zt = 0. [sent-211, score-0.523]

60 In the event of a spike (bt = 1), we use these point estimates to update the posterior distribution z over parameters of cluster ⌫t , to obtain qi,t+1 (·) from qi,t (·); this is straightforward because of b conjugacy. [sent-214, score-0.627]

61 The recording was made simultaneously on all electrodes and was set up such that the cell with the intracellular electrode was also recorded on the extracellular array implanted in the hippocampus of an anesthetized rat. [sent-223, score-0.499]

62 The intracellular recording is relatively noiseless and gives nearly certain firing times of the intracellular neuron. [sent-224, score-0.409]

63 The extracellular recording contains the spike waveforms from the intracellular neuron as well as an unknown number of additional neurons. [sent-225, score-1.294]

64 The neighboring electrode sites in these devices have 30 µm between electrode edges and 60 µm between electrode centers. [sent-232, score-0.331]

65 These devices are close enough that a locallyfiring neuron could appear on multiple electrode sites [2], so neighboring channels warrant joint processing. [sent-233, score-0.497]

66 To define D, we used the first five principle components of all spikes detected with a threshold (three times the standard deviation of the noise above the mean) in the first five seconds. [sent-238, score-0.279]

67 In this dataset, we only have a partial ground truth, so we can only verify accuracy for the neuron with the intracellular (IC) recording. [sent-244, score-0.462]

68 We define a detected spike to be an IC spike if the IC recording has a spike within 0. [sent-245, score-1.533]

69 5 milliseconds (ms) of the detected spike in the extracellular recording. [sent-246, score-0.581]

70 We define the cluster with the greatest number of intracellular spikes as a the “IC cluster”. [sent-247, score-0.456]

71 We refer to these data as “partial ground truth data”, because we know the ground truth spike times for one of the neurons, but not all the others. [sent-248, score-0.608]

72 The spike detections for the offline methods used a threshold of three times the noise standard deviation [5] (unless stated otherwise), and windowed at a size L = 30. [sent-259, score-0.536]

73 For multichannel data, we concatenated the M channels for each waveform to obtain a M ⇥ L-dimensional vector. [sent-260, score-0.591]

74 Running time in unoptimized MATLAB code for 4 minutes of data was 31 seconds for a single channel and 3 minutes for all 4 channels on a 3. [sent-269, score-0.273]

75 Performance on partial ground truth data The main empirical result of our contribution is that all variants of O P A S S detect more true positives with fewer false positives than any of the other algorithms on the partial ground truth data (see Fig. [sent-272, score-0.349]

76 Figure 2: O P A S S detects multiple overlapping waveforms (Top Left) The observed voltage (solid black), MAP waveform 1 (red), MAP waveform 2 (blue), and waveform from the sum (dashed-black). [sent-300, score-1.619]

77 When spikes overlap, although the result can accurately be modeled as a linear sum in voltage space, the resulting waveform often does not appear in any cluster in PC space (see [1]). [sent-311, score-0.786]

78 Note that even though the waveform peaks are approximately 1 ms from one another, thresholding algorithms do not pick up these spikes, because they look different in PC space. [sent-315, score-0.438]

79 Indeed, by virtue of estimating the presence of multiple spikes, the residual squared error between the expected voltage and observed voltage shrinks for this snippet (bottom left). [sent-316, score-0.259]

80 Of the 135 pairs of overlapping spikes, 37 of those spikes came from the intracellular neuron. [sent-320, score-0.479]

81 Thus, while it seems detecting overlapping spikes helps, it does not fully explain the improvements over the competitor algorithms. [sent-321, score-0.346]

82 Time-Varying Waveform Adaptation As has been demonstrated previously [26], the waveform shape of a neuron may change over time. [sent-322, score-0.73]

83 The mean waveform over time for the intracellular neuron is shown in Fig. [sent-323, score-0.83]

84 3c shows that the auto-regressive model for the mean dictionary weights yields a timevarying posterior (top), whereas the static prior yields a constant posterior mean with increasing posterior marginal variances (bottom). [sent-330, score-0.247]

85 5 PCA Component 1 (b) 2 0 1 ms 2 3 (c) Figure 3: The IC waveform changes over time, which our posterior parameters track. [sent-341, score-0.492]

86 Each colored line represents the mean of the waveform averaged over 24 seconds with color denoting the time interval. [sent-343, score-0.434]

87 This neuron decreases in amplitude over the period of the recording. [sent-344, score-0.311]

88 (c) The mean and standard deviation of the waveforms at three time points for the auto-regressive prior on the mean waveform (top) and static prior (bottom). [sent-346, score-0.662]

89 Note that the left panel has a waveform that appears on both channel 2 and channel 3, whereas the waveform in the right panel only appears in channel 3. [sent-354, score-1.289]

90 Had only the third electrode been used, these two waveforms would not be distinct (as evidenced by their substantial overlap in PC space upon using only the third channel in Fig. [sent-361, score-0.47]

91 5 Discussion Our improved sensitivity and specificity seem to arise from multiple sources including (i) improved detection, (ii) accounting for correlated noise, (iii) capturing overlapping spikes, (iv) tracking waveform dynamics, and (v) utilizing multiple channels. [sent-366, score-0.558]

92 While others have developed closely related Bayesian models for clustering [8, 27], deconvolution based techniques [1], time-varying waveforms [26], or online methods [25, 3], we are the first to our knowledge to incorporate all of these. [sent-367, score-0.271]

93 An interesting implication of our work is that it seems that our errors may be irreconcilable using merely first order methods (that only consider the mean waveform to detect and cluster). [sent-368, score-0.447]

94 8a shows the mean waveform of the true and false positives are essentially identical, suggesting that even in the full 30-dimensional space excluding those waveforms from intracellular cluster would be difficult. [sent-371, score-0.976]

95 Projecting each waveform into the first two PCs is similarly suggestive, as the missed positives do not seem to be in the cluster of the true positives (Supp. [sent-372, score-0.586]

96 A model-based spike sorting algorithm for removing correlation artifacts in multi-neuron recordings. [sent-379, score-0.557]

97 Fast, scalable, c Bayesian spike identification for multi-electrode arrays. [sent-382, score-0.474]

98 An online spike detection and spike classification algorithm capable of instantaneous resolution of overlapping spikes. [sent-385, score-1.114]

99 A review of methods for spike sorting: the detection and classification of neural action potentials. [sent-391, score-0.508]

100 Kalman filter mixture model for spike sorting of non-stationary data. [sent-465, score-0.599]


similar papers computed by tfidf model

tfidf for this paper:

wordName wordTfidf (topN-words)

[('spike', 0.474), ('waveform', 0.4), ('neuron', 0.266), ('waveforms', 0.232), ('spikes', 0.222), ('intracellular', 0.164), ('neurons', 0.159), ('channel', 0.137), ('yl', 0.129), ('ic', 0.126), ('poisson', 0.11), ('channels', 0.102), ('electrode', 0.101), ('gamma', 0.098), ('recordings', 0.097), ('voltage', 0.094), ('overlapping', 0.093), ('multichannel', 0.089), ('osort', 0.084), ('sorting', 0.083), ('recording', 0.081), ('extracellular', 0.077), ('spiking', 0.074), ('process', 0.071), ('xt', 0.07), ('cluster', 0.07), ('marks', 0.068), ('qit', 0.067), ('shape', 0.064), ('vy', 0.064), ('positives', 0.058), ('dictionary', 0.055), ('dk', 0.054), ('activity', 0.054), ('posterior', 0.054), ('false', 0.052), ('mo', 0.051), ('multielectrode', 0.05), ('neuronexus', 0.05), ('yijk', 0.05), ('ylk', 0.05), ('ti', 0.05), ('zt', 0.049), ('dp', 0.049), ('electrodes', 0.049), ('marked', 0.047), ('gmm', 0.047), ('nonparametric', 0.047), ('detect', 0.047), ('amplitude', 0.045), ('residual', 0.044), ('mixture', 0.042), ('pc', 0.041), ('online', 0.039), ('bayesian', 0.039), ('panel', 0.039), ('putative', 0.039), ('ms', 0.038), ('dpmm', 0.037), ('lth', 0.037), ('ar', 0.036), ('batch', 0.036), ('dirichlet', 0.035), ('truth', 0.035), ('detections', 0.035), ('superposition', 0.035), ('gp', 0.035), ('city', 0.035), ('detection', 0.034), ('seconds', 0.034), ('autoregressive', 0.034), ('heightened', 0.034), ('rpm', 0.034), ('sugs', 0.034), ('accounting', 0.034), ('pp', 0.033), ('crp', 0.033), ('bin', 0.032), ('ground', 0.032), ('residuals', 0.032), ('mv', 0.032), ('sensitivity', 0.031), ('competitor', 0.031), ('static', 0.03), ('detected', 0.03), ('decoding', 0.03), ('carlson', 0.03), ('crm', 0.03), ('crms', 0.03), ('khz', 0.03), ('event', 0.029), ('devices', 0.028), ('characterizing', 0.028), ('assigned', 0.028), ('pl', 0.028), ('mor', 0.027), ('implanted', 0.027), ('snippet', 0.027), ('threshold', 0.027), ('populations', 0.027)]

similar papers list:

simIndex simValue paperId paperTitle

same-paper 1 0.99999982 262 nips-2013-Real-Time Inference for a Gamma Process Model of Neural Spiking

Author: David Carlson, Vinayak Rao, Joshua T. Vogelstein, Lawrence Carin

Abstract: With simultaneous measurements from ever increasing populations of neurons, there is a growing need for sophisticated tools to recover signals from individual neurons. In electrophysiology experiments, this classically proceeds in a two-step process: (i) threshold the waveforms to detect putative spikes and (ii) cluster the waveforms into single units (neurons). We extend previous Bayesian nonparametric models of neural spiking to jointly detect and cluster neurons using a Gamma process model. Importantly, we develop an online approximate inference scheme enabling real-time analysis, with performance exceeding the previous state-of-theart. Via exploratory data analysis—using data with partial ground truth as well as two novel data sets—we find several features of our model collectively contribute to our improved performance including: (i) accounting for colored noise, (ii) detecting overlapping spikes, (iii) tracking waveform dynamics, and (iv) using multiple channels. We hope to enable novel experiments simultaneously measuring many thousands of neurons and possibly adapting stimuli dynamically to probe ever deeper into the mysteries of the brain. 1

2 0.30165681 6 nips-2013-A Determinantal Point Process Latent Variable Model for Inhibition in Neural Spiking Data

Author: Jasper Snoek, Richard Zemel, Ryan P. Adams

Abstract: Point processes are popular models of neural spiking behavior as they provide a statistical distribution over temporal sequences of spikes and help to reveal the complexities underlying a series of recorded action potentials. However, the most common neural point process models, the Poisson process and the gamma renewal process, do not capture interactions and correlations that are critical to modeling populations of neurons. We develop a novel model based on a determinantal point process over latent embeddings of neurons that effectively captures and helps visualize complex inhibitory and competitive interaction. We show that this model is a natural extension of the popular generalized linear model to sets of interacting neurons. The model is extended to incorporate gain control or divisive normalization, and the modulation of neural spiking based on periodic phenomena. Applied to neural spike recordings from the rat hippocampus, we see that the model captures inhibitory relationships, a dichotomy of classes of neurons, and a periodic modulation by the theta rhythm known to be present in the data. 1

3 0.2755017 51 nips-2013-Bayesian entropy estimation for binary spike train data using parametric prior knowledge

Author: Evan W. Archer, Il M. Park, Jonathan W. Pillow

Abstract: Shannon’s entropy is a basic quantity in information theory, and a fundamental building block for the analysis of neural codes. Estimating the entropy of a discrete distribution from samples is an important and difficult problem that has received considerable attention in statistics and theoretical neuroscience. However, neural responses have characteristic statistical structure that generic entropy estimators fail to exploit. For example, existing Bayesian entropy estimators make the naive assumption that all spike words are equally likely a priori, which makes for an inefficient allocation of prior probability mass in cases where spikes are sparse. Here we develop Bayesian estimators for the entropy of binary spike trains using priors designed to flexibly exploit the statistical structure of simultaneouslyrecorded spike responses. We define two prior distributions over spike words using mixtures of Dirichlet distributions centered on simple parametric models. The parametric model captures high-level statistical features of the data, such as the average spike count in a spike word, which allows the posterior over entropy to concentrate more rapidly than with standard estimators (e.g., in cases where the probability of spiking differs strongly from 0.5). Conversely, the Dirichlet distributions assign prior mass to distributions far from the parametric model, ensuring consistent estimates for arbitrary distributions. We devise a compact representation of the data and prior that allow for computationally efficient implementations of Bayesian least squares and empirical Bayes entropy estimators with large numbers of neurons. We apply these estimators to simulated and real neural data and show that they substantially outperform traditional methods.

4 0.24243073 246 nips-2013-Perfect Associative Learning with Spike-Timing-Dependent Plasticity

Author: Christian Albers, Maren Westkott, Klaus Pawelzik

Abstract: Recent extensions of the Perceptron as the Tempotron and the Chronotron suggest that this theoretical concept is highly relevant for understanding networks of spiking neurons in the brain. It is not known, however, how the computational power of the Perceptron might be accomplished by the plasticity mechanisms of real synapses. Here we prove that spike-timing-dependent plasticity having an anti-Hebbian form for excitatory synapses as well as a spike-timing-dependent plasticity of Hebbian shape for inhibitory synapses are sufficient for realizing the original Perceptron Learning Rule if these respective plasticity mechanisms act in concert with the hyperpolarisation of the post-synaptic neurons. We also show that with these simple yet biologically realistic dynamics Tempotrons and Chronotrons are learned. The proposed mechanism enables incremental associative learning from a continuous stream of patterns and might therefore underly the acquisition of long term memories in cortex. Our results underline that learning processes in realistic networks of spiking neurons depend crucially on the interactions of synaptic plasticity mechanisms with the dynamics of participating neurons.

5 0.23052126 121 nips-2013-Firing rate predictions in optimal balanced networks

Author: David G. Barrett, Sophie Denève, Christian K. Machens

Abstract: How are firing rates in a spiking network related to neural input, connectivity and network function? This is an important problem because firing rates are a key measure of network activity, in both the study of neural computation and neural network dynamics. However, it is a difficult problem, because the spiking mechanism of individual neurons is highly non-linear, and these individual neurons interact strongly through connectivity. We develop a new technique for calculating firing rates in optimal balanced networks. These are particularly interesting networks because they provide an optimal spike-based signal representation while producing cortex-like spiking activity through a dynamic balance of excitation and inhibition. We can calculate firing rates by treating balanced network dynamics as an algorithm for optimising signal representation. We identify this algorithm and then calculate firing rates by finding the solution to the algorithm. Our firing rate calculation relates network firing rates directly to network input, connectivity and function. This allows us to explain the function and underlying mechanism of tuning curves in a variety of systems. 1

6 0.20947081 304 nips-2013-Sparse nonnegative deconvolution for compressive calcium imaging: algorithms and phase transitions

7 0.1941435 173 nips-2013-Least Informative Dimensions

8 0.18632701 286 nips-2013-Robust learning of low-dimensional dynamics from large neural ensembles

9 0.15884729 49 nips-2013-Bayesian Inference and Online Experimental Design for Mapping Neural Microcircuits

10 0.15619273 341 nips-2013-Universal models for binary spike patterns using centered Dirichlet processes

11 0.155779 141 nips-2013-Inferring neural population dynamics from multiple partial recordings of the same neural circuit

12 0.15013281 305 nips-2013-Spectral methods for neural characterization using generalized quadratic models

13 0.14627549 208 nips-2013-Neural representation of action sequences: how far can a simple snippet-matching model take us?

14 0.12445516 205 nips-2013-Multisensory Encoding, Decoding, and Identification

15 0.1154085 266 nips-2013-Recurrent linear models of simultaneously-recorded neural populations

16 0.11033179 210 nips-2013-Noise-Enhanced Associative Memories

17 0.10470752 157 nips-2013-Learning Multi-level Sparse Representations

18 0.10017709 237 nips-2013-Optimal integration of visual speed across different spatiotemporal frequency channels

19 0.093117379 236 nips-2013-Optimal Neural Population Codes for High-dimensional Stimulus Variables

20 0.091771483 308 nips-2013-Spike train entropy-rate estimation using hierarchical Dirichlet process priors


similar papers computed by lsi model

lsi for this paper:

topicId topicWeight

[(0, 0.215), (1, 0.108), (2, -0.122), (3, -0.108), (4, -0.426), (5, 0.003), (6, 0.029), (7, -0.053), (8, 0.057), (9, 0.043), (10, -0.076), (11, 0.006), (12, -0.03), (13, 0.03), (14, 0.052), (15, -0.06), (16, 0.108), (17, 0.068), (18, 0.0), (19, 0.066), (20, -0.034), (21, 0.099), (22, -0.019), (23, -0.151), (24, 0.012), (25, -0.051), (26, -0.04), (27, -0.081), (28, 0.084), (29, 0.01), (30, -0.051), (31, 0.063), (32, 0.005), (33, -0.006), (34, 0.013), (35, 0.069), (36, -0.025), (37, 0.005), (38, -0.055), (39, -0.033), (40, 0.01), (41, 0.037), (42, 0.072), (43, -0.02), (44, -0.048), (45, -0.037), (46, 0.008), (47, 0.046), (48, -0.034), (49, 0.002)]

similar papers list:

simIndex simValue paperId paperTitle

same-paper 1 0.95425159 262 nips-2013-Real-Time Inference for a Gamma Process Model of Neural Spiking

Author: David Carlson, Vinayak Rao, Joshua T. Vogelstein, Lawrence Carin

Abstract: With simultaneous measurements from ever increasing populations of neurons, there is a growing need for sophisticated tools to recover signals from individual neurons. In electrophysiology experiments, this classically proceeds in a two-step process: (i) threshold the waveforms to detect putative spikes and (ii) cluster the waveforms into single units (neurons). We extend previous Bayesian nonparametric models of neural spiking to jointly detect and cluster neurons using a Gamma process model. Importantly, we develop an online approximate inference scheme enabling real-time analysis, with performance exceeding the previous state-of-theart. Via exploratory data analysis—using data with partial ground truth as well as two novel data sets—we find several features of our model collectively contribute to our improved performance including: (i) accounting for colored noise, (ii) detecting overlapping spikes, (iii) tracking waveform dynamics, and (iv) using multiple channels. We hope to enable novel experiments simultaneously measuring many thousands of neurons and possibly adapting stimuli dynamically to probe ever deeper into the mysteries of the brain. 1

2 0.80339903 51 nips-2013-Bayesian entropy estimation for binary spike train data using parametric prior knowledge

Author: Evan W. Archer, Il M. Park, Jonathan W. Pillow

Abstract: Shannon’s entropy is a basic quantity in information theory, and a fundamental building block for the analysis of neural codes. Estimating the entropy of a discrete distribution from samples is an important and difficult problem that has received considerable attention in statistics and theoretical neuroscience. However, neural responses have characteristic statistical structure that generic entropy estimators fail to exploit. For example, existing Bayesian entropy estimators make the naive assumption that all spike words are equally likely a priori, which makes for an inefficient allocation of prior probability mass in cases where spikes are sparse. Here we develop Bayesian estimators for the entropy of binary spike trains using priors designed to flexibly exploit the statistical structure of simultaneouslyrecorded spike responses. We define two prior distributions over spike words using mixtures of Dirichlet distributions centered on simple parametric models. The parametric model captures high-level statistical features of the data, such as the average spike count in a spike word, which allows the posterior over entropy to concentrate more rapidly than with standard estimators (e.g., in cases where the probability of spiking differs strongly from 0.5). Conversely, the Dirichlet distributions assign prior mass to distributions far from the parametric model, ensuring consistent estimates for arbitrary distributions. We devise a compact representation of the data and prior that allow for computationally efficient implementations of Bayesian least squares and empirical Bayes entropy estimators with large numbers of neurons. We apply these estimators to simulated and real neural data and show that they substantially outperform traditional methods.

3 0.79994267 6 nips-2013-A Determinantal Point Process Latent Variable Model for Inhibition in Neural Spiking Data

Author: Jasper Snoek, Richard Zemel, Ryan P. Adams

Abstract: Point processes are popular models of neural spiking behavior as they provide a statistical distribution over temporal sequences of spikes and help to reveal the complexities underlying a series of recorded action potentials. However, the most common neural point process models, the Poisson process and the gamma renewal process, do not capture interactions and correlations that are critical to modeling populations of neurons. We develop a novel model based on a determinantal point process over latent embeddings of neurons that effectively captures and helps visualize complex inhibitory and competitive interaction. We show that this model is a natural extension of the popular generalized linear model to sets of interacting neurons. The model is extended to incorporate gain control or divisive normalization, and the modulation of neural spiking based on periodic phenomena. Applied to neural spike recordings from the rat hippocampus, we see that the model captures inhibitory relationships, a dichotomy of classes of neurons, and a periodic modulation by the theta rhythm known to be present in the data. 1

4 0.7442075 205 nips-2013-Multisensory Encoding, Decoding, and Identification

Author: Aurel A. Lazar, Yevgeniy Slutskiy

Abstract: We investigate a spiking neuron model of multisensory integration. Multiple stimuli from different sensory modalities are encoded by a single neural circuit comprised of a multisensory bank of receptive fields in cascade with a population of biophysical spike generators. We demonstrate that stimuli of different dimensions can be faithfully multiplexed and encoded in the spike domain and derive tractable algorithms for decoding each stimulus from the common pool of spikes. We also show that the identification of multisensory processing in a single neuron is dual to the recovery of stimuli encoded with a population of multisensory neurons, and prove that only a projection of the circuit onto input stimuli can be identified. We provide an example of multisensory integration using natural audio and video and discuss the performance of the proposed decoding and identification algorithms. 1

5 0.72868592 341 nips-2013-Universal models for binary spike patterns using centered Dirichlet processes

Author: Il M. Park, Evan W. Archer, Kenneth Latimer, Jonathan W. Pillow

Abstract: Probabilistic models for binary spike patterns provide a powerful tool for understanding the statistical dependencies in large-scale neural recordings. Maximum entropy (or “maxent”) models, which seek to explain dependencies in terms of low-order interactions between neurons, have enjoyed remarkable success in modeling such patterns, particularly for small groups of neurons. However, these models are computationally intractable for large populations, and low-order maxent models have been shown to be inadequate for some datasets. To overcome these limitations, we propose a family of “universal” models for binary spike patterns, where universality refers to the ability to model arbitrary distributions over all 2m binary patterns. We construct universal models using a Dirichlet process centered on a well-behaved parametric base measure, which naturally combines the flexibility of a histogram and the parsimony of a parametric model. We derive computationally efficient inference methods using Bernoulli and cascaded logistic base measures, which scale tractably to large populations. We also establish a condition for equivalence between the cascaded logistic and the 2nd-order maxent or “Ising” model, making cascaded logistic a reasonable choice for base measure in a universal model. We illustrate the performance of these models using neural data. 1

6 0.7235409 121 nips-2013-Firing rate predictions in optimal balanced networks

7 0.68293548 141 nips-2013-Inferring neural population dynamics from multiple partial recordings of the same neural circuit

8 0.66703391 305 nips-2013-Spectral methods for neural characterization using generalized quadratic models

9 0.66277546 246 nips-2013-Perfect Associative Learning with Spike-Timing-Dependent Plasticity

10 0.62082809 49 nips-2013-Bayesian Inference and Online Experimental Design for Mapping Neural Microcircuits

11 0.61591768 304 nips-2013-Sparse nonnegative deconvolution for compressive calcium imaging: algorithms and phase transitions

12 0.57480401 286 nips-2013-Robust learning of low-dimensional dynamics from large neural ensembles

13 0.57033181 308 nips-2013-Spike train entropy-rate estimation using hierarchical Dirichlet process priors

14 0.54363507 86 nips-2013-Demixing odors - fast inference in olfaction

15 0.53449965 236 nips-2013-Optimal Neural Population Codes for High-dimensional Stimulus Variables

16 0.52958846 173 nips-2013-Least Informative Dimensions

17 0.52104354 208 nips-2013-Neural representation of action sequences: how far can a simple snippet-matching model take us?

18 0.46547654 157 nips-2013-Learning Multi-level Sparse Representations

19 0.45496261 266 nips-2013-Recurrent linear models of simultaneously-recorded neural populations

20 0.42664737 237 nips-2013-Optimal integration of visual speed across different spatiotemporal frequency channels


similar papers computed by lda model

lda for this paper:

topicId topicWeight

[(2, 0.022), (16, 0.055), (33, 0.126), (34, 0.111), (41, 0.02), (49, 0.141), (56, 0.099), (70, 0.04), (72, 0.17), (85, 0.049), (89, 0.044), (93, 0.036), (95, 0.012)]

similar papers list:

simIndex simValue paperId paperTitle

same-paper 1 0.86029011 262 nips-2013-Real-Time Inference for a Gamma Process Model of Neural Spiking

Author: David Carlson, Vinayak Rao, Joshua T. Vogelstein, Lawrence Carin

Abstract: With simultaneous measurements from ever increasing populations of neurons, there is a growing need for sophisticated tools to recover signals from individual neurons. In electrophysiology experiments, this classically proceeds in a two-step process: (i) threshold the waveforms to detect putative spikes and (ii) cluster the waveforms into single units (neurons). We extend previous Bayesian nonparametric models of neural spiking to jointly detect and cluster neurons using a Gamma process model. Importantly, we develop an online approximate inference scheme enabling real-time analysis, with performance exceeding the previous state-of-theart. Via exploratory data analysis—using data with partial ground truth as well as two novel data sets—we find several features of our model collectively contribute to our improved performance including: (i) accounting for colored noise, (ii) detecting overlapping spikes, (iii) tracking waveform dynamics, and (iv) using multiple channels. We hope to enable novel experiments simultaneously measuring many thousands of neurons and possibly adapting stimuli dynamically to probe ever deeper into the mysteries of the brain. 1

2 0.84626305 126 nips-2013-Gaussian Process Conditional Copulas with Applications to Financial Time Series

Author: José Miguel Hernández-Lobato, James R. Lloyd, Daniel Hernández-Lobato

Abstract: The estimation of dependencies between multiple variables is a central problem in the analysis of financial time series. A common approach is to express these dependencies in terms of a copula function. Typically the copula function is assumed to be constant but this may be inaccurate when there are covariates that could have a large influence on the dependence structure of the data. To account for this, a Bayesian framework for the estimation of conditional copulas is proposed. In this framework the parameters of a copula are non-linearly related to some arbitrary conditioning variables. We evaluate the ability of our method to predict time-varying dependencies on several equities and currencies and observe consistent performance gains compared to static copula models and other timevarying copula methods. 1

3 0.84489566 263 nips-2013-Reasoning With Neural Tensor Networks for Knowledge Base Completion

Author: Richard Socher, Danqi Chen, Christopher D. Manning, Andrew Ng

Abstract: Knowledge bases are an important resource for question answering and other tasks but often suffer from incompleteness and lack of ability to reason over their discrete entities and relationships. In this paper we introduce an expressive neural tensor network suitable for reasoning over relationships between two entities. Previous work represented entities as either discrete atomic units or with a single entity vector representation. We show that performance can be improved when entities are represented as an average of their constituting word vectors. This allows sharing of statistical strength between, for instance, facts involving the “Sumatran tiger” and “Bengal tiger.” Lastly, we demonstrate that all models improve when these word vectors are initialized with vectors learned from unsupervised large corpora. We assess the model by considering the problem of predicting additional true relations between entities given a subset of the knowledge base. Our model outperforms previous models and can classify unseen relationships in WordNet and FreeBase with an accuracy of 86.2% and 90.0%, respectively. 1

4 0.80755514 266 nips-2013-Recurrent linear models of simultaneously-recorded neural populations

Author: Marius Pachitariu, Biljana Petreska, Maneesh Sahani

Abstract: Population neural recordings with long-range temporal structure are often best understood in terms of a common underlying low-dimensional dynamical process. Advances in recording technology provide access to an ever-larger fraction of the population, but the standard computational approaches available to identify the collective dynamics scale poorly with the size of the dataset. We describe a new, scalable approach to discovering low-dimensional dynamics that underlie simultaneously recorded spike trains from a neural population. We formulate the Recurrent Linear Model (RLM) by generalising the Kalman-filter-based likelihood calculation for latent linear dynamical systems to incorporate a generalised-linear observation process. We show that RLMs describe motor-cortical population data better than either directly-coupled generalised-linear models or latent linear dynamical system models with generalised-linear observations. We also introduce the cascaded generalised-linear model (CGLM) to capture low-dimensional instantaneous correlations in neural populations. The CGLM describes the cortical recordings better than either Ising or Gaussian models and, like the RLM, can be fit exactly and quickly. The CGLM can also be seen as a generalisation of a lowrank Gaussian model, in this case factor analysis. The computational tractability of the RLM and CGLM allow both to scale to very high-dimensional neural data. 1

5 0.80381048 6 nips-2013-A Determinantal Point Process Latent Variable Model for Inhibition in Neural Spiking Data

Author: Jasper Snoek, Richard Zemel, Ryan P. Adams

Abstract: Point processes are popular models of neural spiking behavior as they provide a statistical distribution over temporal sequences of spikes and help to reveal the complexities underlying a series of recorded action potentials. However, the most common neural point process models, the Poisson process and the gamma renewal process, do not capture interactions and correlations that are critical to modeling populations of neurons. We develop a novel model based on a determinantal point process over latent embeddings of neurons that effectively captures and helps visualize complex inhibitory and competitive interaction. We show that this model is a natural extension of the popular generalized linear model to sets of interacting neurons. The model is extended to incorporate gain control or divisive normalization, and the modulation of neural spiking based on periodic phenomena. Applied to neural spike recordings from the rat hippocampus, we see that the model captures inhibitory relationships, a dichotomy of classes of neurons, and a periodic modulation by the theta rhythm known to be present in the data. 1

6 0.80208915 131 nips-2013-Geometric optimisation on positive definite matrices for elliptically contoured distributions

7 0.80009383 221 nips-2013-On the Expressive Power of Restricted Boltzmann Machines

8 0.79932427 70 nips-2013-Contrastive Learning Using Spectral Methods

9 0.79720014 303 nips-2013-Sparse Overlapping Sets Lasso for Multitask Learning and its Application to fMRI Analysis

10 0.7928766 244 nips-2013-Parametric Task Learning

11 0.79230976 121 nips-2013-Firing rate predictions in optimal balanced networks

12 0.78889853 323 nips-2013-Synthesizing Robust Plans under Incomplete Domain Models

13 0.78878051 274 nips-2013-Relevance Topic Model for Unstructured Social Group Activity Recognition

14 0.78027683 345 nips-2013-Variance Reduction for Stochastic Gradient Optimization

15 0.76166743 167 nips-2013-Learning the Local Statistics of Optical Flow

16 0.75972611 336 nips-2013-Translating Embeddings for Modeling Multi-relational Data

17 0.75411308 141 nips-2013-Inferring neural population dynamics from multiple partial recordings of the same neural circuit

18 0.7417112 64 nips-2013-Compete to Compute

19 0.74023575 353 nips-2013-When are Overcomplete Topic Models Identifiable? Uniqueness of Tensor Tucker Decompositions with Structured Sparsity

20 0.7395004 236 nips-2013-Optimal Neural Population Codes for High-dimensional Stimulus Variables