nips nips2008 nips2008-59 knowledge-graph by maker-knowledge-mining
Source: pdf
Author: Jan Gasthaus, Frank Wood, Dilan Gorur, Yee W. Teh
Abstract: In this paper we propose a new incremental spike sorting model that automatically eliminates refractory period violations, accounts for action potential waveform drift, and can handle “appearance” and “disappearance” of neurons. Our approach is to augment a known time-varying Dirichlet process that ties together a sequence of infinite Gaussian mixture models, one per action potential waveform observation, with an interspike-interval-dependent likelihood that prohibits refractory period violations. We demonstrate this model by showing results from sorting two publicly available neural data recordings for which a partial ground truth labeling is known. 1
Reference: text
sentIndex sentText sentNum sentScore
1 uk Abstract In this paper we propose a new incremental spike sorting model that automatically eliminates refractory period violations, accounts for action potential waveform drift, and can handle “appearance” and “disappearance” of neurons. [sent-5, score-1.402]
2 Our approach is to augment a known time-varying Dirichlet process that ties together a sequence of infinite Gaussian mixture models, one per action potential waveform observation, with an interspike-interval-dependent likelihood that prohibits refractory period violations. [sent-6, score-0.742]
3 We demonstrate this model by showing results from sorting two publicly available neural data recordings for which a partial ground truth labeling is known. [sent-7, score-0.647]
4 1 Introduction Spike sorting (see [1] and [2] for review and methodological background) is the name given to the problem of grouping action potentials by source neuron. [sent-8, score-0.563]
5 In each of these, instead of pursuing the optimal sorting, multiple sortings of the spikes are produced (in fact what each model produces is a posterior distribution over spike trains). [sent-14, score-0.557]
6 Neural data analyses may then be averaged over the resulting spike train distribution to account for uncertainties that may have arisen at various points in the spike sorting process and would not have been explicitly accounted for otherwise. [sent-15, score-1.165]
7 Our work builds on this new Bayesian approach to spike sorting; going beyond them in the way steps five and six are accomplished. [sent-16, score-0.413]
8 Specifically we apply the generalized Polya urn dependent Dirichlet process mixture model (GPUDPM) [7, 8] to the problem of spike sorting and show how it allows us to model waveform drift and account for neuron appearance and disappearance. [sent-17, score-1.318]
9 By introducing a time dependent likelihood into the model we are also able to eliminate refractory period violations. [sent-18, score-0.279]
10 1 The need for a spike sorting approach with these features arises from several domains. [sent-19, score-0.729]
11 Waveform non-stationarities either due to changes in the recording environment (e. [sent-20, score-0.144]
12 movement of the electrode) or due to changes in the firing activity of the neuron itself (e. [sent-22, score-0.149]
13 burstiness) cause almost all current spike sorting approaches to fail. [sent-24, score-0.729]
14 This is because most pool waveforms over time, discarding the time at which the action potentials were observed. [sent-25, score-0.467]
15 A notable exception to this is the spike sorting approach of [9], in which waveforms were pooled and clustered in short fixed time intervals. [sent-26, score-0.949]
16 Multiple Gaussian mixture models are then fit to the waveforms in each interval and then are pruned and smoothed until a single coherent sequence of mixture models is left that describes the entire time course of the data. [sent-27, score-0.312]
17 A recent study by [10] puts forward a compelling case for online spike sorting algorithms that can handle waveform non-stationarity, as well as sudden jumps in waveform shape (e. [sent-30, score-1.221]
18 abrupt electrode movements due to high acceleration events), and appearance and disappearance of neurons from the recording over time. [sent-32, score-0.318]
19 This paper introduces a chronical recording paradigm in which a chronically implanted recording device is mated with appropriate storage such that very long term recordings can be made. [sent-33, score-0.336]
20 Unfortunately as the animal being recorded from is allowed its full range of natural movements, accelerations may cause the signal characteristics of the recording to vary dramatically over short time intervals. [sent-34, score-0.214]
21 As such data theoretically can be recorded forever without stopping, forward-backward spike sorting algorithms such as that in [9] are ruled out. [sent-35, score-0.786]
22 2 Review Our model is based on the generalized Polya urn Dirichlet process mixture model (GPUDPM) described in [7, 8]. [sent-37, score-0.121]
23 The GPUDPM is a time dependent Dirichlet process (DDP) mixture model formulated in the Chinese restaurant process (CRP) sampling representation of a Dirichlet process mixture model (DPM). [sent-38, score-0.247]
24 In this representation we track the distinct φk drawn from G0 for each cluster, and use the Chinese restaurant process to sample the conditional distributions of the indicator variables ci mk P (ci = k|c1 , . [sent-51, score-0.138]
25 , T , all tied together through a particular way of sharing the component parameters φt and table occupancy k counts mt between adjacent time steps (here t indexes the parameters and cluster sizes of the T k DPMs). [sent-61, score-0.263]
26 Dependence among the mt is introduced by perturbing the number of customers sitting at each table k when moving forward through time. [sent-62, score-0.273]
27 , mt t ) the vector containing the 1 K 2 number of customers sitting at each table at time t before a “deletion” step, where K t is the number of non-empty tables at time t. [sent-66, score-0.351]
28 Then the perturbation of the class counts from one step to the next is governed by the process mt − ξ t mt − ζ t mt+1 |mt , ρ ∼ with probability γ with probability 1 − γ t t where ξk ∼ Binomial(mt , 1 − ρ) and ζj = mt for j = j k t Kt k=1 t and ζj = 0 for j = (3) where ∼ mt ). [sent-68, score-0.575]
29 k Before seating the customers arriving at time step t + 1, the number of Discrete(m / customers sitting at each table is initialized to mt+1 . [sent-69, score-0.304]
30 This perturbation process can either remove some number of customers from a table or effectively delete a table altogether. [sent-70, score-0.139]
31 This deletion procedure accounts for the ability of the GPUDPM to model births and deaths of clusters. [sent-71, score-0.115]
32 This drift is modeled by tying together the component parameters φt through a transition kernel P(φt |φt−1 ) from which the class k k k parameter at time t is sampled given the class parameter at time t − 1. [sent-73, score-0.194]
33 k k 3 Model In order to apply the GPUDPM model to spike sorting problems one first has to make a number of modeling assumptions. [sent-78, score-0.729]
34 In the following we describe modeling choices we made for the spike sorting task, as well as how the continuous spike occurrence times can be incorporated into the model to allow for correct treatment of neuron behaviour during the absolute refractory period. [sent-80, score-1.502]
35 Let {xt }T be the the set of action potential waveforms extracted from an extracellular recording t=1 (referred to as “spikes” in the following), and let τ 1 , . [sent-81, score-0.665]
36 , τ T be the time stamps (in ms) associated with these spikes in ascending order (i. [sent-84, score-0.161]
37 , T corresponding to the time steps in the GPUDPM model and the actual spike times τ t at which the spike xt occurs in the recording. [sent-90, score-0.865]
38 We assume that only one spike occurs per time step t, i. [sent-91, score-0.452]
39 we set N = 1 in the model above and identify ct = (ct ) = ct . [sent-93, score-0.118]
40 1 It is well known that the distribution of action potential waveforms originating from a single neuron in a PCA feature space is well approximated by a Normal distribution [1]. [sent-94, score-0.54]
41 While correlations between the components can be observed in neural recordings, they can at least partially be attributed to temporal waveform variation. [sent-111, score-0.234]
42 To account for the fact that neurons have an absolute refractory period following each action potential during which no further action potential can occur, we extend the GPUDPM by conditioning the model 3 on the spike occurrence times τ1 , . [sent-112, score-1.17]
43 , τT and modifying the conditional probability of assigning a spike to a cluster given the other cluster labels and the spike occurrence times τ1 , . [sent-115, score-1.013]
44 , τt in the following way: if τ t − τk ≤ rabs ˆt 0 t 1:t t t P(ct = k|m , c1:t−1 , τ , α) ∝ mk if τ − τk > rabs and k ∈ {1, . [sent-118, score-0.201]
45 , Kt−1 } (6) ˆt t α ifτ − τk > rabs and k = Kt−1 + 1 ˆt where τk is the spike time of the last spike assigned to cluster k before time step t, i. [sent-121, score-1.054]
46 Essentially, the conditional probability of assigning the spike at time t to cluster k is zero if the difference of the occurrence time of this spike and the occurrence time of the last spike associated with cluster k is smaller than the refractory period rabs . [sent-124, score-1.931]
47 If the time difference is larger than rabs then the usual CRP conditional probabilities are used. [sent-125, score-0.126]
48 In terms of the Chinese restaurant metaphor, this setup corresponds to a restaurant in which seating a customer at a table removes that table as an option for new customers for some period of time. [sent-126, score-0.332]
49 The transition kernel P(φt |φt−1 ) specifies how the action potential waveshape can vary over time. [sent-131, score-0.34]
50 To k k meet the technical requirements of the GPUDPM and because its waveform drift modeling semantics are reasonable we use the update rule of the Metropolis algorithm [11] as the transition kernel P(φt |φt−1 ), i. [sent-132, score-0.35]
51 This choice of P (φt |φt−1 ) ensures that G0 is the invariant distribution of the transition k k k kernel, while at the same time allowing us to control the amount of correlation between time steps through σ. [sent-136, score-0.115]
52 A transition kernel of this form allows the distribution of the action potential waveforms to vary slowly (if σ is chosen small) from one time step to the next both in mean waveform shape as well as in variance. [sent-137, score-0.714]
53 1 Methodology Experiments were performed on a subset of the publicly available1 data set described in [12, 13], which consists of simultaneous intracellular and extracellular recordings of cells in the hippocampus of anesthetized rats. [sent-141, score-0.454]
54 Recordings from an extracellular tetrode and an intracellular electrode were made simultaneously, such that the cell recorded on the intracellular electrode was also recorded extracellularly by a tetrode. [sent-142, score-0.799]
55 Action potentials detected on the intracellular (IC) channel are an almost certain indicator that the cell being recorded spiked. [sent-143, score-0.396]
56 Action potentials detected on the extracellular (EC) channels may include the action potentials generated by the intracellularly recorded cell, but almost certainly include spiking activity from other cells as well. [sent-144, score-0.684]
57 The intracellular recording therefore can be used to obtain a ground truth labeling for the spikes originating from one neuron that can be used to evaluate the performance of human sorters and automatic spike sorting algorithms that sort extracellular recordings [13]. [sent-145, score-1.737]
58 However, by this method ground truth can only be determined for one of the neurons whose spikes are present in the extracellular recording, and this should be kept in mind when evaluating the performance of spike sorting algorithms on such a data set. [sent-146, score-1.208]
59 Neither the correct number of distinct neurons recorded from by the extracellular electrode nor the correct labeling for any spikes not originating from the neuron recorded intracellularly can be determined by this methodology. [sent-147, score-0.796]
60 81% 0 Table 1: Performance of both algorithms on the two data sets: % false positives (FP), % false negatives (FN), # of refratory period violations (RPV). [sent-165, score-0.195]
61 The subset of that data set that was used for the experiments consisted of two recordings from different animals (4 minutes each), recorded at 10 kHz. [sent-167, score-0.157]
62 The data was bandpass filtered (300Hz – 3kHz), and spikes on the intracellular channel were detected as the local maxima of the first derivative of the signal larger than a manually chosen threshold. [sent-168, score-0.326]
63 Spikes on the extracellular channels were determined as the local minima exceeding 4 standard deviations in magnitude. [sent-169, score-0.206]
64 Spike waveforms of length 1 ms were extracted from around each spike (4 samples before and 5 samples after the peak). [sent-170, score-0.617]
65 The positions of the minima within the spike waveforms were aligned by upsampling, shifting and then downsampling the waveforms. [sent-171, score-0.594]
66 The extracellular spikes corresponding to action potentials from the identified neuron were determined as the spikes occurring within 0. [sent-172, score-0.781]
67 For each spike the signals from the four tetrode channels were combined into a vector of length 40. [sent-174, score-0.522]
68 Each dimensions was scaled by the maximal variance among all dimensions and PCA dimensionality reduction was performed on the scaled data sets (for each of the two recordings separately). [sent-175, score-0.1]
69 The first three principal components were used as input to our spike sorting algorithm. [sent-176, score-0.729]
70 The first recording (data set 1) consists of 3187 spikes, 831 originate from the identified neuron, while the second (data set 2) contains 3502 spikes, 553 of which were also detected on the IC channel. [sent-177, score-0.16]
71 As shown in Figure 1, there is a clearly visible change in waveform shape of the identified neuron over time in data set 1, while in data set 2 the waveform shapes remain roughly constant. [sent-178, score-0.654]
72 Presumably this change in waveform shape is due to the slow death of the cell as a result of the damage done to the cell by the intracellular recording procedure. [sent-179, score-0.617]
73 985 and γ = 1 − 10−5 , reflecting the fact that we consider relative firing rates of the neurons to stay roughly constant over time and neuron death a relatively rare process respectively. [sent-183, score-0.271]
74 01, favoring small changes in the cluster parameters from one time step to the next. [sent-185, score-0.128]
75 For comparison, the same data set was also sorted using the DPM-based spike sorting algorithm described in [6]2 , which pools waveforms over time and thus does not make use of any information about the occurrence times of the spikes. [sent-191, score-1.01]
76 As expected, our algorithm outperforms the DPM-based algorithm on data set 1, which includes waveform drift which the DPM cannot account for. [sent-201, score-0.313]
77 As data set 2 does not show waveform drift it can be adequately modeled without introducing time dependence. [sent-202, score-0.352]
78 html 5 (a) Ground Truth (b) Ground Truth (c) DPM (d) DPM (e) GPUDPM (f) GPUDPM Figure 1: A comparison of DPM to GPUDPM spike sorting for two channels of tetrode data for which the ground truth labeling of one neuron is known. [sent-208, score-1.165]
79 The top row of graphs shows the ground truth labeling of both data sets where the action potentials known to have been generated by a single neuron are labeled with x’s. [sent-211, score-0.574]
80 Other points in the top row of graphs may also correspond to action potentials but as we do not know the ground truth labeling for them we label them all with dots. [sent-212, score-0.451]
81 The middle row shows the maximum a posteriori labeling of both data sets produced by a DP mixture model spike sorting algorithm which does not utilize the time at which waveforms were captured, nor does it model waveform shape change. [sent-213, score-1.347]
82 The bottom row shows the maximum a posteriori labeling of both data sets produced by our GPUDPM spike sorting algorithm which does model both the time at which the spikes occurred and the changing action potential waveshape. [sent-214, score-1.21]
83 The left column shows that the GPUDPM performs better than the DPM when the waveshape of the underlying neurons changes over time. [sent-215, score-0.188]
84 The right column shows that the GPUDPM performs no worse than the DPM when the waveshape of the underlying neurons stays constant. [sent-216, score-0.162]
85 With a larger number of particles (or samples in the Gibbs sampler), one would expect both models to perform equally well, with possibly a slight advantage for the GPUDPM which can exploit the information contained in the refractory period violations. [sent-219, score-0.283]
86 As dictated by the model, the GPUDPM algorithm does not assign two spikes that are within the refractory period of each other to the same cluster, whereas the DPM does not incorporate this restriction, and therefore can produce labelings containing refractory period violations. [sent-220, score-0.625]
87 In the former case the algorithm incorrectly places the waveforms from the IC channel and the waveform of another neuron in one cluster, in the latter case the algorithm starts assigning the IC waveforms to a different cluster after some point in time. [sent-223, score-0.807]
88 Here the labels assigned to both the the neuron with changing waveshape and one of the neurons with stationary waveshape change approximately half-way through the recording. [sent-230, score-0.389]
89 While for this data set we know this labeling to be wrong because we know the ground truth, in other recordings such an “injection of noise” could, for instance, signal a shift in electrode position requiring similar rapid births and deaths of clusters. [sent-233, score-0.375]
90 5 Discussion We have demonstrated that spike sorting using time-varying Dirichlet process mixtures in general, and more specifically our spike sorting specialization of the GPUDPM, produce promising results. [sent-234, score-1.481]
91 With such a spike sorting approach we, within a single model, are able to account for action potential waveform drift, refractory period violations, and neuron appearance and disappearance from a recording. [sent-235, score-1.599]
92 Previously no single model addressed all of these simultaneously, requiring solutions in the form of ad hoc combinations of strategies and algorithms that produces spike sorting results that were potentially difficult to characterize. [sent-236, score-0.729]
93 Directions for further research include the development of a more efficient sequential inference scheme or a hybrid sequential/Gibbs sampler scheme that allows propagation of interspike interval information backwards in time. [sent-240, score-0.084]
94 Parametric models for the interspike interval density for each neuron whose parameters are inferred from the data, which can improve spike sorting results [15], can also be incorporated into the model. [sent-241, score-0.927]
95 A review of methods for spike sorting: the detection and classification of neural action potentials. [sent-247, score-0.563]
96 An application of reversible-jump Markov chain Monte Carlo to spike classification of multi-unit extracellular recordings. [sent-260, score-0.58]
97 HermesB: A continuous neural recording system for freely behaving primates. [sent-315, score-0.118]
98 Intracellular features a predicted by extracellular recordings in the hippocampus in vivo. [sent-338, score-0.29]
99 Accuracy of tetrode spike separation as a determined by simultaneous intracellular and extracellular measurements. [sent-348, score-0.787]
100 Improved spike-sorting by modeling firing statistics and burst-dependent spike amplitude attenuation: A Markov Chain Monte Carlo approach. [sent-359, score-0.413]
wordName wordTfidf (topN-words)
[('gpudpm', 0.452), ('spike', 0.413), ('sorting', 0.316), ('waveform', 0.234), ('waveforms', 0.181), ('dpm', 0.181), ('extracellular', 0.167), ('refractory', 0.153), ('action', 0.15), ('mt', 0.138), ('intracellular', 0.137), ('neuron', 0.123), ('spikes', 0.122), ('recording', 0.118), ('waveshape', 0.104), ('recordings', 0.1), ('potentials', 0.097), ('period', 0.087), ('rabs', 0.087), ('drift', 0.079), ('labeling', 0.072), ('customers', 0.07), ('dpms', 0.07), ('tetrode', 0.07), ('electrode', 0.068), ('truth', 0.067), ('ground', 0.065), ('cluster', 0.063), ('occurrence', 0.061), ('ct', 0.059), ('neurons', 0.058), ('recorded', 0.057), ('ic', 0.056), ('dirichlet', 0.053), ('interspike', 0.052), ('rpv', 0.052), ('urn', 0.052), ('violations', 0.052), ('potential', 0.049), ('restaurant', 0.047), ('polya', 0.046), ('mixture', 0.046), ('deletion', 0.045), ('particles', 0.043), ('detected', 0.042), ('disappearance', 0.042), ('sitting', 0.042), ('ci', 0.041), ('particle', 0.039), ('channels', 0.039), ('wood', 0.039), ('time', 0.039), ('cell', 0.038), ('dp', 0.038), ('lille', 0.037), ('originating', 0.037), ('avg', 0.037), ('transition', 0.037), ('births', 0.035), ('buzs', 0.035), ('csicsvari', 0.035), ('deaths', 0.035), ('henze', 0.035), ('intracellularly', 0.035), ('rosenbluth', 0.035), ('seating', 0.035), ('cj', 0.034), ('fp', 0.034), ('appearance', 0.032), ('sampler', 0.032), ('neurophysiology', 0.031), ('fn', 0.031), ('dilan', 0.03), ('ddp', 0.03), ('chinese', 0.029), ('false', 0.028), ('crp', 0.028), ('death', 0.028), ('mk', 0.027), ('proposal', 0.027), ('kt', 0.027), ('occurred', 0.027), ('publicly', 0.027), ('changes', 0.026), ('harris', 0.026), ('carlo', 0.026), ('channel', 0.025), ('monte', 0.025), ('arriving', 0.025), ('shape', 0.024), ('labelings', 0.023), ('hippocampus', 0.023), ('frank', 0.023), ('incorporated', 0.023), ('gibbs', 0.023), ('ms', 0.023), ('table', 0.023), ('process', 0.023), ('metropolis', 0.023), ('produced', 0.022)]
simIndex simValue paperId paperTitle
same-paper 1 0.99999946 59 nips-2008-Dependent Dirichlet Process Spike Sorting
Author: Jan Gasthaus, Frank Wood, Dilan Gorur, Yee W. Teh
Abstract: In this paper we propose a new incremental spike sorting model that automatically eliminates refractory period violations, accounts for action potential waveform drift, and can handle “appearance” and “disappearance” of neurons. Our approach is to augment a known time-varying Dirichlet process that ties together a sequence of infinite Gaussian mixture models, one per action potential waveform observation, with an interspike-interval-dependent likelihood that prohibits refractory period violations. We demonstrate this model by showing results from sorting two publicly available neural data recordings for which a partial ground truth labeling is known. 1
2 0.43475974 220 nips-2008-Spike Feature Extraction Using Informative Samples
Author: Zhi Yang, Qi Zhao, Wentai Liu
Abstract: This paper presents a spike feature extraction algorithm that targets real-time spike sorting and facilitates miniaturized microchip implementation. The proposed algorithm has been evaluated on synthesized waveforms and experimentally recorded sequences. When compared with many spike sorting approaches our algorithm demonstrates improved speed, accuracy and allows unsupervised execution. A preliminary hardware implementation has been realized using an integrated microchip interfaced with a personal computer. 1
3 0.23278117 137 nips-2008-Modeling Short-term Noise Dependence of Spike Counts in Macaque Prefrontal Cortex
Author: Arno Onken, Steffen Grünewälder, Matthias Munk, Klaus Obermayer
Abstract: Correlations between spike counts are often used to analyze neural coding. The noise is typically assumed to be Gaussian. Yet, this assumption is often inappropriate, especially for low spike counts. In this study, we present copulas as an alternative approach. With copulas it is possible to use arbitrary marginal distributions such as Poisson or negative binomial that are better suited for modeling noise distributions of spike counts. Furthermore, copulas place a wide range of dependence structures at the disposal and can be used to analyze higher order interactions. We develop a framework to analyze spike count data by means of copulas. Methods for parameter inference based on maximum likelihood estimates and for computation of mutual information are provided. We apply the method to our data recorded from macaque prefrontal cortex. The data analysis leads to three findings: (1) copula-based distributions provide significantly better fits than discretized multivariate normal distributions; (2) negative binomial margins fit the data significantly better than Poisson margins; and (3) the dependence structure carries 12% of the mutual information between stimuli and responses. 1
4 0.22457224 81 nips-2008-Extracting State Transition Dynamics from Multiple Spike Trains with Correlated Poisson HMM
Author: Kentaro Katahira, Jun Nishikawa, Kazuo Okanoya, Masato Okada
Abstract: Neural activity is non-stationary and varies across time. Hidden Markov Models (HMMs) have been used to track the state transition among quasi-stationary discrete neural states. Within this context, independent Poisson models have been used for the output distribution of HMMs; hence, the model is incapable of tracking the change in correlation without modulating the firing rate. To achieve this, we applied a multivariate Poisson distribution with correlation terms for the output distribution of HMMs. We formulated a Variational Bayes (VB) inference for the model. The VB could automatically determine the appropriate number of hidden states and correlation types while avoiding the overlearning problem. We developed an efficient algorithm for computing posteriors using the recursive relationship of a multivariate Poisson distribution. We demonstrated the performance of our method on synthetic data and a real spike train recorded from a songbird. 1
5 0.20234308 16 nips-2008-Adaptive Template Matching with Shift-Invariant Semi-NMF
Author: Jonathan L. Roux, Alain D. Cheveigné, Lucas C. Parra
Abstract: How does one extract unknown but stereotypical events that are linearly superimposed within a signal with variable latencies and variable amplitudes? One could think of using template matching or matching pursuit to find the arbitrarily shifted linear components. However, traditional matching approaches require that the templates be known a priori. To overcome this restriction we use instead semi Non-Negative Matrix Factorization (semiNMF) that we extend to allow for time shifts when matching the templates to the signal. The algorithm estimates templates directly from the data along with their non-negative amplitudes. The resulting method can be thought of as an adaptive template matching procedure. We demonstrate the procedure on the task of extracting spikes from single channel extracellular recordings. On these data the algorithm essentially performs spike detection and unsupervised spike clustering. Results on simulated data and extracellular recordings indicate that the method performs well for signalto-noise ratios of 6dB or higher and that spike templates are recovered accurately provided they are sufficiently different. 1
6 0.1516531 209 nips-2008-Short-Term Depression in VLSI Stochastic Synapse
7 0.1350764 113 nips-2008-Kernelized Sorting
8 0.11087389 204 nips-2008-Self-organization using synaptic plasticity
9 0.10158247 43 nips-2008-Cell Assemblies in Large Sparse Inhibitory Networks of Biologically Realistic Spiking Neurons
10 0.093777731 38 nips-2008-Bio-inspired Real Time Sensory Map Realignment in a Robotic Barn Owl
11 0.084564544 230 nips-2008-Temporal Difference Based Actor Critic Learning - Convergence and Neural Implementation
12 0.075306237 90 nips-2008-Gaussian-process factor analysis for low-dimensional single-trial analysis of neural population activity
13 0.067309462 8 nips-2008-A general framework for investigating how far the decoding process in the brain can be simplified
14 0.066033885 177 nips-2008-Particle Filter-based Policy Gradient in POMDPs
15 0.059442054 60 nips-2008-Designing neurophysiology experiments to optimally constrain receptive field models along parametric submanifolds
16 0.057557065 110 nips-2008-Kernel-ARMA for Hand Tracking and Brain-Machine interfacing During 3D Motor Control
17 0.056319166 109 nips-2008-Interpreting the neural code with Formal Concept Analysis
18 0.056023899 58 nips-2008-Dependence of Orientation Tuning on Recurrent Excitation and Inhibition in a Network Model of V1
19 0.051982857 45 nips-2008-Characterizing neural dependencies with copula models
20 0.046453953 116 nips-2008-Learning Hybrid Models for Image Annotation with Partially Labeled Data
topicId topicWeight
[(0, -0.171), (1, 0.115), (2, 0.299), (3, 0.26), (4, -0.259), (5, 0.057), (6, 0.039), (7, -0.097), (8, -0.143), (9, -0.071), (10, 0.02), (11, -0.133), (12, 0.077), (13, -0.064), (14, 0.047), (15, 0.028), (16, -0.072), (17, -0.064), (18, 0.003), (19, 0.091), (20, 0.049), (21, 0.018), (22, 0.03), (23, -0.033), (24, -0.005), (25, 0.047), (26, -0.014), (27, -0.029), (28, -0.03), (29, 0.097), (30, 0.055), (31, -0.025), (32, 0.038), (33, -0.057), (34, -0.193), (35, -0.059), (36, -0.049), (37, -0.014), (38, 0.034), (39, 0.079), (40, 0.009), (41, 0.001), (42, 0.06), (43, -0.018), (44, 0.027), (45, -0.03), (46, -0.073), (47, 0.008), (48, -0.006), (49, -0.032)]
simIndex simValue paperId paperTitle
same-paper 1 0.96130002 59 nips-2008-Dependent Dirichlet Process Spike Sorting
Author: Jan Gasthaus, Frank Wood, Dilan Gorur, Yee W. Teh
Abstract: In this paper we propose a new incremental spike sorting model that automatically eliminates refractory period violations, accounts for action potential waveform drift, and can handle “appearance” and “disappearance” of neurons. Our approach is to augment a known time-varying Dirichlet process that ties together a sequence of infinite Gaussian mixture models, one per action potential waveform observation, with an interspike-interval-dependent likelihood that prohibits refractory period violations. We demonstrate this model by showing results from sorting two publicly available neural data recordings for which a partial ground truth labeling is known. 1
2 0.95848233 220 nips-2008-Spike Feature Extraction Using Informative Samples
Author: Zhi Yang, Qi Zhao, Wentai Liu
Abstract: This paper presents a spike feature extraction algorithm that targets real-time spike sorting and facilitates miniaturized microchip implementation. The proposed algorithm has been evaluated on synthesized waveforms and experimentally recorded sequences. When compared with many spike sorting approaches our algorithm demonstrates improved speed, accuracy and allows unsupervised execution. A preliminary hardware implementation has been realized using an integrated microchip interfaced with a personal computer. 1
3 0.82781571 209 nips-2008-Short-Term Depression in VLSI Stochastic Synapse
Author: Peng Xu, Timothy K. Horiuchi, Pamela A. Abshire
Abstract: We report a compact realization of short-term depression (STD) in a VLSI stochastic synapse. The behavior of the circuit is based on a subtractive single release model of STD. Experimental results agree well with simulation and exhibit expected STD behavior: the transmitted spike train has negative autocorrelation and lower power spectral density at low frequencies which can remove redundancy in the input spike train, and the mean transmission probability is inversely proportional to the input spike rate which has been suggested as an automatic gain control mechanism in neural systems. The dynamic stochastic synapse could potentially be a powerful addition to existing deterministic VLSI spiking neural systems. 1
4 0.80390596 81 nips-2008-Extracting State Transition Dynamics from Multiple Spike Trains with Correlated Poisson HMM
Author: Kentaro Katahira, Jun Nishikawa, Kazuo Okanoya, Masato Okada
Abstract: Neural activity is non-stationary and varies across time. Hidden Markov Models (HMMs) have been used to track the state transition among quasi-stationary discrete neural states. Within this context, independent Poisson models have been used for the output distribution of HMMs; hence, the model is incapable of tracking the change in correlation without modulating the firing rate. To achieve this, we applied a multivariate Poisson distribution with correlation terms for the output distribution of HMMs. We formulated a Variational Bayes (VB) inference for the model. The VB could automatically determine the appropriate number of hidden states and correlation types while avoiding the overlearning problem. We developed an efficient algorithm for computing posteriors using the recursive relationship of a multivariate Poisson distribution. We demonstrated the performance of our method on synthetic data and a real spike train recorded from a songbird. 1
5 0.79189444 16 nips-2008-Adaptive Template Matching with Shift-Invariant Semi-NMF
Author: Jonathan L. Roux, Alain D. Cheveigné, Lucas C. Parra
Abstract: How does one extract unknown but stereotypical events that are linearly superimposed within a signal with variable latencies and variable amplitudes? One could think of using template matching or matching pursuit to find the arbitrarily shifted linear components. However, traditional matching approaches require that the templates be known a priori. To overcome this restriction we use instead semi Non-Negative Matrix Factorization (semiNMF) that we extend to allow for time shifts when matching the templates to the signal. The algorithm estimates templates directly from the data along with their non-negative amplitudes. The resulting method can be thought of as an adaptive template matching procedure. We demonstrate the procedure on the task of extracting spikes from single channel extracellular recordings. On these data the algorithm essentially performs spike detection and unsupervised spike clustering. Results on simulated data and extracellular recordings indicate that the method performs well for signalto-noise ratios of 6dB or higher and that spike templates are recovered accurately provided they are sufficiently different. 1
6 0.62399405 137 nips-2008-Modeling Short-term Noise Dependence of Spike Counts in Macaque Prefrontal Cortex
7 0.45326257 38 nips-2008-Bio-inspired Real Time Sensory Map Realignment in a Robotic Barn Owl
8 0.45263079 204 nips-2008-Self-organization using synaptic plasticity
9 0.42648625 8 nips-2008-A general framework for investigating how far the decoding process in the brain can be simplified
10 0.37703922 90 nips-2008-Gaussian-process factor analysis for low-dimensional single-trial analysis of neural population activity
11 0.32216799 43 nips-2008-Cell Assemblies in Large Sparse Inhibitory Networks of Biologically Realistic Spiking Neurons
12 0.30289003 110 nips-2008-Kernel-ARMA for Hand Tracking and Brain-Machine interfacing During 3D Motor Control
13 0.28498697 113 nips-2008-Kernelized Sorting
14 0.2594972 41 nips-2008-Breaking Audio CAPTCHAs
15 0.24854606 230 nips-2008-Temporal Difference Based Actor Critic Learning - Convergence and Neural Implementation
16 0.23563367 60 nips-2008-Designing neurophysiology experiments to optimally constrain receptive field models along parametric submanifolds
17 0.22878018 186 nips-2008-Probabilistic detection of short events, with application to critical care monitoring
18 0.2261519 247 nips-2008-Using Bayesian Dynamical Systems for Motion Template Libraries
19 0.22384824 13 nips-2008-Adapting to a Market Shock: Optimal Sequential Market-Making
20 0.22229663 98 nips-2008-Hierarchical Semi-Markov Conditional Random Fields for Recursive Sequential Data
topicId topicWeight
[(6, 0.061), (7, 0.071), (12, 0.041), (15, 0.023), (28, 0.124), (51, 0.307), (57, 0.081), (59, 0.023), (63, 0.014), (71, 0.076), (77, 0.031), (78, 0.011), (83, 0.048)]
simIndex simValue paperId paperTitle
same-paper 1 0.73129052 59 nips-2008-Dependent Dirichlet Process Spike Sorting
Author: Jan Gasthaus, Frank Wood, Dilan Gorur, Yee W. Teh
Abstract: In this paper we propose a new incremental spike sorting model that automatically eliminates refractory period violations, accounts for action potential waveform drift, and can handle “appearance” and “disappearance” of neurons. Our approach is to augment a known time-varying Dirichlet process that ties together a sequence of infinite Gaussian mixture models, one per action potential waveform observation, with an interspike-interval-dependent likelihood that prohibits refractory period violations. We demonstrate this model by showing results from sorting two publicly available neural data recordings for which a partial ground truth labeling is known. 1
2 0.62001085 223 nips-2008-Structure Learning in Human Sequential Decision-Making
Author: Daniel Acuna, Paul R. Schrater
Abstract: We use graphical models and structure learning to explore how people learn policies in sequential decision making tasks. Studies of sequential decision-making in humans frequently find suboptimal performance relative to an ideal actor that knows the graph model that generates reward in the environment. We argue that the learning problem humans face also involves learning the graph structure for reward generation in the environment. We formulate the structure learning problem using mixtures of reward models, and solve the optimal action selection problem using Bayesian Reinforcement Learning. We show that structure learning in one and two armed bandit problems produces many of the qualitative behaviors deemed suboptimal in previous studies. Our argument is supported by the results of experiments that demonstrate humans rapidly learn and exploit new reward structure. 1
3 0.51048213 66 nips-2008-Dynamic visual attention: searching for coding length increments
Author: Xiaodi Hou, Liqing Zhang
Abstract: A visual attention system should respond placidly when common stimuli are presented, while at the same time keep alert to anomalous visual inputs. In this paper, a dynamic visual attention model based on the rarity of features is proposed. We introduce the Incremental Coding Length (ICL) to measure the perspective entropy gain of each feature. The objective of our model is to maximize the entropy of the sampled visual features. In order to optimize energy consumption, the limit amount of energy of the system is re-distributed amongst features according to their Incremental Coding Length. By selecting features with large coding length increments, the computational system can achieve attention selectivity in both static and dynamic scenes. We demonstrate that the proposed model achieves superior accuracy in comparison to mainstream approaches in static saliency map generation. Moreover, we also show that our model captures several less-reported dynamic visual search behaviors, such as attentional swing and inhibition of return. 1
4 0.50921744 62 nips-2008-Differentiable Sparse Coding
Author: J. A. Bagnell, David M. Bradley
Abstract: Prior work has shown that features which appear to be biologically plausible as well as empirically useful can be found by sparse coding with a prior such as a laplacian (L1 ) that promotes sparsity. We show how smoother priors can preserve the benefits of these sparse priors while adding stability to the Maximum A-Posteriori (MAP) estimate that makes it more useful for prediction problems. Additionally, we show how to calculate the derivative of the MAP estimate efficiently with implicit differentiation. One prior that can be differentiated this way is KL-regularization. We demonstrate its effectiveness on a wide variety of applications, and find that online optimization of the parameters of the KL-regularized model can significantly improve prediction performance. 1
5 0.50632036 220 nips-2008-Spike Feature Extraction Using Informative Samples
Author: Zhi Yang, Qi Zhao, Wentai Liu
Abstract: This paper presents a spike feature extraction algorithm that targets real-time spike sorting and facilitates miniaturized microchip implementation. The proposed algorithm has been evaluated on synthesized waveforms and experimentally recorded sequences. When compared with many spike sorting approaches our algorithm demonstrates improved speed, accuracy and allows unsupervised execution. A preliminary hardware implementation has been realized using an integrated microchip interfaced with a personal computer. 1
6 0.50488454 11 nips-2008-A spatially varying two-sample recombinant coalescent, with applications to HIV escape response
7 0.50405037 27 nips-2008-Artificial Olfactory Brain for Mixture Identification
8 0.50284797 75 nips-2008-Estimating vector fields using sparse basis field expansions
9 0.502334 131 nips-2008-MDPs with Non-Deterministic Policies
10 0.50230318 95 nips-2008-Grouping Contours Via a Related Image
11 0.50224137 79 nips-2008-Exploring Large Feature Spaces with Hierarchical Multiple Kernel Learning
12 0.50210655 161 nips-2008-On the Complexity of Linear Prediction: Risk Bounds, Margin Bounds, and Regularization
13 0.50185692 63 nips-2008-Dimensionality Reduction for Data in Multiple Feature Representations
14 0.50152475 205 nips-2008-Semi-supervised Learning with Weakly-Related Unlabeled Data : Towards Better Text Categorization
15 0.50105041 245 nips-2008-Unlabeled data: Now it helps, now it doesn't
16 0.50084335 176 nips-2008-Partially Observed Maximum Entropy Discrimination Markov Networks
17 0.50026256 179 nips-2008-Phase transitions for high-dimensional joint support recovery
18 0.49939847 208 nips-2008-Shared Segmentation of Natural Scenes Using Dependent Pitman-Yor Processes
19 0.49931806 16 nips-2008-Adaptive Template Matching with Shift-Invariant Semi-NMF
20 0.49928826 118 nips-2008-Learning Transformational Invariants from Natural Movies