nips nips2002 nips2002-5 knowledge-graph by maker-knowledge-mining

5 nips-2002-A Digital Antennal Lobe for Pattern Equalization: Analysis and Design


Source: pdf

Author: Alex Holub, Gilles Laurent, Pietro Perona

Abstract: Re-mapping patterns in order to equalize their distribution may greatly simplify both the structure and the training of classifiers. Here, the properties of one such map obtained by running a few steps of discrete-time dynamical system are explored. The system is called 'Digital Antennal Lobe' (DAL) because it is inspired by recent studies of the antennallobe, a structure in the olfactory system of the grasshopper. The pattern-spreading properties of the DAL as well as its average behavior as a function of its (few) design parameters are analyzed by extending previous results of Van Vreeswijk and Sompolinsky. Furthermore, a technique for adapting the parameters of the initial design in order to obtain opportune noise-rejection behavior is suggested. Our results are demonstrated with a number of simulations. 1

Reference: text


Summary: the most important sentenses genereted by tfidf model

sentIndex sentText sentNum sentScore

1 Here, the properties of one such map obtained by running a few steps of discrete-time dynamical system are explored. [sent-5, score-0.261]

2 The system is called 'Digital Antennal Lobe' (DAL) because it is inspired by recent studies of the antennallobe, a structure in the olfactory system of the grasshopper. [sent-6, score-0.292]

3 1 Introduction The complexity of classifiers and the difficulty of learning their parameters is affected by the distribution of the input patterns. [sent-10, score-0.169]

4 It is easier to obtain simple and accurate classifiers when the patterns associated with different classes are spaced far apart and evenly in the input space. [sent-11, score-0.321]

5 In olfaction numerous odors which we wish to discriminate are chemically very similar, for example the citrus family (orange, lemon, lime . [sent-14, score-0.155]

6 The uneven chemical spacing for the odors of interest is expensive: in biological systems there is a premium in the simplicity of the classifiers that will recognize each individual odor. [sent-18, score-0.195]

7 N < 1000), one may transform an uneven distribution of patterns into an evenly distributed one by means of a map that 'randomizes' the position of each pattern, i. [sent-23, score-0.232]

8 We explore a simple dynamical system which realizes one such map for spreading patterns in a high-dimensional space. [sent-27, score-0.462]

9 The input space is the analog D-dimensional hypercube (0,1)D and the output space the digital hypercube {0,1}D. [sent-28, score-0.227]

10 The map is implemented by iterating a discrete-time first-order dynamical system consisting of two steps at each iteration: a first-order linear dynamical system followed by memory less thresholding. [sent-29, score-0.48]

11 on the order of D neurons or transistors) and yet it achieves good equalization in a few time steps. [sent-32, score-0.21]

12 The ideas that we present are inspired by a computation that may take place in the olfactory system as suggested in Friedrichs and Laurent [1J and Laurent [2 , 3J. [sent-33, score-0.233]

13 2 The digital antennal lobe The dynamical system we propose is inspired by the overall architecture of the antennal lobe and is designed to explore its computational capabilities. [sent-36, score-1.046]

14 We apply two key simplifications: we discretize time into equally spaced 'epochs', updating synchronously the state of all the neurons in the network at each epoch, and we discretize the value of the state of each unit to the binary set {O, 1}. [sent-37, score-0.491]

15 At some time an input is applied causing the network to take values that are different from zero. [sent-43, score-0.253]

16 The state of the network after a given constant number of time-steps (e. [sent-45, score-0.153]

17 Let us introduce the following notation: Number of excitatory, inhibitory, and external input units. [sent-48, score-0.221]

18 Total number of excitatory and inhibitory units (N = N E + N I ) Neuron index: i E {1, . [sent-49, score-0.594]

19 Xl Vector of values for all excitatory and inhibitory units at time t. [sent-57, score-0.625]

20 aE, aI, au Excitatory, inhibitory, input weights (Aij E {aI,O,aE}). [sent-65, score-0.154]

21 T Activation thresholds for all the neurons it Vector of pattern inputs. [sent-66, score-0.179]

22 B Matrix of excitatory connections from pattern inputs to units. [sent-67, score-0.426]

23 mt = Li xi/No mu Fraction of the external inputs which are active. [sent-75, score-0.281]

24 Assume excitatory connection weight aE = au = 1 (this is a normalization constant). [sent-77, score-0.392]

25 Generate random connection matrices A and B with average connectivity e and connection weights aE, aI. [sent-79, score-0.197]

26 Solve the following dynamical system forward in time from a zero initial condition: """"''''''·''' '-«''''''' 1 '1 '''"'''' 1 '' '' '''''''''''''1 ''''') '''''''' \1'_ . [sent-80, score-0.25]

27 (Left) Response of a DAL to 10 uniformly distributed random olfactory input patterns applied at time epoch t = 3. [sent-92, score-0.458]

28 Each vertical panel represents the state of excitatory units at a given time epoch (epochs 2,4,8,10 and excitatory units 1-200 are shown) in response to all stimuli. [sent-93, score-1.133]

29 In a given panel the row index refers to a given excitatory unit and the column index to a given input pattern (200 of 1024 excitatory units shown and 10 input patterns). [sent-94, score-1.176]

30 The salt-and-pepper pattern present in each panel indicates that excitatory units respond differently to each input pattern. [sent-99, score-0.768]

31 The horizontal streaks in the panels corresponding to early epochs (t = 4 and t = 6) indicate that the excitatory units respond equally or similarly to all input patterns. [sent-104, score-0.669]

32 The salt-and-pepper pattern in later epochs indicates that the time course of each excitatory units state becomes increasingly different in time. [sent-105, score-0.689]

33 o Axt- 1 l(yt) + Bit - T, t>0 zero initial condition neuronal input state update for some (constant) input pattern it. [sent-111, score-0.386]

34 The overall behavior of the DAL in response to different olfactory inputs is illustrated in Figure 1. [sent-113, score-0.229]

35 (1) In response to an input each unit exhibits a complex temporal pattern of activity. [sent-115, score-0.269]

36 (3) The average activity rate of the neurons is approximately independent of the input pattern. [sent-117, score-0.403]

37 (4) When very different input patterns are applied the average normalized Hamming distance between excitatory unit states is almost maximal immediately after the onset of the input stimulus. [sent-118, score-0.762]

38 (5) When very similar input patterns are applied (e. [sent-119, score-0.236]

39 1 % average difference), the average normalized Hamming distance between excitatory unit patterns is initially very small, i. [sent-122, score-0.562]

40 initially the excitatory units respond similarly to similar inputs. [sent-124, score-0.498]

41 The 'chaotic' properties of sparsely connected networks of neurons were noticed and studied by Van Vreeswijk and Sompolinsky [5] in the limit of 00 neurons. [sent-126, score-0.201]

42 In this paper we study networks with a small number of neurons comparable to the number observed within the antennal lobe. [sent-127, score-0.376]

43 Additionally, we propose a technique for the design of such networks, and demonstrate the possibility of 'stabilizing' some trajectories by parameter learning. [sent-128, score-0.397]

44 1 Analytic solution and equilibrium of network The use of simplified neural elements, namely McCulloch-Pitts units [4], allows us to represent the system as a simple discrete time dynamical system. [sent-130, score-0.539]

45 Several distributions can be used to approximate the number of active units in the population of excitatory, inhibitory, and external units, including: (1) the Binomial distribution, (2) the Poisson distribution, and (3) the Gaussian distribution. [sent-132, score-0.376]

46 An approximation common to all three is that the activities of all units are uncorrelated. [sent-133, score-0.18]

47 Given the population activity at a time t, mt, we can calculate the expected value for the population activity at the next time step, m H1 : KE KJ Ku E(m t+1) = 2. [sent-135, score-0.418]

48 =p(e)p(i)p(u)l(aEe + ali + auu - T) e=O i = O u=O Where pee), p(i), and p(u) are the probabilities of e excitatory, i inhibitory, and u external inputs being active. [sent-141, score-0.158]

49 Both e and i are binomially distributed with mean activity m = mt, while the external input is binomially distributed with mean activity m = mu: The Poisson distribution can be used to approximate the binomial distribution for reasonable values of A, where for instance Ae = K emt. [sent-142, score-0.671]

50 Using the Poisson approximation, the probability of j units being active is given by: In the limit as N ---+ 00, the distributions for the sum of the number of excitatory, inhibitory, and external units active approach normal distributions. [sent-143, score-0.543]

51 Since the sum of Gaussian random variables is itself a Gaussian random variable, we can model the net input to a unit as the sum of the excitatory, inhibitory, and external input shifted by a constant representing the threshold. [sent-144, score-0.398]

52 The equilibrium condition is satisfied when mt = mHl. [sent-146, score-0.158]

53 Light gray indicates inhibition-threshold values that yield a stable dynamical system. [sent-149, score-0.234]

54 That is, small perturbations of firing activity do not result in large fluctuations in activity later in time. [sent-150, score-0.358]

55 inhibition-threshold values for which the dynamical system rests at a constant mean-firing rate. [sent-153, score-0.219]

56 Using this chart one may design an antennal lobe: for any given connectivity choose inhibition and threshold values that produce a desired mean firing rate. [sent-155, score-0.408]

57 (Right) The design procedure produces networks that behave as desired. [sent-156, score-0.157]

58 The values indexing the arrows correspond to the absolute difference ofthe predicted activity (. [sent-158, score-0.223]

59 15) using a binomial approximation and the mean simulation activity across 10 random inputs to 10 different networks with the specified parameters sets. [sent-159, score-0.348]

60 We found the binomial approximation to yield the most accurate predictions in parameter ranges of interest to us, namely 500-4000 total units and connectivities ranging from . [sent-160, score-0.31]

61 Figure 2 outlines the design technique for a network of 512 excitatory and 512 inhibitory units and a population mean activity of . [sent-168, score-0.949]

62 The predicted activity of the network for different parameter sets corresponds well with that observed in Monte Carlo simulations. [sent-170, score-0.26]

63 0061 between the predicted mean activity and that found in the simulations (see Figure 2, right plot). [sent-172, score-0.158]

64 4 Learning for trajectory stabilization Consider a 'physical' implementation of the DAL, either by means of neurons in a biological system or by transistors in an electronic circuit. [sent-173, score-0.352]

65 In the presence of noise the same input applied multiple times to the same network will produce divergent trajectories , hence different final conditions, thus making the use of DALs for pattern classification problematic. [sent-176, score-0.663]

66 Consider the possibility that noise is present in the system: as a result of fluctuations in the level of the input ii, fluctuations in the biophysical properties of the neurons, etc. [sent-177, score-0.281]

67 We may represent this noise as an additional term fi in the dynamical system: ifAX't + Biit - T X'tH l(if + fit) Whatever the statistics of the noise, it is clear that it may influence the trajectory X' of the dynamical system. [sent-178, score-0.517]

68 Indeed, if yf, the nominal input to a neuron, is sufficiently close to zero, then even a small amount of noise may change the state xf of that neuron. [sent-179, score-0.358]

69 As we saw in earlier sections this implies that the ensuing trajectory will diverge from the trajectory of the same system with the same inputs and no noise or the same inputs and a different realization of the same noise process. [sent-180, score-0.64]

70 On the other hand, if yf is far from zero, then xf will not change even with large amounts of noise. [sent-182, score-0.199]

71 Ideally, for any given initial condition and input, and for any E, there exists a constant Yo > 0 such that any initial condition and input in a Yo-ball around the original input and initial condition will produce trajectories that differ at most by E. [sent-184, score-0.667]

72 the trajectory is required to be identical to the one of the noiseless system) then all trajectories of the system must coincide, not very useful. [sent-187, score-0.519]

73 If the total number of patterns to be discriminated is not too large (probably 10-1000 in the case of olfaction) one could think of requiring noise robustness only for the trajectories X'that are specific to those patterns. [sent-191, score-0.585]

74 We therefore explored whether it was in principle possible to stabilize trajectories corresponding to different odor presentations rather than all trajectories. [sent-192, score-0.416]

75 We wish to change the connection weights A, B and thresholds T so that the network is robust with respect to noise around a given trajectory X'(ii). [sent-193, score-0.348]

76 In order to achieve this we wish to ensure that at no time t neuron i has an input that is close to the threshold. [sent-194, score-0.196]

77 xf = 0) then its input must be comfortably less than zero (i. [sent-197, score-0.248]

78 for some constant Yo > 0, yf < -Yo) and viceversa for xf = 1. [sent-199, score-0.199]

79 g(y) = exp(y/yo) , then the cost of neuron i at time t if xf = 0 is Cf = g(yf) and if xf = 1 then Cf = g( -yf). [sent-203, score-0.332]

80 The equations for the gradient are: aCf --' aA ij ayf aA ij similarly, ayf aBij Dive rgerlCe Of 22 Traje<;tori es 8efore Leam ing Divergence of Trajectories After Leam ing f O Ti me_Steps Figure 3: Robustness of trajectories to noise resulting from network learning. [sent-205, score-0.591]

81 Each curve corresponds to the divergence rate between 10 identical trajectories in the presence of 5% gaussian synaptic noise added to each active presynaptic synapse. [sent-207, score-0.492]

82 All patterns achieve maximum spreading in 9-10 steps as also shown in Figure 1. [sent-208, score-0.201]

83 (Right) The divergence rate of the same trajectories after learning the first 10 steps of each trajectory. [sent-209, score-0.322]

84 Each trajectory was learned sequentially, with the trajectory labelled 1 learned first. [sent-210, score-0.372]

85 Note that trajectories learned later, for instance trajectory 20, diverge more slowly than earlier learned trajectories. [sent-211, score-0.629]

86 Thus, the trajectories learned earlier are forgotten while more recently acquired trajectories are maintained. [sent-212, score-0.781]

87 Furthermore, the trajectories maintain their stereotyped ability to decorrelate both after they are forgotten (e. [sent-213, score-0.376]

88 Untrained trajectories behave the same as trajectories in the left panel. [sent-218, score-0.684]

89 Before learning all trajectories are susceptible to synaptic noise. [sent-220, score-0.322]

90 After learning, those trajectories learned last exhibit robustness to noise, while trajectories learned earlier are slowly forgotten. [sent-221, score-0.82]

91 We can compare each learned trajectory to a curve in multi-dimensional space with a 'robustness pipe' surrounding it. [sent-222, score-0.225]

92 Any points lying within this pipe will be part of trajectories that remain within the pipe. [sent-223, score-0.392]

93 In the case of olfactory processing, different odors correspond to unique trajectories, while trajectories lying within a common pipe correspond to the same input odor presentation. [sent-224, score-0.854]

94 A few details on the experiment: The network contained 2048 neurons, half of which were excitatory and the other half inhibitory. [sent-225, score-0.411]

95 5 Discussion and Conclusions Sparsely connected networks of neurons have 'chaotic' properties which may be used for equalizing a set of patterns in order to make their classification easier. [sent-232, score-0.277]

96 In studying the properties of such networks we extend previous results on networks with 00 neurons by van Vreeswijk and Sompolinsky to the case of small number of neurons. [sent-233, score-0.249]

97 Moreover, we propose a learning technique to make the network immune to noise around chosen trajectories while preserving the equalization property elsewhere. [sent-235, score-0.543]

98 A precise characterization of the effects of the DAL on the distribution of the input parameters, and the consequent improvement in the ease of pattern classification is still missing. [sent-237, score-0.18]

99 the number of trajectories which can be learned in any given network before older trajectories are forgotten. [sent-241, score-0.794]

100 (2001) Odor encoding as an active, dynamical process: experiments, computation, and theory. [sent-249, score-0.16]


similar papers computed by tfidf model

tfidf for this paper:

wordName wordTfidf (topN-words)

[('dal', 0.403), ('trajectories', 0.322), ('excitatory', 0.309), ('antennal', 0.215), ('lobe', 0.164), ('dynamical', 0.16), ('units', 0.149), ('olfactory', 0.14), ('trajectory', 0.138), ('inhibitory', 0.136), ('laurent', 0.128), ('xf', 0.128), ('activity', 0.124), ('input', 0.12), ('neurons', 0.119), ('vreeswijk', 0.117), ('patterns', 0.116), ('odors', 0.108), ('network', 0.102), ('external', 0.101), ('binomial', 0.094), ('odor', 0.094), ('sompolinsky', 0.094), ('ae', 0.089), ('spreading', 0.085), ('mt', 0.085), ('design', 0.075), ('active', 0.072), ('yf', 0.071), ('ke', 0.07), ('pipe', 0.07), ('yo', 0.07), ('pattern', 0.06), ('equalization', 0.06), ('gt', 0.06), ('noise', 0.059), ('system', 0.059), ('firing', 0.059), ('connectivity', 0.059), ('inputs', 0.057), ('unit', 0.057), ('cf', 0.056), ('population', 0.054), ('ayf', 0.054), ('binomially', 0.054), ('forgotten', 0.054), ('holub', 0.054), ('leam', 0.054), ('simplifications', 0.054), ('panel', 0.052), ('blue', 0.051), ('epoch', 0.051), ('epochs', 0.051), ('fluctuations', 0.051), ('state', 0.051), ('connection', 0.049), ('classifiers', 0.049), ('learned', 0.048), ('heaviside', 0.047), ('olfaction', 0.047), ('van', 0.046), ('neuron', 0.045), ('robustness', 0.045), ('aij', 0.043), ('discriminated', 0.043), ('friedrich', 0.043), ('map', 0.042), ('networks', 0.042), ('behave', 0.04), ('discretize', 0.04), ('perona', 0.04), ('sparsely', 0.04), ('average', 0.04), ('respond', 0.04), ('curve', 0.039), ('indicates', 0.038), ('equilibrium', 0.038), ('diverge', 0.038), ('mu', 0.038), ('uneven', 0.038), ('yield', 0.036), ('evenly', 0.036), ('hypercube', 0.036), ('transistors', 0.036), ('condition', 0.035), ('digital', 0.035), ('poisson', 0.035), ('earlier', 0.035), ('au', 0.034), ('inspired', 0.034), ('predicted', 0.034), ('difference', 0.033), ('green', 0.033), ('rev', 0.033), ('carlo', 0.033), ('response', 0.032), ('hamming', 0.032), ('ofthe', 0.032), ('approximation', 0.031), ('time', 0.031)]

similar papers list:

simIndex simValue paperId paperTitle

same-paper 1 0.99999982 5 nips-2002-A Digital Antennal Lobe for Pattern Equalization: Analysis and Design

Author: Alex Holub, Gilles Laurent, Pietro Perona

Abstract: Re-mapping patterns in order to equalize their distribution may greatly simplify both the structure and the training of classifiers. Here, the properties of one such map obtained by running a few steps of discrete-time dynamical system are explored. The system is called 'Digital Antennal Lobe' (DAL) because it is inspired by recent studies of the antennallobe, a structure in the olfactory system of the grasshopper. The pattern-spreading properties of the DAL as well as its average behavior as a function of its (few) design parameters are analyzed by extending previous results of Van Vreeswijk and Sompolinsky. Furthermore, a technique for adapting the parameters of the initial design in order to obtain opportune noise-rejection behavior is suggested. Our results are demonstrated with a number of simulations. 1

2 0.20129077 76 nips-2002-Dynamical Constraints on Computing with Spike Timing in the Cortex

Author: Arunava Banerjee, Alexandre Pouget

Abstract: If the cortex uses spike timing to compute, the timing of the spikes must be robust to perturbations. Based on a recent framework that provides a simple criterion to determine whether a spike sequence produced by a generic network is sensitive to initial conditions, and numerical simulations of a variety of network architectures, we argue within the limits set by our model of the neuron, that it is unlikely that precise sequences of spike timings are used for computation under conditions typically found in the cortex.

3 0.19137166 155 nips-2002-Nonparametric Representation of Policies and Value Functions: A Trajectory-Based Approach

Author: Christopher G. Atkeson, Jun Morimoto

Abstract: A longstanding goal of reinforcement learning is to develop nonparametric representations of policies and value functions that support rapid learning without suffering from interference or the curse of dimensionality. We have developed a trajectory-based approach, in which policies and value functions are represented nonparametrically along trajectories. These trajectories, policies, and value functions are updated as the value function becomes more accurate or as a model of the task is updated. We have applied this approach to periodic tasks such as hopping and walking, which required handling discount factors and discontinuities in the task dynamics, and using function approximation to represent value functions at discontinuities. We also describe extensions of the approach to make the policies more robust to modeling error and sensor noise.

4 0.16788018 171 nips-2002-Reconstructing Stimulus-Driven Neural Networks from Spike Times

Author: Duane Q. Nykamp

Abstract: We present a method to distinguish direct connections between two neurons from common input originating from other, unmeasured neurons. The distinction is computed from the spike times of the two neurons in response to a white noise stimulus. Although the method is based on a highly idealized linear-nonlinear approximation of neural response, we demonstrate via simulation that the approach can work with a more realistic, integrate-and-fire neuron model. We propose that the approach exemplified by this analysis may yield viable tools for reconstructing stimulus-driven neural networks from data gathered in neurophysiology experiments.

5 0.15503004 51 nips-2002-Classifying Patterns of Visual Motion - a Neuromorphic Approach

Author: Jakob Heinzle, Alan Stocker

Abstract: We report a system that classifies and can learn to classify patterns of visual motion on-line. The complete system is described by the dynamics of its physical network architectures. The combination of the following properties makes the system novel: Firstly, the front-end of the system consists of an aVLSI optical flow chip that collectively computes 2-D global visual motion in real-time [1]. Secondly, the complexity of the classification task is significantly reduced by mapping the continuous motion trajectories to sequences of ’motion events’. And thirdly, all the network structures are simple and with the exception of the optical flow chip based on a Winner-Take-All (WTA) architecture. We demonstrate the application of the proposed generic system for a contactless man-machine interface that allows to write letters by visual motion. Regarding the low complexity of the system, its robustness and the already existing front-end, a complete aVLSI system-on-chip implementation is realistic, allowing various applications in mobile electronic devices.

6 0.12471371 123 nips-2002-Learning Attractor Landscapes for Learning Motor Primitives

7 0.11195768 11 nips-2002-A Model for Real-Time Computation in Generic Neural Microcircuits

8 0.098590083 160 nips-2002-Optoelectronic Implementation of a FitzHugh-Nagumo Neural Model

9 0.097824834 43 nips-2002-Binary Coding in Auditory Cortex

10 0.095082365 187 nips-2002-Spikernels: Embedding Spiking Neurons in Inner-Product Spaces

11 0.091447964 116 nips-2002-Interpreting Neural Response Variability as Monte Carlo Sampling of the Posterior

12 0.087769717 180 nips-2002-Selectivity and Metaplasticity in a Unified Calcium-Dependent Model

13 0.086811408 153 nips-2002-Neural Decoding of Cursor Motion Using a Kalman Filter

14 0.080354489 71 nips-2002-Dopamine Induced Bistability Enhances Signal Processing in Spiny Neurons

15 0.079381742 184 nips-2002-Spectro-Temporal Receptive Fields of Subthreshold Responses in Auditory Cortex

16 0.076903909 186 nips-2002-Spike Timing-Dependent Plasticity in the Address Domain

17 0.072373003 44 nips-2002-Binary Tuning is Optimal for Neural Rate Coding with High Temporal Resolution

18 0.068523556 131 nips-2002-Learning to Classify Galaxy Shapes Using the EM Algorithm

19 0.067805141 129 nips-2002-Learning in Spiking Neural Assemblies

20 0.067137823 137 nips-2002-Location Estimation with a Differential Update Network


similar papers computed by lsi model

lsi for this paper:

topicId topicWeight

[(0, -0.205), (1, 0.189), (2, -0.057), (3, -0.081), (4, 0.029), (5, 0.075), (6, 0.043), (7, 0.017), (8, 0.115), (9, 0.111), (10, -0.073), (11, 0.047), (12, 0.069), (13, -0.072), (14, -0.177), (15, -0.043), (16, -0.032), (17, 0.01), (18, 0.034), (19, 0.07), (20, -0.013), (21, 0.092), (22, 0.034), (23, -0.005), (24, 0.12), (25, -0.046), (26, 0.037), (27, 0.086), (28, 0.045), (29, 0.172), (30, 0.03), (31, -0.07), (32, -0.11), (33, 0.017), (34, 0.041), (35, 0.06), (36, -0.03), (37, 0.158), (38, -0.074), (39, -0.022), (40, 0.017), (41, -0.11), (42, 0.027), (43, 0.116), (44, -0.033), (45, 0.008), (46, -0.117), (47, 0.045), (48, -0.036), (49, 0.007)]

similar papers list:

simIndex simValue paperId paperTitle

same-paper 1 0.96477985 5 nips-2002-A Digital Antennal Lobe for Pattern Equalization: Analysis and Design

Author: Alex Holub, Gilles Laurent, Pietro Perona

Abstract: Re-mapping patterns in order to equalize their distribution may greatly simplify both the structure and the training of classifiers. Here, the properties of one such map obtained by running a few steps of discrete-time dynamical system are explored. The system is called 'Digital Antennal Lobe' (DAL) because it is inspired by recent studies of the antennallobe, a structure in the olfactory system of the grasshopper. The pattern-spreading properties of the DAL as well as its average behavior as a function of its (few) design parameters are analyzed by extending previous results of Van Vreeswijk and Sompolinsky. Furthermore, a technique for adapting the parameters of the initial design in order to obtain opportune noise-rejection behavior is suggested. Our results are demonstrated with a number of simulations. 1

2 0.74502718 160 nips-2002-Optoelectronic Implementation of a FitzHugh-Nagumo Neural Model

Author: Alexandre R. Romariz, Kelvin Wagner

Abstract: An optoelectronic implementation of a spiking neuron model based on the FitzHugh-Nagumo equations is presented. A tunable semiconductor laser source and a spectral filter provide a nonlinear mapping from driver voltage to detected signal. Linear electronic feedback completes the implementation, which allows either electronic or optical input signals. Experimental results for a single system and numeric results of model interaction confirm that important features of spiking neural models can be implemented through this approach.

3 0.66507924 11 nips-2002-A Model for Real-Time Computation in Generic Neural Microcircuits

Author: Wolfgang Maass, Thomas Natschläger, Henry Markram

Abstract: A key challenge for neural modeling is to explain how a continuous stream of multi-modal input from a rapidly changing environment can be processed by stereotypical recurrent circuits of integrate-and-fire neurons in real-time. We propose a new computational model that is based on principles of high dimensional dynamical systems in combination with statistical learning theory. It can be implemented on generic evolved or found recurrent circuitry.

4 0.66002196 171 nips-2002-Reconstructing Stimulus-Driven Neural Networks from Spike Times

Author: Duane Q. Nykamp

Abstract: We present a method to distinguish direct connections between two neurons from common input originating from other, unmeasured neurons. The distinction is computed from the spike times of the two neurons in response to a white noise stimulus. Although the method is based on a highly idealized linear-nonlinear approximation of neural response, we demonstrate via simulation that the approach can work with a more realistic, integrate-and-fire neuron model. We propose that the approach exemplified by this analysis may yield viable tools for reconstructing stimulus-driven neural networks from data gathered in neurophysiology experiments.

5 0.64688176 76 nips-2002-Dynamical Constraints on Computing with Spike Timing in the Cortex

Author: Arunava Banerjee, Alexandre Pouget

Abstract: If the cortex uses spike timing to compute, the timing of the spikes must be robust to perturbations. Based on a recent framework that provides a simple criterion to determine whether a spike sequence produced by a generic network is sensitive to initial conditions, and numerical simulations of a variety of network architectures, we argue within the limits set by our model of the neuron, that it is unlikely that precise sequences of spike timings are used for computation under conditions typically found in the cortex.

6 0.62720221 123 nips-2002-Learning Attractor Landscapes for Learning Motor Primitives

7 0.61089033 22 nips-2002-Adaptive Nonlinear System Identification with Echo State Networks

8 0.5866462 51 nips-2002-Classifying Patterns of Visual Motion - a Neuromorphic Approach

9 0.5181126 155 nips-2002-Nonparametric Representation of Policies and Value Functions: A Trajectory-Based Approach

10 0.46530157 128 nips-2002-Learning a Forward Model of a Reflex

11 0.41551721 50 nips-2002-Circuit Model of Short-Term Synaptic Dynamics

12 0.41253015 43 nips-2002-Binary Coding in Auditory Cortex

13 0.38864613 136 nips-2002-Linear Combinations of Optic Flow Vectors for Estimating Self-Motion - a Real-World Test of a Neural Model

14 0.37141475 137 nips-2002-Location Estimation with a Differential Update Network

15 0.37065637 44 nips-2002-Binary Tuning is Optimal for Neural Rate Coding with High Temporal Resolution

16 0.36995327 71 nips-2002-Dopamine Induced Bistability Enhances Signal Processing in Spiny Neurons

17 0.35118997 180 nips-2002-Selectivity and Metaplasticity in a Unified Calcium-Dependent Model

18 0.34849158 144 nips-2002-Minimax Differential Dynamic Programming: An Application to Robust Biped Walking

19 0.3305648 187 nips-2002-Spikernels: Embedding Spiking Neurons in Inner-Product Spaces

20 0.33013448 129 nips-2002-Learning in Spiking Neural Assemblies


similar papers computed by lda model

lda for this paper:

topicId topicWeight

[(11, 0.013), (23, 0.061), (42, 0.064), (54, 0.079), (55, 0.048), (57, 0.021), (60, 0.256), (67, 0.02), (68, 0.11), (74, 0.089), (87, 0.011), (92, 0.03), (98, 0.106)]

similar papers list:

simIndex simValue paperId paperTitle

same-paper 1 0.81505311 5 nips-2002-A Digital Antennal Lobe for Pattern Equalization: Analysis and Design

Author: Alex Holub, Gilles Laurent, Pietro Perona

Abstract: Re-mapping patterns in order to equalize their distribution may greatly simplify both the structure and the training of classifiers. Here, the properties of one such map obtained by running a few steps of discrete-time dynamical system are explored. The system is called 'Digital Antennal Lobe' (DAL) because it is inspired by recent studies of the antennallobe, a structure in the olfactory system of the grasshopper. The pattern-spreading properties of the DAL as well as its average behavior as a function of its (few) design parameters are analyzed by extending previous results of Van Vreeswijk and Sompolinsky. Furthermore, a technique for adapting the parameters of the initial design in order to obtain opportune noise-rejection behavior is suggested. Our results are demonstrated with a number of simulations. 1

2 0.60952091 62 nips-2002-Coulomb Classifiers: Generalizing Support Vector Machines via an Analogy to Electrostatic Systems

Author: Sepp Hochreiter, Michael C. Mozer, Klaus Obermayer

Abstract: We introduce a family of classifiers based on a physical analogy to an electrostatic system of charged conductors. The family, called Coulomb classifiers, includes the two best-known support-vector machines (SVMs), the ν–SVM and the C–SVM. In the electrostatics analogy, a training example corresponds to a charged conductor at a given location in space, the classification function corresponds to the electrostatic potential function, and the training objective function corresponds to the Coulomb energy. The electrostatic framework provides not only a novel interpretation of existing algorithms and their interrelationships, but it suggests a variety of new methods for SVMs including kernels that bridge the gap between polynomial and radial-basis functions, objective functions that do not require positive-definite kernels, regularization techniques that allow for the construction of an optimal classifier in Minkowski space. Based on the framework, we propose novel SVMs and perform simulation studies to show that they are comparable or superior to standard SVMs. The experiments include classification tasks on data which are represented in terms of their pairwise proximities, where a Coulomb Classifier outperformed standard SVMs. 1

3 0.60732883 76 nips-2002-Dynamical Constraints on Computing with Spike Timing in the Cortex

Author: Arunava Banerjee, Alexandre Pouget

Abstract: If the cortex uses spike timing to compute, the timing of the spikes must be robust to perturbations. Based on a recent framework that provides a simple criterion to determine whether a spike sequence produced by a generic network is sensitive to initial conditions, and numerical simulations of a variety of network architectures, we argue within the limits set by our model of the neuron, that it is unlikely that precise sequences of spike timings are used for computation under conditions typically found in the cortex.

4 0.59556162 73 nips-2002-Dynamic Bayesian Networks with Deterministic Latent Tables

Author: David Barber

Abstract: The application of latent/hidden variable Dynamic Bayesian Networks is constrained by the complexity of marginalising over latent variables. For this reason either small latent dimensions or Gaussian latent conditional tables linearly dependent on past states are typically considered in order that inference is tractable. We suggest an alternative approach in which the latent variables are modelled using deterministic conditional probability tables. This specialisation has the advantage of tractable inference even for highly complex non-linear/non-Gaussian visible conditional probability tables. This approach enables the consideration of highly complex latent dynamics whilst retaining the benefits of a tractable probabilistic model. 1

5 0.5877533 11 nips-2002-A Model for Real-Time Computation in Generic Neural Microcircuits

Author: Wolfgang Maass, Thomas Natschläger, Henry Markram

Abstract: A key challenge for neural modeling is to explain how a continuous stream of multi-modal input from a rapidly changing environment can be processed by stereotypical recurrent circuits of integrate-and-fire neurons in real-time. We propose a new computational model that is based on principles of high dimensional dynamical systems in combination with statistical learning theory. It can be implemented on generic evolved or found recurrent circuitry.

6 0.57865047 148 nips-2002-Morton-Style Factorial Coding of Color in Primary Visual Cortex

7 0.57344931 141 nips-2002-Maximally Informative Dimensions: Analyzing Neural Responses to Natural Signals

8 0.57246804 43 nips-2002-Binary Coding in Auditory Cortex

9 0.57053602 44 nips-2002-Binary Tuning is Optimal for Neural Rate Coding with High Temporal Resolution

10 0.56900781 51 nips-2002-Classifying Patterns of Visual Motion - a Neuromorphic Approach

11 0.56893903 102 nips-2002-Hidden Markov Model of Cortical Synaptic Plasticity: Derivation of the Learning Rule

12 0.56823945 123 nips-2002-Learning Attractor Landscapes for Learning Motor Primitives

13 0.56440848 28 nips-2002-An Information Theoretic Approach to the Functional Classification of Neurons

14 0.56331581 199 nips-2002-Timing and Partial Observability in the Dopamine System

15 0.5577926 50 nips-2002-Circuit Model of Short-Term Synaptic Dynamics

16 0.55480283 204 nips-2002-VIBES: A Variational Inference Engine for Bayesian Networks

17 0.55409431 81 nips-2002-Expected and Unexpected Uncertainty: ACh and NE in the Neocortex

18 0.55120969 127 nips-2002-Learning Sparse Topographic Representations with Products of Student-t Distributions

19 0.55113518 184 nips-2002-Spectro-Temporal Receptive Fields of Subthreshold Responses in Auditory Cortex

20 0.55103242 173 nips-2002-Recovering Intrinsic Images from a Single Image