nips nips2000 nips2000-34 knowledge-graph by maker-knowledge-mining

34 nips-2000-Competition and Arbors in Ocular Dominance


Source: pdf

Author: Peter Dayan

Abstract: Hebbian and competitive Hebbian algorithms are almost ubiquitous in modeling pattern formation in cortical development. We analyse in theoretical detail a particular model (adapted from Piepenbrock & Obermayer, 1999) for the development of Id stripe-like patterns, which places competitive and interactive cortical influences, and free and restricted initial arborisation onto a common footing.

Reference: text


Summary: the most important sentenses genereted by tfidf model

sentIndex sentText sentNum sentScore

1 uk Abstract Hebbian and competitive Hebbian algorithms are almost ubiquitous in modeling pattern formation in cortical development. [sent-5, score-0.297]

2 We analyse in theoretical detail a particular model (adapted from Piepenbrock & Obermayer, 1999) for the development of Id stripe-like patterns, which places competitive and interactive cortical influences, and free and restricted initial arborisation onto a common footing. [sent-6, score-0.394]

3 These well-known fingerprint patterns have been a seductive target for models of cortical pattern formation because of the mix of competition and cooperation they suggest. [sent-8, score-0.406]

4 A wealth of synaptic adaptation algorithms has been suggested to account for them (and also the concomitant refinement of the topography of the map between the eyes and the cortex), many of which are based on forms of Hebbian learning. [sent-9, score-0.328]

5 Different models show different effects of these parameters as to whether ocular dominance should form at all, and, if it does, then what determines the widths of the stripes, which is the main experimental observable. [sent-11, score-0.787]

6 Although particular classes of models excite fervid criticism from the experimental community, it is to be hoped that the general principles of competitive and cooperative pattern formation that underlie them will remain relevant. [sent-12, score-0.212]

7 Piepenbrock & Obermayer (1999) suggested an interesting model in which varying a single parameter spans a spectrum from cortical competition to cooperation. [sent-14, score-0.261]

8 However, the nature of competition in their model makes it hard to predict the outcome of adaptation completely, except in some special cases. [sent-15, score-0.218]

9 In this paper, we suggest a slightly different model of competition which makes the analysis tractable, and simultaneously generalise the model to consider an additional spectrum between flat and peaked arborisation. [sent-16, score-0.304]

10 It is based on the competitive model of Piepenbrock & Obermayer (1999), who developed it in order to explore a continuum between competitive and linear cortical interactions. [sent-18, score-0.323]

11 We use a slightly different competition mechanism and also c B A L cortex. [sent-19, score-0.208]

12 -- competitive interaction veal W'(a,b) A (a,b)~ A D L R W R ocularity w- W '(a,b) A (a,b) a o u'(b) 60000 left 0 0000 thalamus right u'(b) b Figure 1: Competitive ocular dominance model. [sent-20, score-1.056]

13 A) Left (L) and right (R) input units (with activities u L (b) and uR(b) at the same location b in input space) project through weights WL(a, b) and WR(a, b) and a restricted topography arbor function A(a, b) (B) to an output layer, which is subject to lateral competitive interactions. [sent-21, score-0.839]

14 C) Stable weight patterns W(a , b) showing ocular dominance. [sent-22, score-0.45]

15 D) (left) difference in the connections W- = W R - W L from right and left eye; (right) sum difference across b showing the net ocularity for each a. [sent-23, score-0.209]

16 There are N = 100 units in each input layer and the output layer. [sent-29, score-0.121]

17 extend the model with an arbor function (as in Miller et aI, 1989). [sent-31, score-0.202]

18 The model has two input layers (representing input from the thalamus from left 'L' and right 'R' eyes), each containing N units, laid out in a single spatial dimension. [sent-32, score-0.242]

19 These connect to an output layer (layer IV of area VI) with N units too, which is also laid out in a single spatial dimension. [sent-33, score-0.107]

20 We use a continuum approximation, so labeling weights W L ( a, b) and W R ( a, b) . [sent-34, score-0.112]

21 An arbor function, A(a, b), represents the multiplicity of each such connection (an example is given in figure IB). [sent-35, score-0.202]

22 Four characteristics define the model: the arbor function, the statistics of the input; the mapping from input to output; and the rule by which the weights change. [sent-37, score-0.32]

23 The arbor function A(a, b) specifies the basic topography of the map at the time that the pattern of synaptic growth is being established. [sent-38, score-0.516]

24 The two ends of the spectrum for the arbor are fiat, when A(a, b) = 0: is constant (O"A = 00), and rigid or punctate, when A(a, b) ()( c5(a - b) (O"A = 0) and so input cells are mapped only to their topographically matched cells in the cortex. [sent-40, score-0.356]

25 Since the model is non-linear, pattern formation is a function of aspects of the input in addition to the two-point correlations between input units that drive development of standard, non-competitive, Hebbian models. [sent-42, score-0.293]

26 5 each), and determines whether the input is more from the right or left projection. [sent-46, score-0.126]

27 The third component of the model is the way that input activities and the weights conspire to form output activities. [sent-48, score-0.208]

28 This happens in linear (I), competitive (c) and interactive (i) steps: I: c: v(a) = JdbA(a,b) (WL(a,b)uL(b) + WR(a,b)uR(b)) , v~a) = (v(a))/3 / Jda' (v(a'))/3 i : vi(a) = Jda' I(a, a')v~a) (2) (3) Weights, arbor and input and output activities are all positive. [sent-49, score-0.469]

29 In equation 3c, f3 ~ 1 is a parameter governing the strength of competition between the cortical cells. [sent-50, score-0.334]

30 This form of competition makes it possible to perform analyses of pattern formation that are hard for the model of Piepenbrock & Obermayer (1999). [sent-52, score-0.29]

31 A natural form for the cortical interactions of equation 3i is the purely positive Gaussian I(a, at) = e-(a-a')2/ 2o} . [sent-53, score-0.163]

32 The initial values for the weights are WL,R = we-(a-b)2/20'~ +1]8W L,R, where w is cho(similarly for WR) where A(a) sen to satisfy the normalisation constraints, 1] is small, and 8WL(a, b) and 8WR(a, b) are random perturbations constrained so that normalisation is still satisfied. [sent-58, score-0.305]

33 Values of u~ < 00 can emerge as equilibrium values of the weights if there is sufficient competition (sufficiently large (3) or a restricted arbor (ul < 00). [sent-59, score-0.645]

34 3 Pattern Formation We analyse pattern formation in the standard manner, finding the equilibrium points (which requires solving a non-linear equation), linearising about them and finding which linear mode grows the fastest. [sent-60, score-0.487]

35 By symmetry, the system separates into two modes, one involving the sum of the weight perturbations 8W+ =8W R +8W L, which governs the precision of the topography of the final mapping, and one involving the difference 8W+ = 8W R-;5W L , which governs ocular dominance. [sent-61, score-0.849]

36 The development of ocular dominance requires that a mode of 8W- (a, b) # 0 grows, for which each output cell has weights of only one sign (either positive or negative). [sent-62, score-1.046]

37 The stripe width is determined by changes in this sign across the output layer. [sent-63, score-0.307]

38 Equilibrium solution The equilibrium values of the weights can be found by solving (5) for the A+ determined such that the normalisation constraint fdb W L (a, b) + W R ( a, b) = satisfied for all a. [sent-65, score-0.344]

39 The result is n is (((3 + I)I + (3U)W2 + (A(((3 + I)I + (3U) - ((3 - I)UI)W - (3AIU = 0 (6) Figure 2 shows how the resulting physically realisable (W > 0) equilibrium value of Uw depends on (3, UA and UI, varying each in turn about a single set of values in figure 1. [sent-69, score-0.229]

40 Figure 2A shows that the width rapidly asymptotes as (3 grows, and it only gets large as the arbor function gets large for (3 near 1. [sent-70, score-0.364]

41 For (3 =1 (the dotted line), which quite closely parallels the non-competitive case of Miller et al (1989), A 0. [sent-72, score-0.127]

42 1 10 1 Figure 2: Log-log plots of the equilibrium values of ow in the case of multiplicative normalisation. [sent-99, score-0.298]

43 B) aw as a function of aA for fl = 10 (solid), fl = 1. [sent-108, score-0.192]

44 aw grows roughly like the square root of aA as the arborisation gets flatter. [sent-113, score-0.226]

45 For any (3 > 1, one equilibrium value of aw has a finite asymptote with UA. [sent-114, score-0.305]

46 For absolutely flat topography = = 00) and (3 > 1, there are actually two equilibrium values for uw, one with Uw 00, flat weights; the other with Uw taking values such as the asymptotic values for the dotted and solid lines in figure 2B. [sent-115, score-0.817]

47 (UA ie The sum mode The update equation for (normalised) perturbations to the sum mode is 8W+ (a, b) -t (1 - f. [sent-116, score-0.27]

48 A+)oW+(a, b) + f~ II daldb l O(a, b, al, bdoW+(al' bl ) - f. [sent-117, score-0.103]

49 Here, the values of A+ and A'Ca) = (3 III dbdaldb l A(a, b)O(a, b, aI, bl )8W+(al, bl )/2f2 (10) come from the normalisation condition. [sent-119, score-0.318]

50 We consider the full eigenfunctions ofO(a, b, aI, bl ) below. [sent-123, score-0.301]

51 However, the case that Piepenbrock & Obermayer (1999) studied of a flat arbor function (u A = 00) turns out to be special, admitting two equilibrium solutions, one flat, one with topography, whose stability depends on (3. [sent-124, score-0.46]

52 For UA < 00, the only Gaussian equilibrium solution for the weights has a refined topography (as one might expect), and this is stable. [sent-125, score-0.447]

53 This width depends on the parameters in a way shown in equation 6 and figure 2, in particular, reaching a non-zero asymptote even as (3 gets very large. [sent-126, score-0.201]

54 The difference mode The sum mode controls the refinement of topography, whereas the difference mode controls the development and nature of ocular dominance. [sent-127, score-0.883]

55 The equilibrium value of W- (a, b) is always 0, by symmetry, and the linearised difference equation for the mode is oW- (a , b) -t (l-f. [sent-128, score-0.363]

56 81 2 3 Figure 3: Eigenfunctions and eigenvalues of 0 1 (left block), 0 2 (centre block), and and the theoretical and empirical approximations to 0 (right columns). [sent-137, score-0.131]

57 Here, as in equation 12, k is the frequency of alternation of ocularity across the output (which is integral for a finite system); n is the order of the Hermite polynomial. [sent-138, score-0.211]

58 which is almost the same as equation 7 (with the same operator 0), except that the multiplier for the integral is (3"(2 /2 rather than (3/2. [sent-141, score-0.088]

59 Since "( < 1, the eigenvalues for the difference mode are therefore all less than those for the sum mode, and by the same fraction. [sent-142, score-0.258]

60 Note that the equilibrium values of the weights (controlled by ow) affect the operator 0, and hence its eigenfunctions and eigenvalues. [sent-144, score-0.511]

61 Provided that the arbor and the initial values of the weights are not both flat (aA =j:. [sent-145, score-0.403]

62 00), the principal eigenfunctions of 0 1 and 0 2 have the general form (12) where Pn(r, k) is a polynomial (related to a Hermite polynomial) of degree n in r whose coefficients depend on k. [sent-147, score-0.235]

63 Here k controls the periodicity in the projective field of each input cell b to the output cells, and ultimately the periodicity of any ocular dominance stripes that might form. [sent-148, score-1.1]

64 Operator 0 2 has zero eigenvalues for the polynomials of degree n > 0. [sent-150, score-0.168]

65 The expressions for the coefficients of the polynomials and the non-zero eigenvalues of 0 1 and 0 2 are rather complicated. [sent-151, score-0.131]

66 The left 4 x 3 block shows eigenfunctions and eigenvalues of 0 1 for k = 0 . [sent-153, score-0.41]

67 5 and n = 0, 1, 2; the middle 4 x 3 block, the equivalent eigenfunctions and eigenvalues of 0 2 . [sent-156, score-0.329]

68 The eigenvalues come essentially from a Gaussian, whose standard deviation is smaller for 0 2 . [sent-157, score-0.131]

69 To a crude first approximation, therefore, the eigenvalues of 0 resemble the difference of two Gaussians in k, and so have a peak at a non-zero value of k, ie a finite ocular dominance periodicity. [sent-158, score-0.986]

70 Although the eigenfunctions of 0 1 and 0 2 shown in figure 3 look almost identical, they are, in fact, subtly different, since 0 1 and 0 2 do not commute (except for flat or rigid topography). [sent-160, score-0.34]

71 The similarity between the eigenfunctions makes it possible to approximate the eigenfunctions of 0 very closely by expanding those of 0 2 in terms of 0 1 (or vice-versa). [sent-161, score-0.396]

72 Expanding for n ~ 2 leads to the approximate eigenfunctions and eigenvalues for 0 shown in the penultimate column on the right of figure 3. [sent-163, score-0.368]

73 /N) (dotted line) and the ocular dominance eigenvalues e(k)(Q/N) (solid line 7 = 1; dotted line 7 = 0. [sent-165, score-1.084]

74 5) of /3720/2 as a function of C>[ , where k is the stripe frequency associated with the maximum eigenvalue. [sent-166, score-0.286]

75 For C>[ too large, the ocular dominance eigenfunction no longer dominates. [sent-167, score-0.821]

76 The star and hexagon show the maximum values of C>r such that ocular dominance can form in each case. [sent-168, score-0.89]

77 B) Stripe frequency k associated with the largest eigenvalue as a function of C>r. [sent-170, score-0.215]

78 The star and hexagon are the same as in (A), showing that the critical preferred stripe frequency is greater for higher correlations between the inputs (lower 7). [sent-171, score-0.366]

79 For comparison, the farthest right column shows empirically calculated eigenfunctions and eigenvalues of 0 (using a 50 x 50 grid). [sent-174, score-0.368]

80 For the parameters of figure 3, the case with k 3 has the largest eigenvalue, and exactly this leads to the outcome of figure IC;D. [sent-176, score-0.101]

81 = 4 Results We can now predict the outcome of development for any set of parameters. [sent-177, score-0.096]

82 First, the analysis of the behavior of the sum mode (including, if necessary, the point about multiple equilibria for flat initial topography) allows a prediction of the equilibrium value of c>w, which indicates the degree of topographic refinement. [sent-178, score-0.418]

83 Second, this value of C>w can be used to calculate the value of the normalisation parameter ). [sent-179, score-0.141]

84 that the eigenvalues of 0 must surmount for a solution that is not completely binocular to develop. [sent-182, score-0.131]

85 Third, if the peak eigenvalue of is indeed sufficiently large that ocular dominance develops, then the favored periodicity is set by the value of k associated with this eigenvalue. [sent-183, score-0.951]

86 Of course, if many eigenfunctions have similarly large eigenvalues, then slightly different stripe periodicities may be observed depending on the initial conditions. [sent-184, score-0.415]

87 o The solid line in figure 4A shows the largest eigenvalue of f37 2 0/2 as a function of the width of the cortical interactions C>[, for 7 = 1, the value of C>w specified through the equilibrium analysis, and values of the other parameters as in figure 1. [sent-185, score-0.642]

88 The largest value of C>[ for which ocular dominance still forms is indicated by the star. [sent-188, score-0.878]

89 5, the eigenvalues are reduced by a factor of 7 2 = 0. [sent-190, score-0.131]

90 Figure 4B shows the frequency of the stripes associated with the largest eigenvalue. [sent-192, score-0.297]

91 This line is jagged because only integers are acceptable as stripe frequencies. [sent-194, score-0.226]

92 If the frequency of the stripes is most strongly determined by the frequency that grows fastest when C>[ is first sufficiently small that stripes grow, we can analyse plots such as those in figure 4 to determine the outcome of development. [sent-197, score-0.612]

93 , Figure 5: First three figures : maximal values of fr[ for which ocular dominance will develop as a function of /. [sent-212, score-0.851]

94 Last three figures: value of stripe frequency k associated with the maximal eigenvalue for parameters as in the left three plots at the critical value of fr[. [sent-217, score-0.535]

95 show the largest values of fr[ for which ocular dominance can develop; the bottom plots show the stripe frequencies associated with these critical values of fr[ (like the stars and hexagons in figure 4), in both cases as a function of /. [sent-218, score-1.227]

96 Where no value of fr[ permits ocular dominance to form, no line is shown. [sent-223, score-0.86]

97 From the plots, we can see that the more similar the inputs, (the smaller 'Y) or the less the competition (the smaller fJ), the harder it is for ocular dominance to form. [sent-224, score-0.963]

98 However, if ocular dominance does form, then the width of the stripes depends only weakJy on the degree of competition, and slightly more strongly on the width of the arbors. [sent-225, score-1.149]

99 For rigid topography, as frA -t 0, the critical value of fr[ depends roughly linearly on 'Y . [sent-227, score-0.125]

100 Note that the stripe width predicted by the linear analysis does not depend on the correlation between the input projections unless other parameters (such as a[) change, although ocular dominance might not develop for some values of the parameters. [sent-229, score-1.169]


similar papers computed by tfidf model

tfidf for this paper:

wordName wordTfidf (topN-words)

[('ocular', 0.45), ('dominance', 0.337), ('topography', 0.215), ('arbor', 0.202), ('eigenfunctions', 0.198), ('stripe', 0.185), ('obermayer', 0.176), ('competition', 0.176), ('equilibrium', 0.162), ('wl', 0.137), ('fra', 0.137), ('piepenbrock', 0.137), ('stripes', 0.137), ('eigenvalues', 0.131), ('wr', 0.107), ('ur', 0.107), ('bl', 0.103), ('competitive', 0.098), ('flat', 0.096), ('mode', 0.091), ('cortical', 0.085), ('dotted', 0.084), ('aa', 0.084), ('formation', 0.08), ('width', 0.078), ('normalisation', 0.077), ('uw', 0.076), ('fr', 0.075), ('ul', 0.075), ('aw', 0.072), ('miller', 0.072), ('weights', 0.07), ('analyse', 0.067), ('ua', 0.067), ('frequency', 0.066), ('hebbian', 0.063), ('fl', 0.06), ('largest', 0.059), ('arborisation', 0.059), ('erwin', 0.059), ('ocularity', 0.059), ('swindale', 0.059), ('solid', 0.059), ('ow', 0.057), ('eigenvalue', 0.055), ('fj', 0.055), ('development', 0.054), ('grows', 0.053), ('governs', 0.051), ('input', 0.048), ('critical', 0.047), ('activities', 0.046), ('perturbations', 0.046), ('rigid', 0.046), ('operator', 0.046), ('output', 0.044), ('plots', 0.044), ('al', 0.043), ('eyes', 0.042), ('continuum', 0.042), ('periodicity', 0.042), ('block', 0.042), ('gets', 0.042), ('outcome', 0.042), ('equation', 0.042), ('line', 0.041), ('left', 0.039), ('asymptote', 0.039), ('hermite', 0.039), ('jda', 0.039), ('schulten', 0.039), ('thalamic', 0.039), ('right', 0.039), ('degree', 0.037), ('dashed', 0.037), ('synaptic', 0.037), ('difference', 0.036), ('interactions', 0.036), ('projections', 0.036), ('values', 0.035), ('associated', 0.035), ('pattern', 0.034), ('refinement', 0.034), ('hexagon', 0.034), ('eigenfunction', 0.034), ('kohonen', 0.034), ('laid', 0.034), ('star', 0.034), ('thalamus', 0.034), ('value', 0.032), ('slightly', 0.032), ('cooperation', 0.031), ('eigenmodes', 0.031), ('governing', 0.031), ('interactive', 0.031), ('vi', 0.03), ('cells', 0.03), ('units', 0.029), ('figures', 0.029), ('growth', 0.028)]

similar papers list:

simIndex simValue paperId paperTitle

same-paper 1 0.9999994 34 nips-2000-Competition and Arbors in Ocular Dominance

Author: Peter Dayan

Abstract: Hebbian and competitive Hebbian algorithms are almost ubiquitous in modeling pattern formation in cortical development. We analyse in theoretical detail a particular model (adapted from Piepenbrock & Obermayer, 1999) for the development of Id stripe-like patterns, which places competitive and interactive cortical influences, and free and restricted initial arborisation onto a common footing.

2 0.082683496 102 nips-2000-Position Variance, Recurrence and Perceptual Learning

Author: Zhaoping Li, Peter Dayan

Abstract: Stimulus arrays are inevitably presented at different positions on the retina in visual tasks, even those that nominally require fixation. In particular, this applies to many perceptual learning tasks. We show that perceptual inference or discrimination in the face of positional variance has a structurally different quality from inference about fixed position stimuli, involving a particular, quadratic, non-linearity rather than a purely linear discrimination. We show the advantage taking this non-linearity into account has for discrimination, and suggest it as a role for recurrent connections in area VI, by demonstrating the superior discrimination performance of a recurrent network. We propose that learning the feedforward and recurrent neural connections for these tasks corresponds to the fast and slow components of learning observed in perceptual learning tasks.

3 0.074648678 129 nips-2000-Temporally Dependent Plasticity: An Information Theoretic Account

Author: Gal Chechik, Naftali Tishby

Abstract: The paradigm of Hebbian learning has recently received a novel interpretation with the discovery of synaptic plasticity that depends on the relative timing of pre and post synaptic spikes. This paper derives a temporally dependent learning rule from the basic principle of mutual information maximization and studies its relation to the experimentally observed plasticity. We find that a supervised spike-dependent learning rule sharing similar structure with the experimentally observed plasticity increases mutual information to a stable near optimal level. Moreover, the analysis reveals how the temporal structure of time-dependent learning rules is determined by the temporal filter applied by neurons over their inputs. These results suggest experimental prediction as to the dependency of the learning rule on neuronal biophysical parameters 1

4 0.070375763 124 nips-2000-Spike-Timing-Dependent Learning for Oscillatory Networks

Author: Silvia Scarpetta, Zhaoping Li, John A. Hertz

Abstract: We apply to oscillatory networks a class of learning rules in which synaptic weights change proportional to pre- and post-synaptic activity, with a kernel A(r) measuring the effect for a postsynaptic spike a time r after the presynaptic one. The resulting synaptic matrices have an outer-product form in which the oscillating patterns are represented as complex vectors. In a simple model, the even part of A(r) enhances the resonant response to learned stimulus by reducing the effective damping, while the odd part determines the frequency of oscillation. We relate our model to the olfactory cortex and hippocampus and their presumed roles in forming associative memories and input representations. 1

5 0.069223158 110 nips-2000-Regularization with Dot-Product Kernels

Author: Alex J. Smola, Zoltán L. Óvári, Robert C. Williamson

Abstract: In this paper we give necessary and sufficient conditions under which kernels of dot product type k(x, y) = k(x . y) satisfy Mercer's condition and thus may be used in Support Vector Machines (SVM), Regularization Networks (RN) or Gaussian Processes (GP). In particular, we show that if the kernel is analytic (i.e. can be expanded in a Taylor series), all expansion coefficients have to be nonnegative. We give an explicit functional form for the feature map by calculating its eigenfunctions and eigenvalues. 1

6 0.067687467 40 nips-2000-Dendritic Compartmentalization Could Underlie Competition and Attentional Biasing of Simultaneous Visual Stimuli

7 0.066704914 8 nips-2000-A New Model of Spatial Representation in Multimodal Brain Areas

8 0.058007993 81 nips-2000-Learning Winner-take-all Competition Between Groups of Neurons in Lateral Inhibitory Networks

9 0.056671564 66 nips-2000-Hippocampally-Dependent Consolidation in a Hierarchical Model of Neocortex

10 0.050465308 104 nips-2000-Processing of Time Series by Neural Circuits with Biologically Realistic Synaptic Dynamics

11 0.049227491 49 nips-2000-Explaining Away in Weight Space

12 0.048834294 77 nips-2000-Learning Curves for Gaussian Processes Regression: A Framework for Good Approximations

13 0.048790637 107 nips-2000-Rate-coded Restricted Boltzmann Machines for Face Recognition

14 0.047634412 146 nips-2000-What Can a Single Neuron Compute?

15 0.047136575 99 nips-2000-Periodic Component Analysis: An Eigenvalue Method for Representing Periodic Structure in Speech

16 0.046128001 108 nips-2000-Recognizing Hand-written Digits Using Hierarchical Products of Experts

17 0.046066698 100 nips-2000-Permitted and Forbidden Sets in Symmetric Threshold-Linear Networks

18 0.045779336 11 nips-2000-A Silicon Primitive for Competitive Learning

19 0.043094169 27 nips-2000-Automatic Choice of Dimensionality for PCA

20 0.042846322 92 nips-2000-Occam's Razor


similar papers computed by lsi model

lsi for this paper:

topicId topicWeight

[(0, 0.15), (1, -0.081), (2, -0.084), (3, -0.022), (4, 0.023), (5, 0.044), (6, -0.004), (7, -0.074), (8, 0.023), (9, -0.037), (10, -0.024), (11, -0.043), (12, -0.034), (13, -0.051), (14, 0.121), (15, 0.01), (16, 0.037), (17, -0.05), (18, 0.139), (19, -0.016), (20, 0.026), (21, 0.12), (22, -0.051), (23, -0.025), (24, -0.071), (25, 0.081), (26, 0.039), (27, 0.043), (28, -0.001), (29, 0.011), (30, 0.064), (31, -0.026), (32, 0.06), (33, -0.057), (34, 0.055), (35, -0.015), (36, -0.209), (37, 0.071), (38, 0.005), (39, -0.06), (40, 0.17), (41, 0.109), (42, 0.123), (43, 0.283), (44, -0.178), (45, 0.088), (46, -0.105), (47, 0.087), (48, 0.106), (49, -0.116)]

similar papers list:

simIndex simValue paperId paperTitle

same-paper 1 0.95288336 34 nips-2000-Competition and Arbors in Ocular Dominance

Author: Peter Dayan

Abstract: Hebbian and competitive Hebbian algorithms are almost ubiquitous in modeling pattern formation in cortical development. We analyse in theoretical detail a particular model (adapted from Piepenbrock & Obermayer, 1999) for the development of Id stripe-like patterns, which places competitive and interactive cortical influences, and free and restricted initial arborisation onto a common footing.

2 0.59896988 66 nips-2000-Hippocampally-Dependent Consolidation in a Hierarchical Model of Neocortex

Author: Szabolcs KĂĄli, Peter Dayan

Abstract: In memory consolidation, declarative memories which initially require the hippocampus for their recall, ultimately become independent of it. Consolidation has been the focus of numerous experimental and qualitative modeling studies, but only little quantitative exploration. We present a consolidation model in which hierarchical connections in the cortex, that initially instantiate purely semantic information acquired through probabilistic unsupervised learning, come to instantiate episodic information as well. The hippocampus is responsible for helping complete partial input patterns before consolidation is complete, while also training the cortex to perform appropriate completion by itself.

3 0.41282776 124 nips-2000-Spike-Timing-Dependent Learning for Oscillatory Networks

Author: Silvia Scarpetta, Zhaoping Li, John A. Hertz

Abstract: We apply to oscillatory networks a class of learning rules in which synaptic weights change proportional to pre- and post-synaptic activity, with a kernel A(r) measuring the effect for a postsynaptic spike a time r after the presynaptic one. The resulting synaptic matrices have an outer-product form in which the oscillating patterns are represented as complex vectors. In a simple model, the even part of A(r) enhances the resonant response to learned stimulus by reducing the effective damping, while the odd part determines the frequency of oscillation. We relate our model to the olfactory cortex and hippocampus and their presumed roles in forming associative memories and input representations. 1

4 0.38535196 131 nips-2000-The Early Word Catches the Weights

Author: Mark A. Smith, Garrison W. Cottrell, Karen L. Anderson

Abstract: The strong correlation between the frequency of words and their naming latency has been well documented. However, as early as 1973, the Age of Acquisition (AoA) of a word was alleged to be the actual variable of interest, but these studies seem to have been ignored in most of the literature. Recently, there has been a resurgence of interest in AoA. While some studies have shown that frequency has no effect when AoA is controlled for, more recent studies have found independent contributions of frequency and AoA. Connectionist models have repeatedly shown strong effects of frequency, but little attention has been paid to whether they can also show AoA effects. Indeed, several researchers have explicitly claimed that they cannot show AoA effects. In this work, we explore these claims using a simple feed forward neural network. We find a significant contribution of AoA to naming latency, as well as conditions under which frequency provides an independent contribution. 1 Background Naming latency is the time between the presentation of a picture or written word and the beginning of the correct utterance of that word. It is undisputed that there are significant differences in the naming latency of many words, even when controlling word length, syllabic complexity, and other structural variants. The cause of differences in naming latency has been the subject of numerous studies. Earlier studies found that the frequency with which a word appears in spoken English is the best determinant of its naming latency (Oldfield & Wingfield, 1965). More recent psychological studies, however, show that the age at which a word is learned, or its Age of Acquisition (AoA), may be a better predictor of naming latency. Further, in many multiple regression analyses, frequency is not found to be significant when AoA is controlled for (Brown & Watson, 1987; Carroll & White, 1973; Morrison et al. 1992; Morrison & Ellis, 1995). These studies show that frequency and AoA are highly correlated (typically r =-.6) explaining the confound of older studies on frequency. However, still more recent studies question this finding and find that both AoA and frequency are significant and contribute independently to naming latency (Ellis & Morrison, 1998; Gerhand & Barry, 1998,1999). Much like their psychological counterparts, connectionist networks also show very strong frequency effects. However, the ability of a connectionist network to show AoA effects has been doubted (Gerhand & Barry, 1998; Morrison & Ellis, 1995). Most of these claims are based on the well known fact that connectionist networks exhibit

5 0.38085312 102 nips-2000-Position Variance, Recurrence and Perceptual Learning

Author: Zhaoping Li, Peter Dayan

Abstract: Stimulus arrays are inevitably presented at different positions on the retina in visual tasks, even those that nominally require fixation. In particular, this applies to many perceptual learning tasks. We show that perceptual inference or discrimination in the face of positional variance has a structurally different quality from inference about fixed position stimuli, involving a particular, quadratic, non-linearity rather than a purely linear discrimination. We show the advantage taking this non-linearity into account has for discrimination, and suggest it as a role for recurrent connections in area VI, by demonstrating the superior discrimination performance of a recurrent network. We propose that learning the feedforward and recurrent neural connections for these tasks corresponds to the fast and slow components of learning observed in perceptual learning tasks.

6 0.30922857 110 nips-2000-Regularization with Dot-Product Kernels

7 0.29036292 99 nips-2000-Periodic Component Analysis: An Eigenvalue Method for Representing Periodic Structure in Speech

8 0.28165671 18 nips-2000-Active Support Vector Machine Classification

9 0.27207109 43 nips-2000-Dopamine Bonuses

10 0.25815213 56 nips-2000-Foundations for a Circuit Complexity Theory of Sensory Processing

11 0.24986908 49 nips-2000-Explaining Away in Weight Space

12 0.24081913 87 nips-2000-Modelling Spatial Recall, Mental Imagery and Neglect

13 0.22850537 40 nips-2000-Dendritic Compartmentalization Could Underlie Competition and Attentional Biasing of Simultaneous Visual Stimuli

14 0.22457264 11 nips-2000-A Silicon Primitive for Competitive Learning

15 0.22053188 125 nips-2000-Stability and Noise in Biochemical Switches

16 0.21580465 135 nips-2000-The Manhattan World Assumption: Regularities in Scene Statistics which Enable Bayesian Inference

17 0.2113139 92 nips-2000-Occam's Razor

18 0.20820105 48 nips-2000-Exact Solutions to Time-Dependent MDPs

19 0.2074904 79 nips-2000-Learning Segmentation by Random Walks

20 0.20697868 93 nips-2000-On Iterative Krylov-Dogleg Trust-Region Steps for Solving Neural Networks Nonlinear Least Squares Problems


similar papers computed by lda model

lda for this paper:

topicId topicWeight

[(10, 0.048), (17, 0.088), (21, 0.376), (33, 0.031), (54, 0.013), (55, 0.012), (62, 0.033), (65, 0.013), (67, 0.063), (75, 0.01), (76, 0.066), (79, 0.025), (81, 0.038), (90, 0.05), (91, 0.011), (93, 0.016), (97, 0.012)]

similar papers list:

simIndex simValue paperId paperTitle

same-paper 1 0.82089055 34 nips-2000-Competition and Arbors in Ocular Dominance

Author: Peter Dayan

Abstract: Hebbian and competitive Hebbian algorithms are almost ubiquitous in modeling pattern formation in cortical development. We analyse in theoretical detail a particular model (adapted from Piepenbrock & Obermayer, 1999) for the development of Id stripe-like patterns, which places competitive and interactive cortical influences, and free and restricted initial arborisation onto a common footing.

2 0.3545832 122 nips-2000-Sparse Representation for Gaussian Process Models

Author: Lehel Csatč´¸, Manfred Opper

Abstract: We develop an approach for a sparse representation for Gaussian Process (GP) models in order to overcome the limitations of GPs caused by large data sets. The method is based on a combination of a Bayesian online algorithm together with a sequential construction of a relevant subsample of the data which fully specifies the prediction of the model. Experimental results on toy examples and large real-world data sets indicate the efficiency of the approach.

3 0.3544018 102 nips-2000-Position Variance, Recurrence and Perceptual Learning

Author: Zhaoping Li, Peter Dayan

Abstract: Stimulus arrays are inevitably presented at different positions on the retina in visual tasks, even those that nominally require fixation. In particular, this applies to many perceptual learning tasks. We show that perceptual inference or discrimination in the face of positional variance has a structurally different quality from inference about fixed position stimuli, involving a particular, quadratic, non-linearity rather than a purely linear discrimination. We show the advantage taking this non-linearity into account has for discrimination, and suggest it as a role for recurrent connections in area VI, by demonstrating the superior discrimination performance of a recurrent network. We propose that learning the feedforward and recurrent neural connections for these tasks corresponds to the fast and slow components of learning observed in perceptual learning tasks.

4 0.35317084 104 nips-2000-Processing of Time Series by Neural Circuits with Biologically Realistic Synaptic Dynamics

Author: Thomas Natschläger, Wolfgang Maass, Eduardo D. Sontag, Anthony M. Zador

Abstract: Experimental data show that biological synapses behave quite differently from the symbolic synapses in common artificial neural network models. Biological synapses are dynamic, i.e., their

5 0.35131416 146 nips-2000-What Can a Single Neuron Compute?

Author: Blaise Agüera y Arcas, Adrienne L. Fairhall, William Bialek

Abstract: In this paper we formulate a description of the computation performed by a neuron as a combination of dimensional reduction and nonlinearity. We implement this description for the HodgkinHuxley model, identify the most relevant dimensions and find the nonlinearity. A two dimensional description already captures a significant fraction of the information that spikes carry about dynamic inputs. This description also shows that computation in the Hodgkin-Huxley model is more complex than a simple integrateand-fire or perceptron model. 1

6 0.34992772 74 nips-2000-Kernel Expansions with Unlabeled Examples

7 0.34778434 7 nips-2000-A New Approximate Maximal Margin Classification Algorithm

8 0.34567261 106 nips-2000-Propagation Algorithms for Variational Bayesian Learning

9 0.34469289 4 nips-2000-A Linear Programming Approach to Novelty Detection

10 0.3437199 21 nips-2000-Algorithmic Stability and Generalization Performance

11 0.34132227 49 nips-2000-Explaining Away in Weight Space

12 0.34115478 9 nips-2000-A PAC-Bayesian Margin Bound for Linear Classifiers: Why SVMs work

13 0.34078285 69 nips-2000-Incorporating Second-Order Functional Knowledge for Better Option Pricing

14 0.34066683 37 nips-2000-Convergence of Large Margin Separable Linear Classification

15 0.34063077 119 nips-2000-Some New Bounds on the Generalization Error of Combined Classifiers

16 0.33999625 95 nips-2000-On a Connection between Kernel PCA and Metric Multidimensional Scaling

17 0.33963507 64 nips-2000-High-temperature Expansions for Learning Models of Nonnegative Data

18 0.33890632 111 nips-2000-Regularized Winnow Methods

19 0.33860406 134 nips-2000-The Kernel Trick for Distances

20 0.33562836 79 nips-2000-Learning Segmentation by Random Walks