nips nips2000 nips2000-125 knowledge-graph by maker-knowledge-mining

125 nips-2000-Stability and Noise in Biochemical Switches


Source: pdf

Author: William Bialek

Abstract: Many processes in biology, from the regulation of gene expression in bacteria to memory in the brain, involve switches constructed from networks of biochemical reactions. Crucial molecules are present in small numbers, raising questions about noise and stability. Analysis of noise in simple reaction schemes indicates that switches stable for years and switchable in milliseconds can be built from fewer than one hundred molecules. Prospects for direct tests of this prediction, as well as implications, are discussed. 1

Reference: text


Summary: the most important sentenses genereted by tfidf model

sentIndex sentText sentNum sentScore

1 Stability and noise in biochemical switches William Bialek NEC Research Instit ute 4 Independence Way Princeton, New Jersey 08540 bialek@research. [sent-1, score-0.534]

2 com Abstract Many processes in biology, from the regulation of gene expression in bacteria to memory in the brain, involve switches constructed from networks of biochemical reactions. [sent-4, score-0.544]

3 Crucial molecules are present in small numbers, raising questions about noise and stability. [sent-5, score-0.569]

4 Analysis of noise in simple reaction schemes indicates that switches stable for years and switchable in milliseconds can be built from fewer than one hundred molecules. [sent-6, score-0.746]

5 1 Introduction The problem of building a reliable switch arises in several different biological contexts. [sent-8, score-0.244]

6 The classical example is the switching on and off of gene expression during development [1], or in simpler systems such as phage . [sent-9, score-0.359]

7 It is likely that the cell cycle should also be viewed as a sequence of switching events among discrete states, rather than as a continuously running clock [3]. [sent-11, score-0.203]

8 The stable switching of a specific class of kinase molecules between active and inactive states is believed to playa role in synaptic plasticity, and by implication in the maintenance of stored memories [4] . [sent-12, score-1.078]

9 First, the stable states of the switches are dissipative, so that they reflect a balance among competing biochemical reactions. [sent-14, score-0.713]

10 Second, the total number of molecules involved in the construction of the switch is not large. [sent-15, score-0.78]

11 Finally, the switch, once flipped, must be stable for a time long compared to the switching time, perhaps- for development and for memory- even for a time comparable to the life of the organism. [sent-16, score-0.398]

12 Intuitively we might expect that systems with small numbers of molecules would be subject to noise and instability [5], and while this is true we shall see that extremely stable biochemical switches can in fact be built from a few tens of molecules. [sent-17, score-1.336]

13 This has interesting implications for how we think about several cellular processes, and should be testable directly. [sent-18, score-0.036]

14 Many biological molecules can exist in multiple states, and biochemical switches use this molecular multistability so that the state of the switch can be 'read out' by sampling the states (or enzymatic activities) of individual molecules. [sent-19, score-1.389]

15 Nonetheless, these biochemical switches are based on a network of reactions, with stable states that are collective properties of the network dynamics and not of any individual molecule. [sent-20, score-0.753]

16 Most previous work on the properties of biochemical reaction networks has involved detailed simulation of particular kinetic schemes [6], for example in discussing the kinase switch that is involved in synaptic plasticity [7]. [sent-21, score-1.065]

17 Even the problem of noise has been discussed heuristically in this context t8]. [sent-22, score-0.083]

18 The goal in the present analysis is to separate the problem of noise and st a bility from other issues, and to see if it is possible to make some general statements about the limits to stability in switches built from a small number of molecules. [sent-23, score-0.427]

19 2 Stochastic kinetic equations Imagine that we write down the kinetic equations for some set of biochemical reactions which describe the putative switch. [sent-25, score-0.833]

20 Now let us assume that most of the reactions are fast, so that there is a single molecular species whose concentration varies more slowly than all the others. [sent-26, score-0.398]

21 Then the dynamics of the switch essentially are one dimensional, and this simplification allows a complete discussion using standard analytical methods. [sent-27, score-0.284]

22 In particular , in this limit there are general bounds on the stability of switches, a nd these bounds are independent of (incompletely known) details in the biochemical kinetics. [sent-28, score-0.36]

23 It should be possible to make progress on multidimensional versions of the problem, but the point here is to show that there exists a limit in which stable switches can be built from small numbers of molecules. [sent-29, score-0.452]

24 Let the number of molecules of the 'slow species' be n. [sent-30, score-0.486]

25 All the different reactions can be broken into two classes: the synthesis of the slow species at a rate f(n) molecules per second, and its degradation at a rat e g(n) molecules per second; the dependencies on n can be complicated because they include the effects of all other species in the system. [sent-31, score-1.483]

26 Then, if we could neglect fluct uations, we would write the effective kinetic equation dn dt = f(n) (1) - g(n). [sent-32, score-0.385]

27 If the system is to function as a switch, then the stationarity condition f(n) = g(n) must have multiple solutions with appropriat e local stability properties. [sent-33, score-0.087]

28 The fact that molecules are discrete units means that we need to give the chemical kinetic Eq. [sent-34, score-0.783]

29 It is the mean field approximation to a stochastic process in which there is a probability per unit time f(n) of making the transition n ---+ n+1, and a probability per unit time g(n) of the opposite transition n ---+ n - 1. [sent-36, score-0.142]

30 Thus if we consider the probability P(n , t) for there being n molecules at time t , this distribution obeys the evolution (or 'master') equation ap~~, t) = f (n - l )P(n - 1, t) + g(n + l)P(n + 1, t) - [f(n) + g(n)]P(n, t),(2) with obvious corrections for n = 0, 1. [sent-37, score-0.486]

31 Then 1 is small compared with typical values of n, and we can approximate P(n, t) as being a smooth funct ion of n. [sent-39, score-0.054]

32 (3) This is analogous to the diffusion equation for a particle moving in a potential, but this analogy works only if allow the effective temperature to vary with the position of the particle. [sent-42, score-0.372]

33 As with diffusion or Brownian motion, there is an alternative to the diffusion equation for P( n, t) and this is to write an equation of motion for n( t) which supplements Eq. [sent-43, score-0.179]

34 (1) by the addition of a random or Langevin force dn dt \~(t)~(t') ) f(n) - g(n) [f(n) + ~(t), + g(n)]b(t - t'). [sent-44, score-0.092]

35 ~(t): (4) (5) From the Langevin equation we can also develop the distribution functional for the probability of trajectories n(t). [sent-45, score-0.08]

36 It should be emphasized that all of these approaches are equivalent provided that we are careful to treat the spatial variations of the effective temperature [10]. [sent-46, score-0.261]

37 For any particular kinetic scheme we can compute the effective potential and temperature, and kinetic schemes with multiple stable states correspond to potential functions with multiple minima. [sent-48, score-0.876]

38 3 Noise induced switching rates We want to know how the noise term destabilizes the distinct stable states of the switch. [sent-49, score-0.548]

39 If the noise is small, then by analogy with thermal noise we expect that there will be some small jitter around the stable states, but also some rate of spontaneous jumping between the states, analogous to thermal activat ion over an energy barrier as in a chemical reaction. [sent-50, score-0.923]

40 This jumping rate should be the product of an "attempt frequency"-of order the relaxation rate in the neighborhood of one stable stateand a "Boltzmann factor" that expresses the exponentially small probability of going over the barrier. [sent-51, score-0.28]

41 For ordinary chemical reactions this Boltzmann factor is just exp(-Ft /kB T), where Ft is the activat ion free energy. [sent-52, score-0.483]

42 If we want to build a switch that can be stable for a time much longer than the switching time itself, then the Boltzmann factor has to provide this large ratio of time scales. [sent-53, score-0.682]

43 There are several ways to calculate the analog of the Boltzmann factor for the dynamics in Eq. [sent-54, score-0.08]

44 The first step is to make more explicit the analogy with Brownian motion and thermal activation. [sent-56, score-0.147]

45 (4), we see that our problem is equivalent to a particle with "( = 1 in an effective potential If,,ff(n) such that V:ff(n) = g(n) - f(n), at an effective temperature Teff(n) = [f(n) + g(n)]/2. [sent-59, score-0.389]

46 Their reference for the failure of Langevin methods [12], however, seems to consider only Langevin terms with constant spectral density, thus ignoring (in the present language) the spatial variations of effective temperature. [sent-61, score-0.143]

47 For the present problem this would mean replacing the noise correlation function [f(n) + g(n)]8(t - t') in Eq. [sent-62, score-0.12]

48 This indeed is wrong, and is not equivalent to the master equation. [sent-64, score-0.036]

49 [11 , 12] were generally correct, they would imply that Langevin methods could not used for the description of Brownian motion with a spatially varying temperature, and this would be quite a surprise. [sent-66, score-0.121]

50 Teff (Y) (8) One way to identify the Boltzmann factor for spontaneous switching is then to compute the relative equilibrium occupancy of the stable states (no and nd and the unstable "transition state" at n*. [sent-68, score-0.701]

51 The result is that the effective activation energy for transitions from a stable state at n = no to the stable state at n = nl > no is t ----+ F (no _ nl) - 2kBT In. [sent-69, score-0.631]

52 g(n) - f(n) dn g(n) + f(n)' no (9) where n* is the unstable point , and similarly for the reverse transition, t ----+ F (nl _ no) - 2kBT ln n. [sent-70, score-0.113]

53 The use of optimal path ideas in chemical kinetics has a long history, going back at least to Onsager. [sent-72, score-0.184]

54 For equations of the general form ~: = - V:ff(n) + ~(t), (11) with (~(t)~(t')) = 2Teff(t)J(t-t'), the probability distribution for trajectories P[n(t)] can be written as [10] P [n(t)] exp (-S[n(t)]) (12) S [n(t)] ~ J dtTe~(t) [n(t) + V:ff (n(t))]2 - ~ J dtV:~(n(t)). [sent-75, score-0.08]

55 (13) If the temperature Teff is small, then the trajectories that minimize the action should be determined primarily by minimizing the first term in Eq. [sent-76, score-0.198]

56 Identifying the effective potential and temperature as above, the relevant term is ~ J dt [n - f(n) 2 + g(n)]2 f(n) + g(n) 1 - Jdt 2 n2 f(n) + g(n) 1 [f(n) - g(n)j2 + - Jdt -'-'---::-':--'-:------'---'c-'--;---2 f(n) + g(n) - J dtn f(n) - g(n) . [sent-78, score-0.287]

57 f(n) + g(n) (14) We are searching for trajectories which take n(t) from a stable point no where f(no) = g(no) through the unstable point n* where f and 9 are again equal but the derivative of their difference (the curvature of the potential) has changed sign. [sent-79, score-0.336]

58 First we note that along any trajectory from no to n* we can simplify the third term in Eq. [sent-82, score-0.034]

59 f(n) - g(n) tn f(n) + g(n) = In' no d f(n) - g(n) n f(n) + g(n)' (15) This term thus depends on the endpoints of the trajectory and not on the path, and therefore cannot contribute to the structure of the optimal path. [sent-84, score-0.07]

60 In the analogy to mechanics , the first two terms are equivalent to the (Euclidean) action for a particle with position dependent mass in a potential; this means that along extremal trajectories there is a conserved energy E = ~ n2 1 [f(n) - g(n}F 2 f(n) + g(n) - 2 f(n) + g(n) . [sent-85, score-0.25]

61 At the endpoints of the trajectory we have n = a and f(n) looking for zero energy trajectories, along which n(t) = Ă‚Ä…[f(n(t)) - g(n(t))] . [sent-86, score-0.126]

62 Both the 'transition state' and the optimal path method involve approximations, but if the noise is not too large the approximations are good and the results of the two methods agree. [sent-90, score-0.116]

63 Yet another approach is to solve the master equation (2) directly, a nd again one gets the same answer for the switching rate when the noise is small, as expected since all the different approaches are all equivalent if we make consistent approximations. [sent-91, score-0.358]

64 It is much more work to find the prefactors of the rates, but we are concerned here with orders of magnitude, and hence the prefactors aren't so important. [sent-92, score-0.11]

65 (9,10) are bounded by one, so the activation energy (in units of the thermal energy kBT) is bounded by twice the change in the number of molecules. [sent-94, score-0.166]

66 Translating back to the spontaneous switching rates , the result is that the noise driven switching time is longer than the relaxation time after switching by a factor that is bounded, (A ) spontaneous switching time . [sent-95, score-1.176]

67 n is the change in the number of molecules required to go from one stable 'switched' state to the other. [sent-98, score-0.717]

68 Imagine that we have a reaction scheme in which the difference between the two stable states corresponds to roughly 25 molecules. [sent-99, score-0.371]

69 Then it is possible to have a Boltzmann factor of up to exp(25) rv 1010. [sent-100, score-0.095]

70 Usually we think of this as a limit to stability: with 25 molecules we can have a Boltzmann factor of no more than rv 1010. [sent-101, score-0.581]

71 But here I want to emphasize the positive statement that there exist kinetic schemes in which just 25 molecules would be sufficient to have this level of stability. [sent-102, score-0.771]

72 This corresponds to years per millisecond: with twenty five molecules, a biochemical switch that can flip in milliseconds can be stable for years. [sent-103, score-0.785]

73 Real chemical reaction schemes will not saturate this bound, but certainly such stability is possible with roughly 100 molecules. [sent-104, score-0.377]

74 The genetic switch in A phage operates with roughly 100 copies of the repressor molecules, and even in this simple system there is extreme stability: the genetic switch is flipped spontaneously only once in 105 generations of the host bacterium [2]. [sent-105, score-0.775]

75 Kinetic schemes with greater cooperativity get closer to the bound, achieving greater stability for the same number of molecules. [sent-106, score-0.153]

76 In electronics, the construction of digital elements provides insulation against fluctuations on a microscopic scale and allows a separation between the logical and physical design of a large system. [sent-107, score-0.036]

77 We see that , once a cell has access to several tens of molecules, it is possible to construct 'digital' switch elements with dynamics that are no longer significantly affected by microscopic fluctuations. [sent-108, score-0.361]

78 Furthermore, weak interactions of these molecules with other cellular components cannot change the basic 'states' of the switch, although these interactions can couple state changes to other events. [sent-109, score-0.558]

79 The importance of this 'digitization' on the scale of 10 -100 molecules is illustrated by different models for pattern formation in development. [sent-110, score-0.486]

80 In this picture, the spatial structure of the pattern is linked directly to physical properties of the molecules. [sent-112, score-0.036]

81 An alternative that each spatial location is labelled by a set of discrete possible states, and patterns evolve out of the 'automaton' rules by which each location changes state in relation to the neighboring states. [sent-113, score-0.072]

82 In this picture states and rules are more abstract, and the dynamics of pattern formation is really at a different level of description from the molecular dynamics of chemical reactions and diffusion. [sent-114, score-0.557]

83 Reliable implementations of automaton rules apparently are accessible as soon as the relevant chemical reactions involve a few dozen molecules. [sent-115, score-0.383]

84 Biochemical switches have been reconstituted in vitro, but I am not aware of any attempts to verify that stable switching is possible with small numbers of molecules. [sent-116, score-0.612]

85 It would be most interesting to study model systems in which one could confine and monitor sufficiently few molecules that it becomes possible to observe spontaneous switching, that is the breakdown of stability. [sent-117, score-0.622]

86 Although genetic switches have certain advantages, even the simplest systems would require full enzymatic apparatus for gene expression (but see Ref. [sent-118, score-0.519]

87 [16] for recent progress on controllable in vitro expression systems). [sent-119, score-0.087]

88 2 Kinase switches are much simpler, since they can be constructed from just a few proteins and can be triggered by calcium; caged calcium allows for an optical pulse to serve as input. [sent-120, score-0.465]

89 At reasonable protein concentrations, 10 - 100 molecules are found in a volume of roughly 1 (J. [sent-121, score-0.541]

90 tm, such that solutions of kinase and accessory proteins would switch stably in the larger cells but exhibit instability and spontaneous switching in the smaller cells. [sent-124, score-0.773]

91 A related idea would be to construct vesicles containing ligand gate ion channels which can conduct calcium, and then have inside the vesicle enzymes for synthesis and degradation of the ligand which are calcium sensitive. [sent-126, score-0.53]

92 The cGMP channels of rod photoreceptors are an example, and in rods the cyclase synthesizing cGMP is calcium sensitive, but the sign is wrong to make a switch [17]; presumably this could solved by appropriate mixing and matching of protein components from different cells. [sent-127, score-0.487]

93 In such a vesicle the different stable states would be distinguished by different 2Note also that reactions involving polymer synthesis (mRNA from DNA or protein from mRNA) are not 'elementary' reactions in the sense described by Eq. [sent-128, score-0.985]

94 Synthesis of a single mRN A molecule involves thousands of steps, each of which occurs (conditionally) at constant probability per unit time, and so the noise in the overall synthesis reaction is very different. [sent-130, score-0.312]

95 Thus there is some subtlety in trying to relate a simple model to the complex sequence of reactions involved in gene expression. [sent-132, score-0.343]

96 This combination of circumstances would make experiments on a minimal, in vitro genetic switch espcially interesting. [sent-134, score-0.427]

97 levels of internal calcium (as with adaptation states in the rod), and these could be read out optically using calcium indicators; caged calcium would again provide an optical input to flip the switch. [sent-135, score-0.662]

98 Amusingly, a close packed array of such vesicles with rv 100 nm dimension would provide an optically addressable and writable memory with storage density comparable to current RAM, albeit with much slower switching. [sent-136, score-0.176]

99 In summary, it should be possible to build stable biochemical switches from a few tens of molecules, and it seems likely that nature makes use of these. [sent-137, score-0.687]

100 To test our understanding of stability we have to construct systems which cross the threshold for observable instabilities, and this seems accessible experimentally in several systems. [sent-138, score-0.123]


similar papers computed by tfidf model

tfidf for this paper:

wordName wordTfidf (topN-words)

[('molecules', 0.486), ('switch', 0.244), ('biochemical', 0.237), ('reactions', 0.232), ('switches', 0.214), ('switching', 0.203), ('stable', 0.195), ('kinetic', 0.182), ('langevin', 0.148), ('calcium', 0.146), ('kinase', 0.127), ('temperature', 0.118), ('chemical', 0.115), ('reaction', 0.109), ('spontaneous', 0.099), ('genetic', 0.091), ('stability', 0.087), ('synthesis', 0.083), ('noise', 0.083), ('trajectories', 0.08), ('effective', 0.074), ('boltzmann', 0.073), ('particle', 0.068), ('states', 0.067), ('schemes', 0.066), ('diffusion', 0.066), ('brownian', 0.066), ('arkin', 0.063), ('molecular', 0.063), ('phage', 0.063), ('proteins', 0.063), ('teff', 0.063), ('species', 0.061), ('unstable', 0.061), ('gene', 0.061), ('ff', 0.057), ('energy', 0.056), ('potential', 0.055), ('mcadams', 0.055), ('vitro', 0.055), ('prefactors', 0.055), ('protein', 0.055), ('rv', 0.055), ('thermal', 0.054), ('ion', 0.054), ('dn', 0.052), ('involved', 0.05), ('motion', 0.047), ('analogy', 0.046), ('relaxation', 0.043), ('built', 0.043), ('activat', 0.042), ('apparatus', 0.042), ('blackwell', 0.042), ('caged', 0.042), ('cgmp', 0.042), ('concentration', 0.042), ('concentrations', 0.042), ('dykman', 0.042), ('enzymatic', 0.042), ('enzymes', 0.042), ('flipped', 0.042), ('jdt', 0.042), ('jumping', 0.042), ('kennedy', 0.042), ('ligand', 0.042), ('lisman', 0.042), ('mrna', 0.042), ('optically', 0.042), ('polymer', 0.042), ('polymerization', 0.042), ('ptashne', 0.042), ('rod', 0.042), ('vesicle', 0.042), ('vesicles', 0.042), ('tens', 0.041), ('dt', 0.04), ('dynamics', 0.04), ('factor', 0.04), ('nl', 0.039), ('would', 0.037), ('per', 0.037), ('endpoints', 0.036), ('cellular', 0.036), ('master', 0.036), ('libchaber', 0.036), ('accessible', 0.036), ('flip', 0.036), ('kinetics', 0.036), ('microscopic', 0.036), ('milliseconds', 0.036), ('spatial', 0.036), ('nd', 0.036), ('state', 0.036), ('trajectory', 0.034), ('transition', 0.034), ('path', 0.033), ('variations', 0.033), ('quantum', 0.033), ('expression', 0.032)]

similar papers list:

simIndex simValue paperId paperTitle

same-paper 1 1.0000005 125 nips-2000-Stability and Noise in Biochemical Switches

Author: William Bialek

Abstract: Many processes in biology, from the regulation of gene expression in bacteria to memory in the brain, involve switches constructed from networks of biochemical reactions. Crucial molecules are present in small numbers, raising questions about noise and stability. Analysis of noise in simple reaction schemes indicates that switches stable for years and switchable in milliseconds can be built from fewer than one hundred molecules. Prospects for direct tests of this prediction, as well as implications, are discussed. 1

2 0.13438335 88 nips-2000-Multiple Timescales of Adaptation in a Neural Code

Author: Adrienne L. Fairhall, Geoffrey D. Lewen, William Bialek, Robert R. de Ruyter van Steveninck

Abstract: Many neural systems extend their dynamic range by adaptation. We examine the timescales of adaptation in the context of dynamically modulated rapidly-varying stimuli, and demonstrate in the fly visual system that adaptation to the statistical ensemble of the stimulus dynamically maximizes information transmission about the time-dependent stimulus. Further, while the rate response has long transients, the adaptation takes place on timescales consistent with optimal variance estimation.

3 0.10334257 80 nips-2000-Learning Switching Linear Models of Human Motion

Author: Vladimir Pavlovic, James M. Rehg, John MacCormick

Abstract: The human figure exhibits complex and rich dynamic behavior that is both nonlinear and time-varying. Effective models of human dynamics can be learned from motion capture data using switching linear dynamic system (SLDS) models. We present results for human motion synthesis, classification, and visual tracking using learned SLDS models. Since exact inference in SLDS is intractable, we present three approximate inference algorithms and compare their performance. In particular, a new variational inference algorithm is obtained by casting the SLDS model as a Dynamic Bayesian Network. Classification experiments show the superiority of SLDS over conventional HMM's for our problem domain.

4 0.084722616 98 nips-2000-Partially Observable SDE Models for Image Sequence Recognition Tasks

Author: Javier R. Movellan, Paul Mineiro, Ruth J. Williams

Abstract: This paper explores a framework for recognition of image sequences using partially observable stochastic differential equation (SDE) models. Monte-Carlo importance sampling techniques are used for efficient estimation of sequence likelihoods and sequence likelihood gradients. Once the network dynamics are learned, we apply the SDE models to sequence recognition tasks in a manner similar to the way Hidden Markov models (HMMs) are commonly applied. The potential advantage of SDEs over HMMS is the use of continuous state dynamics. We present encouraging results for a video sequence recognition task in which SDE models provided excellent performance when compared to hidden Markov models. 1

5 0.068555698 42 nips-2000-Divisive and Subtractive Mask Effects: Linking Psychophysics and Biophysics

Author: Barbara Zenger, Christof Koch

Abstract: We describe an analogy between psychophysically measured effects in contrast masking, and the behavior of a simple integrate-andfire neuron that receives time-modulated inhibition. In the psychophysical experiments, we tested observers ability to discriminate contrasts of peripheral Gabor patches in the presence of collinear Gabor flankers. The data reveal a complex interaction pattern that we account for by assuming that flankers provide divisive inhibition to the target unit for low target contrasts, but provide subtractive inhibition to the target unit for higher target contrasts. A similar switch from divisive to subtractive inhibition is observed in an integrate-and-fire unit that receives inhibition modulated in time such that the cell spends part of the time in a high-inhibition state and part of the time in a low-inhibition state. The similarity between the effects suggests that one may cause the other. The biophysical model makes testable predictions for physiological single-cell recordings. 1 Psychophysics Visual images of Gabor patches are thought to excite a small and specific subset of neurons in the primary visual cortex and beyond. By measuring psychophysically in humans the contrast detection and discrimination thresholds of peripheral Gabor patches, one can estimate the sensitivity of this subset of neurons. Furthermore, spatial interactions between different neuronal populations can be probed by testing the effects of additional Gabor patches (masks) on performance. Such experiments have revealed a highly configuration-specific pattern of excitatory and inhibitory spatial interactions [1, 2]. 1.1 Methods Two vertical Gabor patches with a spatial frequency of 4cyc/deg were presented at 4 deg eccentricity left and right of fixation, and observers had to report which patch had the higher contrast (spatial 2AFC). In the

6 0.063683882 76 nips-2000-Learning Continuous Distributions: Simulations With Field Theoretic Priors

7 0.061104223 100 nips-2000-Permitted and Forbidden Sets in Symmetric Threshold-Linear Networks

8 0.060855597 21 nips-2000-Algorithmic Stability and Generalization Performance

9 0.058303174 64 nips-2000-High-temperature Expansions for Learning Models of Nonnegative Data

10 0.053915262 13 nips-2000-A Tighter Bound for Graphical Models

11 0.053423297 146 nips-2000-What Can a Single Neuron Compute?

12 0.051923003 38 nips-2000-Data Clustering by Markovian Relaxation and the Information Bottleneck Method

13 0.051638681 82 nips-2000-Learning and Tracking Cyclic Human Motion

14 0.050724268 14 nips-2000-A Variational Mean-Field Theory for Sigmoidal Belief Networks

15 0.050627615 83 nips-2000-Machine Learning for Video-Based Rendering

16 0.049882587 142 nips-2000-Using Free Energies to Represent Q-values in a Multiagent Reinforcement Learning Task

17 0.048130028 72 nips-2000-Keeping Flexible Active Contours on Track using Metropolis Updates

18 0.047822688 129 nips-2000-Temporally Dependent Plasticity: An Information Theoretic Account

19 0.045518838 124 nips-2000-Spike-Timing-Dependent Learning for Oscillatory Networks

20 0.044288911 137 nips-2000-The Unscented Particle Filter


similar papers computed by lsi model

lsi for this paper:

topicId topicWeight

[(0, 0.163), (1, -0.095), (2, -0.018), (3, -0.055), (4, 0.031), (5, -0.041), (6, 0.021), (7, 0.015), (8, -0.035), (9, -0.026), (10, 0.063), (11, 0.047), (12, 0.045), (13, 0.045), (14, -0.073), (15, -0.136), (16, 0.131), (17, -0.125), (18, 0.046), (19, -0.132), (20, 0.083), (21, 0.103), (22, 0.053), (23, -0.093), (24, -0.089), (25, 0.059), (26, -0.054), (27, -0.176), (28, 0.007), (29, -0.211), (30, -0.02), (31, 0.02), (32, -0.113), (33, 0.024), (34, 0.236), (35, 0.143), (36, -0.121), (37, -0.022), (38, 0.124), (39, -0.035), (40, -0.009), (41, -0.067), (42, -0.054), (43, -0.042), (44, -0.07), (45, 0.03), (46, 0.057), (47, 0.113), (48, -0.006), (49, 0.012)]

similar papers list:

simIndex simValue paperId paperTitle

same-paper 1 0.95960128 125 nips-2000-Stability and Noise in Biochemical Switches

Author: William Bialek

Abstract: Many processes in biology, from the regulation of gene expression in bacteria to memory in the brain, involve switches constructed from networks of biochemical reactions. Crucial molecules are present in small numbers, raising questions about noise and stability. Analysis of noise in simple reaction schemes indicates that switches stable for years and switchable in milliseconds can be built from fewer than one hundred molecules. Prospects for direct tests of this prediction, as well as implications, are discussed. 1

2 0.50145286 80 nips-2000-Learning Switching Linear Models of Human Motion

Author: Vladimir Pavlovic, James M. Rehg, John MacCormick

Abstract: The human figure exhibits complex and rich dynamic behavior that is both nonlinear and time-varying. Effective models of human dynamics can be learned from motion capture data using switching linear dynamic system (SLDS) models. We present results for human motion synthesis, classification, and visual tracking using learned SLDS models. Since exact inference in SLDS is intractable, we present three approximate inference algorithms and compare their performance. In particular, a new variational inference algorithm is obtained by casting the SLDS model as a Dynamic Bayesian Network. Classification experiments show the superiority of SLDS over conventional HMM's for our problem domain.

3 0.47935128 88 nips-2000-Multiple Timescales of Adaptation in a Neural Code

Author: Adrienne L. Fairhall, Geoffrey D. Lewen, William Bialek, Robert R. de Ruyter van Steveninck

Abstract: Many neural systems extend their dynamic range by adaptation. We examine the timescales of adaptation in the context of dynamically modulated rapidly-varying stimuli, and demonstrate in the fly visual system that adaptation to the statistical ensemble of the stimulus dynamically maximizes information transmission about the time-dependent stimulus. Further, while the rate response has long transients, the adaptation takes place on timescales consistent with optimal variance estimation.

4 0.38891098 98 nips-2000-Partially Observable SDE Models for Image Sequence Recognition Tasks

Author: Javier R. Movellan, Paul Mineiro, Ruth J. Williams

Abstract: This paper explores a framework for recognition of image sequences using partially observable stochastic differential equation (SDE) models. Monte-Carlo importance sampling techniques are used for efficient estimation of sequence likelihoods and sequence likelihood gradients. Once the network dynamics are learned, we apply the SDE models to sequence recognition tasks in a manner similar to the way Hidden Markov models (HMMs) are commonly applied. The potential advantage of SDEs over HMMS is the use of continuous state dynamics. We present encouraging results for a video sequence recognition task in which SDE models provided excellent performance when compared to hidden Markov models. 1

5 0.34370947 83 nips-2000-Machine Learning for Video-Based Rendering

Author: Arno Schödl, Irfan A. Essa

Abstract: We present techniques for rendering and animation of realistic scenes by analyzing and training on short video sequences. This work extends the new paradigm for computer animation, video textures, which uses recorded video to generate novel animations by replaying the video samples in a new order. Here we concentrate on video sprites, which are a special type of video texture. In video sprites, instead of storing whole images, the object of interest is separated from the background and the video samples are stored as a sequence of alpha-matted sprites with associated velocity information. They can be rendered anywhere on the screen to create a novel animation of the object. We present methods to create such animations by finding a sequence of sprite samples that is both visually smooth and follows a desired path. To estimate visual smoothness, we train a linear classifier to estimate visual similarity between video samples. If the motion path is known in advance, we use beam search to find a good sample sequence. We can specify the motion interactively by precomputing the sequence cost function using Q-Iearning.

6 0.30680007 25 nips-2000-Analysis of Bit Error Probability of Direct-Sequence CDMA Multiuser Demodulators

7 0.30664343 82 nips-2000-Learning and Tracking Cyclic Human Motion

8 0.27782091 48 nips-2000-Exact Solutions to Time-Dependent MDPs

9 0.2664094 42 nips-2000-Divisive and Subtractive Mask Effects: Linking Psychophysics and Biophysics

10 0.26021656 21 nips-2000-Algorithmic Stability and Generalization Performance

11 0.24613062 13 nips-2000-A Tighter Bound for Graphical Models

12 0.24283354 14 nips-2000-A Variational Mean-Field Theory for Sigmoidal Belief Networks

13 0.2362003 76 nips-2000-Learning Continuous Distributions: Simulations With Field Theoretic Priors

14 0.23273998 126 nips-2000-Stagewise Processing in Error-correcting Codes and Image Restoration

15 0.23080729 90 nips-2000-New Approaches Towards Robust and Adaptive Speech Recognition

16 0.2281452 8 nips-2000-A New Model of Spatial Representation in Multimodal Brain Areas

17 0.22791316 38 nips-2000-Data Clustering by Markovian Relaxation and the Information Bottleneck Method

18 0.21780412 142 nips-2000-Using Free Energies to Represent Q-values in a Multiagent Reinforcement Learning Task

19 0.20747249 64 nips-2000-High-temperature Expansions for Learning Models of Nonnegative Data

20 0.19850439 114 nips-2000-Second Order Approximations for Probability Models


similar papers computed by lda model

lda for this paper:

topicId topicWeight

[(10, 0.02), (17, 0.063), (18, 0.014), (32, 0.01), (33, 0.036), (36, 0.01), (42, 0.05), (45, 0.105), (55, 0.03), (60, 0.012), (62, 0.046), (65, 0.021), (67, 0.077), (69, 0.213), (75, 0.017), (76, 0.043), (79, 0.018), (81, 0.027), (90, 0.018), (91, 0.015), (94, 0.016), (97, 0.021)]

similar papers list:

simIndex simValue paperId paperTitle

same-paper 1 0.85477924 125 nips-2000-Stability and Noise in Biochemical Switches

Author: William Bialek

Abstract: Many processes in biology, from the regulation of gene expression in bacteria to memory in the brain, involve switches constructed from networks of biochemical reactions. Crucial molecules are present in small numbers, raising questions about noise and stability. Analysis of noise in simple reaction schemes indicates that switches stable for years and switchable in milliseconds can be built from fewer than one hundred molecules. Prospects for direct tests of this prediction, as well as implications, are discussed. 1

2 0.79544574 50 nips-2000-FaceSync: A Linear Operator for Measuring Synchronization of Video Facial Images and Audio Tracks

Author: Malcolm Slaney, Michele Covell

Abstract: FaceSync is an optimal linear algorithm that finds the degree of synchronization between the audio and image recordings of a human speaker. Using canonical correlation, it finds the best direction to combine all the audio and image data, projecting them onto a single axis. FaceSync uses Pearson's correlation to measure the degree of synchronization between the audio and image data. We derive the optimal linear transform to combine the audio and visual information and describe an implementation that avoids the numerical problems caused by computing the correlation matrices. 1 Motivation In many applications, we want to know about the synchronization between an audio signal and the corresponding image data. In a teleconferencing system, we might want to know which of the several people imaged by a camera is heard by the microphones; then, we can direct the camera to the speaker. In post-production for a film, clean audio dialog is often dubbed over the video; we want to adjust the audio signal so that the lip-sync is perfect. When analyzing a film, we want to know when the person talking is in the shot, instead of off camera. When evaluating the quality of dubbed films, we can measure of how well the translated words and audio fit the actor's face. This paper describes an algorithm, FaceSync, that measures the degree of synchronization between the video image of a face and the associated audio signal. We can do this task by synthesizing the talking face, using techniques such as Video Rewrite [1], and then comparing the synthesized video with the test video. That process, however, is expensive. Our solution finds a linear operator that, when applied to the audio and video signals, generates an audio-video-synchronization-error signal. The linear operator gathers information from throughout the image and thus allows us to do the computation inexpensively. Hershey and Movellan [2] describe an approach based on measuring the mutual information between the audio signal and individual pixels in the video. The correlation between the audio signal, x, and one pixel in the image y, is given by Pearson's correlation, r. The mutual information between these two variables is given by f(x,y) = -1/2 log(l-?). They create movies that show the regions of the video that have high correlation with the audio; 1. Currently at IBM Almaden Research, 650 Harry Road, San Jose, CA 95120. 2. Currently at Yes Video. com, 2192 Fortune Drive, San Jose, CA 95131. Standard Deviation of Testing Data FaceSync

3 0.71804756 118 nips-2000-Smart Vision Chip Fabricated Using Three Dimensional Integration Technology

Author: Hiroyuki Kurino, M. Nakagawa, Kang Wook Lee, Tomonori Nakamura, Yuusuke Yamada, Ki Tae Park, Mitsumasa Koyanagi

Abstract: The smart VISIOn chip has a large potential for application in general purpose high speed image processing systems . In order to fabricate smart vision chips including photo detector compactly, we have proposed the application of three dimensional LSI technology for smart vision chips. Three dimensional technology has great potential to realize new neuromorphic systems inspired by not only the biological function but also the biological structure. In this paper, we describe our three dimensional LSI technology for neuromorphic circuits and the design of smart vision chips .

4 0.56310189 72 nips-2000-Keeping Flexible Active Contours on Track using Metropolis Updates

Author: Trausti T. Kristjansson, Brendan J. Frey

Abstract: Condensation, a form of likelihood-weighted particle filtering, has been successfully used to infer the shapes of highly constrained

5 0.53013164 75 nips-2000-Large Scale Bayes Point Machines

Author: Ralf Herbrich, Thore Graepel

Abstract: The concept of averaging over classifiers is fundamental to the Bayesian analysis of learning. Based on this viewpoint, it has recently been demonstrated for linear classifiers that the centre of mass of version space (the set of all classifiers consistent with the training set) - also known as the Bayes point - exhibits excellent generalisation abilities. However, the billiard algorithm as presented in [4] is restricted to small sample size because it requires o (m 2 ) of memory and 0 (N . m2 ) computational steps where m is the number of training patterns and N is the number of random draws from the posterior distribution. In this paper we present a method based on the simple perceptron learning algorithm which allows to overcome this algorithmic drawback. The method is algorithmically simple and is easily extended to the multi-class case. We present experimental results on the MNIST data set of handwritten digits which show that Bayes point machines (BPMs) are competitive with the current world champion, the support vector machine. In addition, the computational complexity of BPMs can be tuned by varying the number of samples from the posterior. Finally, rejecting test points on the basis of their (approximative) posterior probability leads to a rapid decrease in generalisation error, e.g. 0.1% generalisation error for a given rejection rate of 10%. 1

6 0.47224391 146 nips-2000-What Can a Single Neuron Compute?

7 0.46314144 104 nips-2000-Processing of Time Series by Neural Circuits with Biologically Realistic Synaptic Dynamics

8 0.46231085 88 nips-2000-Multiple Timescales of Adaptation in a Neural Code

9 0.45306373 129 nips-2000-Temporally Dependent Plasticity: An Information Theoretic Account

10 0.44177374 106 nips-2000-Propagation Algorithms for Variational Bayesian Learning

11 0.44064796 134 nips-2000-The Kernel Trick for Distances

12 0.4402875 133 nips-2000-The Kernel Gibbs Sampler

13 0.43860659 98 nips-2000-Partially Observable SDE Models for Image Sequence Recognition Tasks

14 0.43158671 46 nips-2000-Ensemble Learning and Linear Response Theory for ICA

15 0.4303391 9 nips-2000-A PAC-Bayesian Margin Bound for Linear Classifiers: Why SVMs work

16 0.42718443 49 nips-2000-Explaining Away in Weight Space

17 0.42705956 69 nips-2000-Incorporating Second-Order Functional Knowledge for Better Option Pricing

18 0.42597955 64 nips-2000-High-temperature Expansions for Learning Models of Nonnegative Data

19 0.42523515 20 nips-2000-Algebraic Information Geometry for Learning Machines with Singularities

20 0.4251335 13 nips-2000-A Tighter Bound for Graphical Models