nips nips2003 nips2003-184 knowledge-graph by maker-knowledge-mining

184 nips-2003-The Diffusion-Limited Biochemical Signal-Relay Channel


Source: pdf

Author: Peter J. Thomas, Donald J. Spencer, Sierra K. Hampton, Peter Park, Joseph P. Zurkus

Abstract: Biochemical signal-transduction networks are the biological information-processing systems by which individual cells, from neurons to amoebae, perceive and respond to their chemical environments. We introduce a simplified model of a single biochemical relay and analyse its capacity as a communications channel. A diffusible ligand is released by a sending cell and received by binding to a transmembrane receptor protein on a receiving cell. This receptor-ligand interaction creates a nonlinear communications channel with non-Gaussian noise. We model this channel numerically and study its response to input signals of different frequencies in order to estimate its channel capacity. Stochastic effects introduced in both the diffusion process and the receptor-ligand interaction give the channel low-pass characteristics. We estimate the channel capacity using a water-filling formula adapted from the additive white-noise Gaussian channel. 1 Introduction: The Diffusion-Limited Biochemical Signal-Relay Channel The term signal-transduction network refers to the web of biochemical interactions by which single cells process sensory information about their environment. Just as neural networks underly the interaction of many multicellular organisms with their environments, these biochemical networks allow cells to perceive, evaluate and react to chemical stimuli [1]. Examples include chemical signaling across the synaptic cleft, calcium signaling within the postsynaptic dendritic spine, pathogen localization by the immune system, ∗ † Corresponding author: pjthomas@salk.edu dspencer@salk.edu growth-cone guidance during neuronal development, phototransduction in the retina, rhythmic chemotactic signaling in social amoebae, and many others. The introduction of quantitative measurements of the distribution and activation of chemical reactants within living cells [2] has prepared the way for detailed quantitative analysis of their properties, aided by numerical simulations. One of the key questions that can now be addressed is the fundamental limits to cell-to-cell communication using chemical signaling. To communicate via chemical signaling cells must contend with the unreliability inherent in chemical diffusion and in the interactions of limited numbers of signaling molecules and receptors [3]. We study a simplified situation in which one cell secretes a signaling molecule, or ligand, which can be detected by a receptor on another cell. Limiting ourselves to one ligand-receptor interaction allows a treatment of this communications system using elementary concepts from information theory. The information capacity of this fundamental signaling system is the maximum of the mutual information between the ensemble of input signals, the time-varying rate of ligand secretion s(t), and the output signal r(t), a piecewise continuous function taking the values one or zero as the receptor is bound to ligand or unbound. Using numerical simulation we can estimate the channel capacity via a standard ”water-filling” information measure [4], as described below. 2 Methods: Numerical Simulation of the Biochemical Relay We simulate a biochemical relay system as follows: in a two-dimensional rectangular volume V measuring 5 micrometers by 10 micrometers, we locate two cells spaced 5 micrometers apart. Cell A emits ligand molecules from location xs = [2.5µ, 2.5µ] with rate s(t) ≥ 0; they diffuse with a given diffusion constant D and decay at a rate α. Both secretion and decay occur as random Poisson processes, and diffusion is realized as a discrete random walk with Gaussian-distributed displacements. The boundaries of V are taken to be reflecting. We track the positions of each of N particles {xi , i = 1, · · · , N } at intervals of ∆t = 1msec. The local concentration in a neighborhood of size σ around a location x is given by the convolution N δ(x − xi )g(x − x , σ) dx c(x, t) = ˆ (1) V i=1 where g(·, σ) is a normalized Gaussian distribution in the plane, with mean 0 and variance σ 2 . The motions of the individual particles cause c(x, t) to fluctuate about the mean conˆ centration, causing the local concentration at cell B, c(xr , t) to be a noisy, low-pass filtered ˆ version of the original signal s(t) (see Figure 1). Cell B, located at xr = [7.5µ, 2.5µ], registers the presence of ligand through binding and unbinding transitions, which form a two-state Markov process with time-varying transition rates. Given an unbound receptor, the binding transition happens at a rate that depends on the ligand concentration around the receptor: k+ c(xr , t). The size of the neighborhood σ ˆ reflects the range of the receptor, with binding most likely in a small region close to xr . Once the receptor is bound to a ligand molecule, no more binding events occur until the receptor releases the ligand. The receiver is insensitive to fluctuations in c(xr , t) while it is ˆ in the bound state (see Figure 1). The unbinding transition occurs with a fixed rate k− . For concreteness, we take values for D, α, k− , k+ , and σ appropriate for cyclic AMP signaling between Dictyostelium amoebae, a model organism for chemical communica1 tion: D = 0.25µ2 msec−1 , α = 1 sec−1 , σ = 0.1µ, k− = 1 sec−1 , k+ = 2πσ2 sec−1 . Kd = k− /k+ is the dissociation constant, the concentration at which the receptor on average is bound half the time. For the chosen values of the reaction constants k± , we have Figure 1: Biochemical Signaling Simulation. Top: Cell A secretes a signaling molecule (red dots) with a time-varying rate r(t). Molecules diffuse throughout the two-dimensional volume, leading to locally fluctuating concentrations that carry a corrupted version of the signal. Molecules within a neighborhood of cell B can bind to a receptor molecule, giving a received signal s(t) ∈ {0, 1}. Bottom Left: Input signal. Mean instantaneous rate of molecule release (thousands of molecules per second). Molecule release is a Poisson process with time-varying rate. Bottom Center: Local concentration fluctuations, as seen by cell B, indicated by the number of molecules within 0.2 microns of the receptor. The receptor is sensitive to fluctuations in local concentrations only while it is unbound. While the receptor is bound, it does not register changes in the local concentration (indicated by constant plateaus corresponding to intervals when r(t) = 1 in bottom right panel. Bottom Right: Output signal r(t). At each moment the receptor is either bound (1) or unbound (0). The receiver output is a piecewise constant function with a finite number of transitions. Kd ≈ 15.9 molecules ≈ 26.4nMol, comparable to the most sensitive values reported for µ2 the cyclic AMP receptor [2]. At this concentration the volume V = 50µ2 contains about 800 signaling molecules, assuming a nominal depth of 1µ. 3 Results: Estimating Information Capacity via Frequency Response Communications channels mediated by diffusion and ligand receptor interaction are nonlinear with non-Gaussian noise. The expected value of the output signal, 0 ≤ E[r] < 1, is a sigmoidal function of the log concentration for a constant concentration c: E[r] = 1 c = c + Kd 1 + e−(y−y0 ) (2) where y = ln(c), y0 = ln(Kd ). The mean response saturates for high concentrations, c Kd , and the noise statistics become pronouncedly Poissonian (rather than Gaussian) for low concentrations. Several different kinds of stimuli can be used to characterize such a channel. The steadystate response to constant input reflects the static (equilibrium) transfer function. Concentrations ranging from 100Kd to 0.01Kd occupy 98% of the steady-state operating range, 0.99 > E[r] > 0.01 [5]. For a finite observation time T the actual fraction of time spent bound, rT , is distributed about E[r] with a variance that depends on T . The biochemi¯ cal relay may be used as a binary symmetric channel randomly selecting a ‘high’ or ‘low’ secretion rate, and ‘decoding’ by setting a suitable threshold for rT . As T increases, the ¯ variance of rT and the probability of error decrease. ¯ The binary symmetric channel makes only crude use of this signaling mechanism. Other possible communication schemes include sending all-or-none bursts of signaling molecule, as in synaptic transmission, or detecting discrete stepped responses. Here we use the frequency response of the channel as a way of estimating the information capacity of the biochemical channel. For an idealized linear channel with additive white Gaussian noise (AWNG channel) the channel capacity under a mean input power constraint P is given by the so-called “waterfilling formula” [4], C= 1 2 ωmax log2 1 + ω=ωmin (ν − N (ω))+ N (ω) dω (3) given the constraining condition ωmax (ν − N (ω))+ dω ≤ P (4) ω=ωmin where the constant ν is the sum of the noise and the signal power in the usable frequency range, N (ω) is the power of the additive noise at frequency ω and (X)+ indicates the positive part of X. The formula applies when each frequency band (ω, ω +dω) is subject to noise of power N (ω) independently of all other frequency bands, and reflects the optimal allocation of signal power S(ω) = (ν − N (ω))+ , with greater signal power invested in frequencies at which the noise power is smallest. The capacity C is in bits/second. For an input signal of finite duration T = 100 sec, we can independently specify the amplitudes and phases of its frequency components at ω = [0.01 Hz, 0.02 Hz, · · · , 500 Hz], where 500 Hz is the Nyquist frequency given a 1 msec simulation timestep. Because the population of secreted signaling molecules decays exponentially with a time constant of 1/α = 1 sec, the concentration signal is unable to pass frequencies ω ≥ 1Hz (see Figure 2) providing a natural high-frequency cutoff. For the AWGN channel the input and Figure 2: Frequency Response of Biochemical Relay Channel. The sending cell secreted signaling molecules at a mean rate of 1000 + 1000 sin(2πωt) molecules per second. From top to bottom, the input frequencies were 1.0, 0.5, 0.2, 0.1, 0.05, 0.02 and 0.01 Hz. The total signal duration was T = 100 seconds. Left Column: Total number of molecules in the volume. Attenuation of the original signal results from exponential decay of the signaling molecule population. Right Column: A one-second moving average of the output signal r(t), which takes the value one when the receptor molecule is bound to ligand, and zero when the receptor is unbound. Figure 3: Frequency Transmission Spectrum Noise power N (ω), calculated as the total power in r(t)−¯ in all frequency components save the input frequency ω. Frequencies were r binned in intervals of 0.01 Hz = 1/T . The maximum possible power in r(t) over all frequencies is 0.25; the power successfully transmitted by the channel is given by 0.25/N (ω). The lower curve is N (ω) for input signals of the form s(t) = 1000 + 1000 sin 2πωt, which uses the full dynamic range of the receptor. Decreasing the dynamic range used reduces the amount of power transmitted at the sending frequency: the upper curve is N (ω) for signals of the form s(t) = 1000 + 500 sin 2πωt. output signals share the same units (e.g. rms voltage); for the biological relay the input s(t) is in molecules/second while the output r(t) is a function with binary range {r = 0, r = 1}. The maximum of the mean output power for a binary function r(t) T 2 1 is T t=0 |r(t) − r| dt ≤ 1 . This total possible output power will be distributed be¯ 4 tween different frequencies depending on the frequency of the input. We wish to estimate the channel capacity by comparing the portion of the output power present in the sending frequency ω to the limiting output power 0.25. Therefore we set the total output power constant to ν = 0.25. Given a pure sinusoidal input signal s(t) = a0 + a1 sin(2πωt), we consider the power in the output spectrum at ω Hz to be the residual power from the input and the rest of the power in the spectrum of r(t) to be analogous to the additive noise power spectrum N (ω) in the AWNG channel. We calculate N (ω) to be the total power of r(t) − r ¯ in all frequency bands except ω. For signals of length T = 100 sec, the possible frequencies are discretized at intervals ∆ω = 0.01 Hz. Because the noise power N (ω) ≤ 0.25, the water-filling formula (3) for the capacity reduces to 1 Cest = 2 1Hz log2 0.01Hz 0.25 N (ω) dω. (5) As mentioned above frequencies ω ≥ 1 Hz do not transmit any information about the signal (see Figure 2) and do not contribute to the capacity. We approximate this integral using linear interpolation of log2 (N (ω)) between the measured values at ω = [0.01, 0.02, 0.05, 0.1, 0.2, 0.5, 1.0] Hz. (See Figure 3.) This procedure gives an estimate of the channel capacity, Cest = 0.087 bits/second. 4 Discussion & Conclusions Diffusion and the Markov switching between bound and unbound states create a low-pass filter that removes high-frequency information in the biochemical relay channel. A general Poisson-type communications channel, such as commonly encountered in optical communications engineering, can achieve an arbitrarily large capacity by transmitting high frequencies and high amplitudes, unless bounded by a max or mean amplitude constraint [6]. In the biochemical channel, the effective input amplitude is naturally constrained by the saturation of the receptor at concentrations above the Kd . And the high frequency transmission is limited by the inherent dynamics of the Markov process. Therefore this channel has a finite capacity. The channel capacity estimate we derived, Cest = 0.087 bits/second, seems quite low compared to signaling rates in the nervous system, requiring long signaling times to transfer information successfully. However temporal dynamics in cellular systems can be quite deliberate; cell-cell communication in the social amoeba Dictyostelium, for example, is achieved by means of a carrier wave with a period of seven minutes. In addition, cells typically possess thousands of copies of the receptors for important signaling molecules, allowing for more complex detection schemes than those investigated here. Our simplified treatment suggests several avenues for further work. For example, signal transducing receptors often form Markov chains with more complicated dynamics reflecting many more than two states [7]. Also, the nonlinear nature of the channel is probably not well served by our additive noise approximation, and might be better suited to a treatment via multiplicative noise [8]. Whether cells engage in complicated temporal coding/decoding schemes, as has been proposed for neural information processing, or whether instead they achieve efficient communication by evolutionary matching of the noise characteristics of sender and receiver, remain to be investigated. We note that the dependence of the channel capacity C on such parameters as the system geometry, the diffusion and decay constants, the binding constants and the range of the receptor may shed light on evolutionary mechanisms and constraints on communication within cellular biological systems. Acknowledgments This work would not have been possible without the generous support of the Howard Hughes Medical Institute and the resources of the Computational Neurobiology Laboratory, Terrence J. Sejnowski, Director. References [1] Rappel, W.M., Thomas, P.J., Levine, H. & Loomis, W.F. (2002) Establishing Direction during Chemotaxis in Eukaryotic Cells. Biophysical Journal 83:1361-1367. [2] Ueda, M., Sako, Y., Tanaka, T., Devreotes, P. & Yanagida, T. (2001) Single Molecule Analysis of Chemotactic Signaling in Dictyostelium Cells. Science 294:864-867. [3] Detwiler, P.B., Ramanathan, S., Sengupta, A. & Shraiman, B.I. (2000) Engineering Aspects of Enzymatic Signal Transduction: Photoreceptors in the Retina. Biophysical Journal79:2801-2817. [4] Cover, T.M. & Thomas, J.A. (1991) Elements of Information Theory, New York: Wiley. [5] Getz, W.M. & Lansky, P. (2001) Receptor Dissociation Constants and the Information Entropy of Membranes Coding Ligand Concentration. Chem. Senses 26:95-104. [6] Frey, R.M. (1991) Information Capacity of the Poisson Channel. IEEE Transactions on Information Theory 37(2):244-256. [7] Uteshev, V.V. & Pennefather, P.S. (1997) Analytical Description of the Activation of Multi-State Receptors by Continuous Neurotransmitter Signals at Brain Synapses. Biophysical Journal72:11271134. [8] Mitra, P.P. & Stark, J.B. (2001) Nonlinear limits to the information capacity of optical fibre communications. Nature411:1027-1030.

Reference: text


Summary: the most important sentenses genereted by tfidf model

sentIndex sentText sentNum sentScore

1 We introduce a simplified model of a single biochemical relay and analyse its capacity as a communications channel. [sent-7, score-0.741]

2 A diffusible ligand is released by a sending cell and received by binding to a transmembrane receptor protein on a receiving cell. [sent-8, score-0.875]

3 This receptor-ligand interaction creates a nonlinear communications channel with non-Gaussian noise. [sent-9, score-0.384]

4 We model this channel numerically and study its response to input signals of different frequencies in order to estimate its channel capacity. [sent-10, score-0.677]

5 Stochastic effects introduced in both the diffusion process and the receptor-ligand interaction give the channel low-pass characteristics. [sent-11, score-0.41]

6 We estimate the channel capacity using a water-filling formula adapted from the additive white-noise Gaussian channel. [sent-12, score-0.524]

7 1 Introduction: The Diffusion-Limited Biochemical Signal-Relay Channel The term signal-transduction network refers to the web of biochemical interactions by which single cells process sensory information about their environment. [sent-13, score-0.379]

8 Just as neural networks underly the interaction of many multicellular organisms with their environments, these biochemical networks allow cells to perceive, evaluate and react to chemical stimuli [1]. [sent-14, score-0.542]

9 Examples include chemical signaling across the synaptic cleft, calcium signaling within the postsynaptic dendritic spine, pathogen localization by the immune system, ∗ † Corresponding author: pjthomas@salk. [sent-15, score-0.892]

10 edu growth-cone guidance during neuronal development, phototransduction in the retina, rhythmic chemotactic signaling in social amoebae, and many others. [sent-17, score-0.457]

11 The introduction of quantitative measurements of the distribution and activation of chemical reactants within living cells [2] has prepared the way for detailed quantitative analysis of their properties, aided by numerical simulations. [sent-18, score-0.191]

12 One of the key questions that can now be addressed is the fundamental limits to cell-to-cell communication using chemical signaling. [sent-19, score-0.163]

13 To communicate via chemical signaling cells must contend with the unreliability inherent in chemical diffusion and in the interactions of limited numbers of signaling molecules and receptors [3]. [sent-20, score-1.486]

14 We study a simplified situation in which one cell secretes a signaling molecule, or ligand, which can be detected by a receptor on another cell. [sent-21, score-0.845]

15 Limiting ourselves to one ligand-receptor interaction allows a treatment of this communications system using elementary concepts from information theory. [sent-22, score-0.157]

16 Using numerical simulation we can estimate the channel capacity via a standard ”water-filling” information measure [4], as described below. [sent-24, score-0.457]

17 2 Methods: Numerical Simulation of the Biochemical Relay We simulate a biochemical relay system as follows: in a two-dimensional rectangular volume V measuring 5 micrometers by 10 micrometers, we locate two cells spaced 5 micrometers apart. [sent-25, score-0.671]

18 Cell A emits ligand molecules from location xs = [2. [sent-26, score-0.487]

19 5µ] with rate s(t) ≥ 0; they diffuse with a given diffusion constant D and decay at a rate α. [sent-28, score-0.314]

20 Both secretion and decay occur as random Poisson processes, and diffusion is realized as a discrete random walk with Gaussian-distributed displacements. [sent-29, score-0.25]

21 We track the positions of each of N particles {xi , i = 1, · · · , N } at intervals of ∆t = 1msec. [sent-31, score-0.072]

22 The local concentration in a neighborhood of size σ around a location x is given by the convolution N δ(x − xi )g(x − x , σ) dx c(x, t) = ˆ (1) V i=1 where g(·, σ) is a normalized Gaussian distribution in the plane, with mean 0 and variance σ 2 . [sent-32, score-0.154]

23 The motions of the individual particles cause c(x, t) to fluctuate about the mean conˆ centration, causing the local concentration at cell B, c(xr , t) to be a noisy, low-pass filtered ˆ version of the original signal s(t) (see Figure 1). [sent-33, score-0.319]

24 5µ], registers the presence of ligand through binding and unbinding transitions, which form a two-state Markov process with time-varying transition rates. [sent-36, score-0.412]

25 Given an unbound receptor, the binding transition happens at a rate that depends on the ligand concentration around the receptor: k+ c(xr , t). [sent-37, score-0.584]

26 The size of the neighborhood σ ˆ reflects the range of the receptor, with binding most likely in a small region close to xr . [sent-38, score-0.249]

27 Once the receptor is bound to a ligand molecule, no more binding events occur until the receptor releases the ligand. [sent-39, score-1.106]

28 The receiver is insensitive to fluctuations in c(xr , t) while it is ˆ in the bound state (see Figure 1). [sent-40, score-0.081]

29 The unbinding transition occurs with a fixed rate k− . [sent-41, score-0.082]

30 For concreteness, we take values for D, α, k− , k+ , and σ appropriate for cyclic AMP signaling between Dictyostelium amoebae, a model organism for chemical communica1 tion: D = 0. [sent-42, score-0.519]

31 Kd = k− /k+ is the dissociation constant, the concentration at which the receptor on average is bound half the time. [sent-45, score-0.561]

32 For the chosen values of the reaction constants k± , we have Figure 1: Biochemical Signaling Simulation. [sent-46, score-0.044]

33 Top: Cell A secretes a signaling molecule (red dots) with a time-varying rate r(t). [sent-47, score-0.663]

34 Molecules diffuse throughout the two-dimensional volume, leading to locally fluctuating concentrations that carry a corrupted version of the signal. [sent-48, score-0.155]

35 Molecules within a neighborhood of cell B can bind to a receptor molecule, giving a received signal s(t) ∈ {0, 1}. [sent-49, score-0.547]

36 Mean instantaneous rate of molecule release (thousands of molecules per second). [sent-51, score-0.498]

37 Molecule release is a Poisson process with time-varying rate. [sent-52, score-0.03]

38 Bottom Center: Local concentration fluctuations, as seen by cell B, indicated by the number of molecules within 0. [sent-53, score-0.423]

39 The receptor is sensitive to fluctuations in local concentrations only while it is unbound. [sent-55, score-0.467]

40 While the receptor is bound, it does not register changes in the local concentration (indicated by constant plateaus corresponding to intervals when r(t) = 1 in bottom right panel. [sent-56, score-0.574]

41 At each moment the receptor is either bound (1) or unbound (0). [sent-58, score-0.454]

42 The receiver output is a piecewise constant function with a finite number of transitions. [sent-59, score-0.135]

43 4nMol, comparable to the most sensitive values reported for µ2 the cyclic AMP receptor [2]. [sent-62, score-0.373]

44 At this concentration the volume V = 50µ2 contains about 800 signaling molecules, assuming a nominal depth of 1µ. [sent-63, score-0.497]

45 3 Results: Estimating Information Capacity via Frequency Response Communications channels mediated by diffusion and ligand receptor interaction are nonlinear with non-Gaussian noise. [sent-64, score-0.837]

46 The expected value of the output signal, 0 ≤ E[r] < 1, is a sigmoidal function of the log concentration for a constant concentration c: E[r] = 1 c = c + Kd 1 + e−(y−y0 ) (2) where y = ln(c), y0 = ln(Kd ). [sent-65, score-0.317]

47 The mean response saturates for high concentrations, c Kd , and the noise statistics become pronouncedly Poissonian (rather than Gaussian) for low concentrations. [sent-66, score-0.091]

48 The steadystate response to constant input reflects the static (equilibrium) transfer function. [sent-68, score-0.098]

49 The biochemi¯ cal relay may be used as a binary symmetric channel randomly selecting a ‘high’ or ‘low’ secretion rate, and ‘decoding’ by setting a suitable threshold for rT . [sent-74, score-0.456]

50 ¯ The binary symmetric channel makes only crude use of this signaling mechanism. [sent-76, score-0.61]

51 Other possible communication schemes include sending all-or-none bursts of signaling molecule, as in synaptic transmission, or detecting discrete stepped responses. [sent-77, score-0.556]

52 Here we use the frequency response of the channel as a way of estimating the information capacity of the biochemical channel. [sent-78, score-0.889]

53 The formula applies when each frequency band (ω, ω +dω) is subject to noise of power N (ω) independently of all other frequency bands, and reflects the optimal allocation of signal power S(ω) = (ν − N (ω))+ , with greater signal power invested in frequencies at which the noise power is smallest. [sent-80, score-1.23]

54 For an input signal of finite duration T = 100 sec, we can independently specify the amplitudes and phases of its frequency components at ω = [0. [sent-82, score-0.294]

55 02 Hz, · · · , 500 Hz], where 500 Hz is the Nyquist frequency given a 1 msec simulation timestep. [sent-84, score-0.167]

56 Because the population of secreted signaling molecules decays exponentially with a time constant of 1/α = 1 sec, the concentration signal is unable to pass frequencies ω ≥ 1Hz (see Figure 2) providing a natural high-frequency cutoff. [sent-85, score-0.982]

57 For the AWGN channel the input and Figure 2: Frequency Response of Biochemical Relay Channel. [sent-86, score-0.271]

58 The sending cell secreted signaling molecules at a mean rate of 1000 + 1000 sin(2πωt) molecules per second. [sent-87, score-1.072]

59 From top to bottom, the input frequencies were 1. [sent-88, score-0.127]

60 Left Column: Total number of molecules in the volume. [sent-97, score-0.227]

61 Attenuation of the original signal results from exponential decay of the signaling molecule population. [sent-98, score-0.72]

62 Right Column: A one-second moving average of the output signal r(t), which takes the value one when the receptor molecule is bound to ligand, and zero when the receptor is unbound. [sent-99, score-1.084]

63 Figure 3: Frequency Transmission Spectrum Noise power N (ω), calculated as the total power in r(t)−¯ in all frequency components save the input frequency ω. [sent-100, score-0.57]

64 The maximum possible power in r(t) over all frequencies is 0. [sent-103, score-0.234]

65 25; the power successfully transmitted by the channel is given by 0. [sent-104, score-0.409]

66 The lower curve is N (ω) for input signals of the form s(t) = 1000 + 1000 sin 2πωt, which uses the full dynamic range of the receptor. [sent-106, score-0.16]

67 Decreasing the dynamic range used reduces the amount of power transmitted at the sending frequency: the upper curve is N (ω) for signals of the form s(t) = 1000 + 500 sin 2πωt. [sent-107, score-0.385]

68 rms voltage); for the biological relay the input s(t) is in molecules/second while the output r(t) is a function with binary range {r = 0, r = 1}. [sent-110, score-0.285]

69 The maximum of the mean output power for a binary function r(t) T 2 1 is T t=0 |r(t) − r| dt ≤ 1 . [sent-111, score-0.188]

70 This total possible output power will be distributed be¯ 4 tween different frequencies depending on the frequency of the input. [sent-112, score-0.415]

71 We wish to estimate the channel capacity by comparing the portion of the output power present in the sending frequency ω to the limiting output power 0. [sent-113, score-1.01]

72 Therefore we set the total output power constant to ν = 0. [sent-115, score-0.24]

73 Given a pure sinusoidal input signal s(t) = a0 + a1 sin(2πωt), we consider the power in the output spectrum at ω Hz to be the residual power from the input and the rest of the power in the spectrum of r(t) to be analogous to the additive noise power spectrum N (ω) in the AWNG channel. [sent-117, score-0.996]

74 We calculate N (ω) to be the total power of r(t) − r ¯ in all frequency bands except ω. [sent-118, score-0.309]

75 For signals of length T = 100 sec, the possible frequencies are discretized at intervals ∆ω = 0. [sent-119, score-0.178]

76 25, the water-filling formula (3) for the capacity reduces to 1 Cest = 2 1Hz log2 0. [sent-122, score-0.245]

77 (5) As mentioned above frequencies ω ≥ 1 Hz do not transmit any information about the signal (see Figure 2) and do not contribute to the capacity. [sent-125, score-0.183]

78 ) This procedure gives an estimate of the channel capacity, Cest = 0. [sent-135, score-0.235]

79 4 Discussion & Conclusions Diffusion and the Markov switching between bound and unbound states create a low-pass filter that removes high-frequency information in the biochemical relay channel. [sent-137, score-0.563]

80 A general Poisson-type communications channel, such as commonly encountered in optical communications engineering, can achieve an arbitrarily large capacity by transmitting high frequencies and high amplitudes, unless bounded by a max or mean amplitude constraint [6]. [sent-138, score-0.481]

81 In the biochemical channel, the effective input amplitude is naturally constrained by the saturation of the receptor at concentrations above the Kd . [sent-139, score-0.811]

82 And the high frequency transmission is limited by the inherent dynamics of the Markov process. [sent-140, score-0.15]

83 The channel capacity estimate we derived, Cest = 0. [sent-142, score-0.435]

84 087 bits/second, seems quite low compared to signaling rates in the nervous system, requiring long signaling times to transfer information successfully. [sent-143, score-0.75]

85 However temporal dynamics in cellular systems can be quite deliberate; cell-cell communication in the social amoeba Dictyostelium, for example, is achieved by means of a carrier wave with a period of seven minutes. [sent-144, score-0.108]

86 In addition, cells typically possess thousands of copies of the receptors for important signaling molecules, allowing for more complex detection schemes than those investigated here. [sent-145, score-0.568]

87 Our simplified treatment suggests several avenues for further work. [sent-146, score-0.031]

88 For example, signal transducing receptors often form Markov chains with more complicated dynamics reflecting many more than two states [7]. [sent-147, score-0.158]

89 Also, the nonlinear nature of the channel is probably not well served by our additive noise approximation, and might be better suited to a treatment via multiplicative noise [8]. [sent-148, score-0.447]

90 Whether cells engage in complicated temporal coding/decoding schemes, as has been proposed for neural information processing, or whether instead they achieve efficient communication by evolutionary matching of the noise characteristics of sender and receiver, remain to be investigated. [sent-149, score-0.206]

91 We note that the dependence of the channel capacity C on such parameters as the system geometry, the diffusion and decay constants, the binding constants and the range of the receptor may shed light on evolutionary mechanisms and constraints on communication within cellular biological systems. [sent-150, score-1.274]

92 (2001) Nonlinear limits to the information capacity of optical fibre communications. [sent-202, score-0.224]


similar papers computed by tfidf model

tfidf for this paper:

wordName wordTfidf (topN-words)

[('signaling', 0.375), ('receptor', 0.349), ('biochemical', 0.308), ('ligand', 0.26), ('channel', 0.235), ('molecules', 0.227), ('molecule', 0.206), ('capacity', 0.2), ('relay', 0.15), ('power', 0.143), ('diffusion', 0.132), ('concentration', 0.122), ('chemical', 0.12), ('concentrations', 0.118), ('frequency', 0.112), ('binding', 0.105), ('kd', 0.094), ('signal', 0.092), ('frequencies', 0.091), ('sending', 0.087), ('xr', 0.087), ('hz', 0.085), ('communications', 0.083), ('sec', 0.077), ('cell', 0.074), ('cells', 0.071), ('amoebae', 0.071), ('cest', 0.071), ('dictyostelium', 0.071), ('micrometers', 0.071), ('secretion', 0.071), ('receptors', 0.066), ('unbound', 0.062), ('lling', 0.057), ('noise', 0.057), ('sin', 0.053), ('biophysical', 0.052), ('decay', 0.047), ('amp', 0.047), ('awng', 0.047), ('chemotactic', 0.047), ('dissociation', 0.047), ('secreted', 0.047), ('secretes', 0.047), ('terrence', 0.047), ('unbinding', 0.047), ('uctuations', 0.047), ('signals', 0.046), ('output', 0.045), ('formula', 0.045), ('additive', 0.044), ('constants', 0.044), ('communication', 0.043), ('interaction', 0.043), ('bound', 0.043), ('intervals', 0.041), ('spectrum', 0.038), ('receiver', 0.038), ('transmission', 0.038), ('rt', 0.038), ('neurobiology', 0.037), ('diffuse', 0.037), ('input', 0.036), ('evolutionary', 0.035), ('jolla', 0.035), ('social', 0.035), ('rate', 0.035), ('bottom', 0.034), ('response', 0.034), ('msec', 0.033), ('perceive', 0.033), ('poisson', 0.033), ('neighborhood', 0.032), ('transmitted', 0.031), ('particles', 0.031), ('treatment', 0.031), ('thomas', 0.031), ('amplitudes', 0.03), ('bands', 0.03), ('mediated', 0.03), ('cellular', 0.03), ('release', 0.03), ('schemes', 0.029), ('biological', 0.029), ('ects', 0.028), ('sejnowski', 0.028), ('constant', 0.028), ('la', 0.027), ('thousands', 0.027), ('range', 0.025), ('cyclic', 0.024), ('laboratory', 0.024), ('optical', 0.024), ('peter', 0.024), ('total', 0.024), ('duration', 0.024), ('piecewise', 0.024), ('nonlinear', 0.023), ('simulation', 0.022), ('synaptic', 0.022)]

similar papers list:

simIndex simValue paperId paperTitle

same-paper 1 1.000001 184 nips-2003-The Diffusion-Limited Biochemical Signal-Relay Channel

Author: Peter J. Thomas, Donald J. Spencer, Sierra K. Hampton, Peter Park, Joseph P. Zurkus

Abstract: Biochemical signal-transduction networks are the biological information-processing systems by which individual cells, from neurons to amoebae, perceive and respond to their chemical environments. We introduce a simplified model of a single biochemical relay and analyse its capacity as a communications channel. A diffusible ligand is released by a sending cell and received by binding to a transmembrane receptor protein on a receiving cell. This receptor-ligand interaction creates a nonlinear communications channel with non-Gaussian noise. We model this channel numerically and study its response to input signals of different frequencies in order to estimate its channel capacity. Stochastic effects introduced in both the diffusion process and the receptor-ligand interaction give the channel low-pass characteristics. We estimate the channel capacity using a water-filling formula adapted from the additive white-noise Gaussian channel. 1 Introduction: The Diffusion-Limited Biochemical Signal-Relay Channel The term signal-transduction network refers to the web of biochemical interactions by which single cells process sensory information about their environment. Just as neural networks underly the interaction of many multicellular organisms with their environments, these biochemical networks allow cells to perceive, evaluate and react to chemical stimuli [1]. Examples include chemical signaling across the synaptic cleft, calcium signaling within the postsynaptic dendritic spine, pathogen localization by the immune system, ∗ † Corresponding author: pjthomas@salk.edu dspencer@salk.edu growth-cone guidance during neuronal development, phototransduction in the retina, rhythmic chemotactic signaling in social amoebae, and many others. The introduction of quantitative measurements of the distribution and activation of chemical reactants within living cells [2] has prepared the way for detailed quantitative analysis of their properties, aided by numerical simulations. One of the key questions that can now be addressed is the fundamental limits to cell-to-cell communication using chemical signaling. To communicate via chemical signaling cells must contend with the unreliability inherent in chemical diffusion and in the interactions of limited numbers of signaling molecules and receptors [3]. We study a simplified situation in which one cell secretes a signaling molecule, or ligand, which can be detected by a receptor on another cell. Limiting ourselves to one ligand-receptor interaction allows a treatment of this communications system using elementary concepts from information theory. The information capacity of this fundamental signaling system is the maximum of the mutual information between the ensemble of input signals, the time-varying rate of ligand secretion s(t), and the output signal r(t), a piecewise continuous function taking the values one or zero as the receptor is bound to ligand or unbound. Using numerical simulation we can estimate the channel capacity via a standard ”water-filling” information measure [4], as described below. 2 Methods: Numerical Simulation of the Biochemical Relay We simulate a biochemical relay system as follows: in a two-dimensional rectangular volume V measuring 5 micrometers by 10 micrometers, we locate two cells spaced 5 micrometers apart. Cell A emits ligand molecules from location xs = [2.5µ, 2.5µ] with rate s(t) ≥ 0; they diffuse with a given diffusion constant D and decay at a rate α. Both secretion and decay occur as random Poisson processes, and diffusion is realized as a discrete random walk with Gaussian-distributed displacements. The boundaries of V are taken to be reflecting. We track the positions of each of N particles {xi , i = 1, · · · , N } at intervals of ∆t = 1msec. The local concentration in a neighborhood of size σ around a location x is given by the convolution N δ(x − xi )g(x − x , σ) dx c(x, t) = ˆ (1) V i=1 where g(·, σ) is a normalized Gaussian distribution in the plane, with mean 0 and variance σ 2 . The motions of the individual particles cause c(x, t) to fluctuate about the mean conˆ centration, causing the local concentration at cell B, c(xr , t) to be a noisy, low-pass filtered ˆ version of the original signal s(t) (see Figure 1). Cell B, located at xr = [7.5µ, 2.5µ], registers the presence of ligand through binding and unbinding transitions, which form a two-state Markov process with time-varying transition rates. Given an unbound receptor, the binding transition happens at a rate that depends on the ligand concentration around the receptor: k+ c(xr , t). The size of the neighborhood σ ˆ reflects the range of the receptor, with binding most likely in a small region close to xr . Once the receptor is bound to a ligand molecule, no more binding events occur until the receptor releases the ligand. The receiver is insensitive to fluctuations in c(xr , t) while it is ˆ in the bound state (see Figure 1). The unbinding transition occurs with a fixed rate k− . For concreteness, we take values for D, α, k− , k+ , and σ appropriate for cyclic AMP signaling between Dictyostelium amoebae, a model organism for chemical communica1 tion: D = 0.25µ2 msec−1 , α = 1 sec−1 , σ = 0.1µ, k− = 1 sec−1 , k+ = 2πσ2 sec−1 . Kd = k− /k+ is the dissociation constant, the concentration at which the receptor on average is bound half the time. For the chosen values of the reaction constants k± , we have Figure 1: Biochemical Signaling Simulation. Top: Cell A secretes a signaling molecule (red dots) with a time-varying rate r(t). Molecules diffuse throughout the two-dimensional volume, leading to locally fluctuating concentrations that carry a corrupted version of the signal. Molecules within a neighborhood of cell B can bind to a receptor molecule, giving a received signal s(t) ∈ {0, 1}. Bottom Left: Input signal. Mean instantaneous rate of molecule release (thousands of molecules per second). Molecule release is a Poisson process with time-varying rate. Bottom Center: Local concentration fluctuations, as seen by cell B, indicated by the number of molecules within 0.2 microns of the receptor. The receptor is sensitive to fluctuations in local concentrations only while it is unbound. While the receptor is bound, it does not register changes in the local concentration (indicated by constant plateaus corresponding to intervals when r(t) = 1 in bottom right panel. Bottom Right: Output signal r(t). At each moment the receptor is either bound (1) or unbound (0). The receiver output is a piecewise constant function with a finite number of transitions. Kd ≈ 15.9 molecules ≈ 26.4nMol, comparable to the most sensitive values reported for µ2 the cyclic AMP receptor [2]. At this concentration the volume V = 50µ2 contains about 800 signaling molecules, assuming a nominal depth of 1µ. 3 Results: Estimating Information Capacity via Frequency Response Communications channels mediated by diffusion and ligand receptor interaction are nonlinear with non-Gaussian noise. The expected value of the output signal, 0 ≤ E[r] < 1, is a sigmoidal function of the log concentration for a constant concentration c: E[r] = 1 c = c + Kd 1 + e−(y−y0 ) (2) where y = ln(c), y0 = ln(Kd ). The mean response saturates for high concentrations, c Kd , and the noise statistics become pronouncedly Poissonian (rather than Gaussian) for low concentrations. Several different kinds of stimuli can be used to characterize such a channel. The steadystate response to constant input reflects the static (equilibrium) transfer function. Concentrations ranging from 100Kd to 0.01Kd occupy 98% of the steady-state operating range, 0.99 > E[r] > 0.01 [5]. For a finite observation time T the actual fraction of time spent bound, rT , is distributed about E[r] with a variance that depends on T . The biochemi¯ cal relay may be used as a binary symmetric channel randomly selecting a ‘high’ or ‘low’ secretion rate, and ‘decoding’ by setting a suitable threshold for rT . As T increases, the ¯ variance of rT and the probability of error decrease. ¯ The binary symmetric channel makes only crude use of this signaling mechanism. Other possible communication schemes include sending all-or-none bursts of signaling molecule, as in synaptic transmission, or detecting discrete stepped responses. Here we use the frequency response of the channel as a way of estimating the information capacity of the biochemical channel. For an idealized linear channel with additive white Gaussian noise (AWNG channel) the channel capacity under a mean input power constraint P is given by the so-called “waterfilling formula” [4], C= 1 2 ωmax log2 1 + ω=ωmin (ν − N (ω))+ N (ω) dω (3) given the constraining condition ωmax (ν − N (ω))+ dω ≤ P (4) ω=ωmin where the constant ν is the sum of the noise and the signal power in the usable frequency range, N (ω) is the power of the additive noise at frequency ω and (X)+ indicates the positive part of X. The formula applies when each frequency band (ω, ω +dω) is subject to noise of power N (ω) independently of all other frequency bands, and reflects the optimal allocation of signal power S(ω) = (ν − N (ω))+ , with greater signal power invested in frequencies at which the noise power is smallest. The capacity C is in bits/second. For an input signal of finite duration T = 100 sec, we can independently specify the amplitudes and phases of its frequency components at ω = [0.01 Hz, 0.02 Hz, · · · , 500 Hz], where 500 Hz is the Nyquist frequency given a 1 msec simulation timestep. Because the population of secreted signaling molecules decays exponentially with a time constant of 1/α = 1 sec, the concentration signal is unable to pass frequencies ω ≥ 1Hz (see Figure 2) providing a natural high-frequency cutoff. For the AWGN channel the input and Figure 2: Frequency Response of Biochemical Relay Channel. The sending cell secreted signaling molecules at a mean rate of 1000 + 1000 sin(2πωt) molecules per second. From top to bottom, the input frequencies were 1.0, 0.5, 0.2, 0.1, 0.05, 0.02 and 0.01 Hz. The total signal duration was T = 100 seconds. Left Column: Total number of molecules in the volume. Attenuation of the original signal results from exponential decay of the signaling molecule population. Right Column: A one-second moving average of the output signal r(t), which takes the value one when the receptor molecule is bound to ligand, and zero when the receptor is unbound. Figure 3: Frequency Transmission Spectrum Noise power N (ω), calculated as the total power in r(t)−¯ in all frequency components save the input frequency ω. Frequencies were r binned in intervals of 0.01 Hz = 1/T . The maximum possible power in r(t) over all frequencies is 0.25; the power successfully transmitted by the channel is given by 0.25/N (ω). The lower curve is N (ω) for input signals of the form s(t) = 1000 + 1000 sin 2πωt, which uses the full dynamic range of the receptor. Decreasing the dynamic range used reduces the amount of power transmitted at the sending frequency: the upper curve is N (ω) for signals of the form s(t) = 1000 + 500 sin 2πωt. output signals share the same units (e.g. rms voltage); for the biological relay the input s(t) is in molecules/second while the output r(t) is a function with binary range {r = 0, r = 1}. The maximum of the mean output power for a binary function r(t) T 2 1 is T t=0 |r(t) − r| dt ≤ 1 . This total possible output power will be distributed be¯ 4 tween different frequencies depending on the frequency of the input. We wish to estimate the channel capacity by comparing the portion of the output power present in the sending frequency ω to the limiting output power 0.25. Therefore we set the total output power constant to ν = 0.25. Given a pure sinusoidal input signal s(t) = a0 + a1 sin(2πωt), we consider the power in the output spectrum at ω Hz to be the residual power from the input and the rest of the power in the spectrum of r(t) to be analogous to the additive noise power spectrum N (ω) in the AWNG channel. We calculate N (ω) to be the total power of r(t) − r ¯ in all frequency bands except ω. For signals of length T = 100 sec, the possible frequencies are discretized at intervals ∆ω = 0.01 Hz. Because the noise power N (ω) ≤ 0.25, the water-filling formula (3) for the capacity reduces to 1 Cest = 2 1Hz log2 0.01Hz 0.25 N (ω) dω. (5) As mentioned above frequencies ω ≥ 1 Hz do not transmit any information about the signal (see Figure 2) and do not contribute to the capacity. We approximate this integral using linear interpolation of log2 (N (ω)) between the measured values at ω = [0.01, 0.02, 0.05, 0.1, 0.2, 0.5, 1.0] Hz. (See Figure 3.) This procedure gives an estimate of the channel capacity, Cest = 0.087 bits/second. 4 Discussion & Conclusions Diffusion and the Markov switching between bound and unbound states create a low-pass filter that removes high-frequency information in the biochemical relay channel. A general Poisson-type communications channel, such as commonly encountered in optical communications engineering, can achieve an arbitrarily large capacity by transmitting high frequencies and high amplitudes, unless bounded by a max or mean amplitude constraint [6]. In the biochemical channel, the effective input amplitude is naturally constrained by the saturation of the receptor at concentrations above the Kd . And the high frequency transmission is limited by the inherent dynamics of the Markov process. Therefore this channel has a finite capacity. The channel capacity estimate we derived, Cest = 0.087 bits/second, seems quite low compared to signaling rates in the nervous system, requiring long signaling times to transfer information successfully. However temporal dynamics in cellular systems can be quite deliberate; cell-cell communication in the social amoeba Dictyostelium, for example, is achieved by means of a carrier wave with a period of seven minutes. In addition, cells typically possess thousands of copies of the receptors for important signaling molecules, allowing for more complex detection schemes than those investigated here. Our simplified treatment suggests several avenues for further work. For example, signal transducing receptors often form Markov chains with more complicated dynamics reflecting many more than two states [7]. Also, the nonlinear nature of the channel is probably not well served by our additive noise approximation, and might be better suited to a treatment via multiplicative noise [8]. Whether cells engage in complicated temporal coding/decoding schemes, as has been proposed for neural information processing, or whether instead they achieve efficient communication by evolutionary matching of the noise characteristics of sender and receiver, remain to be investigated. We note that the dependence of the channel capacity C on such parameters as the system geometry, the diffusion and decay constants, the binding constants and the range of the receptor may shed light on evolutionary mechanisms and constraints on communication within cellular biological systems. Acknowledgments This work would not have been possible without the generous support of the Howard Hughes Medical Institute and the resources of the Computational Neurobiology Laboratory, Terrence J. Sejnowski, Director. References [1] Rappel, W.M., Thomas, P.J., Levine, H. & Loomis, W.F. (2002) Establishing Direction during Chemotaxis in Eukaryotic Cells. Biophysical Journal 83:1361-1367. [2] Ueda, M., Sako, Y., Tanaka, T., Devreotes, P. & Yanagida, T. (2001) Single Molecule Analysis of Chemotactic Signaling in Dictyostelium Cells. Science 294:864-867. [3] Detwiler, P.B., Ramanathan, S., Sengupta, A. & Shraiman, B.I. (2000) Engineering Aspects of Enzymatic Signal Transduction: Photoreceptors in the Retina. Biophysical Journal79:2801-2817. [4] Cover, T.M. & Thomas, J.A. (1991) Elements of Information Theory, New York: Wiley. [5] Getz, W.M. & Lansky, P. (2001) Receptor Dissociation Constants and the Information Entropy of Membranes Coding Ligand Concentration. Chem. Senses 26:95-104. [6] Frey, R.M. (1991) Information Capacity of the Poisson Channel. IEEE Transactions on Information Theory 37(2):244-256. [7] Uteshev, V.V. & Pennefather, P.S. (1997) Analytical Description of the Activation of Multi-State Receptors by Continuous Neurotransmitter Signals at Brain Synapses. Biophysical Journal72:11271134. [8] Mitra, P.P. & Stark, J.B. (2001) Nonlinear limits to the information capacity of optical fibre communications. Nature411:1027-1030.

2 0.06685558 15 nips-2003-A Probabilistic Model of Auditory Space Representation in the Barn Owl

Author: Brian J. Fischer, Charles H. Anderson

Abstract: The barn owl is a nocturnal hunter, capable of capturing prey using auditory information alone [1]. The neural basis for this localization behavior is the existence of auditory neurons with spatial receptive fields [2]. We provide a mathematical description of the operations performed on auditory input signals by the barn owl that facilitate the creation of a representation of auditory space. To develop our model, we first formulate the sound localization problem solved by the barn owl as a statistical estimation problem. The implementation of the solution is constrained by the known neurobiology.

3 0.064151138 16 nips-2003-A Recurrent Model of Orientation Maps with Simple and Complex Cells

Author: Paul Merolla, Kwabena A. Boahen

Abstract: We describe a neuromorphic chip that utilizes transistor heterogeneity, introduced by the fabrication process, to generate orientation maps similar to those imaged in vivo. Our model consists of a recurrent network of excitatory and inhibitory cells in parallel with a push-pull stage. Similar to a previous model the recurrent network displays hotspots of activity that give rise to visual feature maps. Unlike previous work, however, the map for orientation does not depend on the sign of contrast. Instead, signindependent cells driven by both ON and OFF channels anchor the map, while push-pull interactions give rise to sign-preserving cells. These two groups of orientation-selective cells are similar to complex and simple cells observed in V1. 1 Orientation Maps Neurons in visual areas 1 and 2 (V1 and V2) are selectively tuned for a number of visual features, the most pronounced feature being orientation. Orientation preference of individual cells varies across the two-dimensional surface of the cortex in a stereotyped manner, as revealed by electrophysiology [1] and optical imaging studies [2]. The origin of these preferred orientation (PO) maps is debated, but experiments demonstrate that they exist in the absence of visual experience [3]. To the dismay of advocates of Hebbian learning, these results suggest that the initial appearance of PO maps rely on neural mechanisms oblivious to input correlations. Here, we propose a model that accounts for observed PO maps based on innate noise in neuron thresholds and synaptic currents. The network is implemented in silicon where heterogeneity is as ubiquitous as it is in biology. 2 Patterned Activity Model Ernst et al. have previously described a 2D rate model that can account for the origin of visual maps [4]. Individual units in their network receive isotropic feedforward input from the geniculate and recurrent connections from neighboring units in a Mexican hat profile, described by short-range excitation and long-range inhibition. If the recurrent connections are sufficiently strong, hotspots of activity (or ‘bumps’) form periodically across space. In a homogeneous network, these bumps of activity are equally stable at any position in the network and are free to wander. Introducing random jitter to the Mexican hat connectivity profiles breaks the symmetry and reduces the number of stable states for the bumps. Subsequently, the bumps are pinned down at the locations that maximize their net local recurrent feedback. In this regime, moving gratings are able to shift the bumps away from their stability points such that the responses of the network resemble PO maps. Therefore, the recurrent network, given an ample amount of noise, can innately generate its own orientation specificity without the need for specific hardwired connections or visually driven learning rules. 2.1 Criticisms of the Bump model We might posit that the brain uses a similar opportunistic model to derive and organize its feature maps – but the parallels between the primary visual cortex and the Ernst et al. bump model are unconvincing. For instance, the units in their model represent the collective activity of a column, reducing the network dynamics to a firing-rate approximation. But this simplification ignores the rich temporal dynamics of spiking networks, which are known to affect bump stability. More fundamentally, there is no role for functionally distinct neuron types. The primary criticism of the Ernst et al.’s bump model is that its input only consists of a luminance channel, and it is not obvious how to replace this channel with ON and OFF rectified channels to account for simple and complex cells. One possibility would be to segregate ON-driven and OFF-driven cells (referred to as simple cells in this paper) into two distinct recurrent networks. Because each network would have its own innate noise profile, bumps would form independently. Consequently, there is no guarantee that ON-driven maps would line up with OFF-driven maps, which would result in conflicting orientation signals when these simple cells converge onto sign-independent (complex) cells. 2.2 Simple Cells Solve a Complex Problem To ensure that both ON-driven and OFF-driven simple cells have the same orientation maps, both ON and OFF bumps must be computed in the same recurrent network so that they are subjected to the same noise profile. We achieve this by building our recurrent network out of cells that are sign-independent; that is both ON and OFF channels drive the network. These cells exhibit complex cell-like behavior (and are referred to as complex cells in this paper) because they are modulated at double the spatial frequency of a sinusoidal grating input. The simple cells subsequently derive their responses from two separate signals: an orientation selective feedback signal from the complex cells indicating the presence of either an ON or an OFF bump, and an ON–OFF selection signal that chooses the appropriate response flavor. Figure 1 left illustrates the formation of bumps (highlighted cells) by a recurrent network with a Mexican hat connectivity profile. Extending the Ernst et al. model, these complex bumps seed simple bumps when driven by a grating. Simple bumps that match the sign of the input survive, whereas out-of-phase bumps are extinguished (faded cells) by push-pull inhibition. Figure 1 right shows the local connections within a microcircuit. An EXC (excitatory) cell receives excitatory input from both ON and OFF channels, and projects to other EXC (not shown) and INH (inhibitory) cells. The INH cell projects back in a reciprocal configuration to EXC cells. The divergence is indicated in left. ON-driven and OFF-driven simple cells receive input in a push-pull configuration (i.e., ON cells are excited by ON inputs and inhibited by OFF inputs, and vise-versa), while additionally receiving input from the EXC–INH recurrent network. In this model, we implement our push-pull circuit using monosynaptic inhibitory connections, despite the fact that geniculate input is strictly excitatory. This simplification, while anatomically incorrect, yields a more efficient implementation that is functionally equivalent. ON Input Luminance OFF Input left right EXC EXC Divergence INH INH Simple Cells Complex Cells ON & OFF Input ON OFF OFF Space Figure 1: left, Complex and simple cell responses to a sinusoidal grating input. Luminance is transformed into ON (green) and OFF (red) pathways by retinal processing. Complex cells form a recurrent network through excitatory and inhibitory projections (yellow and blue lines, respectively), and clusters of activity occur at twice the spatial frequency of the grating. ON input activates ON-driven simple cells (bright green) and suppresses OFF-driven simple cells (faded red), and vise-versa. right, The bump model’s local microcircuit: circles represent neurons, curved lines represent axon arbors that end in excitatory synapses (v shape) or inhibitory synapses (open circles). For simplicity, inhibitory interneurons were omitted in our push-pull circuit. 2.3 Mathematical Description • The neurons in our network follow the equation CV = −∑ ∂(t − tn) + I syn − I KCa − I leak , • n where C is membrane capacitance, V is the temporal derivative of the membrane voltage, δ(·) is the Dirac delta function, which resets the membrane at the times tn when it crosses threshold, Isyn is synaptic current from the network, and Ileak is a constant leak current. Neurons receive synaptic current of the form: ON I syn = w+ I ON − w− I OFF + wEE I EXC − wEI I INH , EXC I syn = w+ ( I ON + I OFF ) + wEE I EXC − wEI I INH + I back , OFF INH I syn = w+ I OFF − w− I ON + wEE I EXC − wEI I INH , I syn = wIE I EXC where w+ is the excitatory synaptic strength for ON and OFF input synapses, w- is the strength of the push-pull inhibition, wEE is the synaptic strength for EXC cell projections to other EXC cells, wEI is the strength of INH cell projections to EXC cells, wIE is the strength of EXC cell projections to INH cells, Iback is a constant input current, and I{ON,OFF,EXC,INH} account for all impinging synapses from each of the four cell types. These terms are calculated for cell i using an arbor function that consists of a spatial weighting J(r) and a post-synaptic current waveform α(t): k ∑ J (i − k ) ⋅ α (t − t n ) , where k spans all cells of a given type and n indexes their spike k ,n times. The spatial weighting function is described by J (i − k ) = exp( − i − k σ ) , with σ as the space constant. The current waveform, which is non-zero for t>0, convolves a 1 t function with a decaying exponential: α (t ) = (t τ c + α 0 ) −1 ∗ exp(− t τ e ) , where τc is the decay-rate, and τe is the time constant of the exponential. Finally, we model spike-rate adaptation with a calcium-dependent potassium-channel (KCa), which integrates Ca triggered by spikes at times tn with a gain K and a time constant τk, as described by I KCa = ∑ K exp(tn − t τ k ) . n 3 Silicon Implementation We implemented our model in silicon using the TSMC (Taiwan Semiconductor Manufacturing Company) 0.25µm 5-metal layer CMOS process. The final chip consists of a 2-D core of 48x48 pixels, surrounded by asynchronous digital circuitry that transmits and receives spikes in real-time. Neurons that reach threshold within the array are encoded as address-events and sent off-chip, and concurrently, incoming address-events are sent to their appropriate synapse locations. This interface is compatible with other spike-based chips that use address-events [5]. The fabricated bump chip has close to 460,000 transistors packed in 10 mm2 of silicon area for a total of 9,216 neurons. 3.1 Circuit Design Our neural circuit was morphed into hardware using four building blocks. Figure 2 shows the transistor implementation for synapses, axonal arbors (diffuser), KCa analogs, and neurons. The circuits are designed to operate in the subthreshold region (except for the spiking mechanism of the neuron). Noise is not purposely designed into the circuits. Instead, random variations from the fabrication process introduce significant deviations in I-V curves of theoretically identical MOS transistors. The function of the synapse circuit is to convert a brief voltage pulse (neuron spike) into a postsynaptic current with biologically realistic temporal dynamics. Our synapse achieves this by cascading a current-mirror integrator with a log-domain low-pass filter. The current-mirror integrator has a current impulse response that decays as 1 t (with a decay rate set by the voltage τc and an amplitude set by A). This time-extended current pulse is fed into a log-domain low-pass filter (equivalent to a current-domain RC circuit) that imposes a rise-time on the post-synaptic current set by τe. ON and OFF input synapses receive presynaptic spikes from the off-chip link, whereas EXC and INH synapses receive presynaptic spikes from local on-chip neurons. Synapse Je Diffuser Ir A Ig Jc KCa Analog Neuron Jk Vmem Vspk K Figure 2: Transistor implementations are shown for a synapse, diffuser, KCa analog, and neuron (simplified), with circuit insignias in the top-left of each box. The circuits they interact with are indicated (e.g. the neuron receives synaptic current from the diffuser as well as adaptation current from the KCa analog; the neuron in turn drives the KCa analog). The far right shows layout for one pixel of the bump chip (vertical dimension is 83µm, horizontal is 30 µm). The diffuser circuit models axonal arbors that project to a local region of space with an exponential weighting. Analogous to resistive divider networks, diffusers [6] efficiently distribute synaptic currents to multiple targets. We use four diffusers to implement axonal projections for: the ON pathway, which excites ON and EXC cells and inhibits OFF cells; the OFF pathway, which excites OFF and EXC cells and inhibits ON cells; the EXC cells, which excite all cell types; and the INH cells, which inhibits EXC, ON, and OFF cells. Each diffuser node connects to its six neighbors through transistors that have a pseudo-conductance set by σr, and to its target site through a pseudo-conductance set by σg; the space-constant of the exponential synaptic decay is set by σr and σg’s relative levels. The neuron circuit integrates diffuser currents on its membrane capacitance. Diffusers either directly inject current (excitatory), or siphon off current (inhibitory) through a current-mirror. Spikes are generated by an inverter with positive feedback (modified from [7]), and the membrane is subsequently reset by the spike signal. We model a calcium concentration in the cell with a KCa analog. K controls the amount of calcium that enters the cell per spike; the concentration decays exponentially with a time constant set by τk. Elevated charge levels activate a KCa-like current that throttles the spike-rate of the neuron. 3.2 Experimental Setup Our setup uses either a silicon retina [8] or a National Instruments DIO (digital input–output) card as input to the bump chip. This allows us to test our V1 model with real-time visual stimuli, similar to the experimental paradigm of electrophysiologists. More specifically, the setup uses an address-event link [5] to establish virtual point-to-point connectivity between ON or OFF ganglion cells from the retina chip (or DIO card) with ON or OFF synapses on the bump chip. Both the input activity and the output activity of the bump chip is displayed in real-time using receiver chips, which integrate incoming spikes and displays their rates as pixel intensities on a monitor. A logic analyzer is used to capture spike output from the bump chip so it can be further analyzed. We investigated responses of the bump chip to gratings moving in sixteen different directions, both qualitatively and quantitatively. For the qualitative aspect, we created a PO map by taking each cell’s average activity for each stimulus direction and computing the vector sum. To obtain a quantitative measure, we looked at the normalized vector magnitude (NVM), which reveals the sharpness of a cell’s tuning. The NVM is calculated by dividing the vector sum by the magnitude sum for each cell. The NVM is 0 if a cell responds equally to all orientations, and 1 if a cell’s orientation selectivity is perfect such that it only responds at a single orientation. 4 Results We presented sixteen moving gratings to the network, with directions ranging from 0 to 360 degrees. The spatial frequency of the grating is tuned to match the size of the average bump, and the temporal frequency is 1 Hz. Figure 3a shows a resulting PO map for directions from 180 to 360 degrees, looking at the inhibitory cell population (the data looks similar for other cell types). Black contours represent stable bump regions, or equivalently, the regions that exceed a prescribed threshold (90 spikes) for all directions. The PO map from the bump chip reveals structure that resembles data from real cortex. Nearby cells tend to prefer similar orientations except at fractures. There are even regions that are similar to pinwheels (delimited by a white rectangle). A PO is a useful tool to describe a network’s selectivity, but it only paints part of the picture. So we have additionally computed a NVM map and a NVM histogram, shown in Figure 3b and 3c respectively. The NVM map shows that cells with sharp selectivity tend to cluster, particularly around the edge of the bumps. The histogram also reveals that the distribution of cell selectivity across the network varies considerably, skewed towards broadly tuned cells. We also looked at spike rasters from different cell-types to gain insight into their phase relationship with the stimulus. In particular, we present recordings for the site indicated by the arrow (see Figure 3a) for gratings moving in eight directions ranging from 0 to 360 degrees in 45-degree increments (this location was chosen because it is in the vicinity of a pinwheel, is reasonably selective, and shows considerable modulation in its firing rate). Figure 4 shows the luminance of the stimulus (bottom sinusoids), ON- (cyan) and OFF-input (magenta) spike trains, and the resulting spike trains from EXC (yellow), INH (blue), ON- (green), and OFFdriven (red) cell types for each of the eight directions. The center polar plot summarizes the orientation selectivity for each cell-type by showing the normalized number of spikes for each stimulus. Data is shown for one period. Even though all cells-types are selective for the same orientation (regardless of grating direction), complex cell responses tend to be phase-insensitive while the simple cell responses are modulated at the fundamental frequency. It is worth noting that the simple cells have sharper orientation selectivity compared to the complex cells. This trend is characteristic of our data. 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 300 250 200 150 100 50 20 40 60 80 100 120 140 160 180 0 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 Figure 3: (a) PO map for the inhibitory cell population stimulated with eight different directions from 180 to 360 degrees (black represents no activity, contours delineate regions that exceed 90 spikes for all stimuli). Normalized vector magnitude (NVM) data is presented as (b) a map and (c) a histogram. Figure 4: Spike rasters and polar plot for 8 directions ranging from 0 to 360 degrees. Each set of spike rasters represent from bottom to top, ON- (cyan) and OFF-input (magenta), INH (yellow), EXC (blue), and ON- (green) and OFF-driven (red). The stimulus period is 1 sec. 5 Discussion We have implemented a large-scale network of spiking neurons in a silicon chip that is based on layer 4 of the visual cortex. The initial testing of the network reveals a PO map, inherited from innate chip heterogeneities, resembling cortical maps. Our microcircuit proposes a novel function for complex-like cells; that is they create a sign-independent orientation selective signal, which through a push-pull circuit creates sharply tuned simple cells with the same orientation preference. Recently, Ringach et al. surveyed orientation selectivity in the macaque [9]. They observed that, in a population of V1 neurons (N=308) the distribution of orientation selectivity is quite broad, having a median NVM of 0.39. We have measured median NVM’s ranging from 0.25 to 0.32. Additionally, Ringach et al. found a negative correlation between spontaneous firing rate and NVM. This is consistent with our model because cells closer to the center of the bump have higher firing rates and broader tuning. While the results from the bump chip are promising, our maps are less consistent and noisier than the maps Ernst et al. have reported. We believe this is because our network is tuned to operate in a fluid state where bumps come on, travel a short distance and disappear (motivated by cortical imaging studies). But excessive fluidity can cause non-dominant bumps to briefly appear and adversely shift the PO maps. We are currently investigating the role of lateral connections between bumps as a means to suppress these spontaneous shifts. The neural mechanisms that underlie the orientation selectivity of V1 neurons are still highly debated. This may be because neuron responses are not only shaped by feedforward inputs, but are also influenced at the network level. If modeling is going to be a useful guide for electrophysiologists, we must model at the network level while retaining cell level detail. Our results demonstrate that a spike-based neuromorphic system is well suited to model layer 4 of the visual cortex. The same approach may be used to build large-scale models of other cortical regions. References 1. Hubel, D. and T. Wiesel, Receptive firelds, binocular interaction and functional architecture in the cat's visual cortex. J. Physiol, 1962. 160: p. 106-154. 2. Blasdel, G.G., Orientation selectivity, preference, and continuity in monkey striate cortex. J Neurosci, 1992. 12(8): p. 3139-61. 3. Crair, M.C., D.C. Gillespie, and M.P. Stryker, The role of visual experience in the development of columns in cat visual cortex. Science, 1998. 279(5350): p. 566-70. 4. Ernst, U.A., et al., Intracortical origin of visual maps. Nat Neurosci, 2001. 4(4): p. 431-6. 5. Boahen, K., Point-to-Point Connectivity. IEEE Transactions on Circuits & Systems II, 2000. vol 47 no 5: p. 416-434. 6. Boahen, K. and Andreou. A contrast sensitive silicon retina with reciprocal synapses. in NIPS91. 1992: IEEE. 7. Culurciello, E., R. Etienne-Cummings, and K. Boahen, A Biomorphic Digital Image Sensor. IEEE Journal of Solid State Circuits, 2003. vol 38 no 2: p. 281-294. 8. Zaghloul, K., A silicon implementation of a novel model for retinal processing, in Neuroscience. 2002, UPENN: Philadelphia. 9. Ringach, D.L., R.M. Shapley, and M.J. Hawken, Orientation selectivity in macaque V1: diversity and laminar dependence. J Neurosci, 2002. 22(13): p. 5639-51.

4 0.060741693 157 nips-2003-Plasticity Kernels and Temporal Statistics

Author: Peter Dayan, Michael Häusser, Michael London

Abstract: Computational mysteries surround the kernels relating the magnitude and sign of changes in efficacy as a function of the time difference between pre- and post-synaptic activity at a synapse. One important idea34 is that kernels result from filtering, ie an attempt by synapses to eliminate noise corrupting learning. This idea has hitherto been applied to trace learning rules; we apply it to experimentally-defined kernels, using it to reverse-engineer assumed signal statistics. We also extend it to consider the additional goal for filtering of weighting learning according to statistical surprise, as in the Z-score transform. This provides a fresh view of observed kernels and can lead to different, and more natural, signal statistics.

5 0.057386339 27 nips-2003-Analytical Solution of Spike-timing Dependent Plasticity Based on Synaptic Biophysics

Author: Bernd Porr, Ausra Saudargiene, Florentin Wörgötter

Abstract: Spike timing plasticity (STDP) is a special form of synaptic plasticity where the relative timing of post- and presynaptic activity determines the change of the synaptic weight. On the postsynaptic side, active backpropagating spikes in dendrites seem to play a crucial role in the induction of spike timing dependent plasticity. We argue that postsynaptically the temporal change of the membrane potential determines the weight change. Coming from the presynaptic side induction of STDP is closely related to the activation of NMDA channels. Therefore, we will calculate analytically the change of the synaptic weight by correlating the derivative of the membrane potential with the activity of the NMDA channel. Thus, for this calculation we utilise biophysical variables of the physiological cell. The final result shows a weight change curve which conforms with measurements from biology. The positive part of the weight change curve is determined by the NMDA activation. The negative part of the weight change curve is determined by the membrane potential change. Therefore, the weight change curve should change its shape depending on the distance from the soma of the postsynaptic cell. We find temporally asymmetric weight change close to the soma and temporally symmetric weight change in the distal dendrite. 1

6 0.056270394 45 nips-2003-Circuit Optimization Predicts Dynamic Networks for Chemosensory Orientation in Nematode C. elegans

7 0.055724278 18 nips-2003-A Summating, Exponentially-Decaying CMOS Synapse for Spiking Neural Systems

8 0.055563651 185 nips-2003-The Doubly Balanced Network of Spiking Neurons: A Memory Model with High Capacity

9 0.052400894 159 nips-2003-Predicting Speech Intelligibility from a Population of Neurons

10 0.048669469 89 nips-2003-Impact of an Energy Normalization Transform on the Performance of the LF-ASD Brain Computer Interface

11 0.044704057 104 nips-2003-Learning Curves for Stochastic Gradient Descent in Linear Feedforward Networks

12 0.04463568 115 nips-2003-Linear Dependent Dimensionality Reduction

13 0.043419339 142 nips-2003-On the Concentration of Expectation and Approximate Inference in Layered Networks

14 0.04312294 5 nips-2003-A Classification-based Cocktail-party Processor

15 0.043020315 43 nips-2003-Bounded Invariance and the Formation of Place Fields

16 0.042793959 79 nips-2003-Gene Expression Clustering with Functional Mixture Models

17 0.042567722 67 nips-2003-Eye Micro-movements Improve Stimulus Detection Beyond the Nyquist Limit in the Peripheral Retina

18 0.042513013 49 nips-2003-Decoding V1 Neuronal Activity using Particle Filtering with Volterra Kernels

19 0.042090978 144 nips-2003-One Microphone Blind Dereverberation Based on Quasi-periodicity of Speech Signals

20 0.041222896 162 nips-2003-Probabilistic Inference of Speech Signals from Phaseless Spectrograms


similar papers computed by lsi model

lsi for this paper:

topicId topicWeight

[(0, -0.118), (1, 0.028), (2, 0.131), (3, 0.016), (4, 0.004), (5, 0.023), (6, 0.046), (7, -0.013), (8, -0.037), (9, 0.038), (10, 0.027), (11, 0.002), (12, 0.066), (13, 0.005), (14, -0.067), (15, -0.038), (16, 0.029), (17, -0.035), (18, 0.014), (19, 0.04), (20, -0.066), (21, 0.026), (22, -0.007), (23, -0.014), (24, -0.073), (25, 0.055), (26, -0.041), (27, 0.046), (28, 0.03), (29, -0.027), (30, 0.026), (31, -0.031), (32, -0.054), (33, -0.029), (34, 0.057), (35, -0.031), (36, 0.062), (37, -0.015), (38, -0.135), (39, 0.015), (40, 0.009), (41, -0.091), (42, 0.013), (43, 0.159), (44, 0.074), (45, 0.049), (46, -0.089), (47, -0.05), (48, -0.085), (49, -0.05)]

similar papers list:

simIndex simValue paperId paperTitle

same-paper 1 0.95999575 184 nips-2003-The Diffusion-Limited Biochemical Signal-Relay Channel

Author: Peter J. Thomas, Donald J. Spencer, Sierra K. Hampton, Peter Park, Joseph P. Zurkus

Abstract: Biochemical signal-transduction networks are the biological information-processing systems by which individual cells, from neurons to amoebae, perceive and respond to their chemical environments. We introduce a simplified model of a single biochemical relay and analyse its capacity as a communications channel. A diffusible ligand is released by a sending cell and received by binding to a transmembrane receptor protein on a receiving cell. This receptor-ligand interaction creates a nonlinear communications channel with non-Gaussian noise. We model this channel numerically and study its response to input signals of different frequencies in order to estimate its channel capacity. Stochastic effects introduced in both the diffusion process and the receptor-ligand interaction give the channel low-pass characteristics. We estimate the channel capacity using a water-filling formula adapted from the additive white-noise Gaussian channel. 1 Introduction: The Diffusion-Limited Biochemical Signal-Relay Channel The term signal-transduction network refers to the web of biochemical interactions by which single cells process sensory information about their environment. Just as neural networks underly the interaction of many multicellular organisms with their environments, these biochemical networks allow cells to perceive, evaluate and react to chemical stimuli [1]. Examples include chemical signaling across the synaptic cleft, calcium signaling within the postsynaptic dendritic spine, pathogen localization by the immune system, ∗ † Corresponding author: pjthomas@salk.edu dspencer@salk.edu growth-cone guidance during neuronal development, phototransduction in the retina, rhythmic chemotactic signaling in social amoebae, and many others. The introduction of quantitative measurements of the distribution and activation of chemical reactants within living cells [2] has prepared the way for detailed quantitative analysis of their properties, aided by numerical simulations. One of the key questions that can now be addressed is the fundamental limits to cell-to-cell communication using chemical signaling. To communicate via chemical signaling cells must contend with the unreliability inherent in chemical diffusion and in the interactions of limited numbers of signaling molecules and receptors [3]. We study a simplified situation in which one cell secretes a signaling molecule, or ligand, which can be detected by a receptor on another cell. Limiting ourselves to one ligand-receptor interaction allows a treatment of this communications system using elementary concepts from information theory. The information capacity of this fundamental signaling system is the maximum of the mutual information between the ensemble of input signals, the time-varying rate of ligand secretion s(t), and the output signal r(t), a piecewise continuous function taking the values one or zero as the receptor is bound to ligand or unbound. Using numerical simulation we can estimate the channel capacity via a standard ”water-filling” information measure [4], as described below. 2 Methods: Numerical Simulation of the Biochemical Relay We simulate a biochemical relay system as follows: in a two-dimensional rectangular volume V measuring 5 micrometers by 10 micrometers, we locate two cells spaced 5 micrometers apart. Cell A emits ligand molecules from location xs = [2.5µ, 2.5µ] with rate s(t) ≥ 0; they diffuse with a given diffusion constant D and decay at a rate α. Both secretion and decay occur as random Poisson processes, and diffusion is realized as a discrete random walk with Gaussian-distributed displacements. The boundaries of V are taken to be reflecting. We track the positions of each of N particles {xi , i = 1, · · · , N } at intervals of ∆t = 1msec. The local concentration in a neighborhood of size σ around a location x is given by the convolution N δ(x − xi )g(x − x , σ) dx c(x, t) = ˆ (1) V i=1 where g(·, σ) is a normalized Gaussian distribution in the plane, with mean 0 and variance σ 2 . The motions of the individual particles cause c(x, t) to fluctuate about the mean conˆ centration, causing the local concentration at cell B, c(xr , t) to be a noisy, low-pass filtered ˆ version of the original signal s(t) (see Figure 1). Cell B, located at xr = [7.5µ, 2.5µ], registers the presence of ligand through binding and unbinding transitions, which form a two-state Markov process with time-varying transition rates. Given an unbound receptor, the binding transition happens at a rate that depends on the ligand concentration around the receptor: k+ c(xr , t). The size of the neighborhood σ ˆ reflects the range of the receptor, with binding most likely in a small region close to xr . Once the receptor is bound to a ligand molecule, no more binding events occur until the receptor releases the ligand. The receiver is insensitive to fluctuations in c(xr , t) while it is ˆ in the bound state (see Figure 1). The unbinding transition occurs with a fixed rate k− . For concreteness, we take values for D, α, k− , k+ , and σ appropriate for cyclic AMP signaling between Dictyostelium amoebae, a model organism for chemical communica1 tion: D = 0.25µ2 msec−1 , α = 1 sec−1 , σ = 0.1µ, k− = 1 sec−1 , k+ = 2πσ2 sec−1 . Kd = k− /k+ is the dissociation constant, the concentration at which the receptor on average is bound half the time. For the chosen values of the reaction constants k± , we have Figure 1: Biochemical Signaling Simulation. Top: Cell A secretes a signaling molecule (red dots) with a time-varying rate r(t). Molecules diffuse throughout the two-dimensional volume, leading to locally fluctuating concentrations that carry a corrupted version of the signal. Molecules within a neighborhood of cell B can bind to a receptor molecule, giving a received signal s(t) ∈ {0, 1}. Bottom Left: Input signal. Mean instantaneous rate of molecule release (thousands of molecules per second). Molecule release is a Poisson process with time-varying rate. Bottom Center: Local concentration fluctuations, as seen by cell B, indicated by the number of molecules within 0.2 microns of the receptor. The receptor is sensitive to fluctuations in local concentrations only while it is unbound. While the receptor is bound, it does not register changes in the local concentration (indicated by constant plateaus corresponding to intervals when r(t) = 1 in bottom right panel. Bottom Right: Output signal r(t). At each moment the receptor is either bound (1) or unbound (0). The receiver output is a piecewise constant function with a finite number of transitions. Kd ≈ 15.9 molecules ≈ 26.4nMol, comparable to the most sensitive values reported for µ2 the cyclic AMP receptor [2]. At this concentration the volume V = 50µ2 contains about 800 signaling molecules, assuming a nominal depth of 1µ. 3 Results: Estimating Information Capacity via Frequency Response Communications channels mediated by diffusion and ligand receptor interaction are nonlinear with non-Gaussian noise. The expected value of the output signal, 0 ≤ E[r] < 1, is a sigmoidal function of the log concentration for a constant concentration c: E[r] = 1 c = c + Kd 1 + e−(y−y0 ) (2) where y = ln(c), y0 = ln(Kd ). The mean response saturates for high concentrations, c Kd , and the noise statistics become pronouncedly Poissonian (rather than Gaussian) for low concentrations. Several different kinds of stimuli can be used to characterize such a channel. The steadystate response to constant input reflects the static (equilibrium) transfer function. Concentrations ranging from 100Kd to 0.01Kd occupy 98% of the steady-state operating range, 0.99 > E[r] > 0.01 [5]. For a finite observation time T the actual fraction of time spent bound, rT , is distributed about E[r] with a variance that depends on T . The biochemi¯ cal relay may be used as a binary symmetric channel randomly selecting a ‘high’ or ‘low’ secretion rate, and ‘decoding’ by setting a suitable threshold for rT . As T increases, the ¯ variance of rT and the probability of error decrease. ¯ The binary symmetric channel makes only crude use of this signaling mechanism. Other possible communication schemes include sending all-or-none bursts of signaling molecule, as in synaptic transmission, or detecting discrete stepped responses. Here we use the frequency response of the channel as a way of estimating the information capacity of the biochemical channel. For an idealized linear channel with additive white Gaussian noise (AWNG channel) the channel capacity under a mean input power constraint P is given by the so-called “waterfilling formula” [4], C= 1 2 ωmax log2 1 + ω=ωmin (ν − N (ω))+ N (ω) dω (3) given the constraining condition ωmax (ν − N (ω))+ dω ≤ P (4) ω=ωmin where the constant ν is the sum of the noise and the signal power in the usable frequency range, N (ω) is the power of the additive noise at frequency ω and (X)+ indicates the positive part of X. The formula applies when each frequency band (ω, ω +dω) is subject to noise of power N (ω) independently of all other frequency bands, and reflects the optimal allocation of signal power S(ω) = (ν − N (ω))+ , with greater signal power invested in frequencies at which the noise power is smallest. The capacity C is in bits/second. For an input signal of finite duration T = 100 sec, we can independently specify the amplitudes and phases of its frequency components at ω = [0.01 Hz, 0.02 Hz, · · · , 500 Hz], where 500 Hz is the Nyquist frequency given a 1 msec simulation timestep. Because the population of secreted signaling molecules decays exponentially with a time constant of 1/α = 1 sec, the concentration signal is unable to pass frequencies ω ≥ 1Hz (see Figure 2) providing a natural high-frequency cutoff. For the AWGN channel the input and Figure 2: Frequency Response of Biochemical Relay Channel. The sending cell secreted signaling molecules at a mean rate of 1000 + 1000 sin(2πωt) molecules per second. From top to bottom, the input frequencies were 1.0, 0.5, 0.2, 0.1, 0.05, 0.02 and 0.01 Hz. The total signal duration was T = 100 seconds. Left Column: Total number of molecules in the volume. Attenuation of the original signal results from exponential decay of the signaling molecule population. Right Column: A one-second moving average of the output signal r(t), which takes the value one when the receptor molecule is bound to ligand, and zero when the receptor is unbound. Figure 3: Frequency Transmission Spectrum Noise power N (ω), calculated as the total power in r(t)−¯ in all frequency components save the input frequency ω. Frequencies were r binned in intervals of 0.01 Hz = 1/T . The maximum possible power in r(t) over all frequencies is 0.25; the power successfully transmitted by the channel is given by 0.25/N (ω). The lower curve is N (ω) for input signals of the form s(t) = 1000 + 1000 sin 2πωt, which uses the full dynamic range of the receptor. Decreasing the dynamic range used reduces the amount of power transmitted at the sending frequency: the upper curve is N (ω) for signals of the form s(t) = 1000 + 500 sin 2πωt. output signals share the same units (e.g. rms voltage); for the biological relay the input s(t) is in molecules/second while the output r(t) is a function with binary range {r = 0, r = 1}. The maximum of the mean output power for a binary function r(t) T 2 1 is T t=0 |r(t) − r| dt ≤ 1 . This total possible output power will be distributed be¯ 4 tween different frequencies depending on the frequency of the input. We wish to estimate the channel capacity by comparing the portion of the output power present in the sending frequency ω to the limiting output power 0.25. Therefore we set the total output power constant to ν = 0.25. Given a pure sinusoidal input signal s(t) = a0 + a1 sin(2πωt), we consider the power in the output spectrum at ω Hz to be the residual power from the input and the rest of the power in the spectrum of r(t) to be analogous to the additive noise power spectrum N (ω) in the AWNG channel. We calculate N (ω) to be the total power of r(t) − r ¯ in all frequency bands except ω. For signals of length T = 100 sec, the possible frequencies are discretized at intervals ∆ω = 0.01 Hz. Because the noise power N (ω) ≤ 0.25, the water-filling formula (3) for the capacity reduces to 1 Cest = 2 1Hz log2 0.01Hz 0.25 N (ω) dω. (5) As mentioned above frequencies ω ≥ 1 Hz do not transmit any information about the signal (see Figure 2) and do not contribute to the capacity. We approximate this integral using linear interpolation of log2 (N (ω)) between the measured values at ω = [0.01, 0.02, 0.05, 0.1, 0.2, 0.5, 1.0] Hz. (See Figure 3.) This procedure gives an estimate of the channel capacity, Cest = 0.087 bits/second. 4 Discussion & Conclusions Diffusion and the Markov switching between bound and unbound states create a low-pass filter that removes high-frequency information in the biochemical relay channel. A general Poisson-type communications channel, such as commonly encountered in optical communications engineering, can achieve an arbitrarily large capacity by transmitting high frequencies and high amplitudes, unless bounded by a max or mean amplitude constraint [6]. In the biochemical channel, the effective input amplitude is naturally constrained by the saturation of the receptor at concentrations above the Kd . And the high frequency transmission is limited by the inherent dynamics of the Markov process. Therefore this channel has a finite capacity. The channel capacity estimate we derived, Cest = 0.087 bits/second, seems quite low compared to signaling rates in the nervous system, requiring long signaling times to transfer information successfully. However temporal dynamics in cellular systems can be quite deliberate; cell-cell communication in the social amoeba Dictyostelium, for example, is achieved by means of a carrier wave with a period of seven minutes. In addition, cells typically possess thousands of copies of the receptors for important signaling molecules, allowing for more complex detection schemes than those investigated here. Our simplified treatment suggests several avenues for further work. For example, signal transducing receptors often form Markov chains with more complicated dynamics reflecting many more than two states [7]. Also, the nonlinear nature of the channel is probably not well served by our additive noise approximation, and might be better suited to a treatment via multiplicative noise [8]. Whether cells engage in complicated temporal coding/decoding schemes, as has been proposed for neural information processing, or whether instead they achieve efficient communication by evolutionary matching of the noise characteristics of sender and receiver, remain to be investigated. We note that the dependence of the channel capacity C on such parameters as the system geometry, the diffusion and decay constants, the binding constants and the range of the receptor may shed light on evolutionary mechanisms and constraints on communication within cellular biological systems. Acknowledgments This work would not have been possible without the generous support of the Howard Hughes Medical Institute and the resources of the Computational Neurobiology Laboratory, Terrence J. Sejnowski, Director. References [1] Rappel, W.M., Thomas, P.J., Levine, H. & Loomis, W.F. (2002) Establishing Direction during Chemotaxis in Eukaryotic Cells. Biophysical Journal 83:1361-1367. [2] Ueda, M., Sako, Y., Tanaka, T., Devreotes, P. & Yanagida, T. (2001) Single Molecule Analysis of Chemotactic Signaling in Dictyostelium Cells. Science 294:864-867. [3] Detwiler, P.B., Ramanathan, S., Sengupta, A. & Shraiman, B.I. (2000) Engineering Aspects of Enzymatic Signal Transduction: Photoreceptors in the Retina. Biophysical Journal79:2801-2817. [4] Cover, T.M. & Thomas, J.A. (1991) Elements of Information Theory, New York: Wiley. [5] Getz, W.M. & Lansky, P. (2001) Receptor Dissociation Constants and the Information Entropy of Membranes Coding Ligand Concentration. Chem. Senses 26:95-104. [6] Frey, R.M. (1991) Information Capacity of the Poisson Channel. IEEE Transactions on Information Theory 37(2):244-256. [7] Uteshev, V.V. & Pennefather, P.S. (1997) Analytical Description of the Activation of Multi-State Receptors by Continuous Neurotransmitter Signals at Brain Synapses. Biophysical Journal72:11271134. [8] Mitra, P.P. & Stark, J.B. (2001) Nonlinear limits to the information capacity of optical fibre communications. Nature411:1027-1030.

2 0.54522961 157 nips-2003-Plasticity Kernels and Temporal Statistics

Author: Peter Dayan, Michael Häusser, Michael London

Abstract: Computational mysteries surround the kernels relating the magnitude and sign of changes in efficacy as a function of the time difference between pre- and post-synaptic activity at a synapse. One important idea34 is that kernels result from filtering, ie an attempt by synapses to eliminate noise corrupting learning. This idea has hitherto been applied to trace learning rules; we apply it to experimentally-defined kernels, using it to reverse-engineer assumed signal statistics. We also extend it to consider the additional goal for filtering of weighting learning according to statistical surprise, as in the Z-score transform. This provides a fresh view of observed kernels and can lead to different, and more natural, signal statistics.

3 0.4910574 144 nips-2003-One Microphone Blind Dereverberation Based on Quasi-periodicity of Speech Signals

Author: Tomohiro Nakatani, Masato Miyoshi, Keisuke Kinoshita

Abstract: Speech dereverberation is desirable with a view to achieving, for example, robust speech recognition in the real world. However, it is still a challenging problem, especially when using a single microphone. Although blind equalization techniques have been exploited, they cannot deal with speech signals appropriately because their assumptions are not satisfied by speech signals. We propose a new dereverberation principle based on an inherent property of speech signals, namely quasi-periodicity. The present methods learn the dereverberation filter from a lot of speech data with no prior knowledge of the data, and can achieve high quality speech dereverberation especially when the reverberation time is long. 1

4 0.48769215 15 nips-2003-A Probabilistic Model of Auditory Space Representation in the Barn Owl

Author: Brian J. Fischer, Charles H. Anderson

Abstract: The barn owl is a nocturnal hunter, capable of capturing prey using auditory information alone [1]. The neural basis for this localization behavior is the existence of auditory neurons with spatial receptive fields [2]. We provide a mathematical description of the operations performed on auditory input signals by the barn owl that facilitate the creation of a representation of auditory space. To develop our model, we first formulate the sound localization problem solved by the barn owl as a statistical estimation problem. The implementation of the solution is constrained by the known neurobiology.

5 0.46512502 89 nips-2003-Impact of an Energy Normalization Transform on the Performance of the LF-ASD Brain Computer Interface

Author: Yu Zhou, Steven G. Mason, Gary E. Birch

Abstract: This paper presents an energy normalization transform as a method to reduce system errors in the LF-ASD brain-computer interface. The energy normalization transform has two major benefits to the system performance. First, it can increase class separation between the active and idle EEG data. Second, it can desensitize the system to the signal amplitude variability. For four subjects in the study, the benefits resulted in the performance improvement of the LF-ASD in the range from 7.7% to 18.9%, while for the fifth subject, who had the highest non-normalized accuracy of 90.5%, the performance did not change notably with normalization. 1 In trod u ction In an effort to provide alternative communication channels for people who suffer from severe loss of motor function, several researchers have worked over the past two decades to develop a direct Brain-Computer Interface (BCI). Since electroencephalographic (EEG) signal has good time resolution and is non-invasive, it is commonly used for data source of a BCI. A BCI system converts the input EEG into control signals, which are then used to control devices like computers, environmental control system and neuro-prostheses. Mason and Birch [1] proposed the Low-Frequency Asynchronous Switch Design (LF-ASD) as a BCI which detected imagined voluntary movement-related potentials (IVMRPs) in spontaneous EEG. The principle signal processing components of the LF-ASD are shown in Figure 1. sIN Feature Extractor sLPF LPF Feature Classifier sFE sFC Figure 1: The original LF-ASD design. The input to the low-pass filter (LPF), denoted as SIN in Figure 1, are six bipolar EEG signals recorded from F1-FC1, Fz-FCz, F2-FC2, FC1-C1, FCz-Cz and FC2-C2 sampled at 128 Hz. The cutoff frequency of the LPF implemented by Mason and Birch was 4 Hz. The Feature Extractor of the LF-ASD extracts custom features related to IVMRPs. The Feature Classifier implements a one-nearest-neighbor (1NN) classifier, which determines if the input signals are related to a user state of voluntary movement or passive (idle) observation. The LF-ASD was able to achieve True Positive (TP) values in the range of 44%-81%, with the corresponding False Positive (FP) values around 1% [1]. Although encouraging, the current error rates of the LF-ASD are insufficient for real-world applications. This paper proposes a method to improve the system performance. 2 Design and Rationale The improved design of the LF-ASD with the Energy Normalization Transform (ENT) is provided in Figure 2. SIN ENT SN SNLPF LPF Feature Extractor SNFE SNFC Feature Classifier Figure 2: The improved LF-ASD with the Energy Normalization Transform. The design of the Feature Extractor and Feature Classifier were the same as shown in Figure 1. The Energy Normalization Transform (ENT) is implemented as S N (n ) = S s=( w s= − ( N ∑ w IN S IN −1) / 2 N −1) / 2 (n ) 2 (n − s) w N where W N (normalization window size) is the only parameter in the equation. The optimal parameter value was obtained by exhaustive search for the best class separation between active and idle EEG data. The method of obtaining the active and idle EEG data is provided in Section 3.1. The idea to use energy normalization to improve the LF-ASD design was based primarily on an observation that high frequency power decreases significantly around movement. For example, Jasper and Penfield [3] and Pfurtscheller et al, [4] reported EEG power decrease in the mu (8-12 Hz) and beta rhythm (18-26 Hz) when people are involved in motor related activity. Also Mason [5] found that the power in the frequency components greater than 4Hz decreased significantly during movement-related potential periods, while power in the frequency components less than 4Hz did not. Thus energy normalization, which would increase the low frequency power level, would strengthen the 0-4 Hz features used in the LF-ASD and hence reduce errors. In addition, as a side benefit, it can automatically adjust the mean scale of the input signal and desensitize the system to change in EEG power, which is known to vary over time [2]. Therefore, it was postulated that the addition of ENT into the improved design would have two major benefits. First, it can increase the EEG power around motor potentials, consequently increasing the class separation and feature strength. Second, it can desensitize the system to amplitude variance of the input signal. In addition, since the system components of the modified LF-ASD after the ENT were the same as in the original design, a major concern was whether or not the ENT distorted the features used by the LF-ASD. Since the features used by the LFASD are generated from the 0-4 Hz band, if the ENT does not distort the phase and magnitude spectrum in this specific band, it would not distort the features related to movement potential detection in the application. 3 3.1 Evaluation Test data Two types of EEG data were pre-recorded from five able-bodied individuals as shown in Figure 3. Active Data Type and Idle Data Type. Active Data was recorded during repeated right index finger flexions alternating with periods of no motor activity; Idle Data was recorded during extended periods of passive observation. Figure 3: Data Definition of M1, M2, Idle1 and Idle2. Observation windows centered at the time of the finger switch activations (as shown in Figure 4) were imposed in the active data to separate data related to movements from data during periods of idleness. For purpose of this study, data in the front part of the observation window was defined as M1 and data in the rear part of the window was defined as M2. Data falling out of the observation window was defined as Idle2. All the data in the Idle Data Type was defined as Idle1 for comparison with Idle2. Figure 4: Ensemble Average of EEG centered on finger activations. Figure 5: Density distribution of Idle1, Idle2, M1 and M2. It was noted, in terms of the density distribution of active and idle data, the separation between M2 and Idle2 was the largest and Idle1 and Idle2 were nearly identical (see Figure 5). For the study, M2 and Idle2 were chosen to represent the active and idle data classes and the separation between M2 and Idle2 data was defined by the difference of means (DOM) scaled by the amplitude range of Idle2. 3.2 Optimal parameter determination The optimal combination of normalization window size, W N, and observation window size, W O was selected to be that which achieved the maximal DOM value. This was determined by exhaustive search, and discussed in Section 4.1. 3.3 Effect of ENT on the Low Pass Filter output As mentioned previously, it was postulated that the ENT had two major impacts: increasing the class separation between active and idle EEG and desensitizing the system to the signal amplitude variance. The hypothesis was evaluated by comparing characteristics of SNLPF and SLPF in Figure 1 and Figure 2. DOM was applied to measure the increased class separation. The signal with the larger DOM meant larger class separation. In addition, the signal with smaller standard deviation may result in a more stable feature set. 3.4 Effect of ENT on the LF-ASD output The performances of the original and improved designs were evaluated by comparing the signal characteristics of SNFC in Figure 2 to SFC in Figure 1. A Receiver Operating Characteristic Curve (ROC Curve) [6] was generated for the original and improved designs. The ROC Curve characterizes the system performance over a range of TP vs. FP values. The larger area under ROC Curve indicates better system performance. In real applications, a BCI with high-level FP rates could cause frustration for subjects. Therefore, in this work only the LF-ASD performance when the FP values are less than 1% were studied. 4 4.1 Results Optimal normalization window size (WN) The method to choose optimal WN was an exhaustive search for maximal DOM between active and idle classes. This method was possibly dependent on the observation window size (W O). However, as shown in Figure 6a, the optimal WN was found to be independent of WO. Experimentally, the W O values were selected in the range of 50-60 samples, which corresponded to largest DOM between nonnormalized active and idle data. The optimal WN was obtained by exhaustive search for the largest DOM through normalized active and idle data. The DOM vs. WN profile for Subject 1 is shown in Figure 6b. a) b) Figure 6: Optimal parameter determination for Subject 1 in Channel 1 a) DOM vs. WO; b) DOM vs. WN. When using ENT, a small W N value may cause distortion to the feature set used by the LF-ASD. Thus, the optimal W N was not selected in this range (< 40 samples). When W N is greater than 200, the ENT has lost its capability to increase class separation and the DOM curve gradually goes towards the best separation without normalization. Thus, the optimal W N should correspond to the maximal DOM value when W N is in the range from 40 to 200. In Figure 6b, the optimal WN is around 51. 4.2 Effect of ENT on the Low Pass Filter output With ENT, the standard deviation of the low frequency EEG signal decreased from around 1.90 to 1.30 over the six channels and over the five subjects. This change resulted in more stable feature sets. Thus, the ENT desensitizes the system to input signal variance. a) b) Figure 7: Density distribution of the active vs. idle class without (a) and with (b) ENT, for Subject 1 in Channel 1. As shown in Figure 7, by increasing the EEG power around motor potentials, ENT can increase class separations between active and idle EEG data. The class separation in (frontal) Channels 1-3 across all subjects increased consistently with the proposed ENT. The same was true for (midline) Channels 4-6, for all subjects except Subject 5, whose DOM in channel 5-6 decreased by 2.3% and 3.4% respectively with normalization. That is consistent with the fact that his EEG power in Channels 4-6 does not decrease. On average, across all five subjects, DOM increases with normalization to about 28.8%, 26.4%, 39.4%, 20.5%, 17.8% and 22.5% over six channels respectively. In addition, the magnitude and phase spectrums of the EEG signal before and after ENT is provided in Figure 8. The ENT has no visible distortion to the signal in the low frequency band (0-4 Hz) used by the LF-ASD. Therefore, the ENT does not distort the features used by the LF-ASD. (a) (b) Figure 8: Magnitude and phase spectrum of the EEG signal before and after ENT. 4.3 Effect of ENT on the LF-ASD output The two major benefits of the ENT to the low frequency EEG data result in the performance improvement of the LF-ASD. Subject 1’s ROC Curves with and without ENT is shown in Figure 9, where the ROC-Curve with ENT of optimal parameter value is above the ROC Curve without ENT. This indicates that the improved LF-ASD performs better. Table I compares the system performance with and without ENT in terms of TP with corresponding FP at 1% across all the 5 subjects. Figure 9: The ROC Curves (in the section of interest) of Subject 1 with different WN values and the corresponding ROC Curve without ENT. Table I: Performance of the LF-ASD with and without LF-ASD in terms of the True Positive rate with corresponding False Positive at 1%. Subject 1 Subject 2 Subject 3 Subject 4 Subject 5 TP without ENT 66.1% 82.7% 79.7% 79.3% 90.5% TP with ENT 85.0% 90.4% 88.0% 87.8% 88.7% Performance Improvement 18.9% 7.7% 8.3% 8.5% -1.8% For 4 out of 5 subjects, corresponding with the FP at 1%, the improved system with ENT increased the TP value by 7.7%, 8.3%, 8.5% and 18.9% respectively. Thus, for these subjects, the range of TP with FP at 1% was improved from 66.1%-82.7% to 85.0%-90.4% with ENT. For the fifth subject, who had the highest non-normalized accuracy of 90.5%, the performance remained around 90% with ENT. In addition, this evaluation is conservative. Since the codebook in the Feature Classifier and the parameters in the Feature Extractor of the LF-ASD were derived from nonnormalized EEG, they work in favor of the non-normalized EEG. Therefore, if the parameters and the codebook of the modified LF-ASD are generated from the normalized EEG in the future, the modified LF-ASD may show better performance than this evaluation. 5 Conclusion The evaluation with data from five able-bodied subjects indicates that the proposed system with Energy Normalization Transform (ENT) has better performance than the original. This study has verified the original hypotheses that the improved design with ENT might have two major benefits: increased the class separation between active and idle EEG and desensitized the system performance to input amplitude variance. As a side benefit, the ENT can also make the design less sensitive to the mean input scale. In the broad band, the Energy Normalization Transform is a non-linear transform. However, it has no visible distortion to the signal in the 0-4 Hz band. Therefore, it does not distort the features used by the LF-ASD. For 4 out of 5 subjects, with the corresponding False Positive rate at 1%, the proposed transform increased the system performance by 7.7%, 8.3%, 8.5% and 18.9% respectively in terms of True Positive rate. Thus, the overall performance of the LF-ASD for these subjects was improved from 66.1%-82.7% to 85.0%-90.4%. For the fifth subject, who had the highest non-normalized accuracy of 90.5%, the performance did not change notably with normalization. In the future with the codebook derived from the normalized data, the performance could be further improved. References [1] Mason, S. G. and Birch, G. E., (2000) A Brain-Controlled Switch for Asynchronous Control Applications. IEEE Trans Biomed Eng, 47(10):1297-1307. [2] Vaughan, T. M., Wolpaw, J. R., and Donchin, E. (1996) EEG-Based Communication: Prospects and Problems. IEEE Trans Reh Eng, 4(4):425-430. [3] Jasper, H. and Penfield, W. (1949) Electrocortiograms in man: Effect of voluntary movement upon the electrical activity of the precentral gyrus. Arch.Psychiat.Nervenkr., 183:163-174. [4] Pfurtscheller, G., Neuper, C., and Flotzinger, D. (1997) EEG-based discrimination between imagination of right and left hand movement. Electroencephalography and Clinical Neurophysiology, 103:642-651. [5] Mason, S. G. (1997) Detection of single trial index finger flexions from continuous, spatiotemporal EEG. PhD Thesis, UBC, January. [6] Green, D. M. and Swets, J. A. (1996) Signal Detection Theory and Psychophysics New York: John Wiley and Sons, Inc.

6 0.45117134 175 nips-2003-Sensory Modality Segregation

7 0.44151723 159 nips-2003-Predicting Speech Intelligibility from a Population of Neurons

8 0.43797007 162 nips-2003-Probabilistic Inference of Speech Signals from Phaseless Spectrograms

9 0.43212435 13 nips-2003-A Neuromorphic Multi-chip Model of a Disparity Selective Complex Cell

10 0.42655632 76 nips-2003-GPPS: A Gaussian Process Positioning System for Cellular Networks

11 0.4245103 16 nips-2003-A Recurrent Model of Orientation Maps with Simple and Complex Cells

12 0.40700629 45 nips-2003-Circuit Optimization Predicts Dynamic Networks for Chemosensory Orientation in Nematode C. elegans

13 0.38143983 187 nips-2003-Training a Quantum Neural Network

14 0.37681282 119 nips-2003-Local Phase Coherence and the Perception of Blur

15 0.37664157 142 nips-2003-On the Concentration of Expectation and Approximate Inference in Layered Networks

16 0.37261629 25 nips-2003-An MCMC-Based Method of Comparing Connectionist Models in Cognitive Science

17 0.37223598 5 nips-2003-A Classification-based Cocktail-party Processor

18 0.36734971 185 nips-2003-The Doubly Balanced Network of Spiking Neurons: A Memory Model with High Capacity

19 0.35164684 27 nips-2003-Analytical Solution of Spike-timing Dependent Plasticity Based on Synaptic Biophysics

20 0.34495652 114 nips-2003-Limiting Form of the Sample Covariance Eigenspectrum in PCA and Kernel PCA


similar papers computed by lda model

lda for this paper:

topicId topicWeight

[(0, 0.023), (11, 0.015), (29, 0.025), (30, 0.04), (35, 0.035), (45, 0.409), (53, 0.112), (59, 0.012), (71, 0.046), (76, 0.03), (85, 0.059), (91, 0.091)]

similar papers list:

simIndex simValue paperId paperTitle

same-paper 1 0.83296329 184 nips-2003-The Diffusion-Limited Biochemical Signal-Relay Channel

Author: Peter J. Thomas, Donald J. Spencer, Sierra K. Hampton, Peter Park, Joseph P. Zurkus

Abstract: Biochemical signal-transduction networks are the biological information-processing systems by which individual cells, from neurons to amoebae, perceive and respond to their chemical environments. We introduce a simplified model of a single biochemical relay and analyse its capacity as a communications channel. A diffusible ligand is released by a sending cell and received by binding to a transmembrane receptor protein on a receiving cell. This receptor-ligand interaction creates a nonlinear communications channel with non-Gaussian noise. We model this channel numerically and study its response to input signals of different frequencies in order to estimate its channel capacity. Stochastic effects introduced in both the diffusion process and the receptor-ligand interaction give the channel low-pass characteristics. We estimate the channel capacity using a water-filling formula adapted from the additive white-noise Gaussian channel. 1 Introduction: The Diffusion-Limited Biochemical Signal-Relay Channel The term signal-transduction network refers to the web of biochemical interactions by which single cells process sensory information about their environment. Just as neural networks underly the interaction of many multicellular organisms with their environments, these biochemical networks allow cells to perceive, evaluate and react to chemical stimuli [1]. Examples include chemical signaling across the synaptic cleft, calcium signaling within the postsynaptic dendritic spine, pathogen localization by the immune system, ∗ † Corresponding author: pjthomas@salk.edu dspencer@salk.edu growth-cone guidance during neuronal development, phototransduction in the retina, rhythmic chemotactic signaling in social amoebae, and many others. The introduction of quantitative measurements of the distribution and activation of chemical reactants within living cells [2] has prepared the way for detailed quantitative analysis of their properties, aided by numerical simulations. One of the key questions that can now be addressed is the fundamental limits to cell-to-cell communication using chemical signaling. To communicate via chemical signaling cells must contend with the unreliability inherent in chemical diffusion and in the interactions of limited numbers of signaling molecules and receptors [3]. We study a simplified situation in which one cell secretes a signaling molecule, or ligand, which can be detected by a receptor on another cell. Limiting ourselves to one ligand-receptor interaction allows a treatment of this communications system using elementary concepts from information theory. The information capacity of this fundamental signaling system is the maximum of the mutual information between the ensemble of input signals, the time-varying rate of ligand secretion s(t), and the output signal r(t), a piecewise continuous function taking the values one or zero as the receptor is bound to ligand or unbound. Using numerical simulation we can estimate the channel capacity via a standard ”water-filling” information measure [4], as described below. 2 Methods: Numerical Simulation of the Biochemical Relay We simulate a biochemical relay system as follows: in a two-dimensional rectangular volume V measuring 5 micrometers by 10 micrometers, we locate two cells spaced 5 micrometers apart. Cell A emits ligand molecules from location xs = [2.5µ, 2.5µ] with rate s(t) ≥ 0; they diffuse with a given diffusion constant D and decay at a rate α. Both secretion and decay occur as random Poisson processes, and diffusion is realized as a discrete random walk with Gaussian-distributed displacements. The boundaries of V are taken to be reflecting. We track the positions of each of N particles {xi , i = 1, · · · , N } at intervals of ∆t = 1msec. The local concentration in a neighborhood of size σ around a location x is given by the convolution N δ(x − xi )g(x − x , σ) dx c(x, t) = ˆ (1) V i=1 where g(·, σ) is a normalized Gaussian distribution in the plane, with mean 0 and variance σ 2 . The motions of the individual particles cause c(x, t) to fluctuate about the mean conˆ centration, causing the local concentration at cell B, c(xr , t) to be a noisy, low-pass filtered ˆ version of the original signal s(t) (see Figure 1). Cell B, located at xr = [7.5µ, 2.5µ], registers the presence of ligand through binding and unbinding transitions, which form a two-state Markov process with time-varying transition rates. Given an unbound receptor, the binding transition happens at a rate that depends on the ligand concentration around the receptor: k+ c(xr , t). The size of the neighborhood σ ˆ reflects the range of the receptor, with binding most likely in a small region close to xr . Once the receptor is bound to a ligand molecule, no more binding events occur until the receptor releases the ligand. The receiver is insensitive to fluctuations in c(xr , t) while it is ˆ in the bound state (see Figure 1). The unbinding transition occurs with a fixed rate k− . For concreteness, we take values for D, α, k− , k+ , and σ appropriate for cyclic AMP signaling between Dictyostelium amoebae, a model organism for chemical communica1 tion: D = 0.25µ2 msec−1 , α = 1 sec−1 , σ = 0.1µ, k− = 1 sec−1 , k+ = 2πσ2 sec−1 . Kd = k− /k+ is the dissociation constant, the concentration at which the receptor on average is bound half the time. For the chosen values of the reaction constants k± , we have Figure 1: Biochemical Signaling Simulation. Top: Cell A secretes a signaling molecule (red dots) with a time-varying rate r(t). Molecules diffuse throughout the two-dimensional volume, leading to locally fluctuating concentrations that carry a corrupted version of the signal. Molecules within a neighborhood of cell B can bind to a receptor molecule, giving a received signal s(t) ∈ {0, 1}. Bottom Left: Input signal. Mean instantaneous rate of molecule release (thousands of molecules per second). Molecule release is a Poisson process with time-varying rate. Bottom Center: Local concentration fluctuations, as seen by cell B, indicated by the number of molecules within 0.2 microns of the receptor. The receptor is sensitive to fluctuations in local concentrations only while it is unbound. While the receptor is bound, it does not register changes in the local concentration (indicated by constant plateaus corresponding to intervals when r(t) = 1 in bottom right panel. Bottom Right: Output signal r(t). At each moment the receptor is either bound (1) or unbound (0). The receiver output is a piecewise constant function with a finite number of transitions. Kd ≈ 15.9 molecules ≈ 26.4nMol, comparable to the most sensitive values reported for µ2 the cyclic AMP receptor [2]. At this concentration the volume V = 50µ2 contains about 800 signaling molecules, assuming a nominal depth of 1µ. 3 Results: Estimating Information Capacity via Frequency Response Communications channels mediated by diffusion and ligand receptor interaction are nonlinear with non-Gaussian noise. The expected value of the output signal, 0 ≤ E[r] < 1, is a sigmoidal function of the log concentration for a constant concentration c: E[r] = 1 c = c + Kd 1 + e−(y−y0 ) (2) where y = ln(c), y0 = ln(Kd ). The mean response saturates for high concentrations, c Kd , and the noise statistics become pronouncedly Poissonian (rather than Gaussian) for low concentrations. Several different kinds of stimuli can be used to characterize such a channel. The steadystate response to constant input reflects the static (equilibrium) transfer function. Concentrations ranging from 100Kd to 0.01Kd occupy 98% of the steady-state operating range, 0.99 > E[r] > 0.01 [5]. For a finite observation time T the actual fraction of time spent bound, rT , is distributed about E[r] with a variance that depends on T . The biochemi¯ cal relay may be used as a binary symmetric channel randomly selecting a ‘high’ or ‘low’ secretion rate, and ‘decoding’ by setting a suitable threshold for rT . As T increases, the ¯ variance of rT and the probability of error decrease. ¯ The binary symmetric channel makes only crude use of this signaling mechanism. Other possible communication schemes include sending all-or-none bursts of signaling molecule, as in synaptic transmission, or detecting discrete stepped responses. Here we use the frequency response of the channel as a way of estimating the information capacity of the biochemical channel. For an idealized linear channel with additive white Gaussian noise (AWNG channel) the channel capacity under a mean input power constraint P is given by the so-called “waterfilling formula” [4], C= 1 2 ωmax log2 1 + ω=ωmin (ν − N (ω))+ N (ω) dω (3) given the constraining condition ωmax (ν − N (ω))+ dω ≤ P (4) ω=ωmin where the constant ν is the sum of the noise and the signal power in the usable frequency range, N (ω) is the power of the additive noise at frequency ω and (X)+ indicates the positive part of X. The formula applies when each frequency band (ω, ω +dω) is subject to noise of power N (ω) independently of all other frequency bands, and reflects the optimal allocation of signal power S(ω) = (ν − N (ω))+ , with greater signal power invested in frequencies at which the noise power is smallest. The capacity C is in bits/second. For an input signal of finite duration T = 100 sec, we can independently specify the amplitudes and phases of its frequency components at ω = [0.01 Hz, 0.02 Hz, · · · , 500 Hz], where 500 Hz is the Nyquist frequency given a 1 msec simulation timestep. Because the population of secreted signaling molecules decays exponentially with a time constant of 1/α = 1 sec, the concentration signal is unable to pass frequencies ω ≥ 1Hz (see Figure 2) providing a natural high-frequency cutoff. For the AWGN channel the input and Figure 2: Frequency Response of Biochemical Relay Channel. The sending cell secreted signaling molecules at a mean rate of 1000 + 1000 sin(2πωt) molecules per second. From top to bottom, the input frequencies were 1.0, 0.5, 0.2, 0.1, 0.05, 0.02 and 0.01 Hz. The total signal duration was T = 100 seconds. Left Column: Total number of molecules in the volume. Attenuation of the original signal results from exponential decay of the signaling molecule population. Right Column: A one-second moving average of the output signal r(t), which takes the value one when the receptor molecule is bound to ligand, and zero when the receptor is unbound. Figure 3: Frequency Transmission Spectrum Noise power N (ω), calculated as the total power in r(t)−¯ in all frequency components save the input frequency ω. Frequencies were r binned in intervals of 0.01 Hz = 1/T . The maximum possible power in r(t) over all frequencies is 0.25; the power successfully transmitted by the channel is given by 0.25/N (ω). The lower curve is N (ω) for input signals of the form s(t) = 1000 + 1000 sin 2πωt, which uses the full dynamic range of the receptor. Decreasing the dynamic range used reduces the amount of power transmitted at the sending frequency: the upper curve is N (ω) for signals of the form s(t) = 1000 + 500 sin 2πωt. output signals share the same units (e.g. rms voltage); for the biological relay the input s(t) is in molecules/second while the output r(t) is a function with binary range {r = 0, r = 1}. The maximum of the mean output power for a binary function r(t) T 2 1 is T t=0 |r(t) − r| dt ≤ 1 . This total possible output power will be distributed be¯ 4 tween different frequencies depending on the frequency of the input. We wish to estimate the channel capacity by comparing the portion of the output power present in the sending frequency ω to the limiting output power 0.25. Therefore we set the total output power constant to ν = 0.25. Given a pure sinusoidal input signal s(t) = a0 + a1 sin(2πωt), we consider the power in the output spectrum at ω Hz to be the residual power from the input and the rest of the power in the spectrum of r(t) to be analogous to the additive noise power spectrum N (ω) in the AWNG channel. We calculate N (ω) to be the total power of r(t) − r ¯ in all frequency bands except ω. For signals of length T = 100 sec, the possible frequencies are discretized at intervals ∆ω = 0.01 Hz. Because the noise power N (ω) ≤ 0.25, the water-filling formula (3) for the capacity reduces to 1 Cest = 2 1Hz log2 0.01Hz 0.25 N (ω) dω. (5) As mentioned above frequencies ω ≥ 1 Hz do not transmit any information about the signal (see Figure 2) and do not contribute to the capacity. We approximate this integral using linear interpolation of log2 (N (ω)) between the measured values at ω = [0.01, 0.02, 0.05, 0.1, 0.2, 0.5, 1.0] Hz. (See Figure 3.) This procedure gives an estimate of the channel capacity, Cest = 0.087 bits/second. 4 Discussion & Conclusions Diffusion and the Markov switching between bound and unbound states create a low-pass filter that removes high-frequency information in the biochemical relay channel. A general Poisson-type communications channel, such as commonly encountered in optical communications engineering, can achieve an arbitrarily large capacity by transmitting high frequencies and high amplitudes, unless bounded by a max or mean amplitude constraint [6]. In the biochemical channel, the effective input amplitude is naturally constrained by the saturation of the receptor at concentrations above the Kd . And the high frequency transmission is limited by the inherent dynamics of the Markov process. Therefore this channel has a finite capacity. The channel capacity estimate we derived, Cest = 0.087 bits/second, seems quite low compared to signaling rates in the nervous system, requiring long signaling times to transfer information successfully. However temporal dynamics in cellular systems can be quite deliberate; cell-cell communication in the social amoeba Dictyostelium, for example, is achieved by means of a carrier wave with a period of seven minutes. In addition, cells typically possess thousands of copies of the receptors for important signaling molecules, allowing for more complex detection schemes than those investigated here. Our simplified treatment suggests several avenues for further work. For example, signal transducing receptors often form Markov chains with more complicated dynamics reflecting many more than two states [7]. Also, the nonlinear nature of the channel is probably not well served by our additive noise approximation, and might be better suited to a treatment via multiplicative noise [8]. Whether cells engage in complicated temporal coding/decoding schemes, as has been proposed for neural information processing, or whether instead they achieve efficient communication by evolutionary matching of the noise characteristics of sender and receiver, remain to be investigated. We note that the dependence of the channel capacity C on such parameters as the system geometry, the diffusion and decay constants, the binding constants and the range of the receptor may shed light on evolutionary mechanisms and constraints on communication within cellular biological systems. Acknowledgments This work would not have been possible without the generous support of the Howard Hughes Medical Institute and the resources of the Computational Neurobiology Laboratory, Terrence J. Sejnowski, Director. References [1] Rappel, W.M., Thomas, P.J., Levine, H. & Loomis, W.F. (2002) Establishing Direction during Chemotaxis in Eukaryotic Cells. Biophysical Journal 83:1361-1367. [2] Ueda, M., Sako, Y., Tanaka, T., Devreotes, P. & Yanagida, T. (2001) Single Molecule Analysis of Chemotactic Signaling in Dictyostelium Cells. Science 294:864-867. [3] Detwiler, P.B., Ramanathan, S., Sengupta, A. & Shraiman, B.I. (2000) Engineering Aspects of Enzymatic Signal Transduction: Photoreceptors in the Retina. Biophysical Journal79:2801-2817. [4] Cover, T.M. & Thomas, J.A. (1991) Elements of Information Theory, New York: Wiley. [5] Getz, W.M. & Lansky, P. (2001) Receptor Dissociation Constants and the Information Entropy of Membranes Coding Ligand Concentration. Chem. Senses 26:95-104. [6] Frey, R.M. (1991) Information Capacity of the Poisson Channel. IEEE Transactions on Information Theory 37(2):244-256. [7] Uteshev, V.V. & Pennefather, P.S. (1997) Analytical Description of the Activation of Multi-State Receptors by Continuous Neurotransmitter Signals at Brain Synapses. Biophysical Journal72:11271134. [8] Mitra, P.P. & Stark, J.B. (2001) Nonlinear limits to the information capacity of optical fibre communications. Nature411:1027-1030.

2 0.51932091 122 nips-2003-Margin Maximizing Loss Functions

Author: Saharon Rosset, Ji Zhu, Trevor J. Hastie

Abstract: Margin maximizing properties play an important role in the analysis of classi£cation models, such as boosting and support vector machines. Margin maximization is theoretically interesting because it facilitates generalization error analysis, and practically interesting because it presents a clear geometric interpretation of the models being built. We formulate and prove a suf£cient condition for the solutions of regularized loss functions to converge to margin maximizing separators, as the regularization vanishes. This condition covers the hinge loss of SVM, the exponential loss of AdaBoost and logistic regression loss. We also generalize it to multi-class classi£cation problems, and present margin maximizing multiclass versions of logistic regression and support vector machines. 1

3 0.38066113 93 nips-2003-Information Dynamics and Emergent Computation in Recurrent Circuits of Spiking Neurons

Author: Thomas Natschläger, Wolfgang Maass

Abstract: We employ an efficient method using Bayesian and linear classifiers for analyzing the dynamics of information in high-dimensional states of generic cortical microcircuit models. It is shown that such recurrent circuits of spiking neurons have an inherent capability to carry out rapid computations on complex spike patterns, merging information contained in the order of spike arrival with previously acquired context information. 1

4 0.38016748 107 nips-2003-Learning Spectral Clustering

Author: Francis R. Bach, Michael I. Jordan

Abstract: Spectral clustering refers to a class of techniques which rely on the eigenstructure of a similarity matrix to partition points into disjoint clusters with points in the same cluster having high similarity and points in different clusters having low similarity. In this paper, we derive a new cost function for spectral clustering based on a measure of error between a given partition and a solution of the spectral relaxation of a minimum normalized cut problem. Minimizing this cost function with respect to the partition leads to a new spectral clustering algorithm. Minimizing with respect to the similarity matrix leads to an algorithm for learning the similarity matrix. We develop a tractable approximation of our cost function that is based on the power method of computing eigenvectors. 1

5 0.37985703 20 nips-2003-All learning is Local: Multi-agent Learning in Global Reward Games

Author: Yu-han Chang, Tracey Ho, Leslie P. Kaelbling

Abstract: In large multiagent games, partial observability, coordination, and credit assignment persistently plague attempts to design good learning algorithms. We provide a simple and efficient algorithm that in part uses a linear system to model the world from a single agent’s limited perspective, and takes advantage of Kalman filtering to allow an agent to construct a good training signal and learn an effective policy. 1

6 0.37759736 179 nips-2003-Sparse Representation and Its Applications in Blind Source Separation

7 0.37604031 80 nips-2003-Generalised Propagation for Fast Fourier Transforms with Partial or Missing Data

8 0.37533635 161 nips-2003-Probabilistic Inference in Human Sensorimotor Processing

9 0.37413329 73 nips-2003-Feature Selection in Clustering Problems

10 0.37412322 30 nips-2003-Approximability of Probability Distributions

11 0.37368584 126 nips-2003-Measure Based Regularization

12 0.3734366 115 nips-2003-Linear Dependent Dimensionality Reduction

13 0.37302476 4 nips-2003-A Biologically Plausible Algorithm for Reinforcement-shaped Representational Learning

14 0.37247154 104 nips-2003-Learning Curves for Stochastic Gradient Descent in Linear Feedforward Networks

15 0.3724387 79 nips-2003-Gene Expression Clustering with Functional Mixture Models

16 0.37229794 82 nips-2003-Geometric Clustering Using the Information Bottleneck Method

17 0.37199268 125 nips-2003-Maximum Likelihood Estimation of a Stochastic Integrate-and-Fire Neural Model

18 0.37197727 66 nips-2003-Extreme Components Analysis

19 0.37187362 162 nips-2003-Probabilistic Inference of Speech Signals from Phaseless Spectrograms

20 0.37167937 81 nips-2003-Geometric Analysis of Constrained Curves