nips nips2013 nips2013-210 nips2013-210-reference knowledge-graph by maker-knowledge-mining

210 nips-2013-Noise-Enhanced Associative Memories


Source: pdf

Author: Amin Karbasi, Amir Hesam Salavati, Amin Shokrollahi, Lav R. Varshney

Abstract: Recent advances in associative memory design through structured pattern sets and graph-based inference algorithms allow reliable learning and recall of exponential numbers of patterns. Though these designs correct external errors in recall, they assume neurons compute noiselessly, in contrast to highly variable neurons in hippocampus and olfactory cortex. Here we consider associative memories with noisy internal computations and analytically characterize performance. As long as internal noise is less than a specified threshold, error probability in the recall phase can be made exceedingly small. More surprisingly, we show internal noise actually improves performance of the recall phase. Computational experiments lend additional support to our theoretical analysis. This work suggests a functional benefit to noisy neurons in biological neuronal networks. 1


reference text

[1] A. Treves and E. T. Rolls, “Computational analysis of the role of the hippocampus in memory,” Hippocampus, vol. 4, pp. 374–391, Jun. 1994.

[2] D. A. Wilson and R. M. Sullivan, “Cortical processing of odor objects,” Neuron, vol. 72, pp. 506–519, Nov. 2011.

[3] J. J. Hopfield, “Neural networks and physical systems with emergent collective computational abilities,” Proc. Natl. Acad. Sci. U.S.A., vol. 79, pp. 2554–2558, Apr. 1982.

[4] R. J. McEliece, E. C. Posner, E. R. Rodemich, and S. S. Venkatesh, “The capacity of the Hopfield associative memory,” IEEE Trans. Inf. Theory, vol. IT-33, pp. 461–482, 1987.

[5] D. J. Amit and S. Fusi, “Learning in neural networks with material synapses,” Neural Comput., vol. 6, pp. 957–982, Sep. 1994.

[6] B. A. Olshausen and D. J. Field, “Sparse coding of sensory inputs,” Curr. Opin. Neurobiol., vol. 14, pp. 481–487, Aug. 2004.

[7] A. A. Koulakov and D. Rinberg, “Sparse incomplete representations: A potential role of olfactory granule cells,” Neuron, vol. 72, pp. 124–136, Oct. 2011.

[8] A. H. Salavati and A. Karbasi, “Multi-level error-resilient neural networks,” in Proc. 2012 IEEE Int. Symp. Inf. Theory, Jul. 2012, pp. 1064–1068.

[9] A. Karbasi, A. H. Salavati, and A. Shokrollahi, “Iterative learning and denoising in convolutional neural associative memories,” in Proc. 30th Int. Conf. Mach. Learn. (ICML 2013), Jun. 2013, pp. 445–453.

[10] N. Brunel, V. Hakim, P. Isope, J.-P. Nadal, and B. Barbour, “Optimal information storage and the distribution of synaptic weights: Perceptron versus Purkinje cell,” Neuron, vol. 43, pp. 745–757, 2004.

[11] L. R. Varshney, P. J. Sj¨ str¨ m, and D. B. Chklovskii, “Optimal information storage in noisy synapses o o under resource constraints,” Neuron, vol. 52, pp. 409–423, Nov. 2006.

[12] C. Koch, Biophysics of Computation. New York: Oxford University Press, 1999.

[13] M. D. McDonnell and L. M. Ward, “The benefits of noise in neural systems: bridging theory and experiment,” Nat. Rev. Neurosci., vol. 12, pp. 415–426, Jul. 2011.

[14] H. Chen, P. K. Varshney, S. M. Kay, and J. H. Michels, “Theory of the stochastic resonance effect in signal detection: Part I–fixed detectors,” IEEE Trans. Signal Process., vol. 55, pp. 3172–3184, Jul. 2007.

[15] D. A. Spielman and S.-H. Teng, “Smoothed analysis of algorithms: Why the simplex algorithm usually takes polynomial time,” J. ACM, vol. 51, pp. 385–463, May 2004.

[16] D. J. Amit, Modeling Brain Function. Cambridge: Cambridge University Press, 1992.

[17] M. G. Taylor, “Reliable information storage in memories designed from unreliable components,” Bell Syst. Tech. J., vol. 47, pp. 2299–2337, Dec. 1968.

[18] A. V. Kuznetsov, “Information storage in a memory assembled from unreliable components,” Probl. Inf. Transm., vol. 9, pp. 100–114, July-Sept. 1973.

[19] L. R. Varshney, “Performance of LDPC codes under faulty iterative decoding,” IEEE Trans. Inf. Theory, vol. 57, pp. 4427–4444, Jul. 2011.

[20] V. Gripon and C. Berrou, “Sparse neural networks with large learning diversity,” IEEE Trans. Neural Netw., vol. 22, pp. 1087–1096, Jul. 2011.

[21] P. Vincent, H. Larochelle, Y. Bengio, and P.-A. Manzagol, “Extracting and composing robust features with denoising autoencoders,” in Proc. 25th Int. Conf. Mach. Learn. (ICML 2008), Jul. 2008, pp. 1096–1103.

[22] Q. V. Le, J. Ngiam, Z. Chen, D. Chia, P. W. Koh, and A. Y. Ng, “Tiled convolutional neural networks,” in Advances in Neural Information Processing Systems 23, J. Lafferty, C. K. I. Williams, J. Shawe-Taylor, R. S. Zemel, and A. Culotta, Eds. Cambridge, MA: MIT Press, 2010, pp. 1279–1287.

[23] A. Karbasi, A. H. Salavati, A. Shokrollahi, and L. R. Varshney, “Noise-enhanced associative memories,” arXiv, 2013.

[24] M. G. Luby, M. Mitzenmacher, M. A. Shokrollahi, and D. A. Spielman, “Efficient erasure correcting codes,” IEEE Trans. Inf. Theory, vol. 47, pp. 569–584, Feb. 2001.

[25] T. Richardson and R. Urbanke, Modern Coding Theory. Cambridge: Cambridge University Press, 2008.

[26] M. Yoshida, H. Hayashi, K. Tateno, and S. Ishizuka, “Stochastic resonance in the hippocampal CA3–CA1 model: a possible memory recall mechanism,” Neural Netw., vol. 15, pp. 1171–1183, Dec. 2002.

[27] R. Sarpeshkar, “Analog versus digital: Extrapolating from electronics to neurobiology,” Neural Comput., vol. 10, pp. 1601–1638, Oct. 1998.

[28] N. H. Mackworth, “Effects of heat on wireless telegraphy operators hearing and recording Morse messages,” Br. J. Ind. Med., vol. 3, pp. 143–158, Jul. 1946. 9