nips nips2002 nips2002-179 nips2002-179-reference knowledge-graph by maker-knowledge-mining

179 nips-2002-Scaling of Probability-Based Optimization Algorithms


Source: pdf

Author: J. L. Shapiro

Abstract: Population-based Incremental Learning is shown require very sensitive scaling of its learning rate. The learning rate must scale with the system size in a problem-dependent way. This is shown in two problems: the needle-in-a haystack, in which the learning rate must vanish exponentially in the system size, and in a smooth function in which the learning rate must vanish like the square root of the system size. Two methods are proposed for removing this sensitivity. A learning dynamics which obeys detailed balance is shown to give consistent performance over the entire range of learning rates. An analog of mutation is shown to require a learning rate which scales as the inverse system size, but is problem independent. 1


reference text

[1] S. Baluja. Population-based incremental learning: A method for integrating genetic search based function optimization and competive learning. Technical Report CMUCS-94-163 , Computer Science Department , Carnegie Mellon University, 1994.

[2] A. Johnson and J. L. Shapiro. The importance of selection mechanisms in distribution estimation algorithms. In Proceedings of the 5th International Conference on Artificial Evolution AE01, 2001.

[3] P. Larraiiaga and J. A. Lozano. Estimation of Distribution Algorithms, A New Tool for Evolutionary Computation. Kluwer Academic Publishers, 2001.

[4] Eckhard Limpert , Werner A. Stahel, and Markus Abbt . Log-normal distributions across the sciences: Keys and clues. BioScience, 51(5):341-352, 2001.

[5] H. Miihlenbein. The equation for response to selection and its use for prediction. Evolutionary Computation, 5(3):303- 346, 1997.

[6] M. Pelikan, D. E . Goldberg, and F. Lobo. A survey of optimization by building and using probabilistic models. Technical report, University of Illinois at UrbanaChampaign, Illinois Genetic Algorithms Laboratory, 1999.

[7] Jonathan L. Shapiro and Adam Priigel-Bennett. Maximum entropy analysis of genetic algorithm operators. Lecture Notes in Computer Science, 993:14- 24, 1995.