A Hierarchical Hybrid Evolutionary Computation for Continuous Function Optimization

##plugins.themes.academic_pro.article.main##

Said Mohamed Said
Senlin Guan
Morikazu Nakamura

Abstract

In this paper, we propose a hybrid master/slave approach to optimization problems on the basis of estimation of distribution algorithms (EDAs) and genetic algorithms (GAs). The master process estimates the probability distribution of the search space on the basis of the non-dependency model at each iteration and sends probability vectors to the slaves. The slaves use the vectors to generate a new initial population for their GA operations. We employ the simplest probability models and we compensate for the reduced accuracy problems by applying GAs to the solutions sampled using the simplest model. Moreover, our method can be incorporated with strategy research, and it easily can be parallelized. Lastly, we conduct experiments to verify the effectiveness of our method.

##plugins.themes.academic_pro.article.details##

How to Cite
Said Mohamed Said, Senlin Guan, & Morikazu Nakamura. (2012). A Hierarchical Hybrid Evolutionary Computation for Continuous Function Optimization. International Journal of Next-Generation Computing, 3(1), 13–28. https://doi.org/10.47164/ijngc.v3i1.25

References

  1. Alba, E., Ed. 2005. Parallel Metaheuristics -A new class of algorithms. Wiley-Intescience.
  2. Baluja, S. 1994. Population-based incremental learning: A method for integrating genetic search based function optimization and competitive learning. Tech. Rep. Technical Report CMU-CS-94-163, Carnegie Mellon University.
  3. Bosman, P. A. N. and Thierens, D. 2000. Continuous iterated density estimation evolutionary algorithms within the idea framework. In Proc of the 2000 Genetic and Evolutionary Computation Conference Workshop Program. 197–200.
  4. Davis, L. 1991. Handbook of Genetic Algorithms. Van Nostrand Reinhold, New York (1991).
  5. Dorigo, M. 1996. Ant system: optimization by colony of cooperating agents. IEEE Trans on systems man, and cybernetics - part B 26, 29–41.
  6. Eberhart, R. C. and Shi, Y. 2001. Particle swarm optimization: Developments, applications and resources. IEEE 1, 81–86.
  7. Gagne, C. and Parizeau, M. 2003. A robust master-slave distribution architecture for evolutionary computations. In Proc of Genetic and Evolutionary Computation Conference Late Breaking 2003. 80–87.
  8. Gallagher, M. R., Frean, M., and Downs, T. 1999. Real-valued evolutionary optimization using a flexible probability density estimator. In Proc of Genetic and Evolutionary Computation Conference. 840–846.
  9. Gong, Y. and Nakamura, M. 2008. Migration effects of parallel genetic algorithms on line topologies of heterogeneous computing resources. IECE Transactions on Fundamentals of Electronics, Communication and Computer Sciences E91-A(4), 1121–1128.
  10. Harik, G., Lobo, F. G., and Goldberg, D. E. 1998. The compact genetic algorithm. In Proc of the IEEE Conference on Evolutionary Computation. 523–528.
  11. Holland, J. H. 1972. Adaptation in Nature and Arti cial Systems. The University of Michigan Press, Reprinted by MIT Press (1992).
  12. Hu, X. 2004. Recent advances in particle swarm. Evolutional Computation 1, 90–97.
  13. J. Zhang, W. C., Zhong, J., Tan, Z., and Li, Y. 2006. Continuous function optimization using hybrid ant colony approach with orthogonal design scheme. In SEAL 2006, Lecture Notes on Computer Science, 4247. 126–133.
  14. Larranaga, P. and Lozano, J. A. 2002. Estimation of Distribution Algorithms. Kluwer Academic Publishers.
  15. Mendiburu-Alberro, A. 2006. Parallel implementation of estimation of distribution algorithms based on probabilistic graphical models. application to chemical calibration models. Ph.D. thesis, The University of the Basque Country.
  16. Muhlenbein, H. 1998. The equation for response to selection and its use for prediction. Evolutional Computation 5, 303–346.
  17. Muhlenbein, H. and Paa, G. 1996. From recombination in genes to the estimation of distribution i, binary parameters. In Lecture Notes in Computer Science 1411: Parallel Problem Solving from Nature.
  18. Ocenasek, J. 2002. Parallel estimation of distribution algorithms. Ph.D. thesis, Brno University of Technology.
  19. Perikan, M., Goldberg, D., and Cantu-Paz, E. 1999. Boa: The bayesian optimization algorithm. In Proc of the Genetic and Evolutionary Computation Conference. 525–532.
  20. Said, S. M. and Nakamura, M. 2010. A hybrid approach of edas and gas based on master/slave cooperation for continuous function optimization. In Proc of NABIC 2010 Second World Congress. 244–248.
  21. Salomon, R. 1996. Re-evaluating genetic algorithm performance under coordinate rotation of benchmark functions. a survey of some theoretical and practical aspects of genetic algorithms. ELSEVIER, Biosystems 39, 263–278.
  22. Sebag, M. and Ducoulombier, A. 1998. Extending population-based incremental learning to continuous search spaces. In Proc of Parallel Problem Solving from Nature? PPSN V. 418–427.