Skip to content

Latest commit

 

History

History
101 lines (86 loc) · 21.1 KB

Evolution-Strategies-(ES).md

File metadata and controls

101 lines (86 loc) · 21.1 KB

Evolution Strategies (ES)

Evolution Strategies (ES) were originally proposed in the 1960s by two German students Rechenberg and Schwefel at Technical University of Berlin (TUB). For a detailed introduction to the interesting history of ES, we strongly suggest to refer to e.g., [Beyer, 2023, GECCO], [Hansen et al., 2015], [Bäck et al., 2013], [Beyer&Schwefel, 2002], and especially recalls from two early pioneers (e.g., [ACM SIGEVOlution, 2008], [ACM SIGEVOlution, 2010]). In this book, we mainly focus on modern ES versions and variants: Covariance Matrix Adaptation ES (CMA-ES/MA-ES), Nature ES (NES), Limited-Memory CMA (LM-CMA), OpenAI-ES, Persistent ES (PES), and Meta-/Distributed ES (DES). Despite as one of the three earliest (from 1960s) families of evolutionary algorithms with genetic algorithms and evolutionary programming, now ES are still studied widely in the evolutionary computation community and are still applied to approximate the global/local optimum of many (though not all) challenging real-world problems. Note that if novel and powerful ES variants emerge in the future, we will expect to add them as soon as possible (reflecting the open nature of this book).

Self-Adaptation and Covariance Matrix Adaptation ES (CMA-ES)

Self-adaptation is widely recognized as the essential feature of ES, which has resulted in (at least) one powerful ES version called Covariance Matrix Adaptation (CMA-ES).

CMA-ES could be used as a strong baseline on some BBO problems, such as offline design of biological sequences where "for TFbind8, which has a relatively small search space, CMA-ES gave pretty good performances", informative path planning where "CMA-ES finds a good trade-off between exploration and exploitation, resulting in the best overall performance among non-learning solvers", task constrained planner for robot manipulator in confined environment, reinforcement learning with human feedback (RLHF), if not the state-of-the-art. On many (though not all) hard BBO problems, CMA-ES has shown very competitive (and sometimes even state-of-the-art) performance (see the following section for some of its representative applications).

Surrogate-assisted CMA-ES, for e.g., computational chemistry.

synthesizability-constrained molecular design

Natural Gradients and Natural Evolution Strategies (NES)

Gradient Estimation (OpenAI-ES) and Variance Reduction (Persistent ES)

Meta-ES and Distributed ES (DES)

Typical Optimization Applications of ES

Notes

  1. Sometimes Evolutionary Strategies are used in the literature.
  2. We are very sorry that in this book we do not give the original German references of ES, since here we only consider English.

Reference

  • Bäck, T.H., Kononova, A.V., van Stein, B., Wang, H., Antonov, K.A., Kalkreuth, R.T., de Nobel, J., Vermetten, D., de Winter, R. and Ye, F., 2023. Evolutionary algorithms for parameter optimization—Thirty years later. Evolutionary Computation, 31(2), pp.81-122.
  • Lee, U.H., Shetty, V.S., Franks, P.W., Tan, J., Evangelopoulos, G., Ha, S. and Rouse, E.J., 2023. User preference optimization for control of ankle exoskeletons using sample efficient active learning. Science Robotics, 8(83), p.eadg3705.
  • Thamm, M. and Rosenow, B., 2023. Machine learning optimization of Majorana hybrid nanowires. Physical Review Letters, 130(11), p.116202.
  • Lin, X., Yang, Z., Zhang, X. and Zhang, Q., 2023, July. Continuation path learning for homotopy optimization. In International Conference on Machine Learning (pp. 21288-21311). PMLR.
  • Antonova, R., Yang, J., Jatavallabhula, K.M. and Bohg, J., 2023, March. Rethinking optimization with differentiable simulation from a global perspective. In Conference on Robot Learning (pp. 276-286). PMLR.
  • Lange, R.T., 2023, July. Evosax: Jax-based evolution strategies. In Proceedings of ACM Companion Conference on Genetic and Evolutionary Computation (pp. 659-662).
  • Wanzenböck, R., Buchner, F., Kovács, P., Madsen, G.K. and Carrete, J., 2023. Clinamen2: Functional-style evolutionary optimization in Python for atomistic structure searches. Computer Physics Communications, p.109065.
  • Van der Meersch, V. and Chuine, I., 2023. Estimating process‐based model parameters from species distribution data using the evolutionary algorithm CMA‐ES. Methods in Ecology and Evolution.
  • Giani, T., Magni, G. and Rojo, J., 2023. SMEFiT: A flexible toolbox for global interpretations of particle physics data with effective field theories. European Physical Journal C, 83(5), p.393.
  • Li, Q., Zhang, C. and Woodland, P.C., 2023. Combining hybrid DNN-HMM ASR systems with attention-based models using lattice rescoring. Speech Communication, 147, pp.12-21.
  • Li, A.C., Macridin, A., Mrenna, S. and Spentzouris, P., 2023. Simulating scalar field theories on quantum computers with limited resources. Physical Review A, 107(3), p.032603.
  • Bonet-Monroig, X., Wang, H., Vermetten, D., Senjean, B., Moussa, C., Bäck, T., Dunjko, V. and O'Brien, T.E., 2023. Performance comparison of optimization methods on variational quantum algorithms. Physical Review A, 107(3), p.032407.
  • Chen, C., Kuvshinov, A., Kruglyakov, M., Munch, F. and Rigaud, R., 2023. Constraining the crustal and mantle conductivity structures beneath islands by a joint inversion of multi‐source magnetic transfer functions. Journal of Geophysical Research: Solid Earth, 128(1), p.e2022JB024106.
  • Real, E., Chen, Y., Rossini, M., de Souza, C., Garg, M., Verghese, A., Firsching, M., Le, Q.V., Cubuk, E.D. and Park, D.H., 2023. AutoNumerics-Zero: Automated discovery of state-of-the-art mathematical functions. arXiv preprint arXiv:2312.08472.
  • Shen, M., Ghosh, S., Sattigeri, P., Das, S., Bu, Y. and Wornell, G., 2023. Reliable gradient-free and likelihood-free prompt tuning. arXiv preprint arXiv:2305.00593.
  • Ciarella, S., Chiappini, M., Boattini, E., Dijkstra, M. and Janssen, L.M., 2023. Dynamics of supercooled liquids from static averaged quantities using machine learning. Machine Learning: Science and Technology, 4(2), p.025010.
  • Tjanaka, B., Fontaine, M.C., Lee, D.H., Kalkar, A. and Nikolaidis, S., 2023. Training diverse high-dimensional controllers by scaling covariance matrix adaptation map-annealing. IEEE Robotics and Automation Letters.
  • Tiboni, G., Arndt, K. and Kyrki, V., 2023. DROPO: Sim-to-real transfer with offline domain randomization. Robotics and Autonomous Systems, 166, p.104432.
  • Liu, W., Leahy, K., Serlin, Z. and Belta, C., 2023, May. Robust multi-agent coordination from catl+ specifications. In American Control Conference (pp. 3529-3534). IEEE.
  • Soni, R., Harnack, D., Isermann, H., Fushimi, S., Kumar, S. and Kirchner, F., 2023, October. End-to-end reinforcement learning for torque based variable height hopping. In IEEE/RSJ International Conference on Intelligent Robots and Systems (pp. 7531-7538). IEEE.
  • Thomaser, A., Vogt, M.E., Kononova, A.V. and Bäck, T., 2023, March. Transfer of multi-objectively tuned CMA-ES parameters to a vehicle dynamics problem. In International Conference on Evolutionary Multi-Criterion Optimization (pp. 546-560). Cham: Springer Nature Switzerland.
  • Eichner, T., Hülsenbusch, T., Palmer, G. and Maier, A.R., 2023. Evolutionary optimization and long-term stabilization of a white-light seeded two-stage OPCPA seed laser. Optics Express, 31(22), pp.36915-36927.
  • Hu, Z., Wolle, R., Tian, M., Guan, Q., Humble, T. and Jiang, W., 2023, September. Toward consistent high-fidelity quantum learning on unstable devices via efficient in-situ calibration. In IEEE International Conference on Quantum Computing and Engineering (pp. 848-858). IEEE.
  • Oh, C., Hwang, H., Lee, H.Y., Lim, Y., Jung, G., Jung, J., Choi, H. and Song, K., 2023. BlackVIP: Black-box visual prompting for robust transfer learning. In Proceedings of IEEE/CVF Conference on Computer Vision and Pattern Recognition (pp. 24224-24235).
  • Irwin, B., Haber, E., Gal, R. and Ziv, A., 2023, July. Neural network accelerated implicit filtering: Integrating neural network surrogates with provably convergent derivative free optimization methods. In International Conference on Machine Learning (pp. 14376-14389). PMLR. [ "These benefits include NNAIF’s ability to minimize structured functions of several thousand variables much more rapidly than well-known alternatives, such as Covariance Matrix Adaptation Evolution Strategy (CMA-ES) and finite difference based variants of gradient descent (GD) and BFGS, as well as its namesake IF." ]
  • Xie, D., Hu, P., Sun, X., Pirk, S., Zhang, J., Mech, R. and Kaufman, A.E., 2023. GAIT: Generating aesthetic indoor tours with deep reinforcement learning. In Proceedings of IEEE/CVF International Conference on Computer Vision (pp. 7409-7419). [ "GAIT-DrQ-v2 and GAITCURL performs generally better than CMA-ES, except the reward with CMA-ES is better than GAIT-DrQ-v2 in the scene of Apartment." ]
  • De Croon, G.C., Dupeyroux, J.J., De Wagter, C., Chatterjee, A., Olejnik, D.A. and Ruffier, F., 2022. Accommodating unobservability to control flight attitude with optic flow. Nature, 610(7932), pp.485-490.
  • Ollivier, Y., Arnold, L., Auger, A. and Hansen, N., 2017. Information-geometric optimization algorithms: A unifying picture via invariance principles. Journal of Machine Learning Research, 18(18), pp.1-65.
  • Tsai, Y.Y., Xu, H., Ding, Z., Zhang, C., Johns, E. and Huang, B., 2021. Droid: Minimizing the reality gap using single-shot human demonstration. IEEE Robotics and Automation Letters, 6(2), pp.3168-3175.
  • https://www.paulvicol.com/pdfs/ES-Single-Slides.pdf
  • Beyer, H.G., 2023, July. What you always wanted to know about evolution strategies, but never dared to ask. In Proceedings of ACM Conference on Genetic and Evolutionary Computation Companion (pp. 878-894).
  • Lange, R.T., Schaul, T., Chen, Y., Zahavy, T., Dalibard, V., Lu, C., Singh, S. and Flennerhag, S., 2022, September. Discovering evolution strategies via meta-black-box optimization. In International Conference on Learning Representations.
  • Diouane, Y., Gratton, S. and Vicente, L.N., 2015. Globally convergent evolution strategies. Mathematical Programming, 152(1), pp.467-490.
  • Schwefel, H.P., 2002. Deep insight from simple models of evolution. BioSystems, 64(1-3), pp.189-198.
  • Schwefel, H.P., 1994. On the evolution of evolutionary computation. Computational Intelligence: Imitating Life, pp.116-124.
  • Schwefel, H.P., 1993. Evolution and optimum seeking: The sixth generation. John Wiley & Sons, Inc..
  • Schwefel, H.P., 1992. Natural evolution and collective optimum seeking. Computational Systems Analysis–Topics and Trends, pp.5-14.
  • Schwefel, H.P., 1988. Evolutionary learning optimum-seeking on parallel computer architectures. In Systems Analysis and Simulation I (pp. 217-225). Springer, New York, NY.
  • Schwefel, H.P., 1988. Collective intelligence in evolving systems. In Ecodynamics (pp. 95-100). Springer, Berlin, Heidelberg.
  • Schwefel, H.P., 1984. Evolution strategies: A family of non-linear optimization techniques based on imitating some principles of organic evolution. Annals of Operations Research, 1(2), pp.165-167.
  • Schwefel, H.P., 1981. Numerical optimization of computer models. John Wiley & Sons, Inc..
  • Hansen, N., Arnold, D.V. and Auger, A., 2015. Evolution strategies. In Springer Handbook of Computational Intelligence (pp. 871-898). Springer, Berlin, Heidelberg.
  • Bäck, T., Foussette, C. and Krause, P., 2013. Contemporary evolution strategies. Berlin: Springer.
  • [2012] A Comparison of Global Search Algorithms for Continuous Black Box Optimization [EC]
  • Beyer, H.G. and Schwefel, H.P., 2002. Evolution strategies–A comprehensive introduction. Natural Computing, 1(1), pp.3-52.
  • [2000] Evolutionary algorithms in noisy environments - Theoretical issues and guidelines for practice [Beyer]
  • Rechenberg, I., 2000. Case studies in evolutionary experimentation and computation. Computer Methods in Applied Mechanics and Engineering, 186(2-4), pp.125-140.
  • Rechenberg, I., 1984. The evolution strategy. A mathematical model of darwinian evolution. In Synergetics—from Microscopic to Macroscopic Order (pp. 122-132). Springer, Berlin, Heidelberg.