协方差矩阵自适应进化策略 (CMAES)

class pypop7.optimizers.es.cmaes.CMAES(problem, options)[源代码]

协方差矩阵自适应进化策略 (CMAES)。

注意

根据广受认可的 《自然》杂志关于进化计算的综述,CMAES 被广泛认为是用于连续黑箱优化 (BBO) 的最先进 (SOTA) 进化算法之一。

关于 CMA-ES 的一些(实际上是所有)有趣的应用,请参阅,例如:[ICLR-2024 Spotlight], [TMRB-2024], [LWC-2024], [RSIF-2024], [MNRAS-2024], [Medical Physics-2024], [Wolff, 2024], [Jankowski et al., 2024], [Martin, 2024, Ph.D. Dissertation (Harvard University)], [Milekovic et al., 2023, Nature Medicine], [Chen et al., 2023, Science Robotics], [Falk et al., 2023, PNAS], [Thamm&Rosenow, 2023, PRL], [Brea et al., 2023, Nature Communications], [Ghafouri&Biros, 2023], [Barral, 2023, Ph.D. Dissertation (University of Oxford)], [Slade et al., 2022, Nature], [Rudolph et al., 2022, Nature Communications], [Cazenille et al., 2022, Bioinspiration & Biomimetics], [Franks et al., 2021], [Yuan et al., 2021, MNRAS], [Löffler et al., 2021, Nature Communications], [Papadopoulou et al., 2021, JPCB], [Schmucker et al., 2021, PLoS Comput Biol], [Barkley, 2021, Ph.D. Dissertation (Harvard University)], [Fernandes, 2021, Ph.D. Dissertation (Harvard University)], [Quinlivan, 2021, Ph.D. Dissertation (Harvard University)], [Vasios et al., 2020, Soft Robotics], [Pal et al., 2020], [Lei, 2020, Ph.D. Dissertation (University of Oxford)], [Pisaroni et al., 2019, Journal of Aircraft], [Yang et al., 2019, Journal of Aircraft], [Ong et al., 2019, PLOS Computational Biology], [Zhang et al., 2017, Science], [Wei&Mahadevan, 2016, Soft Matter], [Loshchilov&Hutter, 2016], [Molinari et al., 2014, AIAAJ], [Melton, 2014, Acta Astronautica], [Khaira et al., 2014, ACS Macro Lett.] 等等,仅举几例。

参数:
  • problem (dict) –

    问题参数,包含以下通用设置 ()
    • 'fitness_function' - 需要被最小化的目标函数 (func),

    • 'ndim_problem' - 维度数量 (int),

    • 'upper_boundary' - 搜索范围的上边界 (array_like),

    • 'lower_boundary' - 搜索范围的下边界 (array_like).

  • options (dict) –

    优化器选项,包含以下通用设置 ()
    • 'max_function_evaluations' - 函数评估的最大次数 (int, 默认: np.inf),

    • 'max_runtime' - 允许的最大运行时间 (float, 默认: np.inf),

    • 'seed_rng' - 随机数生成器的种子,需要明确设置 (int);

    以及以下特定设置 ()
    • ‘sigma’ - 初始全局步长,也称为变异强度 (float),

    • ‘mean’ - 初始(起始)点,也称为高斯搜索分布的均值 (array_like),

      • 如果未给出,它将从一个均匀分布中随机抽样,该分布的搜索范围由 problem[‘lower_boundary’]problem[‘upper_boundary’] 界定。

    • ‘n_individuals’ - 子代数量,也称为子代种群大小 (int,默认值:4 + int(3*np.log(problem[‘ndim_problem’]))),

    • 'n_parents' - 父代数量,也称为父代种群大小 (int,默认值: int(options[‘n_individuals’]/2))。

示例

使用黑箱优化器 CMAES 最小化著名的测试函数 Rosenbrock

 1>>> import numpy  # engine for numerical computing
 2>>> from pypop7.benchmarks.base_functions import rosenbrock  # function to be minimized
 3>>> from pypop7.optimizers.es.cmaes import CMAES
 4>>> problem = {'fitness_function': rosenbrock,  # to define problem arguments
 5...            'ndim_problem': 2,
 6...            'lower_boundary': -5.0*numpy.ones((2,)),
 7...            'upper_boundary': 5.0*numpy.ones((2,))}
 8>>> options = {'max_function_evaluations': 5000,  # to set optimizer options
 9...            'seed_rng': 2022,
10...            'mean': 3.0*numpy.ones((2,)),
11...            'sigma': 3.0}  # global step-size may need to be fine-tuned for better performance
12>>> cmaes = CMAES(problem, options)  # to initialize the optimizer class
13>>> results = cmaes.optimize()  # to run the optimization/evolution process
14>>> print(f"CMAES: {results['n_function_evaluations']}, {results['best_so_far_y']}")
15CMAES: 5000, 0.0017

关于其 Python 编码的正确性检查,请参阅这份基于代码的可重复性报告了解所有细节。对于基于 pytest 的自动化测试,请参见 test_cmaes.py

best_so_far_x

在整个优化过程中找到的迄今为止最优的解。

类型:

array_like

best_so_far_y

在整个优化过程中找到的迄今为止最优的适应度。

类型:

array_like

mean

初始(起始)点,也即高斯搜索分布的均值。

类型:

array_like

n_individuals

后代数量,也称为后代种群大小/样本大小。

类型:

int

n_parents

父代数量,也称为父代种群大小/正向选择的搜索点数量。

类型:

int

sigma

最终的全局步长,也称为变异强度(在优化过程中更新)。

类型:

float

参考文献

https://cma-es.github.io/

Hansen, N., 2023. The CMA evolution strategy: A tutorial. arXiv preprint arXiv:1604.00772.

Ollivier, Y., Arnold, L., Auger, A. and Hansen, N., 2017. Information-geometric optimization algorithms: A unifying picture via invariance principles. Journal of Machine Learning Research, 18(18), pp.1-65.

Hansen, N., Atamna, A. and Auger, A., 2014, September. How to assess step-size adaptation mechanisms in randomised search. In International Conference on Parallel Problem Solving From Nature (pp. 60-69). Springer, Cham.

Kern, S., Müller, S.D., Hansen, N., Büche, D., Ocenasek, J. and Koumoutsakos, P., 2004. Learning probability distributions in continuous evolutionary algorithms–a comparative review. Natural Computing, 3, pp.77-112.

Hansen, N., Müller, S.D. and Koumoutsakos, P., 2003. Reducing the time complexity of the derandomized evolution strategy with covariance matrix adaptation (CMA-ES). Evolutionary Computation, 11(1), pp.1-18.

Hansen, N. and Ostermeier, A., 2001. Completely derandomized self-adaptation in evolution strategies. Evolutionary Computation, 9(2), pp.159-195.

Hansen, N. and Ostermeier, A., 1996, May. Adapting arbitrary normal mutation distributions in evolution strategies: The covariance matrix adaptation. In Proceedings of IEEE International Conference on Evolutionary Computation (pp. 312-317). IEEE.

请参考其由 cyberagent.ai 提供的轻量级 Python 实现:https://github.com/CyberAgentAILab/cmaes

请参考其由 Hansen, N. 提供的官方 Python 实现:https://github.com/CMA-ES/pycma

CMA-ES 的基本信息

RoboCup 3D 仿真联赛冠军。

一些高质量的 CMA-ES 教程

CMA-ES 的一些应用

https://visitor-badge.laobi.icu/badge?page_id=Evolutionary-Intelligence.pypop https://visitor-badge.laobi.icu/badge?page_id=Evolutionary-Intelligence.pypop-CMAES