论文标题
大规模全局优化的增量递归排名分组
Incremental Recursive Ranking Grouping for Large Scale Global Optimization
论文作者
论文摘要
现实世界优化问题可能具有不同的基础结构。在黑盒优化中,决策变量之间的依赖项仍然未知。但是,某些技术可以准确发现此类相互作用。在大规模的全球优化(LSGO)中,问题是高维的。它被证明有效地将LSGO问题分解为子问题并分别优化它们。这种方法的有效性可能高度取决于问题分解的准确性。许多最先进的分解策略来自差分组(DG)。但是,如果给定的问题由非可分离的子问题组成,则基于DG的策略可能会发现许多不存在的相互作用。另一方面,到目前为止提出的单调性检查策略尚未报告任何可分开的子问题的不存在的相互作用,但可能会错过发现许多现有的相互作用。因此,我们提出了这些缺陷都没有的增量递归排名分组(IRRG)。与最近基于DG的命题,例如递归DG 3(RDG3)相比,IRRG消耗更多的健身功能评估。然而,对于适用于RDG3的可添加性可分离子问题的问题,嵌入IRRG或RDG3后所考虑的合作共同进化框架的有效性相似。在用非添加性嵌入IRRG替换出可分离性后,会导致质量明显更高的结果。
Real-world optimization problems may have a different underlying structure. In black-box optimization, the dependencies between decision variables remain unknown. However, some techniques can discover such interactions accurately. In Large Scale Global Optimization (LSGO), problems are high-dimensional. It was shown effective to decompose LSGO problems into subproblems and optimize them separately. The effectiveness of such approaches may be highly dependent on the accuracy of problem decomposition. Many state-of-the-art decomposition strategies are derived from Differential Grouping (DG). However, if a given problem consists of non-additively separable subproblems, DG-based strategies may discover many non-existing interactions. On the other hand, monotonicity checking strategies proposed so far do not report non-existing interactions for any separable subproblems but may miss discovering many of the existing ones. Therefore, we propose Incremental Recursive Ranking Grouping (IRRG) that suffers from none of these flaws. IRRG consumes more fitness function evaluations than the recent DG-based propositions, e.g., Recursive DG 3 (RDG3). Nevertheless, the effectiveness of the considered Cooperative Co-evolution frameworks after embedding IRRG or RDG3 was similar for problems with additively separable subproblems that are suitable for RDG3. After replacing the additive separability with non-additive, embedding IRRG leads to results of significantly higher quality.