论文标题
基于无上下文语法的层次神经架构搜索空间的构建
Construction of Hierarchical Neural Architecture Search Spaces based on Context-free Grammars
论文作者
论文摘要
从简单的构建基础中发现神经体系结构是神经建筑搜索(NAS)的长期目标。分层搜索空间是朝着这个目标迈出的有希望的一步,但缺乏统一的搜索空间设计框架,通常仅在建筑的某些有限方面进行搜索。在这项工作中,我们基于无上下文的语法引入了一个统一的搜索空间设计框架,该框架可以自然而紧凑地生成表现力的层次搜索空间,该搜索空间比文献中的共同空间大100个数量级。通过增强和使用它们的属性,我们有效地可以在完整的体系结构上进行搜索,并可以促进规律性。此外,我们为贝叶斯优化搜索策略提出了有效的分层内核设计,以有效地搜索如此巨大的空间。我们证明了我们的搜索空间设计框架的多功能性,并表明我们的搜索策略可以优于现有的NAS方法。代码可在https://github.com/automl/hierarchical_nas_construction上找到。
The discovery of neural architectures from simple building blocks is a long-standing goal of Neural Architecture Search (NAS). Hierarchical search spaces are a promising step towards this goal but lack a unifying search space design framework and typically only search over some limited aspect of architectures. In this work, we introduce a unifying search space design framework based on context-free grammars that can naturally and compactly generate expressive hierarchical search spaces that are 100s of orders of magnitude larger than common spaces from the literature. By enhancing and using their properties, we effectively enable search over the complete architecture and can foster regularity. Further, we propose an efficient hierarchical kernel design for a Bayesian Optimization search strategy to efficiently search over such huge spaces. We demonstrate the versatility of our search space design framework and show that our search strategy can be superior to existing NAS approaches. Code is available at https://github.com/automl/hierarchical_nas_construction.