论文标题

使用图形神经网络进行NUMA和预摘要优化学习中间表示

Learning Intermediate Representations using Graph Neural Networks for NUMA and Prefetchers Optimization

论文作者

TehraniJamsaz, Ali, Popov, Mihail, Dutta, Akash, Saillard, Emmanuelle, Jannesari, Ali

论文摘要

NUMA和硬件预取配置有很大的空间,可以显着影响应用程序的性能。先前的研究表明,模型如何根据代码的动态属性自动选择配置以实现加速。本文演示了代码的静态中间表示(IR)如何指导NUMA/Prefetcher的优化,而无需绩效分析的过高成本。我们提出了一种创建一个综合数据集的方法,该数据集包括一组不同的中间表示以及最佳配置。然后,我们应用图形神经网络模型以验证该数据集。我们表明,基于静态中间表示的模型可实现昂贵的基于动态性能分析策略提供的绩效增长的80%。我们进一步开发了使用静态和动态信息的混合模型。我们的混合模型与动态模型的增长相同,但仅通过分析30%的程序来降低成本。

There is a large space of NUMA and hardware prefetcher configurations that can significantly impact the performance of an application. Previous studies have demonstrated how a model can automatically select configurations based on the dynamic properties of the code to achieve speedups. This paper demonstrates how the static Intermediate Representation (IR) of the code can guide NUMA/prefetcher optimizations without the prohibitive cost of performance profiling. We propose a method to create a comprehensive dataset that includes a diverse set of intermediate representations along with optimum configurations. We then apply a graph neural network model in order to validate this dataset. We show that our static intermediate representation based model achieves 80% of the performance gains provided by expensive dynamic performance profiling based strategies. We further develop a hybrid model that uses both static and dynamic information. Our hybrid model achieves the same gains as the dynamic models but at a reduced cost by only profiling 30% of the programs.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源