论文标题
多任务图神经网络,用于同时预测铁磁系统中全球和原子特性
Multi-task graph neural networks for simultaneous prediction of global and atomic properties in ferromagnetic systems
论文作者
论文摘要
我们介绍了多任务图卷积神经网络Hydragnn,以同时预测全球和原子的物理特性,并用铁磁材料证明。我们将hydragnn训练以固定的身体为中心的四方(BCT)晶格结构和固定体积,以同时预测混合焓(系统的全球特征),原子电荷转移和原子质磁性的构造,以同时预测跨度的混合粒度,以跨越原子型磁性,跨越原子型磁性,跨越原子型磁性,跨越了整个配置。通过利用材料属性之间的潜在物理相关性,与Hydragnn的多任务学习(MTL)甚至具有适度的数据,也可以提供有效的培训。此外,这仅是通过单任务学习(STL)所要求的,而不是三个体系结构而不是三个体系结构来实现。 Hydragnn体系结构的第一个卷积层由所有学习任务共享,并提取所有材料属性所共有的特征。以下层区分了不同属性的特征,其结果被馈送到最终层的单独头部以产生预测。数值结果表明,Hydragnn有效地捕获了整个组成范围内配置熵与材料特性之间的关系。总体而言,同时MTL预测的准确性与STL预测的准确性相当。此外,MTL的培训水平的计算成本远低于原始DFT计算,也比每个属性的训练单独的STL模型都低。
We introduce a multi-tasking graph convolutional neural network, HydraGNN, to simultaneously predict both global and atomic physical properties and demonstrate with ferromagnetic materials. We train HydraGNN on an open-source ab initio density functional theory (DFT) dataset for iron-platinum (FePt) with a fixed body centered tetragonal (BCT) lattice structure and fixed volume to simultaneously predict the mixing enthalpy (a global feature of the system), the atomic charge transfer, and the atomic magnetic moment across configurations that span the entire compositional range. By taking advantage of underlying physical correlations between material properties, multi-task learning (MTL) with HydraGNN provides effective training even with modest amounts of data. Moreover, this is achieved with just one architecture instead of three, as required by single-task learning (STL). The first convolutional layers of the HydraGNN architecture are shared by all learning tasks and extract features common to all material properties. The following layers discriminate the features of the different properties, the results of which are fed to the separate heads of the final layer to produce predictions. Numerical results show that HydraGNN effectively captures the relation between the configurational entropy and the material properties over the entire compositional range. Overall, the accuracy of simultaneous MTL predictions is comparable to the accuracy of the STL predictions. In addition, the computational cost of training HydraGNN for MTL is much lower than the original DFT calculations and also lower than training separate STL models for each property.