论文标题

属性图嵌入的自适应图编码器

Adaptive Graph Encoder for Attributed Graph Embedding

论文作者

Cui, Ganqu, Zhou, Jie, Yang, Cheng, Liu, Zhiyuan

论文摘要

从图形拓扑和节点特征中学习向量表示的归因图嵌入是图形分析的一项挑战任务。最近,基于图形卷积网络(GCN)的方法在此任务上取得了长足的进步。但是,现有的基于GCN的方法具有三个主要缺点。首先,我们的实验表明图形卷积过滤器和重量矩阵的纠缠会损害性能和鲁棒性。其次,我们表明这些方法中的图形卷积过滤器表明是普遍的拉普拉斯平滑过滤器的特殊情况,但它们不能保留最佳的低通特性。最后,现有算法的训练目标通常是恢复邻接矩阵或特征矩阵,这些矩阵并不总是与现实世界应用程序一致。为了解决这些问题,我们提出了自适应图编码器(年龄),这是一个新颖的归因图嵌入框架。年龄由两个模块组成:(1)更好地减轻节点特征中的高频噪声,Age首先应用了精心设计的Laplacian平滑过滤器。 (2)年龄采用一种自适应编码器,该编码器迭代地增强了过滤的特征以获得更好的节点嵌入。我们使用四个公共基准数据集进行实验,以验证节点聚类和链接预测任务的年龄。实验结果表明,在这些任务上,年龄始终超过最先进的嵌入方法。

Attributed graph embedding, which learns vector representations from graph topology and node features, is a challenging task for graph analysis. Recently, methods based on graph convolutional networks (GCNs) have made great progress on this task. However,existing GCN-based methods have three major drawbacks. Firstly,our experiments indicate that the entanglement of graph convolutional filters and weight matrices will harm both the performance and robustness. Secondly, we show that graph convolutional filters in these methods reveal to be special cases of generalized Laplacian smoothing filters, but they do not preserve optimal low-pass characteristics. Finally, the training objectives of existing algorithms are usually recovering the adjacency matrix or feature matrix, which are not always consistent with real-world applications. To address these issues, we propose Adaptive Graph Encoder (AGE), a novel attributed graph embedding framework. AGE consists of two modules: (1) To better alleviate the high-frequency noises in the node features, AGE first applies a carefully-designed Laplacian smoothing filter. (2) AGE employs an adaptive encoder that iteratively strengthens the filtered features for better node embeddings. We conduct experiments using four public benchmark datasets to validate AGE on node clustering and link prediction tasks. Experimental results show that AGE consistently outperforms state-of-the-art graph embedding methods considerably on these tasks.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源