"IS HOMOPHILY A NECESSITY FOR GRAPH NEURAL NETWORKS?" by Yao Ma, Xiaorui Liu et al.
 

IS HOMOPHILY A NECESSITY FOR GRAPH NEURAL NETWORKS?

Document Type

Conference Proceeding

Publication Date

1-1-2022

Abstract

Graph neural networks (GNNs) have shown great prowess in learning representations suitable for numerous graph-based machine learning tasks. When applied to semi-supervised node classification, GNNs are widely believed to work well due to the homophily assumption (“like attracts like”), and fail to generalize to heterophilous graphs where dissimilar nodes connect. Recent works have designed new architectures to overcome such heterophily-related limitations. However, we empirically find that standard graph convolutional networks (GCNs) can actually achieve strong performance on some commonly used heterophilous graphs. This motivates us to reconsider whether homophily is truly necessary for good GNN performance. We find that this claim is not quite accurate, and certain types of “good” heterophily exist, under which GCNs can achieve strong performance. Our work carefully characterizes the implications of different heterophily conditions, and provides supporting theoretical understanding and empirical observations. Finally, we examine existing heterophilous graphs benchmarks and reconcile how the GCN (under)performs on them based on this understanding.

Identifier

85150385124 (Scopus)

Publication Title

Iclr 2022 10th International Conference on Learning Representations

Grant

CNS1815636

Fund Ref

National Science Foundation

This document is currently not available here.

Share

COinS