Enhanced Knowledge Graph Attention Networks for Efficient Graph Learning

Document Type

Conference Proceeding

Publication Date

1-1-2024

Abstract

This paper presents an innovative design for Enhanced Knowledge Graph Attention Networks (EKGAT), which focuses on improving representation learning to analyze more complex relationships of graph-structured data. By integrating TransformerConv layers, the proposed EKGAT model excels in capturing complex node relationships compared to traditional KGAT models. Additionally, our EKGAT model integrates disentanglement learning techniques to segment entity representations into independent components, thereby capturing various semantic aspects more effectively. Comprehensive experiments on the Cora, PubMed, and Amazon datasets reveal substantial improvements in node classification accuracy and convergence speed. The incorporation of TransformerConv layers significantly accelerates the convergence of the training loss function while either maintaining or enhancing accuracy, which is particularly advantageous for large-scale, real-time applications. Results from t-SNE and PCA analyses vividly illustrate the superior embedding separability achieved by our model, underscoring its enhanced representation capabilities. These findings highlight the potential of EKGAT to advance graph analytics and network science, providing robust, scalable solutions for a wide range of applications, from recommendation systems and social network analysis to biomedical data interpretation and real-time big data processing.

Identifier

105002709899 (Scopus)

ISBN

[9798350387131]

Publication Title

2024 IEEE High Performance Extreme Computing Conference, HPEC 2024

External Full Text Location

https://doi.org/10.1109/HPEC62836.2024.10938526

Grant

CCF-2109988

Fund Ref

National Science Foundation

This document is currently not available here.

Share

COinS