Hierarchical Attention Link Prediction Neural Network

Zhitao Wang, Wenjie Li, Hanjing Su

Research output: Journal article publicationJournal articleAcademic researchpeer-review

13 Citations (Scopus)

Abstract

In this paper, a novel end-to-end neural link prediction model, named Hierarchical Attention Link Prediction Neural Network (HalpNet), is proposed. HalpNet comprehensively explores neighborhood information, which has proved important for link prediction, via the core component, i.e., hierarchical attention mechanism. The proposed hierarchical attention mechanism consists of two neural attention layers, modeling crucial structure information at node level and subgraph level, respectively. At node level, a structure-preserving attention is developed to preserve structure features of each node in the neighborhood subgraph. Based on the latent node features, at subgraph level, a structure-aggregating attention is designed to learn how important that each node in the subgraph is for the linkage of the target node pair and aggregate node features with learned attentions as a comprehensive subgraph representation. Given this expressive representation of neighborhood subgraph, HalpNet is able to predict link score of target node pair effectively. We evaluate HalpNet on 8 benchmark datasets against 14 popular and state-of-the-art approaches. The experimental results demonstrate its significant superiority and wide applicability on link prediction problem.

Original languageEnglish
Article number107431
Pages (from-to)1-9
JournalKnowledge-Based Systems
Volume232
DOIs
Publication statusPublished - 28 Nov 2021

Keywords

  • Hierarchical attention
  • Link prediction
  • Neural network

ASJC Scopus subject areas

  • Management Information Systems
  • Software
  • Information Systems and Management
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Hierarchical Attention Link Prediction Neural Network'. Together they form a unique fingerprint.

Cite this