MGT: Multi-Granularity Transformer leveraging multi-level relation for sequential recommendation

Yihu Zhang, Bo Yang, Runze Mao, Qing Li

Research output: Journal article publicationJournal articleAcademic researchpeer-review

3 Citations (Scopus)

Abstract

Sequential recommendation has been a popular research topic in recent times, aiming to predict the next item in a user's sequence based on their past behaviors. Self-Attention (SA)-based models have shown state-of-the-art performance in this domain. These SA-based models adopt vanilla self-attention mechanism, which takes every single item as the minimum modeling unit and is sufficient to capture the point-level relation: several previously interacted items affecting the target item individually. However, we argue that vanilla self-attention mechanism in existing SA-based models neglects the collective influence of a group of items and thus cannot explicitly capture union-level relation: several previous items affecting the target items jointly. To address this limitation, we propose Multi-Granularity Transformer (MGT) that leverages both point-level and union-level relation for sequential recommendation. The proposed MGT employs a new multi-granularity self-attention (MGSA) mechanism that simultaneously captures multi-level relation (point-level and union-level relation). Specifically, MGSA partitions item latent space into different attention heads and forces different attention heads to account for point-level and union-level relation, respectively. Moreover, to improve the ability of feedforward layer in modeling local patterns, we further propose to incorporate a cross-token scheme into existing point-wise feedforward layer to enable local information interaction between adjacent items. Extensive experiments are conducted on three widely-used benchmark datasets to demonstrate the effectiveness and rationality of the proposed MGT over several state-of-the-art sequential recommendation models.

Original languageEnglish
Article number121808
Pages (from-to)1-13
JournalExpert Systems with Applications
Volume238
DOIs
Publication statusPublished - 15 Mar 2024

Keywords

  • Multi-level relation
  • Recommender system
  • Self-attention mechanism
  • Sequential recommendation
  • Transformer

ASJC Scopus subject areas

  • General Engineering
  • Computer Science Applications
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'MGT: Multi-Granularity Transformer leveraging multi-level relation for sequential recommendation'. Together they form a unique fingerprint.

Cite this