Workflow performance prediction based on graph structure aware deep attention neural network

Jixiang Yu, Ming Gao, Yuchan Li, Zehui Zhang, Wai Hung Ip, Kai Leung Yung

Research output: Journal article publicationJournal articleAcademic researchpeer-review

19 Citations (Scopus)

Abstract

With the rapid growth of cloud computing, efficient operational optimization and resource scheduling of complex cloud business processes rely on real-time and accurate performance prediction. Previous research on cloud computing performance prediction focused on qualitative (heuristic rules), model-driven, or coarse-grained time-series prediction, which ignore the study of historical performance, resource allocation status and service sequence relationships of workflow services. There are even fewer studies on prediction for workflow graph data due to the lack of available public datasets. In this study, from Alibaba Cloud's Cluster-trace-v2018, we extract nearly one billion offline task instance records into a new dataset, which contains approximately one million workflows and their corresponding directed acyclic graph (DAG) matrices. We propose a novel workflow performance prediction model (DAG-Transformer) to address the aforementioned challenges. In DAG-Transformer, we design a customized position encoding matrix and an attention mask for workflows, which can make full use of workflow sequential and graph relations to improve the embedding representation and perception ability of the deep neural network. The experiments validate the necessity of integrating graph-structure information in workflow prediction. Compared with mainstream deep learning (DL) methods and several classic machine learning (ML) algorithms, the accuracy of DAG-Transformer is the highest. DAG-Transformer can achieve 85-92% CPU prediction accuracy and 94-98% memory prediction accuracy, while maintaining high efficiency and low overheads. This study establishes a new paradigm and baseline for workflow performance prediction and provides a new way for facilitating workflow scheduling.

Original languageEnglish
Article number100337
Number of pages17
JournalJournal of Industrial Information Integration
Volume27
DOIs
Publication statusPublished - May 2022

Keywords

  • DAG structure
  • DAG-Transformer
  • Deep Learning
  • Performance prediction
  • Workflow in cloud computing

ASJC Scopus subject areas

  • Industrial and Manufacturing Engineering
  • Information Systems and Management

Fingerprint

Dive into the research topics of 'Workflow performance prediction based on graph structure aware deep attention neural network'. Together they form a unique fingerprint.

Cite this