TY - JOUR
T1 - Workflow performance prediction based on graph structure aware deep attention neural network
AU - Yu, Jixiang
AU - Gao, Ming
AU - Li, Yuchan
AU - Zhang, Zehui
AU - Ip, Wai Hung
AU - Yung, Kai Leung
N1 - Funding Information:
This work is supported by the National Natural Science Foundation of China (71772033, 71831003, 71801031, 72172025), Natural Science Foundation of Liaoning Province, China (Joint Funds for Key Scientific Innovation Bases, 2020-KF-11-11), and also supported by a grant from the Department of Industrial and Systems Engineering of the Hong Kong Polytechnic University (H-ZG3K).
Publisher Copyright:
© 2022 The Author(s)
PY - 2022/5
Y1 - 2022/5
N2 - With the rapid growth of cloud computing, efficient operational optimization and resource scheduling of complex cloud business processes rely on real-time and accurate performance prediction. Previous research on cloud computing performance prediction focused on qualitative (heuristic rules), model-driven, or coarse-grained time-series prediction, which ignore the study of historical performance, resource allocation status and service sequence relationships of workflow services. There are even fewer studies on prediction for workflow graph data due to the lack of available public datasets. In this study, from Alibaba Cloud's Cluster-trace-v2018, we extract nearly one billion offline task instance records into a new dataset, which contains approximately one million workflows and their corresponding directed acyclic graph (DAG) matrices. We propose a novel workflow performance prediction model (DAG-Transformer) to address the aforementioned challenges. In DAG-Transformer, we design a customized position encoding matrix and an attention mask for workflows, which can make full use of workflow sequential and graph relations to improve the embedding representation and perception ability of the deep neural network. The experiments validate the necessity of integrating graph-structure information in workflow prediction. Compared with mainstream deep learning (DL) methods and several classic machine learning (ML) algorithms, the accuracy of DAG-Transformer is the highest. DAG-Transformer can achieve 85-92% CPU prediction accuracy and 94-98% memory prediction accuracy, while maintaining high efficiency and low overheads. This study establishes a new paradigm and baseline for workflow performance prediction and provides a new way for facilitating workflow scheduling.
AB - With the rapid growth of cloud computing, efficient operational optimization and resource scheduling of complex cloud business processes rely on real-time and accurate performance prediction. Previous research on cloud computing performance prediction focused on qualitative (heuristic rules), model-driven, or coarse-grained time-series prediction, which ignore the study of historical performance, resource allocation status and service sequence relationships of workflow services. There are even fewer studies on prediction for workflow graph data due to the lack of available public datasets. In this study, from Alibaba Cloud's Cluster-trace-v2018, we extract nearly one billion offline task instance records into a new dataset, which contains approximately one million workflows and their corresponding directed acyclic graph (DAG) matrices. We propose a novel workflow performance prediction model (DAG-Transformer) to address the aforementioned challenges. In DAG-Transformer, we design a customized position encoding matrix and an attention mask for workflows, which can make full use of workflow sequential and graph relations to improve the embedding representation and perception ability of the deep neural network. The experiments validate the necessity of integrating graph-structure information in workflow prediction. Compared with mainstream deep learning (DL) methods and several classic machine learning (ML) algorithms, the accuracy of DAG-Transformer is the highest. DAG-Transformer can achieve 85-92% CPU prediction accuracy and 94-98% memory prediction accuracy, while maintaining high efficiency and low overheads. This study establishes a new paradigm and baseline for workflow performance prediction and provides a new way for facilitating workflow scheduling.
KW - DAG structure
KW - DAG-Transformer
KW - Deep Learning
KW - Performance prediction
KW - Workflow in cloud computing
UR - http://www.scopus.com/inward/record.url?scp=85124583611&partnerID=8YFLogxK
U2 - 10.1016/j.jii.2022.100337
DO - 10.1016/j.jii.2022.100337
M3 - Journal article
AN - SCOPUS:85124583611
SN - 2452-414X
VL - 27
JO - Journal of Industrial Information Integration
JF - Journal of Industrial Information Integration
M1 - 100337
ER -