TY - GEN
T1 - Memory-Gated Recurrent Networks
AU - Zhang, Yaquan
AU - Wu, Qi
AU - Peng, Nanbo
AU - Dai, Min
AU - Zhang, Jing
AU - Wang, Hu
N1 - Funding Information:
Qi WU acknowledges the support from the JD Digits - CityU Joint Laboratory in Financial Technology and Engineering, the Laboratory for AI Powered Financial Technologies, and the GRF support from the Hong Kong Research Grants Council under GRF 14211316, 14206117, and 11219420.
Funding Information:
Min DAI acknowledges support from Singapore AcRF grants (Grant No. R-703-000-032-112, R-146-000-306-114 and R-146-000-311-114), and the National Natural Science Foundation of China (Grant 11671292).
Publisher Copyright:
Copyright © 2021, Association for the Advancement of Artificial Intelligence (www.aaai.org). All rights reserved.
PY - 2021/2
Y1 - 2021/2
N2 - The essence of multivariate sequential learning is all about how to extract dependencies in data. These data sets, such as hourly medical records in intensive care units and multi-frequency phonetic time series, often time exhibit not only strong serial dependencies in the individual components (the “marginal” memory) but also non-negligible memories in the cross-sectional dependencies (the “joint” memory). Because of the multivariate complexity in the evolution of the joint distribution that underlies the data generating process, we take a data-driven approach and construct a novel recurrent network architecture, termed Memory-Gated Recurrent Networks (mGRN), with gates explicitly regulating two distinct types of memories: the marginal memory and the joint memory. Through a combination of comprehensive simulation studies and empirical experiments on a range of public datasets, we show that our proposed mGRN architecture consistently outperforms state-of-the-art architectures targeting multivariate time series.
AB - The essence of multivariate sequential learning is all about how to extract dependencies in data. These data sets, such as hourly medical records in intensive care units and multi-frequency phonetic time series, often time exhibit not only strong serial dependencies in the individual components (the “marginal” memory) but also non-negligible memories in the cross-sectional dependencies (the “joint” memory). Because of the multivariate complexity in the evolution of the joint distribution that underlies the data generating process, we take a data-driven approach and construct a novel recurrent network architecture, termed Memory-Gated Recurrent Networks (mGRN), with gates explicitly regulating two distinct types of memories: the marginal memory and the joint memory. Through a combination of comprehensive simulation studies and empirical experiments on a range of public datasets, we show that our proposed mGRN architecture consistently outperforms state-of-the-art architectures targeting multivariate time series.
UR - http://www.scopus.com/inward/record.url?scp=85130059586&partnerID=8YFLogxK
M3 - Conference article published in proceeding or book
AN - SCOPUS:85130059586
VL - 12B
T3 - 35th AAAI Conference on Artificial Intelligence, AAAI 2021
SP - 10956
EP - 10963
BT - 35th AAAI Conference on Artificial Intelligence, AAAI 2021
PB - Association for the Advancement of Artificial Intelligence
T2 - 35th AAAI Conference on Artificial Intelligence, AAAI 2021
Y2 - 2 February 2021 through 9 February 2021
ER -