TY - GEN
T1 - Memory-Efficient Domain Incremental Learning for Internet of Things
AU - Zhao, Yuqing
AU - Saxena, Divya
AU - Cao, Jiannong
N1 - Funding Information:
We would like to thank the anonymous reviewers for their valuable comments. This research was supported by the Research Institute for Artificial Intelligence of Things, The Hong Kong Polytechnic University, Collaborative Research Fund (CRF), No. C5026-18G, and RIF-RGC Research Impact Fund No. R5034-18.
Publisher Copyright:
© 2022 ACM.
PY - 2022/11/6
Y1 - 2022/11/6
N2 - In Internet of Things (IoT) scenarios such as smart homes, autonomous vehicles, and wearable devices, data pattern changes over time due to changing environments and user requirements, known as domain shifts. When encountering domain shifts, deep neural network models in IoT suffers from performance degradation and need to retrain from scratch to adapt to domain shifts incrementally. Therefore, incremental learning is needed to adapt a model to domain shifts without retraining. Existing methods using the parameter isolation technique perform well in incremental learning of new domains without performance degradation. However, they cannot be directly adopted in IoT applications as they store masks and require users to label the task to indicate task-specific parameters during inference, which is memory inefficient and cumbersome. In this paper, we propose a memory-efficient method for IoT to incrementally adapt to domain shifts in a fixed neural network, named E-DomainIL. Our method freezes learned parameters and allows reusing them later in training to avoid interference between different domains. E-DomainIL does not require task labels or storing masks as it uses all parameters during inference. We use data-driven pruning to adjust the parameter ratio according to the dataset, thus maintaining the balance between accuracy and parameter efficiency. Experimental results on image classification benchmarks demonstrate our method's efficiency and accuracy.
AB - In Internet of Things (IoT) scenarios such as smart homes, autonomous vehicles, and wearable devices, data pattern changes over time due to changing environments and user requirements, known as domain shifts. When encountering domain shifts, deep neural network models in IoT suffers from performance degradation and need to retrain from scratch to adapt to domain shifts incrementally. Therefore, incremental learning is needed to adapt a model to domain shifts without retraining. Existing methods using the parameter isolation technique perform well in incremental learning of new domains without performance degradation. However, they cannot be directly adopted in IoT applications as they store masks and require users to label the task to indicate task-specific parameters during inference, which is memory inefficient and cumbersome. In this paper, we propose a memory-efficient method for IoT to incrementally adapt to domain shifts in a fixed neural network, named E-DomainIL. Our method freezes learned parameters and allows reusing them later in training to avoid interference between different domains. E-DomainIL does not require task labels or storing masks as it uses all parameters during inference. We use data-driven pruning to adjust the parameter ratio according to the dataset, thus maintaining the balance between accuracy and parameter efficiency. Experimental results on image classification benchmarks demonstrate our method's efficiency and accuracy.
KW - catastrophic forgetting
KW - data-driven pruning
KW - domain incremental learning
KW - domain shift
KW - internet of things
UR - https://www.scopus.com/pages/publications/85147542693
U2 - 10.1145/3560905.3568436
DO - 10.1145/3560905.3568436
M3 - Conference article published in proceeding or book
AN - SCOPUS:85147542693
T3 - SenSys 2022 - Proceedings of the 20th ACM Conference on Embedded Networked Sensor Systems
SP - 1175
EP - 1181
BT - SenSys 2022 - Proceedings of the 20th ACM Conference on Embedded Networked Sensor Systems
PB - Association for Computing Machinery, Inc
T2 - 20th ACM Conference on Embedded Networked Sensor Systems, SenSys 2022
Y2 - 6 November 2022 through 9 November 2022
ER -