A Graph-Attention-Based Method for Single-Resident Daily Activity Recognition in Smart Homes

Jiancong Ye, Hongjie Jiang (Corresponding Author), Junpei Zhong (Corresponding Author)

Research output: Journal article publicationJournal articleAcademic researchpeer-review

2 Citations (Scopus)

Abstract

In ambient-assisted living facilitated by smart home systems, the recognition of daily human activities is of great importance. It aims to infer the household’s daily activities from the triggered sensor observation sequences with varying time intervals among successive readouts. This paper introduces a novel deep learning framework based on embedding technology and graph attention networks, namely the time-oriented and location-oriented graph attention (TLGAT) networks. The embedding technology converts sensor observations into corresponding feature vectors. Afterward, TLGAT provides a sensor observation sequence as a fully connected graph to the model’s temporal correlation as well as the sensor’s location correlation among sensor observations and facilitates the feature representation of each sensor observation through receiving other sensor observations and weighting operations. The experiments were conducted on two public datasets, based on the diverse setups of sensor event sequence length. The experimental results revealed that the proposed method achieved favorable performance under diverse setups.

Original languageEnglish
Article number1626
JournalSensors
Volume23
Issue number3
DOIs
Publication statusPublished - 2 Feb 2023

Keywords

  • deep learning
  • embedding
  • graph attention network
  • human activity recognition
  • smart home

ASJC Scopus subject areas

  • Analytical Chemistry
  • Information Systems
  • Atomic and Molecular Physics, and Optics
  • Biochemistry
  • Instrumentation
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'A Graph-Attention-Based Method for Single-Resident Daily Activity Recognition in Smart Homes'. Together they form a unique fingerprint.

Cite this