Facial Recognition Technology in both Physical and Virtual Environments: Its Measurement, and Mechanism

Hee Sun Choi, Wang Zhang, Tian Cheng, Xinmei Liang

Research output: Unpublished conference presentation (presented paper, abstract, poster)Conference presentation (not published in journal/proceeding/book)Academic researchpeer-review

Abstract

This study explores the potential of facial emotion recognition as a neurological research tool in urban studies, particularly in privately owned public spaces within highly dense residential areas. Data collected from 30 residents and users was analyzed through capture of facial expressions in real time to three-dimensional design elements. As a result, high recognition rates were achieved for emotions such as "Neutral," "Happiness," "Sadness," and "Frustration." The study highlights two key findings. Firstly, it underscores the value of facial recognition technology in the spatial cognition of living environments. Secondly, it provides visual mapping of this spatial cognition through Agent-Based Modeling, allowing the nuances of how spatial settings and elements affect users to be captured.
In light of that, this study demonstrates the validity and reliability of facial emotional recognition in urban morphology based on: (a) emotionally-based morphology theories, (b) principles of computer algorithms, and (c) visualization studies.
Original languageEnglish
Publication statusPublished - 1 Jul 2025
EventIFoU (International Forum on Urbanism) 2025_17th IFoU Conference Lisbon: Future Livings - Universidade de Lisboa, Lisbon, Portugal
Duration: 1 Jul 20254 Jul 2025
Conference number: 17
https://ifou2025.fa.ulisboa.pt

Conference

ConferenceIFoU (International Forum on Urbanism) 2025_17th IFoU Conference Lisbon: Future Livings
Country/TerritoryPortugal
CityLisbon
Period1/07/254/07/25
Internet address

Fingerprint

Dive into the research topics of 'Facial Recognition Technology in both Physical and Virtual Environments: Its Measurement, and Mechanism'. Together they form a unique fingerprint.

Cite this