TY - JOUR
T1 - Facial Expressions of Comprehension (FEC)
AU - Turan, Cigdem
AU - Neergaard, Karl David
AU - Lam, Kin Man
N1 - Funding Information:
The work described in this article was supported in part by the GRF Grant PolyU 15217719 (project code: Q73V) of the Hong Kong SAR Government.
Publisher Copyright:
© 2010-2012 IEEE.
PY - 2022/1
Y1 - 2022/1
N2 - While the relationship between facial expressions and emotion has been a productive area of inquiry, research is only recently exploring whether a link exists between facial expressions and cognitive processes. Using findings from psychology and neuroscience to guide predictions of affectation during a cognitive task, this article aimed to study facial dynamics as a mean to understand comprehension. We present a new multimodal facial expression database, named Facial Expressions of Comprehension (FEC), consisting of the videos recorded during a computer-mediated task in which each trial consisted of reading, answering, and feedback to general knowledge true and false statements. To identify the level of engagement with the corresponding stimuli, we present a new methodology using animation units (AnUs) from the Kinect v2 device to explore the changes in facial configuration caused by an event: Event-Related Intensities (ERIs). To identify dynamic facial configurations, we used ERIs in statistical analyses with generalized additive models. To identify differential facial dynamics linked to knowing vs. guessing and true vs. false responses, we employed an SVM classifier with facial appearance information extracted using LPQ-TOP. Results of ERIs in sentence comprehension show that facial dynamics are promising to help understand affective and cognitive states of the mind.
AB - While the relationship between facial expressions and emotion has been a productive area of inquiry, research is only recently exploring whether a link exists between facial expressions and cognitive processes. Using findings from psychology and neuroscience to guide predictions of affectation during a cognitive task, this article aimed to study facial dynamics as a mean to understand comprehension. We present a new multimodal facial expression database, named Facial Expressions of Comprehension (FEC), consisting of the videos recorded during a computer-mediated task in which each trial consisted of reading, answering, and feedback to general knowledge true and false statements. To identify the level of engagement with the corresponding stimuli, we present a new methodology using animation units (AnUs) from the Kinect v2 device to explore the changes in facial configuration caused by an event: Event-Related Intensities (ERIs). To identify dynamic facial configurations, we used ERIs in statistical analyses with generalized additive models. To identify differential facial dynamics linked to knowing vs. guessing and true vs. false responses, we employed an SVM classifier with facial appearance information extracted using LPQ-TOP. Results of ERIs in sentence comprehension show that facial dynamics are promising to help understand affective and cognitive states of the mind.
KW - Facial behavior understanding
KW - facial expression database
KW - sentence comprehension
UR - http://www.scopus.com/inward/record.url?scp=85075332783&partnerID=8YFLogxK
U2 - 10.1109/TAFFC.2019.2954498
DO - 10.1109/TAFFC.2019.2954498
M3 - Journal article
AN - SCOPUS:85075332783
SN - 1949-3045
VL - 13
SP - 335
EP - 346
JO - IEEE Transactions on Affective Computing
JF - IEEE Transactions on Affective Computing
IS - 1
M1 - 8907450
ER -