*French version below
Integrating psychology and virtual reality to enable dynamic posture recognition and delivery
ANR-PRC (ANR-22-CE38-0010-01) : access to the website.
Emotions are major cues in human communication to understand other’s mental states and to provide appropriate responses. As a complex phenomenon, they are expressed through different channels: verbal, facial (e.g. Emotional Facial Expressions – EFE-) or bodily. The POSTURE project aims to determine the role of dynamic posture in nonverbal communication, using new technologies such as motion capture and virtual reality.
Context and objectives
Nowadays, knowledge about the influence of postural cues on emotional expression is limited. In an integrative perspective, many questions arise: are they reinforcers, or complements to EFEs, or intrinsic cues? Do certain body segments stand out in the transmission of an emotion? Are there gender differences? Is the role of postures the same whatever the type of emotion (basic, social)? Answering these questions require tackle unsolved questions in both psychology and Virtual Reality. To address this challenge, computer scientists have to go beyond the limits of current technologies, combining imaging and AI (machine learning).
Method
By combining multidisciplinary expertise in motion capture, computation (LICIIS, URCA), avatar’s restitution (ICUBE, UNISTRA) and experimental psychology (C2S), we plan to achieve these objectives by providing an open-access database of animated avatars delivering postures with or without EFE.
Impact
This project, and its database outcome, is intended to serve as a reference to future SHS studies but could also have a large impact in virtual reality applications in domains with social interaction, such as education, learning, clinical rehabilitation or professional insertion.