Evaluation of an Online Remote Photoplethysmography Methodology for Emotion Recognition in a Child-Robot Interaction
Nombre: LUCAS CÔGO LAMPIER
Fecha de publicación: 28/04/2020
Supervisor:
Nombre | Papel |
---|---|
TEODIANO FREIRE BASTOS FILHO | Advisor * |
Junta de examinadores:
Nombre | Papel |
---|---|
ALAN SILVA DA PAZ FLORIANO | External Examiner * |
PATRICK MARQUES CIARELLI | Internal Examiner * |
TEODIANO FREIRE BASTOS FILHO | Advisor * |
Sumario: New Mobile Autonomous Robot for Interaction with Autistics (N-MARIA) is a robot built at UFES to help therapy with children with Autistic Spectrum Disorder (ASD). It has audiovisual communication equipment, a system allowing the therapist to send commands to the robot, and
also an algorithm allowing to follow the child at a safe distance. Aiming to improve the N-MARIAs Child-Robot-Interaction (CRI), this work proposes an online remote photoplethysmography (rPPG) to extract the pulse rate signal using a webcam. The obtained cardiac information is then used to train a classifier and infer the childs emotion during CRI. The results are presented to
the therapist using a Graphic-User-Interface (GUI). The algorithm is projected to work online using a low-cost webcam. Different rPPG techniques presented in the literature are evaluated by precision and processing time and compared with ground truth, Electrocardiography (EEG) and Photoplethysmography (PPG). The results show that the error for heart rate measurement, while
the subject is still in front of the camera, is relatively low (with a median error of 3 bpm), but fails in situations of fast movements (with a median error of 15 bpm). For the emotion recognition system, the classifier could not differentiate the emotions using neither the rPPG signal nor the electrocardiogram. However, in general, the emotion classification with rPPG was better than
with ECG.