Search

Simulated Empathy for Social Robots


Start date: 01.08.2013
Funded by: Universität Augsburg
Local head of project: Prof. Dr. Elisabeth André
Local scientists: M.Sc. Kathrin Janowski
M.Sc. Hannes Ritschel
M.Sc. Florian Lingenfelser
Dr. Johannes Wagner
M.Sc. Markus Häring
Prof. Dr. Birgit Lugrin

Abstract

This project's objective is to make social robots react empathically to the emotional state of their human conversation partner.

Description

Zeno spiegelt den Gesichtsausdruck des Nutzers

The simplest form of empathy is mirroring the emotions which are being observed from the interlocutor. Humans usually do this subconsciously.

Our current prototype employs the Social Signal Interpretation Framework (SSI) to analyse the tone of voice and the facial expressions of the user. The recognized emotions are then mapped to numerical values along the two axes "pleasure" and "arousal".

The robot we use, "Zeno", is a RoboKind R50 built by the Texan company Hanson Robotics. He has a human-like body and additional servo motors under a synthetic skin which allow for the adaptation of his facial expressions. The appliation which controls his behavior continually adapts his facial expression and head pose to the emotional state which was recognized by the SSI pipeline.

Additionally, Zeno has several lists of sentences which he can use to comment on the situation when user pauses in their speech. The exact moment when the robot talks depends on various factors, for example the intensity of the current emotional state or the time which has passed since the last comment

Media Coverage (German)

"Roboter Zeno - Ist es bescheuert, Gefühle für eine Maschine zu haben?" bei "PULS", BR 24.03.2016

"Extrem Robotern - Ein echt netter Typ", Deutschlandfunk Nova 07.03.2016

"Algorithmus der Gefühle - Menschliche Roboter", SWR2 05.02.2016

"Robo Sapiens - Können Roboter den Menschen ersetzen?" bei "X:enius", arte 18.06.2015

"Empathische Alice" bei "Quarks & Co", WDR 12.11.2013