It is a robot and it has feelings (II)
In an earlier post titled “It’s a robot and it has feelings” I discussed about the possibility of incorporating “feelings” into robots in a similar way as human emotions are displayed when facing external stimuli. Following that post, an obvious question arises: But what for? Or, being more pragmatic: What advantages can a robot obtain from emulating emotional responses?
Empathy is defined as the cognitive ability to perceive what another being can feel, and can be divided into two main components: affective empathy, also called emotional empathy which is the ability to emotionally respond with appropriately to the mental states of another being; And cognitive empathy, which is the ability to understand the point of view or mental state of another being.
What is the added value of this type of “empathic” communication? On the one hand, empathy improves the efficiency of interaction. Thus, as we perform actions we send signals that communicate our intentions (looks, movements of the hands, the body, etc.), which can allow other beings prepared to perceive these signals to identify them and to make a more efficient collaboration in order to achieve joint objectives. On the other hand, empathic interaction could help to lessen the apprehension that some users have when it comes to interacting with robotic devices, making them feel more comfortable with robots and other machines.
Endowing robots with a behavior that simulates human “emotional” behavior is one of the ultimate goals of robotics. Such emotional behavior could allow robots to show their mood (affective empathy) as well as perceive that of users interacting with them (cognitive empathy).
However, despite the impressive advances made in recent years in the fields of artificial intelligence, speech recognition and synthesis, computer vision and many other disciplines directly and indirectly related to emotional recognition and artificial emotional expressiveness, we are still far from being able to endow the robots with the empathic capacities similar to those a human being has.
Take as an example a person with a friend who just went to two job interviews but only one job was offered to him. Should that person show satisfaction or disappointment for his friend, or give that event any importance at all? Your response to this will obviously depend on what you know of your friend’s goals and aspirations. There are two components here that would be difficult to implement in a robot. In the first place, the robot would need to have a rich knowledge of itself, including personal motivations, weaknesses, strengths, history of successes and failures, etc. Second, its own identity should overlap with that of his human companion enough to provide a shared knowledge that is meaningful and genuine.
Since we are still far from being able to develop robots with such a human-like empathy, robotic system developers need to understand when to show empathy, at what level and in what contexts. For example, empathy may not be necessary in a washing machine, but it is clear that an empathic behavior can improve the performance of robotic applications in areas such as education or the home. On the other hand, pre-programmed empathic behaviors can become annoying and ineffective. For example, there are studies that indicate that drivers come to refuse (as an act of rebellion) to listen to a car that repeatedly says: “you seem to be tired and should stop.”
In this sense, one of the lines of research in which CARTIF participates is in the development of robots and social avatars with the capacity to recognize and express emotions, and to do so in a way that suits their operating environment and the services that they are offering. In this line, and as members of the European robotics platform euRobotics AISBL (Association Internationale Sans But Lucratif) we actively interact with other European centers of recognized prestige within the group “Natural Interaction with Social Robots”, which goal is the discussion and dissemination of cutting-edge advances at European level in the field of interaction between humans and social robots.
One of the recent activities carried out in this context has been the European Robotics Forum held last March in Edinburgh, where we had the opportunity to discuss with members of the group precisely the needs, recommendations and future lines in the development of robots with empathic capacity. From the discussions that took place in this forum, I would like to summarize the following notes which, although of a general nature, will surely mark the European trends in research on social robotics in the coming years:
- It is necessary to continue researching what empathy means for different types of robots, such as exoskeletons, social robots, service robots, manufacturing robots, etc. And investigate how those robots can express empathy in their respective contexts of application.
- Empathic interaction must be a dynamic process that evolves in order to build a relationship with the user over time. Preprogrammed repetitive behaviors are not perceived as empathic by the user, especially when the behavioral tips used to activate robot actions are known to the user.
- Because robots do not possess the physiological processes that allow them to be empathic, the solution is to detect the socio-emotional signals transmitted by humans and have the robots mimic the empathic behavioral responses that would be displayed by humans.
- During experimentation with empathic robots, we should make use of systems that are sufficiently complex and have the necessary capabilities to investigate the different aspects of empathic behavior and to quantitatively assess their impact. After studying the different aspects of empathic interaction, the most relevant aspects could be selected and realized in simple and low-cost systems for commercialization.