“Do you know Castle In The Sky ?”

“Yes. It’s a favorite among Studio Ghibli films.”

“Ok, when was the movie released?”

“1986, I think. Pazu helps Princess Sheeta at the risk of his life during the whole story. It’s nice.”

SwissCognitive LogoLook the other way, and this might just seem like a pretty average conversation between friends on trivia night—except it’s not. We are in a room in Japan’s Waseda University where a researcher is having a chat with SCHEMA, a plastic-bodied humanoid robot that has all the charm of WALL-E and the encyclopedic knowledge of a true movie buff. This is the Perceptual Computing Laboratory, where researchers led by Professor Tetsunori Kobayashi are busy working on human-computer interactions which promise to create friendly robots much like Baymax of the big screen.

The era of emotional robots

The field of human-computer interactions has made considerable headway since its infancy in the 1980s. Now, it’s being used to crack the next frontier in artificial intelligence: empathy. The first rung of the empathy ladder is motor mimicry and emotional contagion, which is what SCHEMA attempts to simulate through a sophisticated algorithm that allows it to pick the most natural response based on datasets.

What’s even more remarkable are the facial recognition systems that are being developed to help such robots read people. Psychologist Dr. Paul Ekman certainly never imagined that his research would be used as a primer on human emotion for robots. Being able to know what other people are feeling is something that we take for granted, but it is a considerable challenge for those building the robot nannies and caretakers of the future.

Challenge to overcome

The lack of empathy in robots is an important challenge to overcome, argues Richard Yonck, executive director of Intelligent Future Consulting. “Emotions will be critical in making machine intelligence more compatible with our own,” he said. And looking at the burgeoning ability of machine learning to process copious amounts of data, it is easy to understand why we might need to temper logical analysis with empathy.

Other challenges on the horizon include the potential backlash against the collection of feelings, en masse. While we may be increasingly comfortable with entrusting our credit card numbers, addresses and all manner of personal data to large tech behemoths, will we agree to our emotions being tracked and recorded for purposes we know little about? […]