Artificial intelligence is learning to imitate human compassion – and it’s taking shape everywhere from customer service to psychiatric therapy to full imitations of our dead relatives. But can we automate empathy? Should AI be marketed as having compassion?


Copyright: – “Automated compassion: The oxymoron we’ll have to live with”


No matter which way you spin it, the idea of automating empathy and compassion is a contradiction. Automated compassion is, by any stretch of the imagination, imitated compassion. And machine “imitation” is only capable of human comprehension, identification, and acknowledgement in the sense that it’s been programmed to copy human reactions.

But is it really about comparing machines to humans? Artificial intelligence is an astonishing thing all-in-all, and has already shown real promise in areas such as our healthcare systems.

The idea of it pretending to be human, and pretending to comprehend human suffering is an odd concept – but AI may have a very useful place in the healthcare of the future – especially when a human isn’t available.

What is automated compassion exactly?

While it’s what it says on the tin, there are aspects of it that may not be obvious. It’s AI that analyses and reacts to perceived mental or physical distress in humans. It’s a simulation of human emotion, a false impression of it if you will, and it can be used for Cognitive Behavioural Therapy (CBT), or as a mental health chatbot.

It can also be trained to acknowledge the woes and difficulties experienced in customer service, to improve the customer experience and make them feel understood.

Thank you for reading this post, don't forget to subscribe to our AI NAVIGATOR!


Sentient AI “does not exist”

Gordon Midwood, co-founder of London AI firm Anything World made the point about AI having a lack of sentience, which makes it hard for it to genuinely be compassionate.

He told Health Tech World: “I think currently automated compassion is something of an oxymoron given that general Artificial Intelligence does not really exist in any meaningful form. This means that a sentient artificial intelligence does not exist.

Given that a sentient Artificial Intelligence does not (yet) exist any attempt at emotion is clearly simulated. And simulated feelings are not genuine feelings, so any compassion given is essentially faked compassion.”[…]

Read more: