For subscribers

The problem with AI and ‘empathy’

If technology redefines what our language means, it could also change our perceptions of ourselves.

Sign up now: Get ST's newsletters delivered to your inbox

One problem with calling machines “empathic” is that it provides cover for actions which would otherwise feel morally uncomfortable.

One problem with calling machines “empathic” is that it provides cover for actions which would otherwise feel morally uncomfortable, says the writer.

PHOTO: REUTERS

Sarah O’Connor

Follow topic:

One after another, the “uniquely human” traits we once thought would remain untouched by the rise of the machines have started to look vulnerable after all. First it was creativity. Is empathy next?

If you have been reading the research of late, you could be forgiven for thinking so. In one study, a team of licensed healthcare professionals compared the responses of chatbots and real doctors with patient questions posed in an online forum. The chatbot responses were rated significantly higher, not just for quality, but for empathy.

See more on