Why these friendly robots can't be good friends to our kids

Jibo's face is a touchscreen with one white eye that looks around, blinks and even closes when he gets bored with you.
Jibo's face is a touchscreen with one white eye that looks around, blinks and even closes when he gets bored with you. PHOTO: WASHINGTON POST

Jibo the robot swivels around when it hears its name and tilts its touchscreen face upwards, expectantly.

"I am a robot, but I am not just a machine," it says. "I have a heart. Well, not a real heart. But feelings. Well, not human feelings. You know what I mean."

Actually, I'm not sure we do. And that's what unsettles me about the wave of "sociable robots" that are coming online. The new releases include Jibo, Cozmo, Kuri and Meccano M.A.X. Although they bear some resemblance to assistants such as Apple's Siri, Google Home and Amazon's Alexa, these robots come with an added dose of personality. They are designed to win us over not with their smarts but with their sociability. They are marketed as companions. And they do more than engage us in conversation - they feign emotion and empathy. This can be disconcerting.

Time magazine, which featured Jibo on the cover of its "25 Best Inventions of 2017" issue last month, hailed the robot as seeming "human in a way that his predecessors do not", in a way that "could fundamentally reshape how we interact with machines".

Reviewers are accepting these robots as "he" or "she" rather than "it". "He told us that blue is his favourite colour and that the shape of macaroni pleases him more than any other," Mr Jeffrey Van Camp wrote about Jibo for Wired.

"Just the other day, he told me how much fun, yet scary it would be to ride on top of a lightning bolt. Somewhere along the way, learning these things, we began to think of him more like a person than an appliance."

But whereas adults may be able to catch themselves in such thoughts and remind themselves that sociable robots are, in fact, appliances, children tend to struggle with that distinction. They are especially susceptible to these robots' pre-programmed bids for attachment. So, before adding a sociable robot to the holiday gift list, parents may want to pause to consider what they would be inviting into their homes.

These machines are seductive and offer the wrong payoff: the illusion of companionship without the demands of friendship, the illusion of connection without the reciprocity of a mutual relationship. And interacting with these empathy machines may get in the way of children's ability to develop a capacity for empathy themselves.

Jibo's face is a touchscreen with one white eye that looks around, blinks and even closes when he gets bored with you. PHOTO: WASHINGTON POST

Jibo's creator, Ms Cynthia Breazeal, is a friend and colleague of mine at the Massachusetts Institute of Technology. We've debated the ethics of sociable robots for years. She's excited about the potential for robots that communicate the way people do to enrich our daily lives. I'm concerned about the ways those robots exploit our vulnerabilities and bring us into relationships that diminish our humanity.

In 2001, Ms Breazeal and I did a study - along with Yale robotics pioneer Brian Scassellati and Ms Olivia Daste, who develops robots for the elderly - looking at the emotional impact of sociable robots on children. We introduced 60 children, aged eight to 13, to two early sociable robots: Kismet, built by Ms Breazeal, and Cog, a project on which Mr Scassellati was a principal designer.

I found the encounters worrisome. The children saw the robots as "sort of alive" - alive enough to have thoughts and emotions, alive enough to care about you, alive enough that their feelings for you mattered. They asked the robots: Are you happy? Do you love me? As one 11-year-old girl put it: "It's not like a toy because you can't teach a toy, it's like something that's part of you, you know, something you love, kind of, like another person, like a baby."

In our study, the children were so invested in their relationships with Kismet and Cog that they insisted on understanding the robots as living beings, even when the roboticists explained how the machines worked or when the robots were temporarily broken.

Ms Breazeal talked to an eight-year-old boy about what Kismet was made of and how long it took to build, and still the child thought the robot wasn't broken, but "sleeping with his eyes open, just like my dad does". Their relationships with the robots affected their state of mind and self-esteem. Another boy, also aged eight, concluded that Kismet stopped talking to him because the robot liked his brothers better.

We were led to wonder whether a broken robot can break a child. Kids are central to the sociable-robot project, because its agenda is to make people more comfortable with robots in roles normally reserved for people, and robotics companies know kids are vulnerable consumers who can bring the whole family along.

As Washington Post tech columnist Geoffrey Fowler noted: "Kids... are the most open to making new friends, so that's where bot-makers are focused for now." So far, the main objection to sociable robots for kids has been over privacy. The privacy policies for these robots tend to be squishy, allowing companies to share the data their devices collect - recorded conversations, photos, videos - with vaguely defined service providers and vendors.

That's generating pushback. In October, Mattel scrapped plans for Aristotle - a kind of Alexa for the nursery, designed to accompany children as they progress from lullabies and bedtime stories through high school homework - after lawmakers and child advocacy groups argued that the data the device collected about children could be misused by Mattel, marketers, hackers and other third parties.

Privacy, though, should not be our only concern.

Ms Breazeal's position is this: People have relationships with many classes of things. They have relationships with children and with adults, with animals and with machines. Now, we are going to add robots to the list. More powerful than with pets. Less powerful than with people. We'll figure it out.

To support their argument, roboticists sometimes point to how children deal with toy dolls. Children animate dolls and turn them into imaginary friends. Why make such a fuss? I've been comparing how children play with traditional dolls and how children relate to robots since Tamagotchis were released in the United States in 1997 as the first computational playmates that asked you to take care of them. The nature of the attachments to dolls and sociable machines is different.

When children play with dolls, they project thoughts and emotions onto them. A girl who has broken her mother's crystal will put her Barbies into detention and use them to work on her feelings of guilt. The dolls take the role she needs them to take. Sociable machines, by contrast, have their own agenda. Playing with robots is not about the psychology of projection but the psychology of engagement. Children try to meet the robot's needs, to understand the robot's unique nature and wants. There is an attempt to build a mutual relationship.

I saw this even with the (relatively) primitive Furby in the early 2000s. A nine-year-old boy summed up the difference between Furbies and action figures: "You don't play with the Furby, you sort of hang out with it. You do try to get power over it, but it has power over you, too."

Today's robots are even more powerful, telling children flat-out that they have emotions, friendships, even dreams to share. Some people might consider that a good thing: encouraging children to think beyond their own needs and goals. Except the whole commercial programme is an exercise in emotional deception.I've watched people shift from thinking that robotic friends might be good for lonely, elderly people to thinking that robots - offering constant companionship with no fear of loss - may be better than anything human life can provide. In the process, we can forget what is most central to our humanity: truly understanding each other.

For so long, we dreamed of artificial intelligence offering us not only instrumental help but the simple salvations of conversation and care. But now that our fantasy is becoming reality, it is time to confront the emotional downside of living with the robots of our dreams.

WASHINGTON POST


  • The writer is a professor of the social studies of science at MIT.

Join ST's Telegram channel and get the latest breaking news delivered to you.

A version of this article appeared in the print edition of The Straits Times on December 09, 2017, with the headline Why these friendly robots can't be good friends to our kids. Subscribe