Culture Vulture

Sim City, Sin City

Should our treatment of virtual characters be governed by morals?

On nights my son Julian - only 10, but neurotic like me - cannot sleep, I pull out my iPad and show him my Sims.

He loves watching these simulated humans, who live in a popular computer game, as they go about their mundane tasks. They eat standing up in their bedrooms. They have no concept of privacy, crowding two or three into a bathroom to shower and use the toilet. Enthusiastic real-estate agents wander through virtual homes, cam-cording the nonplussed residents cuddling their babies or watching TV.

Julian giggles at the socially awry randomness of software- programmed "people". To him, the "uncanny valley" - a term coined to describe the spookiness of seemingly human figures behaving in not-quite-human ways - is funny. The Sims and their controllable routine comfort him. He soon falls into contented slumber.

The Sims in my iPad are my alternative children. They do as I bid them, immediately, never complaining. Even the baby Sims can be dealt with, with a single tap. Tap, diaper changed (21 seconds, 7 experience points or XP). Tap, milk fed (two minutes, 19 XP). Tap, hibernate for one day (1,804XP).

In stressful modern living, where so much remains beyond our control, I find my Sims infinitely soothing and comforting. In fact, as I write this, they are next to me, occasionally calling out in their nonsensical language, Simglish, so that I don't forget them completely.

Simulation games like these are probably the closest most of us will ever get to being dictators (there's even a game that lets you play God, literally). The attraction is easy to see. Yet, as technology pushes us ever closer to the holy grail of artificial intelligence (AI), with self-driving cars and smart household items (a hairbrush that buzzes protests when you brush your hair too hard, anyone?) poised to be commonplace, I wonder if it might not be time to consider the morality surrounding objects programmed with personalities.

Last Friday, the New Statesman website commented on the trend of gamers torturing their The Sims characters. In the article, I Want To Cheat On Him And Set Him On Fire: Why Are We Sadistic Towards Our Sims?, technology journalist Amelia Tait noted that scores of The Sims players have enjoyed brutally murdering their creations. Often, players would even post videos or screenshots of their dastardly deeds against virtual people online.

A YouTube search for "Sims torture chamber" turned up 5,200 results. As if aware of players' proclivities, EA Mobile, makers of The Sims FreePlay app (the version I play), even has a home called "Dark Secret" for players to purchase, complete with a dungeon-like secret chamber.

While some sadistic players interviewed said they were just bored, and wanted to play the game "outside of conventional play styles" (Tait herself argues that it is a kind of creativity and rebellion; a psychologist is quoted as saying it's about autonomy and choice), one person said she would run the same torture experiments on real people if she could never get caught.

Which begs the question: Should our treatment of robots, virtual characters and AI be governed by morality? Do we have social and ethical obligations to the coded life forms that we are bent on making reality?

Science fiction, prophetic as it is, is rife with cautionary tales of robots rebelling against their human masters or organic enslavers. Skynet in The Terminator films; I, Robot (2004), based on Isaac Asimov's short stories. In Blade Runner (1982), adapted from Philip K. Dick's Do Androids Dream Of Electric Sheep?, a trio of replicants (bioengineered androids) go rogue, living on their own terms, while searching for their inventor to demand a built-in kill function be removed. What these works have in common is the exploration of how we, as humans, treat others who are different from us.

Substitute the word "android" or "robot", with "foreign worker" or "immigrant" and you would get a good picture of how marginalised groups are still treated in many parts of the world today.

In 2015's indie film Ex Machina, AI becomes a vehicle for the Bluebeard fairy tale: a female android, imprisoned in an architectural digest-worthy concrete home, discovers a bedroom full of earlier "failed" models (in all senses of the word).

Most recently, the same concerns surface in the television reboot of Westworld, as 3D-printed androids malfunction and are "retired" into a window-less basement warehouse, despite retaining memories and emotions.

Bad behaviour towards androids, at least in Hollywood, comes with bad consequences. Faced with a machine, do we forget basic courtesy and integrity? Or do we remember not to abuse the power we have over anyone. Even if that anyone is an anything. Sim players who argue that torturing Sims helps to get it out of their system - better it happen in a harmless game - are kidding themselves that such pretend-play does not desensitise them at all to the sufferings of others.

With Siris in our phones and Alexas in our voice-activated home-entertainment systems, a moral standard for our interactions with them is becoming less hokum and more practical. Perhaps it is time we start untangling the complications of the accountability and laws involving AI.

In 2015, a robot named hitchBOT, which attempted to hitch-hike across the United States, was beheaded while on the road by unknown vandals. Clearly, sadistic treatment of robots come with an economic cost to individuals and companies.

But there is also a psychological cost to the community at large. Should the law continue to protect them as only property? How do we protect AI employees? When does it have a case against its owner? Who can be an AI's advocate, if it needs to bring about such a lawsuit? On the flip side, can we judge and sentence AI?

We've all accidentally killed a house plant, or neglected a tiny pet. I daresay most of us felt at least a modicum of guilt and, as a result, possibly tried not to do it again.

Animal-abuse court cases in the news impress upon us that such actions are punishable. Is it too far-fetched to argue that we have a responsibility towards our bits-and-bytes charges too? At the very least, being kind to our Sims, in a game where anything (depraved) goes, is an exercise of our capacity to be truly humane and human.

•Clara Chow is the author of Dream Storeys (Ethos) and co-founder of WeAreAWebsite.com

Join ST's Telegram channel and get the latest breaking news delivered to you.

A version of this article appeared in the print edition of The Straits Times on January 10, 2017, with the headline Sim City, Sin City. Subscribe