Singapore's plan to put autonomous vehicles (AVs) on the roads here was praised by global robotics expert Ayanna Howard, who raised questions about the way AV tests are currently conducted overseas.
A former Nasa researcher named by Forbes magazine as one of the top 50 women in tech in the United States, Dr Howard also said there is a need to consider the human factor in testing AVs.
"A lot of times, with self-driving cars, the issue is that other cars are driven by humans. Most humans don't always follow traffic rules exactly. In a test done in California, drivers would stop, then go, because they wanted to see what the self-driving car would do," said the 47-year-old professor and chair of interactive computing at Georgia Tech in the US.
"You may also not see a child running across the street during testing. How do you test for such a scenario?"
Last week, Senior Minister of State for Transport Janil Puthucheary announced that public roads in western Singapore will become a test bed for self-driving vehicles.
This will allow companies to test their AVs in neighbourhoods such as Bukit Timah, Clementi and Jurong.
The test bed will be expanded gradually to cover more than 1,000km of public roads here over the next several years.
Dr Howard, who was in Singapore yesterday, spoke to The Sunday Times on the sidelines of the Out Of The Box Conference 2019: Artificial Intelligence (AI), held at the Legacy Centre in Marymount. The conference was part of a series of professional talks organised by AMP Singapore, a non-profit organisation that serves Muslim professionals.
Dr Howard said it is important that AV tests are conducted here and that researchers are not reliant on just one set of data.
"Having this testing done in a place like Singapore is so important because most of the testing for autonomous vehicles is done in the US," she said.
The value of data from Singapore's AV tests, she said, is that it would be different from the US' since Singapore motorists drive on the opposite side of the road, and the two countries probably have different driving cultures.
She also said there is little reason to fear technology, including robots.
Dr Howard, who worked in the National Aeronautics and Space Administration's Jet Propulsion Laboratory from 1993 to 2005, developed the next-generation Mars rover for future missions. Powered by AI, the robot will be independent enough to explore Mars' terrain on its own, without having its every move programmed by a human.
"Most of the fear of robots is because of movies and films. What people fear is a robot in a human form... But part of our society already uses robotics, like in manufacturing, where robots are used for sorting and assembling things, in Amazon and other companies," she said.
But she acknowledged that there are issues that need to be addressed. One of them is the question of ethics as technology use gains pace.
"How do we ensure that the data represents an entire domain, taking into account factors such as geography, gender, culture and ethnicity? Another aspect is that systems should be able to quantify how they're making decisions, such as an AI system that provides diagnoses in a hospital," she said.
In January, Singapore released a framework on how AI can be ethically and responsibly used, which businesses here and elsewhere can adopt as they grapple with issues that emerge with new technology.
The framework, released at the annual World Economic Forum meeting, was the first in Asia to provide detailed and readily implementable guidance to private-sector organisations using AI.
To illustrate why robots are not to be feared, Dr Howard described how she and her team at Georgia Tech have designed emotional robots for children with autism.
The robots use body gestures based on cognitive science. Open movements and lifted heads show happiness, while closed body gestures express sadness.
She said: "My group designs robots that work with kids for therapy exercises. For children who don't have access to therapy every day, they can learn movement, speech or how to take turns. We find that when the robot emotes based on how the child reacts, the child sees the robot as a friend."