Forum: Introducing AI at Primary 4 carries a real developmental risk
Sign up now: Get ST's newsletters delivered to your inbox
As a digital wellness educator, I read Education Minister Desmond Lee’s comments on introducing artificial intelligence at Primary 4 with deep concern (New committee to guide AI use in Singapore’s higher-education sector, April 1).
The minister said this will be done “under close supervision” and with “low exposure”.
But the conversation cannot be framed only around academic learning or classroom efficiency. It must include what neuroscience tells us about how children grow, learn and mature.
Ten-year-olds are still building the foundations for attention, impulse control, discernment, and the ability to persist through difficulty. These capacities shape how a child learns, relates and responds to the world.
Introducing AI systems – which are designed to respond instantly and reduce cognitive effort – into that window carries real developmental risk.
We have been here before. When social media spread rapidly through our children’s lives, the guardrails came years too late. We are still living with those ripples on youth mental health, sleep, attention and family life. Many countries are now restricting technology companies’ access to their citizens below 16.
International guidance is more cautious than many realise. In 2023, UNESCO called for regulation of generative AI in schools, data protection and privacy standards, teacher training, and an age limit of 13 for classroom AI tools.
The parents I work with are not anti-technology. They are trying to take responsibility for their children’s healthy development.
But what data will be collected on their child, how long will it be kept, and will third-party vendors have access? How will teachers be equipped to provide genuine supervision? Will there be an opt-out option for families who are not yet ready?
Before AI becomes the new normal in schools, parents need answers to questions like these.
Carol Loi Pui Wan
Founder
Village Consultancy


