Healthcare cluster rolls out ‘AI-free’ periods for doctors to prevent deskilling
Sign up now: Get ST's newsletters delivered to your inbox
Deskilling is the loss of previously held skills due to over-reliance on technology.
PHOTO ILLUSTRATIONS: UNSPLASH
SINGAPORE – Healthcare cluster National University Health System (NUHS) has been rolling out artificial intelligence-free periods for doctors in the past year as part of efforts to prevent deskilling.
Deskilling is the loss of previously held skills due to over-reliance on technology.
Another cluster, NHG Health, is exploring similar measures.
During these “AI-free” periods, the healthcare professionals do not use AI tools for clinical work or assessments.
Other possible ways to curb deskilling include tracking clinicians’ performance and educating medical students and clinicians about the risks of using AI.
The threat was highlighted by a study that found that in just a few months, experienced doctors who used AI assistance to detect pre-cancerous growths in the colon were less adept at doing so without the tool.
Published in The Lancet Gastroenterology And Hepatology journal in August 2025, the study involved doctors at four endoscopy centres in Poland who use an AI tool for the detection of polyps, which could either be benign or cancerous.
In the three months before the technology’s roll-out, doctors detected growths in roughly 28 per cent of colonoscopies, but the rate later dropped to around 22 per cent.
As Singapore’s population rapidly ages and demand for healthcare services rises, AI is being deployed in the public healthcare sector to enable faster and more accurate diagnosis, for use in predictive analytics for early intervention, and to automate routine tasks.
The Straits Times approached the three healthcare clusters and medical schools here to ask how they manage the risk of AI deskilling.
Adjunct Professor Ngiam Kee Yuan, head of NUHS’ Artificial Intelligence Office, said that while the study highlights the need for careful and responsible use of AI, it reflects only one specific trial setting with some limitations, and does not mean that AI inevitably weakens doctors’ skills.
He pointed out that one limitation of the study is that it is not fully clear how the AI tool was used, for example, whether false positive results that could skew detection rates were included.
Despite the limitations of the study, Prof Ngiam said that there should be “AI-free” periods for skills and judgment-based AI use cases to maintain the currency and performance of doctors, which the cluster has rolled out in the past year.
“The exact proportion of AI-free time should be determined by the clinicians who use AI in their domains without inadvertently deskilling expert operators, if at all,” said Prof Ngiam.
In early 2024, the cluster rolled out an AI-enabled system to identify polyps during certain endoscopic procedures. In April 2025, it deployed SerenityBot, an AI assistant that suggests recommendations for breast cancer treatment, and in June 2023, deployed the Champ chatbot that helps chronic disease patients track their health readings such as blood pressure and heart rate.
Meanwhile, at NHG Health, the cluster’s chief clinical informatics officer of population health, Dr Jonty Heaversedge, said NHG Health has deliberately chosen to trial AI solutions that are most developed and with the lowest risk for clinical care, to allow the cluster to better understand the opportunity that AI offers.
Examples of AI solutions on trial at NHG Health include imaging, clinical note-taking, and predicting fall risk and length of hospital stay, as well as a range of back-office functions.
Dr Jonty said NHG Health is proactively exploring and identifying safeguards in its AI use, and one approach is to put in place usage limits.
“If usage limits are considered, deliberate boundaries would be placed on how frequently or extensively AI tools are used in clinical workflows. This ensures that clinicians maintain core competencies and avoid over-reliance on automation,” he added.
Another approach is to have “AI-on” and “AI-off” modes, he added.
“For example, a doctor may use AI to help summarise routine patient consultations (AI-on) but choose to manually write notes for complex cases that require more nuanced judgment (AI-off),” said Dr Jonty. Another safeguard is to track clinicians’ performance, both with and without AI.
Professor Kenneth Kwek, SingHealth’s deputy group chief executive officer for digital health, said adoption of AI must be thoughtful and measured, always keeping in focus the possibility of deskilling.
He said AI is primarily used at SingHealth to streamline and reduce routine administrative tasks, and as clinical decision support.
“Every proposed clinical AI solution undergoes a rigorous evaluation process covering all aspects of deployment through our domain experts such as our medical board before implementation is considered,” said Prof Kwek, adding that when deployed appropriately, AI can elevate professional practice and enhance technical skills.
National healthtech agency Synapxe, which develops tech solutions for the public healthcare sector, said it deploys AI tools in close collaboration with healthcare clusters and clinicians.
“All AI tools undergo rigorous evaluation, and are deployed such that a human continues to be in the loop and clinicians retain full decision-making authority, making the final judgment on the most appropriate care pathway and intervention,” said Synapxe.
Similarly, Singapore’s three medical schools – Yong Loo Lin School of Medicine (NUS Medicine), Lee Kong Chian School of Medicine (LKCMedicine) and Duke-NUS Medical School – have incorporated AI training in their curriculum, exposing students to the ethics of deploying AI, and learning about its benefits and risks.
LKCMedicine dean Joseph Sung said that to avoid deskilling of medical practitioners, medical students and junior doctors should be trained not to overly rely on AI, and be equipped with fundamental knowledge and basic skills before they use technology.
“For example, during training in colonoscopy, gastroenterology residents and fellows should start with basic endoscopy techniques, instead of using AI entirely to replace their hands and eyes,” said Professor Sung.
LKCMedicine also conducted a curriculum review in 2023, and included new areas of focus such as digital health. Since 2024, digital health courses in areas such as medical data science, data analytics, precision medicine and AI have been integrated throughout the five-year curriculum.
Prof Sung said: “We also bring in ethicists, lawyers and patient advocates to discuss issues related to AI. This emphasis will help students develop a firm foundation in the ethical and legal consequences of AI and healthcare informatics, and be aware of both its limits and benefits, making them confident and discerning users of technology.”
Associate Professor Shiva Sarraf-Yazdi, vice-dean of education at Duke-NUS Medical School, said deskilling is not the only risk AI presents to medical practitioners.
There are also concerns such as “never-skilling”, where a practitioner fails to acquire critical capabilities, and “mis-skilling”, where AI-generated errors or biases are reinforced. In the context of medical education, it thus remains essential to get the foundations right, said Prof Sarraf-Yazdi.
The curriculum at the graduate medical school has a substantive component that fosters research skills and critical thinking.
It currently enlists multidisciplinary workgroups with students, staff and faculty from SingHealth Duke-NUS Academic Medical Centre to create ethical and pedagogically sound approaches to AI and its risks and benefits, and apply AI to education.
“Facing an uncertain future, we are staying adaptive and committed to reskilling and upskilling alongside our learners, using AI to augment human capabilities and clinical judgment,” said Prof Sarraf-Yazdi.
Prof Ngiam, who is also an adjunct professor at NUS Medicine’s department of surgery, said the medical school is also preparing the next generation of doctors to use AI tools responsibly.
For example, its first-year medical students must now complete a minor in biomedical informatics, which was introduced in the 2023-2024 academic year.
In this minor, students learn about topics such as AI and machine learning in the healthcare context, and are taught to use data derived from medical records systems effectively.
Correction note: In an earlier version of this story, we referred to NHG Health as the National Healthcare Group. The group became known as NHG Health on July 1. The article has been updated to reflect this.

