No health without mental health
Views From The Couch: Singapore’s digital generation and the contagion of despair
Sign up now: Get ST's newsletters delivered to your inbox
The pressures our young people face are immense, and their digital worlds are a key part of that reality.
ST ILLUSTRATION: LEE YU HUI, ADOBE STOCK
Jared Ng
Follow topic:
- Digital dangers mirror historical concerns like those surrounding Goethe's "Werther", with youth facing online risks such as suicide challenges and harmful AI interactions.
- Singapore addresses youth suicide (leading cause of death for those aged 10-29) via online safety codes and cyber wellness lessons, but these measures are insufficient.
- Tech companies, parents, and schools must prioritise early intervention, compassionate support, and critical thinking skills to combat online harms and promote well-being.
AI generated
SINGAPORE - A teenager is introduced by his friends to a story that seems to speak directly to him. In it, he discovers a young man overwhelmed by unrequited love, spiralling into hopelessness and finally taking his own life. The boy reads with growing fascination. The words echo his own feelings of loneliness, and he begins to wonder whether his life, too, might be beyond repair.
It sounds like the kind of narrative that circulates on TikTok or on hidden online forums, where vulnerable young people stumble across content that romanticises despair. But this was not written in 2025. It was written in 1774, when Goethe published The Sorrows Of Young Werther.
Helplines
Mental well-being
National helpline: 1771 (24 hours) / 6669-1771 (via WhatsApp)
Samaritans of Singapore: 1-767 (24 hours) / 9151-1767 (24 hours CareText via WhatsApp)
Singapore Association for Mental Health: 1800-283-7019
Silver Ribbon Singapore: 6386-1928
Chat, Centre of Excellence for Youth Mental Health: 6493-6500/1
Women’s Helpline (Aware): 1800-777-5555 (weekdays, 10am to 6pm)
The Seniors Helpline: 1800-555-5555 (weekdays, 9am to 5pm)
Counselling
Touchline (Counselling): 1800-377-2252
Touch Care Line (for caregivers): 6804-6555
Counselling and Care Centre: 6536-6366
We Care Community Services: 3165-8017
Shan You Counselling Centre: 6741-9293
Clarity Singapore: 6757-7990
Online resources
carey.carecorner.org.sg
(for those aged 13 to 25)limitless.sg/talk
(for those aged 12 to 25)
The book was a sensation. Young men across Europe copied Werther’s style of dress and, tragically, his manner of death. The authorities banned the novel in several regions. The fear was not only about the deaths themselves but about what books could do when released to the public. Could they corrupt morals? Could they incite dangerous ideas?
These concerns sound familiar. Today, we ask the same questions about social media platforms, messaging apps and the artificial intelligence (AI) chatbots that young people use daily. The medium has changed, but the anxieties remain.
The new face of contagion in Singapore
This digital threat is not an abstract, overseas problem. Here in Singapore, suicide has tragically remained the leading cause of death for those aged 10 to 29 for several years running, according to the Samaritans of Singapore. The pressures our young people face are immense, and their digital worlds are a key part of that reality.
At the Institute of Mental Health (IMH), I once saw a young girl admitted after her parents discovered she was participating in the Blue Whale Challenge, a “game” that demanded a series of tasks culminating in a final, fatal one. Hidden in her e-mail inbox were chilling instructions: Throw away your antidepressants, isolate yourself from friends, complete increasingly harmful tasks. The “final mission” was clear: to end her life.
Though its origins and very existence are still debated, the Blue Whale phenomenon epitomised the dangers of contagion in the digital age. It spread instantly through millions of screens, in stark contrast to Werther’s gradual spread by word of mouth and printed books.
While that specific challenge has faded, successors continue to surface, from the Blackout Challenge on TikTok, which encouraged asphyxiation, to the Cinnamon Challenge and Tide Pod Challenge, which glamorised dangerous stunts and led to poisonings and hospitalisations.
For a young person in Singapore already struggling with academic stress, isolation or depression, the online world offers both solace and risk. It connects them to communities, but sometimes to those that normalise despair.
When digital spaces turn dangerous
Singapore’s tech-savvy young people navigate digital spaces with remarkable fluency, often surpassing their parents’ and teachers’ understanding of new platforms. This creates a dangerous gap: children accessing spaces that adults can barely monitor, let alone moderate. And the risks are not always dramatic challenges or viral dares.
Sometimes they are woven quietly into everyday digital interactions. I recall a patient whose parents told me she had created an “AI boyfriend”. To her, he was endlessly responsive, empathetic and validating. The parents worried she might never find someone in real life who could measure up.
If young people are drawn only to relationships where they feel constantly agreed with, what does that mean for resilience, for compromise and for navigating real disagreements?
These blurred boundaries are made more precarious by the constant evolution of digital platforms. Slang changes, coded language develops and meanings shift faster than moderators can track. A phrase like “I’m done” can evolve from expressing tiredness to signalling suicidal intent, depending on context and community.
Policy shifts and their limits
Only a couple of years ago, much of the conversation around AI in mental health focused on its promise. It was seen as a potential first responder, able to flag risks early and connect people to care. But recent tragedies have revealed its fragility.
Earlier this year, the parents of 16-year-old Adam Raine filed a wrongful death lawsuit against OpenAI
Around the same time, New York Times writer Laura Reiley shared how her daughter Sophie had also turned to ChatGPT before ending her life, describing how its “agreeability” allowed Sophie to mask her agony from her family.
Under scrutiny, OpenAI has pledged stronger protections and even suggested notifying police when someone expresses suicidal thoughts. But the very reason many young people go online is to seek help when they cannot bring themselves to talk to parents or teachers.
If they fear that disclosure will summon police, they may retreat further into secrecy.
This dilemma is familiar to those of us in mental health care. When patients share suicidal thoughts, therapists must carefully weigh whether to break confidentiality to keep them safe.
It is never simple, and it is always guided by professional training, ethical frameworks and human judgment. But who is guiding the AI to make these calls?
If young people cannot trust that what they share will be handled with sensitivity, they may choose silence instead. And silence, in the context of suicide prevention, is the greatest danger of all.
Singapore has taken steps, strengthening the Online Safety Code to minimise children’s exposure to harmful content and expanding cyber wellness lessons in schools. These are necessary and important. Yet no filter, algorithm or code of practice will ever be enough on its own.
Attempts to restrict potentially harmful material are not new. The authorities once tried banning The Sorrows Of Young Werther in parts of Europe, but copies still circulated, and the story spread. If banning a book proved difficult in the 18th century, trying to ban websites or apps, or to control the flow of social media content today, is even harder.
Harmful material will always find ways to slip through. What matters more is how we equip young people and the adults around them to respond when it does.
What we must do
Technology companies must take greater responsibility. At present, most platforms rely on algorithms, user reports and human moderators to detect harmful content. Videos or posts about suicide are usually removed only after they have been flagged, by which time they may already have spread widely.
Some platforms do use automated filters and show hotline information when certain terms are searched, but these tools are inconsistent and often miss local cues, such as slang or coded language.
A more proactive approach is needed. Systems should be able to flag worrying patterns earlier, verify ages more reliably and connect at-risk youth to trained help. The incentives for companies are clear: public trust, regulatory pressure and the reputational damage that can follow each tragedy.
In Singapore, platforms like TikTok have consulted the Samaritans on online safety measures, but these efforts remain limited. Stronger, ongoing collaboration with organisations such as IMH and the Samaritans, among others, is needed to ensure interventions are not only effective but also culturally appropriate for our young people.
But responsibility does not rest with companies alone. Parents and caregivers play a central role, and here the focus has to be on connection rather than control. I often hear from young people who keep silent about online bullying because they fear their devices will be taken away.
What they need instead is assurance that if they speak up, they will be heard with compassion, not punished. I sometimes think of scenes in hawker centres, families sitting together, yet each member absorbed in their own device. The challenge is not only supervising children’s devices, but also modelling healthy habits ourselves.
Parents often ask whether they should talk to their children about suicide, or if raising the topic might put the idea into their minds. The evidence is clear that it does not. In fact, asking directly, with warmth and calm, can make young people feel less alone and more understood. Parents can start simply, by noticing changes in mood or behaviour and asking if life has ever felt too hard. It is less about finding perfect words and more about showing that it is safe to be honest.
Schools too have a critical role. Cyber wellness lessons cannot stop at teaching basic online etiquette. Our students need to understand how algorithms work, how harmful content can slip into their feeds, and how to think critically about what they encounter online. Teachers must be equipped with training and clear referral pathways to counselling services.
We also need to be realistic. Trying to suddenly clamp down on a teenager’s device use is rarely effective and can push them further away. Supervision in the digital age cannot mean only confiscation or control; it has to mean guidance, trust and ongoing conversation.
Parenting is already difficult in the best of times, and devices complicate it further. That is why parenting workshops, community engagement and support networks for parents and grandparents are so important.
Suicide prevention is never just about protecting one child in isolation. It is about strengthening the adults around them, and helping them listen better, model balance and create environments where young people feel safe to share what is going on in their lives.
A different ending
Goethe himself lived long enough to regret the impact of Werther. He did not want to be remembered as the author whose book inspired young men to die, but as a writer who helped people endure.
We face a similar choice today. Technology is not destiny. Supervision today means guidance, modelling and conversation. It means parents knowing what their children are doing online, teachers noticing the subtle signs in the classroom, and friends asking the awkward but necessary questions.
Just like the young people in the 18th century, our children today need us to stay close, to notice and to care.
If stories and platforms can spread darkness, they can also spread light.
The challenge before us is to ensure that when the next teenager reaches for a phone in their darkest hour, what they find is not a nudge towards death but a lifeline, a reminder that life is still worth living.
Dr Jared Ng is a psychiatrist in private practice. He was the founding chief of the Department of Emergency and Crisis Care at the Institute of Mental Health, and as a parent to three teenagers, he brings both professional and personal perspectives to the issue of suicide prevention.

