At over 30 years old, the World Wide Web has grown into a highly complicated creature – one that is not only more intricate but also much harder to control.
Starting life as a mostly static network of information, it has evolved into a fast-moving, highly interactive space where people watch, post, comment, share and shape what others see, all at their fingertips.
Governments worldwide have been trying to keep pace with the rapid online developments, putting in place new safeguards to protect users, especially children and teens, from harmful material and online risks.
The latest example is the United Kingdom, which has introduced sweeping new regulations under its Online Safety Act aimed at protecting youth under 18 from harmful online content, such as pornography, self-harm, suicide and eating disorder content.
Passed in 2023, the Act's age-verification provisions came fully into force in late July this year.
Last November, Australia passed a law to delay children's access to social media platforms, such as Facebook, Instagram, YouTube and X (formerly Twitter) until the age of 16, and the ban will come into effect this December.
Reception to these laws has been mixed. In the UK, for example, supporters view the measures as necessary to protect children and reduce their exposure to harmful content. Critics, however, argue that the laws resemble censorship, fall short in regulating private messaging apps and risk infringing on users' freedoms.
Some also contend that age verification methods are overly invasive while still failing to effectively keep younger users off restricted platforms.
But what is clear is that there is an urgent need to act. For example, on Thursday (Aug 14), Reuters reviewed an internal Meta Platforms policy document on chatbot behaviour that showed the company’s AI creations were allowed to engage children in romantic or sensual conversations, generate false medical information and make racially demeaning statements.
Some of these chatbots operate on Meta's social media platforms, including Facebook, WhatsApp and Instagram. The more than 200-page document outlines what staff and contractors should consider acceptable chatbot behaviours when building and training the company's AI products.
With digital landscapes being in a constant state of flux, debates about online safety, along with the pros and cons of stronger online protection, have been gaining momentum in other countries too, including Singapore.
While Singapore has yet to mandate age limits for social media, it has introduced several measures to protect younger users, including a Code of Practice requiring stricter safety tools on platforms and rules mandating age checks in app stores.
In January, Minister of State for Digital Development and Information Rahayu Mahzam said in Parliament that Singapore shares the same objectives as Australia in legislating age limits for social media access, and is currently studying the effectiveness of mandating age limits.
Beyond regulation, the government is also investing heavily in education. Cyber wellness lessons in schools, public toolkits like the Digital for Life portal and parenting initiatives are part of a broader effort to equip families with the know-how to stay safe online.
As one of the world's most connected nations, Singapore faces a dilemma much like other countries around the world grappling with this issue today: What laws and safeguards to introduce against online harms, but also how to strike a balance between protection, practicality and privacy.
CNA TODAY spoke to parents and experts to explore what a truly balanced approach to digital safety looks like in a society where children are growing up with devices from a very young age.
WHY THE URGENCY NOW?
In the early days of the internet, going online meant browsing static pages mainly filled with text and perhaps some animated gif files.
The internet of today is a different beast altogether – children are growing up in a digital world shaped by algorithms, machine learning and artificial intelligence, an ecosystem that was never built with youngsters' safety in mind.
With digital life becoming inseparable from physical life, the impact on young people has become impossible to ignore, said cybersecurity consultant Emil Tan.
He said that from the early days of "the dot-com boom" to the explosion of social media platforms like Facebook, governments have been trying to figure things out as the digital world evolved.
"It has always been a balancing act. Overregulation risks turning into a police state, whilst underregulation leaves users, especially the young, vulnerable. For years, regulations were intentionally light-touch.
"But we are now facing a hard truth. Without proper guardrails, the internet doesn't just reflect the risks in society, it amplifies them," said Mr Tan. He is also co-founder and director of Infosec In the City: SINCON, which organises one of Singapore's annual international cybersecurity conferences.
What has changed over the years, he added, is that many countries are now at a point where both the urgency and the ability to act have aligned.
Ms Nicole Quinn, who is Palo Alto Networks' vice president of public policy and government affairs in Asia Pacific and Japan, likened the situation to when cars were first invented, as there were no rules or safety standards then. Palo Alto Networks is a multinational cybersecurity company.
But as car usage grew and the risks became clearer, policymakers had to introduce regulations to keep people safe on the roads, she said, adding that social media and online platforms are now going through a similar process.
"The online threat landscape has evolved dramatically. This shift has created an exponential increase in potential points of vulnerability for cyber attacks and data exploitation," Ms Quinn said.
"The COVID-19 pandemic further accelerated this, with extended periods of remote learning and socialisation online, deepening children's reliance on these platforms."
Ms Quinn also noted that the rise of algorithm-driven platforms, built to maximise engagement by serving emotionally charged content or reinforcing biases, has heightened concerns, as they tend to amplify harmful material.
"The cyber threats themselves have also become far more sophisticated, moving beyond simple content filters to combating phishing scams, ransomware and social engineering attacks that target young users for personal data or even access to their parents' financial information," she added.
From a psychological perspective, Dr Jean Liu, associate professor of psychology at the Singapore Institute of Technology, said that while regulation efforts for child online safety are not entirely new, calls for stricter controls have increased recently because of several converging factors.
"First, the COVID-19 pandemic placed a more intense spotlight on youth mental health, heightening public awareness of the issue," she said.
"Second, there is now more research on the benefits and harms of internet use to youths. Finally, there has been high-profile advocacy. For example, social commentator Jonathan Haidt wrote a bestselling book, 'The Anxious Generation', linking youth mental health to digital exposure."

In the past few years, platforms like YouTube, TikTok, Discord and OnlyFans have come under scrutiny for how easily young users can encounter inappropriate or harmful content.
YouTube's recommendation engine, for instance, has faced criticism for steering users toward conspiracy theories and disturbing material, such as those related to COVID-19 and 9/11, though some studies have challenged how widespread or systematic this effect truly is.
TikTok's "For You" feed has also been shown to surface eating disorders and self-harm content within minutes of use.
While most platforms have since rolled out stronger content moderation policies, default restrictions for minors, and parental control tools, experts said these safeguards remain uneven and, in many cases, too easy for children to bypass.
AT HOME, GLUED TO THEIR SCREENS
Parents interviewed by CNA TODAY are fully aware of the dangers their children could be exposed to on the internet, citing risks such as addiction, exposure to inappropriate or harmful content, online predators, and potential negative effects on mental health and academic performance.
They also face the challenge of children outsmarting parental controls and finding creative workarounds to satisfy their craving for device use.
In Ms Kate Lim’s household, while she recognises the educational benefits her children have gained from the internet, her biggest worry is exposure to inappropriate content.
"Even YouTube has videos that aren't filtered by age. Things on racism, war, violence, LGBTQ (lesbian, gay, bisexual, transgender and queer) topics… Some of these are difficult for children to understand at their age," said the mother of three boys aged 10, 15 and 17.

Pornography is another concern. Ms Lim, 46, a speech therapist and counsellor, said it is a tricky balance between saying too much too early and waiting too long. She and her husband have already had talks about pornographic content with their older two children, but the youngest still does not understand what porn is.
"We told them: if you come across anything like that, talk to us or my husband, specifically. We'd rather have an informed conversation than have them watching it in secret."
Ms Lim told CNA TODAY that she also enforces offline time, with measures such as programming the Wi-Fi to go dark between 6pm and 8pm when her boys sit down for dinner. The outage resumes from midnight to 7am.
Describing her children's relationship with the internet and digital devices, Ms Lim said she understands that laptops are needed for school assignments, and her children's social lives run partly through group chats and online platforms.
For single mother Jaslyn Ng, 41, who has a 13-year-old daughter and 11-year-old son, the challenge lies not just in controlling their screen time, but in managing what her children consume.
The financial services director said she noticed how exposure to aggressive games like Mobile Legends influenced her son's language and behaviour, prompting her to delete the app and introduce word puzzles instead, as a way to help him "detox" from screens.
She added that limiting screen time has made a difference, with both children becoming noticeably more mild-tempered after their access was reduced from near-unlimited usage to about an hour a day.
The limits were enforced through Apple’s parental controls feature Screen Time, which allows caregivers to lock app usage on the children’s phones behind a passcode and receive notifications if it is bypassed.
"There was one time my daughter managed to crack the four-digit pin code just by blind guessing. She didn't realise my sister would get notified. We had to set things straight after that."

As a father of two aged seven and 11, a civil servant who wanted to be known only as Mr Jay, said that while he allows limited use during travel or waiting times, he sets clear boundaries on gaming – the children get a maximum of 30 minutes after homework on school days, and on weekends, up to an hour.
Mr Jay said he has not seen attempts to bypass limits yet, but expects it eventually as his son and daughter grow older.
"The reality is, they'll be exposed to inappropriate things. Kids watching pornography is something that's been around for decades," he said.
"My worry is what exists online now, how dark and warped some content can be, and how that might affect a young mind."
Mr Jay also said that cyberbullying is his biggest concern, as he is aware how comment sections online can be "full of hate and negativity".
"That kind of environment can really affect a child's mental health and development, especially with how anonymous people are when they post."
Other than monitoring their children’s internet usage, some parents told CNA TODAY that they dislike the idea of handphones being allowed in schools, especially since there is currently no blanket ban on these devices.
Ms Shulin Lee, a legal recruiter and a mother of two aged six and nine, said she was shocked to learn from friends and people she met at work that primary school children as young as seven and eight had been caught watching explicit content at school.
Hearing of such incidents has reinforced her belief that smartphones should not be allowed on school grounds.
"It doesn't matter that they say smartphones will be confiscated if used during class. Kids are using them in the toilets and looking at inappropriate material when waiting for the school buses together."
Worried that her own son, currently in Primary 3, would have been exposed to such content, Ms Lee said that she spoke with him after hearing the stories.
"I told him clearly what to do if anyone shows him inappropriate content, that he should walk away and tell a trusted adult, like a teacher or parent. These aren't easy conversations, but they're necessary."
Where there are screen time limits, there will also be teens figuring out how to bypass them.
Junior college student Jarrett Er, 17, Ms Lim's eldest son, once woke up at 5am for a week straight just to play video games before his parents got out of bed. His ploy was discovered when his father happened to get up for a drink around the same time and caught him red-handed.
Jarrett also said that he tried to bypass the late-night outage by using his handphone's hotspot for internet access, which his parents eventually discovered.
"After the 5am ploy, my parents did not allow me to play video games for a few months."
Polytechnic student Sujaish Kumar, 18, said his parents did set screen time limits when he got his smartphone in Secondary 1. But by Secondary 4, he had found ways around the restrictions, such as telling his parents he had extra classes just so he could stay out later and use his phone without restrictions.
He has come across violent content on Instagram, such as school fights, gore and even videos showing people being killed, often without proper warnings.
"It was disturbing at first, especially when I was younger. I used to think the world was peaceful, then I realised these things happen everywhere," he said.
Sujaish added that he never told his parents about such content but sometimes discussed it with friends, who were also the ones showing him such videos.
"I definitely feel more desensitised since I got used to that kind of content," he admitted.
While Sujaish thinks that 16 is a reasonable age to be granted more access to social media, he said it could backfire if parents are too strict.
"I have a friend whose parents were super strict. They didn't let him use his phone, and now you can tell he doesn't really know much about things going on in the world."
As for Jarrett, his online experience has also exposed him to scam ads, deepfakes and radical opinions, including content that promotes racism or strong views against certain groups.
"When I see that kind of content, I try to read widely, judge it based on my own thinking and evaluate if what they're saying is valid before jumping to conclusions," he said.
Asked whether schools or parents should be doing more to protect young people online, Jarrett felt that while cyber wellness is emphasised a lot in school, it feels like the same messages are merely being repeated.
"They should focus more on real scenarios that are relevant to us, like how to critically evaluate influencer content, especially when it spreads harmful ideas," he said.
BALANCING SCREENS AND STUDY
In response to CNA TODAY's queries on the Ministry of Education's approach to teaching cyber wellness and digital literacy, as well as safeguards implemented on school-issued devices to prevent access to age-inappropriate or harmful content, an MOE spokesperson said the ministry works closely with schools to ensure that the use of digital devices and educational technology is age-appropriate, purposeful and grounded in sound pedagogy.
"Curriculum design and delivery in primary schools remain focused on non-digital learning experiences such as print-based reading, writing by hand and working with concrete manipulatives, interspersed with digital experiences where appropriate.
"As such, primary schools are provided with devices for in-class use supervised by teachers, with light use at lower primary levels and progressively increasing from Primary 3 onwards."
For secondary school students, the spokesperson said schools ensure that the use of personal learning devices (PLD) is balanced in relation to other modes of learning.
"The PLDs provide opportunities for older students to collaborate with peers in and out of class and leverage technology for self-directed learning, such as through resources on SLS," said the spokesperson, referring to Singapore Student Learning Space, the online platform for teaching and learning created by MOE.
The spokesperson added that PLDs are installed with a device management application (DMA), which schools use to set default safety settings – including blocking access to undesirable internet content and selected applications – and to regulate device usage through functions such as sleep mode activation to support students' well-being and rest.
"For parents who prefer a greater say over how their child uses the device, MOE provides them with options to manage their child's PLD after school hours."
Beyond technical safeguards, Cyber Wellness education plays a key role in helping students develop the social-emotional competencies and critical thinking skills needed to navigate the online world safely and responsibly.
Students are taught to:
- Verify the credibility of online information and sources
- Be discerning of negative influences and inappropriate websites
- Understand the risks and consequences of inappropriate online interactions
- Take steps to protect themselves when navigating digital spaces
- Establish a healthy balance between online and offline activities, including managing screen time and social media use
- Seek help from trusted adults, such as parents, teachers or school counsellors, for issues like addictive online behaviour, online harassment, or cyberbullying
- Use safe reporting channels to alert teachers and school leaders about hurtful or harmful online behaviour
- Report cyber-related incidents to the relevant platforms, with guidance from their schools.
The spokesperson stressed that a strong partnership between schools and parents is key to helping students learn with technology.
"MOE and schools will continue to work closely with parents by sharing tips, strategies and resources, such as Parenting for Wellness, which help parents guide their child in the use of technology and devices.
"Parents can also provide feedback to the school on their child's inappropriate use of PLDs. Schools will work with identified students who require additional support and put in place measures to help them regulate their use of PLDs."
Collapse Expand
ARE TOUGHER LAWS WORKING?
As global apprehension intensifies regarding children's online safety, governments are responding with more comprehensive age-related regulatory measures.
One of the most closely watched developments is the UK's new regulatory framework aimed at protecting minors from harmful online content.
The UK's Online Safety Act mandates that digital platforms take stronger steps to identify, remove and prevent the spread of illegal and harmful content, with particular focus on protecting children.
Under this law, enforced by the UK's communications regulator the Office of Communications (Ofcom), platforms like YouTube, TikTok and Instagram are required to implement age verification mechanisms, robust content moderation and user reporting systems.
Ofcom has the power to investigate platforms, demand transparency reports and issue fines of up to 18 million pounds (S$31.2 million) or 10 per cent of a platform's global turnover – whichever is higher.
Platforms are required to publish detailed risk assessments and safety policies, and ensure that users, especially children, are offered safer default settings, such as private accounts and filtered content feeds.
Some major platforms, such as TikTok, Reddit, Meta and YouTube, have begun rolling out age verification tools and safer default settings to comply with the law. Porn sites have also seen sharp drops in UK traffic following stricter enforcement, suggesting active compliance with age-check rules.
Among the critics of the law, the right-wing political party Reform UK has branded it "borderline dystopian" and pledged to repeal it if the party comes to power.

In the European Union, the Digital Services Act (DSA), which entered into full effect last year, serves as the bloc’s content moderation law.
The EU law imposes strict requirements on online platforms to mitigate risks to children, including enhanced content moderation, greater transparency and safeguards against harmful or illegal content such as targeted advertising and algorithmic amplification.
On whether such regulations are effective, Mr Yeong Zee Kin, chief executive of Singapore Law Academy, noted that it is still early days. "Given that they're still relatively new laws, it's still too early to determine how effective they are."
He explained that many of these laws build on existing frameworks – such as mandatory complaint-handling systems already in place in regions like the EU – but newer rules are raising the bar. For instance, some laws now require platforms to fast-track reports from trusted parties that flag harmful or illegal content.
"An interesting development in new online safety laws is the requirements for transparency, where platforms are required to provide information about their efforts, such as the number of complaints handled, to regulators," said Mr Yeong.
"This type of regular monitoring is effective in changing corporate culture within the platforms. If the level of awareness and vigilance increases and harm is reduced or avoided, that is better than enforcement action taken after harm is done."
Mr Tan, the cybersecurity consultant, acknowledged that criticism around age-based content restrictions in places like the UK and Australia is valid.
He said that focusing solely on restrictions, without accounting for user behaviour and the broader digital environment, risks backfiring, as users may migrate to less regulated or underground platforms where risks are greater and safeguards are absent.
Mr Tan added that no country has found the perfect formula yet, but some are making meaningful moves.
"The UK's Online Safety Act is one of the most comprehensive attempts. It pushes platforms to assess and mitigate risks to children by design, rather than relying solely on age gates.
"It shifts the burden to platforms to build safer environments, but implementation has been uneven, and questions remain about enforceability and privacy trade-offs."
In the same vein, experts said that technology companies should take the initiative to design platforms with safety in mind from the outset.
Ms Quinn of Palo Alto Networks said policymakers will have to look towards encouraging technology companies to build platforms with youth in mind, embedding privacy-by-design principles from the start. This means designing privacy and safety features into the platform from the ground up, rather than adding them later.
This includes default privacy settings and intuitive parental controls that give both young people and their families a sense of agency and control over their online experience.
"Adopting these design principles and embedding safety, privacy and security features into platforms from the very start makes protections more effective and harder to bypass," she added.
BLANKET BANS NOT THE ANSWER
Restricting access is just one piece of the puzzle. Without education and guidance, experts said that the move risks doing more harm than good.
Mr Gopal Mahey, psychotherapist at Centre for Psychotherapy (C4P), told CNA TODAY that age-based restrictions delay exposure, but without guidance, it can be futile and even backfire.
He noted that adolescents are naturally inclined to test boundaries as part of their developmental stage, and when restrictions are purely external – like rules or filters – the youngsters often respond by becoming more secretive, borrowing older siblings' identification details, or turning to anonymous platforms.
"The real issue is internal restriction – the ability to self-regulate... If we only tell them 'don’t do this', but don't explore the 'why', we miss a chance to cultivate resilience and ethical thinking," said Mr Gopal.
Likewise, Mr Haja Navaz, founder and principal counsellor at Sparkz Counselling Services, said people, especially teens and younger individuals, naturally resist restrictions on their freedom and often seek to reclaim control as quickly as possible.
This is heightened by the fact that the youths' prefrontal cortex, which governs impulse control and judgment, is still developing, he added.
"To make it worse, the peer influence and perceived social validation expectation add pressure on the 'restricted' children. Therefore, it is only natural for the children to find ways to bypass controls."
If Singapore were to consider similar curbs like in the UK and Australia, Mr Haja noted that drastic blanket bans may not be the answer to effectively address the issues here, as it could cause more harm than expected.
He recounted a case from a colleague: a 15-year-old schoolgirl raised in an overly restrictive and religious household faced extreme parental control under the guise of protection, culminating in a rebellion and an unintended teenage pregnancy.
"This case highlights how excessive restrictions, without open communication or trust, can backfire, especially when teens don't yet have the cognitive maturity to assess long-term risks."
Mr Haja advocated for a hybrid approach, combining reasonable access limits with education, and stressed that the government, education system, parents, and educators need to work gradually with children to build a constructive and safer digital environment.
For Mr Tan, the cybersecurity consultant, when it comes to keeping children safe online, policy and technology are two sides of the same coin, but they must work together, anchored by education.
"I would start with policy as the core, because everything else – technology and education – needs to be built on clear, enforceable, and adaptable rules."
He added that policies should be layered based on risk: a moderated learning app, for instance, might require lighter safeguards, while high-risk spaces like adult-content platforms or anonymous chat services would demand stringent verification, monitoring and reporting requirements.
Flexibility is key because young users often try to bypass restrictions, said Mr Tan.
"Policymaking should be informed by red-teaming and simulation exercises, where you actively test how the system could be evaded or exploited," he said.
Technology, in such a scenario, becomes the enabler.
In Singapore, Mr Tan suggested applying privacy-preserving age checks with trusted digital identity systems like SingPass, especially for high-risk environments such as age-restricted digital services or financial platforms with legal thresholds.
These could be supported by AI-enabled behavioural analysis to flag suspicious activity in real time – all without creating invasive data trails.
"Finally, education is the sustainment layer. Technology can block, policy can direct, but long-term resilience comes from raising digital literacy, so young people, parents and educators understand not just how to use safeguards, but why they exist."

Mr Tan likened this approach to successful frameworks in other sectors. In cybersecurity, for example, critical infrastructure employs layered standards, continuous monitoring and red-teaming exercises to anticipate threats.
In public health, risk-scaled interventions, adaptable protocols and public education campaigns have proven effective in disease outbreak response.
"When you align policy for direction, technology for enablement and education for sustainment, and build a culture that embraces them, you create an environment that not only protects effectively, but also adapts, improves and builds trust of those it's meant to protect," Mr Tan said.
Are age-based content policies, similar to those in the UK and Australia, potentially on the horizon for Singapore, then?
Mr Kieran Donovan, co-founder and CEO of Singapore-based startup k-ID, which helps online service providers comply with global age-related regulations, said Singaporean regulators are known for being measured and deliberate. They take the time to gather input from a wide range of stakeholders, including the industries that will be affected, and it is no different in the online safety space.
"Singapore is taking a thoughtful approach, carefully observing international developments while considering local cultural contexts and capabilities," he said.
For example, Mr Donovan said that through the Ministry of Education, the government is able to allocate sufficient resources to teach digital literacy and cyber awareness to children, helping them become more aware of online harms and dangers.
Asked about the most realistic enforcement mechanisms to keep pace with and potentially outsmart users' resourcefulness, Mr Donovan acknowledged that every protective system will face attempts to bypass it.
"But that doesn't mean we throw the whole system out. We work tirelessly to keep protecting more people online every single day," he said.
"Unfortunately, there are those who call for the removal of all online protections, which would be counterproductive to the protection of the rights of the child."
Mr Donovan stressed that there is a place and need for regulation for all the reasons mentioned previously.
"But regulation alone cannot solve this problem, and in Singapore, that includes continuing with stronger education and training of both parents and children that ultimately empowers our children to be smart and safe digital natives."
Building on that, Ms Quinn also stressed the importance of engaging young people directly in shaping the rules that affect them.
"This direct engagement helps ensure that laws are relevant and effective, making them feel less like arbitrary rules and more like a shared effort to build a safer digital world."