ONSA: Towards Safer Digital Ecosystem, Comprehensive Protection for Users

E
very day, millions of children and teenagers in the country use their smartphones to watch videos, chat, share stories or simply tap the ‘like’ button on social media platforms.

Behind the laughter and entertainment offered in the virtual world lies content that they may not fully understand, including those with elements of violence, harassment, image manipulation and messaging that, little by little, can undermine and erode their self-confidence. This has fuelled growing concern among parents, who are no longer merely worried about how much time their children spend on gadgets, but also what they are watching, reading and absorbing online.

The Online Safety Act 2025 (ONSA), which came into force in Malaysia on Jan 1, emerges against this backdrop of concern. This Act aims to create a safer online environment, particularly for children and families, by setting clear responsibilities for online platform providers who, besides deleting content after it goes viral, must also detect, prevent and act more quickly on harmful content.  

In an era where artificial intelligence (AI) is used to generate fake images, spread slander or manipulate emotions, the enforcement of this Act offers hope that the digital space can become safer and more humane.

 

ROLE OF PLATFORM PROVIDERS

According to Malaysian Cyber Consumer Association (MCCA) president Siraj Jalil, ONSA introduces a more systematic and prevention-oriented approach by clearly defining the role of digital platform providers in managing the risks posed by harmful content.

Malaysian Cyber Consumer Association (MCCA) president Siraj Jalil.

He said this positions the Act as a significant step forward in strengthening the protection of user rights and safety, grounded in the country’s digital legal sovereignty.

“In protecting vulnerable groups, particularly children and teenagers, ONSA provides a stronger legal framework to curb exposure to sexual exploitation content, cyber harassment, image manipulation and extremist material that can have long-term effects on their psychological and social well-being.

“Compared with existing laws that focus more on post-offence action, ONSA sets out more proactive guidelines and is expected to address cyber user issues more effectively.

“Among the key issues it can tackle are the rapid spread of harmful content, failure to remove sensitive material within a reasonable timeframe, the rise in repeated cyber harassment cases and online child exploitation, which were previously difficult to curb due to legal and technical constraints,” he said.

Siraj added that ONSA also opens the door to addressing new challenges related to the misuse of AI, particularly in the creation of fake or manipulative content that can harm individuals and society.

“This approach reflects a shift from merely punishing offenders to building a safer digital system as a whole.

“However, MCCA views the real effectiveness of ONSA as dependent on consistent enforcement, the capacity of regulatory institutions, and the actual level of compliance by digital platforms, especially global platforms operating outside local jurisdictions,” he said.

Commenting on the readiness aspect of digital platforms, he said the level of compliance, however, is still unbalanced between licensed and unlicensed platforms.

Citing a recent case involving the misuse of AI on platform X, where images of women and children were manipulated to produce indecent, offensive and harmful content, Siraj said it clearly illustrated a digital governance gap that must be addressed urgently.

“This issue is not merely an ethical violation but a serious threat to digital safety, individual dignity and social well-being, requiring firm and consistent action by authorities to ensure platform providers comply with established laws,” he said.

 

EXCESSIVE EXPOSURE

Meanwhile, CPC International managing director and child psychologist Dr Noor Aishah Rosli said excessive exposure to harmful online content risks disrupting children’s emotional and mental development, leaving lasting negative effects on their psychological health.

ONSA sets out more proactive guidelines and is expected to address cyber user issues more effectively.

She said children lack mature logical thinking and the ability to interpret information critically, and often do not fully understand the consequences of what they watch online.

“The main risks faced by children exposed to violent and aggressive videos include anxiety, excessive fear or worry. This can lead to nightmares; fear of being alone, even to the extent of needing someone to accompany them to the kitchen; and heightened panic. Watching violent or physically abusive content can also be deeply distressing for children and can cause psychological trauma,” she said.

She added that children also face identity issues when they are unable to distinguish between fabricated content, such as acting or staged scenes, and reality. For example, seeing well-dressed, heavily made-up children living lavish lifestyles on social media can lead to low self-esteem because they feel they cannot measure up.

Dr Noor Aishah also said uncontrolled exposure to online content can normalise the use of inappropriate language and high-risk behaviours, such as swearing, and hitting and bullying others.

She believes the implementation of ONSA could help reduce negative psychological impacts on minors by serving as an intervention or preventive measure to control exposure to content that is unsuitable for children’s mental health.

“ONSA can help regulate what we call algorithms, so that inappropriate content is not easily accessible to underage users.

“The Act can also create a much safer and better digital environment under regulatory oversight, while sending a clear message to parents that children’s safety in cyberspace is important, including in preventing mental health issues arising from uncontrolled use of technology,” she said.

 

AGE LIMITS CRUCIAL

Dr Noor Aishah stressed that content control through age-based mechanisms is essential to filter out unsuitable material.

CPC International managing director and child psychologist Dr Noor Aishah Rosli.

“With just one click today, all kinds of content can appear… some beneficial, others harmful. Not everything is appropriate for five-, six- or 12-year-olds. Filtering is crucial, especially for minors (below 18), as their information-processing abilities differ from adults,” she said.

According to her, in children, early signs of negative digital influence often begin with emotional and behavioural changes, such as becoming withdrawn, irritable, having difficulty sleeping, losing interest in studies or fearing social interaction.

“In building resilience from a young age, the family is the primary line of defence. Parents can nurture emotional stability through affection, gentle communication and positive role modelling, even through simple gestures like saying ‘thank you’, expressing appreciation and spending more time with their children.

“Balanced digital education and screen time limits, and participation in outdoor and recreational activities together also help foster children’s mental well-being and a family that is healthy emotionally, mentally and physically,” she said.

 

CONTINUOUS MONITORING

Sharing a similar view, Universiti Utara Malaysia School of Multimedia Technology and Communication dean and Advanced Communication Research Unit (ACRU) researcher Associate Prof Dr Mohd Khairie Ahmad said the success of ONSA depends on platform operators’ compliance levels and the authorities’ ability to enforce the law consistently and transparently.

Universiti Utara Malaysia School of Multimedia Technology and Communication dean and Advanced Communication Research Unit (ACRU) researcher Associate Prof Dr Mohd Khairie Ahmad.

“The law is strong in terms of its framework, but the real challenge lies in its implementation within a digital environment that transcends national borders,” he said.

He explained that most major platforms operate globally, requiring cooperation among governments, online regulatory agencies and technology companies to ensure compliance with local legal requirements.

According to Dr Mohd Khairie, ONSA’s risk-prevention approach aligns with international practices such as the Online Safety Act 2021 (Australia), Online Safety Act 2023 (United Kingdom) and Digital Services Act (European Union), but demands technical expertise and a deep understanding of algorithms, content moderation systems and user behaviour patterns.

“Enforcement must be supported by continuous monitoring, specialised digital training and ongoing dialogue with industry players.

“In addition, media literacy and public communication are crucial to ensure society understands the role and limitations of ONSA, and to avoid misconceptions that the Act is intended to restrict digital freedom or freedom of expression,” he said.

 

© 2026 BERNAMA. All Rights Reserved.