GENERAL

ONSA: SHIFTING RESPONSIBILITY TO DIGITAL PLATFORMS TO COMBAT CSAM

02/01/2026 09:28 AM

By Mohd Fharkhan Abdul Ghapar

KUALA LUMPUR, Jan 2 (Bernama) -- The growing issue of Child Sexual Abuse Material (CSAM) in the digital realm is not just a hidden crime, it leaves lasting psychological effects that can disrupt the development of identity and emotional well-being in victims well into adulthood.

The Online Safety Act 2025 (ONSA), which came into effect yesterday, marks a significant shift in regulatory responsibility. Whereas this burden previously fell largely on victims or their parents, the new law now holds digital platform providers accountable for preventing exploitation.

Universiti Putra Malaysia (UPM) Department of Human Development and Family Studies senior lecturer, Dr Nellie Ismail emphasised that ONSA is a crucial tool for curbing the spread of exploitative content.

It ensures that platforms are held responsible by implementing proactive measures for prevention and swift response.

“While parents remain the first line of defence, structured support like ONSA is essential. Preventing exploitation can no longer rest solely with users; it must be shared by platforms that control content flow,” Nellie told Bernama.

She also noted that children, due to their cognitive and emotional immaturity, are particularly vulnerable to the dangers of the digital world.

“Placing the burden solely on children or parents is unrealistic and risky,” she said.

Children are especially susceptible to grooming when their emotional needs are unmet, leading to isolation, lack of supervision, and exposure to unsafe content without proper safety literacy.

This highlights the urgent need for secure platform designs and early detection systems, which are no longer optional but critical requirements for safeguarding children online.

She further explained that truly safe platforms must incorporate early prevention strategies, ethical algorithm monitoring, swift response protocols, and clear support pathways.

These measures are vital to ensure ONSA not only addresses technical issues but also reduces the re-exposure of children to traumatic content while facilitating easy access to professional support.

According to Nellie, repeated exposure to content that highlights children's bodies, even without sexual undertones, can normalise early sexualisation and disrupt their self-image and boundaries.

This further reinforces the need for ONSA to ensure that social media algorithms not only chase user engagement but also protect the psychological development of children.

ONSA establishes clear obligations for licenced service providers under the Communications and Multimedia Act 1998. This includes internet messaging services and social media platforms, which are now required to manage and remove content related to CSAM, pornography, incest, and self-harm.

Data from the Malaysian Communications and Multimedia Commission (MCMC) revealed that from 2022 to July 2025, over 82,000 pieces of obscene or abusive content were removed, with 957 specifically involving children being taken down between Jan 1 and Nov 30 last year.

MCMC deputy managing director (Development) Eneng Faridah Iskandar, stressed the crucial role platform providers play in safeguarding users, especially children.

“Platforms can no longer rely solely on user reports; they must take a proactive stance by identifying and removing harmful content, including sexual exploitation and online predators,” she said.

She also highlighted the need for platforms to develop comprehensive online safety plans, which should include effective reporting channels, clear usage guidelines, and educational initiatives to help users recognise and protect themselves from unsafe content.

The implementation of ONSA has been met with a wave of support from parents, who are hopeful that this new law will break the ongoing cycle of CSAM dissemination.

Nor Zulkarnain Md Nor Isa, a father of two teenagers, expressed concerns about the challenges parents face in monitoring digital content.

“While we can control screen time, we cannot fully control what algorithms hide behind features like ‘FYP’, ‘For You’ or ‘Trending’. Harmful content can reach our children before we even have the chance to block it,” he said.

Similarly, Nurul Adlina Azureen Suhaimi, a kindergarten teacher, echoed these concerns. “I hope platforms will analyse content before it is uploaded, allowing early prevention to take place. This responsibility should lie with the platforms, not after a child has already fallen victim to harmful content,” she said.

-- BERNAMA

 

 


 


BERNAMA provides up-to-date authentic and comprehensive news and information which are disseminated via BERNAMA Wires; www.bernama.com; BERNAMA TV on Astro 502, unifi TV 631 and MYTV 121 channels and BERNAMA Radio on FM93.9 (Klang Valley), FM107.5 (Johor Bahru), FM107.9 (Kota Kinabalu) and FM100.9 (Kuching) frequencies.

Follow us on social media :
Facebook : @bernamaofficial, @bernamatv, @bernamaradio
Twitter : @bernama.com, @BernamaTV, @bernamaradio
Instagram : @bernamaofficial, @bernamatvofficial, @bernamaradioofficial
TikTok : @bernamaofficial

© 2026 BERNAMA   • Disclaimer   • Privacy Policy   • Security Policy