THOUGHTS

When Platforms Become Judge, Jury and Executioner

02/03/2026 08:22 AM
Opinions on topical issues from thought leaders, columnists and editors.
By :
Ts Dr Manivannan Rethinam

Digital platforms have reshaped how people work, communicate and earn a living. What began as spaces for sharing creativity and ideas have evolved into powerful economic ecosystems.

Today, digital content creators are not casual users. They are workers, entrepreneurs, educators, entertainers and small business owners whose livelihoods depend on platforms such as TikTok, YouTube, Facebook and Instagram.

Creators invest years building audiences, refining content, understanding algorithms and nurturing communities.

Many rely on live streaming, monetisation tools, brand partnerships and indirect commercial opportunities to sustain themselves.

For a growing number of young Malaysians, digital creation is not a hobby but a primary source of income and professional identity.

It is important to recognise that digital platforms carry enormous responsibility in maintaining safe communities and addressing harmful content at scale.

The concerns raised here are not directed at the necessity of moderation itself but at the absence of transparent safeguards when enforcement actions carry severe livelihood consequences.

What is unfolding is not merely a series of isolated enforcement incidents but a structural governance gap within the digital economy, where private platform rules increasingly shape public economic realities without corresponding public accountability.

A Malaysian creator case highlights systemic risk

A real Malaysian case illustrates how fragile a creator’s livelihood can be.

A TikTok creator had built a community of more than 100,000 followers and accumulated millions of likes through years of consistent and responsible content creation.

On 5 May 2023, the creator permanently lost access to TikTok live after a spontaneous incident was classified as sexual activity.

During a live session, a WhatsApp notification appeared unexpectedly and was clicked accidentally.

The image revealed was a cartoon-style sticker depicting an exaggerated humorous drawing.

It did not show any real person, any real body part or any sexual act.

Stickers of this nature circulate widely on messaging platforms, and recipients cannot see their contents before opening them.

There was no intent, no deliberate sharing and no search for inappropriate material.

Despite this, the system classified the brief accidental exposure as a severe violation and imposed a permanent ban. An appeal was submitted immediately.

The appeal process was automated and provided no opportunity to explain context, intent or circumstances. The appeal was rejected on the same day.

After nearly three years, a manual appeal supported by evidence and explanation was submitted to TikTok Malaysia’s country management.

The response was brief and final. The permanent live ban would remain and no further appeals would be entertained.

No explanation addressed the factual context. No guideline provision was clarified. No proportionality was considered.

For the creator, the outcome was not a temporary suspension but effectively a professional life sentence.

This case is presented to illustrate broader systemic concerns regarding automated enforcement and appeals processes rather than to attribute intent or wrongdoing to any specific individual or organisation.

Platform moderation operates within complex operational constraints, and this example highlights structural challenges that merit policy attention.

A widespread challenge across platforms

This experience is not limited to a single platform or a single individual.

Similar stories are reported across YouTube, Facebook and Instagram, where creators lose accounts, monetisation or visibility based on automated classifications and vague references to community guidelines.

Platforms rely heavily on automation to manage vast volumes of content. While understandable, automation is not judgement.

Algorithms cannot reliably interpret humour, accident, cultural nuance or human intent.

When appeals are also automated, enforcement errors and disproportionate actions become entrenched.

Cases like this illustrate how algorithmic enforcement without contextual human review can transform momentary incidents into irreversible economic punishment, raising serious questions about fairness, proportionality and procedural justice.

Creators frequently report receiving enforcement notices with little explanation and no meaningful opportunity to respond. Decisions are framed as final.

Evidence is not disclosed. Human review is limited or absent. For individuals whose economic survival depends on platform access, the consequences are deeply human.

Creators are workers and their digital identity is their livelihood

For many creators, a platform account is not merely a profile. It is an economic lifeline.

When access is removed, creators lose direct income from live gifts and monetisation tools, brand partnerships and sponsorships, years of community building, professional credibility and future business opportunities.

In traditional industries, removing someone’s ability to earn would require due process, explanation and an independent avenue of appeal. In the digital economy, these safeguards are largely absent.

This legal and governance gap allows platforms to exercise extraordinary power, simultaneously acting as rule maker, investigator, judge and enforcer.

In practice, platform decisions are often shaped by commercial priorities, with enforcement designed to minimise operational risk and protect advertising relationships, sometimes at the expense of fairness and proportionality.

Creators who act responsibly and in good faith can become collateral damage in an ecosystem optimised for scale rather than fairness.

Leading the way: global models for fair and accountable platform governance

Across advanced digital economies, a policy shift is emerging that recognises creators not merely as platform users but as participants in a new form of digital labour market.

This shift acknowledges that algorithmic governance must be accompanied by transparent oversight mechanisms when decisions carry livelihood consequences.

Europe’s Digital Services Act requires platforms to provide clearer explanations for moderation decisions and accessible internal appeals.

Independent dispute resolution bodies now offer neutral human review when internal appeals fail.

These mechanisms do not weaken content moderation. Instead, they enhance legitimacy and trust by ensuring decisions affecting livelihoods are transparent, proportionate and reviewable.

Globally, policymakers are exploring algorithmic accountability, platform transparency obligations and fair contractual practices between creators and platforms.

The emerging consensus is clear: digital economies require governance frameworks that recognise creators as economic actors deserving procedural safeguards.

Malaysia urgently needs an independent digital platform safeguarding mechanism

Malaysia urgently needs to establish an independent digital platform review and safeguarding body.

Creators currently have no independent avenue to challenge wrongful or disproportionate enforcement.

Once a platform decides a ban is final, the creator’s journey ends: there is no neutral body, no forum to consider context, and no safeguard against mistakes.

Such a mechanism would not replace platform moderation but would act as an essential safeguard when internal systems fail. At a minimum, it should provide:

  • A right for creators to present context and evidence
  • Independent human review of severe enforcement actions
  • Proportionality checks before permanent bans are upheld
  • Transparency on which rules were applied and why
  • Consideration of economic harm caused by enforcement decisions

Empowering creators through networks and innovation

Beyond regulation, Malaysia can empower creators through structural innovation.

Around the world, creators are forming structured creator professional networks, industry associations and creator guilds.

These organisations function similarly to professional bodies in traditional industries. They provide legal guidance, training, peer support and a collective voice when engaging with platforms and policymakers.

Such networks enable creators to share resources, negotiate collectively with platforms, and respond effectively when enforcement actions threaten livelihoods.

Malaysia could also promote initiatives that encourage income diversification, cross-platform audience ownership and digital risk education so creators are less vulnerable to sudden algorithmic shifts.

Platforms themselves could establish independent ombudsman offices staffed by neutral experts to review severe cases involving livelihood impact or safeguarding concerns.

At the heart of these innovations lies empathy. Every account represents a human being navigating uncertainty in a rapidly changing economy.

Responsible digital governance requires partnership between governments, platforms, creators and civil society.

The objective is not to undermine the important role platforms play in protecting users but to ensure that systems affecting livelihoods operate with transparency, proportionality and accessible avenues for independent review.

Shaping a fair and accountable digital future

This is not an argument against content moderation. Safe communities remain essential. Harmful material must be addressed decisively.

However, moderation without transparency, proportionality and accountability risks becoming arbitrary authority.

Digital creators are not disposable users. They are innovators, educators, storytellers and economic contributors shaping Malaysia’s cultural and technological future.

They represent a new generation of entrepreneurs building careers without traditional institutional support.

Malaysia now stands at a defining moment. It can allow opaque automated enforcement to shape the future of digital work, or it can lead the world in building a fair, humane and accountable digital ecosystem that protects both innovation and dignity.

By establishing strong safeguards for creators while upholding community safety, Malaysia has the opportunity to become a global benchmark for responsible platform governance and creator protection in the algorithmic age. The time to act is now.

By taking decisive action today, Malaysia can set a benchmark not just regionally, but globally, demonstrating how a country can protect both the human and economic value of the digital creative economy.

Establishing robust safeguards for creators, ensuring transparency in enforcement, and providing independent avenues for appeal would signal to the world that Malaysia values innovation, fairness, and human dignity in the digital age.

Such leadership would not only empower creators domestically but also position Malaysia as a model for responsible, equitable digital governance worldwide.

The next phase of digital governance will not be defined solely by how effectively harmful content is removed, but by how fairly power is exercised, how transparently decisions are made, and how human dignity is preserved within automated systems.

The future digital economy will not be built by platforms alone. It will be built by millions of creators who dare to innovate, connect communities, and create value every single day.

-- BERNAMA

Ts Dr Manivannan Rethinam is a distinguished Professional Technologist and technocrat, holding a Doctorate in Business Administration with a specialisation in marketing and technology management. He serves as Chairman of Majlis Gagasan Malaysia and is a leading policy, geopolitical and public affairs analyst with nearly three decades of experience in strategic leadership, technology innovation and national policy transformation.

At the global level, he is a member of the Safe from Harm Global Panel of the World Scouting movement, where he contributes to safeguarding and protection initiatives affecting over 60 million youths and adults across 224 countries and territories. His work is driven by a steadfast commitment to social justice, inter-communal harmony and inclusive nation-building, grounded in empathy, evidence and transformative innovation.

(The views expressed in this article are those of the author(s) and do not reflect the official policy or position of BERNAMA)