Online Safety Commission: A watchdog with teeth to protect victims: Opinion
Source: Straits Times
Article Date: 21 Oct 2025
Author: Lim Sun Sun
Some people may play down online harms as being merely in the realm of the virtual, but that view is both shortsighted and ill-informed, says the writer.
Isabelle became the target of intense sexual harassment after her phone number and e-mail address were posted online in sexual service advertisements impersonating her and using her image. She was inundated with explicit messages daily across Facebook, Instagram, WhatsApp, and even regular SMS, with strangers offering money for sex.
Despite her desperate efforts to block contacts, report the posts and change her accounts’ privacy settings, it took two long weeks before the messages began to subside. The ordeal took a toll on her mental health, causing anxiety, sleeplessness, and difficulty concentrating at work.
Isabelle’s experience, recounted in SG Her Empowerment’s (SHE) report, 404 Help Not Found: Lived Experiences Of Online Harms Survivors, underscores the pressing need for a faster response to online abuse.
The cruel reality is whereas the harm done to victims like Isabelle is instantaneous, remedies tend to be agonisingly slow, and piecemeal at that.
SHE’s 2023 report, Study On Online Harms, found that more than eight in 10 respondents wanted offensive material removed swiftly and permanently. Yet the Infocomm Media Development Authority’s (IMDA) Online Safety Assessment Report 2024 revealed that most major platforms acted appropriately on only about half the harmful content reported and often took five days or more to do so. For victims of such abuse or harassment, five days can feel like five lifetimes.
Some people may play down online harms as being merely in the realm of the virtual, but that view is both shortsighted and ill-informed. First, it no longer makes sense to speak of the online and offline divide in our digitalised world where children are growing up surrounded by devices and engaged in screen time from a very young age.
Second, the term online harm is a misnomer – its ill effects spill easily over into the offline world. For its victims, the nightmare of reputational harm and inability to seek redress can translate into very real deleterious consequences for personal well-being and daily functioning.
To make matters worse, unlike a physical assault, the damage done via text, image or video is amplified and prolonged by the internet; the victim is left in a constant state of anxiety as to who among the millions online might have seen the content and when it might resurface.
Ensuring the safety of people in the online space should thus not be seen as a secondary undertaking. Instead, its importance must be elevated as daily lived experiences increasingly shift online.
In this light, Singapore’s decision to introduce the Online Safety (Relief and Accountability) Bill on Oct 15 is a crucial and timely move.
Powers and remedies
A key feature of the Bill is the setting up of an Online Safety Commission (OSC) to provide a range of remedies to victims of online harm.
For instance, the OSC, to be headed by a Commissioner appointed by the Minister for Digital Development and Information, could direct a social media platform to ensure that the harmful content cannot be accessed in Singapore, with non-compliance constituting a criminal offence punishable with fines.
Under the Bill, individuals may be fined up to $20,000 and jailed for up to 12 months, while entities may be fined up to $500,000 for failing to comply with the OSC’s orders. A further fine not exceeding $50,000 for each day the offence continues may be imposed on entities after conviction.
Further, the OSC will also be empowered to issue access blocking orders or app removal orders to ISPs and app stores respectively.
It is also noteworthy that the statutory torts introduced by the Bill will also give online platforms another reason to sit up and take online harms that are brought to its notice more seriously, or risk being held liable by victims for damages and more.
The Bill, in particular the OSC, helps to compensate for the sluggish response from the tech sector.
Structural reasons impede more substantive action by tech companies, especially those offering social media services. These platforms live in tension between profit and safety. Content that goes viral, be it joyous or cruel, fuels engagement and, in turn, advertising revenue. By contrast, safety measures are expensive to build and inconvenient to enforce, while also slowing down content delivery.
Expecting corporations to prioritise user well-being over profit margins and shareholder value is a faint hope. Isabelle, for instance, was left disillusioned with platform reporting as she did not get the help she needed at a time of great distress.
Neither is going to court the ideal solution for victims of online harms. SHE’s 2023 study found that over seven in 10 respondents wanted online harms resolved without involving lawyers or judges.
Litigation is perceived as costly, complex, and time-consuming and exacerbates victims’ suffering as it necessitates recounting and “reliving” the ordeal. Most simply want the harms to cease, and not to embark on a protracted legal battle. At the same time, respondents recognised that when platforms or perpetrators refuse to act, victims should have recourse to legally enforceable orders.
Other countries have already recognised this need. Australia’s eSafety Commissioner, a position established since 2015, has the power to order platforms to remove harmful online content, such as cyber-bullying material targeting children, within 24 hours or face penalties, setting a benchmark for swift intervention.
The United Kingdom’s Online Safety Act strengthens obligations on platforms to protect users, particularly children, imposing significant penalties – of up to £18 million (S$31 million) or 10 per cent of the platform’s qualifying worldwide revenue, whichever is greater – for non-compliance.
These examples offer helpful insights into the value of prioritising user protection and the practicability of different forms of interventions.
The beneficial outcomes seen in other countries lend impetus to Singapore’s establishment of the OSC, which appears to be inspired by the success of Australia’s model.
Using eSafety’s track record in Australia as an indicator, we can envision what success could look like for the OSC. Speaking at the Online Harms Symposium at Singapore Management University in September 2023, Australia’s eSafety Commissioner Julie Inman Grant shared that eSafety has sped up responses to victims to three hours in most cases and has become the first port of call for matters relating to online harms. She estimated that around 90 per cent of harmful content they flag to social media companies is removed.
Singapore’s experience without an agency like the OSC has been less than ideal so far. The Online Safety Assessment Report 2024 issued by IMDA in February tested how well six major platforms enforced their online safety measures. The findings revealed significant gaps, including X acting on just over half the reported harmful content and taking an average of seven to nine days to remove sexual or self-harm content. Instagram in turn acted on only 2 per cent of user-reported harmful posts.
With the OSC’s establishment, we can expect platforms to prioritise cases that the agency brings to their attention, especially those of a more extreme nature such as abuse of intimate images.
Wide range of harms
The OSC intends to address a wide range of online harms directed at both individuals and groups. These include familiar forms of online harassment such as cyber-bullying, sexual harassment, doxxing and cyber-stalking.
It also extends to more egregious harms such as intimate image abuse and image-based child abuse, and even nascent and emerging harms such as harmful deepfakes and unjustifiable acts of cancellation (that is, online material which instigates the public to act in a disproportionate manner against a victim). The OSC will start with the harms that are most serious and prevalent, and expand its remit in future phases.
Though ambitious, the OSC’s wide-ranging mandate reflects a thoughtful, forward-leaning approach that acknowledges victims’ experiences and offers them greater assurances while avoiding having to play catch-up with rapidly evolving technologies and harms.
Where victims have lodged reports of online harms to the OSC and the threshold for action is met, the OSC will be able to issue legally enforceable directions swiftly. Speed is important here as delays could cause irreparable harm once the content has gone viral.
Such relief would be welcomed by victims such as Jack and Alex, whose experiences with online harms were also shared in the 404 Help Not Found report.
Jack was extorted with AI-generated footage of him engaging in sexual acts. The footage was eventually released and even made its way to his parents. The ensuing fallout resulted in Jack contemplating suicide. Alex had his sex video leaked by a former partner, which was then, to his horror, circulated at his workplace.
Experts have long warned that image-based sexual abuse haunts victims throughout the course of their lives, but with the OSC’s new levers to help stem the harm early, we can be quietly optimistic that things should change for the better.
Where the circumstances require, OSC may also issue other directions, such as to ban a perpetrator’s account for a specified time period, shut down the online group or location involved in the online harm, or require a reply from the victim to be posted.
To illustrate, there have been many cases where online vigilantes have got their facts very wrong, resulting in innocent people getting hurt.
One may recall the infamous incident of a Bentley driver threatening to run down a security officer outside a primary school a few years ago. Allegations incorrectly identifying a local businessman as the driver spread like wildfire, and he was inundated with calls and messages, and his family and businesses were also affected.
At the time, he lamented that the removal of the false posts alone would do little to help him, as those who had already read them would still think he was the driver in question. Instead, he had hoped more people would see his clarification.
If OSC had been around then, it could have issued directions to take down the harassing comments as well as right-of-reply directions to ensure that the businessman’s side of the story is heard. The latter would require parties such as the communicator or online platform to communicate his account to correct misunderstandings.
Above all, the real promise of the OSC lies in shifting power towards and restoring agency to end users. Today, victims remain at the mercy of platforms or protracted court hearings.
An independent body would tip the scale towards internet users, making safety a foundation rather than an afterthought in the digital ecosystem. Singapore has long prided itself on being a safe and hospitable home for residents of all ages, genders and creeds. The Online Safety Commission will help extend that salubriousness to the virtual world too.
Lim Sun Sun is Lee Kong Chian professor of communication and technology at Singapore Management University. Her latest book is Humanising Technology: Reflections On Design, Ethics And Inclusion.
Source: The Straits Times © SPH Media Limited. Permission required for reproduction.
9