Singapore is moving to strengthen its regulation of online content with the introduction of a new Online Safety Commission that will have the authority to order social media platforms to block harmful material. The proposal is part of a new online safety bill that was tabled in parliament on Wednesday and is expected to be debated at an upcoming session.

The move follows findings from the Infocomm Media Development Authority (IMDA), which reported in February that more than half of legitimate user complaints about harmful online posts — including those related to child abuse, cyber-bullying, and harassment — were not acted upon promptly by social media companies. The new commission will be established by the first half of 2026 and will have powers to act on user reports involving issues such as online harassment, doxxing, stalking, child pornography, and the abuse of intimate images.
ALSO READ: Most Singapore residents exposed to harmful online content, MDDI surveys find
Under the proposed law, the commission will be able to order social media platforms to restrict access to harmful content within Singapore. It will also be empowered to give victims a right to reply and to ban individuals who post harmful content from using certain platforms. The commission will have the authority to instruct internet service providers to block access to specific online locations, such as social media pages or even entire platforms, when deemed necessary.
In addition to these powers, the legislation allows for the gradual inclusion of other forms of online harm in later stages. These will include the non-consensual sharing of private information and “the incitement of enmity,” expanding the commission’s scope of action as it becomes operational.
The establishment of an online safety commission was first discussed during the Ministry of Digital Development and Information’s budget debate in March this year.
Announcing the bill, Minister for Digital Development and Information Josephine Teo said, “More often than not, platforms fail to take action to remove genuinely harmful content reported to them by victims.” She added that the new framework would ensure stronger and faster responses to harmful online behaviour.
The move builds on Singapore’s broader efforts to regulate digital platforms and ensure online accountability. Earlier this year, the government acted under the Online Criminal Harms Act, which came into force in February 2024, to issue its first compliance order against Meta.

In September, the Ministry of Home Affairs warned Meta that it faced a fine of up to SGD 1 million and additional penalties of up to SGD 100,000 per day after the end of the month if it failed to implement stronger safety measures, including facial recognition, to prevent impersonation scams on Facebook.

Authorities have not yet confirmed whether Meta complied with the order, but officials have indicated that Singapore intends to continue enforcing stricter online safety rules. The proposed Online Safety Commission, once established, will serve as a central authority to address user complaints and enforce the removal of harmful content from digital platforms across the country.