Malaysia’s new Online Safety Act 2024 has been gazetted and will take effect soon. The law is designed to “enhance and promote online safety” by regulating online platforms and protecting users – especially children – from digital harm.
In practice, this means certain content types (e.g. child sexual abuse imagery, online scams, obscene or harassing speech, violence, drug promotion, etc.) are defined as “harmful content” that must be controlled or removed.
In particular, child-abuse and financial fraud content are classified as “priority harmful content” requiring immediate action. Under the Act, any platform that hosts or transmits user content has a legal “duty of care” to make its services safer.
Harmful Content Definition under Online Safety Act 2024
The Act defines harmful content in its First Schedule, including:
- Content on child sexual abuse material as provided for under section 4 of the Sexual Offences against Children Act 2017
- Content on financial fraud
- Obscene content including content that may give rise to a feeling of disgust due to lewd portrayal which may offend a person’s manner on decency and modesty
- Indecent content including content which is profane in nature, improper and against generally accepted behavior or culture
- Content that may cause harassment, distress, fear or alarm by way of threatening, abusive or insulting words or communication or act
- Content that may incite violence or terrorism
- Content that may induce a child to cause harm to himself
- Content that may promote feelings of ill-will or hostility amongst the public at large or may disturb public tranquility
- Content that promotes the use or sale of dangerous drugs
Priority harmful content, requiring immediate action, includes child sexual abuse material and financial fraud content.
Scope and Applicability
The Act applies to:
- Applications Service Providers (ASPs): Entities providing services like social media or internet messaging that facilitate user communication over the internet.
- Content Applications Service Providers (CASPs): Providers of content services, such as streaming platforms or websites with user-generated content.
- Network Service Providers (NSPs): Entities providing network infrastructure, with minimal obligations under the Act.
Key Duties for Online Platforms.
Licensed internet platforms (applications or content service providers under the Communications & Multimedia Act 1 998) must implement comprehensive safety measures.
These include:
- issuing clear user guidelines and terms of use;
- providing tools/settings for users to control privacy and safety;
- enabling easy reporting of harmful content; offering user-assistance and support channels (especially for vulnerable users);
- embedding safe-design features (age filters, content blockers, etc.);
- and establishing processes to quickly block or remove priority harmful content.
The law also requires each service to prepare and publish an Online Safety Plan detailing these measures.
As one report notes, social media platforms in particular will have three core obligations: ensuring platform safety, protecting children under 13, and limiting access to harmful material.
In sum, platform providers must proactively mitigate risk – for example by using filters or moderation systems – and be prepared to act quickly on user reports of abuse or illegal content.
- Platforms must mitigate exposure to harmful content (e.g. content filtering, AI moderation).
- Platforms must publish user guidelines/terms (so users know rules and rights).
- Platforms must provide user controls (blocking, privacy settings, age restrictions).
- Platforms must offer reporting tools (so users can flag abuse).
- Platforms must provide user assistance (readily accessible safety information/help channels).
- Platforms must protect children online (e.g. special safeguards for minors).
- Platforms must restrict priority harmful content (automatic removal of child-abuse or financial fraud material).
- Platforms must publish an Online Safety Plan demonstrating compliance.
Legal Repercussions of Failing Obligations
For example, service providers face fines up to RM10 million for failing to meet their duties. If a platform ignores a regulator’s order to remove banned content, it faces up to RM1 million in fines plus RM100,000 per day of continued non-compliance.
In short, the law gives authorities new powers (through the Malaysian Communications and Multimedia Commission) to demand takedowns and punish breaches, and it imposes very steep penalties for non‑compliance.
Impact on SMEs Across Industries
The Act’s impact on SMEs varies depending on whether they are licensed ASPs or CASPs or simply users of online platforms.
Impact on F&B Businesses (Cafés, Restaurants, etc.)
Even small cafés and restaurants must take note. Although a typical eatery isn’t itself an online platform, it uses digital channels (websites, social media pages, online ordering apps) to reach customers.
All content on those channels must comply with the new rules. For example, any photos, videos or posts should avoid prohibited themes (no obscene imagery, hate speech, scams, etc.).
Businesses should moderate user interactions – such as customer comments or reviews – removing any harassment or illegal content. Staff should be briefed not to post content that could be flagged (e.g. avoid defaming competitors or sharing violent memes).
If a café runs its own app or community forum, it should implement basic safety features (privacy settings, comment filters, reporting buttons) in line with the law’s “duty of care.”
In practice, this means keeping a watchful eye on all online brand content and removing any harmful posts swiftly – effectively applying the platform compliance rules to their own social/media presence.
As Minister Azalina noted, once the law is in force the government can “act swiftly in removing unlawful content”, so businesses must be ready to cooperate.
Impact on Retail (Especially E-commerce)
Retail SMEs – particularly e-commerce shops – will be more directly affected. An online retailer is effectively a content platform (with product listings, user reviews, chats, etc.).
These businesses should scrutinize their product content:
- no sales of illegal or offensive items,
- no ads for adult or illicit products without proper safeguards,
- and no misleading or scam offerings (financial frauds are explicitly banned).
Any user-generated material (user reviews, question-and-answer sections) must be moderated to ensure it does not include harassment, hate speech or other banned content.
Retailers should publish clear terms of service and a privacy/safety policy, making it easy for customers to flag problems.
For customer support or chat functions, training staff to spot and shut down bullying or scam attempts will help comply with the law’s emphasis on user safety.
In short, e-commerce sites must treat their online storefront like a regulated platform – filtering content and responding to reports – to avoid hefty fines.
Impact on Service Industries
Professional service providers (salons, clinics, consultancies, etc.) should likewise assess their online communications.
Any digital interface – booking websites, feedback forms, community forums, or live chat – can expose them to the same risks.
For example, if a salon’s Facebook page allows public comments, toxic or threatening messages posted there could create legal issues. Businesses should moderate and remove any such content promptly.
Marketing materials and images must also steer clear of anything that could be viewed as “obscene” or “incitement” under the Act.
Service SMEs should provide clear customer guidelines (e.g. rules for posting in forums) and a way to report abuses.
Essentially, any service business that engages clients online should implement the same online-safety practices as large platforms: safe design choices and active content moderation to fulfill the law’s duty-of-care expectations.
Impact on the IT Industry
IT and digital services SMEs will face the heaviest compliance burden.
Firms that operate websites, social apps, online communities or messaging services may qualify as “applications service providers” under Malaysia’s laws. These companies should immediately review the new requirements and begin implementing them.
Key steps include obtaining the proper licensing (as recently mandated for large social platforms), drafting an Online Safety Plan, and building in the required features (user tools, reporting functions, etc.).
Tech teams will need to design software with “safe by default” settings – for instance, default privacy protections for under-13 users and filters for graphic content – in line with the Act’s child safety focus.
IT firms should also follow guidance from regulators: for example, MCMC has published a Code of Conduct (Best Practice) for internet messaging and social media providers, outlining recommended practices to combat harmful content.
In practice, a software SME should audit its products for any gaps (like missing report functions) and train its developers on the new rules. Because IT companies often work with client data, they should ensure their platforms can quickly remove illegal content when requested by authorities.
Penalties for Non-Compliance
The penalties under the Online Safety Act are severe.
As legal experts note, a company that fails to comply with the Act’s obligations can be fined up to RM10 million.
Furthermore, if a platform fails to take down harmful or priority-harmful content after being ordered, it can be prosecuted and fined up to RM1 million plus RM100,000 per day of continued breach.
These fines apply to service providers – even foreign ones with operations affecting Malaysians (the law is extraterritorial) – so any SME running an online platform must be vigilant.
Although the Act targets platforms rather than individual users, business owners should also remember that existing laws still apply: abusive, harassing or illegal content may attract action under Malaysia’s Penal Code or Communications & Multimedia Act even apart from the new regulations.
In short, the cost of non-compliance is very high, making proactive measures (as outlined above) essential for all SMEs with an online presence.
SME Compliance Checklist by Industry
F&B (Cafés/ Restaurants)
– Review all online content (website, menus, posts) to remove profanity, obscenity or hate speech.
– Moderate customer comments/reviews; promptly delete harassment or illegal material.
– Train staff on digital content policies and quick takedown procedures.
– Use safe-default settings (e.g. comment filters) on social media pages.
– Provide clear terms/policies on your site about acceptable user content.
Retail (Ecommerce)
– Vet product listings: no illegal or offensive items (drugs, weapons, adult content, etc.).
– Moderate user reviews and Q&A to remove insults or scams.
– Publish clear terms of use and return policies emphasizing safe transactions.
– Implement reporting tools for customers to flag harmful listings or fraud.
– Use age-verification or content filters for sensitive products.
Services (Salons, Clinics, etc.)
– Ensure service descriptions and marketing content are decency-compliant (no obscene images).
– Moderate any client forums or booking chats for harassment or prohibited content.
– Provide a channel (email or form) for users to report abusive communications.
– Display a privacy/safety policy on your website, and enforce respectful behavior rules.
IT & Digital Services
– Obtain any required MCMC licensing (e.g. ASP/C license for platforms).
– Draft an Online Safety Plan describing your safety measures (publish on your site).
– Build in user safety features: blocking, content filters, reporting buttons, etc.
– Adhere to MCMC guidance (e.g. Code of Conduct for social messaging).
– Train developers on identifying/removing harmful content.
– Keep logs/data ready to share with authorities if required.
Each industry must align its online practices with the Act’s requirements.
SMEs are advised to consult legal experts and stay informed via official channels (MCMC announcements, the Communications Ministry, etc.) to ensure full compliance. By proactively moderating content and providing user safeguards, Malaysian businesses can both protect their customers and avoid the heavy penalties of the new Online Safety Act.
Sources:
Online Safety Bill 2024 | Official Portal of The Parliament of Malaysia|
Azalina: Online Safety Act coming into force soon, targets digital harm to children | Malay Mail
An Overview of the Online Safety Bill 2024 – Lexology
Akta Keselamatan Dalam Talian 2024 berkuat kuasa dalam masa terdekat – Azalina – Portal Berita
MCMC publishes code of conduct for social media service providers to address harmful content