UK Online Censorship Laws Set to Impose Substantial Fines on Platforms for Non-Compliance

Updated

Britain’s latest online censorship law—the Illegal Harms Code and guidance under the Online Safety Act—came into force on December 16, 2024. Under the guidelines published by Ofcom, the regime’s media and telecommunications watchdog, the British government could fine social media platforms like X (formerly Twitter), Meta, and Google up to £18 million or 10% of global revenue, whichever is higher, for failing to remove illegal content. In detailing a range of 130 illegal acts that these platforms must forbid and prevent, the guidelines focus on risks like hate speech, fraud, and online grooming, and they legally obligate online providers to protect users from these harms. Providers must conduct a risk assessment for illegal harms by March 16, 2025, and be set to comply with safety measures the next day, on March 17, 2025.

Key changes introduced by the new guidance focus on risk management, illegal content management, child protection, fraud detection, protecting women and children, and the enforcement of sanctions. Online platforms must execute risk assessments to identify and mitigate harms, and for accountability, a senior person must supervise these efforts. Platforms must have enhanced content moderation, reporting abilities, and algorithm upgrades to curb illegal content. Additionally, they must have features to protect children, including invisible profiles, placing restrictions on direct messaging, and implement the use of automated tools like hash-mashing for Child Sexual Abuse Material (CSAM). They also must have dedicated reporting channels to identify and reduce fraudulent activities and measures in place to block harassment, remove intimate image abuse, and address coercive criminal content. As noted by RT News, Ofcom wrote:

“Some offenses are ‘complex’. They may be more about a series of interactions between users, or may involve behavior that takes place partly offline, or may involve thinking about the nature, identity or age of one or more of the users concerned.”  

The Online Safety Act focuses on “user-to-user” (U2U) services, which encompass platforms that enable various forms of interaction and content sharing among users. These include services that support user-generated content (UGC), allowing individuals to upload, share, or post text, images, videos, or audio, as seen on platforms like Instagram, TikTok, YouTube, and Reddit. They also include communication and messaging platforms, such as WhatsApp, Facebook Messenger, Slack, and Discord, which facilitate private or public exchanges. It also includes U2U platforms like Elon Musk’s X/Twitter, LinkedIn, and Pinterest, where interactive features like commenting, liking, and sharing content are core.

Likewise, the focus includes U2U forums and communities like Reddit, Quora, and online gaming communities, which host discussion boards and groups for user engagement, as well as marketplaces like eBay, Etsy, and Facebook Marketplace, which allow users to post listings and engage in transactions for products or services. The guidelines point out that all of these U2U platforms regularly host content that can include illegal or harmful material, such as hate speech, and child sex abuse material. Likewise, Ofcom conveys that because U2Us enable direct interaction, they increase the chance of indisputable detrimental activities like cyberbullying, online grooming, and fraudulent activities.

To ensure these platforms can demonstrate compliance with their obligation to follow the laws set out in the new guidance, rigid requirements for record-keeping and auditing are required to be in place. Requirements include accurate records on risk assessments, mitigation measures, content moderation logs, user reports and complaints, transparency reports, algorithm testing and adjustments, staff training records, and enforcement actions related to any warnings, enforcement actions, or penalties issued by Ofcom, along with details of actions taken to address non-compliance issues. Ofcom notes these measures are critical for transparency and accountability and permit the agency to assess the effectiveness of safety measures and enforce compliance when necessary.

Ofcom has the power to audit platforms to ensure they follow the Online Safety Act rules. This power can include reviewing any of the records listed above, as well as conducting on-site evaluations. If requested, platforms must provide these records to Ofcom. During audits, Ofcom states it will review whether the platform’s safety measures effectively address risks as outlined in its Codes of Practice. After the review, Ofcom can suggest improvements to help platforms meet their obligations. If non-compliance is found, Ofcom can take enforcement actions, such as issuing fines or demanding corrective steps.

Undoubtedly, many U2U platforms are apprehensive about the new laws because of the many challenges they present. After all, they will apply to over 100,000 companies from around the globe, including the largest social media platforms down to “very small” entertainment platforms like dating, gambling, and other online services. Fulfilling these requirements will likely be expensive and complicated, especially for smaller platforms that lack the resources for advanced tools or large content moderation units.

As evidenced by the suppression of truthful and life-saving information during the COVID-19 pandemic, it will be interesting to see whether platforms can strike a balance between Ofcom’s strict content moderation—there is no doubt some online movements are horrendous, like cyberbullying, online grooming, and fraudulent activities—and maintaining the freedom of speech and expression. Likewise, the Act’s expansive definitions and evolving standards make it problematic for platforms to know precisely what’s expected of them, and the combination of financial risks, operational challenges, and uncertainty about enforcement creates substantial pressure, indeed leaving some platforms worried about whether they can continue to operate effectively under these ever-growing regulations, which are far from over. Ofcom notes it will soon introduce “phase three,” of its dominance plan, which will “establish additional requirements for categorized services, focused on bringing an enhanced level of safety, transparency, and accountability to some of the largest service providers operating in the online world.”

Will the burgeoning legally enforceable rules drive some platforms out of business, and is that the goal?

Generic avatar

Tracy Beanz & Michelle Edwards

Tracy Beanz is an investigative journalist with a focus on corruption. She is known for her unbiased, in-depth coverage of the COVID-19 pandemic. She hosts the Dark to Light podcast, found on all major video and podcasting platforms. She is a bi-weekly guest on the Joe Pags Radio Show, has been on Steve Bannon’s WarRoom and is a frequent guest on Emerald Robinson’s show. Tracy is Editor-in-chief at UncoverDC.com.