Jump to content

Online Safety Act

From Consumer Rights Wiki
Revision as of 16:29, 30 August 2025 by JackFromWisconsin (talk | contribs) (duplicate, marked for a merge)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

⚠️ Article status notice: This article has been marked as incomplete

This article needs additional work for its sourcing and verifiability to meet the wiki's Content Guidelines and be in line with our Mission Statement for comprehensive coverage of consumer protection issues.

This notice will be removed once sufficient documentation has been added to establish the systemic nature of these issues. Once you believe the article is ready to have its notice removed, please visit the Moderator's noticeboard, or the discord and post to the #appeals channel.

Learn more ▼

🔄 A merge request has been made for this article

There has been a merge request for this page for the following reason:

Duplicate

Merge with:Implementation of the UK Online Safety Act


Once the merge has been completed, please contact a mod, who can delete the extra pages. Alternatively, edit the unneeded pages such that they contain only a redirect to the merged page.

If you believe the merge has been completed but the page is still not deleted please contact a moderator either on discord or via their talk pages.


The Online Safety Act 2023 is a United Kingdom law designed to regulate online content and impose new safety duties on digital platforms. Enforced by Ofcom, the Act grants regulators powers to fine or restrict access to services that fail to address harmful or illegal online material.[citation needed] Marketed as a measure to protect children and vulnerable users, the Act has attracted significant criticism regarding its impact on consumer rights, free expression, and digital privacy.[citation needed]

Consumer-impact summary

Freedom

The Act requires platforms to proactively monitor and remove "harmful" content. Critics argue this could incentivize excessive content moderation, limiting freedom of expression and legitimate debate online.[citation needed] Civil liberties groups warn that consumers may face reduced access to lawful content due to companies’ fear of regulatory penalties.[citation needed]

Privacy

The law empowers Ofcom to compel messaging services to implement scanning tools, including within encrypted channels.[citation needed] Privacy advocates argue this effectively undermines end-to-end

encryption, forcing consumers to compromise their security for compliance.[citation needed] Services like Signal and WhatsApp have threatened to withdraw from the UK rather than weaken user privacy protections.[citation needed]

Business model

Compliance requirements are resource-intensive, potentially disadvantaging smaller firms and startups.[citation needed] Critics suggest this entrenches the dominance of large technology companies, limiting consumer choice and concentrating control in fewer providers.[citation needed]

Market control

By vesting Ofcom with extensive enforcement powers, the Act centralises control over digital communication. While intended to safeguard consumers, opponents argue it risks government overreach and reduced autonomy for users to determine how their data and communications are managed.[citation needed]

Incidents

Encryption controversy (2023)

Major technology firms, including Meta and apple, publicly opposed provisions requiring proactive content detection in encrypted messaging.[citation needed] They argued the law would compromise encryption or make the UK an nonviable market for secure communications, impacting millions of consumers.[citation needed]

See also

References