Online Safety Act: Difference between revisions
m added infoboxcompany for Ofcom |
duplicate, marked for a merge |
||
(3 intermediate revisions by 3 users not shown) | |||
Line 1: | Line 1: | ||
{{Incomplete}} | {{Incomplete}}{{MergeRequest|Duplicate|[[Implementation of the UK Online Safety Act]]}} | ||
'''The Online Safety Act 2023''' is a United Kingdom law designed to regulate online content and impose new safety duties on digital platforms. Enforced by [[Ofcom]], the Act grants regulators powers to fine or restrict access to services that fail to address harmful or illegal online material.{{Citation needed}} Marketed as a measure to protect children and vulnerable users, the Act has attracted significant criticism regarding its impact on consumer rights, free expression, and digital privacy.{{Citation needed}} | '''The Online Safety Act 2023''' is a United Kingdom law designed to regulate online content and impose new safety duties on digital platforms. Enforced by [[Ofcom]], the Act grants regulators powers to fine or restrict access to services that fail to address harmful or illegal online material.{{Citation needed}} Marketed as a measure to protect children and vulnerable users, the Act has attracted significant criticism regarding its impact on consumer rights, free expression, and digital privacy.{{Citation needed}} | ||
Line 9: | Line 10: | ||
====Privacy==== | ====Privacy==== | ||
The law empowers [[Ofcom]] to compel messaging services to implement scanning tools, including within encrypted channels.{{Citation needed}} Privacy advocates argue this effectively undermines end-to-end | The law empowers [[Ofcom]] to compel messaging services to implement scanning tools, including within encrypted channels.{{Citation needed}} Privacy advocates argue this effectively undermines end-to-end | ||
encryption, forcing consumers to compromise their security for compliance.{{Citation needed}} Services like [[Signal]] and [[WhatsApp]] have threatened to withdraw from the UK rather than weaken user privacy protections.{{Citation needed}} | encryption, forcing consumers to compromise their security for compliance.{{Citation needed}} Services like [[Signal]] and [[WhatsApp]] have threatened to withdraw from the UK rather than weaken user privacy protections.{{Citation needed}} | ||
Line 31: | Line 22: | ||
===Encryption controversy (2023)=== | ===Encryption controversy (2023)=== | ||
Major technology firms, including [[Meta]] and [[apple]], publicly opposed provisions requiring proactive content detection in encrypted messaging.{{Citation needed}} They argued the law would compromise encryption or make the UK an | Major technology firms, including [[Meta]] and [[apple]], publicly opposed provisions requiring proactive content detection in encrypted messaging.{{Citation needed}} They argued the law would compromise encryption or make the UK an nonviable market for secure communications, impacting millions of consumers.{{Citation needed}} | ||
==See also== | ==See also== |
Latest revision as of 16:29, 30 August 2025
⚠️ Article status notice: This article has been marked as incomplete
This article needs additional work for its sourcing and verifiability to meet the wiki's Content Guidelines and be in line with our Mission Statement for comprehensive coverage of consumer protection issues.
This notice will be removed once sufficient documentation has been added to establish the systemic nature of these issues. Once you believe the article is ready to have its notice removed, please visit the Moderator's noticeboard, or the discord and post to the #appeals
channel.
Learn more ▼
🔄 A merge request has been made for this article
There has been a merge request for this page for the following reason:
Duplicate
Merge with:Implementation of the UK Online Safety Act
Once the merge has been completed, please contact a mod, who can delete the extra pages. Alternatively, edit the unneeded pages such that they contain only a redirect to the merged page.
If you believe the merge has been completed but the page is still not deleted please contact a moderator either on discord or via their talk pages.
The Online Safety Act 2023 is a United Kingdom law designed to regulate online content and impose new safety duties on digital platforms. Enforced by Ofcom, the Act grants regulators powers to fine or restrict access to services that fail to address harmful or illegal online material.[citation needed] Marketed as a measure to protect children and vulnerable users, the Act has attracted significant criticism regarding its impact on consumer rights, free expression, and digital privacy.[citation needed]
Consumer-impact summary[edit | edit source]
Freedom[edit | edit source]
The Act requires platforms to proactively monitor and remove "harmful" content. Critics argue this could incentivize excessive content moderation, limiting freedom of expression and legitimate debate online.[citation needed] Civil liberties groups warn that consumers may face reduced access to lawful content due to companies’ fear of regulatory penalties.[citation needed]
Privacy[edit | edit source]
The law empowers Ofcom to compel messaging services to implement scanning tools, including within encrypted channels.[citation needed] Privacy advocates argue this effectively undermines end-to-end
encryption, forcing consumers to compromise their security for compliance.[citation needed] Services like Signal and WhatsApp have threatened to withdraw from the UK rather than weaken user privacy protections.[citation needed]
Business model[edit | edit source]
Compliance requirements are resource-intensive, potentially disadvantaging smaller firms and startups.[citation needed] Critics suggest this entrenches the dominance of large technology companies, limiting consumer choice and concentrating control in fewer providers.[citation needed]
Market control[edit | edit source]
By vesting Ofcom with extensive enforcement powers, the Act centralises control over digital communication. While intended to safeguard consumers, opponents argue it risks government overreach and reduced autonomy for users to determine how their data and communications are managed.[citation needed]
Incidents[edit | edit source]
Encryption controversy (2023)[edit | edit source]
Major technology firms, including Meta and apple, publicly opposed provisions requiring proactive content detection in encrypted messaging.[citation needed] They argued the law would compromise encryption or make the UK an nonviable market for secure communications, impacting millions of consumers.[citation needed]
See also[edit | edit source]
- Ofcom
- Data Protection Act 2018
- Freedom of expression in the United Kingdom
- General Data Protection Regulation