Online Safety Act: Difference between revisions

Eliadttt (talk | contribs)
m minor edits to infoboxcompany (ofcom)
duplicate, marked for a merge
 
(2 intermediate revisions by 2 users not shown)
Line 1: Line 1:
{{Incomplete}}
{{Incomplete}}{{MergeRequest|Duplicate|[[Implementation of the UK Online Safety Act]]}}
 
'''The Online Safety Act 2023''' is a United Kingdom law designed to regulate online content and impose new safety duties on digital platforms. Enforced by [[Ofcom]], the Act grants regulators powers to fine or restrict access to services that fail to address harmful or illegal online material.{{Citation needed}} Marketed as a measure to protect children and vulnerable users, the Act has attracted significant criticism regarding its impact on consumer rights, free expression, and digital privacy.{{Citation needed}}
'''The Online Safety Act 2023''' is a United Kingdom law designed to regulate online content and impose new safety duties on digital platforms. Enforced by [[Ofcom]], the Act grants regulators powers to fine or restrict access to services that fail to address harmful or illegal online material.{{Citation needed}} Marketed as a measure to protect children and vulnerable users, the Act has attracted significant criticism regarding its impact on consumer rights, free expression, and digital privacy.{{Citation needed}}


Line 5: Line 6:


====Freedom====
====Freedom====
{{InfoboxCompany
| Name = The Office of Communications
| Type = Statutory corporation
| Founded = Created by Office of Communications Act 2002
| Industry = Broadcasting
| Official Website = http://www.ofcom.org.uk/
| Logo = Ofcom-3424516625.png
}}
The Act requires platforms to proactively monitor and remove "harmful" content. Critics argue this could incentivize excessive content moderation, limiting freedom of expression and legitimate debate online.{{Citation needed}} Civil liberties groups warn that consumers may face reduced access to lawful content due to companies’ fear of regulatory penalties.{{Citation needed}}
The Act requires platforms to proactively monitor and remove "harmful" content. Critics argue this could incentivize excessive content moderation, limiting freedom of expression and legitimate debate online.{{Citation needed}} Civil liberties groups warn that consumers may face reduced access to lawful content due to companies’ fear of regulatory penalties.{{Citation needed}}


Line 29: Line 22:


===Encryption controversy (2023)===
===Encryption controversy (2023)===
Major technology firms, including [[Meta]] and [[apple]], publicly opposed provisions requiring proactive content detection in encrypted messaging.{{Citation needed}} They argued the law would compromise encryption or make the UK an unviable market for secure communications, impacting millions of consumers.{{Citation needed}}
Major technology firms, including [[Meta]] and [[apple]], publicly opposed provisions requiring proactive content detection in encrypted messaging.{{Citation needed}} They argued the law would compromise encryption or make the UK an nonviable market for secure communications, impacting millions of consumers.{{Citation needed}}


==See also==
==See also==