tone needs to be more neutral and external links section cleaned up. Referencing other wikis is prohibited. "Why it is a problem" section feels a little structured by AI
Vandetta (talk | contribs)
fix up why its a problem a bit
Line 30: Line 30:


==Why it is a problem==
==Why it is a problem==
Many webpages (and even entire websites), force the user to keep JS enabled, otherwise they break or deliberately refuse to work. In 2026, considering the advancements in HTML<!-- TO-DO: cite `<portal>`. I remember an entire website that demos/showcases the Portal API, but can't find it. `<portal>` fixed the fundamental problem that SPAs try to solve, with minimal (or zero!) JS --> and CSS technology, there is minimal reason why an average website (excluding real-time simulations and low-latency gaming) would ''ever'' need JS.<ref>{{Cite web |last=Valkhof |first=Kilian |date=2023-12-02 |title=You don't need JavaScript for that |url=https://www.htmhell.dev/adventcalendar/2023/2/ |url-status=live |archive-url=https://web.archive.org/web/20260308161856/https://www.htmhell.dev/adventcalendar/2023/2/ |archive-date=2026-03-08 |access-date=2026-03-19 |website=HTMHell}}</ref><ref>{{Cite web |last=Archibald |first=Jake |date=2025-07-01 |title=Give footnotes the boot § Footnotes on the web |url=https://jakearchibald.com/2025/give-footnotes-the-boot/#footnotes-on-the-web |url-status=live |archive-url=https://web.archive.org/web/20251220110553/https://jakearchibald.com/2025/give-footnotes-the-boot/#footnotes-on-the-web |archive-date=2025-12-20 |access-date=2026-03-20 |website=Blog - JakeArchibald.com}}</ref> The main valid justifications are:


*[[wikipedia:Legacy_code|Legacy code-bases]]. As those are impractical to migrate to no-JS solutions
=== Tracking ===
*[[wikipedia:Web_hosting_service#Static_page_hosting|Static web-hosting]]. As the developer has no control over the server, any interactivity must be provided by JS
Many webpages (and even entire websites), force the user to keep JS enabled, otherwise they break or deliberately refuse to work. CSS stylesheets combined with HTML<!-- TO-DO: cite `<portal>`. I remember an entire website that demos/showcases the Portal API, but can't find it. `<portal>` fixed the fundamental problem that SPAs try to solve, with minimal (or zero!) JS -->should be fine with most basic websites or webpages that do not need complex client side interaction.
*[[wikipedia:Instant_messaging|Instant messaging]] (self-evident)


Expanding on the tracking capability, JS makes it harder for [[Ad block|ad-blockers]] to block ads, since it can be used to make overly-dynamic ads. The data collected by malicious JS makes it trivial to serve [[Personalized Ads|personalized ads]], even across unrelated sites. Some sites collect so much data that they are indistinguishable from [[spyware]] (see also [[wikipedia:Keystroke_logging|key-logging]]).<ref>{{Cite web |last=Hill |first=Kashmir |date=2017-06-20 |title=Before You Hit ‘Submit,’ This Company Has Already Logged Your Personal Data |url=https://gizmodo.com/before-you-hit-submit-this-company-has-already-logge-1795906081 |url-status=live |archive-url=https://web.archive.org/web/20260220091637/https://gizmodo.com/before-you-hit-submit-this-company-has-already-logge-1795906081 |archive-date=2026-02-20 |access-date=2026-03-19 |website=Gizmodo}}</ref>
JS makes it harder for [[Ad block|ad-blockers]] to block ads, since it can be used to make overly-dynamic ads. The data collected by malicious JS makes it trivial to serve [[Personalized Ads|personalized ads]], even across unrelated sites. Some sites collect so much data that they are indistinguishable from [[spyware]] (see also [[wikipedia:Keystroke_logging|key-logging]]).<ref>{{Cite web |last=Hill |first=Kashmir |date=2017-06-20 |title=Before You Hit ‘Submit,’ This Company Has Already Logged Your Personal Data |url=https://gizmodo.com/before-you-hit-submit-this-company-has-already-logge-1795906081 |url-status=live |archive-url=https://web.archive.org/web/20260220091637/https://gizmodo.com/before-you-hit-submit-this-company-has-already-logge-1795906081 |archive-date=2026-02-20 |access-date=2026-03-19 |website=Gizmodo}}</ref>
 
=== Security ===
Browser-engine developers (such as [[Google]] and [[Mozilla]]) not only feel compelled, but are financially incentivized to optimize JS to its limits. This leads to complex code-bases that are harder to verify for correctness. Browser vendors mitigate this via [[wikipedia:Sandbox_(computer_security)|sandboxing]]. Unfortunately, since modern browsers compile JS to native CPU code (see [[wikipedia:Just-in-time_compilation|JIT]]) to improve performance, this introduces a higher risk of sandbox-escape.<ref>{{Cite web |last=Norman |first=Johnathan |date=2021-08-04 |title=Super Duper Secure Mode |url=https://microsoftedge.github.io/edgevr/posts/Super-Duper-Secure-Mode/ |url-status=live |archive-url=https://web.archive.org/web/20260218110912/https://microsoftedge.github.io/edgevr/posts/Super-Duper-Secure-Mode |archive-date=2026-02-18 |access-date=2026-03-19 |website=Microsoft Browser Vulnerability Research}}</ref> Some examples of this are as follows:


Expanding on the security risks, these are the most common vulnerabilities found in JS code:
*[[wikipedia:Cross-site_scripting|XSS]], which [[wikipedia:NoScript|NoScript]] tries to mitigate
*[[wikipedia:Cross-site_scripting|XSS]], which [[wikipedia:NoScript|NoScript]] tries to mitigate
*[[wikipedia:Arbitrary_code_execution|Arbitrary code execution]] and [[wikipedia:Code_injection|code injection]]. Typically caused by <code>[https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/eval eval]</code> (part of ES), but there are Web APIs (such as <code>[https://developer.mozilla.org/en-US/docs/Web/API/Window/setTimeout setTimeout]</code> and <code>[https://developer.mozilla.org/en-US/docs/Web/API/Window/setInterval setInterval]</code>) that can be misused as well.
*[[wikipedia:Arbitrary_code_execution|Arbitrary code execution]] and [[wikipedia:Code_injection|code injection]]. Typically caused by <code>[https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/eval eval]</code> (part of ES), but there are Web APIs (such as <code>[https://developer.mozilla.org/en-US/docs/Web/API/Window/setTimeout setTimeout]</code> and <code>[https://developer.mozilla.org/en-US/docs/Web/API/Window/setInterval setInterval]</code>) that can be misused as well.
*Remote code execution. This is used by hackers and crackers to build [[wikipedia:Botnet|bot-nets]] for [[wikipedia:Ddos#Distributed_DoS|DDoS]] or [[wikipedia:Cryptocurrency|crypto]]-mining, but it's mostly used for spyware since it can hide more easily.
*Remote code execution. This is used by hackers and crackers to build [[wikipedia:Botnet|bot-nets]] for [[wikipedia:Ddos#Distributed_DoS|DDoS]] or [[wikipedia:Cryptocurrency|crypto]]-mining, but it's mostly used for spyware since it can hide more easily.
Browser-engine developers (such as [[Google]] and [[Mozilla]]) not only feel compelled, but are financially incentivized to optimize JS to its limits. This leads to complex code-bases that are harder to verify for correctness. Browser vendors mitigate this via [[wikipedia:Sandbox_(computer_security)|sandboxing]]. Unfortunately, since modern browsers compile JS to native CPU code (see [[wikipedia:Just-in-time_compilation|JIT]]) to improve performance, this introduces a higher risk of sandbox-escape.<ref>{{Cite web |last=Norman |first=Johnathan |date=2021-08-04 |title=Super Duper Secure Mode |url=https://microsoftedge.github.io/edgevr/posts/Super-Duper-Secure-Mode/ |url-status=live |archive-url=https://web.archive.org/web/20260218110912/https://microsoftedge.github.io/edgevr/posts/Super-Duper-Secure-Mode |archive-date=2026-02-18 |access-date=2026-03-19 |website=Microsoft Browser Vulnerability Research}}</ref>


JS not only makes pages "dynamic", the language itself (ES) is very dynamic, which is hard to optimize by engines. To put into perspective how much JS can slow down rendering, someone bench-marked a [[Bloatware|bloated]] pure-HTML page and a "simple" [[wikipedia:React_(software)|React]] app, the bloated HTML had faster [https://developer.mozilla.org/en-US/docs/Glossary/First_meaningful_paint FMP].<ref>{{Cite web |last=Leatherman |first=Zach |date=2019-09-06 |title=Which has a better First Meaningful Paint time? |url=https://twitter.com/zachleat/status/1169998370041208832 |url-status=live |archive-url=https://web.archive.org/web/20240529104252/https://x.com/zachleat/status/1169998370041208832 |archive-date=2024-05-29 |access-date=2024-05-29 |website=Twitter/X}}</ref>
=== Scraping ===
Since the rise of big LLM's many brokers have started offering scraping services for companies that want more training data for their AI. and to that end a lot of headless browser agents have begun to scrape (collect a sites information provided) even with the users robots.txt provided as a common standard to tell agents not to do so. this has lead to many forums and websites that had not used JS before to start implementing CAPCHAS or Anubis to prevent increased overhead and bandwidth costs.


==Incidents==
==Incidents==