JavaScript: Difference between revisions
JS isn't just for "complex" stuff; sockets can improve latency |
m move G and Moz links to new 1st occurrence; also because the 1st-ocur aren't in parentheses |
||
| Line 19: | Line 19: | ||
*'''Degraded accessibility''': Dynamic and/or active content is well-known to have poor accessibility for users with visual and/or cognitive impairments. While standards such as [[wikipedia:WAI-ARIA|WAI-ARIA]] were created to mitigate this, it's no silver bullet, especially when developers aren't aware of ARIA. | *'''Degraded accessibility''': Dynamic and/or active content is well-known to have poor accessibility for users with visual and/or cognitive impairments. While standards such as [[wikipedia:WAI-ARIA|WAI-ARIA]] were created to mitigate this, it's no silver bullet, especially when developers aren't aware of ARIA. | ||
*'''Degraded compatibility''': While HTML and CSS degrade gracefully, meaning web browsers not supporting a certain feature will simply ignore it and load the rest of the page, JavaScript does not. If any JavaScript feature is not supported by a web browser and not caught using a <code>try</code>...<code>catch</code> block, the rest of the script is not executed, which usually breaks the entire site if it requires JavaScript for basic functions, as "web apps" usually do. This makes accessing a website impossible from legacy systems that do not support recent web browser versions or minimalist web browsers that challenge the Google-Mozilla duopoly, rather than being able to use some parts of a website.<ref>{{cite web |url=https://jakearchibald.com/2013/progressive-enhancement-still-important/ |title=Progressive enhancement is still important - JakeArchibald.com |date=2013-07-03 |access-date=2026-04-18 }}</ref><ref>{{cite web |url=https://digdeeper.club/articles/browsers.xhtml#minimal |title=How to choose a browser for everyday use? § Why "minimalist" browsers suck. |author=Dig Deeper |access-date=2026-04-22 }}</ref> | *'''Degraded compatibility''': While HTML and CSS degrade gracefully, meaning web browsers not supporting a certain feature will simply ignore it and load the rest of the page, JavaScript does not. If any JavaScript feature is not supported by a web browser and not caught using a <code>try</code>...<code>catch</code> block, the rest of the script is not executed, which usually breaks the entire site if it requires JavaScript for basic functions, as "web apps" usually do. This makes accessing a website impossible from legacy systems that do not support recent web browser versions or minimalist web browsers that challenge the [[Google]]-[[Mozilla]] duopoly, rather than being able to use some parts of a website.<ref>{{cite web |url=https://jakearchibald.com/2013/progressive-enhancement-still-important/ |title=Progressive enhancement is still important - JakeArchibald.com |date=2013-07-03 |access-date=2026-04-18 }}</ref><ref>{{cite web |url=https://digdeeper.club/articles/browsers.xhtml#minimal |title=How to choose a browser for everyday use? § Why "minimalist" browsers suck. |author=Dig Deeper |access-date=2026-04-22 }}</ref> | ||
*'''Lack of transparency''': To optimize network bandwidth, JS code is typically served in [[wikipedia:Minification_(programming)|minified]] form, which makes it harder to understand for humans. This is particularly problematic if the original source is not publicly [[wikipedia:Source-available_software|available]], which is typically the case of [[wikipedia:Proprietary_software|proprietary software]].<ref>{{Cite web |last=Gross |first=Carson |date=21 Sep 2023 |title=The #ViewSource Affordance |url=https://htmx.org/essays/right-click-view-source/ |url-status=live |archive-url=https://web.archive.org/web/20260228105626/https://htmx.org/essays/right-click-view-source/ |archive-date=28 Feb 2026 |access-date=24 Mar 2026 |website=</> htmx ~ Essays}}</ref> | *'''Lack of transparency''': To optimize network bandwidth, JS code is typically served in [[wikipedia:Minification_(programming)|minified]] form, which makes it harder to understand for humans. This is particularly problematic if the original source is not publicly [[wikipedia:Source-available_software|available]], which is typically the case of [[wikipedia:Proprietary_software|proprietary software]].<ref>{{Cite web |last=Gross |first=Carson |date=21 Sep 2023 |title=The #ViewSource Affordance |url=https://htmx.org/essays/right-click-view-source/ |url-status=live |archive-url=https://web.archive.org/web/20260228105626/https://htmx.org/essays/right-click-view-source/ |archive-date=28 Feb 2026 |access-date=24 Mar 2026 |website=</> htmx ~ Essays}}</ref> | ||
*'''Excessive tracking''': JS is much more capable than HTML and [[CSS]]<!-- See "CSS Exfil": https://www.mike-gualtieri.com/posts/stealing-data-with-css-attack-and-defense/ --> '''combined''' to track user behavior.<ref>https://clickclickclick.click/</ref> JS can communicate with almost any server (only limited by [[wikipedia:Cross-origin_resource_sharing|CORS]]) at any time (limited by connection availability), using a plethora of protocols. JS can get hardware information and compute a [[Device fingerprint|fingerprint of the device]], user, or both.<ref>https://privacycheck.sec.lrz.de/</ref><ref>https://abrahamjuliot.github.io/creepjs</ref><ref>https://www.deviceinfo.me/</ref><ref>{{Cite web |title=Learn how identifiable you are on the Internet |url=https://www.amiunique.org/ |url-status=live |access-date=19 Mar 2026 |website=Am I Unique ?}}</ref> | *'''Excessive tracking''': JS is much more capable than HTML and [[CSS]]<!-- See "CSS Exfil": https://www.mike-gualtieri.com/posts/stealing-data-with-css-attack-and-defense/ --> '''combined''' to track user behavior.<ref>https://clickclickclick.click/</ref> JS can communicate with almost any server (only limited by [[wikipedia:Cross-origin_resource_sharing|CORS]]) at any time (limited by connection availability), using a plethora of protocols. JS can get hardware information and compute a [[Device fingerprint|fingerprint of the device]], user, or both.<ref>https://privacycheck.sec.lrz.de/</ref><ref>https://abrahamjuliot.github.io/creepjs</ref><ref>https://www.deviceinfo.me/</ref><ref>{{Cite web |title=Learn how identifiable you are on the Internet |url=https://www.amiunique.org/ |url-status=live |access-date=19 Mar 2026 |website=Am I Unique ?}}</ref> | ||
| Line 41: | Line 41: | ||
===Security=== | ===Security=== | ||
Browser-engine developers (such as | Browser-engine developers (such as Google and Mozilla) not only feel compelled, but are financially incentivized to optimize JS to its limits. This leads to complex code-bases that are harder to verify for correctness. Browser vendors mitigate this via [[wikipedia:Sandbox_(computer_security)|sandboxing]]. Unfortunately, since modern browsers compile JS to native CPU code (see [[wikipedia:Just-in-time_compilation|JIT]]) to improve performance, this introduces a higher risk of sandbox-escape.<ref>{{Cite web |last=Norman |first=Johnathan |date=4 Aug 2021 |title=Super Duper Secure Mode |url=https://microsoftedge.github.io/edgevr/posts/Super-Duper-Secure-Mode/ |url-status=live |archive-url=https://web.archive.org/web/20260218110912/https://microsoftedge.github.io/edgevr/posts/Super-Duper-Secure-Mode |archive-date=18 Feb 2026 |access-date=19 Mar 2026 |website=Microsoft Browser Vulnerability Research}}</ref> Some examples of this are as follows: | ||
*[[wikipedia:Cross-site_scripting|XSS]], which [[wikipedia:NoScript|NoScript]] tries to mitigate | *[[wikipedia:Cross-site_scripting|XSS]], which [[wikipedia:NoScript|NoScript]] tries to mitigate | ||