Rudxain (talk | contribs)
m specific URL for Gemini FAQ
JodyBruchonFan (talk | contribs)
Consumer impact summary: It's not just about old browsers but also minimalist lightweight ones that are alternatives to the Google-Mozilla duopoly.
Line 19: Line 19:


*'''Degraded accessibility''': Dynamic and/or active content is well-known to have poor accessibility for users with visual and/or cognitive impairments. While standards such as [[wikipedia:WAI-ARIA|WAI-ARIA]] were created to mitigate this, it's no silver bullet, especially when developers aren't aware of ARIA.
*'''Degraded accessibility''': Dynamic and/or active content is well-known to have poor accessibility for users with visual and/or cognitive impairments. While standards such as [[wikipedia:WAI-ARIA|WAI-ARIA]] were created to mitigate this, it's no silver bullet, especially when developers aren't aware of ARIA.
*'''Degraded compatibility''': While HTML and CSS degrade gracefully, meaning web browsers not supporting a certain feature will simply ignore it and load the rest of the page, JavaScript does not. If any JavaScript feature is not supported by a web browser and not caught using a <code>try</code>...<code>catch</code> block, the rest of the script is not executed, which usually breaks the entire site if it requires JavaScript for basic functions, as "web apps" usually do. This makes it impossible for legacy systems to access a website at all, rather than being able to use some parts of a website.<ref>{{cite web |url=https://jakearchibald.com/2013/progressive-enhancement-still-important/ |title=Progressive enhancement is still important - JakeArchibald.com |date=2013-07-03 |access-date=2026-04-18 }}</ref>
*'''Degraded compatibility''': While HTML and CSS degrade gracefully, meaning web browsers not supporting a certain feature will simply ignore it and load the rest of the page, JavaScript does not. If any JavaScript feature is not supported by a web browser and not caught using a <code>try</code>...<code>catch</code> block, the rest of the script is not executed, which usually breaks the entire site if it requires JavaScript for basic functions, as "web apps" usually do. This makes accessing a website impossible from legacy systems that do not support recent web browser versions or minimalist web browsers that challenge the Google-Mozilla duopoly, rather than being able to use some parts of a website.<ref>{{cite web |url=https://jakearchibald.com/2013/progressive-enhancement-still-important/ |title=Progressive enhancement is still important - JakeArchibald.com |date=2013-07-03 |access-date=2026-04-18 }}</ref><ref>{{cite web |url=https://digdeeper.club/articles/browsers.xhtml#minimal |title=How to choose a browser for everyday use? § Why "minimalist" browsers suck.  |author=Dig Deeper |access-date=2026-04-22 }}</ref>
*'''Lack of transparency''': To optimize network bandwidth, JS code is typically served in [[wikipedia:Minification_(programming)|minified]] form, which makes it harder to understand for humans. This is particularly problematic if the original source is not publicly [[wikipedia:Source-available_software|available]], which is typically the case of [[wikipedia:Proprietary_software|proprietary software]].<ref>{{Cite web |last=Gross |first=Carson |date=21 Sep 2023 |title=The #ViewSource Affordance |url=https://htmx.org/essays/right-click-view-source/ |url-status=live |archive-url=https://web.archive.org/web/20260228105626/https://htmx.org/essays/right-click-view-source/ |archive-date=28 Feb 2026 |access-date=24 Mar 2026 |website=</> htmx ~ Essays}}</ref>
*'''Lack of transparency''': To optimize network bandwidth, JS code is typically served in [[wikipedia:Minification_(programming)|minified]] form, which makes it harder to understand for humans. This is particularly problematic if the original source is not publicly [[wikipedia:Source-available_software|available]], which is typically the case of [[wikipedia:Proprietary_software|proprietary software]].<ref>{{Cite web |last=Gross |first=Carson |date=21 Sep 2023 |title=The #ViewSource Affordance |url=https://htmx.org/essays/right-click-view-source/ |url-status=live |archive-url=https://web.archive.org/web/20260228105626/https://htmx.org/essays/right-click-view-source/ |archive-date=28 Feb 2026 |access-date=24 Mar 2026 |website=</> htmx ~ Essays}}</ref>
*'''Excessive tracking''': JS is much more capable than HTML and [[CSS]]<!-- See "CSS Exfil": https://www.mike-gualtieri.com/posts/stealing-data-with-css-attack-and-defense/ --> '''combined''' to track user behavior.<ref>https://clickclickclick.click/</ref> JS can communicate with almost any server (only limited by [[wikipedia:Cross-origin_resource_sharing|CORS]]) at any time (limited by connection availability), using a plethora of protocols. JS can get hardware information and compute a [[Device fingerprint|fingerprint of the device]], user, or both.<ref>https://privacycheck.sec.lrz.de/</ref><ref>https://abrahamjuliot.github.io/creepjs</ref><ref>https://www.deviceinfo.me/</ref><ref>{{Cite web |title=Learn how identifiable you are on the Internet |url=https://www.amiunique.org/ |url-status=live |access-date=19 Mar 2026 |website=Am I Unique ?}}</ref>
*'''Excessive tracking''': JS is much more capable than HTML and [[CSS]]<!-- See "CSS Exfil": https://www.mike-gualtieri.com/posts/stealing-data-with-css-attack-and-defense/ --> '''combined''' to track user behavior.<ref>https://clickclickclick.click/</ref> JS can communicate with almost any server (only limited by [[wikipedia:Cross-origin_resource_sharing|CORS]]) at any time (limited by connection availability), using a plethora of protocols. JS can get hardware information and compute a [[Device fingerprint|fingerprint of the device]], user, or both.<ref>https://privacycheck.sec.lrz.de/</ref><ref>https://abrahamjuliot.github.io/creepjs</ref><ref>https://www.deviceinfo.me/</ref><ref>{{Cite web |title=Learn how identifiable you are on the Internet |url=https://www.amiunique.org/ |url-status=live |access-date=19 Mar 2026 |website=Am I Unique ?}}</ref>
Line 25: Line 25:
*'''Security risks''': It is well-known that JS is poorly-designed,<ref>https://github.com/denysdovhan/wtfjs</ref><ref>https://github.com/brianleroux/wtfjs</ref><ref>https://github.com/Rudxain/ideas/blob/aa9a80252a4b7c9c51f32eda5c716e96220ed96e/software/evar/with_bf.js</ref> even [[wikipedia:Ecma_International|tc39]] acknowledges that{{Citation needed}}<!-- They do improve (and complicate) it every year, but the fact that `eval` isn't deprecated implies they don't care that much about improving the language -->. This leads to programmers and even experienced software-devs to accidentally add vulnerabilities to their code. That, and the fact that ES is [[wikipedia:Turing_completeness|Turing-complete]]<!-- Not typo. ECMAScript alone is TC. No need for extensions --> (both [https://gavinhoward.com/2024/03/what-computers-cannot-do-the-consequences-of-turing-completeness/#mathematical-vs-practical in practice and in theory]), makes [[wikipedia:Debugging|debugging]] and [[wikipedia:Reverse_engineering|reverse-engineering]] impractical in big code-bases. It's worth noting that tooling, such as [[wikipedia:TypeScript|TypeScript]] and [[wikipedia:ESLint|ESLint]], exist to substantially minimize the likelihood of [[wikipedia:Software_bug|bugs]].
*'''Security risks''': It is well-known that JS is poorly-designed,<ref>https://github.com/denysdovhan/wtfjs</ref><ref>https://github.com/brianleroux/wtfjs</ref><ref>https://github.com/Rudxain/ideas/blob/aa9a80252a4b7c9c51f32eda5c716e96220ed96e/software/evar/with_bf.js</ref> even [[wikipedia:Ecma_International|tc39]] acknowledges that{{Citation needed}}<!-- They do improve (and complicate) it every year, but the fact that `eval` isn't deprecated implies they don't care that much about improving the language -->. This leads to programmers and even experienced software-devs to accidentally add vulnerabilities to their code. That, and the fact that ES is [[wikipedia:Turing_completeness|Turing-complete]]<!-- Not typo. ECMAScript alone is TC. No need for extensions --> (both [https://gavinhoward.com/2024/03/what-computers-cannot-do-the-consequences-of-turing-completeness/#mathematical-vs-practical in practice and in theory]), makes [[wikipedia:Debugging|debugging]] and [[wikipedia:Reverse_engineering|reverse-engineering]] impractical in big code-bases. It's worth noting that tooling, such as [[wikipedia:TypeScript|TypeScript]] and [[wikipedia:ESLint|ESLint]], exist to substantially minimize the likelihood of [[wikipedia:Software_bug|bugs]].
*'''Degraded performance (web apps)''': "Web apps" load slower than traditional web sites because lots of code has to be processed by the web browser before any content can appear on screen, putting the content at the end of the rendering path.<ref>{{cite web |title=Critical rendering path – Mozilla Developer Network |url=https://developer.mozilla.org/en-US/docs/Web/Performance/Guides/Critical_rendering_path |access-date=2026-04-18 }}</ref>
*'''Degraded performance (web apps)''': "Web apps" load slower than traditional web sites because lots of code has to be processed by the web browser before any content can appear on screen, putting the content at the end of the rendering path.<ref>{{cite web |title=Critical rendering path – Mozilla Developer Network |url=https://developer.mozilla.org/en-US/docs/Web/Performance/Guides/Critical_rendering_path |access-date=2026-04-18 }}</ref>
==How it works==
==How it works==
Whenever a user visits a webpage, an average web-browser will execute the JS code it finds in <code><script></code> [[wikipedia:HTML_element|tags]]. This code could do anything from updating part of the [[wikipedia:Document_Object_Model|DOM]]-tree only when the user requests it, to showing a [[wikipedia:Pop-up_ad|popup/popunder]].
Whenever a user visits a webpage, an average web-browser will execute the JS code it finds in <code><script></code> [[wikipedia:HTML_element|tags]]. This code could do anything from updating part of the [[wikipedia:Document_Object_Model|DOM]]-tree only when the user requests it, to showing a [[wikipedia:Pop-up_ad|popup/popunder]].