Drakeula (talk | contribs)
Drakeula (talk | contribs)
No edit summary
Line 1: Line 1:
== Hello ==
Hi.  Most of my wiki experience is from editing on wikipedia.
My background is in computers, computers & society, and technical writing/editing.  Plus some public health.
At the moment I am mostly using this page to draft pieces for articles.
==Draft for article on Artificial Intelligence==
==Draft for article on Artificial Intelligence==


Machine learning  
=== Machine learning (Generative AI is subset) ===
 
Bias.  Replicates patterns and deficiencies in the training data.  May be intentional biases from the trainers, or patterns in the data set that the trainers did not consider or were unaware of.  Examples.  Image classifiers labeling African-Americans as gorillas.  Self-driving car killing person (walking a bicycle was it), because the training set did not include persons with bicycles (what about walkers, canes, unicycles, strollers, ...?).  Amazon hiring program was trained on a primarily male workforce, so it discarded resumes that contained the word women (or other markers).
Bias.  Replicates patterns and deficiencies in the training data.  May be intentional biases from the trainers, or patterns in the data set that the trainers did not consider or were unaware of.  Examples.  Image classifiers labeling African-Americans as gorillas.  Self-driving car killing person (walking a bicycle was it), because the training set did not include persons with bicycles (what about walkers, canes, unicycles, strollers, ...?).  Amazon hiring program was trained on a primarily male workforce, so it discarded resumes that contained the word women (or other markers).


Line 13: Line 20:


==Why it is a problem==
==Why it is a problem==
=== LLM ===
Unreliable.  No way to make them reliable.
Unreliable.  No way to make them reliable.


Line 26: Line 35:


==Examples (of abuse)==
==Examples (of abuse)==
=== LLM ===
Customer service chatbots present misinformation as fact (rug pull).  For example, misrepresent prices, misstate policies.  Even if the company will say that is a mistake when challenged, the company may profit from people who don't notice, or don't know to challenge it.[Cite burger joint, system capabilities]
Customer service chatbots present misinformation as fact (rug pull).  For example, misrepresent prices, misstate policies.  Even if the company will say that is a mistake when challenged, the company may profit from people who don't notice, or don't know to challenge it.[Cite burger joint, system capabilities]
Vibe coding.  Vibe means incompetent.  If you are doing vibe coding, the AI will not teach you best practices, and what you are doing wrong.  An experienced programmer may fix the problems caused by using AI coding (but in studies, this takes more time to do that than to do the job without AI). 


Search summaries [what is google's name]   
Search summaries [what is google's name]   


Generative not just LLM:  
Vibe coding.  Vibe means incompetent.  If you are doing vibe coding, the AI will not teach you best practices, and what you are doing wrong.  An experienced programmer may fix the problems caused by using AI coding (but in studies, this takes more time to do that than to do the job without AI).  


=== Generative not just LLM: ===
Providers of nudify programs typically do not provide adequate user education on the legal and reputational dangers to users.  They also do not adequately protect the photographic subjects (enforce that models must be informed and the user must have a valid release from the model).     
Providers of nudify programs typically do not provide adequate user education on the legal and reputational dangers to users.  They also do not adequately protect the photographic subjects (enforce that models must be informed and the user must have a valid release from the model).     


==Further Reading==
==Further Reading==