Google Gemini: Difference between revisions
Added another source, and a stub for consumer impact |
SinexTitan (talk | contribs) stub notice |
||
(One intermediate revision by the same user not shown) | |||
Line 1: | Line 1: | ||
{{InfoboxProductLine | {{Stub}}{{InfoboxProductLine | ||
| Title = | | Title = Google Gemini | ||
| Release Year =2023 | | Release Year =2023 | ||
| Product Type =Cloud-based Generative Neural Network Service | | Product Type =Cloud-based Generative Neural Network Service | ||
| In Production = | | In Production =Yes | ||
| Official Website =https://gemini.google.com/ | | Official Website =https://gemini.google.com/ | ||
| Logo = | | Logo =Google Gemini logo.png | ||
}}[https://de.wikipedia.org/wiki/Google_Gemini Google Gemini] is a cloud-based Large Language Model (AI text generator) which includes image generation features, developed by American search and advertising giant [[Google]]/[[Alphabet]]. | }}[https://de.wikipedia.org/wiki/Google_Gemini Google Gemini] is a cloud-based Large Language Model (AI text generator) which includes image generation features, developed by American search and advertising giant [[Google]]/[[Alphabet]]. | ||
==Consumer impact summary== | ==Consumer impact summary== | ||
Line 14: | Line 14: | ||
* Market Control}} | * Market Control}} | ||
* Accusation of exporting American moral code (eg. sexualization of nudity), woke sensibilities and cancel culture <!-- This is a larger problem that affects pretty much all US big tech companies and their content moderation policies. As such, I'm not sure this article is the right place to open this can of worms. Also, we'll need proper citations for this. --> | *Accusation of exporting American moral code (eg. sexualization of nudity), woke sensibilities and cancel culture<!-- This is a larger problem that affects pretty much all US big tech companies and their content moderation policies. As such, I'm not sure this article is the right place to open this can of worms. Also, we'll need proper citations for this. --> | ||
==Incidents== | ==Incidents== | ||
Line 37: | Line 37: | ||
[[Category:{{PAGENAME}}]] | [[Category:{{PAGENAME}}]] | ||
[[Category:Stub]] |
Revision as of 08:35, 3 July 2025
❗Article Status Notice: This Article is a stub
This article is underdeveloped, and needs additional work to meet the wiki's Content Guidelines and be in line with our Mission Statement for comprehensive coverage of consumer protection issues. Learn more ▼
Basic Information | |
---|---|
Release Year | 2023 |
Product Type | Cloud-based Generative Neural Network Service |
In Production | Yes |
Official Website | https://gemini.google.com/ |
Google Gemini is a cloud-based Large Language Model (AI text generator) which includes image generation features, developed by American search and advertising giant Google/Alphabet.
Consumer impact summary
- Accusation of exporting American moral code (eg. sexualization of nudity), woke sensibilities and cancel culture
Incidents
This is a list of all consumer protection incidents related to this product. Any incidents not mentioned here can be found in the Google Gemini category.
Over-eager political correctness filter (February 2024)
Several news outlets reported that when prompted to generate images of German WW II soldiers, Gemini not only generated images of soldiers of male central European types, but various ethnicities and genders that were historically inaccurate, such as male black and female Asian soldiers.
Gemini also refused to generate pictures of white couples when instructed to do so, but had not issues creating images of black or Asian couples.
It is suspected that this is due to misguided efforts by Google to compensate for ethnical biases in the training data set, to cater to a "woke" zeitgeist and to support cancel culture by using a so-called initial prompt, a hidden fixed text that is prepended to the user's prompt, to always instruct the neural network to always generate pictures of all genders and ethnicities.
Google reacted by displaying a message that it is working on improving the depiction of people and would notify users when the feature returns.[1][2]
Incidents affecting all of the product line/company's products can be found in the product line/company article: Product line title/Company article title
See also
References
- ↑ Grant, Nico (2024-02-22). "Google Chatbot's A.I. Images Put People of Color in Nazi-Era Uniforms". The New York Times. Archived from the original on 2025-07-02. Retrieved 2025-07-02.
- ↑ Kleinman, Zoe (2024-02-28). "Why Google's 'woke' AI problem won't be an easy fix". BBC. Retrieved 2025-07-02.
{{cite web}}
: CS1 maint: url-status (link)