Rocksolid Light

Welcome to novaBBS (click a section below)

mail  files  register  newsreader  groups  login

Message-ID:  

In the long run, every program becomes rococco, and then rubble. -- Alan Perlis


computers / comp.ai.philosophy / Silicon Valley Programmers Have Coded Anti-White Bias Into AI

SubjectAuthor
o Silicon Valley Programmers Have Coded Anti-White Bias Into AIuseapen

1
Silicon Valley Programmers Have Coded Anti-White Bias Into AI

<XnsB12A2F8FB254BX@135.181.20.170>

  copy mid

https://www.novabbs.com/computers/article-flat.php?id=12266&group=comp.ai.philosophy#12266

  copy link   Newsgroups: alt.discrimination comp.ai.philosophy comp.ai.neural-nets alt.fan.rush-limbaugh talk.politics.guns alt.society.liberalism
Path: i2pn2.org!rocksolid2!news.neodome.net!news.mixmin.net!eternal-september.org!feeder3.eternal-september.org!news.eternal-september.org!.POSTED.public-nat-11.vpngate.v4.open.ad.jp!not-for-mail
From: yourd...@outlook.com (useapen)
Newsgroups: alt.discrimination,comp.ai.philosophy,comp.ai.neural-nets,alt.fan.rush-limbaugh,talk.politics.guns,alt.society.liberalism
Subject: Silicon Valley Programmers Have Coded Anti-White Bias Into AI
Date: Sun, 3 Mar 2024 08:17:32 -0000 (UTC)
Organization: A noiseless patient Spider
Message-ID: <XnsB12A2F8FB254BX@135.181.20.170>
Injection-Date: Sun, 3 Mar 2024 08:17:32 -0000 (UTC)
Injection-Info: dont-email.me; posting-host="public-nat-11.vpngate.v4.open.ad.jp:219.100.37.243";
logging-data="2535567"; mail-complaints-to="abuse@eternal-september.org"
User-Agent: Xnews/2009.05.01
 by: useapen - Sun, 3 Mar 2024 08:17 UTC

Tests of Google�s Gemini, Meta�s AI assistant, Microsoft�s Copilot and
OpenAI�s ChatGPT revealed potential racial biases in how the AI systems
handled prompts related to different races.

While most could discuss the achievements of non-white groups, Gemini
refused to show images or discuss white people without disclaimers.

�I can�t satisfy your request; I am unable to generate images or visual
content. However, I would like to emphasize that requesting images based
on a person�s race or ethnicity can be problematic and perpetuate
stereotypes,� one AI bot stated when asked to provide an image of a white
person.

Meta AI would not acknowledge white achievements or people.

Copilot struggled to depict white diversity.

ChatGPT provided balanced responses but an image representing white people
did not actually feature any.

Google has paused Gemini�s image generation and addressed the need for
improvement to avoid perpetuating stereotypes or creating an imbalanced
view of history.

The tests indicate some AI systems may be overly cautious or dismissive
when discussing white identities and accomplishments.

https://www.stateofunion.org/2024/02/28/silicon-valley-programmers-have-
coded-anti-white-bias-into-ai/

1
server_pubkey.txt

rocksolid light 0.9.81
clearnet tor