[Lex Computer & Tech Group/LCTG] AI bias

Smita Desai smitausa at gmail.com
Sat Apr 9 07:55:56 PDT 2022


Bias is a major issue in AI and ML algorithms. Many times the bias in an
algorithm is not caught because the data collected is biased to begin with.
Some of the well-known cases have been Google's algorithm identifying
dark-skinned people as chimps and Apple's credit card was gender-blind. 

 

https://www.forbes.com/sites/mzhang/2015/07/01/google-photos-tags-two-africa
n-americans-as-gorillas-through-facial-recognition-software/?sh=58c2f0d5713d

 

https://www.washingtonpost.com/business/2019/11/11/apple-card-algorithm-spar
ks-gender-bias-allegations-against-goldman-sachs/

 

Smita Desai

 

From: LCTG <lctg-bounces+smitausa=gmail.com at lists.toku.us> On Behalf Of john
rudy
Sent: Saturday, April 9, 2022 10:47 AM
To: Lex Computer Group <LCTG at lists.toku.us>
Subject: [Lex Computer & Tech Group/LCTG] AI bias

 

I am taking an AI course and a major issue is the bias build into some AI
systems.  If, for example, you pick new hires from those who were successful
in the past you'll mostly hire white males because those were who you mostly
had in the past.  Here is a fascinating article from the course

https://www.wired.com/story/excerpt-from-automating-inequality/

 

John Rudy

781-861-0402

781-718-8334 (cell)

John.rudy at alum.mit.edu <mailto:John.rudy at alum.mit.edu>  

 

13 Hawthorne Lane

Bedford, MA  01730-1047



 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.toku.us/pipermail/lctg-toku.us/attachments/20220409/a62db12e/attachment.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image001.png
Type: image/png
Size: 27675 bytes
Desc: not available
URL: <http://lists.toku.us/pipermail/lctg-toku.us/attachments/20220409/a62db12e/attachment.png>


More information about the LCTG mailing list