[Lex Computer & Tech Group/LCTG] AI bias

john rudy jjrudy1 at comcast.net
Sat Apr 9 08:03:01 PDT 2022


But the problem is that the increase in AI systems is faster than the understanding of how they operate.  In many cases an “answer” is presented by the system but it is impossible to determine the exact way in which the answer is derived, thus making it very difficult to adjust what might have been causing the problem.  If the dataset from which Google built its responses was predominantly of whites, one can begin to see why the erroneous conclusion was reached.  That story reached the news and, presumably, the problem has been addressed.  In other cases the bias has been far less “obvious” and probably not caught.  I may send out more material from the course.

John

 

John Rudy

781-861-0402

781-718-8334 (cell)

 <mailto:John.rudy at alum.mit.edu> John.rudy at alum.mit.edu 

 

13 Hawthorne Lane

Bedford, MA  01730-1047



 

From: Robert Primak <bobprimak at yahoo.com> 
Sent: Saturday, April 9, 2022 10:58 AM
To: 'john rudy' <jjrudy1 at comcast.net>; 'Lex Computer Group' <lctg at lists.toku.us>; Smita Desai <smitausa at gmail.com>
Subject: Re: [Lex Computer & Tech Group/LCTG] AI bias

 

That first instance is really embarrassing!

 

-- Bob Primak

 

On Saturday, April 9, 2022, 10:56:04 AM EDT, Smita Desai <smitausa at gmail.com <mailto:smitausa at gmail.com> > wrote: 

 

 

Bias is a major issue in AI and ML algorithms. Many times the bias in an algorithm is not caught because the data collected is biased to begin with. Some of the well-known cases have been Google’s algorithm identifying dark-skinned people as chimps and Apple’s credit card was gender-blind. 

 

https://www.forbes.com/sites/mzhang/2015/07/01/google-photos-tags-two-african-americans-as-gorillas-through-facial-recognition-software/?sh=58c2f0d5713d

 

https://www.washingtonpost.com/business/2019/11/11/apple-card-algorithm-sparks-gender-bias-allegations-against-goldman-sachs/

 

Smita Desai

 

From: LCTG <lctg-bounces+smitausa=gmail.com at lists.toku.us <mailto:lctg-bounces+smitausa=gmail.com at lists.toku.us> > On Behalf Of john rudy
Sent: Saturday, April 9, 2022 10:47 AM
To: Lex Computer Group <LCTG at lists.toku.us <mailto:LCTG at lists.toku.us> >
Subject: [Lex Computer & Tech Group/LCTG] AI bias

 

I am taking an AI course and a major issue is the bias build into some AI systems.  If, for example, you pick new hires from those who were successful in the past you’ll mostly hire white males because those were who you mostly had in the past.  Here is a fascinating article from the course

https://www.wired.com/story/excerpt-from-automating-inequality/

 

John Rudy

781-861-0402

781-718-8334 (cell)

John.rudy at alum.mit.edu <mailto:John.rudy at alum.mit.edu>  

 

13 Hawthorne Lane

Bedford, MA  01730-1047



 

===============================================
::The Lexington Computer and Technology Group Mailing List::
Reply goes to sender only; Reply All to send to list.
Send to the list: LCTG at lists.toku.us <mailto:LCTG at lists.toku.us>       Message archives: http://lists.toku.us/private.cgi/lctg-toku.us
To subscribe: email lctg-subscribe at toku.us <mailto:lctg-subscribe at toku.us>   To unsubscribe: email lctg-unsubscribe at toku.us <mailto:lctg-unsubscribe at toku.us> 
Future and Past meeting information: http://LCTG.toku.us
List information: http://lists.toku.us/listinfo.cgi/lctg-toku.us
This message was sent to bobprimak at yahoo.com. <mailto:bobprimak at yahoo.com.> 
Set your list options: http://lists.toku.us/options.cgi/lctg-toku.us/bobprimak@yahoo.com

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.toku.us/pipermail/lctg-toku.us/attachments/20220409/50933268/attachment.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image001.png
Type: image/png
Size: 27675 bytes
Desc: not available
URL: <http://lists.toku.us/pipermail/lctg-toku.us/attachments/20220409/50933268/attachment.png>


More information about the LCTG mailing list