<html xmlns:v="urn:schemas-microsoft-com:vml" xmlns:o="urn:schemas-microsoft-com:office:office" xmlns:w="urn:schemas-microsoft-com:office:word" xmlns:m="http://schemas.microsoft.com/office/2004/12/omml" xmlns="http://www.w3.org/TR/REC-html40"><head><meta http-equiv=Content-Type content="text/html; charset=utf-8"><meta name=Generator content="Microsoft Word 15 (filtered medium)"><!--[if !mso]><style>v\:* {behavior:url(#default#VML);}
o\:* {behavior:url(#default#VML);}
w\:* {behavior:url(#default#VML);}
.shape {behavior:url(#default#VML);}
</style><![endif]--><style><!--
/* Font Definitions */
@font-face
{font-family:Helvetica;
panose-1:2 11 6 4 2 2 2 2 2 4;}
@font-face
{font-family:"Cambria Math";
panose-1:2 4 5 3 5 4 6 3 2 4;}
@font-face
{font-family:Calibri;
panose-1:2 15 5 2 2 2 4 3 2 4;}
/* Style Definitions */
p.MsoNormal, li.MsoNormal, div.MsoNormal
{margin:0in;
font-size:11.0pt;
font-family:"Calibri",sans-serif;}
a:link, span.MsoHyperlink
{mso-style-priority:99;
color:blue;
text-decoration:underline;}
p.yiv4568511074msonormal, li.yiv4568511074msonormal, div.yiv4568511074msonormal
{mso-style-name:yiv4568511074msonormal;
mso-margin-top-alt:auto;
margin-right:0in;
mso-margin-bottom-alt:auto;
margin-left:0in;
font-size:11.0pt;
font-family:"Calibri",sans-serif;}
span.EmailStyle28
{mso-style-type:personal-reply;
font-family:"Arial",sans-serif;
font-variant:normal !important;
color:windowtext;
text-transform:none;
text-decoration:none none;
vertical-align:baseline;}
.MsoChpDefault
{mso-style-type:export-only;
font-size:10.0pt;}
@page WordSection1
{size:8.5in 11.0in;
margin:1.0in 1.0in 1.0in 1.0in;}
div.WordSection1
{page:WordSection1;}
--></style><!--[if gte mso 9]><xml>
<o:shapedefaults v:ext="edit" spidmax="1026" />
</xml><![endif]--><!--[if gte mso 9]><xml>
<o:shapelayout v:ext="edit">
<o:idmap v:ext="edit" data="1" />
</o:shapelayout></xml><![endif]--></head><body lang=EN-US link=blue vlink=purple style='word-wrap:break-word'><div class=WordSection1><p class=MsoNormal><span style='font-size:12.0pt;font-family:"Arial",sans-serif'>But the problem is that the increase in AI systems is faster than the understanding of how they operate. In many cases an “answer” is presented by the system but it is impossible to determine the exact way in which the answer is derived, thus making it very difficult to adjust what might have been causing the problem. If the dataset from which Google built its responses was predominantly of whites, one can begin to see why the erroneous conclusion was reached. That story reached the news and, presumably, the problem has been addressed. In other cases the bias has been far less “obvious” and probably not caught. I may send out more material from the course.<o:p></o:p></span></p><p class=MsoNormal><span style='font-size:12.0pt;font-family:"Arial",sans-serif'>John<o:p></o:p></span></p><p class=MsoNormal><span style='font-size:12.0pt;font-family:"Arial",sans-serif'><o:p> </o:p></span></p><div><p class=MsoNormal><span style='font-size:10.0pt;font-family:"Arial",sans-serif'>John Rudy<o:p></o:p></span></p><p class=MsoNormal><span style='font-size:10.0pt;font-family:"Arial",sans-serif'>781-861-0402<o:p></o:p></span></p><p class=MsoNormal><span style='font-size:10.0pt;font-family:"Arial",sans-serif'>781-718-8334 (cell)<o:p></o:p></span></p><p class=MsoNormal><span style='font-size:10.0pt;font-family:"Arial",sans-serif'><a href="mailto:John.rudy@alum.mit.edu"><span style='color:#0563C1'>John.rudy@alum.mit.edu</span></a> <o:p></o:p></span></p><p class=MsoNormal><span style='font-size:10.0pt;font-family:"Arial",sans-serif'><o:p> </o:p></span></p><p class=MsoNormal><span style='font-size:10.0pt;font-family:"Arial",sans-serif'>13 Hawthorne Lane<o:p></o:p></span></p><p class=MsoNormal><span style='font-size:10.0pt;font-family:"Arial",sans-serif'>Bedford, MA 01730-1047<o:p></o:p></span></p><p class=MsoNormal><img border=0 width=99 height=94 style='width:1.0312in;height:.9791in' id="Picture_x0020_2" src="cid:image001.png@01D84C01.60B347F0"><span style='font-size:10.0pt;font-family:"Arial",sans-serif'><o:p></o:p></span></p></div><p class=MsoNormal><span style='font-size:12.0pt;font-family:"Arial",sans-serif'><o:p> </o:p></span></p><div><div style='border:none;border-top:solid #E1E1E1 1.0pt;padding:3.0pt 0in 0in 0in'><p class=MsoNormal><b>From:</b> Robert Primak <bobprimak@yahoo.com> <br><b>Sent:</b> Saturday, April 9, 2022 10:58 AM<br><b>To:</b> 'john rudy' <jjrudy1@comcast.net>; 'Lex Computer Group' <lctg@lists.toku.us>; Smita Desai <smitausa@gmail.com><br><b>Subject:</b> Re: [Lex Computer & Tech Group/LCTG] AI bias<o:p></o:p></p></div></div><p class=MsoNormal><o:p> </o:p></p><div><div><p class=MsoNormal><span style='font-size:10.0pt;font-family:"Helvetica",sans-serif'>That first instance is really embarrassing!<o:p></o:p></span></p></div><div><p class=MsoNormal><span style='font-size:10.0pt;font-family:"Helvetica",sans-serif'><o:p> </o:p></span></p></div><div><p class=MsoNormal><span style='font-size:10.0pt;font-family:"Helvetica",sans-serif'>-- Bob Primak<o:p></o:p></span></p></div><div><p class=MsoNormal><span style='font-size:10.0pt;font-family:"Helvetica",sans-serif'><o:p> </o:p></span></p></div></div><div id="yahoo_quoted_0472357449"><div><div><p class=MsoNormal><span style='font-size:10.0pt;font-family:"Helvetica",sans-serif;color:#26282A'>On Saturday, April 9, 2022, 10:56:04 AM EDT, Smita Desai <<a href="mailto:smitausa@gmail.com">smitausa@gmail.com</a>> wrote: <o:p></o:p></span></p></div><div><p class=MsoNormal><span style='font-size:10.0pt;font-family:"Helvetica",sans-serif;color:#26282A'><o:p> </o:p></span></p></div><div><p class=MsoNormal><span style='font-size:10.0pt;font-family:"Helvetica",sans-serif;color:#26282A'><o:p> </o:p></span></p></div><div><div id=yiv4568511074><div><div><p class=yiv4568511074msonormal><span style='font-size:12.0pt;font-family:"Arial",sans-serif;color:#26282A'>Bias is a major issue in AI and ML algorithms. Many times the bias in an algorithm is not caught because the data collected is biased to begin with. Some of the well-known cases have been Google’s algorithm identifying dark-skinned people as chimps and Apple’s credit card was gender-blind. </span><span style='font-size:10.0pt;font-family:"Helvetica",sans-serif;color:#26282A'><o:p></o:p></span></p><p class=yiv4568511074msonormal><span style='font-size:12.0pt;font-family:"Arial",sans-serif;color:#26282A'> </span><span style='font-size:10.0pt;font-family:"Helvetica",sans-serif;color:#26282A'><o:p></o:p></span></p><p class=yiv4568511074msonormal><span style='font-size:12.0pt;font-family:"Arial",sans-serif;color:#26282A'><a href="https://www.forbes.com/sites/mzhang/2015/07/01/google-photos-tags-two-african-americans-as-gorillas-through-facial-recognition-software/?sh=58c2f0d5713d" target="_blank">https://www.forbes.com/sites/mzhang/2015/07/01/google-photos-tags-two-african-americans-as-gorillas-through-facial-recognition-software/?sh=58c2f0d5713d</a></span><span style='font-size:10.0pt;font-family:"Helvetica",sans-serif;color:#26282A'><o:p></o:p></span></p><p class=yiv4568511074msonormal><span style='font-size:12.0pt;font-family:"Arial",sans-serif;color:#26282A'> </span><span style='font-size:10.0pt;font-family:"Helvetica",sans-serif;color:#26282A'><o:p></o:p></span></p><p class=yiv4568511074msonormal><span style='font-size:12.0pt;font-family:"Arial",sans-serif;color:#26282A'><a href="https://www.washingtonpost.com/business/2019/11/11/apple-card-algorithm-sparks-gender-bias-allegations-against-goldman-sachs/" target="_blank">https://www.washingtonpost.com/business/2019/11/11/apple-card-algorithm-sparks-gender-bias-allegations-against-goldman-sachs/</a></span><span style='font-size:10.0pt;font-family:"Helvetica",sans-serif;color:#26282A'><o:p></o:p></span></p><p class=yiv4568511074msonormal><span style='font-size:12.0pt;font-family:"Arial",sans-serif;color:#26282A'> </span><span style='font-size:10.0pt;font-family:"Helvetica",sans-serif;color:#26282A'><o:p></o:p></span></p><p class=yiv4568511074msonormal><span style='font-size:12.0pt;font-family:"Arial",sans-serif;color:#26282A'>Smita Desai</span><span style='font-size:10.0pt;font-family:"Helvetica",sans-serif;color:#26282A'><o:p></o:p></span></p><p class=yiv4568511074msonormal><span style='font-size:12.0pt;font-family:"Arial",sans-serif;color:#26282A'> </span><span style='font-size:10.0pt;font-family:"Helvetica",sans-serif;color:#26282A'><o:p></o:p></span></p><div id=yiv4568511074yqt91533><div><div style='border:none;border-top:solid #E1E1E1 1.0pt;padding:3.0pt 0in 0in 0in'><p class=yiv4568511074msonormal><b><span style='font-size:10.0pt;font-family:"Helvetica",sans-serif;color:#26282A'>From:</span></b><span style='font-size:10.0pt;font-family:"Helvetica",sans-serif;color:#26282A'> LCTG <<a href="mailto:lctg-bounces+smitausa=gmail.com@lists.toku.us">lctg-bounces+smitausa=gmail.com@lists.toku.us</a>> <b>On Behalf Of </b>john rudy<br><b>Sent:</b> Saturday, April 9, 2022 10:47 AM<br><b>To:</b> Lex Computer Group <<a href="mailto:LCTG@lists.toku.us">LCTG@lists.toku.us</a>><br><b>Subject:</b> [Lex Computer & Tech Group/LCTG] AI bias<o:p></o:p></span></p></div></div><p class=yiv4568511074msonormal><span style='font-size:10.0pt;font-family:"Helvetica",sans-serif;color:#26282A'> <o:p></o:p></span></p><p class=yiv4568511074msonormal><span style='font-size:12.0pt;font-family:"Arial",sans-serif;color:#26282A'>I am taking an AI course and a major issue is the bias build into some AI systems. If, for example, you pick new hires from those who were successful in the past you’ll mostly hire white males because those were who you mostly had in the past. Here is a fascinating article from the course</span><span style='font-size:10.0pt;font-family:"Helvetica",sans-serif;color:#26282A'><o:p></o:p></span></p><p class=yiv4568511074msonormal><span style='font-size:12.0pt;font-family:"Arial",sans-serif;color:#26282A'><a href="https://www.wired.com/story/excerpt-from-automating-inequality/" target="_blank">https://www.wired.com/story/excerpt-from-automating-inequality/</a></span><span style='font-size:10.0pt;font-family:"Helvetica",sans-serif;color:#26282A'><o:p></o:p></span></p><p class=yiv4568511074msonormal><span style='font-size:12.0pt;font-family:"Arial",sans-serif;color:#26282A'> </span><span style='font-size:10.0pt;font-family:"Helvetica",sans-serif;color:#26282A'><o:p></o:p></span></p><p class=yiv4568511074msonormal><span style='font-size:10.0pt;font-family:"Arial",sans-serif;color:#26282A'>John Rudy</span><span style='font-size:10.0pt;font-family:"Helvetica",sans-serif;color:#26282A'><o:p></o:p></span></p><p class=yiv4568511074msonormal><span style='font-size:10.0pt;font-family:"Arial",sans-serif;color:#26282A'>781-861-0402</span><span style='font-size:10.0pt;font-family:"Helvetica",sans-serif;color:#26282A'><o:p></o:p></span></p><p class=yiv4568511074msonormal><span style='font-size:10.0pt;font-family:"Arial",sans-serif;color:#26282A'>781-718-8334 (cell)</span><span style='font-size:10.0pt;font-family:"Helvetica",sans-serif;color:#26282A'><o:p></o:p></span></p><p class=yiv4568511074msonormal><span style='font-size:10.0pt;font-family:"Arial",sans-serif;color:#26282A'><a href="mailto:John.rudy@alum.mit.edu" target="_blank">John.rudy@alum.mit.edu</a> </span><span style='font-size:10.0pt;font-family:"Helvetica",sans-serif;color:#26282A'><o:p></o:p></span></p><p class=yiv4568511074msonormal><span style='font-size:10.0pt;font-family:"Arial",sans-serif;color:#26282A'> </span><span style='font-size:10.0pt;font-family:"Helvetica",sans-serif;color:#26282A'><o:p></o:p></span></p><p class=yiv4568511074msonormal><span style='font-size:10.0pt;font-family:"Arial",sans-serif;color:#26282A'>13 Hawthorne Lane</span><span style='font-size:10.0pt;font-family:"Helvetica",sans-serif;color:#26282A'><o:p></o:p></span></p><p class=yiv4568511074msonormal><span style='font-size:10.0pt;font-family:"Arial",sans-serif;color:#26282A'>Bedford, MA 01730-1047</span><span style='font-size:10.0pt;font-family:"Helvetica",sans-serif;color:#26282A'><o:p></o:p></span></p></div><p class=yiv4568511074msonormal><span style='font-size:10.0pt;font-family:"Helvetica",sans-serif;color:#26282A'><img border=0 width=99 height=94 style='width:1.0312in;height:.9791in' id="yiv4568511074Picture_x0020_1" src="cid:image001.png@01D84C01.60B347F0"></span><span style='font-size:10.0pt;font-family:"Helvetica",sans-serif;color:#26282A'><o:p></o:p></span></p><p class=yiv4568511074msonormal><span style='font-size:10.0pt;font-family:"Helvetica",sans-serif;color:#26282A'> <o:p></o:p></span></p></div></div></div><div id=yqt98388><p class=MsoNormal><span style='font-size:10.0pt;font-family:"Helvetica",sans-serif;color:#26282A'>===============================================<br>::The Lexington Computer and Technology Group Mailing List::<br>Reply goes to sender only; Reply All to send to list.<br>Send to the list: <a href="mailto:LCTG@lists.toku.us">LCTG@lists.toku.us</a> Message archives: <a href="http://lists.toku.us/private.cgi/lctg-toku.us" target="_blank">http://lists.toku.us/private.cgi/lctg-toku.us</a><br>To subscribe: email <a href="mailto:lctg-subscribe@toku.us">lctg-subscribe@toku.us</a> To unsubscribe: email <a href="mailto:lctg-unsubscribe@toku.us">lctg-unsubscribe@toku.us</a><br>Future and Past meeting information: <a href="http://LCTG.toku.us" target="_blank">http://LCTG.toku.us</a><br>List information: <a href="http://lists.toku.us/listinfo.cgi/lctg-toku.us" target="_blank">http://lists.toku.us/listinfo.cgi/lctg-toku.us</a><br>This message was sent to <a href="mailto:bobprimak@yahoo.com.">bobprimak@yahoo.com.</a><br>Set your list options: <a href="http://lists.toku.us/options.cgi/lctg-toku.us/bobprimak@yahoo.com" target="_blank">http://lists.toku.us/options.cgi/lctg-toku.us/bobprimak@yahoo.com</a><o:p></o:p></span></p></div></div></div></div></div></body></html>