<html xmlns:v="urn:schemas-microsoft-com:vml" xmlns:o="urn:schemas-microsoft-com:office:office" xmlns:w="urn:schemas-microsoft-com:office:word" xmlns:m="http://schemas.microsoft.com/office/2004/12/omml" xmlns="http://www.w3.org/TR/REC-html40"><head><meta http-equiv=Content-Type content="text/html; charset=us-ascii"><meta name=Generator content="Microsoft Word 15 (filtered medium)"><!--[if !mso]><style>v\:* {behavior:url(#default#VML);}
o\:* {behavior:url(#default#VML);}
w\:* {behavior:url(#default#VML);}
.shape {behavior:url(#default#VML);}
</style><![endif]--><style><!--
/* Font Definitions */
@font-face
{font-family:"Cambria Math";
panose-1:2 4 5 3 5 4 6 3 2 4;}
@font-face
{font-family:Calibri;
panose-1:2 15 5 2 2 2 4 3 2 4;}
/* Style Definitions */
p.MsoNormal, li.MsoNormal, div.MsoNormal
{margin:0in;
font-size:11.0pt;
font-family:"Calibri",sans-serif;}
a:link, span.MsoHyperlink
{mso-style-priority:99;
color:#0563C1;
text-decoration:underline;}
span.EmailStyle19
{mso-style-type:personal-reply;
font-family:"Arial",sans-serif;
color:windowtext;
font-weight:normal;
font-style:normal;}
.MsoChpDefault
{mso-style-type:export-only;
font-size:10.0pt;}
@page WordSection1
{size:8.5in 11.0in;
margin:1.0in 1.0in 1.0in 1.0in;}
div.WordSection1
{page:WordSection1;}
--></style><!--[if gte mso 9]><xml>
<o:shapedefaults v:ext="edit" spidmax="1026" />
</xml><![endif]--><!--[if gte mso 9]><xml>
<o:shapelayout v:ext="edit">
<o:idmap v:ext="edit" data="1" />
</o:shapelayout></xml><![endif]--></head><body lang=EN-US link="#0563C1" vlink="#954F72" style='word-wrap:break-word'><div class=WordSection1><p class=MsoNormal><span style='font-size:12.0pt;font-family:"Arial",sans-serif'>Bias is a major issue in AI and ML algorithms. Many times the bias in an algorithm is not caught because the data collected is biased to begin with. Some of the well-known cases have been Google’s algorithm identifying dark-skinned people as chimps and Apple’s credit card was gender-blind. <o:p></o:p></span></p><p class=MsoNormal><span style='font-size:12.0pt;font-family:"Arial",sans-serif'><o:p> </o:p></span></p><p class=MsoNormal><span style='font-size:12.0pt;font-family:"Arial",sans-serif'><a href="https://www.forbes.com/sites/mzhang/2015/07/01/google-photos-tags-two-african-americans-as-gorillas-through-facial-recognition-software/?sh=58c2f0d5713d">https://www.forbes.com/sites/mzhang/2015/07/01/google-photos-tags-two-african-americans-as-gorillas-through-facial-recognition-software/?sh=58c2f0d5713d</a><o:p></o:p></span></p><p class=MsoNormal><span style='font-size:12.0pt;font-family:"Arial",sans-serif'><o:p> </o:p></span></p><p class=MsoNormal><span style='font-size:12.0pt;font-family:"Arial",sans-serif'><a href="https://www.washingtonpost.com/business/2019/11/11/apple-card-algorithm-sparks-gender-bias-allegations-against-goldman-sachs/">https://www.washingtonpost.com/business/2019/11/11/apple-card-algorithm-sparks-gender-bias-allegations-against-goldman-sachs/</a><o:p></o:p></span></p><p class=MsoNormal><span style='font-size:12.0pt;font-family:"Arial",sans-serif'><o:p> </o:p></span></p><p class=MsoNormal><span style='font-size:12.0pt;font-family:"Arial",sans-serif'>Smita Desai<o:p></o:p></span></p><p class=MsoNormal><span style='font-size:12.0pt;font-family:"Arial",sans-serif'><o:p> </o:p></span></p><div><div style='border:none;border-top:solid #E1E1E1 1.0pt;padding:3.0pt 0in 0in 0in'><p class=MsoNormal><b>From:</b> LCTG <lctg-bounces+smitausa=gmail.com@lists.toku.us> <b>On Behalf Of </b>john rudy<br><b>Sent:</b> Saturday, April 9, 2022 10:47 AM<br><b>To:</b> Lex Computer Group <LCTG@lists.toku.us><br><b>Subject:</b> [Lex Computer & Tech Group/LCTG] AI bias<o:p></o:p></p></div></div><p class=MsoNormal><o:p> </o:p></p><p class=MsoNormal><span style='font-size:12.0pt;font-family:"Arial",sans-serif'>I am taking an AI course and a major issue is the bias build into some AI systems. If, for example, you pick new hires from those who were successful in the past you’ll mostly hire white males because those were who you mostly had in the past. Here is a fascinating article from the course<o:p></o:p></span></p><p class=MsoNormal><span style='font-size:12.0pt;font-family:"Arial",sans-serif'><a href="https://www.wired.com/story/excerpt-from-automating-inequality/">https://www.wired.com/story/excerpt-from-automating-inequality/</a><o:p></o:p></span></p><p class=MsoNormal><span style='font-size:12.0pt;font-family:"Arial",sans-serif'><o:p> </o:p></span></p><p class=MsoNormal><span style='font-size:10.0pt;font-family:"Arial",sans-serif'>John Rudy<o:p></o:p></span></p><p class=MsoNormal><span style='font-size:10.0pt;font-family:"Arial",sans-serif'>781-861-0402<o:p></o:p></span></p><p class=MsoNormal><span style='font-size:10.0pt;font-family:"Arial",sans-serif'>781-718-8334 (cell)<o:p></o:p></span></p><p class=MsoNormal><span style='font-size:10.0pt;font-family:"Arial",sans-serif'><a href="mailto:John.rudy@alum.mit.edu">John.rudy@alum.mit.edu</a> <o:p></o:p></span></p><p class=MsoNormal><span style='font-size:10.0pt;font-family:"Arial",sans-serif'><o:p> </o:p></span></p><p class=MsoNormal><span style='font-size:10.0pt;font-family:"Arial",sans-serif'>13 Hawthorne Lane<o:p></o:p></span></p><p class=MsoNormal><span style='font-size:10.0pt;font-family:"Arial",sans-serif'>Bedford, MA 01730-1047<o:p></o:p></span></p><p class=MsoNormal><img border=0 width=99 height=94 style='width:1.0312in;height:.9791in' id="Picture_x0020_1" src="cid:image001.png@01D84C00.33354630"><span style='font-size:10.0pt;font-family:"Arial",sans-serif'><o:p></o:p></span></p><p class=MsoNormal><o:p> </o:p></p></div></body></html>