[Lex Computer & Tech Group/LCTG] AI bias

Adam Broun abroun at gmail.com
Sat Apr 9 10:37:21 PDT 2022


" It wouldn't surprise me at all if criminal records of relatives correlate more strongly with crime.”

Not if a Black kid is charged and a White kid is let off with a warning for the same offense. Or when a poor person has to rely on a public defender and gets a record, and a wealthier person can afford an attorney who gets them acquitted.

https://www.vera.org/downloads/publications/for-the-record-unjust-burden-racial-disparities.pdf <https://www.vera.org/downloads/publications/for-the-record-unjust-burden-racial-disparities.pdf>




> On Apr 9, 2022, at 12:47, Jon Dreyer <jon at jondreyer.org> wrote:
> 
> Not every instance of apparent bias represents bias. John's example seems like an open-and-shut case of bias, but Jerry's may not be. Since zip codes correlate with poverty and poverty correlates with crime, it wouldn't surprise me if zip codes correlate with crime. (Probably not with amount stolen per capita!) It wouldn't surprise me at all if criminal records of relatives correlate more strongly with crime.
> 
> Whether it's ethical to use that data for a given purpose is a separate question.
> 
> -- 
> Jon "Bias Sometime" Dreyer
> Math tutor/Computer Science tutor <http://www.passionatelycurious.com/>
> Jon Dreyer Music <http://music.jondreyer.com/>
> On 4/9/22 11:22, Jerry Harris wrote:
>> John, 
>> You might find the book by Cathy O'Neil, "Weapons of Math Destruction," interesting since it contains more examples in the same vein as the article. The author attended Lexington High School and gave a talk about her book at Follen Church several years ago. Stories such as an algorithm that uses data like zip code and criminal records of relatives to calculate a prisoner's likelihood to commit another crime if released on parole, etc. 
>>     https://www.amazon.com/Weapons-Math-Destruction-Increases-Inequality/dp/0553418815 <https://www.amazon.com/Weapons-Math-Destruction-Increases-Inequality/dp/0553418815>
>> 
>> Jerry
>> 
>> On Sat, Apr 9, 2022 at 10:47 AM john rudy <jjrudy1 at comcast.net <mailto:jjrudy1 at comcast.net>> wrote:
>> I am taking an AI course and a major issue is the bias build into some AI systems.  If, for example, you pick new hires from those who were successful in the past you’ll mostly hire white males because those were who you mostly had in the past.  Here is a fascinating article from the course
>> 
>> https://www.wired.com/story/excerpt-from-automating-inequality/ <https://www.wired.com/story/excerpt-from-automating-inequality/>===============================================
> ::The Lexington Computer and Technology Group Mailing List::
> Reply goes to sender only; Reply All to send to list.
> Send to the list: LCTG at lists.toku.us      Message archives: http://lists.toku.us/private.cgi/lctg-toku.us
> To subscribe: email lctg-subscribe at toku.us  To unsubscribe: email lctg-unsubscribe at toku.us
> Future and Past meeting information: http://LCTG.toku.us
> List information: http://lists.toku.us/listinfo.cgi/lctg-toku.us
> This message was sent to abroun at gmail.com.
> Set your list options: http://lists.toku.us/options.cgi/lctg-toku.us/abroun@gmail.com

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.toku.us/pipermail/lctg-toku.us/attachments/20220409/ecdee2d5/attachment.html>


More information about the LCTG mailing list