[Lex Computer & Tech Group/LCTG] AI bias
Jerry Harris
jerryharri at gmail.com
Sat Apr 9 11:29:44 PDT 2022
> wouldn't surprise me if zip codes correlate with a crime. (Probably not
with the amount stolen per capita!) It wouldn't surprise me at all if
criminal records of relatives correlate more strongly with crime
The bias comes into play by using this data to give people from certain zip
codes or certain families longer sentences. Or to assign more police
officers to certain regions, where in turn they make more arrests for minor
crimes they observe (eg, broken tail light, etc), and thus feedback
misleading data into the models.
Cathy imagines these algorithms to be used to make different decisions:
> Imagine if you used recidivist models to provide the at-risk inmates with
> counseling and job training while in prison. Or if police doubled down on
> foot patrols in high crime zip codes -- working to build relationships with
> the community instead of arresting people for minor offenses.
You might notice there's a human element to these solutions. Because really
> that's the key. Algorithms can inform and illuminate and supplement our
> decisions and policies. But to get not-evil results, humans and data really
> have to work together.
"Big Data processes codify the past," O'Neil writes. "They do not invent
> the future. Doing that requires moral imagination, and that's something
> only humans can provide."
https://money.cnn.com/2016/09/06/technology/weapons-of-math-destruction/index.html
Jerry
On Sat, Apr 9, 2022 at 12:47 PM Jon Dreyer <jon at jondreyer.org> wrote:
> Not every instance of apparent bias represents bias. John's example seems
> like an open-and-shut case of bias, but Jerry's may not be. Since zip codes
> correlate with poverty and poverty correlates with crime, it wouldn't
> surprise me if zip codes correlate with crime. (Probably not with amount
> stolen per capita!) It wouldn't surprise me at all if criminal records of
> relatives correlate more strongly with crime.
>
> Whether it's ethical to use that data for a given purpose is a separate
> question.
>
> --
> Jon "Bias Sometime" Dreyer
> Math tutor/Computer Science tutor <http://www.passionatelycurious.com>
> Jon Dreyer Music <http://music.jondreyer.com>
> On 4/9/22 11:22, Jerry Harris wrote:
>
> John,
> You might find the book by Cathy O'Neil, "Weapons of Math Destruction,"
> interesting since it contains more examples in the same vein as the
> article. The author attended Lexington High School and gave a talk about
> her book at Follen Church several years ago. Stories such as an algorithm
> that uses data like zip code and criminal records of relatives to calculate
> a prisoner's likelihood to commit another crime if released on parole, etc.
>
> https://www.amazon.com/Weapons-Math-Destruction-Increases-Inequality/dp/0553418815
>
> Jerry
>
> On Sat, Apr 9, 2022 at 10:47 AM john rudy <jjrudy1 at comcast.net> wrote:
>
>> I am taking an AI course and a major issue is the bias build into some AI
>> systems. If, for example, you pick new hires from those who were
>> successful in the past you’ll mostly hire white males because those were
>> who you mostly had in the past. Here is a fascinating article from the
>> course
>>
>> https://www.wired.com/story/excerpt-from-automating-inequality/
>>
> ===============================================
> ::The Lexington Computer and Technology Group Mailing List::
> Reply goes to sender only; Reply All to send to list.
> Send to the list: LCTG at lists.toku.us Message archives:
> http://lists.toku.us/private.cgi/lctg-toku.us
> To subscribe: email lctg-subscribe at toku.us To unsubscribe: email
> lctg-unsubscribe at toku.us
> Future and Past meeting information: http://LCTG.toku.us
> List information: http://lists.toku.us/listinfo.cgi/lctg-toku.us
> This message was sent to jerryharri at gmail.com.
> Set your list options:
> http://lists.toku.us/options.cgi/lctg-toku.us/jerryharri@gmail.com
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.toku.us/pipermail/lctg-toku.us/attachments/20220409/6871a9b6/attachment.html>
More information about the LCTG
mailing list