[Lex Computer & Tech Group/LCTG] AI bias
Jon Dreyer
jon at jondreyer.org
Sat Apr 9 09:47:52 PDT 2022
Not every instance of apparent bias represents bias. John's example
seems like an open-and-shut case of bias, but Jerry's may not be. Since
zip codes correlate with poverty and poverty correlates with crime, it
wouldn't surprise me if zip codes correlate with crime. (Probably not
with amount stolen per capita!) It wouldn't surprise me at all if
criminal records of relatives correlate more strongly with crime.
Whether it's ethical to use that data for a given purpose is a separate
question.
--
Jon "Bias Sometime" Dreyer
Math tutor/Computer Science tutor <http://www.passionatelycurious.com>
Jon Dreyer Music <http://music.jondreyer.com>
On 4/9/22 11:22, Jerry Harris wrote:
> John,
> You might find the book by Cathy O'Neil, "Weapons of Math
> Destruction," interesting since it contains more examples in the same
> vein as the article. The author attended Lexington High School and
> gave a talk about her book at Follen Church several years ago. Stories
> such as an algorithm that uses data like zip code and criminal records
> of relatives to calculate a prisoner's likelihood to commit another
> crime if released on parole, etc.
> https://www.amazon.com/Weapons-Math-Destruction-Increases-Inequality/dp/0553418815
>
> Jerry
>
> On Sat, Apr 9, 2022 at 10:47 AM john rudy <jjrudy1 at comcast.net> wrote:
>
> I am taking an AI course and a major issue is the bias build into
> some AI systems. If, for example, you pick new hires from those
> who were successful in the past you’ll mostly hire white males
> because those were who you mostly had in the past. Here is a
> fascinating article from the course
>
> https://www.wired.com/story/excerpt-from-automating-inequality/
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.toku.us/pipermail/lctg-toku.us/attachments/20220409/2e6fd0fc/attachment.html>
More information about the LCTG
mailing list