• AI DIY Episode 3 Roboaudit

  • Dec 28 2021
  • Length: 1 hr and 16 mins
  • Podcast

AI DIY Episode 3 Roboaudit

  • Summary

  • Dr Pamela Ugwidike is an expert on the use of AI in justice systems and is particularly interested in how AI and data bias impact criminal justice. She wants to make an AI that can audit other AIs for bias to make sure that all Artificial Intelligence always operates in the interest of humanity. There's currently a huge amount of interest in AI ethics and biases in the research community, and we are discovering many ways in which our

    data and our computations can lead to unfairness.


    Perhaps the most famous case of AI bias comes from the field of criminal justice. The COMPAS algorithm (used in US courts to predict whether a defendant would go on to reoffend) was twice as likely to incorrectly label black people as reoffenders than white people. While there are lots of advice and programming toolkits aimed at helping human developers to eradicate bias from the AI products that they are building, IBM's Watson OpenScale is the only AI platform that claims to detect and correct biases in its own operation. Although is that just IBM's marketing department being a bit biased?


    Hosted on Acast. See acast.com/privacy for more information.

    Show More Show Less

What listeners say about AI DIY Episode 3 Roboaudit

Average Customer Ratings

Reviews - Please select the tabs below to change the source of reviews.

In the spirit of reconciliation, Audible acknowledges the Traditional Custodians of country throughout Australia and their connections to land, sea and community. We pay our respect to their elders past and present and extend that respect to all Aboriginal and Torres Strait Islander peoples today.