Intelligence, Inside and Outside.

PyCon 2019 | Measures and Mismeasures of Algorithmic Fairness

PyCon 2019 | Measures and Mismeasures of Algorithmic Fairness

Speaker: Manojit Nandi

 

Within the last few years, researchers have come to understand that machine learning systems may display discriminatory behavior with regards to certain protected characteristics, such as gender or race. To combat these harmful behaviors, we have created multiple definitions of fairness to enable equity in machine learning algorithms. In this talk, I will cover these different definitions of algorithmic fairness and discuss both the strengths and limitations of these formalizations. In addition, I will cover other best practices to better mitigate the unintended bias of data products.

 

Slides can be found at: https://speakerdeck.com/pycon2019 and https://github.com/PyCon/2019-slides

Read More  Subtle Biases In AI Can Influence Emergency Decisions

For enquiries, product placements, sponsorships, and collaborations, connect with us at [email protected]. We'd love to hear from you!
Share this article
Shareable URL
Prev Post

PyCon 2019 | Statistical Profiling (and other fun with the sys module)

Next Post

Detecting ‘Deepfake’ Videos In The Blink Of An Eye

Read next