PyCon 2019 | Measures and Mismeasures of Algorithmic Fairness

unconscious-bias

PyCon 2019 | Measures and Mismeasures of Algorithmic Fairness

Speaker: Manojit Nandi

 

Within the last few years, researchers have come to understand that machine learning systems may display discriminatory behavior with regards to certain protected characteristics, such as gender or race. To combat these harmful behaviors, we have created multiple definitions of fairness to enable equity in machine learning algorithms. In this talk, I will cover these different definitions of algorithmic fairness and discuss both the strengths and limitations of these formalizations. In addition, I will cover other best practices to better mitigate the unintended bias of data products.

 

Slides can be found at: https://speakerdeck.com/pycon2019 and https://github.com/PyCon/2019-slides

Read More  Machine Learning Made Easy With Python

For enquiries, product placements, sponsorships, and collaborations, connect with us at [email protected]. We'd love to hear from you!

Read More

Technology

Transforming the Developer Experience for Every Engineering Role

In today’s fast-paced software development landscape, ambitious goals, complex technologies, and shifting prioriti
Read More