Rejoignez getAbstract pour lire le résumé !

Practical Fairness

Rejoignez getAbstract pour lire le résumé !

Practical Fairness

Achieving Fair and Secure Data Models

O'Reilly,

15 minutes de lecture
6 points à retenir
Audio et texte

Aperçu

Software that learns from data increasingly influences daily life, raising profound questions about its assessments and decisions.


Editorial Rating

9

Qualities

  • Scientific
  • Concrete Examples
  • For Experts

Recommendation

Machine learning is becoming ubiquitous. This branch of artificial intelligence works by teaching a computer program what correct output looks like. This powerful method raises questions regarding fair outcomes for the people machine learning (ML) affects. Software engineer and attorney Aileen Nielsen examines different kinds of fairness and how training data and algorithms can promote them. For those developing machine learning models, she provides useful examples in Python.

Summary

Fairness is about who gets what, and how that’s decided. 

Every new technology creates victims along with progress. Information technology improves life but can prey on users’ time and attention or become a tool for nefarious purposes. Often, unfairness takes the form of violations of community norms. This happens when large-scale use of a technology, from drones to bots targeting dating apps, becomes a nuisance.

Software developers should pay attention to the difference between equity and equality, security and privacy. Not paying attention to fairness will expose companies to legal trouble and consumer backlash. Laws set by the United States, Europe and China provide standards for some of these aspects. However, a fairness mandate need not dampen innovation; it can stimulate ideas in mathematics, computer science and law.

People tend to prefer equity over equality. Equity implies that people should not receive different treatment for belonging to a certain group – direct discrimination. Neither should a specific group enjoy or suffer good or bad consequences – indirect discrimination.

But equity ...

About the Author

Software engineer and lawyer Aileen Nielsen combines work at a deep learning start-up with a fellowship in law and technology at ETH Zürich.


Comment on this summary