Article
Your Boss Wants to Spy on Your Inner Feelings
Tech companies now use AI to analyze your feelings in job interviews and public spaces. But the software is prone to racial, cultural and gender bias
Scientific American,
2021
подробнее...
© 2021 Scientific American, a division of Springer Nature America, Inc. All rights reserved.
1×
Войдите, чтобы прослушать аудиоверсию краткого изложения.
Recommendation
Cameras and sensors that feed vocal tones, facial expressions and body language images into elaborate “emotion AI” systems proliferate in today’s society. They are used by marketers, employers, schools and medical institutions. Although the technology has existed for decades, its recent insidious explosion raises concerns because of gender, racial, cultural and other dangerous biases baked into AI training data.
Summary
About the Author
John McQuaid is a journalist and author. He reported this story while a fellow at the Woodrow Wilson International Center for Scholars in Washington, DC. He is currently a PhD student at the University of Maryland Merrill College of Journalism.
Learners who read this summary also read
Related Skills
Transformation par l'IA
Ressources Humaines
Tirez parti de l'IA pour l'expérience client
Exploiter l'IA pour le marketing
Marketing
Utilisez l'IA pour la planification des effectifs
Utilisez l'IA pour l'appariement des compétences
Tirez parti de l'IA pour les RH
Utilisez l'IA pour le sourcing de talents
Utilisez l'IA pour l'analyse des sentiments
Utilisez l'IA pour le filtrage des candidats
Comment on this summary