Algorithms were supposed to make our lives easier and fairer: help us find the best job applicants, help judges impartially assess the risks of bail and bond decisions, and ensure that health care is ...
In 1998, I unintentionally created a racially biased artificial intelligence algorithm. There are lessons in that story that resonate even more strongly today. The systems often fail on women of color ...
Algorithms are a staple of modern life. People rely on algorithmic recommendations to wade through deep catalogs and find the best movies, routes, information, products, people and investments.
For more than a decade, journalists and researchers have been writing about the dangers of relying on algorithms to make weighty decisions: who gets locked up, who gets a job, who gets a loan — even ...
AI is increasingly finding its way into healthcare decisions, from diagnostics to treatment decisions to robotic surgery. As I’ve written about in this newsletter many times, AI is sweeping the ...
Data, numbers, algorithms are supposed to be neutral... right? Computer scientist Joy Buolamwini discusses the way biased algorithms can lead to real-world inequality. A version of this segment was ...
This week, the Department of Justice settled a lawsuit with Meta, Facebook’s parent company over the use of algorithms the government said were discriminatory. That served housing advertisements to ...
Part 2, Digital Inequality Series: Under what conditions can artificial intelligence benefit all of society vs. just a few people? Kalinda Ukanwa, a quantitative marketing scholar at the University of ...
New research by Questrom’s Carey Morewedge shows that people recognize more of their biases in algorithms’ decisions than they do in their own—even when those decisions are the same Algorithms were ...
(THE CONVERSATION) In 1998, I unintentionally created a racially biased artificial intelligence algorithm. There are lessons in that story that resonate even more strongly today. Facial recognition ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results