We all knew facial-recognition technology was flawed,Dear Utol (2025): Catfish Episode 46 just perhaps not this flawed.
A new study from the National Institute of Standards and Technology, published on Dec. 19, lays out in painstaking detail how facial-recognition tech misidentifies the elderly, young, women, and people of color at rates higher than that of white men. In other words, more at risk populations are also the ones more likely to suffer false matches and any associated legal troubles that follow.
Just how bad is it? Let's let the NIST study authors explain.
"We found false positives to be higher in women than men, and this is consistent across algorithms and datasets," they wrote. "We found elevated false positives in the elderly and in children; the effects were larger in the oldest and youngest, and smallest in middle-aged adults."
And that's not all. "With mugshot images," the authors continued, "the highest false positives are in American Indians, with elevated rates in African American and Asian populations."
Why does this matter? Well, law enforcement uses the technology, and as such false positives can lead directly to mistaken arrests and harassment.
This study, which claims "empirical evidence" for its findings, is sure to add support to lawmakers' calls to ban the controversial tech.
"We have started to sound the alarm on the way facial recognition technology is expanding in concerning [ways]," wrote congresswoman Alexandria Ocasio-Cortez in July. "From the FBI to ICE to Amazon, the bar for consent and civil liberties protection is repeatedly violated, and on top of it all has a disproportionate racial impact, too."
She now has additional evidence to back up that latter claim.
SEE ALSO: Here's why San Francisco's vote to ban facial-recognition tech mattersImportantly, the congresswoman isn't alone in her concern. In a statement published by the Washington Post, Senator Ron Wyden reacted to the NIST findings by stating that "algorithms often carry all the biases and failures of human employees, but with even less judgment."
A growing number of cities, including San Francisco and Berkeley, recently moved to ban some government use of the tech. Perhaps this study will encourage others to follow suit.
Topics Facial Recognition
(Editor: {typename type="name"/})
TikTok wants me to host a dinner party. Is that an actual recession indicator?
Apple warns MagSafe users with medical implants to keep a safe distance
Google workers in 10 countries form union alliance: 'We will hold Alphabet accountable'
Singer loses composure after noticing man's interpretive dancing
Alienware M16 Gaming Laptop deal: Save $560
Fishermen get a dose of surprise sea lion after reeling in a fish
Fishermen get a dose of surprise sea lion after reeling in a fish
Freakish ice cube shows the scale of Earth's vanishing ice sheets
NYT Connections Sports Edition hints and answers for April 23: Tips to solve Connections #212
Trump repeatedly called the prime minister of Spain 'president,' and everyone is confused
What's Thermal Throttling and How to Prevent It
Meet the McElroys: How podcasting's 'first family' can help you find your voice
接受PR>=1、BR>=1,流量相当,内容相关类链接。