Skip to Content Facebook Feature Image

Face recognition researcher fights Amazon over biased AI

News

Face recognition researcher fights Amazon over biased AI
News

News

Face recognition researcher fights Amazon over biased AI

2019-04-04 04:20 Last Updated At:04:30

Facial recognition technology was already seeping into everyday life — from your photos on Facebook to police scans of mugshots — when Joy Buolamwini noticed a serious glitch: Some of the software couldn't detect dark-skinned faces like hers.

That revelation sparked the Massachusetts Institute of Technology researcher to launch a project that's having an outsize influence on the debate over how artificial intelligence should be deployed in the real world.

More Images
In this Wednesday, Feb. 13, 2019, photo, Massachusetts Institute of Technology facial recognition researcher Joy Buolamwini stands for a portrait at the school, in Cambridge, Mass. Buolamwini's research has uncovered racial and gender bias in facial analysis tools sold by companies such as Amazon that have a hard time recognizing certain faces, especially darker-skinned women. Buolamwini holds a white mask she had to use so that software could detect her face. (AP PhotoSteven Senne)

In this Wednesday, Feb. 13, 2019, photo, Massachusetts Institute of Technology facial recognition researcher Joy Buolamwini stands for a portrait at the school, in Cambridge, Mass. Buolamwini's research has uncovered racial and gender bias in facial analysis tools sold by companies such as Amazon that have a hard time recognizing certain faces, especially darker-skinned women. Buolamwini holds a white mask she had to use so that software could detect her face. (AP PhotoSteven Senne)

In this Wednesday, Feb. 13, 2019, photo, Massachusetts Institute of Technology facial recognition researcher Joy Buolamwini stands for a portrait at the school, in Cambridge, Mass. Buolamwini's research has uncovered racial and gender bias in facial analysis tools sold by companies such as Amazon that have a hard time recognizing certain faces, especially darker-skinned women. Buolamwini holds a white mask she had to use so that software could detect her face. (AP PhotoSteven Senne)

In this Wednesday, Feb. 13, 2019, photo, Massachusetts Institute of Technology facial recognition researcher Joy Buolamwini stands for a portrait at the school, in Cambridge, Mass. Buolamwini's research has uncovered racial and gender bias in facial analysis tools sold by companies such as Amazon that have a hard time recognizing certain faces, especially darker-skinned women. Buolamwini holds a white mask she had to use so that software could detect her face. (AP PhotoSteven Senne)

In this Wednesday, Feb. 13, 2019, photo, Massachusetts Institute of Technology facial recognition researcher Joy Buolamwini stands for a portrait behind a mask at the school, in Cambridge, Mass. Buolamwini's research has uncovered racial and gender bias in facial analysis tools sold by companies such as Amazon that have a hard time recognizing certain faces, especially darker-skinned women. Buolamwini holds a white mask she had to use so that software could detect her face. (AP PhotoSteven Senne)

In this Wednesday, Feb. 13, 2019, photo, Massachusetts Institute of Technology facial recognition researcher Joy Buolamwini stands for a portrait behind a mask at the school, in Cambridge, Mass. Buolamwini's research has uncovered racial and gender bias in facial analysis tools sold by companies such as Amazon that have a hard time recognizing certain faces, especially darker-skinned women. Buolamwini holds a white mask she had to use so that software could detect her face. (AP PhotoSteven Senne)

In this Feb. 22, 2019, photo, Washington County Sheriff’s Office Deputy Jeff Talbot demonstrates how his agency used facial recognition software to help solve a crime at their headquarters in Hillsboro, Ore. The image on the left shows a man whose face was captured on a surveillance camera and investigators used the software to scan their database of past mug shots to match that facial image with an identity. (AP PhotoGillian Flaccus)

In this Feb. 22, 2019, photo, Washington County Sheriff’s Office Deputy Jeff Talbot demonstrates how his agency used facial recognition software to help solve a crime at their headquarters in Hillsboro, Ore. The image on the left shows a man whose face was captured on a surveillance camera and investigators used the software to scan their database of past mug shots to match that facial image with an identity. (AP PhotoGillian Flaccus)

Her tests on software created by brand-name tech firms such as Amazon uncovered much higher error rates in classifying the gender of darker-skinned women than for lighter-skinned men.

In this Wednesday, Feb. 13, 2019, photo, Massachusetts Institute of Technology facial recognition researcher Joy Buolamwini stands for a portrait at the school, in Cambridge, Mass. Buolamwini's research has uncovered racial and gender bias in facial analysis tools sold by companies such as Amazon that have a hard time recognizing certain faces, especially darker-skinned women. Buolamwini holds a white mask she had to use so that software could detect her face. (AP PhotoSteven Senne)

In this Wednesday, Feb. 13, 2019, photo, Massachusetts Institute of Technology facial recognition researcher Joy Buolamwini stands for a portrait at the school, in Cambridge, Mass. Buolamwini's research has uncovered racial and gender bias in facial analysis tools sold by companies such as Amazon that have a hard time recognizing certain faces, especially darker-skinned women. Buolamwini holds a white mask she had to use so that software could detect her face. (AP PhotoSteven Senne)

Along the way, Buolamwini has spurred Microsoft and IBM to improve their systems and irked Amazon, which publicly attacked her research methods. On Wednesday, a group of AI scholars, including a winner of computer science's top prize, launched a spirited defense of her work and called on Amazon to stop selling its facial recognition software to police.

Her work has also caught the attention of political leaders in statehouses and Congress and led some to seek limits on the use of computer vision tools to analyze human faces.

"There needs to be a choice," said Buolamwini, a graduate student and researcher at MIT's Media Lab. "Right now, what's happening is these technologies are being deployed widely without oversight, oftentimes covertly, so that by the time we wake up, it's almost too late."

In this Wednesday, Feb. 13, 2019, photo, Massachusetts Institute of Technology facial recognition researcher Joy Buolamwini stands for a portrait at the school, in Cambridge, Mass. Buolamwini's research has uncovered racial and gender bias in facial analysis tools sold by companies such as Amazon that have a hard time recognizing certain faces, especially darker-skinned women. Buolamwini holds a white mask she had to use so that software could detect her face. (AP PhotoSteven Senne)

In this Wednesday, Feb. 13, 2019, photo, Massachusetts Institute of Technology facial recognition researcher Joy Buolamwini stands for a portrait at the school, in Cambridge, Mass. Buolamwini's research has uncovered racial and gender bias in facial analysis tools sold by companies such as Amazon that have a hard time recognizing certain faces, especially darker-skinned women. Buolamwini holds a white mask she had to use so that software could detect her face. (AP PhotoSteven Senne)

Buolamwini is hardly alone in expressing caution about the fast-moving adoption of facial recognition by police, government agencies and businesses from stores to apartment complexes. Many other researchers have shown how AI systems, which look for patterns in huge troves of data, will mimic the institutional biases embedded in the data they are learning from. For instance, if AI systems are developed using images of mostly white men, the systems will work best in recognizing white men.

Those disparities can sometimes be a matter of life or death: One recent study of the computer vision systems that enable self-driving cars to "see" the road shows they have a harder time detecting pedestrians with darker skin tones.

What's struck a chord about Boulamwini's work is her method of testing the systems created by well-known companies. She applies such systems to a skin-tone scale used by dermatologists, then names and shames those that show racial and gender bias. Buolamwini, who's also founded a coalition of scholars, activists and others called the Algorithmic Justice League, has blended her scholarly investigations with activism.

In this Wednesday, Feb. 13, 2019, photo, Massachusetts Institute of Technology facial recognition researcher Joy Buolamwini stands for a portrait behind a mask at the school, in Cambridge, Mass. Buolamwini's research has uncovered racial and gender bias in facial analysis tools sold by companies such as Amazon that have a hard time recognizing certain faces, especially darker-skinned women. Buolamwini holds a white mask she had to use so that software could detect her face. (AP PhotoSteven Senne)

In this Wednesday, Feb. 13, 2019, photo, Massachusetts Institute of Technology facial recognition researcher Joy Buolamwini stands for a portrait behind a mask at the school, in Cambridge, Mass. Buolamwini's research has uncovered racial and gender bias in facial analysis tools sold by companies such as Amazon that have a hard time recognizing certain faces, especially darker-skinned women. Buolamwini holds a white mask she had to use so that software could detect her face. (AP PhotoSteven Senne)

"It adds to a growing body of evidence that facial recognition affects different groups differently," said Shankar Narayan, of the American Civil Liberties Union of Washington state, where the group has sought restrictions on the technology. "Joy's work has been part of building that awareness."

Amazon, whose CEO, Jeff Bezos, she emailed directly last summer, has responded by aggressively taking aim at her research methods.

A Buolamwini-led study published just over a year ago found disparities in how facial-analysis systems built by IBM, Microsoft and the Chinese company Face Plus Plus classified people by gender. Darker-skinned women were the most misclassified group, with error rates of up to 34.7%. By contrast, the maximum error rate for lighter-skinned males was less than 1%.

In this Feb. 22, 2019, photo, Washington County Sheriff’s Office Deputy Jeff Talbot demonstrates how his agency used facial recognition software to help solve a crime at their headquarters in Hillsboro, Ore. The image on the left shows a man whose face was captured on a surveillance camera and investigators used the software to scan their database of past mug shots to match that facial image with an identity. (AP PhotoGillian Flaccus)

In this Feb. 22, 2019, photo, Washington County Sheriff’s Office Deputy Jeff Talbot demonstrates how his agency used facial recognition software to help solve a crime at their headquarters in Hillsboro, Ore. The image on the left shows a man whose face was captured on a surveillance camera and investigators used the software to scan their database of past mug shots to match that facial image with an identity. (AP PhotoGillian Flaccus)

The study called for "urgent attention" to address the bias.

"I responded pretty much right away," said Ruchir Puri, chief scientist of IBM Research, describing an email he received from Buolamwini last year.

Since then, he said, "it's been a very fruitful relationship" that informed IBM's unveiling this year of a new 1 million-image database for better analyzing the diversity of human faces. Previous systems have been overly reliant on what Buolamwini calls "pale male" image repositories.

Microsoft, which had the lowest error rates, declined comment. Messages left with Megvii, which owns Face Plus Plus, weren't immediately returned.

Months after her first study, when Buolamwini worked with University of Toronto researcher Inioluwa Deborah Raji on a follow-up test, all three companies showed major improvements.

But this time they also added Amazon, which has sold the system it calls Rekognition to law enforcement agencies. The results, published in late January, showed Amazon badly misidentifying darker-hued women.

"We were surprised to see that Amazon was where their competitors were a year ago," Buolamwini said.

Amazon dismissed what it called Buolamwini's "erroneous claims" and said the study confused facial analysis with facial recognition, improperly measuring the former with techniques for evaluating the latter.

"The answer to anxieties over new technology is not to run 'tests' inconsistent with how the service is designed to be used, and to amplify the test's false and misleading conclusions through the news media," Matt Wood, general manager of artificial intelligence for Amazon's cloud-computing division, wrote in a January blog post. Amazon declined requests for an interview.

"I didn't know their reaction would be quite so hostile," Buolamwini said recently in an interview at her MIT lab.

Coming to her defense Wednesday was a coalition of researchers, including AI pioneer Yoshua Bengio , recent winner of the Turing Award, considered the tech field's version of the Nobel Prize.

They criticized Amazon's response, especially its distinction between facial recognition and analysis.

"In contrast to Dr. Wood's claims, bias found in one system is cause for concern in the other, particularly in use cases that could severely impact people's lives, such as law enforcement applications," they wrote.

Its few publicly known clients have defended Amazon's system.

Chris Adzima, senior information systems analyst for the Washington County Sheriff's Office in Oregon, said the agency uses Amazon's Rekognition to identify the most likely matches among its collection of roughly 350,000 mug shots. But because a human makes the final decision, "the bias of that computer system is not transferred over into any results or any action taken," Adzima said.

But increasingly, regulators and legislators are having their doubts. A bipartisan bill in Congress seeks limits on facial recognition. Legislatures in Washington and Massachusetts are considering laws of their own.

Buolamwini said a major message of her research is that AI systems need to be carefully reviewed and consistently monitored if they're going to be used on the public. Not just to audit for accuracy, she said, but to ensure face recognition isn't abused to violate privacy or cause other harms.

"We can't just leave it to companies alone to do these kinds of checks," she said.

Associated Press writer Gillian Flaccus contributed to this report from Hillsboro, Oregon.

MILWAUKEE (AP) — Four years after showing up late for the Beijing Olympics and missing one of his races because of a case of COVID-19, U.S. long track speedskater Casey Dawson is enjoying what he jokingly terms his “villain arc,” peaking at the right time ahead of the Milan Cortina Games.

Dawson secured a spot for next month in the men's 5,000 meters — an event he was forced to skip in 2022 while sick — by winning at that distance at the U.S. Olympic trials in 6 minutes, 12.857 seconds on Friday night.

“I actually got COVID two or three weeks before going to the competition. Tested positive for 50 straight tests,” said Dawson, a 25-year-old from Park City, Utah. “Couldn’t go over to the Games. I missed the opening ceremonies. Missed the 5,000 meters. Showed up 12 hours before my 1,500 meters. So I kind of got a little screwed over from that point of view. But this time around, I’m just looking forward to getting there smoothly and just getting a little bit of redemption.”

And then, with a chuckle, Dawson added: “It’s kind of fun to have, like, my villain arc, I would call it. Just coming back and having some fun.”

Ethan Cepuran was about 6 1/2 seconds back Friday, finishing next in 6:19.335.

The last American man to medal in the 5,000 at an Olympics was Chad Hedrick at the 2006 Turin Games.

Dawson already had secured the lone U.S. place for Milan in the men’s 10,000 — a race not being contested at these trials — and also will be part of the trio for men’s team pursuit at the Olympics.

Dawson, Cepuran and Emery Lehman took the bronze in that event in Beijing four years ago, set the world record in 2024 and claimed gold in the team pursuit at the world championships in March.

In the other race Friday, the women's 3,000, Greta Myers won in 4:06.799. As of now, the United States does not have a berth in Milan for that distance, but one of its athletes could end up in the field if another country relinquishes an opening.

“It's hard to wait,” said Myers, a 21-year-old from Lino Lakes, Minnesota. “But I'm very hopeful. I think it's at least a 50-50 chance that it'll happen.”

The U.S. Olympic roster for long track won't become official until the four-day trials at the Pettit National Ice Center wrap up on Monday. One element that could come into play is that the Americans are allowed to bring a maximum of eight men and six women to these Winter Games.

The biggest star of the team — and the sport — is scheduled to make his trials debut Saturday in the men's 1,000 meters: Jordan Stolz. The 21-year-old from Kewaskum, a town about 40 miles north of Milwaukee, is not just competing at home this week; he's racing at the same rink where he first began taking lessons as a kid.

He made his Olympic debut at age 17 in Beijing four years ago, finishing 13th in the 500 and 14th in the 1,000.

At both the 2023 and 2024 world championships, Stolz earned titles in each of the 500, 1,000 and 1,500 meters.

He's already pre-qualified for the Olympics based on performances at those three distances. All he really needs to do to lock down berths on the squad for the Feb. 6-22 Milan Cortina Games is show up at the starting line this week.

The 500 and 1,500 are slated for Sunday, and the mass start is Monday.

AP Winter Olympics: https://apnews.com/hub/milan-cortina-2026-winter-olympics

Casey Dawson, of Park City, Utah, right, and Ethan Cepuran, of Glen Ellyn, lllinois, left, compete in the men's 5,000 meters at the U.S. Olympic trials for long track speed skating at the Pettit National Ice Center in Milwaukee, Friday, Jan. 2, 2026. (AP Photo/Howard Fendrich)

Casey Dawson, of Park City, Utah, right, and Ethan Cepuran, of Glen Ellyn, lllinois, left, compete in the men's 5,000 meters at the U.S. Olympic trials for long track speed skating at the Pettit National Ice Center in Milwaukee, Friday, Jan. 2, 2026. (AP Photo/Howard Fendrich)

The Pettit National Ice Center is seen in Milwaukee on Friday, Jan. 2, 2026, the first day of the U.S. Olympic trials for long track speedskating. (AP Photo/Howard Fendrich)

The Pettit National Ice Center is seen in Milwaukee on Friday, Jan. 2, 2026, the first day of the U.S. Olympic trials for long track speedskating. (AP Photo/Howard Fendrich)

Recommended Articles