Home > Accessories > Facial Recognition Bias Exposed

A new landmark study has revealed that facial recognition technologies misidentify people from ethnic backgrounds.

The study – which analysed technologies from Intel, Microsoft, Panasonic, SenseTime, and Vigilant Solutions – found that people from ethnic and asian communities are 100 times more likely to produce a false identification than white men.

Women were also more likely to be misidentified than men, with Native peoples being the most vulnerable to receiving a false report.

It’s problematic for law enforcement agencies and politicians alike, who argue that facial recognition technologies are an efficient and effective tool for states to identify suspects.

Throughout the sweeping study, carried out by the National Institute of Standards and Technology (NIST) in the US, researchers found the software had a higher rate of falsely identifying people of colour and women by a factor of 10 to 100 times, depending on the algorithms used.

‘While it is usually incorrect to make statements across algorithms, we found empirical evidence for the existence of demographic differentials in the majority of the face recognition algorithms we studied,’ Patrick Grother, a NIST computer scientist and the report’s primary author, said.

‘While we do not explore what might cause these differentials, this data will be valuable to policymakers, developers and end users in thinking about the limitations and appropriate use of these algorithms.’

The study evaluated facial recognition algorithms submitted by industry and academic developers on their capability to perform different tasks.

In evaluating the algorithms, the NIST team used photographs from databases provided by the State Department, The Department of Homeland Security and FBI. Overall, 18.27 million images of 8.49 million people were assessed.

Amazon’s Rekognition was not available for the study.

(AAP Image/Julian Smith)

The research results come at an inconvenient time for Australian politicians, as some suggest the use of facial recognition technologies to stop children watching online pornography.

Current laws do not stop people under the age of 18 from viewing pornographic material, but the federal government is considering introducing proposals that would require people to prove their age before gaining access to explicit sites.

Peter Dutton’s Department of Home Affairs is at the forefront of this suggestion, but it has been met with backlash amid concerns over privacy and surveillance.

(AAP Image/Mick Tsikas)

You may also like
Apple Adds To Its AI Acquisitions With Machine Learning Start-up
US Ramps Up Pressure On Chinese Tech Outfits
‘Criminal Offence’: Fletcher Warns 5G Opponents
‘Your Phones May Not Work’: Huawei Exec Warns Aust.
Intel Invests In Chinese Chips