Huawei Accused Of Testing Racist Facial Recognition Software To Help China Track Muslims
In another black mark against Huawei’s name, the Chinese-owned tech titan is now being accused of testing a facial recognition tool which could spot oppressed Uighur Muslims and alert authorities of their presence.
According to surveillance researchers, Huawei worked with one of China’s largest artificial intelligence agencies, Megvii, to validate the “Uighur alarms” as Beijing carried out a repression campaign against the ethnic minority group.
Surveillance research group IPVM unearthed a report from January 2018 which listed the Uighur tool as among the “basic functions” of a facial recognition system powered by both Huawei cameras and Megvii’s software.
The report, which has since been deleted, revealed the program can identify a person’s ethnicity as part of its “face attribute analysis”.
“Huawei and Megvii’s collaboration on Uighur alarms further proves that many large Chinese video surveillance/face recognition companies are deeply implicated in Uighur repression,” IPVM said in its Tuesday report. “Anyone doing business with these firms should take note.”
Huawei has since claimed the face recognition tool was only tested and “has not seen real-world application”, according to The New York Post.
“Huawei only supplies general-purpose products for this kind of testing,” the company told IPVM. “We do not provide custom algorithms or applications.”
This latest privacy concern is one of many criticisms Shenzhen-based Huawei has faced in recent years.
Concerns have mounted about Huawei’s cybersecurity flaws and the company was banned from the 5G rollout in the US, UK, Australia and a number of European countries.
In late October, Swedish regulators also banned the use of telecom equipment from Huawei in its 5G network.