China tests AI ‘emotion-detection’ software on Uighur Muslims

The revelation has come as a shock to many

Maroosha Muzaffar
Wednesday 26 May 2021 08:43 EDT
Comments
A member of the Uighur community holds a placard as she joins a demonstration to call on the British parliament to vote to recognise alleged persecution of China's Muslim minority people as genocide and crimes against humanity in London on April 22, 2021
A member of the Uighur community holds a placard as she joins a demonstration to call on the British parliament to vote to recognise alleged persecution of China's Muslim minority people as genocide and crimes against humanity in London on April 22, 2021 (Photo by JUSTIN TALLIS/AFP via Getty Images)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

China has been testing facial recognition and artificial intelligence camera systems on Uighur Muslims in the Xinjiang region to detect their emotions, a software engineer has revealed.

Speaking to BBC’s Panorama, the software engineer — who preferred to remain anonymous — said he installed these systems in police stations in the Xinjiang province.

China has always maintained that surveillance of the region is important given that separatists, who want their own state, have killed hundreds of people in attacks. Xinjiang, home to at least 12 million ethnic minority Uighurs, most of whom are Muslim, has seen massive human rights violations and poor treatment of Uighurs in the region. China has also set up “re-education centres” for them in the area that have been criticised for human rights abuses, mistreatment, rape and torture.

The revelation has shocked many. The Chinese embassy in London maintained that “political and social rights of all ethnic groups are guaranteed” and that “People live in harmony regardless of their ethnic backgrounds and enjoy a stable and peaceful life with no restriction to personal freedom.”

The software engineer, fearing for his safety, has also not revealed the name of the company he worked for. He, however, showed photographs of five Uighurs on whom he claimed the government tested the facial recognition system.

He told BBC’s Panorama: “The Chinese government use Uighurs as test subjects for various experiments just like rats are used in laboratories.”

“We placed the emotion detection camera 3m from the subject. It is similar to a lie detector but far more advanced technology,” he said.

He also explained his role in installing cameras in police stations in the province.

In Xinjiang, police officers use restraint chairs in which one’s wrists and ankles are locked by metal restraints, the engineer said. He spoke about how the AI is trained to recognise and analyse “even minute changes in facial expressions and skin pores.”

According to his claims, the software then makes a pie chart with the red segment indicating a negative or anxious state of mind.

Sophie Richardson, China director of Human Rights Watch, who saw the evidence provided by the software engineer, said: “It is shocking material. It’s not just that people are being reduced to a pie chart, it’s people who are in highly coercive circumstances, under enormous pressure, being understandably nervous and that’s taken as an indication of guilt, and I think, that’s deeply problematic.”

In July 2018, a patent was filed by Huawei and the China Academy of Sciences to describe a face recognition product that is capable of identifying people on the basis of their ethnicity. Huawei had said that it did “not condone the use of technology to discriminate or oppress members of any community” and that it was “independent of government” wherever it operated.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in