Enroll Course

100% Online Study
Web & Video Lectures
Earn Diploma Certificate
Access to Job Openings
Access to CV Builder



Online Certification Courses

AI emotion-detection software tested on Uyghurs

AI emotion-detection software tested on Uyghurs. 

AI emotion-detection software tested on Uyghurs

According to the BBC, a camera system that uses artificial intelligence and facial recognition to reveal states of emotion has been tested on Uyghurs in Xinjiang.

According to a software engineer, such systems were installed in police stations throughout the province.

When shown the evidence, a human rights advocate described it as shocking.

The Chinese embassy in London has not responded directly to the allegations, but has stated that political and social rights are guaranteed to all ethnic groups.

Xinjiang is home to 12 million Uyghurs, the majority of whom are Muslim.

The province's citizens are subjected to daily surveillance. Additionally, the area is home to highly contentious "re-education centers," dubbed high-security detention camps by human rights groups, where over a million people are believed to have been held.

Beijing has long argued that surveillance in the region is necessary because separatists seeking to establish their own state have killed hundreds of people in terror attacks.

Because he fears for his safety, the software engineer agreed to speak to the BBC's Panorama programme on the condition of anonymity. Additionally, the name of the company for which he worked is being withheld.

However, he showed Panorama five photographs of Uyghur detainees he claimed had been subjected to an emotion recognition test.

"The Chinese government uses Uyghurs as test subjects for a variety of experiments, just as laboratories use rats," he explained.

Additionally, he described his role in installing the cameras in the province's police stations: "We placed the emotion detection camera three meters away from the subject. It is comparable to a lie detector but utilizes far more sophisticated technology."

He stated that officers used "restraint chairs," which are widely used in Chinese police stations.

"Your wrists and ankles are restrained by metal restraints."

He demonstrated how the AI system is trained to recognize and analyze even the smallest changes in facial expressions and skin pores.

The software, he claims, generates a pie chart with the red segment representing a negative or anxious state of mind.

He asserted that the software was designed for "pre-judgment in the absence of credible evidence."

The Chinese embassy in London did not respond to questions about the province's use of emotional recognition software, but stated that "all ethnic groups in Xinjiang have full political, economic, and social rights, as well as religious freedom."

"Regardless of ethnic origin, people live in harmony and enjoy a stable and peaceful existence with no restrictions on personal freedom."

The evidence was presented to Sophie Richardson, Human Rights Watch's China director.

"This is harrowing material. It's not just that people are reduced to pie charts; it's also about people who are subjected to extreme coercion, are under enormous pressure, and are understandably nervous, and this is interpreted as an indication of guilt, which I believe is deeply problematic."

Suspicious behaviour

According to Darren Byler of the University of Colorado, Uyghurs are routinely required to provide DNA samples to local officials, submit to digital scans, and download a government-mandated phone app that collects data such as contact lists and text messages.

"Uyghur life is now about data generation," he explained.

"Everyone is aware that the smartphone is something you must carry with you at all times, and that if you do not, you risk being detained; they are also aware that you are being tracked by it. And they believe there is no way out," he explained.

The majority of data is fed into a computer system known as the Integrated Joint Operations Platform, which Human Rights Watch claims flags allegedly suspicious behavior.

"The system is collecting data on dozens of perfectly legal behaviors, such as whether people were exiting through the back door rather than the front, or whether they were filling up a car that did not belong to them," Ms Richardson explained.

"Authorities are now placing QR codes on the outside of people's doors to easily determine who is supposed to be there and who is not."

Orwellian?

There has long been debate over the degree to which Chinese technology firms are inextricably linked to the state. IPVM, a US-based research organization, claims to have discovered evidence in such companies' patent filings that facial recognition products were specifically designed to identify Uyghur people.

Huawei and the China Academy of Sciences filed a patent in July 2018 for a face recognition product that is capable of identifying people based on their ethnic origin.

Huawei responded by stating that it "does not condone the use of technology to discriminate against or oppress members of any community" and that it was "independent of government" in all countries in which it operated.

Additionally, the group discovered a document indicating that the firm was developing technology for a so-called One Person, One File system.

"The government would keep track of each person's personal information, political activities, relationships, and anything else that might provide insight into how that person would behave and what kind of threat they might pose," IPVM's Conor Healy explained.

"It effectively eliminates any possibility of dissidence and provides the government with true predictability regarding the behavior of their citizens. I doubt that [George] Orwell ever imagined a government capable of this level of analysis."

Huawei did not respond directly to questions about its involvement in developing technology for the One Person, One File system, but reiterated its independence from governments in all countries where it operates.

China's embassy in London stated that it was "unaware" of these programs.

IPVM also claimed to have discovered marketing materials from Chinese firm Hikvision promoting an AI camera capable of detecting Uyghurs, as well as a patent for software developed by another Chinese tech giant, Dahua, that could also identify Uyghurs.

Dahua stated that its patent made reference to all 56 recognized ethnic groups in China and did not specifically mention any of them.

It added that it provided "products and services that help keep people safe" and adhered to "all applicable laws and regulations" in the markets in which it operates, including the UK.

Hikvision stated that the information on its website was incorrect and was "uploaded online without appropriate review," adding that the company did not sell or offer "a minority recognition function or analytics technology" in its product line.

Dr Lan Xue, chairman of China's National committee on artificial intelligence governance, stated that he was unaware of the patents.

"There are numerous charges of this nature outside of China. Numerous reports are inaccurate and untrue," he told the BBC.

"I believe that the Xinjiang local government owed it to the Xinjiang people to truly protect them... if technology is used in those contexts, that is entirely understandable," he said.

The Chinese embassy in the United Kingdom offered a more robust defense, telling the BBC: "There is no such thing as so-called facial recognition technology that incorporates Uyghur analytics."

Periodic surveillance

China is believed to be home to approximately half of the world's nearly 800 million surveillance cameras.

Additionally, it has a large number of smart cities, such as Chongqing, where AI is integrated into the urban environment's foundations.

Hu Liu, a Chongqing-based investigative journalist, described his own experience to Panorama: "Once you leave your house and enter the lift, you are captured by a camera. Everywhere there are cameras."

"When I leave the house to go somewhere, I call a taxi, and the taxi company transmits the data to the government. I may then go to a cafe to meet a few friends, and the authorities will be informed of my location via the cafe's camera.

"There have been instances when I've met some friends and shortly thereafter been contacted by someone from the government.  'Do not see that person, do not do this and that,' they warned me.

"With artificial intelligence, there is no place for us to hide," he explained.

Courses and Certification

Artificial Intelligence Course and Certificate

Artificial Neural Network Course and Certificate

Python Deep Learning Course and Certificate

Python Data Structure Course and Certificate

Corporate Training for Business Growth and Schools