A vision of what life might be like under widespread emotion recognition can be found in China, says Shazeda Ahmed, a researcher with the AI Now Institute who recently co-wrote a report on the dire implications the technology has on human rights in the country. Ahmed discovered applications ranging from benign, such as analysing a motorist’s face for distress to improve their driving, to more troubling cases including monitoring classrooms for signs of student misbehaviour and lie detection during police interrogations. Other times, it simply doubled up as another way of conducting mass surveillance. “There were some examples we found of malls and other retail centres in China, where it is being used, and on hundreds, potentially thousands of people a day,” says Ahmed.