If we train AI surveillance system using old footage, like from CCTV or police body cameras, then biases that exist in society are likely to be perpetuated.

This process is already taking place in law enforcement, says Meredith Whittaker, co-director of the ethics-focused AI Now institute at NYU, and will spread into the private sector. Whittaker gives the example of Axon (formerly Taser), which bought several AI companies to help build video analytics into its products. “The data they have is from police body cams, which tells us a lot about who an individual police officer may profile, but doesn’t give us a full picture,” says Whittaker. “There’s a real danger with this that we are universalizing biased pictures of criminality and crime.”

Read more here.