Government study finds racial, gender bias in facial recognition software

Many facial recognition technology systems misidentify people of color at a higher rate than white people, according to a federal study released Thursday.

The research from the National Institute of Standards and Technology (NIST), a federal agency within the Department of Commerce, comes amid pushback from lawmakers and civil rights groups to the software which scans faces to quickly identify individuals.

After reviewing 189 pieces of software from 99 developers, which NIST identified as a majority of the industry, the researchers found that in one-to-one matching, which is normally used for verification, Asian and African American people were up to 100 times more likely to be misidentified than white men.