May 2020 – The U.S. National Institute of Standards and Technology (NIST) recently analyzed 189 software algorithms from 99 developers and found that most of them demonstrate accuracy based on sex, age and ethnicity.
The NIST report was published on facial recognition study. The study tested algorithms on two tasks: confirming a photo in a database matches a different photo of the same person and determining whether the person in the photo has any match in a database.
The NIST also measured false positive and false negative rates for the algorithms. Four photo collections of 18.27 million images of 8.49 million people from U.S. databases were provided by State Department, Homeland Security Police, and FBI. The photos contained information of sex, age, race and country of birth.
The study highlighted many specific findings across the algorithms: Asian and African American faces had higher false positive rates for one-to-one matching compared to Caucasian images. Similar rates of false positives were found among American developed algorithms in one-to-one matching for Asians, African Americans and native groups. The highest false positive rates were at the American Indian population.
“The error rate in facial recognition algorithms carries different weight depending on the application” Watson says. Desmond Patton has worked image-based analysis. He says, “algorithms face extreme challenges in detecting context such as how behavior changes in different places, and what clothing or hand gestures might mean in different situations”.
Further Patton says, “Oftentimes the goal is to have the most accurate system, but the most accurate system can also be weaponized against the community it is intended to help,”. “We blindly trust these systems, and repeatedly, when we apply them to the real world, we quickly realize that they have limits. We’re trusting these systems far beyond their capacities”.