Security News > 2020 > August > Facing gender bias in facial recognition technology

Facing gender bias in facial recognition technology
2020-08-27 05:00

Despite all of the advancements we've seen, many organizations still rely on the same algorithm used by Bledsoe's database - known as "k-nearest neighbors" or k-NN. Since each face has multiple coordinates, a comparison of these distances over millions of facial images requires significant data processing.

Facial recognition also involves finding the location of a feature on a face before evaluating it.

While this may not be a significant problem when matching faces for social media platforms, it can be far more damaging when the facial recognition software from Amazon, Google, Clearview AI and others is used by government agencies and law enforcement.

A smartphone relying on face recognition could block access, a police officer using facial recognition software could mistakenly identify an innocent bystander as a criminal, or a government agency might call in the wrong person for questioning based on a false match.

For the data set, we used a database of faces called Labeled Faces in the Wild, and we only investigated faces that matched another face in the database.


News URL

http://feedproxy.google.com/~r/HelpNetSecurity/~3/hO50nXrCpPA/