Biometric Mirror: Reflecting the Imperfections in AI

By: Joseph Sanford


When we see someone for the first time, we make internal snap judgements about them. We can’t help it, we’re just judgmental like that. After looking at the person for just a few seconds, we might note their gender, race, and age or decide whether or not we think they’re attractive, trustworthy, or kind.


A new technology has me both fascinated and afraid for the future. A new device called a biometric mirror is in development. What this device does is scans “a user’s physical and personality traits based on a photo of them”. With this scan, it rates and categorizes different attributes of a person such as gender, age, ethnicity, happiness, attractiveness, and other categories as well.


To use Biometric Mirror, a person just has to stand in front of the system for a few seconds. It quickly scans their face and then lists their perceived characteristics on a screen. The AI then asks the person to think about how they’d feel if it shared that information with others. How would they feel if they didn’t get a job because the AI ranked them as having a low level of trustworthiness? Or if law enforcement officials decided to target them because they ranked highly for aggression?


The AI’s decision process in rating people in the more subjective categories comes from a database of what features 10,000 people find attractive, etc. While this is an interesting development, there are many dangerous avenues that could be explored with a machine that identifies race and attractiveness. What do you think about a computer that identifies and rates you?