Even though there are examples throughout history from the ancient Greeks to the 18th century of people practising physiognomy, basically judging a person’s character or lifestyle from their facial features, a recent study from Stanford University gives us a modern-day version to contemplate—computers determining if a person is gay or straight through facial-detection technology.
Deep Neural Networks Used to Determine Sexual Orientation in Study
Yilun Wang and Michael Kosinski’s study took more than 35,000 facial images of men and women that were publicly available on a U.S. dating website and found that a computer algorithm was correct 81% of the time when it was used to distinguish between straight and gay men, and accurate 74% of the time for women. Accuracy improved to 91% when the computer evaluated five images per person. Humans who looked at the same photos were accurate only 61% of the time.
One pattern the machines detected in the study was gay women and men typically had “gender-atypical,” “grooming styles,” features and expressions—gay men appeared more feminine and gay women appeared more masculine. Another trend the machines identified was that gay women tended to have larger jaws and smaller foreheads then straight women while gay men had larger foreheads, longer noses and narrower jaws than straight men. The researchers found that the computers paid most attention to the neckline, mouth corners, hair and the nose on women and the chin, eyes, eyebrows, nose cheeks and hairline for men to help determine the person’s sexuality.
Wang and Kosinski used VGG-Face, a deep neural network that already exists and was originally trained for facial recognition by learning to spot patterns in a sample of 2.6 million images. A neural network is a set of algorithms that is loosely modelled after the human brain and designed to recognise patterns in a large dataset. It helps us to classify data. Similar AI systems could be trained to spot other human traits such as IQ or political views, Dr. Kosinski suggests.
Once the study was published in the Journal of Personality and Social Psychology it began to raise concerns about the potential for this type of profiling to go down a negative path.
Apprehensive of AI
There are billions of facial images of people that are publicly available on social media sites and in government databases. This study used existing technology to extract significant meaning just from facial characteristics and the study’s authors gave us a glimpse at how powerful it can be. One can just imagine how this capability could be used for nefarious reasons.
As quoted in an article by The Guardian on the subject, Nick Rule, an associate professor of psychology at the University of Toronto who has published research on the science of gaydar said, “It’s certainly unsettling. Like any new tool, if it gets into the wrong hands, it can be used for ill purposes.”
This study has raised several questions and adds another consideration to the list of things we need to navigate as a culture with the addition of artificial intelligence to our capabilities.
First, this study used publicly available images. There are certainly privacy concerns about how facial-detection technology is used. Is it OK to hunt down terrorists, criminals and missing persons, but do we cross a line when we just extract info from any face we can capture without that person’s consent? What are the ethics guidelines around this? How do we ensure this technology isn’t abused for anti-LGBT purposes or in the future for discrimination based on IQ or political views? How can we achieve the right balance of using the insights from this study to inform our AI strategies rather than over generalise (i.e. all men with gender-atypical expressions are gay)? As it stands, although the AI was better than humans at distinguishing sexual orientation, it wasn’t 100%. And, is it really anyone’s right to that info without a person’s consent?
In fact, Dr. Kosinski claims this study was done as a demonstration and to “warn policymakers of the power of machine vision.” In the last U.S. presidential race, the Trump campaign used “psychometric profiling” and similar models as the one used in this study to target voters on Facebook who had certain personality characteristics. As the always-growing volume of data feeds the machine algorithms of facial-detection programmes, they will become better over time, and the potential uses will also grow.
Will we be able to keep anything private in the future? Time will tell.