Computer 'Gaydar' Can Determine Your Sexuality By Scanning A Photograph

Photo: Wikimedia (National Institute of Standards and Technology)

In a study from Stanford University, scientists have successfully tested a computer algorithm that could plausibly discern an individual's sexual orientation, merely by scanning a photograph of that person.

According to the study findings, artificial intelligence was able to correctly distinguish between gay and straight men 81% of the time, and 74% for women.

But in response to the study, people have begun to raise questions about the ethics of facial-detection technology. 

From The Guardian:

The machine intelligence tested in the research, which was published in the Journal of Personality and Social Psychology and first reported in the Economist, was based on a sample of more than 35,000 facial images that men and women publicly posted on a US dating website. The researchers, Michal Kosinski and Yilun Wang, extracted features from the images using “deep neural networks”, meaning a sophisticated mathematical system that learns to analyze visuals based on a large dataset.

The research found that gay men and women tended to have “gender-atypical” features, expressions and “grooming styles”, essentially meaning gay men appeared more feminine and vice versa. The data also identified certain trends, including that gay men had narrower jaws, longer noses and larger foreheads than straight men, and that gay women had larger jaws and smaller foreheads compared to straight women.

Human judges performed much worse than the algorithm, accurately identifying orientation only 61% of the time for men and 54% for women. When the software reviewed five images per person, it was even more successful – 91% of the time with men and 83% with women. Broadly, that means “faces contain much more information about sexual orientation than can be perceived and interpreted by the human brain”, the authors wrote.

The paper suggested that the findings provide “strong support” for the theory that sexual orientation stems from exposure to certain hormones before birth, meaning people are born gay and being queer is not a choice. The machine’s lower success rate for women also could support the notion that female sexual orientation is more fluid.

While the findings have clear limits when it comes to gender and sexuality – people of color were not included in the study, and there was no consideration of transgender or bisexual people – the implications for artificial intelligence (AI) are vast and alarming. With billions of facial images of people stored on social media sites and in government databases, the researchers suggested that public data could be used to detect people’s sexual orientation without their consent.

Even the authors of the study are concerned about the potential for misuse of the technology.

They write:

Some people may wonder if such findings should be made public lest they inspire the very application that we are warning against. We share this concern. However, as the governments and companies seem to be already deploying face-based classifiers aimed at detecting intimate traits (Chin & Lin, 2017; Lubin, 2016), there is an urgent need for making policymakers, the general public, and gay communities aware of the risks that they might be facing already.

Delaying or abandoning the publication of these findings could deprive individuals of the chance to take preventive measures and policymakers the ability to introduce legislation to protect people. Moreover, this work does not offer any advantage to those who may be developing or deploying classification algorithms, apart from emphasizing the ethical implications of their work. We used widely available off-the-shelf tools, publicly available data, and methods well known to computer vision practitioners.

We did not create a privacy-invading tool, but rather showed that basic and widely used methods pose serious privacy threats. We hope that our findings will inform the public and policymakers, and inspire them to design technologies and write policies that reduce the risks faced by homosexual communities across the world.





 I think gay men are more likely to have their heads' tilted down like a swan, I call it "swan neck."  It gives you the appearance of having a narrower jaw, larger forehead and longer nose.


Although 91% is an impressive score, it still falls well below the threshold for significance.  We really shouldn't read too much into this.  Plus, we should be extra cautious about drawing too many conclusions from reviews of academic articles by lay journalists especially when they use words like "suggest" and "could support."  Those words are pundit/activist/academic BullS*** Speak for "we really want there to be a connection, but we just can't prove it."

Add new comment