Life

This AI Claims To Be Able To Tell If You’re Gay Or Straight Based On A Picture Of Your Face

by JR Thorpe
OLGA MALTSEVA/AFP/Getty Images

Update: On Sept. 8, GLAAD and Human Rights Campaign issued a joint statement calling research from Stanford University that claimed an algorithm can detect a person’s sexuality “dangerous and flawed.” The statement urged Stanford and “responsible media” to “debunk” the study that produced this assertion.

“Technology cannot identify someone’s sexual orientation. What their technology can recognize is a pattern that found a small subset of out white gay and lesbian people on dating sites who look similar. Those two findings should not be conflated,” said Jim Halloran, GLAAD’s chief digital officer, in the statement. Bustle has updated this post to reflect GLAAD and HRC’s guidelines.

Previous: A researcher at Stanford University made headlines today for a study that said that an AI claimed to show that it can determine human sexual orientation with a degree of accuracy based on pictures of faces, and that might have huge ramifications for surveillance, safety, and LGBTQ rights across the world. The study has been called out by GLAAD and Human Rights Campaign, however, as inaccurate for its poor methodology: The study does not include any non-white subjects; it assumes there are only two sexualities, gay or straight; and it did not independently verify the sexualities of the subjects. The authors of the study, in turn, released their own statement addressing the points made by GLAAD and HRC. "Any scientific findings can be wrong, but dismissing them and their implications without due consideration could be dangerous and ill-informed," said the study's authors in the statement.

According to the authors of the study, human judges presented with images of faces can guess the sexuality of the subjects of the photos with reasonable accuracy: 61 percent accuracy for men, and 54 percent for women. The AI network was set up to use a gigantic range of facial images: 35,326 of them, to be precise. All were Caucasian and had been posted publicly on a dating site in the United States, and photos of subjects who claimed they were heterosexual or homosexual on the dating sites were represented equally. The AI, like many others in use around the world, had already had training in facial recognition, but the researchers fed it the images and some information about their sexual status, and attempted to use it to predict whether a human face belonged to a gay or a straight person.

ROB LEVER/AFP/Getty Images

"Given a single facial image," the researchers wrote, their network claims to "correctly distinguish between gay and heterosexual men in 81 percent of cases, and in 74 percent of cases for women." If they added four more images of each person, that alleged accuracy increased to 91 percent and 83 percent, respectively.

This is problematic for a variety of reasons. One is how on earth the AI managed to pick out heterosexuality or homosexuality from facial features. Apparently, the researchers claim, gay men and women "tended to have gender-atypical morphology, expression, and grooming styles," which means that they look slightly less "feminine" or "masculine" than their straight counterparts. This is a very problematic statement that plays into stereotypes about LGBTQ people that can lead to stigma. As the authors of the GLAAD and HRC statement point out, "It is not surprising that gay people (out, white, similar age) who choose to go on dating sites post photos of themselves with similar expressions and hairstyles."

The authors of the study also used their findings to suggest that queer humans are physically and genetically different from straight ones, and have been since before birth, since not all the facial elements they tracked could be down to grooming and social conditioning alone. The idea that homosexuality "starts in the womb," in the words of a 2012 Science examination of the genetics of gayness, is a popular but problematic theory, and it's suggested that homosexuality and heterosexuality may be partially caused by hormonal exposures in utero.

The AI itself hasn't been tested by anybody except the Stanford scientists yet, and it has been thoroughly backtracked upon; dating site photographs, as we well know, are hardly a representation of how people look on the street or when they wake up in the morning. It's also only been applied to white, cisgender faces, which fails to recognize the diversity of the LGBTQ experience. Research like this that makes monolithic claims about the LGBTQ community is deeply dangerous.

Editor's note: This story has been updated from its original version.