Gay AI identification software
The study created composite faces judged most and least likely to belong to homosexuals
A facial recognition experiment that claims to be able to distinguish between gay and heterosexual people has sparked a row between its creators and two leading LGBT rights groups.

The Stanford University study claims its software recognises facial features relating to sexual orientation that are not perceived by human observers.

The work has been accused of being "dangerous" and "junk science".

But the scientists involved say these are "knee-jerk" reactions.

Details of the peer-reviewed project are due to be published in the Journal of Personality and Social Psychology.

Narrow jaws

For their study, the researchers trained an algorithm using the photos of more than 14,000 white Americans taken from a dating website.

They used between one and five of each person's pictures and took people's sexuality as self-reported on the dating site.

The researchers said the resulting software appeared to be able to distinguish between gay and heterosexual men and women.

In one test, when the algorithm was presented with two photos where one picture was definitely of a gay man and the other heterosexual, it was able to determine which was which 81% of the time.

With women, the figure was 71%.

"Gay faces tended to be gender atypical," the researchers said. "Gay men had narrower jaws and longer noses, while lesbians had larger jaws."

But their software did not perform as well in other situations, including a test in which it was given photos of 70 gay men and 930 heterosexual men.

When asked to pick 100 men "most likely to be gay" it missed 23 of them.

In its summary of the study, the Economist - which was first to report the research - pointed to several "limitations" including a concentration on white Americans and the use of dating site pictures, which were "likely to be particularly revealing of sexual orientation".

'Reckless findings'

On Friday, two US-based LGBT-focused civil rights groups issued a joint press release attacking the study in harsh terms.

"This research isn't science or news, but it's a description of beauty standards on dating sites that ignores huge segments of the LGBTQ (lesbian, gay, bisexual, transgender and queer/questioning) community, including people of colour, transgender people, older individuals, and other LGBTQ people who don't want to post photos on dating sites," said Jim Halloran, chief digital officer of Glaad, a media-monitoring body.

"These reckless findings could serve as a weapon to harm both heterosexuals who are inaccurately outed, as well as gay and lesbian people who are in situations where coming out is dangerous."
AI surveillance gaydar
Campaigners raised concerns about what would happen if surveillance tech tried to make use of the study
The Human Rights Campaign added that it had warned the university of its concerns months ago.

"Stanford should distance itself from such junk science rather than lending its name and credibility to research that is dangerously flawed and leaves the world - and this case, millions of people's lives - worse and less safe than before," said its director of research, Ashland Johnson.

The two researchers involved - Prof Michael Kosinski and Yilun Wang - have since responded in turn, accusing their critics of "premature judgement".

"Our findings could be wrong... however, scientific findings can only be debunked by scientific data and replication, not by well-meaning lawyers and communication officers lacking scientific training," they wrote.

"However, if our results are correct, Glaad and HRC representatives' knee-jerk dismissal of the scientific findings puts at risk the very people for whom their organisations strive to advocate."

'Treat cautiously'

Previous research that linked facial features to personality traits has become unstuck when follow-up studies failed to replicate the findings. This includes the claim that a face's shape could be linked to aggression.

One independent expert, who spoke to the BBC, said he had added concerns about the claim that the software involved in the latest study picked up on "subtle" features shaped by hormones the subjects had been exposed to in the womb.

"These 'subtle' differences could be a consequence of gay and straight people choosing to portray themselves in systematically different ways, rather than differences in facial appearance itself," said Prof Benedict Jones, who runs the Face Research Lab at the University of Glasgow.

It was also important, he said, for the technical details of the analysis algorithm to be published to see if they stood up to informed criticism.

"New discoveries need to be treated cautiously until the wider scientific community - and public - have had an opportunity to assess and digest their strengths and weaknesses," he said.