AI That Can Study Your Face To Determine Whether You're Gay Is Technology We Just Don't Need
All it needs is a photo or two and anyone can be outed from halfway across the world.
Artificial Intelligence can be repurposed to do a lot of things better and faster than humans, including sifting through large amounts of data, or detecting markers for illness in patient scans. But what about something with a more everyday use?
Well, if everything goes as planned, you may eventually be able to detect which of your crushes are gay or straight based on just their photos. Of course, that¡¯s also a huge problem.
Images courtesy: Reuters
Students from Stanford University conducted a study which found that an AI ¡°gaydar¡± could correctly distinguish between gay and straight men 81 percent of the time, with a 74 percent success rate for women. The algorithm was based on simple facial recognition software, which was provided with a sample of more than 35,000 faces of men and women. The images were pulled from publicly available photos on a US dating website, complete with their sexual preference posted.
Researchers Michal Kosinski and Yilun Wang used the AI to extract common features between people of each sexual orientation, and then programmed it to perform the analysis itself on images it¡¯s shown.
According to the research, homosexual men and women tend to have ¡°gender-atypical¡± features and expressions, as well as different grooming styles. Basically, they¡¯re seen gay men tended to be more feminine, while lesbian women generally appeared more feminine. In addition, the neural network also spotted trends in the subjects¡¯ features; gay men had narrower jaws, longer noses, and larger foreheads, while gay women had the opposite.
Intriguingly, the AI performed much better than human participants when shown photos of people and asked to determine their sexual orientation. Regular people only managed to get it right 61 percent of the time for men and 54 percent for women. The AI, meanwhile, performed better the more images of a person it had to analyse. Five photos of each person gave it a 91 percent and 83 percent success rate for men and women respectively. ¡°Faces contain much more information about sexual orientation than can be perceived and interpreted by the human brain¡±, the authors wrote in their study.
Indeed, the study even suggests their findings strongly support a current theory, that being gay isn¡¯t a choice but rather something you are from birth. They say the similarities in bone structure and facial features among the queer subjects of their analysis indicate that it may actually in fact be that they¡¯re gay because of an exposure to certain hormones before birth. In short, the study says we really are born this way.
Of course, there are plenty of limitations in the study as well. For some reason, there were no people of colour included in the study, so the trends could change the success rate. Additionally, solely straight and gay people were considered, with no transgender or bisexual subjects. And yet, the study raises more concerns than assistance.
With the advent of social media, and the subsequent growth of Big Data, there are billions of photos of people available freely online. The researchers themselves admit the the problems this sort of technology could pose, with people unwilling to reveal their sexual identity being ¡°outed¡± against their will by others. Even more worrisome is the scope for widespread harassment; it¡¯s not too hard to imagine a government secretly utilising this technology to target what they consider a ¡°sexually deviant¡± section of the populace.
The problem here lies in the fact that developing the software is just a short step from where we already are. The technology to build and train AI is freely available, as are the data and image dumps required to teach them. Kosinski and Wang therefore believe it¡¯s important to expose the capabilities right now, so the public is aware of not only what can be done, but also what can be done to stop it.
If you still think privacy isn¡¯t important because you ¡°have nothing to hide¡±, here¡¯s another reason to think again.