Automated recognition of gender and sexual orientation is scientifically flawed and puts LGBT+ lives at risk. The European Union has the opportunity to ban this technology and prevent such tools from being exported around the world.

"Automatic gender recognition is not just a trans issue, it is a misogyny and patriarchy issue because we've created these narrow boxes around policing male and policing female."

Edited quote from a trans non-conforming white person. (Source)

A growing number of governments and companies use artificial intelligence (AI) systems to identify and track citizens and consumers. Faster check-ins at airports using facial recognition or a seamless and personalized shopping experience are just two of the touted benefits.

But this technology is fundamentally flawed when it comes to recognizing and categorizing human beings in all their diversity.

Many of these AI systems work by sorting people into two groups – male or female. The consequences for people whose gender does not match the sex they were assigned at birth can be severe. The algorithms are also programmed to reinforce outdated stereotypes about race and gender that are harmful to everyone.

This kind of technology is built on the mistaken idea that your gender and sexual orientation can be identified by how you look or sound, by how you move, or simply by how “male” or “female” your name is. It is a flaw that can easily fuel discrimination.

Imagine the humiliation of trying to catch your flight, and being stopped by a computer that determines you don’t match the gender marker in your passport. Or imagine not being able to access a public toilet just because a facial recognition algorithm thinks your face doesn’t look male or female enough. Unless we do something to stop it, this could soon be a reality for many people all over the world. And in malicious hands, such as law enforcement in countries with anti-LGBT+ laws, these tools could lead to serious harm for LGBT+ people.

We have a chance to stop the harm caused by this technology before it gets worse. In April 2021, the European Commission, the executive branch of the EU, plans to propose a legal framework – that is, a set of rules and guidelines – to regulate artificial intelligence systems. This is a unique opportunity to ban automated recognition of gender and sexual orientation in the EU and prevent such tools from being exported around the world.

Sign this petition and urge the European Union to ban this technology that harms LGBT+ people around the world.

Signatures
Goal:  0

To: European Commission, European Parliament, and Council of the European Union.

A growing number of governments and companies use artificial intelligence (AI) systems to identify and track citizens and customers. But these systems aren’t designed with queer people in mind and instead reinforce existing biases and discrimination.

We call on you to include a ban of automated recognition of gender and sexual orientation in the upcoming regulatory framework on artificial intelligence systems. This is a step towards ensuring this technology will not be developed, deployed or used by public or private entities across the globe.

"I get misgendered enough by human beings. Why on earth would I want a robot to help with that? Programmatic misgendering just adds to the ocean of constant small comments we all swim in. Misgendering is death by a thousand paper cuts."

Edited quote from a non-binary white person. (Source)

"Automatic gender recognition has no provable meaningful benefit. But it does have provable harm. Trans people should be concerned. It's bad technology, and it doesn't work, but there will always be someone willing to deploy it."

Edited quote from a non-binary white person. (Source

"There's already so many eyes on every trans person navigating through the world and we all feel those eyes. It's just going to exacerbate what's already there. We don't need to feel another robot overlord set of eyes. If security cameras were constantly on the hunt for my gender, I think that I'd be brutalized. I think that I'd be exposed to a lot of violence that is unnecessary."


Edited quote from a trans feminine black/mix person. (Source)
FAQ

What is automated recognition of gender and sexual orientation?

Automated recognition of gender and sexual orientation is technology that attempts to identify your gender and sexual orientation by analyzing your name, how you look and sound, or how you move. In reality, this technology can never recognize gender or sexuality, but only make a prediction.

Where is automated recognition of gender and sexual orientation being used?

A growing number of governments and companies use this technology to identify and track citizens and consumers. You can find it in places like airport check-in terminals, retail stores, and even some social media platforms.

Why is automated recognition of gender and sexual orientation a threat to LGBT+ people?

The software is often programmed to sort people into two groups – male or female. The consequences for people whose gender does not match the sex they were assigned at birth can be severe. The algorithms are also programmed to reinforce outdated stereotypes about race and gender that are harmful to everyone.

Plus, the algorithm replicates biases on how lesbian, gay, and bi people look, reviving the old mistaken belief that you can predict character from appearance. And in malicious hands, such as law enforcement in countries with anti-LGBT+ laws, these tools could lead to serious harm for LGBT+ people around the world.

What are some examples of the harm automated recognition of gender and sexual orientation can do to LGBT+ people?

  • Trans people could be denied access to gender-specific spaces like bathrooms and locker rooms.

  • Authorities in repressive countries could use facial recognition to analyze security camera footage or social media profiles to track down people they assume to be LGBT+ and arrest them.

  • You could be stopped and interrogated at the airport if the security software determines you don't match the gender marker in your passport.

What is the difference between automated recognition of gender and sexual orientation, and facial recognition?

Automated gender and sexual orientation recognition systems sometimes use facial recognition software (or the analysis of other biometric characteristics like the way that someone walks) to make their predictions. Biometric technologies like facial recognition can also be used by governments or companies to conduct mass surveillance, which threatens the privacy and freedoms of queer people.

Why should the European Union ban automated recognition of gender and sexual orientation?

In April 2021, the European Commission, the executive branch of the EU, plans to propose a legal framework – that is a set of rules and guidelines – to regulate artificial intelligence systems. This is a unique opportunity to ban automated recognition of gender and sexual orientation in the EU and prevent such tools from being exported around the world.

What can I do to help stop automated recognition of gender and sexual orientation?

Share our petition and ask your friends to call on the European Union to ban this technology that harms LGBT+ people.

SHARE THIS CAMPAIGN

This campaign is being run in conjunction with Access Now and Reclaim Your Face.

All Out is a partner of Reclaim Your Face, the European campaign for a legal ban on harmful facial recognition and related biometric mass surveillance technologies in public spaces. By watching and tracking us all the time, these technologies pose a specific threat to the free expression and privacy of LGBT+ people. If you are an EU citizen, you can give your support for LGBT+ lives legal weight by also signing the officially-recognized European Citizens’ Initiative to ban biometric mass surveillance.