Have you ever felt unfairly indicted? For instance, in school, when a teacher dismissed your stellar record and punished you for a rule that you didn't break? When such things happened, were you told that the world is unfair and that you should just accept it?
Don and Shashank felt the same helplessness and rage after the repeated suspension of their organisation's Instagram account based on ¡°Community Guidelines¡± despite having shared nothing illegal, hateful, or pornographic. The private, unverified, and anonymous accounts that trolled and harassed their content every day were the ones running scot-free on this Meta platform, they tell us; despite regular reporting.?
Similarly, a queer, brown and body positivity account on Instagram called tasteofhissalt is used to their content getting restricted, but is bewildered at how a male-identifying upper body, which looks different from what Instagram's AI perhaps recognised as male, is blocked on Instagram citing nudity when there are countless pictures of cishet male upper bodies that run free in the Meta universe.?
Also read:?From Captivity To Freedom: Harrowing Journey Of Two Same-Sex Couples In North India
Randhir Pratap Singh, the founder of the clothing brand Leather Subculture talked?about having noticed a pattern in engagement on his business account - photos of women in certain kinds of skin-show, the kind that can be categorised as queer and not cishet, gain more engagement than those of men. Performance Marketer and queer-identifying Palash Gogoi considers it a matter of fact at this point that queer bodies, and any display of skin by them, is more prone to getting shadow banned (if not banned) than their cishet counterparts.
Something seems off, doesn¡¯t it? Let¡¯s dig deeper.
In 2021, the infamous Meta whistleblower Frances Haugen made waves with her complaints to the American federal law enforcement, which claimed that Facebook¡¯s own research shows that its algorithm amplifies hate, misinformation, political unrest, and causes harm to the teenage sensibility by pushing content on eating disorders and beauty standards that has increased suicide rates in teenagers.?
Since then, much has been talked about, and some has even been acted upon. Meta has talked about setting up a third-party fact checking program to reduce fake news, and Instagram has even come up with some new features, guidelines, and measures to ensure individual and community safety. Even Haugen has mentioned the sympathy she has for Mark Zukerberg as she sees him, not as an active propagator of hate, but as someone who runs a company with misaligned incentives where engagement takes priority to social responsibility.?
Nevertheless, a lot of information about all these measures taken by Meta after the 2021 fiasco can be found in Meta¡¯s blogs. We¡¯re picking some particular aspects of Meta policies, specifically Instagram policies, that are relevant to what we want to talk about here - queer erasure.?
This is what they called the reports they released in February this year for their fourth quarter of 2022. In the overview, Meta talked about the various issues that are highlighted in the said report, which it has worked or started work on - with regards to security, transparency, and clearer guidelines to help users understand the actions and restrictions deployed by their team on the platform. The overview included phrases like ¡°policies against mass reporting (or coordinated abusive reporting)¡±. Reading the overview, one gets the impression that a lot is being done and being done successfully.
But as Pete Seger says, education is what you read in the fine print.?Going through the report, Meta wanted to highlight their progress in the following words:
¡°Completing the global roll out of new messaging to let people know whether automated or human review led to an enforcement action against their content, providing them with more granular understanding of how our review systems work and how to engage safely on our platforms[..]¡±
Sounds great, right? This should have solved Don and Shashank¡¯s confusion to some extent - getting a clear answer to the question ¡°Why are we the ones against whom the action is being taken?¡±
Also read:?LGBTQIA+ Mental Health Toolkit: How To Navigate The Complexities Of Well-Being
That is, until you read the description, they give in one of their Appendixes about this fully implemented task:
¡°Following the pilot of this messaging in France, our teams continued launching this messaging in more markets, using findings from these launches to further understand the impact of the message on people¡¯s experiences, and improving our design and approach for new markets.¡±
Okay, things are getting more lukewarm. But still, they¡¯re working on it, right?
Maybe, but it has been months. Meta has put out a whole new application called Threads in these months. Yet when Himachal Queer Foundation¡¯s account was blocked, three days after the previous block was removed because the Instagram team had found no evidence for the accusations against their content, this is the language of explanation:
With no other way to reach the team, Don and Shashank talk about the anxiety and stress of waking up every day and waiting for something to happen. Their social media presence is their voice, they say. It is how they fulfil what they set out to do - reach the young, scared, and queer people of remote areas of Himachal and provide a safe space for them. It is also their archive, their memory bank, and their history. It¡¯s their way of interacting with others who might need their support and help and have no other way to reach them. Their crime in all of this? Posting about the first-ever pride parade they organised in Himachal.?
Another good measure that the Meta quarterly report pointed out was ¡°Launching an update to our penalty strikes system to improve clarity about the rules that apply to people on our platforms¡±, yet on reading the transparency page on which Meta claims to have put out this information one finds caveats way too easily. As the following lines echo in the chamber of secrets here, one wonders if words mean anything at all, to the organisation:
"These restrictions generally only apply to Facebook accounts [¡] (Note that while we count strikes on both Facebook and Instagram, these restrictions only apply to Facebook accounts)."
Meta¡¯s decisions on shadow banning, restricting, and suspending accounts is based on their AI¡¯s understanding of their community guidelines and if the content in question follows the same.?
Here is what Meta claims to have completed as a task in their quarterly report:?
¡°Adding clarifying language to our Community Standards to underscore the importance of artistic and creative expression on our platforms[¡]¡±
The most common flag that one sees, especially in terms of queer content, is for nudity. Here is what their community guidelines page says they mean when they say nudity:
¡°This includes photos, videos and some digitally-created content that show sexual intercourse, genitals and close-ups of fully-nude buttocks. It also includes some photos of female nipples, but photos in the context of breastfeeding, birth giving and after-birth moments, health-related situations (for example, post-mastectomy, breast cancer awareness or gender confirmation surgery) or an act of protest are allowed. Nudity in photos of paintings and sculptures is OK too.¡±
Now, it¡¯s sensible to expect some mistakes from the technology and hope that Meta fixes them in due time. Sometimes news report videos about harassment are blocked as an error, and it¡¯s understandable; preferred even if faces can be seen in the said video.?
Also read:?How College Queer Collectives In India Are Creating Inclusive Spaces On Campus
But when the AI they use assumes bodies to look a certain way, a cis-heteronormative way, what we have at our hands is unintentional discrimination. In one of his Tonight Show videos on YouTube, John Oliver talked about the danger of what AI cannot do, and what it cannot do is use discretion and understand nuances the way humans can. And this is purely because of the use of certain data that is available to the AI for analysis.
Organisations use AI to work more efficiently, reduce costs, and for faster action. But using AI that is clearly lacking in situations where if it makes a mistake causes queer erasure and reenforces gender dysphoria - something queer people struggle with on an everyday basis.?
What happens when you are banned from using your own social media accounts for posting something that Instagram itself has allowed, and that a ¡°cis-bodied¡± individual can post any day, without any restrictions? Discrimination is a valid answer to this question. It feels alienating, it feels like an attack, and it feels like the world is denying your existence yet again.?
Queer artist and filmmaker Raqeeb Raza, whose posts have been removed and blocked various times despite self-censoring to follow these community guidelines, says ¡°I feel similar work and especially accounts and posts with heterosexual, western category get the green-light while brown, queer bodies remain in the shadows. It almost makes you feel like two white bodies together are aesthetic, but brown bodies together are ¡°against the guidelines¡±."
And this is just one aspect of the incompetence of Meta¡¯s safety tools. When Himachal Queer Foundation loses its account based on violating community guidelines on Human Exploitation, how should the founders take it? Does Instagram agree with the mass, homophobic reporting of their accounts? Do these faceless, nameless profiles hold more credibility than a registered organisation? Or is it a bullying tactic to give them no option but to buy the new blue-tick verification?
Also read:?Quest For The West: Why Does The Indian Queer Population Want To Move Out?
Despite all of this, the queer individuals we interviewed showed hope. Maybe AI has a long way to go before it can recognise and understand queer bodies and existences, but there are safeguards that are needed here. Special recognition and monitoring for queer accounts, or at least setting up easier ways for them to reach out and appeal once blocked seem necessary. Queer people are vulnerable to social media harassment, especially in India where no specific legal protections exist for the LGBTQIA+ community. And while Meta has yet to figure out how to recognise regional languages to remove queerphobic trolling, setting up necessary safeguards to protect queer pages and their content seems like a necessity, and their absence a clear sign of incompetence.
A Meta spokesperson told Indiatimes the following -??¡°At Meta, we recognise the importance of having a safe place to connect online. We partner with LGBTQ+ safety and advocacy organizations around the world to design policies and create tools that foster a safer online environment. This approach is always evolving, and input from the LGBTQ+ community online is critical to informing and continuously improving Meta¡¯s technologies and programs. Meta has resources available for the community as well. The LGBTQ+ Safety Policies are a resource for anyone seeking help and support on issues specifically related to online safety for the LGBTQ+ community."
¡°When we believe a genuine risk of physical harm or a direct threat to public safety exists, we remove content, disable accounts and work with local emergency services. Our policies do not allow content that puts an individual as a member of a designated and recognizable at-risk group or threatens LGBTQ+ safety by revealing sexual orientation or gender identity against their will or without permission,¡± the spokesperson added.
For more stories on the LGBTQIA+ community and queerness in India, keep reading?Spectrum?on?Indiatimes.