Skip to main content

Facebook and Instagram May Erase Ban on This Hotly Contested Adult Content

Meta's Oversight Board recommends the company clears up a very confusing community standard

In 1995, a group of women gathered in Times Square made headlines simply because of what they were wearing. Well, more accurately, because of what they weren't wearing. The women were topless, and they placed themselves in one of the city's most highly-visible areas to illustrate why women should have the same (topless) rights as men.

It's a debate that's raged on for nearly two decades since. As social media has become a regular part of our daily lives, the discussion moved online, but it posed the same question. If men can be topless in public and on social media, why can't women? Why is the breast, a reproductive organ, so sexualized? And how do these policies apply to trans and nonbinary people?

In the past, Meta Platforms  (META) - Get Free Report social media sites Facebook and Instagram have been called out by activists for implementing policies that censor female nipples. To the cry of the catchphrase #FreeTheNipple, women have gone as far as to use stickers or digital art programs to cover their own nipples with a man's to illustrate the flaws in the company's policies. 

This week, Meta's Oversight Board suggested that the company update its policies to allow more than just men to go topless on the site. But the board goes on to say that such an update could be even more confusing than the original policy.

Censorship Lead JS 011923

Trans and Nonbinary User Posts Have Been Removed Despite No Community Violation

Meta’s Oversight Board also recommended it overturn the company’s original policy that calls for the removal of posts showing transgender and non-binary people with bare chests. The board also recommended that the company “change its Adult Nudity and Sexual Activity Community Standard so that it is governed by clear criteria that respect international human rights standards.”

The decision was handed down after a couple identifying as transgender and non-binary posted images of themselves with bare chests, their hands covering their nipples. The purpose of the post was to share their journey in fundraising for gender-affirming care, including what’s commonly referred to as "top surgery." Top surgery is the term for the surgical removal of breasts.

The posts were met both by user reports and "a series of alerts by Meta’s automated systems," and were removed from the site multiple times for violations of Community Standards. At one time, the posts were removed for a violation of Instagram’s Sexual Solicitation Community Standard, but after the couple appealed to Meta and the Oversight Board, the posts were reinstated.

Meta's Oversight Board Says the Company Still Has a Ways to Go

The Oversight Board is an independent group of people who can look at content issues that may require a more nuanced, human approach than the site’s internal systems can provide. In this particular case, the Oversight Board acknowledged that Meta’s Sexual Solicitation policies are being applied more broadly than is necessary, and in this instance, infringed on the human rights of its users.

The board went on to clarify why the amendment still doesn’t address the root flaw of the company’s overall policy regarding bare breasts.

"This policy is based on a binary view of gender and a distinction between male and female bodies. Such an approach makes it unclear how the rules apply to intersex, non-binary and transgender people, and requires reviewers to make rapid and subjective assessments of sex and gender, which is not practical when moderating content at scale," the board concluded in its statement. 

"The restrictions and exceptions to the rules on female nipples are extensive and confusing, particularly as they apply to transgender and non-binary people. Exceptions to the policy range from protests to scenes of childbirth, and medical and health contexts, including top surgery and breast cancer awareness. These exceptions are often convoluted and poorly defined. In some contexts, for example, moderators must assess the extent and nature of visible scarring to determine whether certain exceptions apply. The lack of clarity inherent in this policy creates uncertainty for users and reviewers, and makes it unworkable in practice."

The statement also points out that the posts in question were identified by Meta’s automated systems on multiple occasions even though the posts didn’t violate any of the site’s rules. The case itself is a clear example of why the company's current policies leave too much room for active discrimination. Whether or not Meta follows the advice of the advisory board is yet to be seen; the company's policy remains unchanged at this time.