Baring All & Social Media Censorship

Baring All & Social Media Censorship

Social media censorship of breasts shows us why we need to hold Big Tech accountable

By Dr Carolina Are

Dr Carolina Are is the pole dancing academic, activist and content creator behind the @bloggeronpole social media accounts and blog. A platform governance researcher with a PhD in online abuse and conspiracy theories, she is currently working as Innovation Fellow at Northumbria University’s Centre for Digital Citizens, where she focuses on the way platforms govern bodies and sex. Following her experiences of online censorship, she has been researching on algorithmic bias against nudity and sexuality on social media, and has published the first peer-reviewed study on the shadowbanning of pole dancing in Feminist Media Studies. Her work has appeared in The New York Times, The Atlantic, The Guardian, The Conversation, the BBC, Wired, the MIT Technology Review.

Although it may seem unlikely now that most of my breasts have been eaten up by muscle, I used to be known for my big boobs when I was younger. I was the first in my class to get breasts, and while I liked them - they made me feel ‘grown up’ - I was just 11 years old, more interested in games and magic than in dating. Suddenly, I began struggling with boys’ change of attitude towards me: I wasn’t a friend anymore – I was a slut, a name I was called every day at school. Years later, ‘sluts’ would become my inspiration in activism, research and work, and by baring what I used to hide under giant metal band t-shirts through pole dance shows and social media posts I would somehow mend that complicated relationship with my breasts and my body. Unfortunately, social media platforms had other plans. If social networks’ community guidelines are anything to go by, ‘female-presenting nipples’ (their words, not mine) represent the source of all online harms – and I’ve experienced different levels of censorship for showing even less than that, inspiring me to become a platform governance researcher. Meanwhile, as the world descends into a pit of hate, with barely avoided coups, misinformation and conspiracy theories galore, platforms seem to prefer targeting breasts than violence. How did we get here?

Far from being mere spaces for us to connect, social media platforms are now where we learn about the world and keep up to date with news, as well as a site of expression, work, organising and self-promotion. Because of the crucial role this handful of private companies now plays in our everyday and public life, and due to the exponential growth of their user base and of the content they host, they had to begin governing their spaces. Without content moderation - or the practice of deleting and/or censoring online content – platforms would be unusable, full of spam, harmful content and just… chaos. What’s interesting however is the content they choose to censor.

To moderate content, platforms rely on a blend of algorithmic and underpaid, outsourced, overwhelmed human moderators largely based in the Global South, who have to make split-second decisions over posts they’re not always familiar with. In my research, I have found that platforms’ content moderation systems are far from equal or efficient. They have so far disproportionately targeted marginalised users, over-focusing on nudity and sexuality instead of on violence – and the fact that the online groups and content used to coordinate the 2021 attack on the United States Congress were left up, while a mere nipple is immediately deleted is a case in point. The 2018 US law known as FOSTA/SESTA—the Allow States and Victims to Fight Online Sex Trafficking Act (FOSTA) and the Stop Enabling Sex Traffickers Act (SESTA) - is a large part of the reason behind this moderation discrepancy.

FOSTA/SESTA is an exception to the Section 230 of the US Telecommunications Act which ruled social media companies were communication intermediaries, not speakers or publishers - and therefore not legally liable for what was posted on them. FOSTA/SESTA made an exception to this, meaning that Section 230 still protects platforms, bar for one type of content: anything that may facilitate sex trafficking. Great, right? Not so much. Sex trafficking and online child sexual abuse were already illegal. Instead, the new exception lumped sex trafficking (a crime) in with sex work (a job), making sex workers – who found working online improved their safety and autonomy – the demographic most affected by censorship. But it didn’t stop there: as FOSTA/SESTA was pushed into Congress by anti-sex, far-right evangelical groups aiming to banish sex from the internet, its successful trickle-down effect has been worldwide censorship of anything related to sex and bodies.

FOSTA/SESTA broke the internet - and not in a good way. A flawed new law generated shambolic platform attempts to moderate bodies at scale to avoid being accused of facilitating trafficking. They left this task to algorithms, and that hasn’t gone well: due to shadowbanning – aka hiding or not recommending content from Explore or For You pages – and de-platforming, or outright content and/or account removal, online sex and bodies are becoming increasingly invisible. As a result of the law, social media platforms now over-censor posts by athletes, lingerie, sexual health brands, sex educators, and activists, applying this flawed US law to content worldwide. After facilitating the creation of spaces that busted taboos, social media platforms are now making bodies and sex even more taboo: enter the use of ‘seggs,’ ‘vajayvjay’ and so on to prevent algorithms from censoring any sex and bodies related content.

This censorship isn’t applied equally. In 2023, a Guardian investigation found that artificial intelligence (AI) tools used by most tech companies rate women’s images as ‘racy’ by default - even in everyday situations depicting pregnancy, health check-ups or fitness scenarios. In my research, I found that users’ accounts were deleted for ‘nudity and sexual activity’ or for ‘sexual solicitation’ even when they were fully clothed, posing with their families, talking about legal and safe abortions or sharing their sexual assault survivor stories. Meanwhile, online violence and harassment against those same users is left online to go viral. Because of Big Tech’s disproportionately male, white, cisgender, heterosexual and able-bodied workforce, the way tech platforms write and apply rules, and the way they use their algorithms, inevitably views anything related to bodies, pleasure, women and LGBTQIA+ and especially breasts and nipples as sexual - which means worthy of censorship. And if you’re wondering why celebrities are able to post almost fully nude images, it’s because celebs and public figures have a preferential route in content moderation compared to the rest of us, preventing the almost immediate algorithmic take-down posts showing skin would normally face.

When Tumblr introduced a ban on ‘female-presenting nipples’ (something most social networks have since implemented), it became clear that platforms have crowned themselves rulers not just of our bodies, but of deciding what is ok to be seen and what isn’t – and who has and hasn’t got ‘female-presenting nipples,’ something incredibly upsetting for trans and gender non-conforming folks whose gender identity ends up being determined by an algorithm.

Platforms are fully aware that this is an issue. Last year, after examining a series of content take-downs, even Meta’s independent oversight body, the Oversight Board, found that the company’s current adult nudity and sexual activity rules "reflect a default notion of the sexually suggestive nature of women's breasts," over-sexualising our bodies and resulting in inconsistent and overly enforced censorship that violates users’ human rights.

We cannot let private companies demonise bodies and specific user demographics, dictate what can and cannot be seen, who can and can’t work, what is and what isn’t harmful. If we leave them to it, they will inevitably prioritise their companies’ interests as they already doing, hiding content they deem unsavoury to avoid alienating advertisers. So I am going to leave you with some further reading to keep up the fight, and with a quote from the Manifesto for Sex Positive Social Media:

“Social media rules around what can and can’t be posted shape broader attitudes towards sex and nudity, which in turn directly impact on all of our safety and wellbeing. We believe that we’re healthiest and happiest when sex is not a source of shame but accepted as part of human experience.”

Further reading:

Back to blog