After two years of being a survivor of the trauma of being assaulted, Josephine Walker felt ready to speak.
She opened TikTok and recorded a video about what had happened to her when she was 19. She called it what it was. Rape.
Within hours, her video was removed for “violating community guidelines.”

“I felt a weight off my shoulders when I finally opened up, only to come back to the app and find out that no one even heard me. It was heartbreaking.”
That was Jo’s first lesson in the digital dystopia survivors now live in, a world where naming your pain is more offensive than the act that caused it.
Censorship
On TikTok, saying “rape” will likely get your content removed. Say “grape,” however, and you might stay visible. Sexual assault survivors have created entire languages out of necessity- ”unalived” instead of “suicidal,” “SA” instead of “sexual assault,” “PDF file” instead of “paedophile” and endless euphemisms that reduce real trauma to algorithm-surviving codes.
Click here to find out more way social media silences women
“It’s like talking in riddles about the worst thing that ever happened to you,” Jo says, “It makes me feel like a joke to talk about my trauma that way.”
Jo is not alone in her story. A 2023 Mozilla Foundation report found that content about sexual violence is among the most frequently removed on TikTok, even when it doesn’t violate community guidelines. Survivors, advocates, and even therapists have been shadowbanned or silenced, while misogynistic content, ironically, often slips through the algorithm.
Tiktok’s stance
According to TikTok’s Community Guidelines, while the platform prohibits content that depicts, promotes, or glorifies sexual violence, it does permit those who are surviving from sexual experiences to share their personal stories, and educational content about sexual violence, provided the content does not include graphic descriptions or imagery. The reality that Jo and many other content creators on the app face, however, is very different.
“I think it’s ridiculous to be honest,” says Jo, “The amount of times I’ve seen vile comments on the app that don’t get taken down, even when I personally report them. If I want to speak up about my trauma though, suddenly I’m the problem.”
The reality that survivors who speak up face can be seen as, not just frustrating, but dystopian- a world where a survivor is expected to reduce their to watered-down, bite-sized clips or be erased altogether; a digital courtroom where a survivor is cross-examined by strangers, judged by likes, and sentenced to invisibility if they don’t play by the algorithm’s rules.

“Sometimes I wonder if I’m doing more harm than good by speaking up. I think that especially if I have to censor myself to do it,” says Jo.
She pauses, then adds: “But then I remember what it was like to feel alone as a survivor. And I don’t want someone else to think they are.”
A problem?
TikTok and other platforms claim their censorship is meant to protect users from triggering content. What they often fail to consider is they’re not erasing the problem- only the people who are speaking about it.
By muting these voices, platforms can be protecting perpetrators more than they are protecting their viewers.
“It shouldn’t have to be this way. I should be allowed to speak up about something that happened to me without fear of censorship. I wasn’t ‘graped’- I was raped.” Jo says.
“This is a huge fault in the app and I hope they do something to rectify this.”