Paging Dr. Internet, we need a diagnosis. In this series, Mashable examines the online world’s influence on our health and prescribes new ways forward.


When pelvic floor physical therapist Sabrina Baxter (@nosuchthingas_TMI) began posting on TikTok, she noticed the platform would remove certain videos. Those about bladder and bowel problems were fine, but TikTok removed others about pain with intercourse and pelvic health issues, she said.

Even though there was no nudity in the videos — Baxter films fully clothed — the app notified her that the videos violated their community guidelines due to “nudity and sexual content.” Baxter tried to work around this in subsequent posts by lip syncing to trending sounds, but videos kept getting taken down. Then, TikTok started temporarily barring her from posting.

Baxter is merely one example of certified pelvic floor physical therapists, sex educators, and other TikTok creators in this space who said they’re subjected to repeated content removal. TikTok appears to be blocking some essential sex education content from the app, all while misinformation around sex ed and other topics like vaccines runs rampant.

TikTok’s guidelines ban nudity and sexually explicit content (depicting penetrative or non-penetrative sex, sexual stimulation or fetish, or “erotic kissing”), but oftentimes these creators depict neither in their videos. What’s more, the guidelines explicitly state that educational content is an exception to the rules:

We recognize that some content that would normally be removed per our Community Guidelines could be in the public interest. Therefore, we may allow exceptions under certain circumstances, such as educational, documentary, scientific, or artistic content, satirical content, content in fictional settings, counterspeech, and content in the public interest that is newsworthy or otherwise enables individual expression on topics of social importance.

“I started getting banned a lot,” Baxter explained. “I would be banned for a week, then I’d be banned for two weeks, and then I got removed from the creator fund.”

Baxter created her first TikTok account (@nosuchthingastmi) in late 2019. She hit 10,000 followers in March 2021, the threshold to be added to the TikTok Creator Fund, a way the app pays users with over 10,000 followers and at least 100,000 views in 30 days.

TikTok removed that account from the fund this summer as a result of the post removals and bans, she explained. Baxter said it was supposed to be “reactivated” in August, but never was. She believes the account was shadowbanned, a term for when platforms secretly block a user’s content without their knowledge. A spokesperson for TikTok wouldn’t comment on shadowbanning.

“People say shadowbanning is a myth,” Baxter said, “but I just felt like my videos weren’t getting put out. I wasn’t getting a lot of interaction, people weren’t finding my account.”

She briefly left TikTok out of frustration, but made her second and current account later this summer because she loves creating videos. The new account is now a part of the creator fund, but she still worries about video removal and banning.

Baxter banned from posting on TikTok.
Credit: screenshot: tiktok

Baxter’s content removed for ‘adult nudity and sexual activity.’
Credit: screenshot: tiktok

Shadowbans and content removal

Like Baxter, sex educator Danielle Bezalel (@sexedwithdrb) believes she’s been shadowbanned by TikTok. Last fall, the app started removing her sex ed videos and messaging her that she wasn’t following community guidelines.

After that, Bezalel said, “it was just so much harder for us to get our content out there to our followers.”

“Over the last six months, our videos aren’t really even breaking 5,000 views with 57,000 followers,” she continued. Bezalel believes something has changed because her videos aren’t even reaching the audience she’s already acquired.

Her account’s growth has also taken a hit. From May to August 2020, the first four months, she gained 40,000 followers. Since then — over a year later — she’s only gained 17,000.


“I would be banned for a week, then I’d be banned for two weeks…”

Fellow sex educator Madeline Gregg (@the.attitude.tok) also believes that video removal has suppressed her reach. “I have a good amount of followers and yet my likes are very very low,” she said. “My engagement is low because they keep taking my [sex ed] videos down.”

Dr. Alicia Jeffrey-Thomas, another pelvic floor PT on TikTok @scrambledjam, said she’s had three videos as well as several comments about aspects of pelvic health removed. One video, about the Ohnut wearable meant to alleviate painful penetration, highlighted TikTok’s seemingly arbitrary rules.

Last year, Jeffrey-Thomas participated in the #LearnOnTikTok program, an initiative to encourage users to do just that. She shared a video about the Ohnut using the hashtag and said she received positive feedback (from a third-party TikTok partnered with for the program) that it was a great video.

Yet, when she posted a video separate from #LearnOnTikTok, of her putting an Ohnut on a vaginal dilator, she said TikTok removed it. She remade the video, this time just showing the wearable on the dilator, and it stayed up. She said the remake received a fraction of the engagement the original did.

“If you use the wrong gesture with your hand, say just one wrong word, it’s enough to trigger [deletion],” said Jeffrey-Thomas. and “It’s mysterious to me…it’s just frustrating.”

One removed video of Jeffrey-Thomas’s, about pelvic floor trigger point wands (tools that can help with pelvic pain), did get reinstated after Jeffey-Thomas appealed, but she’s still baffled by what she views as the arbitrary nature of removal.

All these creators have tried figuring out what specifically gets videos removed, or how to circumvent the algorithm. They use euphemisms like “seggs” instead of “sex” (Bezalel named her recent sex ed series the “Seggs Ed Show”); they put disclaimers at the start of their videos saying their content is educational and thus shouldn’t be removed (complete with screenshots of the community guidelines — here’s an example of Gregg doing so); they watch their language and even their movements.

But even then, they say their videos get taken down, and usually all for the same reason — adult nudity and sexual content — when there’s no nudity, and the content is allowed per TikTok’s own guidelines under the exception for educational content.

Why is TikTok removing sex ed content?

TikTok and other social apps are quick to take down even remotely sexual content at least in part due to FOSTA-SESTA. The bill passed in 2018 with the intent to stop sex trafficking online, but in reality hurts of-age and consensual sex workers, educators, and others in the sexuality space.

As a result of FOSTA-SESTA, social media platforms like Instagram have taken to deleting any and all sexual content.

“With FOSTA-SESTA, any type of sexual content that could be misconstrued as solicitation, as trafficking, as something that’s not ‘on the up-and-up,’ can get these platforms in trouble,” explained sex educator Sunny Megatron, who is verified on TikTok.

Tech companies like TikTok are in the business of making money, said Megatron, not benefiting people. “It is not in their best interest to look out for good education being thrown out with something that could potentially be illegal, immoral, unethical or get them in trouble,” she said.

Another reason behind all the video removal, creators say, is mass reporting.



“If somebody doesn’t like something that you’re saying, they can easily report you for whatever they want to — whether it’s spam or bullying or nudity, which I usually get,” said Gregg, “just because they don’t like your content.”

Creators in marginalized communities, say queer creators, are targeted by mass reporting even more so, said Megatron.

TikTok’s spokesperson wouldn’t talk about these individual creators on the record as they considered this personal information, but they did share that the app uses a combination of technology and human moderators to detect and remove content that violates community guidelines; it relies on reports from users, as well. They added that TikTok’s safety team reviews reports in accordance with its community guidelines and does so regardless of the reporters’ intent.

Different accounts, different treatment

Both unverified and verified creators Mashable spoke to believe TikTok allows the latter to abide by a different rulebook.

“I got verified and instantly noticed what I had heard,” said Megatron, “that when you’re verified you can get away with a lot more.”

Prior to getting verified, TikTok deleted two of Megatron’s videos, one about a sex toy and the other featuring a vaguely phallic object, she said. After verification, she never had a video removed again. She still believes TikTok curbs her reach due to the nature of the content, but not as much as that of unverified creators.

“My type of content is suppressed, even with that blue checkmark,” she said. “I’m talking about sex and kink and things that the algorithm doesn’t like.”

It’s a double-edged sword, said Megatron, because on one hand, the billions of TikTok users out there don’t see her videos, but on the other, that’s not necessarily a bad thing. The users outside her core audience who may mass report her (and result in a banning) just don’t find her.

All the unverified creators Mashable spoke to said they tried and failed to contact TikTok about these issues. Gregg reaches out weekly to no response, and she joked that the platform probably blocked her email by now. Bezalel went so far as to message members of the product team on LinkedIn to no avail.

Verified accounts like Megatron’s, meanwhile, “get away” with more. “I’ve…said things that would be deemed very questionable in TikTok’s eyes on live broadcast,” she said, such as a Q&A about BDSM. “Never had a problem.”

TikTok didn’t respond to questions about privileges of verified users, instead pointing to a blog post that states a verification badge doesn’t signal endorsement from TikTok.

In addition to this, misinformation is a huge problem on TikTok for sex education and elsewhere. For whatever reason, many of those videos remain up, which vexes qualified educators.

Gregg named an example of a doctor on TikTok claiming to have a cure for herpes. Spam accounts commented about the “cure” on Gregg’s videos, but Gregg said TikTok didn’t remove the comment even after she reported.

A spam comment on Madeline Gregg's TikTok claiming there's a cure for herpes.

A spam comment on Madeline Gregg’s TikTok claiming there’s a cure for herpes.
Credit: screenshot: tiktok

On the topic of misinformation, TikTok said it partners with third-party, accredited fact checkers, and created in-app videos about media literacy.

How this impacts creators and audiences

This is a problem for both creators and viewers. The former must deal with the frustration of spending hours of time and research on a video no one gets to see. They also take a financial hit.

Gregg, for example, is a member of the creator fund. Accounts receive more fund money the more views and engagement they get. “They’re suppressing my views and not letting me post for days,” she said. “I lose out on my coins.”

“It genuinely is my livelihood,” said Bezalel.

Even ads aren’t immune to content removal. Videos promoting sex toy brand Bellesa and education site Beducated were both removed immediately (though Baxter reposted the former on her original account and, for some reason, it stayed up), she explained.

Going forward, Baxter’s not going to work with brands unless TikTok changes something. “I’m already scared that my stuff’s gonna get banned or permanently deleted,” she said. “Why would I risk that for a brand?”

As a result, she’ll lose thousands of dollars in monthly income.

Beyond the creators themselves, removing these videos also hurts TikTok users. Even prior to COVID, not every child in the U.S. received sex education — but the pandemic disrupted sex ed even more, to the point where even schools who previously had it may have removed it from the curriculum to prioritize other subjects.

It’s not just children who need sex and pelvic health education, either. Sex education has historically not measured up, and Megatron has seen people in their 20s through their 50s learning about sex on the internet.

Baxter recalled a patient in her 70s who asked what her clitoris was. “We’re not taught enough about our own anatomy and our reproductive system to begin with,” she said.

This also extends to teachings about sexual abuse.


“I’m honestly scared to use the app to promote pelvic health.”

“Let’s say you are a sexual assault victims advocate,” said Megatron. “Just saying the word ‘sexual’ is enough to get that life-changing, very helpful content taken down because it’s deemed dirty and inappropriate.”

TikTok told Mashable it partnered with the Rape Abuse & Incest National Network to develop resources for their community.

TikTok creators provide valuable, professional, and free information that our school systems don’t offer. Yet, due to content removal and bannings, Baxter doesn’t even want to provide such content anymore. “I’m honestly scared to use the app to promote pelvic health,” she said.

“TikTok is conflating sexually explicit inappropriate content with absolutely normal, healthy, science-backed, medically-accurate sex education,” said Bezalel.

What can we — and TikTok — do?

In the short term, creators who believe their content was wrongfully removed can appeal to TikTok. On the app’s blog, TikTok’s head of U.S. safety Eric Han acknowledged that they don’t always make the right call and that creators should appeal.

“While we strive to be consistent, neither technology nor humans will get moderation decisions correct 100% of the time, which is why it’s important that creators can continue to appeal their content’s or account’s removal directly in our app,” Han wrote. “If their content or account has been incorrectly removed, it will be reinstated, the penalty will be erased, and it will not impact the account going forward.”

According to TikTok’s Community Guidelines Enforcement Report, over four million videos were reinstated between April and June 2021 after users appealed.

A long term solution isn’t easy, however. Megatron doesn’t believe the current system provides a good solution, as we can’t trust platforms to do what’s best for the public good — evidenced by the recent Facebook whistleblower Frances Haugen — but sweeping government regulation may only make content blockage worse.

What TikTok could do, Megatron said, is hire people proficient in LGBTQ activism, lifestyle, sex education, and sex positivity to look at reported videos and determining if they’re inappropriate. These moderators would have a better understanding of what is appropriate educational content, and could stop bad actors trying to remove posts from marginalized creators. Ideally, she continued, these people would also develop more specific guidelines so wrongful removals don’t keep happening.

“Until they consult with us and have us working with them, this will never stop,” she said.

Other creators posed suggestions, as well. Jeffrey-Thomas supports a verification status for people providing educational material so their content doesn’t get flagged for violating community guidelines.

Bezalel, meanwhile, suggested an age filter so creators can make their videos 18+, and children won’t be privy to their content.

Social media platforms are particularly puritanical, and TikTok is no different. Until these platforms find a way to differentiate actually harmful sexual content from the rest, however, both creators and audiences will suffer.

©