Business Categories Reports Podcasts Events Awards Webinars
Contact My Account About

Terms of (Unequal) Service: The Online War on Women’s Sexual Health

Published April 1, 2025
Published April 1, 2025
Troy Ayala

Sexual health censorship online is a growing concern as it limits access to essential discussions and educational content. While some level of moderation is needed to protect young viewers and prevent misinformation, excessive censorship is suppressing diverse perspectives and hindering learning. The issue is nothing new. In 2022, BeautyMatter’s LinkedIn article regarding sexual health censorship was removed from the platform with no prior warning.

In particular, female sexual health content is more likely to be censored than male content, according to a recent report from advocacy group the Center for Intimacy Justice, which revealed that major tech platforms (including Google, Amazon, and TikTok) routinely suppress information vital to women’s health. The report surveyed 160 sexual and reproductive health businesses, brands, and nonprofit organizations globally while highlighting overwhelming evidence of economic impact due to these biased practices. This censorship creates a narrative that male sexual health and pleasure is more important than female counterparts. Therefore, striking a balance between necessary moderation and intellectual freedom is essential to ensure that social media remains a space for education and open discourse.

"Our findings demand a critical examination of how algorithmic biases shape what information is accessible online, particularly regarding sexual and reproductive health for women and people of diverse genders," Jackie Rotman, founder and CEO of Center for Intimacy Justice, said in the report. "There is a critical need for tech platforms to implement content moderation practices that support, rather than suppress, this essential health information."

Meta

Despite Meta making changes to its sexual health policies in 2022, stating that "advertisers can run ads that promote sexual health, wellness, and reproductive products and services,” on its platforms (Facebook and Instagram), restrictions are still happening, particularly concerning women. The report highlights two ads: one that was removed and one that was approved—the banned advertisement promoted breast cancer screenings and featured an image of a pink ribbon, while the permitted advertisement promoted Viagra, with an image of a man holding a banana to his groin with the caption “the pill helps me stay hard for the touchdown.” 

Of the survey respondents posting sexual health content to Meta platforms:

  • 63% had organic content removed
  • 84% of businesses and 76% of nonprofits had paid content/ads rejected by Meta
  • 75% said they believe Meta has shadowbanned their organic content 
  • 51% said Meta has entirely suspended their advertising account at some point
  • 33% said that Meta has suspended their organic account at some point

Of those whose paid content/ads were rejected by Meta:

  • 31% were not informed of any specific policy preference 
  • 45% were flagged under “adult product and services” guidelines 
  • 42% were flagged under “adult content guidelines”
  • 38% were flagged under “adult nudity and sexual activity” guidelines 
  • 16% were flagged under “restricted goods and services” guidelines
  • 13% were flagged under “illegal products or services” guidelines
  • 10% were flagged under “personal health and appearance” guidelines

Of those who appealed Meta’s rejection of their ads: 

  • 46% say that none of their appeals have been accepted by Meta
  • 25% say that between 1% and 33% of their appeals have been accepted by Meta
  • 11% say that between 34% and 66% of their appeals have been accepted by Meta
  • 11% say that between 67% and 99% of teir appeals have been accepted by Meta
  • 9% say that all of their appeals have been accepted by Meta

TikTok

Across all platforms, but particularly TikTok, survey respondents reported using “self-censorship” when attempting to share sexual and reproductive health information. This practice, sometimes called “algospeak,” includes changing words such as lube to “loob,” or vagina to “v@gin@.” 

Additionally, several users on TikTok report having their content removed without explanation. “If you’re going to flag something or say I’m violating something, be very clear on what I’m violating so I can do better in the future,” Shelby Goodrich Eckard, a PCOS support content creator, said in the report, “I get very defeated because I work very hard to make content that is educational and can help a lot of people, but this [censoring without providing reasoning] discourages educational people like us who are trying to do our best from making content.”

Of the surveyed respondents posting sexual health content to TikTok: 

  • 55% had organic content removed
  • 48% had ads rejected or removed
  • 52% say that they believe their organic content had been shadowbanned by TikTok
  • 39% say that their organic content has been labelled “sensitive” on TikTok
  • 62% reported that TikTok did not provide specific reasons for flagging their content, beyond general references to community guidelines 
  • 43% of respondents who use TikTok but do not advertise on the platform say that the belief ads would be rejected is a primary reason they do not use the platform to do so

Of those whose paid content/ads were rejected by TikTok:

  • 38% were flagged under “sexual activity and services” standards
  • 28% were flagged under “sexually suggestive content” standards
  • 13% were flagged under “nudity and body exposure” standards
  • 6% were flagged under “regulated goods and services” standards 
  • 6% were flagged under “sexual exploitation and gender-based violence” standards
  • 6% were flagged under “shocking and graphic content” standards

Google

Google further exemplifies the differing treatment for male vs. female sexual health content and advertisements. The report outlines that YouTube content creator Leeza’s video “Morning Erection” faced no age restrictions or monetization limits, yet three of her videos about women’s health (the vagina, vulva, and orgasms for women) were restricted to users 18 or over and faced limits in monetization.

“What they’re [Google] actually doing by censoring us and claiming we’re spreading dangerous information is endangering people’s lives and curbing people’s ability to exercise bodily autonomy,” said Jennifer Daw Holloway, Communications Director of NPO Ipas, in the report. “It’s not surprising that what gets censored is in the same vein: abortion or reproductive rights and justice, information for people who are LGBTQ+, and information about transgender healthcare.”

Google told Aquafit, a pelvic health company, that Aquafit's advertising keywords could not use "descriptive language”—such as "pain during sex”—and instead must use medical language, such as "vaginismus." However, Google searches for "sex pain" are at least five times as popular as searches for "vaginismus."

Of the respondents who use Google’s platform for paid content/ads: 

  • 66% had ads rejected
  • 58% had ads restricted or age-gated

Of those whose paid content/ads were restricted by Google:

  • 58% said Google classified their ads as “sexual content”
  • 25% said Google classified their ads as “inappropriate content”
  • 25% said Google classified their ads under its “healthcare and medical content” policy
  • 25% said Google did not reference a specific policy

Of those whose paid content/ads were rejected by Google: 

  • 57% said Google classified their ads as “sexual content”
  • 38% said Google classified their ads under its “healthcare and medical content” policy
  • 29% said Google classified their ads as “inappropriate content”
  • 24% said Google did not indicate a specific policy that Google classified its restricted ads under 

Amazon

On Amazon, when a product is labeled with an “adult flag,” it becomes unsearchable on Amazon’s home page. Users can access this product listing only through a direct link to the product, or through Google search, or by navigating to Amazon’s “sexual wellness” subsection, which the report highlights is extremely rare for users to do.

VuVatech, created to soothe pelvic and vaginal discomfort, has repeatedly had product listings removed on Amazon, according to brand founder Tara Langdale-Schmidt. In an even more extreme case, Amazon blocked VuVatech from adding a discount coupon to one of its products, due to being identified as an item “potentially embarrassing or offensive.”

“We just have to stop this insanity with being embarrassed about things,” Langdale-Schmidt told Wired. “There's no difference from your vagina than your ear, your nose, your mouth. It is another place on your body, and I don't know how we got to this point where it's not OK to talk about it. I just don't get it.”

Of the respondents who use posted listings of sexual wellness products to Amazon: 

  • 64% said Amazon turned off their product listing at some point 
  • 48% said that Amazon gave their product an “adult flag,” meaning users could only find the product listing through a specific link or Google search because it becomes unsearchable on Amazon’s homepage 
  • 34% had accounts suspended, meaning they could no longer sell on the platform 

Of those whose Amazon accounts were suspended: 

  • 55% were cited for violating the product guidelines for “adult products and explicit sexual imagery”
  • 25% were cited for violating the community guidelines for “sexual content”
  • 25% say Amazon did not indicate which of the platforms policies, if any, had been violated at the time of their account suspension 
  • 10% were cited for violating “SEO search policy: offensive terms” 
  • 94% say they have seen their direct competitors allowed to list or promote products on Amazon even when they have been restricted for those same activities

Revenue Losses and Impact on Funding

Concerningly, censorship stops sexual health and wellness brands from growing their profit to full capacity. “I cannot grow my company or hire more help because, at any minute, Amazon can shut my entire account down. Amazon sales are 50% of my business; my account and/or listings have been shut down multiple times, and not being able to advertise on any social media, paired with the cost of goods doubling, has left VuVatech in a bind. We just scrape by every month and have not grown sales-wise in three years,” Langdale-Schmidt added. 

  • 85% of respondents who have sought to fundraise for their business or organization believe digital suppression has negatively impacted their ability to do so 
  • Respondents estimated annual revenue losses ranging from $20,000 and $500,000 (per company) due to Meta’s removals and restrictions of their organic content and advertisements
  • Respondents estimated annual revenue losses ranging between $10,000 and $1,000,000 (per company) due to Amazon’s rejection and restrictions on their products

The growing censorship of women’s sexual health content is not just a matter of policy oversight—it's a deliberate suppression of essential health information, which is arguably driven by societal and political biases that continue to become more prominent in the Western world. This content moderation reveals a deeply ingrained inequality that extends beyond digital spaces. It is important to challenge these biases wherever possible while advocating for politics that prioritize public health over outdated taboos.

×

2 Article(s) Remaining

Subscribe today for full access