Clinics, advocacy teams and people who share abortion-related content material on-line say they’re seeing informational posts being taken down even when the posts don’t clearly violate the platforms’ insurance policies.
The teams, in Latin America and the US, are denouncing what they see as censorship even in locations the place abortion is authorized. Firms like Meta declare their insurance policies haven’t modified, and specialists attribute the takedowns to over-enforcement at a time when social media platforms are decreasing spending on content material moderation in favor of synthetic intelligence programs that wrestle with context, nuance and grey areas.
However abortion advocates say the removals have a chilling impact even when they’re later reversed, and navigating platforms’ advanced programs of appeals is commonly troublesome, if not unimaginable.
“The goal of it was to better understand the breadth of the problem, who’s affected, and with what consequences. Obviously, then once we had a better understanding of the trends, we hope to call attention to the issue, demand accountability and increase transparency in the moderation practices and ultimately, help stop the platforms from censoring this essential, sometimes life-saving information,” stated Jennifer Pinsof, workers lawyer at EFF.
The group says it acquired near 100 examples of content material takedowns from abortion suppliers, advocacy teams and people on Meta platforms equivalent to Instagram and Fb, in addition to TikTok and even LinkedIn.
It’s not clear if the takedowns are growing or individuals are posting extra about abortion, particularly abortion medicine equivalent to mifepristone, for the reason that Supreme Courtroom overturned Roe v. Wade in 2022.
Brenna Miller, a TikTok creator who typically posts about abortion and works in reproductive well being care, stated she made a video unboxing an abortion tablet package deal from the nonprofit carafem — the place she talked about what was within the package deal and mentioned the method of taking the capsules at residence.
She posted the video in December. It was up for a minimum of per week earlier than TikTok eliminated it, saying it violated the platform’s neighborhood requirements.
Ultimately, the video was restored in Could with no clarification.
“I work in public health in my 9-to-5 and we’re seeing a real suppression of public health information and dissemination of that information, particularly in the reproductive health space. And people are scared,” Miller stated. “It’s really important to get people this medically accurate information so that they’re not afraid and they actually can access the health care that they need.”
TikTok doesn’t typically prohibit sharing details about abortion or abortion medicine, nonetheless it does regulate promoting and advertising and marketing medication, together with abortion capsules and it prohibits misinformation that might hurt folks.
On Fb, the Purple River Girls’s Clinic in Moorhead, Minnesota, put up a publish saying it affords each surgical and medicated abortion after it heard from a affected person who didn’t understand it supplied medicine abortion. The publish included a photograph of mifepristone. When the clinic tried to show the publish into an ad, its account was suspended. The clinic says that because it doesn’t supply telehealth companies, it was not making an attempt to promote the medicine. The clinic appealed the choice and received a reversal, however the account was suspended once more shortly after. Finally, the clinic was in a position to resolve the problem via a connection at Meta.
“We were not trying to sell drugs. We were just informing our followers about a service, a legal service that we offer. So that’s alarming that, you know, that was flagged as not fitting into their standards,” stated clinic director Tammi Kromenaker. “To have a private company like Meta just go with the political winds and say, we don’t agree with this, so we’re going to flag these and we’re going to shut these down, is very alarming.”
Meta stated its insurance policies and enforcement relating to medication-related abortion content material haven’t modified and weren’t impacted by the adjustments introduced in January, which included the tip of its fact-checking program.
“We allow posts and ads promoting health care services like abortion, as well as discussion and debate around them, as long as they follow our policies — and we give people the opportunity to appeal decisions if they think we’ve got it wrong,” the corporate stated in a press release.
In late January, Emory College’s Middle for Reproductive Well being Analysis within the Southeast, or RISE, put up an Instagram publish about mifepristone that described what it’s and why it issues. In March, its account was suspended. The group then appealed the choice however the enchantment was denied and its account was deleted completely. This choice was later reversed after they have been in a position to join with somebody at Meta. As soon as the account was restored, it grew to become clear that the suspension was as a result of it was flagged as making an attempt to “buy, sell, promote or exchange illegal or restricted drugs.”
“Where I get concerned is (that) with the increased use of social media, we also have seen correspondingly an increased rise of misinformation and disinformation on social media platforms about many health topics,” stated Sara Redd, director of analysis translation at RISE and an assistant professor at Emory College. “One of main goals through our communications and through our social media is to promote scientifically accurate evidence-based information about reproductive health care, including abortion.”
Laura Edelson, assistant professor of laptop science at Northeastern College, stated that on the finish of the day, whereas folks like to debate platforms’ insurance policies and what the insurance policies ought to be, what issues is folks’s “experiences of sharing information and the information are able to get and they’re able to see.”
“This is just a policy that is not being implemented well. And that, in and of itself, is not all that surprising because we know that Meta has dramatically reduced spending on content moderation efforts,” Edelson stated. “There are fewer people who are spending time maintaining automated models. And so content that is even vaguely close to borderline is at risk of being taken down.”
Fortune International Discussion board returns Oct. 26–27, 2025 in Riyadh. CEOs and world leaders will collect for a dynamic, invitation-only occasion shaping the way forward for enterprise. Apply for an invite.
