The first time my Instagram account, @soaking_wet_angel, was disabled was in March of 2021. I, like other meme admins, make memes as well as curate and repost images I find from other places on the web. We all face the constant threat of being disabled.

Originally, two of us ran the account, but as you’ll see, it was surprisingly high-stress for something seemingly so simple. My friend is on an indefinite break from the page, but posted as “cake admin” during her tenure. I posted under “arab admin,” and that’s the pseudonym I still use on @soaking_wet_angel_2, @soaking_wet_angel_3 and finally @soaking_wet_angel_4.

My accounts get disabled a lot. My posts have been described as “irreverent, chaotic and capable of bridging zoomers and millennials during nuclear war.” I think that’s a fair assessment.

SEE ALSO: Best robot vacuum for small spaces

The first time my account was disabled was for posting a satirical mockup of Coachella’s lineup poster that featured other meme pages instead of the usual musical acts. Different versions were posted repeatedly on Instagram and most accounts were left untouched, but mine was disabled for over a month for “solicitation.” I’m not sure what I was supposedly soliciting, but nevertheless, the account I built up to 21,000 followers in a year was inexplicably derailed and my dreams of starting a merch line were too.



Disabled accounts on Instagram are essentially in a limbo period that may never end. If your account gets disabled, you can appeal it, but oftentimes those appeals go unanswered. If Instagram decides to delete your account, you cannot get it back.

Since March, my accounts have been disabled five more times. One backup was actually deleted while I wrote this article for an innocent image of a man kissing a baby’s head. I’m not kidding. Adiòs to another 14,000 followers, I guess. No matter how many appeals I send, nothing happens. I never fully understand why my account gets disabled, but I always try to play by Instagram’s distinctly vague community guidelines. Instagram did not reply to multiple requests for comment on this story, but when they do talk to the press, they usually say some version of, “Instagram has a responsibility to keep people safe.” While that may be true, how exactly does disabling an account for posting a Coachella meme have anything to do with keeping people safe?

My experience is not singular: A website called “Deleted in 2020” (NSFW) displays a large collection of images submitted by Instagram users whose accounts have been deleted due to apparent violations. After just a few seconds of scrolling, you immediately recognize how flawed the violation system is. The images range from full nudity to neutral pictures of vases. In short, it’s all over the place and nonsensical.

If you’ve been paying attention to meme pages or other creators, you’re likely familiar with this story. Most meme pages usually have a “backup” account listed in their bio because of how often they’re targeted, and while it may seem silly to try to hold on to a follower base, don’t forget that many of the creators behind these pages are monetizing their output to some degree.

I spoke with Krister Larson, a 28-year-old tattoo artist located in Berlin, who posts memes on @neurodivergent_bussy and who has had two other accounts disabled, @girl_storage and @girl_storag3, cumulatively losing 40,000 followers. Larson said his deletions have affected his real-life business. He shares his tattooing to his meme accounts, encouraging followers to engage with his work—and they do.

“Luckily, my tattoo account hasn’t been deleted, but I have heard horror stories of other tattoo artists’ accounts being shut down for images of their clients’ nipples being present with lots of confusion around instagram’s ‘nudity in art’ guideline,” he said.

Like me, he’s contacted Instagram, filed appeals, requested reviews, and heard nothing.

In August, an Instagram spokesperson spoke to BuzzFeed News about the banning of the account for Julia Rose’s magazine, Shag Mag. The spokesperson claimed that Instagram is actively trying to improve its internal review system to ensure bans are being more fairly issued. It’s clear this has been a problem for a while and it’s dually clear that Instagram seems to be acutely aware of the hardships this is causing some of its high-profile users. However, that same month, the company announced it would be enacting stricter penalties for accounts who send abusive direct messages. Protecting people from online bullying and abuse is important, but these efforts are only as helpful as Instagram’s willingness to define abuse. By not divulging how they define it, they leave all accounts in the dark and therefore subject to being disabled.

Before Larson’s @girl_storag3 account was disabled, he posted a meme that said, “I am not your bestie. I am a random meme page admin that you have never met.” It received a violation for “hate speech and bullying.” The problem, once again, lies in Instagram’s vague community guidelines. What exactly constitutes bullying and what does not? If they are not defining it, anything and everything could be included. Instagram seems to know this, based on their August announcement about abuse violations, but seemingly intentionally leaves the definition open.

Interestingly, despite Instagram’s strict and confusing violation algorithms, some meme accounts are left unscathed. One, @patiasfantasyworld, posted a meme earlier this month that said, “Potheads will find ANY REASON to smoke.. ‘Damn that bitch ugly, let me roll up.'” I posted that same meme a month ago and it was taken down for “harassment and bullying.” I even tried to appeal that violation but it was denied. However, the meme on their account remained up. An admin from @patiasfantasyworld did not respond to my requests for comment.

I also spoke with Simon Jackson who is the Montreal-based curator of @Our.Community.Guidelines. Simon agreed with Larson, saying, “Instagram’s guidelines are obviously written to protect their business interests at the expense of users. The guidelines are absurd, as are their interpretation and application. I named my account @our.community.guidelines to highlight how ridiculous it is to try to get corporate lawyers to interpret and restrict visual symbolism.”

While some creators struggle financially and depend on Instagram referrals for art sales, merch, or other money-making endeavors, never forget that, as Simon pointed out, Instagram is worth over $100 billion.

The effects of constant deletion aren’t only financial. Instagram is a social media app. Simon said, “I depend on Instagram for access to a lot of my friends, conversations, opportunities, and self-expression, making the dependence seem even more sinister… If they own our friendships, they have incredible leverage over users when they give their advertisers ready access to our data and wallets.”

Until Instagram clearly defines their community guidelines and why some users are penalized for content that others face no repercussions for, meme admins should consider exploring other applications. A mass meme exodus might be the only way to get Instagram to take note of its users long-standing grievances.

Samantha Nazzi is a meme administrator based in Brooklyn, New York. You can follow her on Instagram @soaking_wet_angel_3, @soaking_wet_angel_4, and thank_u_for_shopping.

©