Close Friends isn't as close as you may think
You may think the posts on your Close Friends Story on Instagram are just for your closest friends. But it’s being monitored, so post carefully.
In a world in which the Venn diagram of our physical and digital selves approaches a circle, it’s important to feel like we’re in charge of our ability to separate the nuance of those identities. You can give most people access to your hiking photos, but only a select group of people can see the humiliating act of flirting online, or your true feelings about Glee. One of my friends only posts about her yoga classes on her Close Friends Story — an attempt to go against Taylor Swift’s graduation advice and flee potential cringe.
Another uses her Close Friends Story as a pseudo OnlyFans, collecting money each month in return for access to her Stories, which include some particularly scandalous shots unfit for her typical audience. Megan Rudy, a 26-year-old in Arizona, uses her Close Friends Story, and the story on her finsta, as a way to resist the boredom of sending nudes to her boyfriend of two years. And Instagram sees all of it.
“There’s only so much of him being like, ‘wow,'” Rudy told Mashable. “And I’m like, ‘I worked really hard to take this picture. I did my makeup. I’m looking great. And I love you so much, but your one fire emoji is not doing it for me.’ I need more validation that this is a good pic.”
I need more validation that this is a good pic.
Instead, she posts them — with her nipples or labia censored in order to comply with Instagram’s nudity guidelines — on her Instagram Close Friends Stories or on her finsta’s Stories. She sold some pictures of her feet on Instagram once, but, typically, she just posts artful semi-nudes on her finsta’s Stories.
“I like the attention,” Rudy said. “I like my friends being like, ‘oh cute.’ Like I guess I just got tired of sending nudes to the same person for two years in a row.”
Rudy isn’t alone in posting nudes on her Close Friends Stories — plenty of users do it, not only for the attention, but also as a way to flirt, plug your work on another platform like OnlyFans, or even to make money. But as Close Friends Stories become our flirty, NSFW finstas, it’s worth taking a look at how Instagram regulates content there. Because we aren’t just getting the attention of our Close Friends — our content is also getting clocked by Instagram.
Who can see your Close Friends Story
The nice part about a finsta, or a Close Friends Story, Rudy says, is that you can control who sees what you post. Despite wanting a larger audience to catch a glimpse of her tasteful nudes, she doesn’t want just anyone to see them.
“For my regular Instagram Stories, I’m like, ‘oh, we’re out at happy hour;’ ‘Oh, I’m sharing this giveaway. Maybe I’ll win a free charcuterie board;’ ‘Anyway, here’s my boob,'” Rudy joked. “The vibe on [my finsta Stories] is just slightly more chaotic. It’s like, ‘I’m mentally ill. Here are my nudes.'”
So she’s curated a highly specific group of viewers for her nudes. While users think Close Friends is just for that — the eyes are distilled down to those we deem our closest friends, so it’s unlikely that they’ll report your photos for going against the platforms’ guidelines if you accidentally slip a nip. But it’s being watched. The people behind the scenes at Instagram can still see everything you post. And all the rules still apply.
Stephanie Otway, a Meta spokesperson, told Mashable over email that it relies “on both proactive technology and reports from the community to find and remove content that contains nudity.” Otway added that Meta “proactively found 94.3 percent of content containing nudity on Instagram before it was reported to us,” in the last quarter.
“Our proactive technology detects violating content across Instagram, including in Close Friends,” Otway said.
How content is censored
Instagram will take your content down if you’re violating its rules — whether you’re violating them on your public Instagram account, your finsta, or your Close Friends Story. But you’re also at risk of having your photos reported, censored, or taken down even if you aren’t technically doing anything wrong.
“Obviously not all bodies are treated equal on Instagram,” Dr. Carolina Are, an innovation fellow at the Centre for Digital Citizens at Northumbria University, told Mashable. “There have been stories that show that algorithms pick up, for instance, Black plus-size bodies more than white skinny bodies.”
It seems not all bodies are censored consistently. In October 2020, Instagram changed its nudity policy on breast squeezing to combat the censorship of plus-size Black women on the platform in an effort to ensure all body types are treated fairly. But as recently as March 2021, censorship of fat bodies on Instagram was still happening. A Toronto woman told Vice that Instagram unfairly censored her posts and shut down her account because she is fat. She said she paid an Instagram insider $2,500 to get it restored.
“The big issue that I’m really speaking up about is you see all these thin women that are posting pretty much full nudity and their posts are staying up,” Konstantopoulos told VICE. “Anyone that happens to be in a marginalized body, whether that be fat, (BIPOC) etc., we’re being censored, we’re being removed and ultimately silenced on this platform.”
Should we be censored at all?
Moderation is an important aspect of keeping social media safe for all users. When it works, it stops disinformation from spreading and works to keep hate speech at bay.
But Are, who has a PhD in the moderation of online abuse conspiracy theories and is also a pole dancer and instructor, argues that moderating content that can fuel hate and censoring human bodies aren’t comparable at all — and she’s encouraging the platform to stop the censorship of nude bodies. Last year, she started a petition to urge Instagram to change their nudity rules after Instagram made it against the rules to link OnlyFans or any other account on a site that allows users to solicit folks for explicit content.
“[Instagram seems] to incredibly threaten the lives and livelihoods of people like us, of people who worked through their bodies or express themselves through their bodies on social media,” Are told Mashable.
It makes Instagram a frustrating — and dangerous — place to exist for some users.
“Users who post bodies have to exist within an ecosystem where their bodies are already taboo and they’re already dangerous,” Are said. “So it’s an extremely shaky place to be in, because your livelihood, your income, your network, your source of self expression, and your source of information can be deleted at the flick of a switch.”
People who use Instagram for their job are left flailing, risking their livelihood to the whim of a platform that doesn’t seem to care much about its users at all. Meta’s own internal research revealed by the Wall Street Journal‘s Facebook Files revealed that Instagram users are at risk of increased body image issues, social comparison, loneliness, stress, and depression.
In theory, many of us use Instagram for fun. But as our world relies more and more on the impact of your digital footprint, that’s becoming less true. Instead, we’re stuck with them for socializing, for commerce, and sometimes even for work. Meanwhile, the companies own what we produce — our data — without giving users even a seat at the table.
It’s only normal to want a respite from the overwhelming experience that being on Instagram inherently is. Just know that if that respite is a cute nude post to Close Friends, the platform itself is watching, too.