Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Instagram Now Lets You Control Your Bully's Comments

Adam Mosseri, head of Instagram, speaks about the social media platform's anti-bullying efforts at the F8 developers conference in San Jose, Calif., on April 30.
Bloomberg via Getty Images
Adam Mosseri, head of Instagram, speaks about the social media platform's anti-bullying efforts at the F8 developers conference in San Jose, Calif., on April 30.

Blocking a bully on social media doesn't always bring an end to online abuse. And in some cases, it can make face-to-face interactions with the bully even worse.

That's what Adam Mosseri, the head of Instagram, said he found out when he started talking to teenagers about their experiences with bullying on the platform.

"Most of it seems to happen between people who know each other in real life ... and teenagers are often reluctant to report or block their peers who bully them online," Mosseri said in an interview last month with NPR's Audie Cornish. "The controls that we had before were insufficient."

That's why this week Instagram announced a new anti-bullying feature called Restrict.

Teenagers told Mosseri and other Instagram researchers that they often didn't block bullies on the platform for two reasons. First, blocking a bully — something that the bully is aware of — can actually escalate the situation and result in more abuse on the platform or elsewhere.

Here's the other reason: When you block a bully, you render yourself invisible, but at the same time you give up your ability to see what the bully is doing. To counter abuse, you often have to know what is happening.

Instagram says Restrict addresses these concerns by taking a more nuanced approach. If someone is bullying you on the platform — posting mean comments on your photos or sending you offensive messages — the new feature allows you to restrict the person's actions.

Once you've restricted a user, comments on your posts from that person require your approval. You can see the comment and so can the bully, but unless you choose to release it, no one else can.

Messages from the restricted user will be sent to a separate inbox, and you can choose whether or not you want to read them. If you do, the bully won't receive a read receipt.

To restrict someone, you can swipe left on the person's comment, use the privacy tab in settings or go directly to the profile of the account you want to restrict.

Once you've restricted an Instagram user, comments on your posts from that person require your approval. You can see the comment and so can the bully, but unless you choose to release it, no one else can.
/ Instagram
/
Instagram
Once you've restricted an Instagram user, comments on your posts from that person require your approval. You can see the comment and so can the bully, but unless you choose to release it, no one else can.

"You need, as a target of bullying, to see what the actor is actually doing," Mosseri said. "[Restrict] gives a target of bullying a bit more power over the experience."

A lot of bullying happens on Instagram, and Mosseri, who started as head of the platform a little over a year ago, said he is committed to changing that.

"What we aspire to do — and this will take years, I want to be clear — is to lead the fight against online bullying," Mosseri said at Facebook's annual F8 developers conference last April.

In an interview with NPR's Cornish, Mosseri discussed being bullied as a child.

"I had like Coke-bottle Corbusier glasses at 5 years old that made my eyes as big as ... lemons. I had a haircut that made me look like Harry Potter long before Harry Potter existed or was cool," said Mosseri. "It was not a good look. I was made fun of a lot. I probably would have been made fun of on Instagram."

Instagram began publicly testing Restrict over the summer, offering the feature to a small percentage of English-speaking users. During the test period, Instagram found that Restrict significantly reduced the number of unwanted interactions and experiences of bullying. For the most part, users restricted people whom they also knew outside the platform.

Based on these findings, Instagram decided to roll out the feature to all users globally.

Fifty-nine percent of American teens have been bullied or harassed online, according to a 2018 survey by the Pew Research Center. Instagram is one of the most popular social media networks among teenagers and a likely place for teens to be bullied.

In a recent study, conducted by the investment bank Piper Jaffray, Instagram is the second most popular social media platform among teenagers. Thirty-five percent of teens surveyed said that Instagram is their favorite social media platform, compared with 41% who preferred Snapchat.

But another study by Piper Jaffray, in 2018, found that Instagram had a slight lead over Snapchat in terms of daily usage. It found that 85% of teens surveyed said they used Instagram at least once a month, while 84% said they used Snapchat at least once a month.

Instagram has been criticized as providing a unique set of tools that enable bullying. It's easy to set up anonymous profiles that can then be used to troll others. The scale of the platform allows hurtful comments or harassing posts to go viral. And while parents and teachers may be able to observe and stop bullying that happens face-to-face, online bullying is often hidden.

In addition to Restrict, Instagram has taken other steps to prevent bullying and make the platform a safer and more welcoming place.

It has banned content that shows self-harm and has made an effort to provide resources for people actively searching for that content. There are also new restrictions on posts related to diet products and cosmetic surgery.

For the past few years, Instagram has been using artificial intelligence to detect bullying and other types of harmful content in comments, photos and videos.

Over the summer, it started rolling out a new feature that notifies people if the comment they are about to post may be considered mean-spirited.

Instagram says that this gives users a chance to "reflect and undo their comment," and it found in early tests that some people were encouraged to edit their comments to make them less hurtful.

Over the summer, Instagram started rolling out a new feature that notifies people if the comment they are about to post may be considered mean-spirited.
/ Instagram
/
Instagram
Over the summer, Instagram started rolling out a new feature that notifies people if the comment they are about to post may be considered mean-spirited.

Instagram hasn't released data on this, but Mosseri said that while a "significant minority of people" rewrite their comments, they rewrite them in a "much more pleasant way."

"I think [flagging comments] is going to be somewhat effective but not very effective," said Mosseri. "I actually think that's true of all of the work that we're going to do. ... But sometimes people get caught up in the moment, and a lightweight reminder can actually help them rethink what they're doing or what they're saying. And so it's not supposed to be the solution that we hang our hat on for all of bullying. It's supposed to be one of many tools to try to prevent bullying from happening in the first place."

Along with Facebook and YouTube, Instagram has also been experimenting with hiding "like" counts in an effort to reduce the negative pressure that comes from constantly striving for social media popularity.

In these tests, users could still see how many likes their post had received, but the count was hidden from everyone else.

In July, Instagram rolled out the feature to test groups in seven countries — Australia, Brazil, Canada, Ireland, Italy, Japan and New Zealand. It was met with a mixed response, especially from influencers who rely on Instagram to make money and often use like counts as a tool for tracking their value on the platform.

Mosseri said that while the decision to hide the counts could be bad for Instagram in the short term — leading influencers to use other platforms instead — he thinks the long-term effects would be worth it and is committed to rolling it out to all users.

"If we make decisions that are bad for business but that keep people safe and are good for well-being more broadly, I have to believe that those are going to be good for the business over the long run," Mosseri said. "The idea with making like counts private is to try and depressurize the experience a bit."

The feature is still being tested, and there is no guarantee that it will be rolled out to all users. Before that could happen, Mosseri said, Instagram has to work through some challenges, like finding a new way to measure how relevant influencers are. But Mosseri said he's "bullish" when it comes to eliminating public like counts from the platform.

He said that he has heard directly from influencers on this, and the response has been mixed. While some are concerned about the potential change, others have called to thank him.

"I've actually had a lot of actors, musicians reach out to me and say ... 'What can I do to help?' Because they also talk about experiencing some of the same feelings that the rest of us feel on the platform," Mosseri said. "The platform can make them feel anxious, make them feel like it's a popularity contest that they can't win."

Before coming to Instagram, Mosseri spent 10 years at Facebook, now Instagram's parent. He helped create Facebook's News Feed, which would later be used to spread misinformation during the 2016 presidential election season. He said this experience has taught him to approach his work with caution.

"Technology is not inherently good, and it's not inherently bad," Mosseri said. "For those of us who work in the industry, it's our responsibility to magnify the good and address the bad as effectively as we can."

He said that in the early days of Facebook, they were so excited and optimistic about the "value that comes from connecting people" that they were "underfocused and somewhat naive about the negative consequences of connecting people at scale."

At Instagram, he said, he's committed to thinking about not just the good that can come from something but "how the idea might be abused."

And when a platform or a particular feature is being misused — whether that be by high school bullies or Russian trolls — Mosseri said it's the developer's responsibility to fix the problem.

"I think one of the big experiences that the industry as a whole ... has had over the last few years is to really become more aware of the negative consequences of what we do and to try and proactively embrace that responsibility and address those issues," Mosseri said.

Editor's note: Facebook is among NPR's financial sponsors.

Copyright 2023 NPR. To see more, visit https://www.npr.org.

Tags
Aubri Juhasz is a news assistant for NPR's All Things Considered.