Credit...Anthony Gerace

Instagram, Facing Pressure Over Child Safety Online, Unveils Sweeping Changes

The app, which is popular with teenagers, introduced new settings and features aimed at addressing inappropriate online contact and content, and improving sleep for users under 18.

by · NY Times

Instagram unveiled a sweeping overhaul on Tuesday to beef up privacy and limit social media’s intrusive effects for users who are younger than 18, as the app faces intensifying pressure over children’s safety online.

Instagram said the accounts of users younger than 18 will be made private by default in the coming weeks, which means that only followers approved by an account-holder may see their posts. The app, owned by Meta, also plans to stop notifications to minors from 10 p.m. to 7 a.m. to promote sleep. In addition, Instagram will introduce more supervision tools for adults, including a feature that allows parents to see the accounts that their teenager recently messaged.

Adam Mosseri, the head of Instagram, said the new settings and features were intended to address parents’ top concerns about their children online, including inappropriate contact, inappropriate content and too much screen time.

“We decided to focus on what parents think because they know better what’s appropriate for their children than any tech company, any private company, any senator or policymaker or staffer or regulator,” he said in an interview. Instagram’s new effort, called “Teen Accounts,” was designed to “essentially default” minors into age-appropriate experiences on the app, he said.

The changes are one of the most far-reaching set of measures undertaken by an app to address teenagers’ use of social media, as scrutiny over young people’s experiences online has ramped up. In recent years, parents and children’s groups have warned that Instagram, TikTok, Snapchat and other apps have regularly exposed children and teenagers to bullying, pedophiles, sexual extortion and content promoting self-harm and eating disorders.

In June, Dr. Vivek Murthy, the U.S. Surgeon General, called for cigarette-like labels on social media to warn of the potential mental health risks. In July, the Senate passed bipartisan legislation called the Kids Online Safety Act to impose safety and privacy requirements for children and teenagers on social media. And some states have passed social media restrictions.

Mark Zuckerberg, Meta’s chief executive, has faced particular criticism over social media’s risks to young people. Dozens of state attorneys general have filed lawsuits against his company, accusing Meta — which also owns Facebook and WhatsApp — of knowingly hooking children on its apps while playing down the risks. At a Congressional hearing on child online safety in January, lawmakers urged Mr. Zuckerberg to apologize to families whose children had killed themselves after social media abuse.

“I’m sorry for everything you have all been through,” Mr. Zuckerberg told the families at the hearing.

How effective Instagram’s new changes will be is unclear. Meta has promised to protect minors from inappropriate contact and content since at least 2007, when state attorneys general warned that Facebook was rife with sexually explicit content and had enabled adults to solicit teenagers. Since then, Meta has introduced tools, features and settings to foster youth well-being on its social networks — with varying degrees of success.

In 2021, for instance, Instagram announced that it would make new accounts opened by those who indicated they were younger than 16 private by default. At the time, the app allowed younger teenagers to simply switch the default to public accounts.

This time, 16-year-olds and 17-year-olds will be able to opt out of the default privacy settings by themselves. But Instagram said users younger than 16 will now need a parent’s permission to make their accounts publicly viewable.

Dr. Megan Moreno, a pediatrics professor at the University of Wisconsin School of Medicine who studies adolescents and problematic social media use, said Instagram’s new youth default settings were “significant.”

“They set a higher bar for privacy and confidentiality — and they take some of the burden off the shoulders of teens and their parents,” she said.

Yet the changes do not directly address a glaring problem: young people who lie about their age when they register for Instagram. The new settings and features are set to automatically kick in for account holders who self-identify as minors. And while Instagram’s terms of service prohibit children under 13 from using the app, “Teen Accounts” is not designed to search for and remove underage users.

Instagram said it removes underage accounts when it learns of them. It said it would require teenagers to verify their ages if they tried to circumvent the new privacy defaults by creating new accounts with an adult birth date. The company is also working on technology to allow it to proactively find teenagers who have set up accounts posing as adults.

Several children’s groups said Instagram’s announcement, which came as Congress was poised to take up children’s online safety legislation on Wednesday, seemed to be an attempt to ward off new federal protections for young people online.

“These are long overdue features that Instagram should have put in place years ago to keep young people safe online,” said Jim Steyer, the chief executive of Common Sense Media, a children’s advocacy and media ratings group. “They’re only acting now because they’re under pressure from lawmakers, advocates and a groundswell of public opinion.”

While the overhaul may be well received by parents, some teenagers — who are an important part of Instagram’s user base — may be less pleased. Teenage influencers who keep their accounts public to gain new followers could balk at the changes. Nearly half of U.S. teenagers ages 13 to 17 use Instagram at least once a day, according to a survey last fall from Pew Research, making it the fourth most popular social network among young people in America, after YouTube, TikTok and Snapchat.

The safety moves could hurt Meta’s business in the short term, since the company needs new users to grow and young users to remain relevant. But by making these changes now, Instagram is also attempting to court the next generation of young people to use social media while trying to reduce the risks they can face online.

Mr. Mosseri acknowledged that the new safety measures could affect Meta’s bottom line and popularity among teenagers.

“It’s definitely going to hurt teen growth and teen engagement, and there’s lots of risk,” he said. “But fundamentally, I want us to be willing to take risks, to move us forward and to make progress.”

Other social media apps have also made changes for younger users. In 2021, TikTok made accounts private by default for those registered to users from age 13 to 15. It also disabled direct messages for those younger teenagers.

Instagram’s latest settings and features will begin rolling out Tuesday, with new accounts registered by people who identify themselves as minors automatically being put into private mode. The app said it would also soon begin making private existing accounts of minors in the United States, Canada, Australia and Britain.

Meta said it would continue restricting teenagers on Instagram from being able to send direct messages to people they do not already follow. The company said it will also show them less content in the main Instagram feed from people they do not follow and prevent them from being tagged by the accounts of other people with whom they are not connected.

The new options give parents who oversee their teenagers’ accounts more insight into how their children use the apps, Instagram said. That includes a feature enabling a parent to see the topics of posts their child has chosen to see more of, as well as the accounts of the people their child recently messaged. To protect user privacy, though, parents will not be able to view the content of their children’s messages.

While parents might use the information to start important conversations with their children, experts said the feature could also create tensions for vulnerable teenagers whose politics or gender identities may be at odds with their parents’ views.

Dr. Moreno, who is also the medical co-director of the American Academy of Pediatrics’ Center of Excellence on Social Media and Youth Mental Health, said she was looking forward to seeing teenagers’ reactions to Instagram’s changes. Many young people might be relieved that their accounts are made private, she noted, while others may find that getting a parent’s permission to change default settings too burdensome.

“Their voices will be really important in determining how meaningful these changes are,” she said.

Mr. Mosseri said developing the new features was tricky for the company in trying to balance safety concerns with personal privacy.

“The thing for me about this whole world of safety online and well-being and social media is that there are trade-offs,” he said. “We think we’ve found a decent balance. But I’m sure we’re going to get a bunch of feedback.”


A Guide to Digital Safety

A few simple changes can go a long way toward protecting yourself and your information online.