Instagram Debuts New Safety Settings For Teenagers

Instagram Debuts New Safety Settings For Teenagers




Instagram is updating new safety settings for young users
: Its making new accounts private defaults for kids under 16, blocking some adults from interacting with teens on its platforms, and restrict how advertisers can target teenagers. 

The changes come as the facebook- owned-photo sharing app is under pressure from lawmakers, regulators, parents and child-safety advocates worried about the impact of social media on kids' safety, privacy and mental health.

"There is no magic switch that makes people suddenly aware of how to use internet",Karina Newton, Instagram's head of Public Policy told NPR. She said the changes announced on Tuesday are aimed at creating "Age-appropriate experiences" and helping young users navigate the social network.

"We want to keep young people safe, we want to give them safe good experiences, and we want to help teach them, as they use our platforms, to develop health and quality habits when are using internet and apps and social medias", She said.

Like other apps,Instagram bans kids under 13 from its platform, to comply with federal privacy law. But critics says the law left older teens exposed. Some members of congress have called for expanding privacy protection up to age 15.

Starting this week, when a kid under 16 joins Instagram, their accounts will be made private automatically, meaning their posts will be visible only to people they allow to follow them. (The default private settings will apply to people under age 18 in some countries).

Teens who already have Public Instagram accounts will see notification of benefit of having private accounts and how to switch. The company said in testing, about 80% of young people chose to keep the privacy setting when signing up.

Instagram is also taking steps to prevents what it calls "unwanted contacts from adults". It says who, while not breaking Instagram's rule, have shown "potentially suspicious behaviour", such as  they've been blocked or reported by young people, will have limited ability to interact with and follow teens.

For example, they wont  see teenager's posts among the recommendations  in Instagram's Explore and Reel's sections, and Instagram won't suggest  they follow teen's accounts. If these adults search for specific teens by user names, they wont be able to follow them, and they will be blocked from commenting on teens posts.

Instagram has come under fire in the past over how it handles young users. It started asking users for their birth dates only in 2019. Before then, it simply asked them to confirm whether or not  they were at least 13.

Newton says Facebook and Instagram already use artificial Intelligence to scan the profiles for signals that suggest whether the user is younger or older than 18 . That includes looking at what people says wishing users the happy birthday. Now they are expanding that technology to determine the age of younger users.

She acknowledge that determining age is a "complex challenge" and that Instagram has to balance questions for privacy when using technology to determine user's age.

"This is an are that isn't foolproof", she said. Facebook and Instagram are in discussions with other tech groups, including makers of browsers and smartphones operating systems, on sharing information in "Privacy-preserving-way" to help determine if the user is old enough for an online account, she said.

The changes "appear to be a step in the right direction", said Josh Golin, Executive director of the children's advocacy Nonprofit fairplay, formerly called the campaign for a Commercial -Free Childhood, which is one of the groups pushing to extend legal privacy protections for teens.

"There has been such a groundswell to do more to protect teen safety, but also around manipulative behavioural advertising ", he told NPR. "So its good that they are responding finally to that pressure".

But Golin said his group would continue pushing for tighter  regulation of how tech companies can use kid's data.

"The idea that you shouldn't be allowed to use child's data in a way that's harmful to them is something that we absolutely want to see", he said. "In no other context that I'm aware of do we treat a 13-year-olds like they are adults".

As scrutiny of powerful technology companies has grown in Washington, their impact on kids has emerged as bipartisan area of concern.

"Your platforms are my biggest fear as a parent", Rep. Cathy McMorris Rogers, told the CEOs of Facebook, Google and Twitter at a congressional hearing  in March. "Its a battle for their development, a battle for their mental health, and ultimately the battle for their safety", She said, pointing to research linking social media to depression among teens.

That hearing came shortly after Instagram sparkle a new line of controversy with news it is working on a version of the app for the kids under 13. The project has drawn opposition from child's safety groups, members of congress,  and 44 attorney general, who urged Facebook to scrap the idea entirely. 

But Facebook CEO, Mark Zuckerberg  has defended the idea, saying that under-13s are already using Instagram, so it would be better to provide them a dedicated version.

Newton echoed that point, saying that young people are online, despite age limits, "A lot of times they are using products that weren't built or designed for them"

On Tuesday, Pavni Diwanji, Facebook's vice president of youth products, wrote in a blog post that the company is continuing to develop "a new Instagram experience for teens", that would be managed by parents or guardians.

"We believe that encouraging them to use an experience that is age appropriate and managed by parents is the right path", said Diwanji.