Instagram Set to Roll Out Teen Accounts To Protect Kids On Its Platform

image

Finally some guard rails with Instagram to protect kids. The massive social media platform is rolling out new separate accounts specifically for users under 18, aiming to make the platform safer for young people amid increasing concerns over social media’s impact on mental health.

About time.

Get this, starting Tuesday in the U.S., U.K., Canada, and Australia, new users under 18 will be automatically placed into these more restrictive accounts, and existing teen accounts will be transitioned over the next 60 days. Similar changes will be implemented for teens in the European Union later this year.

Meta is trying to get ahead of kids circumventing the soon to come protections. The company acknowledges the possibility of teenagers lying about their age and has stated it will require more frequent age verification. The company is also developing technology to identify accounts where teens might be pretending to be adults, automatically placing them into restricted teen accounts. These accounts will be private by default, and direct messages will be limited to people teens already follow. Additionally, “sensitive content” will be restricted, and teens will receive notifications if they spend more than 60 minutes on the platform. A “sleep mode” feature will disable notifications and send auto-replies between 10 p.m. and 7 a.m.

What’s more, teens aged 16 and 17 can opt out of some of these restrictions, while users under 16 will need parental permission to do so. Naomi Gleit, Meta’s head of product, stated, “The three concerns we’re hearing from parents are that their teens are seeing content that they don’t want to see or that they’re getting contacted by people they don’t want to be contacted by or that they’re spending too much on the app. So teen accounts are really focused on addressing those three concerns.”

Timing wise, this move comes as Meta faces lawsuits from multiple U.S. states, alleging that the company has contributed to the youth mental health crisis by creating features that are addictive to children. New York Attorney General Letitia James called Meta’s actions “an important first step,” but emphasized that more needs to be done to protect children from social media harm. While Meta has introduced features like time-limit notifications in the past, these have been criticized for not being strict enough, as teens can easily bypass them without parental supervision.

Basically, with the new teen accounts, Meta provides parents more tools to oversee their children’s activity on Instagram. If parental supervision is enabled, parents can limit their teens’ screen time and see who is messaging their child, offering them a chance to engage in conversations about online safety. Gleit added, “Parents will be able to see, via the family center, who is messaging their teen and hopefully have a conversation with their teen.”

This is a good thing no matter how you slice it.

The post Instagram Set to Roll Out Teen Accounts To Protect Kids On Its Platform first appeared on The Source.

The post Instagram Set to Roll Out Teen Accounts To Protect Kids On Its Platform appeared first on The Source.