Technology

Do your kids spend a lot of time on Instagram?yuan wants to help

Meta announced Thursday that Instagram will start sending “nightly reminders” to teen users who use the app for more than 10 minutes late at night, as part of a new effort by the company to help young people manage their social media use. Pressure to impose greater restrictions and parental controls.

Target notifications will read: “Time to take a break? It’s getting late. Consider closing Instagram for the evening,” the company posted.

Meta told tech website TechCrunch that teen users won’t be able to turn off notifications that start after 10 p.m., though users can choose to ignore them.

Meta said in its announcement that it is implementing the recommendations because “sleep is important, especially for young people.”

Instagram requires users of the app to be at least 13 years old and requires age verification, where users must provide a date of birth, photo ID or submit a video Selfie.

This isn’t the first change Meta has made this year regarding teen content consumption. On January 9, the tech company announced that it would begin “hiding more types of teen content on Instagram and Facebook” and would automatically set teen accounts to the “strictest content control settings on Instagram and Facebook.”

Meta used posts talking about anti-self-harm efforts as examples of content that might be hidden from teens in the future. The company said that while discussion is important and content can help combat bias, “this is a complex topic and may not be appropriate for all young people.”

An October Pew Research Center survey showed that 81% of adults favor minors obtaining parental consent on social media. The study found that only 46% of teenagers supported parental consent.

DON’T MISS: Meta announces it will restrict more teen-targeted content on Instagram and Facebook

key context

2021, wall street journal The Wall Street Journal published an investigation that found Facebook knew Instagram was “harmful” to teen girls, as internal researchers “did a deep dive into teen mental health” and found the app worsened body image issues in teens. One in three teenagers.

Officials also found that “teenagers blame Instagram for rising rates of anxiety and depression,” according to the newspaper.

The investigation led several state prosecutors to investigate Meta and also brought renewed attention to the impact of social media on young people across the country.As a result, the company denies that Facebook is conducting an investigation and will ignore the results “if they cause the company inconvenience,” describing the investigation as wall street journal is a “mischaracterization of our work.”

The nationwide boycott isn’t just focused on Meta, however, and in March 2022, several prosecutors announced an investigation into TikTok’s potential harm to teenagers. Several states have since considered it, and Montana even passed a law, though it was later deemed unconstitutional: banning TikTok due to its potential impact on teenagers and safety concerns.

In response to growing concerns, social media companies have reiterated their commitment to keeping apps safe for young users and rolled out a slew of new features.

In November, Meta launched “monitoring tools” on Facebook, Instagram, Messenger and Horizon Worlds that let parents see how much time their children spend on apps and manage how much time they are allowed online.

TikTok, for its part, launched an Instagram “notifications” feature last March aimed at monitoring how much time teens spend on the app. According to TechCrunch, the feature sets an automatic screen time limit of 60 minutes per day for accounts of users under 18 years old, and requires a password to remain in the app after the time is up.

This article was originally published in Forbes USA

Subscribe to our YouTube channel

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *