Meta Platforms is rolling out its “Teen Accounts” feature to Facebook and Messenger on Tuesday as it faces sustained criticism about not doing enough to protect young users from online harms.
The enhanced privacy and parental controls, which were introduced on Instagram last year, will address concerns about how teens are spending their time on social media, the company said.
WHY IT'S IMPORTANT
Meta's expansion of safety features for teens comes as some US legislators say they plan to press ahead with proposed legislation, such as the Kids Online Safety Act (Kosa), seeking to protect children from social media harms.
Meta, ByteDance's TikTok and Google's YouTube already face hundreds of lawsuits filed on behalf of children and school districts about the addictive nature of social media.
In 2023, 33 US states including California and New York sued the company for misleading the public about the dangers of its platforms.
CONTEXT
Meta said teens under 16 will require parental permission before they can go live and disable a feature that automatically blurs images potentially containing nudity in direct messages.
“We will start including these updates in the next few months,” the company said.
In July 2024, the US Senate advanced two online safety bills — Kosa and The Children and Teens' Online Privacy Protection Act — that would force social media companies to take responsibility for how their platforms affect children and teens.
The Republican-led House declined to bring Kosa up for a vote last year but suggested at a committee hearing late last month they still plan to press ahead with new laws to protect children online.
Top platforms, including Facebook, Instagram and TikTok, allow users 13 years of age and above to sign up.
Reuters
Meta expands 'Teen Accounts' to Facebook, Messenger amid children's online safety regulatory push
Image: 123RF/Andriy Popov
Meta Platforms is rolling out its “Teen Accounts” feature to Facebook and Messenger on Tuesday as it faces sustained criticism about not doing enough to protect young users from online harms.
The enhanced privacy and parental controls, which were introduced on Instagram last year, will address concerns about how teens are spending their time on social media, the company said.
WHY IT'S IMPORTANT
Meta's expansion of safety features for teens comes as some US legislators say they plan to press ahead with proposed legislation, such as the Kids Online Safety Act (Kosa), seeking to protect children from social media harms.
Meta, ByteDance's TikTok and Google's YouTube already face hundreds of lawsuits filed on behalf of children and school districts about the addictive nature of social media.
In 2023, 33 US states including California and New York sued the company for misleading the public about the dangers of its platforms.
CONTEXT
Meta said teens under 16 will require parental permission before they can go live and disable a feature that automatically blurs images potentially containing nudity in direct messages.
“We will start including these updates in the next few months,” the company said.
In July 2024, the US Senate advanced two online safety bills — Kosa and The Children and Teens' Online Privacy Protection Act — that would force social media companies to take responsibility for how their platforms affect children and teens.
The Republican-led House declined to bring Kosa up for a vote last year but suggested at a committee hearing late last month they still plan to press ahead with new laws to protect children online.
Top platforms, including Facebook, Instagram and TikTok, allow users 13 years of age and above to sign up.
Reuters
Google must compensate SA media with R500m – Competition Commission of SA
TikTok restructures trust and safety team, lays off staff in unit, sources say
US users saddened as TikTok goes dark ahead of ban
Would you like to comment on this article?
Register (it's quick and free) or sign in now.
Please read our Comment Policy before commenting.
Trending
Latest Videos