Meta brings her “adolescent accounts” on Facebook and Messenger. As on Instagram, the company will automatically start to automatically move younger adolescents to new accounts, which are delivered with compulsory parental control features and restrictions on people with whom they can send a message and interact.
The company first presented the functionality on Instagram and now has 54 million adolescents with the most locked accounts. (Instagram requires adolescents aged 13 to 15 to use an account for adolescents and has integrated tools intended to catch those who lie to their age.) Teenage accounts on Facebook and Messenger will work in the same way. Teenagers will not be able to interact with unknown contacts or modify certain confidentiality parameters unless a parent approves the action. Parents will also be able to monitor the list of metrics and friends of their child.
Meta also adds new security features to adolescent accounts on Instagram. With the change, adolescents under the age of 16 will need a parental authorization to start a live broadcast. The application will also prevent young adolescents from deactivating nudity protection – functionality that automatically blurs images in direct messages that contain “suspected nudity” – unless they obtain parental approval.
These may seem obvious guarantees (they are), but they at least show that Meta concludes obvious gaps in its adolescent-focused safety characteristics. The company has been the subject of intense control over the effect that its applications, in particular Instagram, have on adolescents in recent years. Dozens of states are currently too alleged to younger users.
This article originally appeared on engadget to