What are these new “teen accounts” on Instagram?

Pique and pique and Instagram. For months, the application of the famous social network has been in the crosshairs of associations and authorities, who accuse it of harming the mental health of 13-15 year olds. In an attempt to respond to the criticism and to protect the youngest users, the Meta group – parent company of Facebook, Instagram, WhatsApp and Messenger – announced on Tuesday the creation of “Teenage Accounts”. How does it work?

Safeguards in place

“This is an important update, designed to give parents peace of mind,” summarizes Antigone Davis, vice president of the Californian group responsible for security issues, to our colleagues at AFP. In practice, users aged 13 to 15 will now have, with the “Teenager Account” imagined by Meta, private accounts by default, with safeguards on the people who can contact them and the content they can see.

By clicking on“I accept”you accept the deposit of cookies by external services and will thus have access to the content of our partners.

More information on the pageCookie management policy

Teens who want a public profile and fewer restrictions – because they want to become influencers, for example – will need to get permission from their parents. This is true whether they are already registered or new to the platform. “Adults will be able to supervise their children’s activities on the social network and take action accordingly, including blocking the application,” assures the head of Meta, adding: “If a teenager tries to change their date of birth, we will ask them to prove their age.”

A question of age

Pressure has been mounting for a year around Meta, the world’s number 2 in digital advertising, and its competitors. Last October, forty American states filed a complaint against Meta’s platforms, accusing them of harming the “mental and physical health of young people” due to the risks of addiction, cyberbullying and eating disorders.

From Washington to Canberra, elected officials are working on bills to better protect children online. Australia is expected to soon set the minimum age for using social networks at between 14 and 16. Meta, whose boss is Mark Zuckerberg, is currently refusing to check the age of all its users, in the name of respecting confidentiality. “We don’t want to force 3 billion people to provide an ID,” confides Antigone Davis.

“So much more to do”

It is not certain, however, that this reinforcement of existing protections will be enough to reassure governments and worried organizations. “Instagram is addictive. The app leads children into vicious spirals, where they are shown not what they want to see, but what they cannot look away from,” believes Matthew Bergman.

In 2021, this lawyer founded an organization to defend “victims of social media” in court. In particular, it represents 200 parents whose child committed suicide “after being encouraged to do so by videos recommended by Instagram or TikTok.” Meta’s “Teenage Accounts”? “These are small steps in the right direction, but there is so much more to do,” the lawyer believes.

He said it would be enough for groups to make their platforms less addictive – “and therefore a little less profitable” – without losing their qualities for users, to communicate or explore interests. In June, the US surgeon general called for requiring social networks to display information about the dangers faced by minors, like the prevention messages on cigarette packages.


source site

Related Articles