Instagram wants to protect against naked pictures in direct messages

Youth protection
Not a Hotdog: This is how Instagram wants to protect against naked pictures in direct messages

Instagram now warns when receiving nude pictures – and when sending them

© Instagram / PR

Sending naked pictures is still widespread – even among young people. They should now be particularly protected from this on Instagram. But everyone else also benefits from the new regulation.

The pictures are sent quickly – and often have dramatic consequences. It’s not just Instagram that has a problem Nude pictures that are often sent by users who are still young. Now the image network wants to protect its users from this – in several ways.

The network just announced this in a blog post. If nude images are detected in the direct messages, minors receive a warning as both the sender and the recipient. In addition, the images received are initially hidden and only displayed at the express request of the user.

Instagram: Protection against nude images – for both sides

The latter in particular will be welcomed by many users. After all, unwanted penis photos, so-called dick pics, are part of everyday social media life, especially for female users. Instead of seeing the image directly, you can now simply ignore it – or have it displayed if you wish.

Instagram is actually concerned with the first aspect: the inexperienced sending of the pictures. According to the blog post, it often happens that young people are persuaded to send nude photos – and are then blackmailed with them. This doesn’t just happen among acquaintances: there are said to be real gangs that professionally monetize the victims’ naivety.

The new warning is intended to put a stop to this, according to the company, which belongs to Facebook’s parent company Meta. The warning reminds you of the possible consequences of sending a nude picture and emphasizes not to allow yourself to be persuaded to do so. If you still want to send it, you can still do so.

AI-powered

Instagram relies on artificial intelligence to recognize nude images. The recognition of the nude images should take place directly on the device, meaning it should also work with an encrypted connection, emphasizes Instagram in the blog post.

The protective measure should be activated automatically for all accounts of under-18s, but is also available to all other users as an optional setting. However, the company has not yet given an exact schedule for the feature.

Protective measure – also for Insta

In addition to recognizing nude images, Instagram announced several other protection functions for accounts belonging to minors. The aim is to limit the contact options for accounts that the young people are not already connected to. The network also tries to actively identify fraudsters and sexual blackmailers based on their behavior and warns against contact with these accounts if there is suspicion. If the suspicion becomes more serious, the accounts can be removed from the platform entirely.

One aspect of the protective measures is also likely to be the platform’s self-protection: sending naked pictures of minors is illegal in many countries – even if the people send pictures of themselves. Instagram also has an interest in reducing the number of images sent in this way as much as possible.

However, this group is no longer particularly large on Instagram. Although the network has a relatively young audience compared to Facebook and allows users aged 13 and over on the platform, the proportion of underage users is only just under eight percent. For comparison: Tiktok, a competitor that is particularly popular with young people, has 25 percent of underage users.

Sources:Instagram, New York Times, Hootsuite

source site-5