Distinguishing young people from adults in social networks poses problems for Facebook

Abuse of teenagers
In the case of sexual images and videos: distinguishing young people from adults poses problems for Facebook

Moderators who review images for Facebook don’t have an easy job. (icon picture)

© Dima Berlin / Getty Images

Facebook reported 27 million sexual images of minors to the authorities last year. But it is not always clear how old the people in the pictures are. Decision-makers quickly find themselves in a dilemma.

When it comes to possible sexual recordings of minors, no website reports as many suspected cases to US authorities as Facebook. Every day, thousands of moderators view pictures and videos, always looking for violations of the house rules or applicable law. The decisions that the inspectors have to make every day are sometimes extremely difficult because the age of the people depicted is not always clear. New guidelines should help: As the New York Times reports, moderators are instructed to assume that they are of legal age when in doubt.

For the Facebook parent company Meta, this is apparently the choice between the plague and cholera. As Meta Security Chief Antigone Davis told the Times, “the sexual abuse of teenagers online is despicable,” but the consequences for people falsely accused of uploading underage sexual material could be life-destroying. The guidelines are therefore intended to protect those who post compliant, erotic material on the platform.

Many cases go undetected

In the eyes of child protectors, however, this could result in a large number of minors being at the mercy of the publication of explicit material without protection. Studies have shown that young people are developing faster today than in the past. Puberty occurs much earlier, especially among black and Hispanic-American adolescents, than is the case, for example, in the population of European descent. This, combined with the relaxed reporting rules, leaves an entire group of youth unprotected, according to Lianna McDonald, executive director of the Canadian Center for Child Protection.

In addition, there are internal problems with reporting content, say employees who are not allowed to speak publicly due to non-disclosure agreements. Because the corporations are legally obliged to report prohibited material immediately, but enjoy little protection from lawsuits if a report was made in error and thus had a damaging effect on the reputation of the people concerned.

Four former employees reported to the New York Times how this affects everyday work. It is said that negative performance ratings would have threatened if a person had made too many wrong decisions. As a result, when in doubt, moderators did not dare to report pictures and videos whose legality they were not entirely sure of.

Overburdening the authorities is no reason to look the other way

At least Apple, Snap and Tiktok confirmed to the New York Times that they had arranged things differently for themselves. In case of uncertainties, it is better to report the pictures, the companies confirmed. But even that was not the last word, explained Meta. Because too many reports would only narrow the problematic bottleneck, namely overloaded offices. “When the system is too crowded with things that aren’t useful,” Davis said, “then that’s a real drain.”



Commissioner Sarah Schneider from the special group "BAO FOCUS" fights against child pornography

The American authorities are now siding with the companies, which are reporting all suspected cases as a precaution. It is not about relieving the authorities, because that could not possibly be a reason not to report cases, according to a decision-making body that views reported content and alerts the police if it is illegal media. Investigators see it the same way. “Nobody should choose not to report a possible crime, especially a crime against a minor, because they think the police are too busy,” said Chuck Cohen, who has served as a 14-year leader on a child exploitation task force headed in Indiana.

Ultimately, the corporations on both sides seem to want to protect themselves as best they can without taking any risks. A major problem is that US courts and laws are very vague when it comes to exactly what criteria define “conspicuous” material and what corporations should use when assessing sexual images. “I couldn’t find a court that came close to answering the question of how to strike that balance,” said Paul Ohm, a former prosecutor with the Justice Department’s computer crimes division.

source: New York Times

source site-5