Network column: Aesthetics of Censorship – Culture

Carolina Are is a cultural scientist and dances at the stripper pole. She does both so well that she teaches these things. But while her academic work is impeccable, there is always trouble with the dance videos on Instagram and Tiktok. This is probably due to the fact that she mostly appears in her performances lightly dressed. Reason enough for the platforms to block the content.

The project The Unseen Ares is now showing images along with those of hundreds of other people whose accounts and content have been banned for equally seemingly trivial reasons. One tries to “show the extent and the human consequences of unfair censorship practices and to advance the discussion about solutions in such a way that the voices of those affected are heard,” says the website. You can also see the pictures that were the trigger. Could a kind of aesthetics of the ban be worked out on the basis of the blocked material?

You can see a lot of naked bodies, sometimes photographed, sometimes just drawn, but also a whole lot of other motifs that, even at second glance, do not violate the prudish criteria of the platform. Crying children, black-and-white portraits of wrinkled old men, not exactly subtle tampon sculptures, or computer animations of vaguely humanoid aliens on a tanning bed. You have to be very delicate to take offense at it.

Reasons for the blocking of the content are often not communicated at all or only insufficiently

In addition to an army of at least 15,000 underpaid and overworked moderators who check or block conspicuous content, automated AI systems are primarily responsible for the censorship. The puritanism instilled in the testing algorithms and the simple triggers that set them off would be almost ridiculous were it not for the consequences. What follows is the softened visual landscape familiar from Instagram. The lockdowns disproportionately affect people who are non-white, are not slim, or otherwise do not conform to mainstream ideals of beauty.

In addition, the project provides interviews with the users affected by the censorship, including many full-time photographers, but also activists and private individuals. They tell of existential fears because of the lack of customer orders and how it feels when years of work are undone by a computer’s decision.

After all, the reasons for the blocking of the content are often not communicated at all or only inadequately. The users have no choice but to piece together the motives and mechanisms themselves. For example, many state that the pictures they post are affected by a so-called “shadowban”. This refers to the phenomenon that content – at least ambiguous for the automated algorithms – is not explicitly blocked, but is reduced in its visibility to the public. The company itself has never admitted using this feature. However, users can see in the statistics how the relevant content is performing worse than it should.

The problem is not new. the community guidelines of the Instagram parent company Meta, which deal with publicly displayed nudity, now fill entire pages. In the past there have already been numerous hashtag campaigns such as “Free The Nipple” or “Don’t Delete My Body”, which rebelled against arbitrary censorship. But publicity confessions Instagram boss Adam Mosseri to address the problem of algorithmic prejudice has so far had no major consequences.

They want “Instagram to continue to be a safe place for inspiration and expression,” says the top priority of the company guidelines. The word authentic, which was there until a while ago, has now been wisely deleted. And the rest doesn’t seem to apply to everyone either.


source site