Media column: Off button within reach – culture


A while ago a nice game made the rounds on Twitter. The users should write the words “It’s the left’s fault that …” and then let the automatic word suggestions on their smartphone complete the sentence. The result was, on the one hand, Dadaist garbage and, on the other hand, astonishingly clairvoyant summaries of the internal political discourse.

For a few days there was therefore a big hello and giggles, but what was forgotten about it was that the words suggested by the software are also fed from the user’s own frequency of use. The autocorrect miniatures are much less a clever zeitgeist commentary, but on the contrary say more about the user than about society. It is not about an objective, albeit arbitrary, statement about the state of the left or the German political landscape, but about who is communicating in a meandering or determined manner.

“Tiktok knows me better than most of my friends.”

The moment when people feel recognized and noticed by machines capable of learning has long been the subject of self-help columns that deal with modern life. Journalist Kaitlyn Tiffany recently wrote in the magazine that she is afraid of who Tiktok thinks she is The Atlantic. Because the contents that are suggested to her are “totally repulsive”. Flat, sexist, transporting questionable worldviews, at best good for a quick laugh.

Tiffany has a bad feeling mainly because the recommendation algorithms of the short video platform are notorious for being extremely precise and deciphering the preferences of the users in a very short time. Since spending so much time with the app during the pandemic, writes Australian author Louis Hanson, he has been convinced that “Tiktok knows me better than most of my friends”.

The standard phrase used in technology criticism is that the platforms exacerbate those conditions in their audiences that are profitable for them. They give shape and direction to the unspoken dissatisfaction, boredom and loneliness of people without actually resolving them. Or maybe this assessment is far too deterministic. Once an algorithmic feed comes into play, the only reason for the content to appear that way is the user.

Is it really that easy to see through?

What does it mean when it feels like computer systems and algorithms know more about the user than they do? Self-doubts are loud there. Are you really that under-complex and easy to see through? Are your own interests and sensitivities really that generic? Don’t all the fears of the breakdown of the self reveal a strangely simple and externally determined image of man? And what actually prevents the user from evading the portals’ behavioristic mechanisms? The off button is still within reach.

You could of course turn the tables. And perceive the algorithms’ calculations as a distorting mirror, as a dark digital twin that is worth confronting. Take the opportunity to question yourself. Do you go online to be part of or complain about society? Do you stand by all the needs that you play on the net, lying alone on the couch in the dark? Or have you had the impression for far too long that you could be a better version of yourself? “Who stares into Tiktok, stares back into Tiktok,” as tech analyst Eugene Wei once put it nicely. The only question is whether you like what you see there.

.



Source link