Until recently, @dieworkwear was just a Twitter account, run by someone called Derek Guy, which tweeted often niche jokes about menswear and clothing more generally. Well, actually: it is still just that – but what exactly that is has changed considerably over the past few weeks. Last month, after Elon Musk made some tweaks to the algorithm that runs Twitter, everyone seemed to start getting those tweets, and complaints about the unavoidability of "menswear guy" were as unavoidable as the account itself. (It's not clear why it happened: Mr Guy told GQ that "my guess is as good as anybody else's" when asked why he was suddenly everywhere.)
Initially, I didn't see any tweets from "menswear guy". Whether that was because I had been automatically profiled as not being stylish enough, or something else, was as unclear as anything else involved in the story. But I saw the posts about him – mostly from New York media types – and eventually found his account to see what this mysterious man was talking about.
And then it happened. I couldn't escape his posts. His wry observations about fashion brands were ineluctably pouring into my "for you" feed. Simply by viewing his account, I had triggered a process that I couldn't possibly hope to reverse. I – like everyone else – was now a reader of the posts of "menswear guy" whether I liked it or not.
This is increasingly happening on the Internet, as every app adjusts to become a kind of AI-powered, hyper-specific app feed of things that it is convinced you like. When Elon Musk introduced the "for you" feed and made it quickly responsive to anything you looked at, he was really just following TikTok, which has used that same mechanic to become the vast entertainment behemoth it is today.
There's a musician that I find hazily annoying, for example. (It's no doubt more to do with me than it is with them, so I won't mention them by name.) TikTok has been, nevertheless, convinced I want to see posts about every single thing they do. I have taken whatever steps I can to ban them from my feed: muting their name, clicking "not interested" every time one of those posts appears. And it appeared to work. Until recently when a friend sent me a post that included them – and it started all over again.
The desperate attempt to teach the algorithm has sometimes taken me to extreme lengths. I recently saw a tweet that suggested that TikTok's algorithm was built to ensure that you stayed on the platform – and that therefore the best thing to do to instruct it in what you like is to shut the app. Being ignored is after all the ultimate punishment for the attention seeker. And so I can often be found aggressively shutting the app after seeing things I don't like. In recent times I've even taken to deleting the app entirely.
I'm blessed in that I'm only irritated by this stuff. On other platforms, such as YouTube, the endless recommendations can quickly become not just annoying but dangerous. Once, when Jordan Peterson was on the rise, I clicked on a couple of videos, watched them – and quickly found my "up next" column filling up with increasingly negative videos. To the recommendation algorithm, intrigue or curiosity is the same as fandom, since you watch all the same, and so the response is the same, too.
Such content can also feel psychologically damaging, when the algorithm takes you to places you actively never wanted to go. During the search for Nicola Bulley, it was easy to idly leave one video of amateur sleuths open, perhaps clicking on the comments to try and find some explanation of what you were actually going on or whether their claims had been debunked, and find yourself getting video after video of people grimly trudging around fields in the hope of solving what they appeared to see as a kind of very immersive mystery show.
This week, a TikTok user admitted to having tricked police to hide in a field and film as police took Bulley's body out of the river in which it was found. It was ghoulish, as everyone said, but he was no doubt encouraged to do so by an algorithm that actively promoted such posts to people – whether they wanted to see them or not. Just like it can't tell the difference between idle intrigue and committed interest, the algorithm doesn't know when a view of video is being motivated by banal prurience.
Grappling with questions of complicity in what we see is not new, of course. No doubt soon after the first image was drawn, people wondered what it meant to view it. And in recent decades the internet has only made that questioning more intense and immediate. Growing up as a young person around the early web, people would regularly show you videos – of people dying, of other kinds of gore, and more besides – that made you feel a kind of impotent shame for having watched them at all. And I was also a part of the early use of file sharing, which often saw people told they were destroying the viability of the music and films they loved by getting them of getting them for free.
But there is something more urgent about the particular kind of complicity you have when using these kinds of feeds, if only in the sense that the consequences are instantly felt by you, the viewer. Watch something horrible, whether you choose to or not, and you won't just feel bad – you'll be subjected to more and more of it, each video you are forced to watch seemingly serving as another signal to the algorithm that you did want to see it after all, and that it should show more. It's like being pulled into quicksand.
In physics, there is the idea that observing something necessarily changes that thing: that it causes a disturbance in the system. On the internet, the same principle is true, and we can't pretend that we as observers are not having an effect on that which we observe. But it's not only us observing that disturbs the system; it's disturbing us right back.
No comments:
Post a Comment
Keep a civil tongue.