If you use social media, you’ve probably done a manoeuvre called the ‘algo swerve’ without even realising it.
Let me tell you a story: a few months back, I was scrolling through my For You page on TikTok.
I came across a hilarious video of an Australian bloke riding around his farm in his ute with a sheep sitting upright in the seat next to him.
Great content, right? So I did what I’ve been trained to do. Like a mouse in a scientific experiment pressing a button to get more cheese, I smashed that like button to reward the creator for a great vid.
Unfortunately, what I didn’t think about was the sheer power of TikTok’s algorithm.
For weeks and weeks afterwards, I was stuck in what I called ‘Farmboy TikTok’. The app served over and over again videos of guys doing burn outs in utes, people shearing sheep, even some bloke holding a beer bottle cracking a whip at his mate.
Now, I may be just a city slicker but I don’t mind a bit of farm culture. Even still, I soon tired of the farmboy antics. After liking a few more, I realised the only way to escape from this corner of TikTok was to teach the algorithm that I was sick of it.
I stopped liking. I stopped following. Hell, I even had to tear my eyes away from enticing-looking videos of men chasing cows because I didn’t want it thinking I wanted to see more of them.
This, anecdotally, is a common experience. People who’ve clicked on a link for a bed are haunted for weeks with advertisements for mattress, bed frames and pillows. Other times people are using Spotify incognito to avoid having their weekly suggested playlist recommend nothing but the Frozen soundtrack because they let their children choose music.
You accidentally watch one Jordan Peterson video out of a sick curiosity? Well, buckle up for years of being suggested ‘Canadian Professor Humiliates Cucked Lib’ by YouTube.
It’s a distinctly modern phenomenon, but one that seems relatively common. A colleague of mine coined the term for when you try to avoid engaging with content you don’t want to see: the algo swerve.
More than just a part of every day life now, it’s also a symptom of a broader issue.
We’re told that technology written by the best and brightest is powerful and sophisticated. In 2015, Cambridge Analytica told the world that it could use Facebook Likes to know you better than your friends. And increasingly, algorithms are being put in charge of making important decisions about the world.
But in many cases they’re wrong. They misinterpret signals. Often it’s funny or unimportant. But sometimes, these decisions affecting people’s lives. The robodebt saga is an example of an algorithm that people claim cost lives.
This, on a minor scale, shows how people inherently know that algorithms are fallible. Let’s hope that the people making them and implementing them realise that too.
In a world increasingly controlled by algorithms that are supposed to be shaped by their users, people are changing their behaviour to avoid shaping the algorithm.