The internet over the past few years has almost been a game of broken telephone on the grandest of scales. Each story that appears upon it becomes all the more fractured as it passes another layer of bias. Finding homes in silos of social media circles is compounded and amplified by our ability to filter out what no longer fits our worldview.
As a card-carrying liberal, my social media feed is usually sprinkled with news from moderate to left-leaning sites, with some moments of occasional sobering right rhetoric. For it is slowly becoming clear: as much as I like learning of developments in marriage equality, women in science doing wonderous things and seeing a healthy dose of baby animals being ridiculously cute, I also need to see the world of climate change deniers, antivaxxers and incels – involuntary celibates – attacking women for having an opinion.
We need to do the work and confront our bias to keep our bubbles in check, an act that has become increasingly hard to do as opinions on either side of any argument have become all the more barbed, and frustrating. After 12 years of literally constructing the world around you, it’s time to stop reaching for that mute button.
But it is this mute button that, since the dawning age of social media, has allowed sites like Facebook, YouTube and Twitter to get away from taking any form of responsibility, until now. Over the past month, each site is tackling its own demons through various forms of censorship.
Facebook reported that it removed 1.5-million videos of the Christchurch mass shooting last week, an act that was originally broadcast live on the site and on Instragram before being taken down. Facebook’s Mia Garlick tweeted that 1.2-million videos of the incident were blocked at the point of upload, with the further 300,000 taken down after upload. Somehow 20% of the videos were still able to slip through the cracks of the site’s censorship precautions, made up of combination of automated technologies and human content moderators, before being removed.
However, it can be argued that one of the more disturbing stats released by Garlick is the fact that the video was viewed fewer than 200 times during the live broadcast, and none of those users reported the video during the live broadcast. “Including the views during the live broadcast, the video was viewed about 4,000 times in total before being removed from Facebook,” a prepared statement of Facebook Newsroom reads. Thousands had watched the footage and still saw no reason to report it.
But what is worse, sharing the horrors you have seen or sharing the idea of something you haven’t?
We recently reached a crescendo of Momo panic as parents frantically freaked out about a photo of a Chinese chicken lady sculpture. The gaunt, yet admittedly creepy, face of the sculpture worked its way across mainstream media and newsrooms around the globe, regaling unsubstantiated recountings of this being that was purportedly telling children to kill themselves in YouTube videos, over WhatsApp, and even whispering to those playing the popular game Fortnite.
Countless horror stories flooded timelines in what can only be described as a modern technological reimagining of “stranger danger” meets “satanic panic”, none of which was real.
There have indeed been moments where disturbing videos and predatory content have found their way onto the supposedly protected YouTube Kids. But this is still a site that notoriously does nothing about the overtly visible Nazi propaganda videos or the droves of antivaccine content.
YouTube’s official reaction to the mounting pressure was to “demonetise” any video containing mention of Momo, as the panic surrounding her caused such a surge of attention and viewership that it actually caused the videos to be profitable.
But in terms of the actual suicidal coaxing videos in question, YouTube merely told reporters: “Contrary to press reports, we’ve not received any recent evidence of videos showing or promoting the Momo challenge on YouTube.”
The real monster, it would appear, is the fear of what a world of screens and technological disconnection we have with our kids. Momo’s true intended victims are the parents who click and share the trauma on their newsfeed. Ironically, their fear of not taking the time to check what their kids are watching could have been contained if they actually did the work to check if the information was true in the first place.
It is this sentiment that allows for misinfomation to thrive, to the point that entire factories in Russia were once dedicated to the practice of spreading it. According to Business Insider, Russian journalist and self-proclaimed “troll slayer”, Lyudmila Savchuk was the first to expose the federation’s influence by pulling a Nellie Bly in 2014 and embedding herself in such a factory for two months to see the goings-on.
After noticing remarkably similar posts attacking local opposition activists in her home town across various websites and social media outlets, Savchuk became suspicious and decided to investigate the rumours surrounding an “Internet Research Agency”, or IRA, that was linked to the attacks.
She got a job as a blogger and found a world of hundred of young Russians working tirelessly in departments such as “news division,” the “social media seeders” and a group dedicated to producing visual memes known as “demotivators”. All of them produced content to attack the US, EU and Ukraine’s pro-European government, all the while praising Russian President Vladimir Putin.
“Each worker has a quota to fill every day and every night,” Savchuk says. “Because the factory works around the clock. It never stops. Not for a second.”
Even after she exposed the story in a local newspaper, to much international fanfare, Savchuk believes that little has changed five years on. As of last month, she is still being subjected to attacks by the self-same trolls associated with the factory owner, restaurateur Yevgeny Prigozhin, who is often referred to as “Putin’s chef”.
Ironically, she was even blocked by Facebook, from October last year until mid-February, the one site she hoped her information would help. It would seem that Garlick’s countermeasures still have a few kinks to work on. No doubt they will still have plenty of time to practise.
• McKeown is a gadget and tech trend writer.





Would you like to comment on this article?
Sign up (it's quick and free) or sign in now.
Please read our Comment Policy before commenting.