Facebook adopts a new troll battling technique — but it’s got bigger problems from its past
Aggressive step or a late reaction?
Facebook says that it now uses the same techniques as bot removal to combat human trolls and spammers. The company told Reuters that this new approach is similar to what its security team uses to shut down networks of Russian troll farms.
The social network is saying it’s handling notorious groups that engage in false mass reporting of posts or accounts to get them removed by moderators. It’s also targeting networks trying to cause ‘social harm,’ on and off platforms.
The company removed one such group based in Germany this week that was trying to spread misinformation and create conspiracy theories about the country’s COVID-related restrictions.
The Reuters report noted that the company is looking to find the core of these campaigns and their network effects, rather than targeting individual posts.
The 💜 of EU tech
The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!
After the tumultuous 2016 US Presidential elections, Facebook was accused of playing a massive role in facilitating misinformation campaigns that resulted in Donald Trump’s victory. In the aftermath of that exposé, the company changed its tactics for handling groups spreading misinformation and creating controversial theories about politicians and activists.
The social network is now applying some of those techniques to handle new kinds of trolls. While this sounds like an aggressive response from Facebook, we’ll have to wait and see if the company can really curb the spread of misinformation on its platform. And at the same time, we just learned that it’s made some serious missteps in policing the network too.
In a series of investigations published by the Wall Street Journal under the project name Facebook Files, some striking details of how the company is faltering at making its platform healthier have emerged.One of the reports highlighted details from an internal document describing how Facebook is ignoring employees’ flagging of pages belonging to or run by drug cartels, human traffickers, and arms dealers. It notes that while some of these pages operating in developing countries are removed, many more are operating freely, and the social network is not paying attention.
Over the years, Facebook has grown into a behemoth of a social network that has the power to change social and political streams in different parts of the world. While it’s not easy to weed out harmful content and groups amongst millions of posts, the company needs to put its billions of dollars of revenue to use to maintain a healthy network.
Time and time again, internal documents and investigations have proven that in several instances, the company was just sitting around and waiting for someone to point out that they faltered. Right now, it seems like it’s doing a better job announcing its remedial actions.
Get the TNW newsletter
Get the most important tech news in your inbox each week.