The Listening Post

Facebook and the ethics of moderation

We examine Facebook’s challenge in moderating content. Plus, the people monitoring the social web.

On The Listening Post this week: With two billion users and 1.3 million posts a minute, Facebook’s content moderation challenges are huge. Plus, we look at the people monitoring and moderating the social web.

Facebook’s status: Tech or media company?

This week, the British newspaper The Guardian revealed hundreds of internal Facebook documents that outline the site’s ground rules for content moderators. From violence to racism, revenge porn to child abuse – the picture that emerges is one of a company struggling with its responsibilities as a media platform and how to cope with more 1.3 million posts per minute, in multiple languages.

Contributors: 

Alex Hazlett, deputy managing editor, Mashable
Olivia Solon, senior technology reporter, Guardian US
Jennifer Pybus, senior lecturer, London College of Communication
Richard Millington, founder, FeverBee
Natasha Tiku, Silicon Valley reporter, Wired

On our radar:

  • Turkish journalist Murat Celikkan has been sentenced to 18 months in prison for producing “propaganda for a terrorist organisation”. He was one of 56 people who took part in an “editor for a day” solidarity project in 2016 at a pro-Kurdish paper, Ozgur Gundem.
  • It’s been open season on journalists in Mexico and yet another has been abducted there. Salvador Adame Pardo, director of television station 6TV in the central city of Nueva Italia, was captured by armed men and forced into an SUV on May 18. He has not been heard from since.
  • In the Gulf state of Qatar, there’s been what the government says is a hack of the state news agency, QNA, this past week, after a fake statement was published on its website ascribing false comments to the country’s ruler. Qatar says it has opened an investigation into the hack and “will hold all those involved accountable”.

Scrubbing the net: The content moderators

Many social media users assume that content moderation is automated, that when an inappropriate image or video is uploaded to the net a computer removes it. In reality, there are thought to be more than 150,000 content moderators working around the world today. It can be unpleasant but necessary work, and many social media companies based in the West now outsource it to countries such as the Philippines or India.

But the question is: do they do that responsibly? Or do they just take advantage of the cheap labour with little consideration for the person doing the work?

The Listening Post’s Nic Muirhead reports on the invisible workers who decide what you see, and don’t see, on the web.

Contributors:
Sarah Roberts, assistant professor, University of California
Ben Wells, attorney
Ciaran Cassidy, filmmaker
Suman Howlader, CEO, Foiwe Info Global Solutions