FB2 Inside Facebook - option 2
[File: AP Photo]
Al Jazeera Documentaries

Inside Facebook: Secrets of the Social Network

From violence to hate speech, self-harm to child abuse, is Facebook putting profit before safety?

Editor’s note: This film is no longer available to view online.

From violence to hate speech, self-harm to child abuse, this documentary gives undercover access to how Facebook handles extreme content and asks if the company is putting profit before safety. 

There are over 1.47 billion daily active users and more than 2 billion monthly active users on Facebook. With 100 million hours of video watched every day on Facebook, how does the world’s biggest social media platform decide what can and can’t be posted on its site?

If users find content they think is inappropriate, they can report it to Facebook. A content moderator will then decide whether or not it breaks the platform’s rules. Each of these reports is called a ticket and the tickets build up in queues that the moderators work through. 

”It’s

by ”Roger

platform.”]

Throughout the film, a reporter poses as one of Facebook’s content moderation trainees. What he finds striking is that graphic images are often allowed to remain on the site, solidifying speculations that Facebook’s business model values extreme content.

A moderator explains to the undercover reporter, “If you start censoring too much, then people lose interest in the platform. It’s all about making money at the end of the day.”

When a page has a high number of followers and a high level of engagement, it can be shielded and protected even if it contains hateful messages. The justification used is the importance of free speech rights.

When Britain First’s Facebook page was eventually taken down, the decision had to be made at the highest levels of the organisation. This was due to the fact that, despite the page’s eight or nine violations of Facebook’s policies, it had a lot of followers and was generating a lot of revenue for Facebook.

For one of the company’s early investors, Roger McNamee, the decision to keep sensitive content on the platform is a deliberate part of the business model.

“From Facebook’s point of view…[this is] the crack cocaine of their product,” says McNamee, a former mentor to Facebook CEO Mark Zuckerberg.

“It’s the really extreme, really dangerous form of content that attracts the most highly engaged people on the platform…Facebook has learned that the people on the extremes are the really valuable ones because one person on either extreme can often provoke 50 or 100 other people and so they want as much extreme content as they can get.”

In the documentary, Richard Allan, Facebook’s head of public policy apologises for some abusive content failing to be deleted but denies the rules are based on revenue. “If the content is indeed violating it will go… This is not a discussion about money. This is a discussion about political speech.”

But is the discussion that simple for a billion-dollar company that relies on digital advertising as its main source of income?

With billions of pieces of content on Facebook, deciding what can stay on the social platform and what cannot has far-reaching consequences.

Just recently, Facebook said it discovered a security breach affecting nearly 50 million user accounts. News broke early this year that a data analytics firm that once worked for US President Donald Trump’s campaign, Cambridge Analytica, had gained access to personal data from millions of user profiles.