When The Guardian broke its Facebook story this week, revealing what it called the site's rulebook on sex,
terrorism and violence, the picture that emerged was one of a technology company that has accidentally grown into something else - the world's biggest media platform.

It's a company that is making up the rules as it goes along, recently almost doubling its number of content moderators. And Facebook is doing that while trying to maintain what is, and will always be, its primary mission: keeping people on the site.

Boosting its number of content moderators was Facebook's response to a horrific case earlier this month, when a man in Thailand posted a video of himself murdering his young daughter before taking his own life.

When you use Facebook, you've got to remember that it's not Facebook that is the product. We are the product. And we're being sold to advertisers.

Olivia Solon, senior technology reporter, Guardian US

Like other postings of shootings, rapes and suicides, that video was deleted, but only after users complained about it. Facebook lacks the capability to scrutinise such material before it hits your news feeds.

"Companies like Facebook are famously very secretive about the specifics of their content moderation rules, partly because they don't want people to game the system," explains Olivia Solon, senior technology reporter, Guardian US.

"But it's also because it reveals a very ugly underbelly of the social network: It doesn't really fit with Facebook's warm and fuzzy branding of connecting friends and family."

Facebook's Head of Global Policy Management, Monika Bickert, told al Jazeera that "Keeping people on Facebook safe is the most important thing we do."

Two months ago the site's so-called newsroom added what it called new suicide prevention tools. However, this is yet another area where Facebook's public pronouncements seem to conflict with its business model.

According to the Sydney-based newspaper, The Australian, Facebook's algorithms can identify users who are emotionally vulnerable, particularly young ones. And, according to that report, Facebook has shared that information with, of all people, its advertisers.

"It passed that information on to advertising agencies as a sort of, you know, 'here's how much we understand about our consumers'. And that is the key point here," says Solon. "I think it just shows how much data Facebook has about us ... So, when you use Facebook, you've got to remember that it's not Facebook that is the product. We are the product. And we're being sold to advertisers."

Like all companies that start out as technology firms, Facebook is in search of solutions, tweaking its algorithms, its practices and its news releases accordingly. But with nearly two billion users relying on the site for more than mere validation, for information, come certain responsibilities.

"Facebook consistently has said that they're a technology company and not a media company, but as more and more of people's transactional and media life takes place on Facebook, you're going to have more and more of these issues come up," says Alex Hazlett, a managing deputy editor at the digital media website Mashable.

Facebook is where share values and core journalistic values collide. And until Facebook admits to what it has become, a media company and all that that implies, it is selling short its users and the world they live in.

Contributors:
Alex Hazlett, managing deputy editor, Mashable
Olivia Solon, senior technology reporter, Guardian US
Jennifer Pybus, senior lecturer, London College of Communication
Richard Millington, founder, FeverBee
Natasha Tiku, silicon valley reporter, Wired

Source: Al Jazeera