New Zealand PM to discuss attack’s livestreaming with Facebook

A livestream of Friday’s massacre was available to watch on various social media platforms hours after the incident.

Facebook
Experts say tech companies such as Facebook have a 'content-moderation problem' [Dado Ruvic/Illustration/Reuters]

New Zealand Prime Minister Jacinda Ardern has said she wants to discuss the livestreaming facility available on Facebook after the attacker webcast the Christchurch massacre live on the platform.

Ardern said on Sunday she would be looking for answers from social media firms about how the mosque attack, that killed 50 people on Friday, was livestreamed on their platforms.

Using a GoPro camera, suspect Brenton Tarrant broadcast extremely graphic footage of him shooting the worshippers at Christchurch’s Al Noor mosque via Facebook Live.

The distressing 17-minute livestream was available to watch on social media for hours after the attack that also left 34 people wounded.

Ardern said there were “further questions to be answered” by the social media sites.

“We did as much as we could to remove, or seek to have removed, some of the footage that was being circulated in the aftermath of this terrorist attack,” said Ardern.

“But ultimately it has been up to those platforms to facilitate their removal. I do think that there are further questions to be answered.

“I have had contact from Sheryl Sandberg [Facebook CEO]. I haven’t spoken to her directly but she has reached out, an acknowledgement of what has occurred here in New Zealand,” Ardern said a media conference when asked if Facebook should stop livestreaming.

On Sunday, Facebook said it removed 1.5 million videos of the Christchurch shootings “in the first 24 hours”.

“We continue to work around the clock to remove violating content using a combination of technology and people,” Mia Garlick, who works for Facebook in New Zealand, said on Twitter adding that of the removed videos, 1.2 million were “blocked at upload”.

“Out of respect for the people affected by this tragedy and the concerns of local authorities, we’re also removing all edited versions of the video that do not show graphic content,” she said. 

Hours after the attack, New Zealand police said they were working to have the footage removed while urging people not to share it. 

Tech companies “have a content-moderation problem that is fundamentally beyond the scale that they know how to deal with,” Becca Lewis, a researcher at Stanford University and the think-tank Data & Society, was quoted as saying by the Washington Post.

“The financial incentives are in play to keep content first and monetization first.”

On Friday, YouTube tweeted it was “working vigilantly to remove any violent footage” while Twitter said it suspended the account of one of the suspects. 

Source: Al Jazeera, News Agencies