Twitter has suspended more than 125,000 accounts, most of them linked to the Islamic State of Iraq and the Levant group (ISIL), as part of a stepped-up effort to eradicate “terrorist content” on the popular messaging platform.
The social network has said the accounts were frozen since mid-2015 “for threatening or promoting terrorist acts”.
“Like most people around the world, we are horrified by the atrocities perpetrated by extremist groups,” Twitter said on its policy blog on Friday.
“We condemn the use of Twitter to promote terrorism and the Twitter rules make it clear that this type of behaviour, or any violent threat, is not permitted on our service.”
The announcement comes after the US and other governments urged social networks to take more aggressive steps to root out activity aimed at recruiting and planning violent acts.
Twitter said it already had rules to discourage this activity but that it was driving up enforcement by boosting staff and using technology to filter violence-promoting content. But it warned there was no easy technological solution.
“As many experts and other companies have noted, there is no ‘magic algorithm’ for identifying terrorist content on the internet, so global online platforms are forced to make challenging judgement calls based on very limited information and guidance,” Twitter said.
“In spite of these challenges we will continue to aggressively enforce our rules in this area and engage with authorities and other relevant organisations to find viable solutions to eradicate terrorist content from the internet and promote powerful counter-speech narratives.”
Last March, Facebook updated its “community standards”, saying this would curb the use of the social network giant for promoting “terrorism” or hate speech.
The update said Facebook would not allow a presence from groups advocating “terrorist activity, organised criminal activity or promoting hate”.
The move came after videos of gruesome executions appeared on Facebook and other social media as part of ISIL propaganda efforts.