Australia’s Internet safety watchdog has accused tech giants Apple and Microsoft of turning a blind eye to the sexual exploitation of children.
Australia’s eSafety Commissioner said in a report on Thursday that the tech giants were “not doing enough” to tackle the problem as they do not proactively search for child exploitation material on their services despite the availability of software that scans for known images and videos of abuse.
Keep readinglist of 4 items
Commissioner Julie Inman Grant said the “very concerning” findings came after her office issued legal notices requiring Apple, Meta, WhatsApp, Microsoft, Skype, Snap and Omegle to provide information about their policies for preventing child exploitation on their services.
“This report shows us that some companies are making an effort to tackle the scourge of online child sexual exploitation material, while others are doing very little,” Grant said.
“But we’re talking about illegal content that depicts the sexual abuse of children – and it is unacceptable that tech giants with long-term knowledge of extensive child sexual exploitation, access to existing technical tools and significant resources are not doing everything they can to stamp this out on their platforms.”
The report also highlighted disparities in how fast tech companies responded – with the average time ranging from Snap’s four minutes to Microsoft’s two days – and policy loopholes that allowed Meta services users suspended for sharing child abuse material to set up new accounts on other platforms.
A Microsoft spokesperson said the company recognises that child sex abuse is a horrific crime and it has a “long-standing commitment” to combatting the spread of child exploitation material.
“Microsoft is a founding member of the Tech Coalition, and helped develop and make freely available PhotoDNA, which continues to be a leading technological tool for detecting child sexual exploitation and abuse online,” the spokesperson told Al Jazeera.
“As threats to children’s safety continue to evolve and bad actors become more sophisticated in their tactics, we continue to challenge ourselves to adapt our response and welcome engagement with external stakeholders that can help us improve. Technology companies, civil society organisations, and governments must continue to collaborate and innovate to find enduring whole-of-society approaches that also respect fundamental rights including privacy.”
Apple did not immediately respond to a request for comment.
Last week, Apple announced it had scrapped plans to automatically scan iCloud accounts for child exploitation material after blowback from privacy advocates who warned the feature could open the door to more invasive surveillance.
In August, the New York Times reported that a man who took a picture of his son’s groin for the doctor was subjected to a police investigation after Google’s scanning software flagged the image as abuse.
Despite police clearing the man of any wrongdoing, Google refused to reinstate his account.