7 Big Tech Firms Criticised Over Not Doing Enough To Stop Child Exploitation: Australian Regulator

Victoria Kelly-Clark
By Victoria Kelly-Clark
December 15, 2022Australia
share

Seven of the world’s largest Big Tech firms have been told to do better in tackling online child sexual exploitation by the Australian government after a report undertaken by the eSafety Commissioner found their response inadequate.

Apple, Meta (Facebook and Instagram), Snap, Microsoft, WhatsApp, Skype, and Omegle were reprimanded by eSafety Commissioner Julie Inman Grant, who said in a media release on Dec. 15 that the report’s (pdf) findings were “very disturbing” and firms needed to do more to address the “scourge of online child sexual exploitation.”

“We’re talking about illegal content that depicts the sexual abuse of children—and it is unacceptable that tech giants with long-term knowledge of extensive child sexual exploitation, access to existing technical tools, and significant resources are not doing everything they can to stamp this out on their platforms,” Inman Grant said.

“We don’t need platitudes; we need to see more meaningful action.”

Julie Inman Grant
eSafety Commissioner Julie Inman Grant during Senate Estimates at Parliament House in Canberra, Australia, on Feb. 15, 2022. (AAP Image/Mick Tsikas)

This comes after the regulator issued legal notices to the seven firms in August under the country’s Online Safety Act 2021 and Basic Online Safety Expectations guidelines compelling Big Tech companies to answer questions on how they were dealing with the problem.

They were given 28 days to respond to the notice or risk fines of up to $550,000 a day.

At the time, the regulator said  the country had seen a surge in reports of child sexual exploitation from the start of the pandemic “as technology was weaponised to abuse children.”

” The harm experienced by survivors is perpetuated when platforms and services fail to detect and remove the content,” the regulator said. “We know there are proven tools available to stop this horrific material being identified and recirculated, but many tech companies publish insufficient information about where or how these tools operate and too often claim that certain safety measures are not technically feasible.”

Apple and Microsoft Highlighted By Commissioner

The regulator found that two of the world’s largest tech firms, Apple and Microsoft, do not attempt to proactively detect child abuse material stored on iCloud and OneDrive services.

This is despite the common availability of PhotoDNA detection technology, which was originally developed by Microsoft. It is now used by tech companies around the world to scan for known child sexual abuse images and videos, with a false positive rate of 1 in 50 billion, the Commissioner said.

Apple and Microsoft also admitted that they do not use any technology to detect live-streaming of child sexual abuse in video chats on Skype, Microsoft Teams or FaceTime, despite the extensive use of Skype for this long-standing crime.

However, Microsoft received praise from the Commissioner for its in-service ability to report the sexual exploitation of children.

“There is no in-service reporting on Apple or Omegle, with users required to hunt for an email address on their websites—with no guarantees they will be responded to,” Inman Grant said.

“Fundamental to safety by design and the Basic Online Safety Expectations are easily discoverable ways to report abuse. If it isn’t being detected and it cannot be reported, then we can never really understand the true scale of the problem.”

The regulator also unearthed large differences in how rapidly the tech companies responded to reports of child sexual exploitation and abuse on their platforms, with the time ranging from an average time of four minutes from Snap to two days for Microsoft.”

“Speed isn’t everything, but every minute counts when a child is at risk,” she said.

Grooming was also spotlighted in the report, with Microsoft, Skype, Snap and Apple admitting to the regulator that they do not use any tools to help detect this on their platforms, including Outlook.com Teams, OneDrive, Skype Messaging, Snapchat’s direct chat and snaps and Apple iMessage.

However, Xbox live does have tools to help with this form of abuse.

Meta and WhatsApp ‘ Struggle Stopping Repeat Offenders

The report also noted that firms like Meta and Whatsapp struggle with repeat offenders, with Meta noting in their response that if an account is banned on Facebook, the ban does not always flow through to Instagram. Likewise, when a user is banned on WhatsApp, that information is not then given to Facebook or Instagram.

“This is a significant problem because WhatsApp report they ban 300,000 accounts for child sexual exploitation and abuse material each month – that’s 3.6 million accounts every year,” Inman Grant said.

“What’s stopping all those offenders creating new accounts on Facebook or Instagram and continuing to abuse children?”

From The Epoch Times

ntd newsletter icon
Sign up for NTD Daily
What you need to know, summarized in one email.
Stay informed with accurate news you can trust.
By registering for the newsletter, you agree to the Privacy Policy.
Comments