X Says Platform Now Scanning All Videos, GIFs for Child Abuse Material in Safety Update

Katabella Roberts
By Katabella Roberts
December 29, 2023US News
share
X Says Platform Now Scanning All Videos, GIFs for Child Abuse Material in Safety Update
A young man types on an illuminated computer keyboard typically favored by computer coders in Berlin on Jan. 25, 2021. (Sean Gallup/Getty Images)

Elon Musk’s X is now scanning videos and GIFs uploaded to the platform for child sexual abuse materials (CSAM) as part of wider efforts to crack down on child sexual exploitation online.

The social media site announced the new measure in a Dec. 28 blog post in which its safety team stressed tackling child sexual exploitation (CSE) online is a top priority.

“At X, we have zero tolerance for child sexual exploitation, and we are determined to make X inhospitable for actors who seek to exploit minors in any way,” the company said.

The platform’s safety team also noted it has “strengthened enforcement and refined detection mechanisms” throughout the year as part of efforts to crack down on child sexual abuse materials on the site, and has increased the number of referrals sent to the National Center for Missing and Exploited Children (NCMEC).

From January to November of 2023, X permanently suspended over 11 million accounts for violations of its CSE policies, the company said.

In 2022, prior to Mr. Musk taking over the platform, which was then called Twitter, the site suspended just 2.3 million accounts, officials said.

In the first six months of this year, X sent a total of 430,000 reports to the NCMEC CyberTipline compared to just over 98,000 reports sent in 2022, according to Thursday’s blog post.

New Measures Help Tackle Child Exploitation

“Not only are we detecting more bad actors faster, we’re also building new defenses that proactively reduce the discoverability of posts that contain this type of content,” the company said. One such measure that we have recently implemented has reduced the number of successful searches for known Child Sexual Abuse Material (CSAM) patterns by over 99 percent since December 2022.”

That measure, a “Search Intervention for CSE Keywords” allows the platform to entirely block search results for certain terms. Since December 2022, X has added more than 2,500 CSE keywords and phrases to that list to prevent users from searching for common CSE terms, officials said.

Additionally, the social media platform has rolled out “Expanded Hash Matching” for videos and GIFs, whereby employees evaluate all videos and gifts uploaded to the site for CSAM.

Since introducing that feature in July 2023, X said it has matched over 25,000 pieces of media.

Along with implementing various measures aimed at cracking down on child abuse materials on the site, X said it is also consistently working with various trusted organizations including NCMEC and the Tech Coalition—a global alliance of technology companies working to fight child abuse online—as well as WeProtect—which includes government and private sector experts working to protect children from sexual abuse online—to tackle emerging threats and behaviors.

The platform is also working in close coordination with law enforcement, according to the blog post.

NTD Photo
A photo illustration of the X (Twitter) logo in London, England, on July 24, 2023. (Dan Kitwood/Getty Images)

‘Critical Bad Actors Are Brought to Justice’

“Ultimately, it is critical that the bad actors be brought to justice and that law enforcement has the tools and information they need to prosecute these heinous crimes. X cooperates with law enforcement around the world and provides an online portal to submit removal requests, information requests, preservation requests, and emergency requests,” the company said.

“Ongoing dialogue and knowledge sharing with law enforcement is key to achieving our mission,” it added.

The update from X comes after the Australian eSafety Commissioner (eSafety) began civil penalty proceedings against X for failing to comply with government requirements regarding child sexual exploitation materials.

The commission alleges the social media giant failed to hand over information about how they are addressing child sexual exploitation and abuse materials and activities on the site; specifically by not complying with a transparent notice and not preparing a report in the required manner and form.

A month earlier in September, the commission fined the platform Australian $610,500 (US $414,400) over its alleged failure and gave it 28 days to either pay the fine or provide the information requested.

In its blog post on Thursday, X said putting an end to child sexual exploitation will remain a top priority in the year ahead.

“Until child sexual exploitation is brought to an end, our work will never stop,” the social network said. “In 2024, we will continue our strong investment in this critical area and expand our efforts to educate our users about the importance of helping us to combat child sexual exploitation online.”

“We are committed to making X a place where freedom of expression and users’ safety are not compromised in the public conversation,” it concluded.

From The Epoch Times

ntd newsletter icon
Sign up for NTD Daily
What you need to know, summarized in one email.
Stay informed with accurate news you can trust.
By registering for the newsletter, you agree to the Privacy Policy.
Comments