According to a report from The New York Times, child sexual abuse imagery (CSAM) nonetheless persists on Twitter, regardless of Elon Musk stating that cracking down on child exploitation content material is “precedence #1” for the corporate.
While working with the Canadian Centre for Child Protection, which helped match abusive photos to its CSAM database, the Times says it uncovered content material throughout Twitter that was beforehand flagged as exploitative, in addition to accounts saying they may promote extra.
During its search, the Times says it discovered photos containing 10 child abuse victims in 150 cases “throughout a number of accounts” on Twitter. Meanwhile, the Canadian Centre for Child Protection had equally disturbing outcomes, uncovering 260 of the “most express movies” in its database on Twitter, which garnered over 174,000 likes and 63,000 retweets in complete.
Twitter reportedly promotes CSAM by means of its suggestion algorithm
According to the Times, Twitter truly promotes among the photos by means of its suggestion algorithm that surfaces prompt content material for customers. The platform reportedly solely took down among the content material after the Canadian middle notified the corporate.
Earlier this month, Twitter said it’s “proactively and severely limiting the attain” of CSAM content material and that the platform will work to “take away the content material and droop the unhealthy actor(s) concerned.” The firm claims it suspended round 404,000 accounts that “created, distributed, or engaged with this content material,” a 112 % enhance since November.
“The quantity [of CSAM] we’re ready to discover with a minimal quantity of effort is fairly vital,” Lloyd Richardson, the Canadian middle’s expertise director, tells the Times. “It shouldn’t be the job of exterior folks to discover this kind of content material sitting on their system.”