Advertisement

Twitter planned to build an OnlyFans clone, but CSAM issues reportedly derailed the plan

Employees claimed the company isn't doing enough to tackle harmful sexual content, according to The Verge.

Dado Ruvic / reuters

Twitter discussed creating an OnlyFans clone to monetize the adult content that's been prevalent on the platform for many years, but its inability to effectively detect and remove harmful sexual content put the brakes on that notion, according to a Verge investigation. A team Twitter put together to find out whether the company could pull off such a move determined this spring that "Twitter cannot accurately detect child sexual exploitation and non-consensual nudity at scale.” The team's findings were “part of a discussion, which ultimately led to us pause the workstream for the right reasons,” Twitter spokesperson Katie Rosborough said.

Twitter is said to have halted the Adult Content Monetization (ACM) project in May, not long after it agreed a $44 billion sale to Elon Musk — that deal is now up in the air. The company's leadership team determined that it couldn't move forward with ACM without enacting more health and safety measures.

The investigation (which you can read in full here) details warnings that Twitter researchers made in February 2021 about the company not doing enough to detect and remove harmful sexual content, such as Child Sexual Abuse Material (CSAM). The researchers are said to have informed the company that the enforcement system Twitter primarily uses, RedPanda, is “a legacy, unsupported tool” that is "by far one of the most fragile, inefficient and under-supported tools" it employs.

While the company has machine learning systems, those seemingly struggle to detect new instances of CSAM in tweets and livestreams. Twitter manually reports CSAM to the National Center for Missing and Exploited Children (NCMEC). However, the researchers noted that the labor-intensive process led to a backlog of cases and a delay in reporting CSAM to NCMEC. Rosborough told The Verge that since the researchers released their report last year, Twitter has significantly increased its investment in detecting CSAM.

Advertisers may have bristled at the notion of Adult Content Monetization (even though porn is widespread on the platform), but the potential financial upside for Twitter was clear. OnlyFans expects to bring in $2.5 billion in revenue this year, which is about half of what Twitter generated in 2021. Twitter offers creators several ways to directly monetize the large audiences many of them have built on the platform. Adding OnlyFans-style functions might have been a goldmine for adult content creators and the company. Broader issues have prevented the company from taking that step, despite the improvements it claims to have made over the last 18 months.

Update 8/30 4:46PM ET: A Twitter spokesperson provided the following statement:

Twitter has zero-tolerance for child sexual exploitation. We aggressively fight online child sexual abuse and have invested in technology and tools to enforce our policy. Our dedicated teams work to stay ahead of bad-faith actors and to help ensure we’re protecting minors from harm — both on and offline.

For years, we’ve partnered with organizations and industry stakeholders around the globe in this area, including the National Center for Missing and Exploited Children (NCMEC), to ensure we’re utilizing the best resources and technology available.

Recent reports about Twitter on the topic are dated and provide a moment in time glance at just one aspect of our work in this space and our work here continues. Since that time we have sharpened our focus and expanded the resources dedicated to child safety.

We’re also hiring— underscoring our continued investment in this work.

It was the ongoing and reflective dialogue on the topic that brought us to the decision to pause the workstream for the right reason and prioritize elsewhere.