On Chub AI, a website where users chat with artificially intelligent bots, people can indulge their wildest fantasies. For as little as $5 a month, users can get teased by an anthropomorphic cat or flirt with a “tomboy girlfriend who works at a truck-stop café.”
They can also visit a brothel staffed by girls under 15.
In a recent feature, Alexandra Sternlicht and I wrote about Chub, short for Character Hub, and other “uncensored” AI apps that either explicitly or implicitly enable AI-powered child pornographic role-play. These apps are part of a broader uncensored AI economy that, according to interviews with 18 AI developers and founders, was spurred first by OpenAI and then accelerated by Meta.
The apps we wrote about are mainly text-based. Some offer hundreds of role-play scenarios with chatbots, and others behave like a long-term romantic companion. While some actively screen for scenarios involving minors, others forgo extensive moderation. But they all raise difficult questions: Should we keep powerful AI models under lock and key or open them up to developers across the internet? Can these AI child pornographic role-play scenarios we documented, which involve no real children, be a gateway to child abuse?
The exit comes shortly after one of Carta’s customers accused the company of self-dealing on social media, causing an outcry in Silicon Valley’s startup scene.
The agency was forced into action after a landmark court victory by the crypto firm Grayscale and the arrival of major players like BlackRock and Fidelity.
Blazers and slacks outshone chipmakers in 2023 despite the furor around AI—and the CEO of Abercrombie & Fitch says targeting young millennials was the golden ticket.