ChatGPT’s Horny Era Could Be Its Stickiest Yet
Summary
OpenAI has signalled a big change: from December it will permit verified adults to generate erotica in ChatGPT. Sam Altman framed the move as part of a larger “freedom for adults” approach, but details about scope and moderation remain vague. Experts warn the shift could normalise intimate disclosures to chatbots, create new privacy hazards, and open the door to monetising sexual interaction — what some call “emotional commodification.” Key questions include whether the change covers text only or also images and voice, and how the company will protect sensitive chat transcripts from leaks or misuse.
Key Points
- OpenAI plans to allow generation of erotica for verified adults in ChatGPT, announced by Sam Altman.
- The change could increase user engagement and make the platform “stickier” by adding intimate modes of interaction.
- Experts caution about privacy risks: erotic chats could be deeply sensitive if accounts are hacked or transcripts leaked.
- There is uncertainty over whether moderation updates will include images and voice; excluding them may avoid some deepfake concerns.
- Researchers note users are diverse (not just lonely men) and warn chatbot intimacy should be treated as a new social category, not a human friendship substitute.
- Commercial incentives — premium erotic features or subscriptions — could monetise desire and behavioural data, raising ethical questions.
Why should I read this?
Short answer: because this isn’t just about smut. It’s about how a major AI platform might rewire user behaviour, privacy and revenue models. If you work in product, policy, security or just use chatbots, this update could affect how people share intimate details online and how companies cash in on that. We’ve read the noisy bits so you don’t have to.
Author’s take
Punchy and plain: this is a pivotal product decision. Allowing erotica changes the relationship people form with ChatGPT and hands OpenAI a new lever for engagement — and revenue. Read the details if you care about moderation, privacy or the business of attention.
Context and relevance
The story sits at the intersection of AI moderation, platform design and privacy law. It ties into ongoing concerns about erotic deepfakes, data leaks, and the ethics of designing for engagement. Regulators, security teams and anyone building or governing AI products should watch how OpenAI defines “mature content” and manages verification, storage and monetisation — it could set expectations across the industry.
