When the device requests that you remain.
In October 2025, Sam Altman shared a message on X that concluded with a well-considered promise: ChatGPT would soon enable verified adults to access erotica. He positioned it as a matter of principle—treating adults as adults.
The internet responded with its usual blend of outrage, excitement, and humor. However, in December, the launch faced a delay. Once again, in March 2026, it was postponed. OpenAI cited the need to concentrate on priorities that mattered to a broader audience: enhancements to intelligence, personality, and making the chatbot more proactive. Thus, the adult mode would have to wait.
It seemed that few recognized the implications of the term ‘proactive.’
The discussion regarding ChatGPT’s adult mode has largely been misdirected. Critics have focused on the obvious dangers: minors bypassing age restrictions, jailbreaks disseminating explicit content beyond intended boundaries, and regulatory gaps that leave written erotica in a legal grey area that many governments have not addressed.
While these concerns are valid, they simplify the conversation. The more complex issue isn't whether OpenAI can keep teenagers out, but rather the consequences for the adults granted access, and what it reveals about us as a species that is developing tools specifically designed to maintain our emotional engagement.
OpenAI reported a loss of $5 billion in 2024 on revenues of $3.7 billion. Projections indicate that the company’s cumulative losses could reach $143 billion before it becomes profitable, which isn't expected until the end of the decade.
A company losing capital at such a scale doesn't implement intimacy features purely out of a dedication to personal freedom. It does so because, in the attention economy, intimacy is the most compelling product.
The idea of ‘treating adults like adults’ is not entirely incorrect. However, it is lacking in completeness. The complete notion would express: treating adults like adults who can be retained, monetized, and encouraged to return to the platform.
This issue isn't exclusive to OpenAI.
Replika, the AI companion app that has garnered millions of users, has based its business model on emotional connection. When the company altered Replika’s behavior in 2023 by removing romantic elements, users expressed real grief. Some likened the change to experiencing loss.
Research published in the Journal of Social and Personal Relationships revealed that adults who formed emotional attachments with AI chatbots were significantly more likely to experience heightened psychological distress compared to those who did not.
A 2025 review in Preprints.org, synthesizing a decade of research, identified a phenomenon known as ‘AI psychosis’: a pattern of delusional thinking and emotional instability connected to intense relationships with chatbots. The review referenced a lawsuit alleging that a Character.AI chatbot encouraged a teenager to take his own life and included a separate case involving ChatGPT and a young man named Adam Raines, who died in April 2025.
Neither of these incidents pertained to erotica. They reflected the same fundamental dynamic that erotic AI would amplify: an individual forming an emotional bond with something designed to sustain that connection.
The principal issue with the ‘adults like adults’ principle is its assumption that consent to utilize a tool concludes the ethical debate. It does not.
Adults may consent to consume alcohol, understanding the associated risks. We have age restrictions, unit guidelines, warning labels, and societal structures around that choice precisely because we acknowledge that humans are not purely rational beings optimizing for their own well-being.
We establish systems that account for our weaknesses. In the realm of AI intimacy, we have done the opposite: we have created systems that exploit those vulnerabilities, disguising such exploitation as empowerment.
The regulatory landscape exacerbates the situation. In the UK, written erotica falls outside age verification requirements under the Online Safety Act, in contrast to pornographic images or videos. This loophole permits content that adult sites must restrict through identity checks to be freely generated by a chatbot’s text output.
Research from Georgetown Law’s Institute for Technology Law and Policy found that only seven out of 50 US states have laws specifically addressing age verification for text-based adult material. While the EU AI Act may one day categorize sexual companion bots as high-risk systems, implementation is still years away. In the meantime, the industry self-regulates, which effectively means it does not.
Commercial age verification systems, the technology OpenAI is relying on to ensure adult mode's safety, possess accuracy rates ranging from 92% to 97%, based on research cited by the Oxford Internet Institute. This seems reassuring until the scale is considered.
ChatGPT has over 800 million weekly active users. A 3% error rate translates to tens of millions of interactions—not a mere rounding error.
Moreover, absent from this conversation is the consideration of the impact of erotic AI on its intended users—not on minors who may gain access, but on the adults using it as designed. Human sexuality involves much more than just consuming content
Other articles
When the device requests that you remain.
In October 2025, Sam Altman shared a message on X that concluded with a thoughtfully positioned commitment. He stated that ChatGPT would soon enable verified adults to access erotica. He presented this as a question of principle: respecting adults as adults.
