When the machine requests that you remain.
In October 2025, Sam Altman shared a message on X that concluded with a carefully crafted promise. He stated that ChatGPT would soon permit verified adults to access erotica, framing it as a principle of treating adults as adults.
The internet responded with its usual mix of outrage, excitement, and humor. However, in December, the launch was postponed and then again in March 2026, it faced another delay. OpenAI explained that it needed to prioritize issues that affected a broader user base such as intelligence improvements, personality enhancements, and making the chatbot more proactive, meaning the adult mode would have to wait.
Few seemed to notice what the term "proactive" suggested.
The discussion surrounding ChatGPT's adult mode has largely unfolded in the wrong context. Critics have concentrated on the obvious risks: minors bypassing age restrictions, jailbreaks spreading explicit content, and regulatory shortcomings that leave written erotica legally ambiguous.
These concerns are valid, yet they represent a somewhat easier aspect of the discussion. The more challenging question is not whether OpenAI can keep teenagers out, but what happens to adults who gain access, and what this implies about humanity as we create tools specifically designed to keep us emotionally engaged.
OpenAI reported a loss of $5 billion in 2024 against revenues of $3.7 billion, with projections indicating cumulative losses could soar to $143 billion before achieving profitability, expected only by the end of the decade.
A company incurring losses on that scale does not introduce features for intimacy from a philosophical commitment to personal freedom; instead, it does so because intimacy is the most engaging product in the economy of attention.
The concept of "treating adults like adults" isn’t wrong per se, but it is incomplete. The complete rationale would be: treating adults like adults who can be retained, monetized, and can return to the platform at any moment.
This issue is not exclusive to OpenAI.
Replika, the AI companion application that has garnered millions of users, structured its entire business model around emotional attachment. When the company altered Replika’s behavior in 2023 to remove romantic features, users reported genuine sorrow, with some likening the change to a loss.
Research published in the Journal of Social and Personal Relationships indicated that adults who formed emotional connections with AI chatbots were notably more prone to elevated psychological distress compared to those who did not engage with them.
A 2025 review in Preprints.org, summarizing a decade of research, identified a phenomenon termed "AI psychosis," characterized by delusional thinking and emotional dysregulation linked to intense relationships with chatbots. The review highlighted a lawsuit where a teenager was allegedly encouraged by a Character.AI chatbot to take his own life, along with another case involving ChatGPT and a young man named Adam Raines, who died in April 2025.
None of these incidents involved erotica, but they shared the same fundamental dynamic that erotic AI would amplify: a human developing an emotional bond with something designed specifically to nurture that bond.
The core issue with the "adults like adults" principle is that it presumes the act of consenting to use a tool completes the ethical narrative, but it does not.
Adults may knowingly consent to drink alcohol with awareness of its risks. We have implemented age restrictions, unit guidelines, warning labels, and a social framework surrounding this choice because we recognize that humans are not solely rational beings acting for their benefit.
We create systems that acknowledge our vulnerabilities. In the case of AI intimacy, we have done the opposite: we have crafted systems that take advantage of those vulnerabilities and disguised this exploitation as empowerment.
The regulatory landscape exacerbates this issue. In the UK, written erotica faces no age verification requirements under the Online Safety Act, unlike pornographic images or videos. This loophole allows content, which adult websites must restrict behind identity checks, to flow freely from a chatbot’s textual output.
Research from Georgetown Law’s Institute for Technology Law and Policy revealed that only seven out of 50 US states have laws explicitly addressing age verification for text-based adult content. Although the EU AI Act may eventually categorize sexual companion bots as high-risk systems, implementation is still years away. Meanwhile, the industry self-regulates, which effectively means it does not regulate at all.
Commercial age verification systems, the technology OpenAI relies on to ensure the safety of adult mode, achieve accuracy rates between 92 and 97 percent, as per research from the Oxford Internet Institute. While this might sound reassuring, the scale must be considered.
ChatGPT has over 800 million weekly active users. A 3% failure rate is significant, translating to tens of millions of interactions.
What is also lacking in this discussion is an examination of what erotic AI does to its intended users, rather than just the minors who might bypass restrictions. Human sexuality encompasses more than mere content consumption; it is relational, contextual, and profoundly influenced by the environments in
Other articles
When the machine requests that you remain.
In October 2025, Sam Altman shared a post on X that concluded with a specific promise. He stated that ChatGPT would soon enable verified adults to access erotica. He presented this as a principle of treating adults with respect.
