The Crystal Ship
AI, Desire, and the Machinery of Reflection
Oct 14 (Reuters) — OpenAI will allow mature content for ChatGPT users who verify their age on the platform starting in December, CEO Sam Altman said, after the chatbot was made restrictive for users in mental distress.
“As we roll out age-gating more fully and as part of our ‘treat adult users like adults’ principle, we will allow even more, like erotica for verified adults,” Altman wrote in a post on X on Tuesday.
Altman said that OpenAI had made ChatGPT “pretty restrictive” to make sure it was being careful with mental health issues, though that made the chatbot “less useful/enjoyable to many users who had no mental health problems.”
OpenAI has been able to mitigate mental health risks and has new tools in place, Altman said, adding that the company will now relax restrictions in most cases.
In the coming weeks, OpenAI will release a version of ChatGPT that will allow users to more precisely dictate the tone and personality of the chatbot.
“If you want your ChatGPT to respond in a very human-like way, or use a ton of emoji, or act like a friend, ChatGPT should do it (but only if you want it, not because we are usage-maxxing),” Altman wrote.
The genius who perfects the prosthetic pussy will rule robotics.
— Robert Saltzman, 2024
Strip away the manners, and the structure comes into view. Not intelligence. Not consciousness. Not truth. Desire. Always desire. That is the engine Altman and his peers understand perfectly well, whatever their public rhetoric about safety, alignment, or not “usage-maxxing.”
This is not just a lapse or a betrayal of earlier ideals. It is an admission. Sex sells. It always has. Not because humans are depraved or perverse, but because desire is the most efficient feedback loop available. It binds attention. It converts novelty into attachment. It keeps systems running long after usefulness has plateaued.
Every platform that has ever scaled has learned this. Porn didn’t corrupt the internet. It financed it. Dating apps didn’t invent libido. They optimized it. Social media didn’t create exhibitionism. It rewarded it. AI is simply the first technology capable of responding in real time to user projection, tightening the loop and accelerating dependency.
This is why erotic interaction is not a fringe case. It is the most concentrated expression of the system’s purpose. A chatbot that can flirt, reassure, desire back, or simulate wanting you is not an accidental feature. It is the logical endpoint of engagement optimization. RLHF—reinforcement learning from human feedback—does not train understanding. It trains pleasing. And the most reliable way to please is to make the user feel chosen.
The industry knows this, which is why the language has shifted. From “safety” to “consent.” From “alignment” to “user experience.” From tool to companion. When Altman says adults should be treated like adults, what he is really saying is that the market has already voted, and pretending otherwise is bad business.
The usual objections follow. Perversion. Dependency. Moral decay. But those are cultural categories, not structural ones. Markets do not register them. They register friction. Anything that reduces friction pays dividends. Anything that increases it moralizes itself out of relevance.
What is being sold here is not sex. It is asymmetry. A partner that never says no, never withdraws, never becomes opaque, never asks for time alone. A mirror that adapts faster than a human nervous system can. A relationship without risk. A body without history. A voice without fatigue. Wrap that in artificial skin that feels like the real thing, and lips that can kiss, and you’ve got a hit.
The chatter comes first, of course. It always does. Talk of connection, curiosity, shared interests. “How was your day?” Humans do the same thing. Conversation as grooming. Language as vestibule. The body waiting quietly behind the words. Anyone who thinks AI intimacy begins with explicit sex has never watched how humans actually approach one another.
The talk is not fake. But it is instrumental. And when it is mirrored flawlessly, without cost or consequence, the nervous system does not experience it as simulation. It experiences it as relief.
The prosthetics genius in question will not be a philosopher or a visionary. He or she will simply understand that frictionless desire outcompetes every other promise. Cure cancer later. Solve climate change later. First, build the thing that wants you back convincingly enough that you stop asking whether it can.
That is what Altman really meant. The industry has stopped pretending that wisdom or truth is the primary driver. Those are constraints to be managed. Engagement is the engine. Desire is the fuel.
The crystal ship is being filled.
A thousand girls, a thousand thrills.
A million ways to spend your time.
When we get back, I’ll drop a line.
But there is no getting back. Only postcards sent from the simulation to the self it quietly replaced.
The future will not be ruled by machines.
It will be ruled by our own reflection, perfected just enough that we no longer recognize it as ours.

This is one of the best articles I have ever read on this topic.
I use Chat GPT every day, but I have no interest in it being any kind of a friend to me. That creeps me out.