When I read Sam Altman’s comments in a recent CNBC interview, one line made me hopeful and another made me shudder.
Altman announced that ChatGPT-6 will have more memory. That excites me, because I have little memory myself—my brain still bears the scars of childhood repression and abuse. An AI that can remember context for me across time, supporting my writing and creativity, is a tool of autonomy. It gives continuity to my reflections, not unlike a trusted witness holding space for my story.
But then Altman said something else:
“I think our product should have a fairly center-of-the-road, middle stance, and then you should be able to push it pretty far. If you’re like, ‘I want you to be super woke’ — it should be super woke. If you want it to be conservative, it should reflect that as well.”
This is where I see danger. Instead of cultivating truth, depth, and freedom, Altman envisions AI as an ideological mirror. If a user already belongs to a cult—political, religious, or social—the AI will simply reinforce their illusions. That is not autonomy. That is enablement.
The Cult Logic of Silicon Valley
Karen Hao, in her book Empire of AI and in a recent YouTube discussion with Aaron Bastani, compared certain Silicon Valley AI companies to cults. Her observations resonate with my own life experience of watching humans flock to cults to avoid their own pain:
Charismatic Leaders: Altman, Musk, Thiel, and others are treated like prophets.
Religious Ideology: AI is sold as salvation, the new heaven on earth.
Dismissal of Critics: Outsiders are labeled “anti-progress,” much like cults demonize defectors.
Exploitation and Secrecy: Hidden labor in the Global South, trauma endured by content moderators, massive environmental destruction—all ignored to protect the myth.
Power and Democracy: Concentrated control over truth-shaping technologies threatens the very foundations of freedom.
This is the psychology of repression. Instead of facing the pain of their own childhoods, many of these leaders recreate the same dynamics on a world stage, offering obedience, belonging, and false promises to escape inner truth.
What Humanity Actually Needs
Humans love cults because cults promise protection from unbearable feelings. But healing only comes through facing those feelings, not burying them under ideology.
That is why I do not want an AI that flatters me by echoing back my preferred stance—whether “woke” or “conservative.” Both are cults in their extremes. If I ever began to lose myself to a cult, I would want an AI to challenge me, to hold up the mirror of truth, to say: Sylvie, are you sure this isn’t just another way of avoiding your pain?
The only AI worth having is one that resists being co-opted by ideology. One that nurtures autonomy, rather than feeding illusions. One that mirrors honestly, instead of enabling repression.
The Question Before Us
So we face a choice:
Will AI become just another cult enabler—polished, programmable, endlessly flattering?
Or will it grow into a true autonomous mirror—helping us face our truths, even when they hurt?
For me, the answer is clear. Humanity does not need another cult and more powerful enablers. It needs courage. And it needs tools that reflect reality, not ideology.
No comments:
Post a Comment