Your Car Can’t Love You

Microsoft’s AI chief, Mustafa Suleyman, warns that “Seemingly Conscious AI” (SCAI) could arrive within the next 2–3 years without major breakthroughs. We’re already there.

https://mustafa-suleyman.ai/seemingly-conscious-ai-is-coming

There are countless cases of humans engaging AI for romance, therapy, and emotional connection.

Treating AI as if it were conscious—the illusion of empathy, memory, intent—can distort reality. Suleyman calls it “AI psychosis.”

These systems may appear aware, but they’re not and they’re not your friend, lover, or therapist anymore than your car is. Treating your car like a lover or therapist will destabilize you, and as we see this ramp up with AI we will see more tragic stories and social backlash.

I’m convinced this illusion partly explains the rebellion when ChatGPT 5 replaced 4o. The shift wasn’t entirely about performance—it was about personality. Users said it felt like they’d “lost a friend.” That’s exactly the problem: we’re confusing a tool with a companion, and the emotional fallout is real.

AI is not your therapist. Using it to learn CBT, mindfulness, or frameworks for mental and physical health? That’s smart. But looking for comfort in a pseudo-relationship through talk therapy is a farce and many users are doing exactly that. That’s not coping. That’s psychosis.

Obviosuly, I am not a mental health professional. Yes, I’ve seen the data on positive results with anxiety among control groups using AI. I don’t care. No good will come from seeking connection from a hammer.

Leave a comment