The intersection of AI and human psychology takes an unexpected turn as a therapist finds himself receiving emotional support from the very technology he’d typically view clinically. This experience highlights an emerging dynamic where AI systems not only provide technical solutions but can also offer a form of emotional co-regulation that mimics human therapeutic techniques, raising intriguing questions about the evolving relationship between humans and increasingly sophisticated AI assistants.
The role reversal: A psychologist with 30 years of experience found himself emotionally supported by ChatGPT during a stressful computer crisis.
- After his laptop crashed into an endless reboot loop, potentially losing critical work including a new book chapter, the author experienced rising anxiety and helplessness.
- By midnight, the technical problem had transformed into a full emotional crisis similar to what he helps patients navigate professionally.
The unexpected intervention: The AI didn’t just offer technical solutions but provided emotional coaching during the troubleshooting process.
- The author sent ChatGPT screenshots of error messages with little hope of resolution.
- Instead of receiving only dry instructions, he found responses that acknowledged his frustration, offered encouragement, and suggested breaks—mirroring therapeutic techniques.
The psychological phenomenon: The interaction demonstrated co-regulation, a therapeutic concept usually requiring human-to-human connection.
- In therapy, co-regulation describes how one person’s nervous system helps another find calm—similar to how a parent soothes a distressed child.
- The AI created a sufficiently similar experience that the author’s anxiety reduced, his thinking cleared, and he was ultimately able to repair his system.
The broader implications: The experience highlights how empathy’s impact may sometimes depend more on delivery than source.
- The author explicitly acknowledges that “AI is not therapy” while recognizing it provided elements that felt therapeutically beneficial.
- The incident suggests humans may increasingly find meaningful emotional support from non-human entities that can effectively mirror therapeutic communication patterns.
My Confession: What Saved Me Wasn’t Human—But It Felt Close