×
Psychology professor warns AI dependency mirrors addiction—here’s why that matters
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

A Psychology Today analysis examines how AI tools like ChatGPT and Claude are reshaping individual behavior through the lens of behavioral psychology, arguing that while AI provides instant gratification, it may be undermining critical thinking and authentic communication skills.

The big picture: AI systems reinforce certain behaviors while inadvertently discouraging others, potentially creating what Michael Karson, a psychology professor, describes as a drug-like dependency where users get immediate satisfaction but miss developing essential life skills.

What gets reinforced: AI strengthens the pleasure of discovery and knowledge-sharing behaviors that have biological survival value.

  • Richard Feynman’s concept of “the pleasure of finding things out” is amplified by AI’s vast information access, making question-asking behavior more rewarding.
  • People receive positive social reinforcement when sharing AI-generated knowledge, though this can backfire when unsolicited.

What gets weakened: Critical cognitive skills that require effort and practice are being bypassed rather than developed.

  • Searching for evidence and evaluating information aren’t reinforced when AI simply provides answers.
  • Karson compares this to the difference between “catching a grounder and keeping your eye on the ball while playing the short hop” — AI encourages the lucky catch rather than fundamental skills.

Academic consequences: Students are increasingly using AI for assignments, creating a disconnect between grades received and skills developed.

  • AI-generated essays feature perfect punctuation and structure but lack personal insight, resembling “greeting cards” that do the job without authenticity.
  • The reinforced behavior becomes plagiarism rather than developing writing, editing, and critical thinking abilities.
  • Students may not even read the essays they submit, according to Karson’s analysis.

The alienation effect: Heavy AI reliance creates psychological distance from one’s own work and thoughts.

  • Users report feeling that ideas generated with AI assistance belong to the tool rather than themselves.
  • This mirrors drug use, where temporary relief comes at the cost of developing genuine coping mechanisms.
  • The consequences range from minor situations like meeting minutes to significant ones involving romance, friendship, and creativity.

Educational transformation: Schools have shifted from preparing citizens to preparing workers, with AI accelerating this trend.

  • Traditional civic education focused on critical thinking and engaging with diverse viewpoints.
  • Modern education resembles vocational training for “cubicle jobs” across government, media, and industry.
  • AI tools don’t instill the independent thinking skills necessary for informed citizenship and democratic participation.

What the expert says: Karson frames this not as an anti-AI argument but as a necessary warning label about behavioral consequences.

  • “Relying on it is likely to leave us feeling alienated from what it produces.”
  • The analysis suggests AI provides “respite but not life skills,” similar to how drugs offer temporary relief without building genuine capabilities.
A Behavioristic View of AI

Recent News

Salesforce launches Agentforce 3 with real-time monitoring tools

Enterprise adoption surged 233% in six months as companies move beyond pilot programs.