Students compare AI writing tools to everything from high heels to performance-enhancing drugs, revealing complex and nuanced relationships with this technology in academic settings. A recent study asked international postgraduate students in the UK to create metaphors describing how they use generative AI in their writing, uncovering patterns in adoption and highlighting both benefits and concerns. These metaphorical frameworks provide valuable insights into how the next generation of professionals conceptualize AI’s role in knowledge work during a period when institutions are still establishing ethical guidelines.
The big picture: Researchers from the University of Hong Kong studied how international postgraduate students in the UK use and think about AI writing tools through creative metaphors like spaceships, performance-enhancing drugs, and even Spider-Man.
- Students from 14 regions including China, Pakistan, France, Nigeria, and the US participated in the study, specifically focusing on those using ChatGPT-4 in their academic work.
- The research comes at a critical time when educational institutions are still developing frameworks for ethical AI use in academic settings.
Key patterns identified: Researchers categorized four distinct ways students incorporate AI into their academic writing workflows.
- The “technical support” category included using AI for grammar checking and reference formatting, with students comparing AI to aesthetic enhancements or mechanical tools.
- More involved was the “text development” category, where AI helped organize writing logic, assist with literature reviews, and function as a brainstorming guide.
Cross-cultural benefits: Students highlighted AI’s role in overcoming language and cultural barriers in academic writing.
- International students found AI particularly valuable for expressing nuanced concepts in English and navigating complex academic information.
- One Chinese student compared AI to high heels that make writing “look noble and elegant,” while acknowledging occasional academic stumbles.
Concerns and cautions: Many students expressed worries about potential drawbacks of AI usage through revealing metaphors.
- Some participants compared AI to drugs, suggesting addictive properties or performance-enhancing qualities that might undermine authentic learning.
- Concerns included the potential for shallow understanding and lack of innovation when relying too heavily on AI tools.
Why metaphors matter: Harvard technology researcher Emily Weinstein noted that these linguistic frameworks shape how we understand and discuss emerging technologies.
- The metaphors people use reflect their relationship with technology and influence broader public discourse.
- The study suggests effective AI integration in education requires more classroom discussion and recognition that different assignments might have different appropriate levels of AI usage.
Is ChatGPT a Drug? Metaphors Show What Students Think of AI