Virginia colleges are adopting diverse approaches to artificial intelligence education, with most institutions allowing faculty to develop their own classroom AI policies rather than implementing campus-wide restrictions. This decentralized strategy comes as U.S. Sen. Mark Warner, Virginia’s Democratic senator, recently warned local university presidents about potential job disruptions that AI may cause for college graduates, highlighting the urgent need for comprehensive AI literacy programs.
What you should know: Most regional institutions are taking a faculty-driven approach to AI integration rather than blanket institutional policies.
- Randolph College requires instructors to establish their own rules on generative AI use in their courses, stating that “instructors have the authority to determine how or if they allow their students to use these tools in their classes.”
- Central Virginia Community College’s approach is similarly decentralized, with Coordinator of Professional Development Michael Babcock noting, “We do not have a single, uniform college approach.”
- The University of Lynchburg is embracing AI integration, with Chief Educational Technology and AI Officer Charley Butcher acknowledging that some faculty will resist adoption.
Different enforcement strategies: Universities are split on whether to use AI detection tools to monitor student work.
- Liberty University prohibits AI-generated content but allows “ethical AI assistance” for brainstorming and editing, using sophisticated detection tools like Turnitin to analyze writing for AI probability scores.
- The University of Lynchburg has eliminated “gotcha” tools like Turnitin and GPT Zero, with Butcher calling them “a disservice to students.”
- CVCC’s Babcock considers AI detectors “virtually worthless,” instead relying on handwritten assessments and oral exams to establish baseline student abilities.
AI as augmentation, not replacement: Despite varied policies, all institutions emphasize using AI as a supportive tool rather than a substitute for human work.
- Randolph College’s John Keener said generative AI “should, generally, all things being equal, be approached as a way of being more efficient in the steps inside a task” rather than replacing the overall work.
- Liberty’s Alexander Mason stressed that “AI can be used as a support tool, but it’s never a substitute for actual human work.”
- CVCC’s Babcock tells students to use AI to “supercharge their own creativity” and act as “a backboard for your own thinking.”
Environmental considerations: Several institutions are addressing AI’s resource-intensive nature and environmental impact.
- Sweet Briar College is building understanding of AI’s implications, especially regarding environmental sustainability.
- The University of Lynchburg plans to implement a program called Elevate in fall 2026, providing every student with a MacBook Air featuring the M4 chip with embedded AI to reduce data center usage.
- Babcock introduces students to AI’s “very real infrastructure and environmental impact,” calling it “very much a work in progress.”
What they’re saying: Educators emphasize the importance of modeling responsible AI use and preparing students for AI-integrated workplaces.
- “You’ve got to model lifelong learning if you want your students to be lifelong learners,” Butcher said during an August webinar.
- “There are very few careers out there right now that they’re not using AI,” said UL’s Sandra Perez. “So, let’s teach them how to do it and to do it well so that they are the leaders in the industries that they’ve chosen.”
- “I talk with my students a lot about how I use it in my own personal, as well as my own professional, life,” Babcock explained. “And I think that that’s a really important way to humanize the technology.”
Virginia Colleges Take Varied Approach to AI Education