Just over a year ago, Gov. Glenn Youngkin issued Executive Order 30. In it, Youngkin lays out sweeping guidelines for the usage of Artificial Intelligence across all corners of the Commonwealth. The order includes a five page document specifically dealing with how Virginia schools should implement AI into the classroom. Therein, Youngkin encourages schools to “harness AI to empower student success,” and to “be constantly discerning and responsive to the continuous expansion of AI capabilities and uses.” A year removed from their implementation, these lofty goals fail to actually engage with the current state of schools across the country, and in Virginia particularly.
It is hard to talk about American secondary schools without mentioning the past five years of disruption. The rise of AI is just the most recent educational disturbance. Less fresh in our memory are the shutdowns that followed the spread of COVID-19. The impact of this missed time is dramatic, and not unconnected from how schools must deal with AI. It’s not that some students’ European history is lacking — literacy rates and math scores are both at an all time low. And the Commonwealth is no exception. In Virginia, the post pandemic years have seen our National Assessment of Educational Progress scores drop at all levels — with a staggering ten point drop of fourth grade reading scores. This widespread deficiency in basic educational skills makes ChatGPT dangerous for students.
With students across the country just treading water in classes, AI came at the perfect time. From the perspective of a struggling student, what is a quicker fix than a tool that requires no greater effort than inputting your homework question and asking it to write you an answer of your chosen length and style? But therein lies the danger of AI. For ChatGPT to be a useful educational tool, it needs to play a supplementary role in the learning process. But when students lack the base of knowledge necessary for AI to be merely supplementary, they run the risk of allowing the AI to do the learning for them — forgoing the struggle that is essential to learning.
And it seems like many students are doing just that — over 60 percent of high school teachers claimed that at least one student of theirs had turned in AI-generated work passed off as their own. It is not just a few errant students, when over half of teens aged 13 to 17 claimed they used AI tools with some frequency. Now more than ever, students are struggling to understand and apply basic educational concepts. But with ChatGPT and other AI tools, this struggle is convincingly hidden behind “good enough” work. And in a system largely reliant on grade-based incentivization, “good enough” AI outputs make it easy for students to fail to learn but never learn to fail.
The brunt of the responsibility to catch students up has fallen on the individual efforts of teachers. With students worse than ever at basic skills like reading and math, teachers have seen their responsibilities increase beyond what they have been trained for — high school history teachers can teach students history, but not how to read their textbooks. Despite the drastic increase in work, teachers have not seen a subsequent increase in benefits. Faced with a job fundamentally different than the one they applied to, teachers are retiring earlier and more frequently than ever before — leaving schools less equipped to help students catch up.
The increase in students’ need for direct instruction from a teacher, coupled with a decrease of teachers, creates an environment where a tool like ChatGPT is educationally devastating. Yet, Youngkin’s executive order AI in education neglects the complicated nature of the situation. The education system these advocates assume exists — the one with students sufficiently capable to make use of AI as an educational tool and with enough passionate teachers to do the bulk of instruction — simply no longer does. Eroded by decades of deficient funding and ravaged by years of a pandemic, our public education system is too weak and dysfunctional to bear the load of AI. Welcoming AI into the classroom first requires reshaping schools into systems strong enough to support it as an educational tool.
I do not claim to know how to fix American schools, nor do I fault others for not knowing either. In fact, I was once among those advocating the whole hearted embrace of AI in the classroom. But solving a problem begins by recognizing its complexity and it is just not as simple as deciding what sort of AI policy a school should have. Whether it is AI or whatever the next educational disruption is, conversations surrounding education policy need to grapple with the fundamental problems facing schools. The societal-wide conversation around AI in education should be nothing more than a preamble to the conversation of how to fix our suffering schools. How do we improve teacher retention? How do we catch students up from two years of learning loss? Until we answer these and more fundamental questions about the state of education, the question of how to best implement AI into schools is one we are not ready to tackle.
Dan Freed is a senior opinion columnist for The Cavalier Daily. He can be reached at opinion@cavalierdaily.com.
The opinions expressed in this column are not necessarily those of The Cavalier Daily. Columns represent the views of the authors alone.