The Trump administration wants to bring artificial intelligence into K-12 classrooms. At first glance, this isn’t a terrible idea. Used well, AI can be a patient tutor. It doesn’t get frustrated. It doesn’t lose focus. It doesn’t roll its eyes, check the clock or give up.
AI could help personalize learning, diagnose learning disabilities, ease administrative burdens and free teachers to spend more time doing what only humans can do: connect, mentor, care. But such outcomes aren’t automatic. They depend on thoughtful design, clear oversight and shared values.
And yet, the federal government still isn’t allowing most of its own workforce to use generative AI tools, citing both security concerns and a lack of clear policy. These are trained adults — scientists, analysts, engineers. Yet many are still waiting for guidance on how (or whether) they can use the same tools we now propose putting in front of third graders. Let’s take the time to ensure we get this right — by aligning educators and tech experts around what matters most: student outcomes.
We can’t just bolt AI onto the current system. We have to rethink our educational values: Not just efficiency and test scores, but also ethical tech use and human connection. That kind of shift must be designed — openly, and with the people who know students best: teachers.
Related: A lot goes on in classrooms from kindergarten to high school. Keep up with our free weekly newsletter on K-12 education.
The truth is, AI is already in the classroom. More than half of U.S. K-12 teachers reported using AI tools in 2024 — double the previous year’s figure. Yet a recent Pew survey found that one in four teachers believe AI in education does more harm than good. Another third say the impact is mixed.
The risks are real: biased algorithms, privacy breaches, overreliance on automation. But so are the possibilities. Done thoughtfully, AI could restore something our schools desperately need: time. Time for students to go deeper. Time for teachers to be present to coach student thinking. Time for sparking curiosity. There could be time for building trust — time for learning to be more human, not less.
But to reap these benefits, we should not make AI the next Google — adopted first, questioned later, if at all. We must build an ethical framework alongside the tools. We must pilot, assess and revise before we deploy at scale. And we must create space for teachers, parents and students to shape these decisions — not just companies and politicians.
This is a moment for humility, not hype. The question isn’t whether AI belongs in the classroom. It’s whether we are ready to make it serve the people in the classroom. If we let AI reshape education without purpose or care, the companies will keep building the algorithms. And when they fail, our students will feel the cost.
That is disruption paired with negligence, leaving our teachers, not the tech companies, to deal with the fallout.
We’ve been here before. Just ask Google.
Over the past decade, schools across the country quietly embraced Google’s suite of tools. Google Docs, Gmail, YouTube — these products now form the digital backbone of American classrooms. During the pandemic, their adoption accelerated. In the U.S., more than 30 million students were using Google’s education apps as early as 2017. Globally, that number has since ballooned to over 150 million students, teachers and administrators, according to the company itself. In many districts, Chromebooks, based on Google’s operating system, are standard issue.
Related: Kids who use ChatGPT as a study assistant do worse on tests
But this embrace came with few questions asked: Who owns the data? What’s being tracked? Who profits? We didn’t stop to ask the hard questions about letting a big tech company mediate so much of the learning experience — and now, we’re scrambling to catch up.
We’d be wise to learn from that experience. If we fail to build AI guardrails now, we risk letting AI flatten education — turning schools into testing labs for corporate algorithms, not communities for human growth. Done right, AI would be designed with and for teachers — not as a shortcut around them. It would focus on tasks that free teachers to do what only humans can do.
Imagine a chatbot that gives a student real-time feedback while they draft an essay, flagging confusing phrases so their ideas flow faster and they build confidence — without waiting days for corrections. Or an exam platform that doesn’t just mark wrong answers, but explains why the answers are wrong, helping students learn from their mistakes while the memory is fresh.
In both cases, AI isn’t replacing a teacher’s work — it’s reinforcing it, turning feedback loops into learning loops.
Take the calculator. When it entered classrooms, many worried that it would destroy basic math skills. Today, we allow students to use calculators — even on the SAT — but with clear standards. We treat them as assistants, not replacements.
AI poses a bigger challenge than calculators ever did — but the lesson endures: the right design, the right rules and the right purpose can make any new technology a tool for deeper learning.
We remember the teachers who challenged us, who believed in us — not the calculator they taught us to use. If we get this right, AI will stay in the background, and the human moments will shine.
Michael Goergen is a writer and policy professional focused on sustainability, technology and ethics. He has worked in science and government and believes the future of learning depends on remembering what makes us human.
Contact the opinion editor at opinion@hechingerreport.org.
This story about AI in education was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Hechinger’s weekly newsletter.
The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn't mean it's free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.