作者:Steve Caplan
Credit: Matheus Bertelli / Pexels
A recent New York Times investigation revealed OpenAI’s ambition to make artificial intelligence the “core infrastructure” of higher education. In California, that vision is already a reality: The California State University system has committed $16.9 million to provide ChatGPT Edu to 460,000 students across its 23 campuses. But this massive investment misses a crucial opportunity to develop the strategic thinking capabilities that make students genuinely valuable in an AI-augmented workplace.
The irony is striking. OpenAI helped to create the problem of students outsourcing critical thinking to chatbots, and now presents itself as the solution by making that outsourcing even more seamless. Recent research in Psychology Today found a negative correlation between frequent AI use and critical thinking abilities, particularly among younger users. When students delegate decision-making and problem-solving to AI, they bypass the very mental processes that build strategic capabilities.
California State University’s investment in ChatGPT Edu is significant and potentially transformative. But spending almost $17 million on AI tools without a strategic framework is like buying students calculators without teaching them mathematics. The investment is sound; what’s missing is teaching students how to direct these powerful capabilities strategically rather than becoming dependent on them.
Students in the CSU system already possess remarkable strategic thinking skills that traditional academic metrics don’t capture. Here are a few examples. Working multiple jobs while attending school requires sophisticated resource optimization. Supporting families demands stakeholder management and priority balancing. Navigating complex bureaucracies develops systems thinking. Translating between different cultural communities builds pattern recognition across domains.
These aren’t just life experiences — they’re strategic capabilities that, when developed and articulated, become powerful career advantages in an AI-augmented workplace. The goal should be to help students recognize and leverage these skills, not replace them with chatbot dependency.
European business schools are already proving that the strategy-focused approach works. At Essec Business School, outside of Paris, executive education programs focus on developing “strategically fluent leaders” who use AI as a strategic tool rather than a replacement for thinking. Students learn to maintain strategic direction while leveraging AI capabilities — exactly what CSU students need. When executives can apply strategic frameworks to AI integration, they don’t merely use the technology better; they direct it toward genuine business value.
A recent University of Chicago Law School study found that even AI systems trained on specific course materials made “significant legal errors” that could be “harmful for learning.” This isn’t about AI’s current limitations; it’s about the fundamental difference between tactical execution and strategic judgment. AI excels at processing information within defined parameters, but strategic thinking requires the uniquely human ability to see patterns across domains, understand complex motivations, and envision new possibilities.
The democratization of AI tools actually creates unprecedented opportunities for students from diverse backgrounds to translate their strategic insights into career success. But only if we teach strategic frameworks, not just tool usage.
In my courses at the University of Southern California’s Annenberg School — spanning advertising, social media, public relations and political communications — I’m developing approaches that emphasize strategic thinking alongside AI capabilities. Rather than just teaching AI literacy, I focus on helping students develop strategic frameworks for directing these tools effectively. The goal isn’t AI literacy — it’s strategic literacy enhanced by AI capabilities.
Rather than criticizing CSU’s AI investment, we should help the system maximize its value. Imagine courses that help students identify their strategic thinking patterns from real-world experience, develop frameworks for human-AI collaboration, and practice directing AI capabilities toward strategic goals. Students would graduate not as AI users, but as strategic directors of AI — exactly what employers need, and exactly what justifies CSU’s significant investment.
This isn’t about rejecting AI in education. It’s about ensuring that as AI handles tactical execution, we develop the strategic thinking capabilities that become more valuable, not less. CSU students bring strategic insights from lived experience that no chatbot can replicate. The question is whether we’ll help them recognize and develop these capabilities, or teach them to depend on tools instead.
We don’t need AI-native universities. We need strategic-thinking native students who can direct AI capabilities toward human purposes. That’s the transformation worth investing in.
•••
Steve Caplan teaches strategic communications at USC’s Annenberg School for Communication and Journalism and is the author of “Strategy First: Thriving in the Face of Technological Disruption.”
The opinions expressed in this commentary represent those of the authors. EdSource welcomes commentaries representing diverse points of view. If you would like to submit a commentary, please review our guidelines and contact us.