Throughout the 2024-25 academic year, Ithaca College has placed an increased focus on artificial intelligence initiatives and AI education across campus, with the Presidential Working Group on AI formulating recommendations and guidelines for the use and implementation of AI. It has been working on a plan of six deliverables, which will be presented to the Board of Trustees for approval in May.
The Presidential Working Group on AI is a group of faculty, staff and students that has been working since July 2024 to create the plan and principles for further integrating AI into the college. In the last few months, the group has visited various representative bodies — Student Governance Council, Faculty Council and Staff Council — for feedback.
Dave Weil, vice president and chief information and analytics officer, said AI implementation must be strategic and appropriate, especially as a small institution. Weil said the focus of AI in a smaller institution should be to reduce the time spent on tedious tasks and enhance interpersonal experiences.
“Things you might spend half an hour doing … the AI tool can do it in a minute,” Weil said. “Now you have 29 more minutes in your day to meet with people or engage with [other] things.”
Weil said there are three ways to consider AI uses on campus: Productivity AI like ChatGPT or Microsoft Copilot, embedded AI, or AI that come with a program like ZoomAI and AI programs developed at IC like the AI for Ithaca College Awareness, Response, and Education Team (ICare) support. ICare is a work group dedicated to quickly responding to referrals that are sent for students exhibiting concerning behavior relating to mental health. The AI for ICare support was built to automate the process of organizing and summarizing student information for the ICare staff before a student meets with them, which they did manually before October 2024. The AI system was built by staff in the Office of Information Technology and Analytics, and they continue to work with ICare to maintain it.
Danni Klein, care manager in the Office of ICare and Student Support, said via email that the AI only uses de-identified information from Homer Connect, and it does not use any outside information. Klein said the AI has transformed her team’s workflow and helped them spend more time with students.
“We can now conduct more meetings and spend less time on documentation,” Klein said.
Jenna Linskens, director of the Center for Instructional Design and Educational Technology, said students need to be familiar with AI because of how it is changing the job market.
“Digital literacy is not a new thing,” Linskens said. “What it is though, is an evolving thing. We need to look at adding AI into that digital literacy and preparing our students for the workforce.”
Linskens said AI, especially generative AI, is not perfect in its responses and the programs are often biased. Bias becomes a part of AI at various stages, from data collection to model training, as the computer is still subject to the opinions and ideas of those who make and program it.
“Helping students to understand [bias in AI] … is part of that literacy piece, understanding you can’t rely on the information that’s provided,” Linskens said. “You still have to do your own work.”
At the beginning of Spring 2025, Ann-Marie Adams, assistant professor in the Department of Media Arts, Sciences and Studies, received an Artificial Intelligence Mini-Grant from the Center for Instructional Design and Educational Technologies. The mini-grant, which began in Fall 2024, is a $600 stipend for developing and integrating AI into the course curriculum, which began in Fall 2024. Adams said she has been integrating AI into her classwork for her course in media law, even specifically instructing students to use AI on assignments.
The shifting focus on AI in education is not solely in higher education. According to a 2024 AIPRM report, 54% of grade school students use AI tools for schoolwork with around 60% of grade school teachers claiming to integrate it into their teaching plans.
Matthew Clauhs, associate professor in the Department of Music Education, said he offered the use of AI tools in a songwriting class he taught. He said students could prompt the program with a feeling and it would offer chord progressions and instrumental ideas to match that.
“The students in my songwriting class didn’t seem to want to use it at all, so it was more available as an option,” Clauhs said. “I think a lot of our students have, understandably, a degree of skepticism when working with these tools, especially if they are creative artists.”
Clauhs said faculty have both ethical and philosophical concerns when it comes to AI, some even outright rejecting it.
“There’s no way to ban AI or AI tools,” Clauhs said. “It’s absolutely essential that we figure out how to work with [AI] and how to work with it ethically. All of our fields are going to require AI literacy for our students in the future and if our students don’t have AI literacy, they won’t be competitive for their own careers.”
Since the generative AI boom in 2023, more data has been uncovered about the environmental effects of AI. Susan Allen, professor and chair of the Department of the Environment, said AI requires immense computing power, consuming large amounts of electricity and fresh water and causing significant carbon emissions. AI programs require data centers, temperature-controlled buildings dedicated to housing large computer systems.
A recent preprint by researchers at the University of California, Riverside, the University of Houston and the University of Texas at Arlington found that by 2027, AI data centers will consume as much fresh water as half the United Kingdom. By 2026, it is also estimated that the electricity consumption of global data centers will sit between Russia and Japan as the fifth-largest global electricity consumer.
Allen said it would be very difficult to measure how much of an impact on the environment AI use on campus would have and it likely will not appear on a sustainability report. Scott Doyle ’98, director of Energy Management and Sustainability, said there is currently are no measures to observe AI impact on sustainability reports, but it will become easier to measure as the energy management industry continues to learn more about it.
“I think there’s actually a ton of interesting research in this [industry] that thinks about, ‘Hey, … what’s the best way we can use this for advancing our goals and not going backwards?’” Doyle said.
Doyle said there are ways that AI can assist sustainability and that the college works with Revert Technologies, a Cornell University tech startup. The company makes an AI-powered smart plug that helps assess idle power waste and manage energy costs.
Allen said she believes the school should pursue AI initiatives to keep up with the latest educational technologies. She still has concerns regarding the environment and the impact on students’ ability to learn and think for themselves.
“Are we moving to a society where no one has to know anything because you can look it all up?” Allen said. “What does it mean in terms of our ability to communicate with each other?”
Junior Kevin Yang works in the AI Exploratorium, a space where all members of the campus community can come and experiment with different premium AI software like Microsoft Copilot, ChatGPT premium, and Anthropic Claude. The lab also works to experiment with and research AI.
Yang said generative AI is a powerful tool for research and task management when used properly, but responsible use is fully up to the discretion of professors and students. Yang said he feels it is better when courses have AI integration, as it becomes harder for people to use it to generate full assignments. He said his Advanced Decision Making with Analytics course integrates AI well.
“I think a lot of professors don’t really understand how easy some of their [assignments] are to have AI do,” Yang said. “[My] professor actually encourages us to use AI … and [in] that class it’s way harder to use AI in a negative way.”
Weil said it is important to understand that AI has limitations but also to acknowledge its potential. He said continued experimentation, staying informed and continuing conversations regarding it is the best way to create guidelines and principles for AI use at the college.
“We have to acknowledge the reality that the world is embracing AI so we can’t ignore it,” Weil said. “In fact, if we choose to ignore it, we will fall behind [and] we will not provide the type of experiences our students want. Ithaca College is an institution dedicated to empowering people through theory, practice and performance. … We are thinking deeply and very intentionally about the use of these tools for Ithaca.”