You have /5 articles left.
Sign up for a free account or log in.

According to Inside Higher Ed’s forthcoming survey of campus chief technology/information officers, the top reported barrier to granting students access to generative AI tools is cost.
Tampatra/iStock/Getty Images Plus
Transformative. Disruptive. Game-changing. That’s how many experts continue to refer, without hyperbole, to generative AI’s impact on higher education. Yet more than two years after generative AI went mainstream, half of chief technology officers report that their college or university isn’t granting students institutional access to generative AI tools, which are often gratis and more sophisticated and secure than what’s otherwise available to students. That’s according to Inside Higher Ed’s forthcoming annual Survey of Campus Chief Technology/Information Officers with Hanover Research.
There remains some significant—and important—skepticism in academe about generative AI’s potential for pedagogical (and societal) good. But with a growing number of institutions launching key AI initiatives underpinned by student access to generative AI tools, and increasing student and employer expectations around AI literacy, student generative AI access has mounting implications for digital equity and workforce readiness. And according to Inside Higher Ed’s survey, cost is the No. 1 barrier to granting access, ahead of lack of need and even ethical concerns.
Ravi Pendse, who reviewed the findings for Inside Higher Ed and serves as vice president for information technology and chief information officer at the University of Michigan, a leader in granting students access to generative AI tools, wasn’t surprised by the results. But he noted that AI prompting costs, typically measured in units called tokens, have fallen sharply over time. Generative AI models, including open-source large language models, have proliferated over the same period, meaning that institutions have increasing—and increasingly less expensive—options for providing students access to tools.
‘Paralyzed’ by Costs
“Sometimes we get paralyzed by, ‘I don’t have resources, or there’s no way I can do this,’ and that’s where people need to just lean in,” Pendse said. “I want to implore all leaders and colleagues to step up and focus on what’s possible, and let human creativity get us there.”
According to the survey—which asked 108 CTOs at two- and four-year colleges, public and private nonprofit, much more about AI, digital transformation, online learning and other key topics—institutional approaches to student generative AI access vary. (The full survey findings will be released next month.)
Some 27 percent of CTOs said their college or university offers students generative AI access through an institutionwide license, with CTOs at public nonprofit institutions especially likely to say this. Another 13 percent of all CTOs reported student access to generative AI tools is limited to specific programs or departments, with this subgroup made up entirely of private nonprofit CTOs. And 5 percent of the sample reported that students at their institution have access to a custom-built generative AI tool.
Among community college CTOs specifically (n=22), 36 percent said that students have access to generative AI tools, all through an institutionwide license.
Roughly half of institutions represented do not offer student access to generative AI tools. Some 36 percent of CTOs reported that their college doesn’t offer access but is considering doing so, while 15 percent said that their institution doesn’t offer access and is not considering it.
Of those CTOs who reported some kind of student access to generative AI and answered a corresponding question about how they pay for it (n=45), half said associated costs are covered by their central IT budget; most of these are public institution CTOs. Another quarter said there are no associated costs. Most of the rest of this group indicated that funding comes from individual departments. Almost no one said costs are passed on to students, such as through fees.
Among CTOs from institutions that don’t provide student access who responded to a corresponding question about why not (n=51), the top-cited barrier from a list of possibilities was costs. Ethical concerns, such as those around potential misuse and academic integrity, factored in, as well, followed by concerns about data privacy and/or security. Fewer said there is no need or insufficient technical expertise to manage implementation.
“I very, very strongly feel that every student that graduates from any institution of higher education must have at least one core course in AI, or significant exposure to these tools. And if we’re not doing that, I believe that we are doing a disservice to our students,” Pendse said. “As a nation we need to be prepared, which means we as educators have a responsibility. We need to step up and not get bogged down by cost, because there are always solutions available. Michigan welcomes the opportunity to partner with any institution out there and provide them guidance, all our lessons learned.”
The Case for Institutional Access
But do students really need their institutions to provide access to generative AI tools, given that rapid advances in AI technology also have led to fewer limitations on free, individual-level access to products such as ChatGPT, which many students have and can continue to use on their own?
Experts such as Sidney Fernandes, vice president and CIO of the University of South Florida, which offers all students, faculty and staff access to Microsoft Copilot, say yes. One reason: privacy and security concerns. USF users of Copilot Chat use the tool in a secure, encrypted environment to maintain data privacy. And the data users share within USF’s Copilot enterprise functions—which support workflows and innovation—also remains within the institution and is not used to train AI models.
There’s no guarantee, of course, that students with secure, institutional generative AI accounts will use only them. But at USF and beyond, account rollouts are typically accompanied by basic training efforts—another plus for AI literacy and engagement.
“When we offer guidance on how to use the profiles, we’ve said, ‘If you’re using the commercially available chat bots, those are the equivalent of being on social media. Anything you post there could be used for whatever reason, so be very careful,” Fernandes told Inside Higher Ed.
In Inside Higher Ed’s survey, CTOs who reported student access to generative AI tools by some means were no more likely than the group over all to feel highly confident in their institution’s cybersecurity practices—although CTOs as a group may have reason to worry about students and cybersecurity generally: Just 26 percent reported their institution requires student training in cybersecurity.
Colleges can also grant students access to tools that are much more powerful than freely available and otherwise prompt-limited chat bots, as well as tools that are more integrated into other university platforms and resources. Michigan, for instance, offers students access to an AI assistant and another conversational AI tool, plus a separate tool that can be trained on a custom dataset. Access to a more advanced and flexible tool kit for those who require full control over their AI environments and models is available by request.
Responsive AI and the Role of Big Tech
Another reason for institutions to lead on student access to generative AI tools is cultural responsiveness, as AI tools reflect the data they’re trained on, and human biases often are baked into that data. Muhsinah Morris, director of Metaverse programs at Morehouse College, which has various culturally responsive AI initiatives—such as those involving AI tutors that look like professors—said it “makes a lot of sense to not put your eggs in one basket and say that basket is going to be the one that you carry … But at the end of the day, it’s all about student wellness, 24-7, personalized support, making sure that students feel seen and heard in this landscape and developing skills in real time that are going to make them better.”
The stakes of generative AI in education, for digital equity and beyond, also implicate big tech companies whose generative AI models and bottom lines benefit from the knowledge flowing from colleges and universities. Big tech could therefore be doing much more to partner on free generative AI access with colleges and universities, and not just on the “2.0” and “3.0” models, Morris said.
“They have a responsibility to also pour back into the world,” she added. “They are not off the hook. As a matter of fact, I’m calling them to the carpet.”
Jenay Robert, senior researcher at Educause, noted that the organization’s 2025 AI Landscape Study: Into the Digital AI Divide found that more institutions are licensing AI tools than creating their own, across a variety of capabilities. She said digital equity is “certainly one of the biggest concerns when it comes to students’ access to generative AI tools.” Some 83 percent of respondents in that study said they were concerned about widening the digital divide as an AI-related risk. Yet most respondents were also optimistic about AI improving access to and accessibility of educational materials.
Of course, Robert added, “AI tools won’t contribute to any of these improvements if students can’t access the tools.” Respondents to the Educause landscape study from larger institutions were more likely those from smaller ones to report that their AI-related strategic planning includes increasing access to AI tools.
Inside Higher Ed’s survey also reveals a link between institution size and access, with student access to generative AI tools through an institutionwide license, especially, increasing with student population. But just 11 percent of CTOs reported that their institution has a comprehensive AI strategy.
Still, Robert cautioned that “access is only part of the equation here. If we want to avoid widening the digital equity divide, we also have to help students learn how to use the tools they have access to.”
In a telling data point from Educause’s 2025 Students and Technology Report, more than half of students reported that most or all of their instructors prohibit the use of generative AI.
Arizona State University, like Michigan, collaborated early on with OpenAI, but it has multiple vendor partners and grants student access to generative AI tools through an institutionwide license, through certain programs and custom-built tools. ASU closely follows generative AI consumption in a way that allows it to meet varied needs across the university in a cost-effective manner, as “the cost of one [generative AI] model versus another can vary dramatically,” said Kyle Bowen, deputy CIO.
“A large percentage of students make use of a moderate level of capability, but some students and faculty make use of more advanced capability,” he said. “So everybody having everything may not make sense. It may not be very cost-sustainable. Part of what we have to look at is what we would describe as consumption-based modeling—meaning we are putting in place the things that people need and will consume, not trying to speculate what the future will look like.”
That’s what even institutions with established student access are “wrestling with,” Bowen continued. “How do we provide that universal level of AI capability today while recognizing that that will evolve and change, and we have to be ready to have technology for the future, as well, right?”