You have /5 articles left.
Sign up for a free account or log in.
metamorworks/Getty Images
When it comes to artificial intelligence, faculty members across disciplines have had a demanding year. Many have redesigned assignments and developed new course policies in the presence of generative AI tools. At conferences and in idle moments, some have pondered what makes prose human. (One possible answer: burstiness.) Others have designed, delivered or participated in workshops focused on AI in teaching and learning, with or without support. One sent students a message that he will “not grade this chat Gpt shit.” (No doubt the fallout required time.)
Amid 2023’s AI disruption, professors have also grappled with a paradox. In one narrative, large language models offer much-needed assistance to students by bolstering their creative, research, writing and problem-solving skills. Students with disabilities also benefit from AI tools that, for example, provide executive function support. But in another narrative, algorithms reproduce systemic bias and hold potential to widen the education gap like no other tech before them.
“There are two schools of thought,” Belkacem Karim Boughida, dean of libraries at Stony Brook University, said, adding that he is committed to a middle ground. “All data are biased, and you can approach tools in an ethical and responsible way.”
An AI Literacy Divide
Some students already possess sophisticated skills in prompt engineering—the art of crafting questions for natural language processing tools to get better results. Others have scant experience in conversing with machines.
“What’s happening is the rich—so to speak—are getting richer,” said Lewis Ludwig, professor of mathematics and director of the Center for Teaching and Learning at Denison University. “Somebody who knows what they’re doing can really make this thing sing, and those who don’t know how to use it are kind of left in the dust.”
Many instructors are at work helping students navigate this AI divide. But such efforts demand nuanced calibration. In an ideal world, students would find their way to a prompt-engineering sweet spot—one in which they leverage AI tools for learning without hindering their personal or academic growth.
Students sometimes feel more confident about the output of an AI tool than their own work, said Laura Dumin, professor of English and director of the technical writing program at the University of Central Oklahoma. As a result, some may invest less effort in shaping the output of an AI tool.
When ChatGPT was released in late 2022, many students were in the middle of their academic years at institutions they knew well. But students will redistribute this fall at the start of the new academic year. Some may arrive with savvy knowledge of AI tools. Others from largely rural U.S. states, including West Virginia, Alaska, Mississippi and Arkansas, may have limited digital literacy given poor internet quality, latency and access in their geographic regions, according to a 2023 report.
Even those who arrive from states that rank high for broadband may be digitally disadvantaged. California, for example, ranks high, but 40 percent of Latino students in the state have no reliable broadband access.
Also, some incoming freshmen will have graduated from high schools that banned AI-powered tools. Others will have extensive classroom experiences with the technology. In Australia, for example, most public schools banned ChatGPT, which has raised concerns about a digital divide among public and private school students in that country. Even in the United States, many students have quick access to ChatGPT-like tools on the cellphones in their pockets, while others do not own laptops or mobile phones.
“We’re going to see a stratification of who already knows how to play around the AI and who doesn’t, who knows more of the dark side and who is just coming in for the first time,” Dumin said. “That’s a big equity concern.”
AI Tools—for a Price
“The most robust AI is behind paywalls,” Emily Isaacs, executive director for the Office for Faculty Advancement and professor of writing at Montclair State University, told Inside Higher Ed. At Montclair, nearly half of students are recipients of Pell Grants—a marker of low-income status—and nearly half of the faculty members are contingent, Isaacs said. “Both our faculty and students have financial stressors. We’re hearing from students, ‘Is this [AI tool] something I should purchase?’”
Students’ unequal access to premium versions of educational technology mirrors inequities Isaacs observes every day at (in her words) the commuter school.
“The student who has a car is able to get to campus much more easily than the student who takes three buses to get from their home in Newark to our campus,” Isaacs said. To foster equity, Isaacs would like to see AI-powered educational products offered as open educational resources. Until such a day, she wonders whether the cost could be folded into tuition and fees, just as, for example, all Montclair State students are given access to other proprietary software.
In the absence of equity-minded policies, some academics see roles that their units or departments may play on the path forward.
“Academic libraries will likely pick up the slack,” Boughida said. “It will be the same as managing journal licensing … Some colleges won’t have the means to subscribe. Hence, the digital divide.”
Until colleges develop policies addressing inequities, professors who encourage the use of AI tools should be mindful of costs they ask students to incur, Isaacs said.
Culturally Insensitive Chat Bots
Colin Bjork, senior lecturer at Massey University in New Zealand, recently spoke with a Microsoft executive who called non-English languages, including oral and Indigenous languages, “edge cases”—a term for describing uncommon cases that perplex computers. That’s because large language models are trained online, where data sets are often in standard American English. For this reason, AI outputs are often not representative of the depth and breadth of many students’ multicultural and multilingual experiences.
“We try to teach students to find their voice,” Dumin said. Students who speak an African American vernacular English or Appalachian English, she offered as examples, may view generic AI-generated text as more valid than their own. “We might lose some of the diversity of writing and of sound. That would be really sad.”
“Black English matters,” Chi Luu, JSTOR Daily’s resident linguist, recently wrote. Luu makes the case for its “almost relentless creativity” in linguistic innovations, even as it accommodates rich regional and class differences. Its innovations tell stories of migration and movement and have made a mark on standard English by “sliding seamlessly into the language of art, music, poetry, storytelling and social media.” But its marginalization, at times, can have negative impacts on those who speak it, including in job interviews, apartment rentals and interactions with the police, Luu wrote.
Many academics are concerned that large language models are trained on the internet, where bias is well established.
“It’s going to normalize the expectation that everybody else needs to change to look more like this machine’s generative AI language bias default,” said Lance Eaton, director of digital pedagogy at College Unbound, a postsecondary institution that focuses on adult learners who have faced significant barriers to attending college. College Unbound’s bachelor’s degree program is designed around a personalized, interest- and project-based curriculum.
Chat Bots That Empower
Language learners can use AI writing tools to learn vocabulary, genres, idioms, grammar and more. Likewise, individuals whose neurodivergence leaves them struggling in social settings and those who fear judgment from their peers may also benefit. Chat bots are adept and willing conversational partners, and conversing with them offers a low-stakes way to experiment. That, in turn, can support learners’ self-confidence.
“Being able to ask questions about anything is really powerful,” Eaton said. “You don’t feel judgment from the computer.”
Practicing the art of conversation can enhance social mobility, especially for those with communication disabilities. For example, Fiona Given, a lawyer who lives with cerebral palsy, often economized when writing messages with pre-ChatGPT assistive technologies, according to an article in The Conversation. Still, she was concerned that her minimalist replies might be perceived as “curt, if not rude.” ChatGPT, Given found, helps add the polite parts of emails, which saves her time and projects professionalism.
AI-Assisted Career Boosts
Many employers require job applicants to write and submit cover letters when applying for jobs. This is often the case even when the required job skills do not overlap with cover letter–writing skills.
“The ways you have to present yourself, the language, the nuance you’re expected to use … [most] of the time has nothing to do with the actual job you’re applying for,” said Ludwig of Denison.
During his 21 years at the Ohio liberal arts college, Ludwig has participated in numerous faculty hiring committees, particularly for the math department. In that time, he and his colleagues have placed a premium on applicants’ cover letters.
“Maybe we’ve been doing some things wrong,” Ludwig said. “If folks just aren’t good at writing or kind of missed the mark … they don’t make the cut.”
Shiladitya (Raj) Chaudhury, executive director of the Innovation Learning Center at the University of South Alabama, has also participated in search committees in which he perceived a bias among evaluators that was driven partly by language.
AI writing tools “can help ameliorate the effects of somebody expressing something where the content is good but the vehicle of the language for that particular purpose can be standardized,” Chaudhury said.
Nonetheless, many employers persist in requiring cover letters for open positions. AI assistance in job applications “could open doors for people who may have been excluded in the past from even getting an interview,” Dumin said.
Yet human resources professionals are divided on the use of AI assistance in job applications. Some deem it a “marketable skill,” while others see it as a “deal breaker.”
AI Time Savers
Few college professors receive training in pedagogy, though some do. For new faculty members with minimal teaching experience, generative AI tools could help.
“This could increase what new teachers’ lesson plans look like and make them stronger,” Dumin said. “If you have a lesson plan that feels solid, that’s built on something that you feel is good pedagogy, then you go to class feeling a little more confident.”
Dumin has seen AI-generated lesson plans, including from colleagues who swear by the premium version of ChatGPT, which they report is “leaps and bounds better than the free version.” Dumin is “critically optimistic” and “not unimpressed” with the results—phrases that make her laugh and hint at the ambivalence surrounding conversations about AI in education.
For students or teachers who struggle with executive function skills or are short on time, AI tools can assist with prioritizing tasks, organizing information or creating schedules. AI could also reallocate 20 to 30 percent of faculty members’ time away from routine administrative tasks and toward activities that support student learning, according to a World Economic Forum report. In a field where practitioners cite exhaustion and burnout due to unmanageable administrative tasks, that could make a noticeable difference.
In the meantime, educators and policy makers are proceeding with caution in efforts designed to maximize equity and minimize bias in AI. The White House’s Office of Science and Technology Policy released a Blueprint for an AI Bill of Rights in 2022, which envisions a future in which equity metrics are built into algorithms.
Then, last month, the U.S. Department of Education’s Office of Educational Technology released a report offering insights and recommendations on AI in teaching and learning. The report is 71 pages, but if pressed, its message might be reduced to five words contained within it: “Emphasize humans-in-the-loop.”