You have /5 articles left.
Sign up for a free account or log in.

My IHE colleague John Warner took to Twitter to express his frustration at the opinion shared by a correspondent drawing a parallel between using ChatGPT for writing and using a calculator in math and applied math classes.

Folks on Twitter responded vigorously, often making the point that writing involves thinking, synthesizing and creating something new. In most contexts, the papers that students are asked to produce aren’t assigned in the hopes of generating new knowledge for the public record; they’re assigned to give students practice in doing the work of putting disparate thoughts together into a coherent and readable whole.

To which I say, yes.

I recall a frustrating exchange in graduate school in which a professor asked us to submit a paper outline a month before the paper was due. I tried to explain that I don’t know where the paper is going until I’m actually writing it; any coherence is typically added later. I’ve heard songwriters say something similar about songs: they go where they go, and the job of the songwriter is both to get out of the way and to supply just enough order to make it work. If you reduce them to formulas, they sound formulaic.

I fully agree with Warner’s statement that “we have to fundamentally rethink what we ask students to write,” though I might substitute “produce” for “write.” Moving from the tasks of absorption and recollection to the act of creation requires students to draw on (and develop) different skills; helping them develop those skills at a high level is a challenge that higher education tends to undervalue. Writing is one form of creation, though not the only one.

The challenge of AI, in part, is that it may make distinguishing actual student work from automated production much more difficult. For students who are intrinsically motivated and who have the time and resources to devote to original production, that may not matter much. But enough students are opportunistic in taking shortcuts that colleges will need to confront the question directly.

Last week’s piece about ChatGPT forgetting a half dozen former vice presidents who became presidents, including Joe Biden, contained an embarrassing mistake: I neglected to fact-check the names ChatGPT did provide. One, Franklin Pierce, had never been a vice president.

Sigh.

As the saying goes, artificial intelligence is no match for natural stupidity. That one’s on me.

I have a bit of travel early this week, so the blog will skip a couple of days. It’ll be back later this week.

Next Story

Written By

More from Confessions of a Community College Dean