You have /5 articles left.
Sign up for a free account or log in.
I was in my first year of law school when the internet became public. Professor Peter Martin, erstwhile dean, was already way ahead of the curve. He established Cornell’s Law Information Institute. And he influenced the curriculum about legal research and writing to take into account the new tools the internet provided to law students and everyone else. In those early years, using electronic search to find and Shepardize cases was a great boon. An assignment that formerly would take a few hours for a law clerk became a matter of minutes. I dazzled supervisors with how fast I would return with results until I confessed it was not me but the machine that did the work. Moving forward, no one looked back.
I taught before I went to law school (Ph.D. in history), and I have taught a lot since, mostly in the area of computer information science (CIS), specialty in internet law and policy, now being refashioned as culture, law and politics of information policy. In fact, this is the name of a course I am teaching this semester at the Brooks School of Policy at Cornell. In the early years, the pedagogical trick to evade academic integrity violations was to avoid multiple-choice tests because they were stolen, copied and shared among students. I think the last one I ever administered was at the University of Buffalo, when I taught the survey course in American history to a large class of 125 students. That was in 1991. I was pregnant with my now 31-year-old son, and the Buffalo Bills were on their way to the Super Bowl.
Since then I have chosen take-home essays that comprehensively synthesize the materials and resources of the course, including readings and class discussions. Depending on the type of course, I also emphasize participation and have used a variety of means for students to contribute in addition to speaking in class (for example, discussion boards in learning management systems). Way, long time ago when I was teaching history, I had academic integrity cases that usually revolved around plagiarism in the course of writing papers on generic topics. I stopped those kinds of assignments, especially as I moved into CIS, and have not had one since.
Enter ChatGPT. Today, The New York Times reports this new artificially intelligent tool is upending testing as we know it in higher education. Already we learn that the largest public school system in the country, New York City’s, has banned its use. Colleges and universities are smart not to go the censorship route for many reasons, not least the practical. Students can use Starbucks Wi-Fi or anything else to access even if their institution blocks the port on their networks. The question remains, however, what to do about academic integrity in light of this new development.
As I put together the syllabus for the new course, one of the administrators who assists faculty (to be sure it meets New York State standards) suggested that I include something specific about ChatGPT. I broadened it out to artificial intelligence generally, and here is what I added.
A Special Note About the Use of Artificial Intelligence for Coursework
Originality is the cornerstone to all academic endeavors. We stand on the shoulders of those who have come before us to teach and learn, research and analyze to produce newly insightful work. The expectation of this course and its instructor is that all work produced for a grade will be the sole product of a student’s endeavors to meet those academic goals.
Students are encouraged to use artificial intelligence among many other (re)search resources if a student finds the resources a useful tool. Students must not substitute the substance of their work with the results of such (re)search tools, however, as that act would contravene the rules academic integrity and their underlying academic values.
For undergraduates, please note that exams will ask you to synthesize readings, lectures and class discussion. The assignment is intentionally designed to stimulate critical thinking and individual innovation. For graduate students, please take careful note of the instructions above that the report must be written from the perspective of the particularized learning within this course. Again, this assignment is designed to hone your academic abilities to interpret book-length materials in the context of particularized queries, thought and research.
Will this be enough to put students on notice about potential academic integrity violations through the use of artificial intelligence tools? I don’t know, but it is a start. I heard Martin in the back of my mind as I wrote it. Paraphrasing what I remember goes something like this: “education/learning will no longer be about how much an individual can absorb and retain but how to find information.” In other words, how to search. And then, of course, do research in a meaningful way no matter what the enterprise, i.e. academic pursuits, scientific, technological, work contexts, etc. Make that search applicable to the question or project at hand. That is why I call for synthesized analysis in final papers. I want to see what a student can do with information, how they think about it and not just whether they can spit stuff back.
We are at another crisis/opportunity moment in the realm of technology and academic integrity. While some colleges and universities perceive crisis, I can’t help but see opportunity. It provides us with the chance to view ourselves as those tasked with the privilege of imparting “knowledge” and “learning.” Are we doing our best? Are we keeping up with technological advances that youth will have to contend with as they move forward in life and work in a manner that genuinely helps them, or are we holding on to older methodologies that would only hold them back? Because I do not know K-12 education, I should be cautious in making a statement, but going out on a limb I will suggest that if ChatGPT or any other program can get us away from tendencies to “teach to the test,” then I think it is a really good thing that artificial intelligence has brought upon us.
Mis- and disinformation is newly comprising about a third of the material of this new course. For the last couple of semesters, I have been inching my way toward including that topic. Given the political landscape globally as well as in the United States, that topic could, and should, be its own course. Artificial intelligence will certainly play a role in that sphere as it emerges in all significant walks of life; take a look, for example, at the essay Bruce Schneier published just yesterday in The New York Times on the subject of lobbying and political influence. We must deal with it. Panic will not help. Tried and true methods of critical inquiry look both forward and back as the foundation of real learning. Martin’s point was not just finding the material but what you did with it. Our challenge in higher education confronting artificial intelligence is not to capitulate to the robotic, the wonders of machine learning notwithstanding, but to think in ways that not only maximize innovation but speak to ethics. Now there’s a thought. A quality that is distinctly human. Why don’t we use this juncture to focus on that?