You have /5 articles left.
Sign up for a free account or log in.

Scientists walking in the corridor from above

As researchers grapple with how to engage with generative artificial intelligence, Cornell University researchers and faculty offer insights.

sanjeri/E+/Getty Images

A recent report from a Cornell University task force on AI identifies a framework and perspectives on how generative AI can aid or influence academic research.

The report, “Generative AI in Academic Research: Perspectives and Cultural Norms,” was published Dec. 15 and highlights best practices in the current landscape, how university policies impact the Cornell community and considerations for other faculty members or researchers navigating the new tech tools.

The background: The university developed the Cornell AI Initiative in 2021 to lead institutionwide collaboration around AI development and application. The group published a report in early fall 2023 identifying policies for generative AI in higher education and pedagogy for faculty members.

This most recent report was authored by a task force of researchers, faculty members and staff and led by Krystyn Van Vliet, vice president for research and innovation.

Providing guidance on generative AI can be a challenge given the evolving nature of the tool, and this report is not the be-all and end-all, but it’s the first step in a larger conversation, Van Vliet says. The authors intend to revisit its guidance annually but are offering the report now to the Cornell audience and the larger higher education space to kick-start conversations around research and AI tools.

Report’s purpose: The December report identifies four phases of research stages and how generative AI can be applied to each phase, the perspectives and cultural norms around generative AI, and discussion questions for stakeholders.

There are two different groups of researchers who are engaging with generative AI: those developing tools based on AI in creating models and algorithms, and those using or adapting those tools, Van Vliet says. “We wanted to speak to both of those communities’ research practices.”

Within the four stages of research practice—conception and execution, dissemination, translation, and funding/funding agreement compliance—the report lists specific situations generative AI should or should not be applied.

“In all of these research stages, we consider GenAI to be a useful research tool that can and should be explored and used to great scholarly advantage,” the report says. “As with all tools, the user is responsible for understanding how to use such tools wisely.”

Key concerns: Some highlighted concerns that carry across research disciplines include:

  • Data privacy. Tools like ChatGPT are not secure or private, and researchers should therefore be aware of the risks of using generative AI tools with their intellectual property or data. The report advises against researchers imputing their first-draft research ideas into ChatGPT, for example, because that information could be added to the server and used to teach the language model for another user. “It’s easier for more people to inadvertently put some of that data prematurely out onto forums where other people have access to it if you are not thoughtful and careful,” Van Vliet says.
  • Transparency. Disclosure of generative AI usage is also critical because the tool is so novel. The report’s authors acknowledge there is a potential for generative AI tools to be more widely accepted in research work, such as a spell-checker or graphic calculator, but researchers should still be candid about their use to maintain the ability to replicate and test studies in the community.
  • User responsibility. The principal investigator, or similar role in research, is responsible for validating research outputs created with the help of generative AI and should not be complacent or trusting of the devices. All generative language models available have blind spots and biases based on their training data, which makes them fallible.

The report encourages researchers to refer to existing policies and practices given by their research communities and institutions to guide their best practices. Cornell, for example, has a University Privacy Statement and policy on research data retention, which apply to data shared with generative AI models.

For faculty members: While today’s researchers are grappling with how to engage with generative AI tools, faculty members and instructors are engaging future researchers in the classroom.

“I think it will become part of the toolbox of the next generation of researchers,” Van Vliet says.

Cornell’s report offers an appendix of questions for faculty to engage with students about generative AI use and their perspectives. Each question comes with a sample response as well. Just as technology advances research, it also shifts how researchers engage in their work, making it critical for higher education professionals to lead and engage in thoughtful discussions about tools and research processes, Van Vliet says.

“Those are discussion starters that researchers, including the leaders of research teams, you can think about this for yourself, you can discuss with your team in meetings, you can discuss them with your department leadership,” Van Vliet says. “This is a great place to start and then to sharpen your own thinking and to learn more as well as in your community.”

Some questions include:

  • What are the conditions, if any, that a researcher should not use AI to generate research ideas? Examples may vary among research fields, sources of information included in a prompt including FERPA or HIPAA data, and collaborating or sponsoring organization perspective.
  • Should the costs of generative AI be charged to a research account, assuming this is not disallowed by the corresponding funding agreement (i.e., not disallowed by a research sponsor)?
  • When and how should research group leaders (e.g., faculty) communicate these expectations of appropriate/ethical/responsible use of generative AI in research to researchers who are undergraduate students? Graduate students? Postdoctoral researchers? Other research staff?

Do you have an academic success tip that might help others encourage student success? Tell us about it.

Next Story

More from Academic Life