Faculty Members Weigh How To Enter the Brave New World of AI

January 31, 2023 By Matt Kelly, mkelly@virginia.edu Matt Kelly, mkelly@virginia.edu

“Is it live or is it Memorex?”

That advertising slogan from the early 1980s hinted that the quality of a Memorex audio cassette would be so good that users couldn’t tell if they were listening to something live or something on tape.

Fast forward 35 years and the same question could apply to writing generated by an AI program called ChatGPT. The game-changing app was released in November and is now so popular it’s sometimes impossible to get onto its website.

Faculty members this winter are coming to grips with artificial intelligence applications that can draft student papers seemingly flawlessly – or even with flaws, if requested. Instead of copying existing works, these AI applications can create a new work, using a text-based AI algorithm that has been trained on massive online databases.

“The tools are leveraging what is known as generative AI,” said Michael Palmer, director of the University of Virginia’s Center for Teaching Excellence. “That means they’re not making guesses. It’s not a predefined set of responses. It’s not pulling a line of text out of the web. It’s generating responses from massive datasets. And it’s just based on probability and patterns, so it can write fluent, grammatically correct, mostly factual, text – instantly. All the user has to do is give it a few prompts, which the user can refine with a series of follow-up prompts.”

Palmer made a distinction between generative and predictive AI. Predictive AI guesses what the writer wants to say, based on word patterns and previous behavior. Google’s Gmail application, which anticipates what you want an email to say next, is an example, as are text programs that guess how you want to finish a sentence. But generative AI creates from scratch, using algorithms that have learned from data from the internet.

Related Story

Portrait of Andrew Pennock at a chalkboard
Andrew Pennock believes there are opportunities to teach students how to think about artificial intelligence. (Photo by Dan Addison, University Communications)

“It can create any sort of text from your prompts – outlines, essays, poetry, song lyrics, computer codes … you name it, it can do anything you ask it to do,” said Andrew Pennock, an associate professor of public policy at UVA’s Frank Batten School of Leadership and Public Policy. “It does it lightning fast. And it does it in a very conversational way. Once you start a conversation, it remembers the context in which that conversation started. I can tell it to introduce errors, I can tell it to write as a fifth-grader, I can tell it to write as a graduate student.”

Before classes started this semester, Palmer and Pennock held an online workshop with faculty members on the artificial intelligence application and its implications.

“There are faculty who are completely unaware of the tools,” Pennock said. “There are faculty who know about the tools and are scared and thinking, ‘What’s the point of my class or will my assignments make sense in a ChatGPT world?’ And then there are faculty that are both curious and excited about the possibilities.”

Pennock suggested the best approach would be to learn to work with the AI applications.

“It’s a losing battle to be adversarial with students about generative AI,” Pennock said. “There are opportunities here to teach students how to think about the benefits of AI and the limits of it.”

How ChatGPT and other AI programs will be viewed at a University with a heralded honor code remains to be seen. The student-led Honor Committee began tackling that question this semester, but has not made any formal recommendations.

Portrait of Michael Palmer outside by the Lawn
Professor Michael Palmer suggested that writing assignments encourage students to incorporate personal experiences, since the AI does not know about the individual writers. (Photo by Dan Addison, University Communications)

Some professors have already started working with the students on how to approach the artificial intelligence program. David Mick, the Robert Hill Carter Professor of Marketing at UVA’s McIntire School of Commerce, studied the application and added a paragraph to his syllabus that cautioned the students from claiming the AI’s work as their own.

“I prohibit the use of these AI tools to produce content that they can then put into an exercise or a term paper, which they have to produce in my class, and then hand it in as though it was their work,” he said during the workshop. “I stressed the words ‘content’ and ‘text.’ You cannot generate content and text, and then use it as if it were yours.

“However,” he said, “if you can use these tools to find sources of insights, which you can then use – tables, figures, opinions or whatever – that you would quote, you can do that and then put the reference the way you would do any other time you write a paper, so that we know the source of where that idea came from.”

Sarah Ware, co-director of Legal Research and Writing at the School of Law, thinks the application could be used in drafting legal documents.

“There is a lot of legal writing on the internet that AI could pull from,” Ware said. “If it becomes a reliable tool for drafting, it could change how lawyers work. And that could change how we teach – thinking about how to use it skillfully, how to use it ethically. But it’s not there yet. I’ve played with it a bit. The results looked good, but were shallow and sometimes wildly wrong.”

Pennock said there are beneficial uses for the AI applications, such as creating PowerPoint presentations and simplifying complex technical text and distilling it down to more easily understood prose.

“How many of our students find the readings, the journal articles, we give them confusing?” said Pennock, who noted that an application named “Explainpaper” allows a user to “upload a paper, highlight the text that you find confusing, and it will give you a plain English version.”

Pennock highlighted a page of journal text, 537 words, and the AI program reduced it to 170 words.

“Students can read much more quickly as they read this,” Pennock said of the shorter version. “And you can imagine if students work together, they could create very short versions that are maybe dumbed down, but very quick reads of the journal articles that we’re asking them to engage in. It's missing all the nuance that I find helpful and engaging and wonderful about this article. But to a student who’s entering the conversation without a lot of scaffolding, this might allow them to get there without the nuance.”

“[ChatGPT] can draft emails for you, it can draft the meeting minutes,” Palmer said. “However, the application doesn’t know anything about you. So anytime you ask students to reflect on their own experience, it’s not going to be able to do that. So far, ChatGPT knows nothing, or very little after 2021. That will change. At some point, it will be almost instantaneously updated.”

Mick questions the ethics of working with artificial intelligence applications since they learn through interaction.

Historian Jennifer Sessions agrees.

“I feel very torn because we have a responsibility to learn about what this thing is and how it works because some of our students will be experimenting with it,” Sessions said during the workshop. “And I am I resentful that there’s no way to do that without contributing to it.”

Sessions said she was also concerned about data privacy and the opacity of the entities and business models behind these tools.

It's closer than you think. University of Virginia Northern Virginia
It's closer than you think. University of Virginia Northern Virginia

“What do the creators and operators of these tools intend to do with the user data they collect?” she said.

Another discussion revolves around how and what students learn and what expectations future employers may have about student’s understanding and use of artificial intelligence.

“The employer is going to expect that they know how to use AI tools effectively to do their job better, quicker, faster,” Palmer said. “Some jobs are probably going to become obsolete or change significantly.”

“A dimension we need to add is the ethics of using these tools,” Pennock said. “One aspect is the traditional norms of authorship and academic integrity. Another is thinking about how these programs scrape the internet for data without consent or attention to copyright infringement or harmful content.

“Generative AI is not something that we can just wish away. We will have to learn to work with it and live with it. And I think it can have beneficial implications for how we teach and how students learn. This will play out over time as the technology matures and we grapple with it in our courses,” Pennock said.

“For now, there’s value in engaging students in conversations about the ethics an generative AI program producing content without a transparent process of knowing where the underlying data came from and how it was assembled into the text it produces.”

Media Contact

Matt Kelly

University News Associate Office of University Communications