ChatGPT is forcing universities to rethink plagiarism

Microsoft
By Microsoft 9 Min Read

After listening to his colleagues enthusiastic about the generative AI tool Chat GPT, Cobbs decided to play with the chatbot while writing an essay on the history of capitalism. Best known for its ability to generate long-form written content in response to requests for user input, Cobbs expected the tool to produce a nuanced and thoughtful response to his specific search cues. Instead, his screen produced a generic, poorly written document that he would never dare claim as his.

“The quality of the writing was appalling. The phrasing was clunky and lacked complexity,” says Cobbs. “Logically, I can’t imagine a student using ChatGPT-generated writing for an article or anything when the content is just plain bad.”

Not everyone shares Cobbs’ contempt. Since OpenAI launched the chatbot in November, educators struggled with how to manage a new wave of student work produced with the help of artificial intelligence. While some public school systems, such as New York City’s, have banned the use of ChatGPT on devices and school networks to curb cheating, universities have been reluctant to follow suit. In higher education, the introduction of generative AI has raised thorny questions about the definition of plagiarism and academic integrity on campuses where new digital research tools are continually coming into play.

Make no mistake, the birth of ChatGPT does not mark the emergence of concerns regarding the misuse of the Internet in academia. when Wikipedia was launched in 2001universities nationwide were scrambling to decipher their own research philosophies and understanding of honest academic work, pushing policy boundaries to keep pace with technological innovation. Now, the stakes are a little more complex, as schools figure out how to deal with work produced by bots rather than weird attribution logistics. The world of higher education is playing a familiar game of catch-up, adjusting its rules, expectations and perceptions as other professions adjust as well. The only difference now is that the Internet can think for itself.

According to ChatGPT, the definition of plagiarism is the act of using someone else’s work or ideas without giving due credit to the original author. But when the work is generated by somewhat rather than someone, this definition is difficult to apply. As Emily Hipchen, board member of Brown University’s Academic Code Committee, puts it, students’ use of generative AI leads to a critical point of contention. “Self [plagiarism] he’s stealing from a person,” he says, “then I don’t know if we have a person it’s being stolen from.”

Hipchen is not alone in her speculation. Alison Daily, chair of the Academic Integrity Program at Villanova University, is also grappling with the idea of ​​classifying an algorithm as a person, particularly if the algorithm involves text generation.

Daily believes that eventually professors and students will have to understand that digital tools that generate text, rather than simply gathering facts, will have to fall under the umbrella of things that can be plagiarized.

While Daily acknowledges that this technological growth raises new concerns in academia, it doesn’t find it an entirely uncharted realm. “I think we’ve already been in one version of this territory for a while now,” says Daily. “Students who commit plagiarism often borrow material from ‘somewhere’, such as a website that does not have clear author attribution. I suspect the definition of plagiarism will expand to include things they manufacture.

Ultimately, according to the Daily, a student using ChatGPT text will not be seen as copying and pasting text from Wikipedia without attribution.

Opinions of students about ChatGPT is another issue entirely. There are those, like Cobbs, who can’t imagine putting their name on something bot-generated, but there are others who see it as just another tool, like a spell checker or even a calculator. For Brown University sophomore Jacob Gelman, ChatGPT exists simply as a handy research assistant and nothing else.

“Calling the use of ChatGPT to extract reliable sources from the Internet as ‘cheating’ is preposterous. It’s like saying that using the Internet to conduct research is unethical,” says Gelman. “To me, ChatGPT is the research equivalent of [typing assistant] Grammatical. I use it for convenience and that’s really all. Cobbs expressed a similar sentiment, comparing the AI ​​robot to “an online encyclopedia.”

But while students like Gelman use the bot to speed up searching, others take advantage of the high-capacity rapid input feature to generate completed work for submission. It might seem obvious what qualifies as cheating here, but different schools around the country offer conflicting interpretations.

According to Carlee Warfield, chair of Bryn Mawr College’s Student Honor Board, the school considers any use of these AI platforms to be plagiarism. Disclosing the tool just requires more care in assessing the intent behind student violations. Warfield explains that students who turn in essays entirely produced by AI are starkly different from those who borrow from online tools without knowing the standard citations. Since the ChatGPT phenomenon is still new, student confusion about ethics is understandable. And it’s unclear which policies will remain in place once the dust settles, at any school.

In the midst of fundamental changes in both the academic and technological spheres, universities are being forced to reconsider their definitions of academic integrity to reasonably reflect societal circumstances. The only problem is that the company shows no stagnation.

“Villanova’s current code of academic integrity will be updated to include language that prohibits the use of these tools to generate text that students then represent as text they have independently generated,” Daily explained. “But I think it’s an evolving thing. And what it can do and what we’re going to need to keep tabs on is also going to be kind of a moving target.”

In addition to the increasingly complex questions about whether ChatGPT is a search tool or a plagiarism engine, there is also the possibility that it could be Used for learning. In other educational contexts, teachers see it as a way to show students the shortcomings of AI. Some instructors already are changing the way they teach giving students tasks that the robots couldn’t complete, such as those requiring personal details or anecdotes. There’s also the matter of detecting the use of AI in student work, which is a flourishing handicraft industry all his.

Ultimately, says Daily, schools may need rules that reflect a number of variables.

“My guess is that there will be some general general policy development that essentially says, unless you have permission from a professor to use AI tools, using them will be considered a violation of the code of academic integrity” says Daily. “This then gives teachers ample freedom to use it in their teaching or assignments, as long as they explicitly state that they allow it.”

As for ChatGTP, the program agrees. “Advances in fields like artificial intelligence are expected to drive significant innovation in the coming years,” she says, when asked how schools can combat academic dishonesty. “Schools should constantly review and update their academic honor codes as technology evolves to ensure they are addressing the current ways technology is being used in academic settings.”

But a bot would tell.

Share This Article
Leave a comment