Sponsored
Sponsored content is premium paid-for content produced by the Irish Times Content Studio on behalf of commercial clients. The Irish Times newsroom or other editorial departments are not involved in the production of sponsored content.

What happens when plagiarism goes digital?

A robot ate my homework won’t cut it as an excuse for tardy assignment delivery. Technological advances such as generative AI and essay mills bring new ethical questions to bear on the higher education sector

The rise of artificial intelligence requires educators to gain a proper understanding of how students are using it and how the sector needs to keep up.

If you could cheat in your exams — and get away with it — would you do it?

This is one of the ethical dilemmas facing college students today. Over the past decade or more, third-level lecturers have become wise to the use of so-called “essay mills”, which involve students paying a company to write an essay for them. But students had to quite actively seek these companies out and pay them.

The rise of artificial intelligence (AI), however, presents novel temptations to cheat. Students can easily find the software online, and anti-plagiarism technologies are struggling to keep up.

Dr Robert Mooney is a lecturer in sport and exercise science at Atlantic Technological University (ATU) Galway. He has a background in electrical and electronic engineering and carried out his PhD at the University of Galway, on the topic of wearable technology.

READ MORE

He wanted to get a proper understanding of how students are using artificial intelligence, not just as a plagiarism tool, but also as a legitimate study tool.

“It has been pretty clear that students have been using AI engines more and more in the past couple of years,” he says.

“The knee-jerk reaction from us as staff is to penalise this as a deliberate act of plagiarism. I feel that this approach is flawed, as it makes us look like old-fashioned teachers cracking the whip and, also, it will become increasingly more difficult to detect as these engines continue to improve.

“This is where the idea of the project came about — in our reaction we seemed to forget to ask the students for their opinion, so that is what I attempted to do. We are all looking for solutions and it seems obvious that the end-user voice should be part of this too.”

Seventy-five per cent of survey respondents have used an AI engine during their studies, and 48 per cent used it during the previous month

Mooney applied for funding through the National Technological University Transformation for Recovery and Resilience (N-TUTORR) programme, which is a collaboration between the technological universities and two remaining institutes of technology to explore and improve their student experiences.

“I was really excited when I saw the N-TUTORR project advertised,” he says.

“I felt it was a great way to engage with our students and gain some insight into their own perceptions and practices when it comes to AI and their education. I applied for the funding based on this idea and was fortunate to be selected.

“The aim of the project was to give the students a chance to speak up.

“Twenty students from the department of sport, exercise and nutrition at ATU Galway were involved. We met and discussed how best we can gather some data and then developed a questionnaire to send out across the ATU student body. The students were excellent and really helped to frame the course of this work.”

Using Microsoft Forms, Mooney aimed to survey as many students as possible.

Robert Mooney: 'There are ways that we pick up plagiarism, particularly as we know our students here and will pick up on sudden changes in their quality, writing style or grades.'

The survey was anonymous, but Mooney had to overcome some initial suspicion from students that their responses could get them in trouble by reassuring them that the information would not be used to catch any cheating.

“There are ways that we pick up plagiarism, particularly as we know our students here and will pick up on sudden changes in their quality, writing style or grades,” Mooney says. “It may be that one paragraph stands out.”

Mooney says that there can be a preconceived idea that the tools are for assessment alone, but that students are also building it into their learning.

“They are using them for study notes, for supplementing learning from lecturers and to gather information. AI is also being used so that students can generate a model answer for past exam papers, which they can use to revise.”

The rise of AI is increasingly forcing lecturers away from older models of assessment — primarily exams and assessment, says Mooney.

“My colleagues and I are moving away from traditional essays, and more towards oral and video presentations, question and answers to assess understanding and integrating chatbot ChatGPT into assessments. For instance, lecturers may generate an essay using ChatGPT — and then get students to critique it. They can learn about their subject, and also see what AI is getting right and wrong. I think it is a really strong idea, but a big concern I have is that academic writing is such a key skill we aim to develop in our students, who are training to be scientists. I am not concerned that this area is not going to run away from us, as everyone is approaching the issue positively and looking for solutions to embrace new technology.”

Mooney says he was not surprised to find out that so many students were using AI, but it is now helpful to have objective measures.

“Seventy-five per cent of survey respondents have used an AI engine during their studies, and 48 per cent used it during the previous month. Many others are using AI on a weekly basis.”

He was surprised, however, with how students are using AI tools.

“My naive opinion was that it was all for the purposes of cheating, but that wasn’t necessarily the case,” he says.

“On the downside, it is now almost seen as something everyone should do, and that they’re missing out if they’re not using it.”

Mooney says that this project could be scaled up and carried out by other subject areas in other third levels.

“Students have found ways to integrate these tools into their learning, using AI to help with grammar, to generate study notes and to research topics.

“This was always about hearing the student voice and including them in this collaboration was key for me.

“It seems that our students are much more clued into the potential positives associated with these new technologies and that perhaps it is us as educators who need to give them a bit more credit, and catch up,” he says.

What ATU students think of AI
  • While students should still learn to write well, having more practicals, presentations, and exams may prevent AI from completing the degree in the place of the student.
  • I think [AI] can be useful for idea generation but should be carefully used. Over-reliance may lead to poor critical thinking and brainstorming abilities.
  • AI detectors are unreliable.
  • I would not directly copy from AI or submit references it gives me, but I would use it to help generate ideas and answers for my assessments.
  • AI technology should be used as a tool to further help the student and lecturers, with all new technology, it should be embraced and (people should) see the potential benefits from it, rather than the old-school mentality of, ‘it’s new, it shouldn’t be used’.”
  • Sometimes it is incredibly hard to find the words to rewrite something to avoid plagiarism, especially after reading what you want to explain in a journal. It helps the clarity of the work. AI is always going to be around and develop.
  • While AI engines are not 100 per cent reliable, for courses such as software it is extremely helpful as it can search the web and give you documentation that is easier to read, understand and implement. As AI gets more reliable, I believe it should be integrated to some degree into courses and colleges as I find it helps a lot if you’re struggling with certain topics.