In our newest "how can we help you?" thread,
My employer uses turnitin and asks us to investigate every single case where turnitin shows any percentage of assignments being generated by AI. I think this has gotten a bit out of hand despite constantly warning my students about the policy. (Student who have something detected also tend to conclude their assignments with “some experts say X some experts say Y and this is an important issue so we need to investigate further and have more discussions.”)
I think this has substantially increased my workload and that of the casual staff who works with me. Is this getting more common? How have others dealt with the increased workload and potential unpaid work of casual staff?
Good questions. I haven't had a major surge in investigations or made any major changes to how I teach yet yet, but I spoke to friend recently who has who told me that they are now simply having students do all of their work by hand in class.
What about the rest of you? Have you experienced a big surge in academic misconduct cases because of AI? If so, how are you dealing with it? And either way, if you have experienced a surge or not, have you adopted any teaching strategies to prevent AI-based misconduct?
Last term I had no plagiarism cases to deal with, this term I had five, all ChatGPT related.
Posted by: Le Chat Demoniaque De GPT | 06/26/2023 at 11:35 AM
Besides the fact that it's time consuming to investigate every case, it's also very hard to prove plagiarism in these cases, and detectors are not reliable enough to provide proof. I think the best way to deal with this is just to have all work done in class under supervision.
Posted by: ehz | 06/26/2023 at 12:53 PM
Double the amount of confirmed plagiarism in Spring 2023 compared to prior semesters. Triple the amount of suspected plagiarism in Spring 2023 compared to prior semesters. All instances of confirmed plagiarism in Spring 2023 were AI-related; previously, students copied from essay mills and other sources without attribution.
Posted by: Assistant Prof | 06/26/2023 at 01:02 PM
I had a couple of extra just before the summer--I assume students didn't really know about it yet. The baseline rate of cheating here is quite high, however.
My partner just taught a summer course here and nabbed 30 of her 35 students. Judging from her experience, it's easier to tell who's plagiarized now (because it's all from the same source and essentially the same sentences), but harder to prove definitively.
Thankfully, we don't need to report each and every case. But really, we should. The academic culture here is broken.
Posted by: Michel | 06/26/2023 at 02:06 PM
I've never had a plagiarism case (that I know of) until last term, when I had 2 confirmed AI cases. The class size was 40, meaning 5% of the class copied straight from ChatGPT. Lots more essays were flagged by the detectors, but only in small chunks and without the obvious signs (made up quotations, etc.) so I'm assuming I just found the tip of the iceberg. I plan to replace the standard essays with in-class assignments of some variety.
Posted by: Elizabeth | 06/26/2023 at 03:28 PM
I had four confirmed cases of AI plagiarism this past semester that I sent to the Provost. (They were confirmed in the sense that I formally accused the students after my investigation, and they admitted to using ChatGPT or ChatGPT+Quillbot.)
Most AI detectors were very poor at detection earlier this year, but I have recently found Winston AI to be a reliable detector in my own experiments. However, there is still a gap between detection and proof since it is a black box how detectors like that of Winston AI works, just like the LLMs themselves.
Shameless plug:
In part because of my experiences, I recently started a blog/newsletter on tech (read: mainly AI) in higher ed, called 'AutomatED'. The web version is located here: https://automated.beehiiv.com/. Since I am a philosopher, you all will notice that many of the pieces I write are somewhat philosophical, but we also have a fair bit of empirical investigations into aspects of the issue, like our AI-immunity challenge where we try to crack professors' take-home assignments with AI tools in under an hour each (and explain how we did it and what the takeaways are).
Posted by: Graham Clay | 06/26/2023 at 05:08 PM
It might be helpful when reporting on this trend if people can say what kind of class they are teaching. For example, is it first year or upper level? Is it a compulsory course for some or all students? Is it largely/entirely philosophy majors or lots of people taking it as an elective? And so on.
The reason I ask is that the underlying problem, it seems to me, is the motivation of the students taking these courses. Why are they cheating? Philosophy degrees aren’t very useful for getting jobs (I understand if you cheat in business school, I’m glad I don’t teach in one). So why bother taking philosophy just to cheat? Are you in the US or somewhere else?
We won’t beat this technology, except perhaps by going back to all in person assessment (and that has many many other downsides). It seems better, I think, to try to work out why so many students are cheating and how we can motivate them not to.
Of course, that is a long term project. And obviously we need to do something now about the students who are cheating. So I’m not against discussing strategies that help right now. But while we do that I think we should be looking to the future and starting to work on ways to treat the cause not merely the symptoms.
Posted by: Future thinking because I have no immediate teaching requirements | 06/27/2023 at 06:09 AM
Absolutely. I'm catching roughly 10 cases a semester (or more even), with 80-100 students per semester. In the fall I'm going to experiment with no papers. Just quizzes, exams, and in-class participation. I know it's not the best method, pedagogically, but my salary loses purchasing power every year. I'm an employee, not a martyr.
Posted by: Assc prof | 06/27/2023 at 08:59 PM
When ChatGPT first came I decided I wasn't sure how to handle things yet and switched from papers in my classes to essay exams. In one case I was teaching a senior seminar for humanities students (outside of my philosophy department), which is supposed to be a writing-intensive seminar, the point of which is to make sure students do a substantial writing project in their final year. Because students are now just writing essay exams in this class they are not really getting the full value of the seminar. It's like students don't realize that this software means that they literally are paying the same price for less education. I haven't figured out how to address this situation yet but think it's pretty unfortunate.
Posted by: AnonymousL | 07/04/2023 at 12:24 AM