From the article: The fear that generative AI tools such as ChatGPT would lead to a generation of students cheating and plagiarizing work has come to pass. The situation is so bad that educators are now looking at multipe ways to stop the problem, or at least make the practice much more difficult. Ironically, one of them is to use AI.
Speaking about AI-cheat students, Gary Ward, a teacher at Brookes Westshore High School in Victoria, British Columbia, told Business Insider, “Some of the ones that I see using it all the time – I think if it wasn’t there, they would just sit there looking blindly into space.”
There were warnings about AI cheating being endemic in education last year. Now, Ward says that “literally” all students are doing it.
One of the ways Ward is trying to combat the problem is to turn the AI against the cheaters. He asks ChatGPT to help him develop work that would be difficult for students to complete by simply feeding it into a large language model.
Richard Griffin, a lecturer in the business faculty at Manchester Metropolitan University in Manchester, England, is also using AI to make life harder for the AI cheats. The University has developed an in-house system that can be fed assignments. The system will then summarize how difficult it would be to use AI to complete the work, and recommend ways to make doing so more challenging.
Baruch_S on
Now we wait for the armchair quarterbacks to come in here and tell us that teachers just need to teach kids to use AI responsibly.
JCPLee on
There was a good reason we moved away from pure exams and embraced coursework, projects, and other flexible forms of assessment, especially to support different learning styles and reflect real-world skills. However, that model is cracking under the weight of the internet and now AI. It has been increasingly open to abuse since the ubiquity of the internet.
Over the past decade, the potential for abuse and misrepresentation of student work has skyrocketed. And now, with generative AI tools widely available, students can produce “original” assignments with almost no effort or understanding. The barrier to submitting plausible-looking work is basically gone. This is especially true at the high school level where the depth of research and analytical skills is less.
At this point, we may need to take a step back and consider that the only reliable way to assess what a student actually knows may be through in-person, supervised testing or live presentations. It’s not ideal for everything, but at least it ensures the person being assessed is the one doing the work. Otherwise, we risk turning education into a credential mill for those who are best at prompt engineering.
JayList on
As if we hadn’t already deteriorated the education system in the US to the point that most kids can’t read or critically think. Even a decade ago professors were saying these things. AI isn’t the problem here, the problem is we have been teaching people to take short cuts their whole lives and then give them the ultimate shortcut.
thisisjustintime on
I wonder how many teachers are running essays though GPT to correct them. Basically AI teaching AI through human drones.
Dentrius on
In my country oral test, written test and exams are the only thing that matters in school when it comes to grades. Homework is mostly just an indicator how lazy someone is and it can only drop your grade if you dont do it. Ones getting 100% on essayes dosent matter if they get 30% in tests.
Academics doesnt really do much homework. All grading is done in excercises and tests.
Buut what do I know, Im not in a first world country.
H0vis on
The problem here is that until every child has access to an AI assistant the kids that do are going to vastly outperform the ones that don’t.
So make sure everybody has access to one.
And then what you do is you raise the standards accordingly.
It’s like, Okay Little Jimmy, you want AI to write your essay? Then it better be *extremely fucking good*. Like, I would be expecting the definitive thesis on what happens when you give a moose a muffin.
RandeKnight on
Or end of year in-person, handwritten exam is worth 90% of the mark?
My last year of high school finals were 100% of the mark. You could do resits, but you’d have to pay the exam fee again.
xavez on
The problem is the teachers who don’t know how to use AI, not the kids who do.
AvailableDirt9837 on
I’ve never understood why the essay problem can’t be solved… couldn’t they just develop a word processing app that monitors for human input? Like limit copy/paste and have the app monitor for normal writing behavior like human input speed, stopping and starting, going back and rephrasing etc? It really doesn’t seem like a very hard problem to solve, can somebody tell me why my idea wouldn’t work?
SemiDiSole on
I say it every time: School is not about learning, it’s about passing specific, time-based performance-checks. In the past we used Ritalin and bullimia-studying to pass those, now the kids use AI, but the core problem is exactly the same!
AI will be what forces educators world-wide to rethink our entire education system and how we approach schooling.
ChocolateGoggles on
I think this really pulls to the forefront one thing that we desperately have needed in schools for a long time now (I’m in Sweden, so speaking from this perspective and what little I’ve heard about the USA school system):
Practical implementation of knowledge.
Me and my friends used to talk about this and we still talk about it. There should be much more space given to understanding the society you’re about to enter, way beyond the theoretical level. If we developer practical ways of having students utilize their knowledge to interact with real-world scenarios there would be a WANT to learn something, not just get it out of the way because you HAVE TO.
I have no doubt that there are actual winners among the schools in this environment, and it simply has to be those that have already designed their whole system to teach students the value of learning something, and to have them actually experience the “I want to learn this because of x, y and z.”
Bierculles on
Just do in person exams for grading like the rest of the world? This is really not much of an issue in places where having exams year round is the norm. The solution is obvious and we know it works.
YsoL8 on
I don’t really see how this would work?
Anyone determined to take the lazy route can just transcribe from the screen to the paper
And no current AI system any ordinary person can use has the first idea how to set AI proof assignments. It doesn’t reason like that.
Jamhead02 on
Workplaces are using more and more AI and schools are trying to prevent it. Why not use AI, teach kids to use it, but also to think critically on what AI has produced.
augustfolk on
My only hope in all of this is that educators are now discouraged from assigning homework from now on.
16 Comments
From the article: The fear that generative AI tools such as ChatGPT would lead to a generation of students cheating and plagiarizing work has come to pass. The situation is so bad that educators are now looking at multipe ways to stop the problem, or at least make the practice much more difficult. Ironically, one of them is to use AI.
Speaking about AI-cheat students, Gary Ward, a teacher at Brookes Westshore High School in Victoria, British Columbia, told Business Insider, “Some of the ones that I see using it all the time – I think if it wasn’t there, they would just sit there looking blindly into space.”
There were warnings about AI cheating being endemic in education last year. Now, Ward says that “literally” all students are doing it.
One of the ways Ward is trying to combat the problem is to turn the AI against the cheaters. He asks ChatGPT to help him develop work that would be difficult for students to complete by simply feeding it into a large language model.
Richard Griffin, a lecturer in the business faculty at Manchester Metropolitan University in Manchester, England, is also using AI to make life harder for the AI cheats. The University has developed an in-house system that can be fed assignments. The system will then summarize how difficult it would be to use AI to complete the work, and recommend ways to make doing so more challenging.
Now we wait for the armchair quarterbacks to come in here and tell us that teachers just need to teach kids to use AI responsibly.
There was a good reason we moved away from pure exams and embraced coursework, projects, and other flexible forms of assessment, especially to support different learning styles and reflect real-world skills. However, that model is cracking under the weight of the internet and now AI. It has been increasingly open to abuse since the ubiquity of the internet.
Over the past decade, the potential for abuse and misrepresentation of student work has skyrocketed. And now, with generative AI tools widely available, students can produce “original” assignments with almost no effort or understanding. The barrier to submitting plausible-looking work is basically gone. This is especially true at the high school level where the depth of research and analytical skills is less.
At this point, we may need to take a step back and consider that the only reliable way to assess what a student actually knows may be through in-person, supervised testing or live presentations. It’s not ideal for everything, but at least it ensures the person being assessed is the one doing the work. Otherwise, we risk turning education into a credential mill for those who are best at prompt engineering.
As if we hadn’t already deteriorated the education system in the US to the point that most kids can’t read or critically think. Even a decade ago professors were saying these things. AI isn’t the problem here, the problem is we have been teaching people to take short cuts their whole lives and then give them the ultimate shortcut.
I wonder how many teachers are running essays though GPT to correct them. Basically AI teaching AI through human drones.
In my country oral test, written test and exams are the only thing that matters in school when it comes to grades. Homework is mostly just an indicator how lazy someone is and it can only drop your grade if you dont do it. Ones getting 100% on essayes dosent matter if they get 30% in tests.
Academics doesnt really do much homework. All grading is done in excercises and tests.
Buut what do I know, Im not in a first world country.
The problem here is that until every child has access to an AI assistant the kids that do are going to vastly outperform the ones that don’t.
So make sure everybody has access to one.
And then what you do is you raise the standards accordingly.
It’s like, Okay Little Jimmy, you want AI to write your essay? Then it better be *extremely fucking good*. Like, I would be expecting the definitive thesis on what happens when you give a moose a muffin.
Or end of year in-person, handwritten exam is worth 90% of the mark?
My last year of high school finals were 100% of the mark. You could do resits, but you’d have to pay the exam fee again.
The problem is the teachers who don’t know how to use AI, not the kids who do.
I’ve never understood why the essay problem can’t be solved… couldn’t they just develop a word processing app that monitors for human input? Like limit copy/paste and have the app monitor for normal writing behavior like human input speed, stopping and starting, going back and rephrasing etc? It really doesn’t seem like a very hard problem to solve, can somebody tell me why my idea wouldn’t work?
I say it every time: School is not about learning, it’s about passing specific, time-based performance-checks. In the past we used Ritalin and bullimia-studying to pass those, now the kids use AI, but the core problem is exactly the same!
AI will be what forces educators world-wide to rethink our entire education system and how we approach schooling.
I think this really pulls to the forefront one thing that we desperately have needed in schools for a long time now (I’m in Sweden, so speaking from this perspective and what little I’ve heard about the USA school system):
Practical implementation of knowledge.
Me and my friends used to talk about this and we still talk about it. There should be much more space given to understanding the society you’re about to enter, way beyond the theoretical level. If we developer practical ways of having students utilize their knowledge to interact with real-world scenarios there would be a WANT to learn something, not just get it out of the way because you HAVE TO.
I have no doubt that there are actual winners among the schools in this environment, and it simply has to be those that have already designed their whole system to teach students the value of learning something, and to have them actually experience the “I want to learn this because of x, y and z.”
Just do in person exams for grading like the rest of the world? This is really not much of an issue in places where having exams year round is the norm. The solution is obvious and we know it works.
I don’t really see how this would work?
Anyone determined to take the lazy route can just transcribe from the screen to the paper
And no current AI system any ordinary person can use has the first idea how to set AI proof assignments. It doesn’t reason like that.
Workplaces are using more and more AI and schools are trying to prevent it. Why not use AI, teach kids to use it, but also to think critically on what AI has produced.
My only hope in all of this is that educators are now discouraged from assigning homework from now on.