Students at the Heart Conference: ChatGPT: Problem child or stimulating plaything?
Universities are at a crossroads with ChatGPT, as illustrated by an eagerly-awaited presentation by Martin Hanneghan of the School of Computer Science and Mathematics at the SATH Conference.
“As a computer scientist I’ve rarely been so excited but as an academic, it can seem rather terrifying,” Dr Hanneghan preambled.
Much newsprint has been used up attempting to capture the impact of the technology which has led to howls of plagiarism and cheating and equal measures of realism that the technology is not going to be put back in the box.
A 100% of students will have tried it, says Martin, not all will know why and certainly not many will have bad intentions if they do.
But it’s up to educationalists globally – at all levels, to decide whether to embrace it or to obstruct it. At the very least they must formulate new policies and guidelines.
And quickly! According to Martin, generative AI may seem new but in computer science terms, it’s already been around ‘for a lifetime’.
So what is it?
“Generative AI learns and mimicks patterns in data, like any other machine learning tool,” says Martin, a software engineer. “It a predictor, essentially, it predicts what words come next from prompts.
“It is so powerful because it has so much data to draw from; the latest 4.0 version has been trained on 100 trillion parameters.
“You can ask it to generate a presentation or essay and it’ll do it. You can ask it to generate your slides in form of a poem and it’ll do it.
“It can personalise learning materials, generate new research ideas and make students more effective learners.”
But it can’t critically think and you wouldn’t trust it to mark your students’ exams! Also it has flaws, such a in-built bias (most data is provide by white, middle-class men), consent (draws on copyrighted sources) and security.
Where the jury is out, says Martin, is how it contributes to assessment. After all, if a student uses ChatGPT or a rival AI to do their exam, is that student leaving uni with the right knowledge?
On the other hand, if the world of work is using generative AI tools, don’t students need to know how to use them?
And we come back to the perennial question that just prompted a decade or more of innovation in teaching and learning. What’s the absolute best way to put our students on the road to success?