5: Why ChatGPT may be about to revolutionise education for the better
The long-awaited move from information regurgitation to know-how application
There’s a lot of talk about ChatGPT, the newly released artificial intelligence chatbot which can not only respond to many kinds of questions in a plausible and well-informed fashion but also assemble things in new ways which can even look creative. The world seems full of people trying it out; one of my favourites is British physicist and broadcaster Jim Al-Khalili asking it to write a poem on the two-slit experiment in quantum mechanics in the style of Robert Burns, with delightful results.
It can be self-referential too. Hindu theologian Akhandadi Das asked it to script his Radio 4 ‘Thought For The Day’ slot on 12th January 2023 in the style of himself, referencing the Bhagavad Gita. Again, the result was convincing. Hear him quoting from it (but not using it completely!).
Others have started attempting to use it as an essay-writing tool. There are reports that it can produce exam-passing texts in US business schools. To counter this, the makers of ChatGPT have released a tool for spotting AI written text, and others are already onto this new and potentially lucrative market.
The future of exams
What does this new development mean for the future of exams? The prestigious Alleynes school in Dulwich, south London is onto ChatGPT’s case. They are already seeing a different future for examinations with less solo essay-writing and more oral exams, group work and ‘flipped learning’.
Is this a bad thing? Emphatically NOT, in my view. It’s a great thing. Let’s have a think about exams, how they usually work and what they achieve.
We all have an image of exams. They usually look like rows of individual desks in a big cold hall, students working in silence, an invigilator prowling the aisles and the clock ticking. What will come up? Did you revise that one?
This old fashioned way of testing a certain kind of ability. What IS it testing? Co-operation? No. Research? Not a bit. Information sorting? Not really. This is about a certain kind of monastic application, solo writing, under time pressure, with no more resources than can be summoned up ‘in the head’ (actually in the body as well) of the candidate. It’s basically information regurgitation, with the most accurate puker carrying the day.
I still recall my own exams at school and university. I did English Literature for O-level at the age of 15 where we were tested on Shakespeare’s play Richard II, Chaucer’s The Nun’s Priest’s Tale (in middle English) and the Thomas Hardy short story The Withered Arm. We had to write essays on each of these, including quotes from the text to support our position.
I recall railing at the way we were supposed to talk about allusions and metaphors that the authors ‘meant’, even though we had no chance to quiz them or investigate in a proper way (thought me the young physicist-to-be! I am now, of course, a recovering physicist). What really got me, though, was that we were supposed to achieve all this without the text in hand. We were supposed to learn a whole bunch of quotes and then deploy them, presumably in some previously imagined way, without even the book to refer to or indeed to gather more useful passages. I passed with a C grade, to everyone’s amazement, and was put off Shakespeare for over 30 years.
Know-what and know-how
What are we trying to do in an exam or assessment anyway? The kind of exam above is basically testing ‘know-what’. Can you give the ‘right’ answer, written down in black and white on paper, under pressure? It’s about the facts and setting them out in a passable fashion. What is Boyle’s Law? (The less the external pressure, the greater the volume of hot air, as Flanders & Swann memorably declared.) What were the fourteen causes of the First World War (using small roman numerals to list them). What were Liz Truss’s greatest flaws as a Prime Minister?
It’s about information recall and answer construction, sitting isolated in a room with no stimulation or other sources. British comedy actor Harry Enfield knows about know-what; in a memorable sketch his character interrupts Hamlet’s soliloquy “To be or not to be, that is the question…” with his response “Ooh, you don’t want that to be the question. What is the capital of Peru? That’s a proper question!” And it’s a perfect know-what question, a defined and parrotable answer whether you’ve been anywhere near Peru or not.
Know-how, on the other hand, connects to practical abilities which can be learned and shown. Change the washer on this tap. Get to Peru without using an aeroplane. And yes, write an essay in a cold room in three hours without books, colleagues, the internet or even a cup of coffee.
Classical exams are a proper test; but a test of exam-passing skills. I am lucky to have these skills, and even refined them as I went along. If you don’t have them, tough luck. It’s curious (not really) that those who favour ‘the discipline of traditional proper examinations’ are invariably those who passed.
Accelerated Learning
In the 1990s I was seen as an expert in applying ‘Accelerated Learning’ methods to business and corporate training. The idea was to find ways to use all kinds of ability and intelligence, not just fusty essay-writing, list-making and memory, into making lessons and course active, fun, memorable and engaging in ways which meant that everyone could learn and succeed. These methods are still out there and it’s a bit baffling to me that they are not more widely used. If you want to catch up I would recommend the late Dave Meier’s Accelerated Learning Handbook as a good place to start. (You will find me listed in the appendix of AL people at the back!) Sharon Bowman continues this tradition with her splendid ‘Training From The Back Of The Room’ work.
This kind of approach typically involved what is now known as ‘flipping the classroom’. In conventional teaching the lesson is used to impart information to the learners, who then go away and put it to use in their homework outside the classroom. Flipped learning sees this turned on its head; the learners gather the information before the lesson, and classroom time is spend applying it together in a variety of ways, groupings and processes. This is exactly what ChatGPT can’t do.
Show You Know
Another key part of this work is building in a ‘Performance’ stage or similar, where the learner ‘Shows They Know’ as part of the training or learning, not as an afterthought. Designing a ‘Show You Know’ activity is a really key part of a good training course to me; everything flows towards it. I found and used a set of six key principles based on Authentic Assessment (lots of references online including here and here) in thinking about how to make this a useful part of the learning, not just a ticking up of right and wrong answers. These were:
Assessments are integrated into classroom/course work (not an extra afterwards)
This is a great way to get ongoing feedback and encourage learners, rather than a high-stakes shootout at the end of the course.
Based around constructing responses to real situations (not artificial ‘case studies’)
This is key. It helps learners use their know-how rather than simply restate it, and can also help them build useful progress in other areas of their lives. I did an MBA in the early 1990s and was horrified that many business schools of the day (and still) put so much faith in case-study learning, where we were presented with some facts and figures about a fictional company and were supposed to come up with an analysis of the problems and routes to solving them – without ever clapping eyes on it, talking to the people, sniffing the air or getting engaged. (Of course this would be hard with fictional organisations! Real ones are so much more interesting AND so much more than their accounts and balance sheets too.)
Have clear criteria for succeeding/passing. What is the minimum requirement? What would be fantastic?
This can be surprisingly controversial. I remember one training manager in the nuclear power industry (where I did quite a lot of work) saying:
“But if I told them the criteria, they’d all pass!”
He was perhaps unwittingly pointing to one of the others goals of traditional exams: the sorting out of the sheep from the goats, the bright from the dim, the officers from the men, the successful from the also-rans. There may be some need for this at certain points, but to do it with the ability to write well in a cold room under time pressure seems to me to be a profoundly 14th-Century solution.
Allows for differing views, where appropriate. Also allows learners to include their own experiences.
There’s always a risk that learning can turn into ‘remember what the teacher said’ rather than examining that in a wider context. Good assessments allow for such divergence, as long as it can be supported or presented suitably.
Teaches the learner to evaluate their own work (and the work of others).
If you can mark someone else’s work, you have to understand it yourself! And it’s much more interesting to share feedback with and from peers than the teacher always having to do it. Can anyone show me a teacher who LIKES marking papers?
Ideally, offers the learners choice in how they demonstrate their know-how.
Written essays may be ok but what about other options – graphics, dramas, poems, examples, personal reflections, songs, objects…
I think that taken as a whole this is a superb list to get away from ChatGPT type written answers. I still use it on all my long courses (like Solution Focus Business Professional, my 16 week SF certificate course with the University of Wisconsin Milwaukee). I think that applying know-how to real, novel (and perhaps unseen) situations is a wonderful way of showing learning. In this course, participants use a case or situation of their own to show what they have learned, with choice about format and style.
Particularly in corporate training and adult learning, there is usually much more about ‘have you learned this’ than ‘are you in the top 10%’. When I helped the nuclear power people revamp their long (and rather boring) six week course into a three-and-a-half week much more lively version, everyone DID pass. Not only that, nobody dropped out either, which has been a problem in the past. When people dropped out they typically didn’t make further career progress and weren’t seen at the training centre again; a poor result for all concerned. And yes, on that occasion we had to use the original assessment because it was approved by the regulator.
Critical thinking
The world is getting clearer about what ChatGPT can do and what it can’t, as this very entertaining recent paper A Computer Wrote this Paper: What ChatGPT Means for Education, Research, and Writing by Lea Bishop shows. Plausible information regurgitation and even re-combination (physics in the style of Robert Burns) it can do. Weighing up complex issues, making decisions, implementing ideas, working with other people it can’t.
Where we are assessing learning in the future, these are the kind of tasks we’ll need to use. Being clear about what’s required, addressing real (and potentially unknown) situations, offering learners choice in how they show what they need to show… and then gets them evaluating each other’s work too.
This would be a huge step towards a humanity of organisation. Because it allows people to use their skills, talents and resources, helps find new angles, makes the learners think rather than regurgitate, AND is much more interesting all concerned.
From the horse’s mouth
And of course I had to ask ChatGPT what it thought. Here’s the answer:
Mark: Is ChatGPT a threat to academic examinations?
ChatGPT: No, ChatGPT is not a threat to academic examinations. It is simply a tool that can provide information and answer questions, but it does not have the ability to replace human knowledge and critical thinking skills that are essential for academic success. Academic examinations assess a student's understanding and mastery of a subject, and their ability to apply that knowledge in real-world situations. ChatGPT can assist in the learning process, but it cannot replace the hands-on, problem-solving skills that are tested in exams.
I would add that there are many other ways to test and show hands-on, problem solving skills (in ChatGPT’s words) than conventional exams. Yes, I know about course work, projects and so on. But now it’s time to get really creative. How can we rethink the world of learning and assessment to not just resist ChatGPT and its friends but maybe even incorporate them? After all, sitting alone in a cold room having to write something you didn’t know about beforehand is utterly unlike any real-world 21st century job.
What ChatGPT shows is that knocking out a plausible 800 words on some topic is now the work of a few seconds. Perhaps we can hope that this will be the end for journalists-turned-politicians who are over-paid for the writing task, and utterly incapable of actually governing in real life.
News updates
I am leading a rare Hosting Generative Change course online for four weeks starting Tuesday 21 March 2023, 1pm-5pm UK time. More details here.
The AI stuff I have seen (MidJourney) and read (ChatGPT etc) are fundamentally derivative, rather than genuinely creative. Burns was a genius with a unique perspective. The poetry that Jim Al-Khalili generated via his specific request was a good parody - entertaining - but ultimately lacking what one might dare to describe as 'soul'. ChatGPT will drive creativity, not stifle it.
Wow, very interesting. Being sensibly smart and coming up with right answers is an automatic instant job for machines. So what we can do as human beings may be making mistakes which is hard for machines to do;-) If we don't have to strive to come up with already made right answers and are free of that kind of stress, I am curious what is coming out of us!?