Redesigning assessments for a world with artificial intelligence presentation By Marieke Guy, Head of Digital Assessment, UCL
QAA Annual Conference, The Future of Quality: What’s Next?
Wednesday 13 September 2023
Marieke GuyHead of Digital Assessment um University College London (UCL)
Redesigning assessments for a world with artificial intelligence
1. QAA ANNUAL CONFERENCE, THE FUTURE OF QUALITY:
WHAT’S NEXT?
WEDNESDAY 13 SEPTEMBER 2023
Redesigning assessments for a world
with artificial intelligence
Marieke Guy, Head of Digital Assessment
2. University College
London (UCL)
• 11 faculties, 60+ departments
• 43,800 students, 14,300 employees, 440
undergraduate programmes, 675
postgraduate programmes
• 53% international students, 150+
nationalities
• C520,000 student to assessment
instances
• Variety of assessments but still majority
exams and coursework
3. UCL AI Expert Group
• Cross-institutional AI scoping group (senior
leaders, academic experts, operational
staff)
• UCL approach to embrace AI and plan for
future
• Sector leader with early guidance
• New set of supporting resources recently
released
• 4 stream areas:
1.Academic skills
Assessment
design
Policy
development
Opportunities
4. Six changes you can
make now
• Designing assessment for an AI-
enabled world page
• Small-scale adaptations to current
assessments - can be integrated into
existing module descriptions
• Videos created by Isobel Bowditch &
Digital Assessment Team
1. Discuss Academic Integrity and AI with your
students
2. Feedback and formative assessment
3. Revise exam questions
4. Revise essay questions
5. Convert generic questions to scenario-based
questions
6. Upgrade your Multiple Choice Questions (MCQs)
https://www.ucl.ac.uk/teaching-learning/designing-assessment-ai-enabled-world
5. Plan for larger
changes
• Assessment Menu for larger changes
• 40+ cards, use Lydia Arnold’s Top
Trumps for inspiration
• Supported by Assessment process
and review FAQs
• Jisc have now released searchable
interactive PPT version
https://www.ucl.ac.uk/teaching-learning/designing-assessment-ai-enabled-world
6. Other resources released by UCL
• Main AI hub page
• Guidance on AI categorisation
• Guidance on acknowledging and referencing AI
• Student-facing AI skills development module
• Template slide deck for teaching staff and template slide
deck for Professional Services staff to offer sessions
• Update to existing Understanding academic integrity course
• Changes to regulations by Academic Integrity Task and
Finish group
• LinkedIn Learning playlist
• Case studies
• Intellectual property guidance
Professor Mike Sharples & Union Affairs Officer
Mary McHarg in conversation about Generative AI
at UCL Education conference
https://www.ucl.ac.uk/teaching-learning/generative-ai-hub
Students and staff at the Russell Group Collective
event where they co-designed a set of AI
principles
7. AI providers as
Criminal Essay Mills
• By Noëlle Gaumann Research Assistant & Dr
Michael Veale, Associate Professor in Digital
Rights and Regulation, Faculty of Laws, UCL
• Covers how AI service providers interact with
the laws criminalising essay mills and contract
cheating services
• Looks at social media advertising designed to
allow students to write essays with correct
references
• Provides a set of recommendations
https://doi.org/10.31235/osf.io/cpbfd
8. QAA ANNUAL CONFERENCE, THE FUTURE OF QUALITY:
WHAT’S NEXT?
WEDNESDAY 13 SEPTEMBER 2023
Thanks!
Marieke Guy (m.guy@ucl.ac.uk)
Hinweis der Redaktion
Hello, I’m Marieke Guy, Head of Digital Assessment at UCL. I head up a team of learning technologist and digital assessment advisors who support the institution and faculties with their digital assessment needs and long term strategy.
In case people are unaware UCL is a large and broad institution with over 11 faculties, covering areas from brain science and maths and physical sciences, to social and historical sciences and the built environment. Also countless departments, some of which are research only.
There are almost 45 thousand students, a large number of whom are postgraduate.
We are very much an international institution with over 150 nationalities represented.
All of this means that we are very sensitive to scale and large cohorts and have a lot of assessment instances. We are also still exam and course work heavy, though keen to diversify more in the future.
In the picture you can see Jeremy Bentham, the philosopher who devised the doctrine of utilitarianism, arguing that the ‘greatest happiness of the greatest number is the only right and proper end of government’.
He supported the idea of equal opportunity in education and his ideas contributed to the foundation of University College London in 1826. Bentham left his body to medical science and requested that his body be preserved and gifted to UCL.Today Bentham sits in UCL’s South Cloisters dressed in his own clothes and sitting in his chair
In November 30, 2022 ChatGPT was released for public use. And the First meeting of our AI expert group took place in January 2023 chaired by Professor Kathy Armour, Vice-Provost (Education & Student Experience)
Experts from across the institution
Luck enough to have UCL Centre for Artificial Intelligence
Got out some initial comms
Then set up 4 stream areas. I am co-lead on the assessment design theme along with Professor of Educational Assessment at IOE, UCL's Faculty of Education and Society.
UCL approach from very early on was one of acceptance and engagement – Russell Group Collective event. Led to Russell principles - Use of GenAI would not be banned at an institutional level, and that students would be encouraged to incorporate it into their learning and assessment where appropriate.
Recognised that we have work with these tools and adapt the ways we teach, learn and assess to make best use of them.
We have tried to make our initial resources think about two different time periods. What can I do now, and what can I do in the future.
We have a page in our AI hub area on Designing assessment for an AI-enabled world. It looks at tweaks and small scale adaptions that can be made under the existing module descriptions. There are some first steps for staff which include trying out a generative AI tool themselves, and taking a critical approach to their own assessments - Does the current design really measure what you want it to measure? How else might you ascertain students’ learning?
There are then a series of 6 videos to guide changes to policy and practice at programme or module level . One area were are very keen to focus on is how this is part of an ongoing dialogue with students.
Key points include
Tell students when and how use of AI is permitted in assessment
Signpost where to find guidance (see below under 'Additional resources')
Explore capabilities and fallabilities of AI with students and colleagues in a context of transparency
Build critical AI literacy
The videos are practical and actionable.
The page then moves on to think about the future. My colleague Isobel Bowditch developed a set of cards with assessment inspiration based on Lydia Arnolds top trumps. Each card has been rated in terms of authenticity, challenge, product, learning, staff demand and sustainability.
And these cards have now been taken forward by Jisc wh have created a searchable interactive powerpoint.
We are keen to bust myths about the assessment change process, so have developed a set of FAQs covering how change can and should happen – be it from changing an essay title to completely redesigning your assessment approach and turning it in to a podcast, viva or group project.
These resources are part of a wider collection of resources on Generative AI and teaching and learning.
For example we have some guidance giving a three-tiered categorisation approach outlining how AI can be used in an assessment. The categories are:
Category 1: AI tools cannot be used - The purpose and format of these assessments makes it inappropriate or impractical for AI tools to be used
Category 2: AI tools can be used in an assistive role - Students are permitted to use AI tools for specific defined processes within the assessment.
Category 3: AI has an integral role - AI can be used as a primary tool throughout the assessment process.
The categories focus on module learning outcomes and the type of skills required from the assessment. We anticipate staff having a dialogue with students to ensure clear AI requirements for particular assessments. Everyone is on the same page and understands expectations.
This is supported by guidance on acknowledging and referencing AI from our library team. And other guidance on areas including research and intellectual property and the misconduct process.
There is also training for staff and students to follow and a series of interesting case studies on how academics are using AI in their learning and teaching. Pluss support slides for academic staff and professional staff to use and adapt. These will also form a staff toolkit.
At the top of the page you can see a picture of our UCL Education conference held earlier this year. It shows Professor Mike Sharples from the Open University & Union Affairs Officer Mary McHarg in conversation about Generative AI. There are some really great videos of that session if anyone is interested. It is always fantastic to have a student perspective on this. Mary and I were also interviewed for a Jisc podcast on reimaginging feedback.
At the bottom of the page you can see the Russell Group collective event I mentioned earlier. Students and staff at the co-designed a set of AI principles covering relevance, literacy, rigour, clarity, fairness and human-centredness.
Quote from Chris Thomson from Jisc - The students’ accounts made it clear that the genie is out of the bottle; AI is now so deeply integrated into their learning experience that it would be futile and dangerous to resist the change. For many, AI has become a “lifechanging” educational companion, offering a level of support that is impossible to ignore. As such, the students argued, returning to traditional exam halls or engaging in an AI detection arms race would be detrimental to their future employability and wellbeing.”
I finally wanted to mention a recent report released by colleagues from the Faculty of Laws. The paper looks at social media advertising databases, including TikTok and Instagram, and analyse a huge range of apps, websites and services emerging designed to allow students to write essays with correct references, or to turn chatGPT text into text with references, to 'humanise' and rewrite it so it doesn't trigger AI or plagiarism detectors, and to provide access to detectors that universities use to assure them they are getting clean scores.
The paper gives recommendations for institutions and the sector and also explores the differences between contract cheating and AI. As it says
Essay mills and contract cheating are companies whose sole purpose is to facilitate academic cheating. The legislation addressing such services works because it targets services that only fulfil that purpose. General purpose AI providers are different. They fulfil multiple purposes and have not been designed with the objective of facilitating academic misconduct. It is therefore questionable, whether it is desirable to criminalise the acts of a service that only inadvertently facilitates academic cheating.
All of our resources and information is work in progress. It’s important to say that we have also been actively working with other universities to share best practice and discuss approach.
This has been a very hectic couple of months but we feel that we are emerging with a better understanding of what good assessment looks like and will look like in the future. As QAA put it – we are using Generative Artificial Intelligence as a catalyst for enhancement. Naturally in the short term there will be some move back to unseen invigilated exams but we haope that will be temporary and that in the long-term we will move to more