AMBA's Ambition magazine: Issue 63, May 2023

OPINION 

We believe that formal policies can only go so far in deterring misconduct: more fundamentally, it is incumbent on all educators to instil a deep understanding of the value of academic study. In seeking to bypass some of the learning and research that assessments are designed to confirm, students need to appreciate that they would be risking much more than being found in breach of school policies. Ultimately, they would fail to develop the understanding and skills that will eventually lead to rewarding careers. Moreover, before we can adapt our existing policies, we need to decide the point at which unacknowledged use of generative AI tools becomes problematic. Some universities have already announced that they wish to ‘ban’ their use, but it is difficult to see how this could be enforced. Should we instead recognise them as useful research and writing aids, which might even level the playing field for those with language difficulties (non-first-language English speakers or those with dyslexia, for example) who could use the tools to improve the readability of their assignments? Students are already asking questions around what is ‘allowed’, so we need to decide on our responses quickly, aiming for some level of agreement across the higher education sector. We know from talking to students facing the most serious misconduct investigations that deliberate cheating is often a last resort, reached when fear of the consequences of failure outweighs a perceived risk of being caught. In a world where assignments can be bought readily and cheaply from unscrupulous companies, promising to thwart plagiarism detection tools, will students add generative AI writing tools to their arsenal of back-up plans when failure threatens? Perhaps we should be encouraged that most students are asking us to confirm our policies not to clarify what they can get away with, but rather to reassure themselves that their peers will not gain unfair advantage via dishonest approaches. Adapting our approach So, while fears around cheating have hit the headlines, there is also considerable focus on adapting our approaches to assessment and, more fundamentally, reshaping the intended learning outcomes of our courses. Learning and assessment in universities have become much more focused on developing critical and problem-solving skills than on knowledge acquisition alone in recent decades. If (as expected) AI tools improve their ability to generate reasonably competent responses to some of our existing assignment questions, we need to change what and how we assess. For some subjects, this will mean increased use of invigilated exams, despite genuine concerns that these raise different questions around reliability and fairness. For others, it is an opportunity to move towards more ‘authentic’ assessments – something that business school educators have been experts in for many years. By focusing as much on the learning as the outcomes – by engaging in assessment for learning rather than assessment of learning – we can create more innovative, personalised assessment tasks that foreground the process of applying knowledge, problem-solving and skill development. Digital tools can be scrutinised as part of the learning process, preparing students for working lives that will constantly evolve in response to emerging technologies. As the initial scaremongering around ChatGPT dies down, many educators are already integrating these tools

“Digital tools can be scrutinised in the learning process, preparing students for working lives that will constantly evolve in response to emerging technologies” into their teaching practice. Rather than banning them, we should lead the way in encouraging our students to recognise their benefits and dangers. What better way to teach critical thinking than to ask students to evaluate the reliability and quality of AI-generated outputs and consider the development process that sits behind the technology? At Exeter, we have developed new modules that encourage students to consider the ongoing opportunities and challenges presented by emerging technologies. One of these, called Digital Technologies and the Future of Work, helps learners to appreciate the impact of new technologies across diverse career paths by drawing on collaborative learning via Teams and Zoom. Its portfolio assessment format also allows them to reflect on the value of this approach in the achievement of the module’s intended learning outcomes. As we review our modules in response to the arrival of easy-to-access generative AI tools, we would do well to consider adopting similarly reflective, personalised approaches that highlight the very tools that risk challenging the integrity of traditional modes of assessment. The implications of recent developments for the whole education ecosystem are clearly profound and stretch beyond universities themselves. Yet, the industry has been adapting to new technologies for a long time. This is partly why business schools are such exciting environments in which to work and study. Our responsibility as educators is to continually align our practice to emerging challenges. If, by instilling an awareness of the power (and risks) of generative AI, we can ensure our degree programmes reliably prepare students for an unpredictable future of work, that can only be a good thing.

Alison Truelove is director of the Centre for Innovation in Business Education, associate professor in critical practice and lead academic tutor at the University of Exeter Business School. She offered her expertise as a panel member at Studiosity’s recent symposium on UK Higher Education’s Thoughtful Response to Robot Writing

Ambition | MAY 2023 | 49

Made with FlippingBook - Share PDF online