by Dr Sam Hemsley, Academic Developer, University of Sussex
The second meeting of the Teaching and Learning with Generative Artificial Intelligence Community of Practice (or “AI CoP”) is taking place next month and will focus on approaches to talking with students about Generative AI (GenAI).
But, why this topic?
The discussion of GenAI in Higher Education (HE) is both necessary and challenging. Many of us don’t feel very knowledgeable or confident about the use and capabilities of such tools or the impact they have on students’ education and future careers. Another reason is the recent update to the Sussex AI and Academic Integrity page with some standardised statements on the use of AI in assessments, which module convenors can add to their assessment information. However, it remains necessary that tutors also make time in class to talk with students about both the rationale for, and implications of, such permissions and how students might use these tools for self-directed learning. Therefore, our second AI CoP meeting seeks to provide a space to explore and share how such conversations might be initiated and managed, and how they can support GenAI and assessment literacy among students.
The outline of the meeting is as follows:
- Introductions and updates from Educational Enhancement on GenAI in Sussex and the sector.
- A lightning talk from Dr Andres Guadamuz, Reader In Intellectual Property Law (LPS) on his approach to having mature conversations with students about AI.
- An interactive segment with discussion will form the bulk of the session and provide space to explore and share your challenges and practices.
- Finally, a roundup (on the hour) followed by time for informal conversation and networking.
The importance of this focus has been further reinforced in recent weeks, not least by the publication in February of a Higher Education Policy Institute (HEPI) Policy Note on students’ attitudes to AI. One of the most attention-grabbing statistics from a poll of 1,250 students shows that 53% use GenAI to explain concepts in assessments, 13% use it to generate text they later edit for assessments, and 5% use it to write assessments for them. What I find concerning is that, in addition to the emergence of a digital divide in the use (legitimate or not) of GenAI, over a third of students aren’t aware of how often such tools ‘hallucinate’ facts, citations or statistics (creating, as my colleague put it in her recent blog, “data mirages”). This is a concern shared by our headline speaker, Andres Guadamuz, as quoted in this Guardian report on the HEPI findings.
Similarly, staff may see the potential in AI marking tools for improving the speed, quality and consistency of marking and feedback, thereby freeing them up for more student engagement and support. But, in conversations with students, we hear concerns that such marking would be outsourced, impersonal and represent poor value for money.
The importance of having open conversations with students about GenAI was further reinforced in multiple sessions of the Quality Assurance Agency (QAA) Quality Insights Conference last week. The comment that resonated most with me came from the consistently insightful Martin Compton, of Kings College London, who emphasised that students are typically reticent about engaging actively or critically with AI unless they are exposed to deliberate, open, meta-cognitive discussions. In other words, students need to be encouraged and supported to think about and reflect critically on GenAI tools, and the use and impact on their own learning. He also emphasised that academics don’t need to be experts in AI to enable such conversations. Rather, they can use such conversations to learn with students in a spirit of shared curiosity and critical engagement. This is the bread and butter of academic enquiry and something we all, as HE educators, have the skills to support.
Finally, last week’s Heducationist blog provided additional nuance and insight into this kind of collaborative critical engagement with students around the use of GenAI. The blog summarised some interim observations from a range of projects at KCL, many in collaboration with students, looking into the possibility of using AI within their discipline and teaching:
“several projects identify how students are sceptical of the usefulness of gAI to them and in some that scepticism grows through the project. In some ways this is quite pleasing, as they begin to see gAI not as a panacea, but as a tool. They’re identifying what it can and can’t do, and where it is and isn’t useful to them. We’re teaching about something (or facilitating), and they’re learning.”
So, please sign up to join us (in person or listening along online) and continue the conversation at the second Teaching and Learning with AI CoP on Monday 18th of March from 14:00 to 15:30 in the University of Sussex Library Open Learning Space.
We hope to see you there!
Can’t attend?
- Join our AI CoP mailing list for updates by emailing EducationalEnhancement@sussex.ac.uk
- Explore and share great links via our shiny new look University of Sussex Teaching with AI collaborative Padlet.
- Submit a Case Study of your practice to the ‘Learning Matters’ blog (it’s quick and easy and supported by us).
Additional links:
- UPDATED! Sussex AI and Academic Integrity webpage: New recommended AI use statements for module sand assessment briefs.
- NEW! Sussex AI tools webpage: Find out about the types of generative AI tools available and how to access them.[SH3]