­

When does assistive technology become prohibitive technology?

By Dan Axson (Learning Technologies Manager)

In May 2024, I wrote a post titled ‘What if our students with disabilities didn’t have to jump through hoops to have an equitable experience.’ You’ll be forgiven for not having read it, it never saw the light of day (hence the very un-catchy title). Amongst the noise of a never-ending flurry of AI related content, I never quite got it finished. In it I had a meandering wander through my thoughts and experience about the onerous admin, lack of agency and impacts on cognitive load a student with disabilities may experience. Naturally given we’re in the ‘AI is mentioned in all the things’ era, the topic came up. It made the case that university policies and guidance on the use of generative AI technology for teaching, learning and assessment may inadvertently put students with Disabled Students Allowance funded software at odds with academic integrity policies.

Fast forward ten months and the topic, with little shock to anyone, is not only still very relevant but becoming more so. Then an email thread shared with me recently resurrected the ideas. Discussed in the email was the complexities of approving software in a space where the software companies themselves can update tools to add AI ‘features’ without notice, students understanding or awareness of institutional policies can’t fairly be relied on and the DSA assessor’s ability to find them is an us problem not a them problem. So here is that post, dusted off and made fit for 2025.

 Robot holding out hand in supportive manner. Hand fills frame. 
Photo by Possessed Photography on Unsplash 

From dependent to independent

As anyone in the Learning Technology game will tell you, conversations are largely dominated by generative AI. Recently, we have been testing a tool called Jamworks. In short, the tool generates a transcript of a recording, for example a recorded lecture, then uses said transcript for a number of very well thought out AI tools such as summaries, key points, flashcards etc. As part of our conversations around the use of such tools in the classroom, was how this would be hugely valuable to someone who relies on a notetaker. These notes are generated almost immediately, they don’t require waiting on the notetaker to send them on and unlike with the human notetaker, the student can ask questions of the notes, whereas the human notetaker may not be a subject expert. In other words, the technology enables much more autonomy and independence than existing approaches with human notetakers. This is of course just one example, there are many.

We know assistive technologies provided through assessment are essential for students. However, in an increasingly complex technology environment, how do we ensure the technology doesn’t put students at odds with academic integrity?


It’s a trap

As noted, advances in technology have a lot of opportunity to make things better for people with disabilities, we ignore and suppress them at our peril. A concern though is how easy it might be to indirectly create barriers through the wording and enforcement of academic integrity policies or directly through mitigations. For example, it would be possible for someone relying on a tool such as Grammarly to fall foul of the proof-reading policy. Further, we’ve seen an increase in inaccessible content like photos of text be uploaded to quizzes to prevent AI tools copying text, overlooking the fact that optical character recognition (OCR) or the ability for software to read text in an image, is getting the same AI love as everything else. This is simply not an acceptable approach, but it highlights the immediate response HE has taken to advances in technology and that is one of pulling up the drawbridge and bolstering defences by compromising on accessibility and if you’re caught on the other side, you’re kind of on your own… for now.

However, for the most part our students do not fall foul of policy in using these tools routinely for teaching and learning. Our Skills Hub guidance on Grammarly:

‘Can I use Grammarly? It’s fine to use Grammarly to improve basic grammar, spelling and punctuation errors. However, you should not use Grammarly’s generative AI features to meaningfully change your work. The same restrictions apply to Grammarly as they do to a proofreader, and you should think carefully about which changes you accept, as it remains your responsibility to ensure the accuracy of your work.’

Sam Hemsley, Academic Developer follows up:

‘As above, just using AI in an assistive capacity isn’t contravening any rules/isn’t academic misconduct (for students with DSA provided assistive tech and those using genAI tools in general). There is nothing, inherently, wrong with AI being built into assistive tech. As with Grammarly, I suspect the onus is currently on the student to ensure use of such tools doesn’t contravene academic integrity policy and/or explicit permissions from module convenors, should they allow use of AI in the creation of assessed submissions.’


Who’s problem is it anyway?

Given the demands already placed on this group of students, having them need to police the software they get recommended is perhaps a stretch too far. As Learning Technologist Helen Morley points out.

‘It wouldn’t be appropriate to place the onus on the students to police the assistive technology (AT) they have been given […]. As for AT they’ve sourced/bought themselves, I’d be wary of the providers advising them of any changes in a way which is clear and accessible enough. Of course, students use AT for a variety of reasons but for those who already managing pressure on their executive function, energy levels, stress levels etc. this is an unreasonable expectation and turns something from assistive to prohibitive.’

I have to agree and whilst we need to look at reducing this extraneous demand, I strongly believe one way in which we can do this is by (yes you guessed it) building the digital capabilities of our staff and student body. Being able to have an informed discussion with your assessor doesn’t place the onus on you but enables you to feel more at ease with the solutions provided. Conversely, teaching staff will be more familiar with the tools, the capabilities, and the needs they are addressing and between them all, know where the boundaries are and how to avoid crossing them. Sounds too good to be true? Maybe, maybe not, but it has to be worth a try.

I don’t think it’s a stretch to say improvement in skills equates to an improvement in equity. Yes, I can hear you saying what about curriculum design, and you’re right, it’s a key element. Frameworks such as Universal Design for Learning and the university’s own curriculum review will address some of the challenges we’re seeing, but my view is that to engage meaningfully with the ‘threat or opportunity’ of advances in technology it’s fundamentally a skills question.

Skills led equity

There are number of barriers to mass adoption of Generative AI associated skills development: resources, content, training, time, willing, strategic direction, language being just a few. So where do you start with addressing these challenges?

I suggest language is the starting point, a common language of digital skills, what’s the difference between a chatbot and an LLM, how do I talk to my students about ‘hallucinations’, what about relying on summarised text. Does everyone use the same terms in the same way? When a DSA assessor is looking for institutional policy on use of Generative AI tools, wouldn’t it be easier if the terminology was the same across the institutions? A reach I know but makes the point.

So, what do we do about it?

My call to action here is to think explicitly about the digital skills you and your students have and where you might like to make improvements in confidence to ensure a fair, equitable and barrier free approach to the use of assistive technology for those that need it. Start by talking to your students who are using this stuff and find out how they are using it. It may then follow that when a student is assessed they are more able to understand the implications of certain technologies on your teaching and assessment.

Then, find opportunities to engage with the conversation, whether it be the internal AI summit (deadline 28th March 2025 for contribution), the AI in Education Community of Practice, or other development opportunities on these topics. We’re all learning as we go on this stuff, come join us.

Tagged with:
Posted in Active learning, Uncategorized

Coming back to Educational Enhancement

By Katie Turner Educational Enhancement Coordinator

I recently returned to the Educational Enhancement after doing a secondment in Communications and completing my apprenticeship in Digital Marketing. I would describe it as doing an exchange trip if I was a student, going away to learn new experiences and skills. With this time away it has given me further knowledge and understanding of different areas that I hope to bring back to EE.

The importance of communication

Whether that is with our colleagues, staff or students. I had the opportunity to work on different campaigns for different projects with staff and students. The way we talk about subjects to audiences can have a significant impact on the message we are trying to portray. The channels in how we communicate, and we write for that channel. Not only does this mean writing but other media forms such as video, animation and photo galleries.

Engagement

Whilst doing my digital marketing apprenticeship I had the opportunity to look at data analytics that can be used to capture engagement from the use of different channels. This is strongly used in communications with every article or video launched for a campaign. I hope to use these data collecting skills capturing the engagement for the different events and blogs the EE team send out. One subject that really stood out on my digital marketing apprenticeship is understanding your customer’s journey.

What is a customer journey

A customer journey refers to the series of steps a customer takes when interacting with a product or business, from the first awareness of a product or service to post-purchase interactions and loyalty. Understanding this journey can help that business or department.

If students or staff are trying to gather a piece of information. A good exercise set out on the course was to try and step into the shoes of that persona, so that being an academic or student and try a roadmap of that customer trying to gather information or professional development. To do this we had to map create a persona and then map out what information they may want to try reach and then follow this journey for how they would retain this and the feelings of moments of truth whether it was a happy or bad experience.

In Educational Enhancement the team focuses on assisting academics with teaching and learning from all aspects from the technology side of providing classrooms online to the pedagogy side of teaching. The guidance for staff has been thought about from every angle and kept updated with the newer trends with the use of AI technology. Our main sources of information can be found on the Educational Enhancement webpages. The team are dedicated to help academics in schools and run online and in-person workshops on topics relating to enhancing teaching and learning. If staff want to study in their own time they can use a self-study course on Canvas.

I am excited to bring the knowledge back to Educational Enhancement and hope to use my new skills gained to combine them with the team’s expertise to help academics and students in more meaningful way.

Tagged with: ,
Posted in Educational Enhancement, Professional Development

DeepSeek, AI advancement and the path forward for AI in education

The world of AI is a fast-moving place, shown very clearly by the arrival of a new AI-model from a Chinese company called Deepseek a few weeks ago. The whale in the company’s logo is very fitting as the release of their V3 model certainly made a splash!  

Although perhaps a shark logo would have been more appropriate given the impact over the last few weeks. The release of the V3 model immediately wiped almost $600bn (£482bn) off the market value from Nvidia, the biggest one-day loss in US history, and set panic amongst American AI companies and AI policy. But why the fuss and how does this relate to the future of AI in relation to education? 

Why is it such a big deal?

There were a few reasons this release was such a big deal, including its low cost, its power-usage and the fact it’s been made open-source. Let’s explore what each of these points mean. 

Cost

DeepSeek was reportedly trained on a budget of 6 million dollars, a fraction of the cost compared to the American models. DeepSeek V3 is comparable to the most powerful models that competitors like ChatGPT have developed and, more importantly, DeepSeek is able to give access to their model for free. You would have to pay almost $200 a month to get access to ChatGPT’s comparable model. 

From testing, the results these are essentially the same. OpenAI has a few more features but the models processing and speed are roughly on par. 

Power usage

AI models need a lot of processing power to train them. The best way is to use GPU’s (Graphic processing units) which are able to do advanced processing of data. Now America has restricted Chinese access to Nvidia’s H100 GPU’s because of their usefulness to training AI models. By restricting them, the American policy aimed to prevent China developing strong AI models…or so it was thought! 

But necessity is the mother of invention and with Chinese companies like DeepSeek being forced to use much weaker GPU’s (they used Nvidia’s much slower H800 GPU’s), they were able to work out ways of working smarter, programming models that got the same result using much less processing power. DeepSeek researchers then published a paper showing how they achieved this result and how others can do the same. 

Open source

DeepSeek’s model has also been released as open-source. The model can be used for free by anyone locally which means it would run entirely from an individual’s own computer (there is a cut down model that can be hosted on a reasonably powerful computer and a full model which requires a very powerful computer to run. It’s still out of reach for most people but closer than it was). This open-source approach is in opposition to other AI models such as OpenAI Google and Microsoft’s which are closed. This means you cannot run your own version of it but can only run it from those companies’ servers.  

Running your own model means you have complete access to how your data is used, plus you aren’t beholden to a company which has now clear advantages- we’ll cover that later. 

Should we be using it?

There’s no real reason to be less trustful of DeepSeek then any other model. DeepSeek was created in China and so has some censorship built into it due to the Chinese government, however American companies’ models have strong connections to the current American government and their own biases and potential data issues.  

You could argue that DeepSeek is currently the most ‘trustworthy’ (used loosely) of the big models or at least the best of a bad bunch given their release of an open-source model and free sharing of their research compared to the American companies’ secrecy. Also, at the time of writing, the only one to not have a contract with defence or defence adjacent weapons companies (Google has recently withdrawn their policy against developing weapons systems). DeepSeek also uses the lowest amount of power to run queries making it the most “sustainable” (again used loosely) of the current crop of models. 

We should be using the same critical thinking when using any AI model regardless of which company it comes from. This means you shouldn’t put any private or personal information into any AI model that’s not been approved for use by the institution. Don’t download any AI tools directly to a computer unless it’s been approved. Assume all models will have biases and potentially be aligned to certain views and values. Don’t assume anything it tells you is automatically true- it’s healthy to apply this principle to any AI model you use regardless of where it’s come from. 

The ideal path forward for AI models at Universities 

DeepSeek does however point us in the direction of a more ethical and safer future for AI- smaller, more efficient open-source models that are locally hosted and run by individuals or institutions, entirely under their own control without the data being taken or used by an outside company. This would also allow for more ethical models with less bias and censorships. Currently, there are cost barriers but a model like DeepSeek can run on a computer that costs a fraction of what ChatGPT and similar take to run. 

The advantages of these models would be multifaceted. First, a decoupling from what are largely unethical and unsustainable companies, meaning that students and staff could use AI models without worrying that their data is being harvested. Secondly, models could be tuned that are more free of biases and censorship.  

Of course, for this to come to fruition we’d need a few more technological leaps to make even smaller models that use much less power, but DeepSeek’s progress is an encouraging step in the right direction and more competition will hopefully breed more innovation.  

For better or worse we cannot avoid the proliferation of AI into our society and institutions, and so, taking an optimists view, I’d like to hope we can move to a less corporate ownership model and put such tools in the hands of Universities that can use them for the public good. We can but hope!  

Tagged with: ,
Posted in AI, Educational Enhancement

Academic Developers February round up

a picture of a megaphone, a mobile phone, a lightbulb, a magnifying glass and the YouTube logo on a pink background, to indicate announcing information

Welcome to the February 2025 Academic Developers round up, where we share information, events and news.

Sussex Education Festival 

The Sussex Education Festival is back for its third year! On Friday 2 May, colleagues from across the university will come together to share their experiences, research and reflections on teaching and learning here at Sussex. Further information, and a Call to Participation is now live on the Staff Hub. 

New and updated Educational Enhancement Advice and Guidance web pages 

A new Assessment Equivalencies page provides guidelines for determining student assessment workloads and equivalences.  

Our Curriculum design step by step page has improved guidance on writing or reviewing course and module learning outcomes. See also our updated and expanded guidance on oral assessments, which now includes lots of ideas for oracy based teaching and assessment activities. 

The page on AI and academic integrity now includes general guidance on what to do if AI use is suspected and the Assessment in an AI world page now lists assessment types in order from highly vulnerable to academic misconduct using AI, to inherently resilient.  

In other AI news, see our most recent Spotlight on AI in Education: January 2025, click here to join the newly launched Teaching and Learning with AI Community of Practice Teams space. And find out more about the Sussex AI in Education Summit planned for April. 

New on Learning Matters is Episode 5 of the Learning Matters featuring Dr Sophie Anns’ work (Associate Professor of Psychology) on creating an autism-friendly university. If you have a spare 30 minutes, then give it a listen (or read the transcript). It’s really interesting! 

Events and workshops organised by Educational Enhancement

Learning Technologist workshops – now available Workshops to support staff using technology in teaching and learning : Staff Hub : University of Sussex free staff development.   

Enhancing assessment and feedback: A case study compendium (Published in October by Advance HE) features a case study from Angela Gao (USBS) titled: “Changing the paradigm: rethinking assessment in the AI era”. 

Employability 

Between 250-300 students per year benefit from the employability-focused Business Law and Practice module taught in the School of Law, Politics and Sociology, which has been shortlisted in the Lexis Nexis Legal Awards University Commercial Impact Award category. Read on Learning Matters about (1) the module convenors’ approach to working with students to develop the module, and (2) the module content, delivery and student feedback. 

Newly published: The Advance HE 2025 employability case study compendium. Nineteen case studies from across the sector. 

Posted in Uncategorized

Spotlight on AI in Education: January 2025

Welcome to January’s Spotlight on AI in Education bulletin. With how fast things are moving, this will help you cut through the noise and catch what’s important. The bulletin highlights on-the-ground practice, institutional perspectives and trends in generative AI use across the sector and beyond. We hope you find this useful.

If you have anything you’d like to contribute or see in this bulletin please email EE@sussex.ac.uk

On-the-ground at Sussex

Workshop: A throwback to our second CoP and talking to students about GenAI

Read this if: You’re keen to address the hallucinating elephant in the room with your students.

Given is start of term, we thought you’d like to be reminded of the work by Dr Andres Guadamuz.
Reader In Intellectual Property Law (LPS) who spoke at the second CoP about the value of talking with students about AI..

Read more about our previous AI CoPs on the blog.


Institutional Perspective

Don’t miss out on having your voice heard for the AI Summit

Read this if: You want to have a say in developing institutional principles on the use of Generative AI in teaching, learning and assessment.

Look out for invitations to get involved and feed in your views. Each area (e.g. Faculty or Division) will complete a series of questions on this reporting form. We welcome responses from all areas of the university, as well as individual responses. Please respond via the form directly if you are unable to attend scheduled events – all submissions should be entered by Friday 28 March.

Find out more and book your place on the Summit.


Across the Sector

Jisc – Trends in assessment in higher education: considerations for policy and practice

Read this if: Your interested in ’emerging trends in assessment and feedback within the rapidly evolving landscape of higher education’.

From the introduction: ‘As part of this initiative, insights were gathered from our assessment and feedback working group and key stakeholders across the sector. This work aims to provide valuable recommendations for policy and practice, supporting institutions in creating effective, inclusive, and innovative assessment strategies’.

Find out more and read the report.


Further Afield

US chatbot dominance takes a hit

Read this if: You’re interested in the landscape of Generative AI tools, notably the Chinese made, DeepSeek.

To say it caused a stir maybe contender for the understatement of 2025! Within days, DeepSeek was top of the Apple App Store, some stock prices plummeted, then they had to limit sign ups due to a cyber-attack. Since then, Italy has launched a GDPR query at them and OpenAI claim to have evidence they used their work, it’s been a week for DeepSeek. As of writing the latest is that DeepSeek will be coming to Windows Copilot+ PCs soon as well. But what is it and why has it made waves. Check out the link below to find out more.

This BBC article has a good explainer and overview.


In case you missed it

Other links on the topic of AI in teaching and learning you may have missed.

Disclaimer on any tools not supported at Sussex. Please do not share Sussex, student, colleague, sensitive or personal data via these platforms. Not being supported means they have not passed stringent Data Protection assessments and could put you at breach of policy and legislation. For a list of supported platforms for teaching and learning please visit the Educational Enhancement website.

This was a Spotlight on AI in Education update from Educational Enhancement

Posted in AI

31 Years at Sussex

by Terry Bryan, Online Distance Learning Coordinator

About me

It all began on the 14th April 1993 …. but before we get to Sussex, a little about me …. I was born in Wrexham, North Wales in 1970 (now famous for the Hollywood-owned football club). When I was in my late teens I would go hitchhiking with a friend, with no particular plan of where we were going. We once ended up camping in a cemetery on a hill overlooking Lyme Regis bay – we only left when the Rector from St Michael’s church asked us to leave because he needed the space for a burial plot! We arrived in Brighton after hitching a lift from some guys driving a hippy van. I loved Brighton so much I vowed to return one day.

Where it began and the journey

After moving to Brighton in Dec 1992, I saw a job ad in the Argus newspaper for a Secretary at the University of Sussex in the School of Cultural and Community Studies (Coordinators were called Secretaries in those days). I applied for the post and to my surprise got the job with no previous office experience (I did complete a PA course in Wrexham, which equipped me for office work). I was over the moon and very proud to tell my friends I worked at The University of Sussex.

I began work on 14 April 1993 as Music and Media Studies Secretary in CCS. Media Studies was in its infancy, and they needed admin support (hence me). Roger Silverstone (Silverstone Building) was Head of Media Studies and Jonathan Cross was Head of Music (Jonathan now teaches at Oxford). I was a jack of all trades, working for both Music and Media Studies, collecting exam scripts, organising parties, giving Music & Media tours to students and new Faculty, issuing harmony tests to wide-eyed musicians, taking Music students to Glyndebourne opera house (the list goes on).

One of my first tasks in the early nineties was to investigate a new electronic mail system called ‘email’.  If you wanted to reach someone across campus you would either phone or send a paper memo via staff pigeonholes. We had a very basic email system called SOLX1, so under the guise of research, I sent an email to a student called Craig at Hofstra University in Long Island, New York (this was revolutionary at the time), and I received a reply! I rushed to my line manager and said it works! I’m still friends with Craig from Hofstra on Facebook after 31 years. Another task was to set up the first Music webpage, which was very basic, with gaudy flashing icons (akin to a casino website!).

I left the Media Dept in approximately 2000 after a major restructuring and solely supported the Music Department. I had additional roles such as Visiting and Exchange Student Coordinator; Student Attendance Monitor; Coordinator for Faculty in Anthropology; Geography; Cultural Studies; Drama and English.

Working for Online Distance Learning

In August 2023 I left the Music Department after 30 years and started working as Online Distance Learning (ODL) Coordinator, which is part of the Educational Enhancement (EE) Department headed by Katie Piatt and Mellow Sadik, based in the Main Library. Many people at Sussex have no idea ODL exists, or how it works. The small team are amazing, friendly and do a terrific job. It’s like a mini-University where we deal with payments, admissions, teaching, statistics, enrolment, support etc. The EE team are also brilliant, and I’ve gained lots of knowledge regarding AI in teaching and learning.

People often say, how could you stay at the same job for so long? My answer is always the same … working at Sussex isn’t the same everyday …  it’s varied and interesting. I’ve got to know thousands of students over time, many interesting Faculty from all disciplines, along with wonderful professional services colleagues who’ve come and gone over the years.  I’ve also been through 3 major restructurings, moved office 6 times, witnessed 3 Chancellors, 5 Vice Chancellors, countless line Managers and saw the landscape of campus change dramatically.  And I’m still proud to say I work at Sussex University!

Tagged with: ,
Posted in Educational Enhancement, Online Distance Learning (ODL)

Level Up Your Teaching: Video Games Aren’t Just for Gamers!

“If Pac-Man had affected us as kids, we’d all be running around in dark rooms, munching pills and listening to repetitive electronic music.”

Marcus Brigstocke

An image from the game Pacman. The yellow circle character of pacman is eating white dots followed by colourful ghosts.

Learning through how things work

Despite their commercial success games are often still seen, by some as trivial and lacking in legitimacy, compared to more traditional media. Video games have often been dismissed as mindless fun, but they’re so much more than that. In fact, they could be the perfect teaching tool you didn’t know you needed. Bogost, (2007, 2021) states that videogames legitimacy as a medium requires a more robust analysis than comparisons to other media and bidding for time until it has acceptance. In fact, Bogost suggests that “videogames open a new domain for persuasion, thanks to their core representational mode, procedurality. “Procedural rhetoric” in video games involves designing the rules, mechanics, and interactions of a game to create specific experiences and foster types of thinking. By engaging with these systems, players don’t just passively receive information; they actively learn by doing (an embodied experience). This makes video games uniquely suited to teach complex, dynamic concepts that are difficult to convey through static media. Games break down abstract concepts and give students the freedom to experiment—fail, try again, and succeed, while developing critical thinking and problem-solving skills (Tannahill, Tissington, and Senior, 2012).

It’s About Immersive Learning

Imagine taking your students from passively reading about history to actively experiencing it. Games like Massira, which takes players through the refugee experience, allow students to engage with content on a deeper emotional level. This “embodied learning,” is where students learn, by doing (Gee, 2008). They don’t just read about problems; they live them in a safe, simulated environment, making the experience much more impactful. Whether it’s exploring complex problems in economics, tackling ethical dilemmas in healthcare, or even diving into the physics of space travel, video games offer an interactive and engaging approach to learning (Bogost, 2007).

Students can attempt to better understand the cause and effects of ideas and contexts by throwing them into a game where their actions have consequences and watch them learn through experimentation. This learning method makes video games especially useful for teaching complex systems, where students actively explore and manipulate. Cram A., Hedberg J. G., Gosper M., & Dick G. (2011) state that “when social actors experience a higher level of embodied interaction, they more effectively encode, convey, and decode individual and collective communicative acts”. Video games involving complex scenarios require strategic thinking and problem-solving, which are skills to be fostered in Higher Education.

Building Empathy Through Gameplay

In a classroom, you can explain concepts like empathy, but how do you make students feel it? Video games have the potential to evoke emotional responses that help players develop empathy. Procedures and processes that govern our experiences can feel too abstract or distant for us to truly understand and empathize with. The makers of the game ‘The Walking Dead’ studied the impact of activating mirror neurons – those responsible for understanding others’ emotions (Madigan, 2012). They were inspired by a study looking at chimpanzees and how they reacted to facial expressions. This led them to focus on the detail expressed on the character’s faces in their game design. Attention to detail from those facial cues paid off as players reported higher levels of empathy and emotional reaction to the plight of the characters. The game’s success in this area is attributed to the success of triggering mirror neurons. This suggests that by stepping into someone else’s shoes, or being exposed to people’s experiences and feelings, students learn to see and feel the world from new perspectives.

An image taken from the game "The Walking dead". Two characters have concerned looks on their faces

Problems, solutions & reflection

Videogames offer a playground for people to experiment in a situational context and take on ‘roles’ and simulate solutions from different perspectives leading to better understanding and empathetic approach to effective problem-solving (Cram A., Hedberg J. G., Gosper M., & Dick G. 2011). Not all problems are created equally and some of them have solutions that are correct and knowable; therefore, processes can be followed to a solution, objectively. At the other end of the spectrum, we have conflicting evidence, opinions and assumptions which leave us with different solutions. Our different attitudes, emotions, and values may impact our ideas. A game may offer you the chance to play with those conflicting solutions and challenge your own assumptions and biases. This intention does not always go to plan as Bogost (2021) describes in a scenario where a tutor used a game where students assumed the role of running a McDonalds franchise “a scathing critique of the multinational fast-food industry”. The tutor hoped that students would reflect on the corruption, environmental impact, and questionable employment practices a large multi-national organisation adopts in the name of profit. Instead, students reported an increase empathy for the challenges faced by CEOs.
Effective use of video games in education doesn’t stop at just playing them. The most effective learning happens when students reflect on what they’ve done. After a game session, set aside time for discussions, problem-solving exercises, or even debates. What strategies worked? What didn’t? Reflection helps students solidify their understanding and apply it to real-world situations (Doney, 2019).

Takeaways for Your Teaching

Is it time to start thinking about or even rethink that video games are a serious educational tool? They offer immediate feedback, encourage critical thinking, and make learning fun. Games provide a risk-free environment where students can explore, fail, and try again perfect for subjects where practical experience is key but real-world stakes are high. Next time you’re planning a lesson, why not consider a video game? Whether you’re teaching economics, history, healthcare, or physics, video games have something to offer. Give them a try and watch your students level up their learning!

Places to go to get started

Test Tube Games “Bringing Science to life”.

Games 4 Change

Minecraft Education – The university has access to the library of pre-made (adaptable) Minecraft lessons covering a range of disciplines.

References

Ahn, S. J., Bessarabova, E., Bogost, I., Burgoon, J., Deen, M., Dunbar, N. E. (Norah E., Elizondo, J., Ferri, G., Flanagan, M., Grace, L. D., Hera, T. de la, Jacobs, R., Jansz, J., Jensen, M., Kaufman, G., Ketel, C., Kors, M., Lee, Y.-H., Miller, C. H., … Wilson, S. (2021). Persuasive gaming in context (J. Raessens, B. Schouten, J. Jansz, & T. de la Hera, Eds.). Amsterdam University Press. https://doi.org/10.1515/9789048543939

Bogost, I. (2007) Persuasive Games: The Expressive Power of Videogames. Cambridge, MA: MIT Press.

Cram A., Hedberg J. G., Gosper M., & Dick G. (2011). Situated, embodied and social problem-solving in virtual worlds. Research in Learning Technology, 19(3). https://doi.org/10.3402/rlt.v19i3.17114

Doney, I. (2019) ‘Research into effective gamification features to inform e-learning design’, Research in Learning Technology, 27. Available at: https://doi.org/10.25304/rlt.v27.2093 (Accessed: 4 October 2024).

Gee, E., & Gee, J. P. (2017). Games as Distributed Teaching and Learning Systems. Teachers College Record (1970), 119(12), 1–22. https://doi.org/10.1177/016146811711901202

Gee, J. P. (2008) What Video Games Have to Teach Us About Learning and Literacy. 2nd edn. New York: Palgrave Macmillan.

Gene Carolan. (2021). ‘Papers, Please’ – Using a Video Game to explore Experiential Learning and Authentic Assessment in Immigration and Asylum Law. Irish Journal of Academic Practice, 9(2). https://doi.org/10.21427/PC79-AN45

Madigan, J. (2012) ‘The Walking Dead, mirror neurons, and empathy’, Psychology of Games, 7 November. Available at: https://www.psychologyofgames.com/2012/11/the-walking-dead-mirror-neurons-and-empathy/ (Accessed: 4 October 2024).

Tannahill, N., Tissington, P. and Senior, C. (2012) ‘Video Games and Higher Education: What Can “Call of Duty” Teach Our Students?’, Frontiers in Psychology, 3, pp. 1–10. Available at: https://doi.org/10.3389/fpsyg.2012.00210 (Accessed: 4 October 2024).


Tagged with: , ,
Posted in Educational Enhancement, Learning Design

Spotlight on AI in Education: December 2024

Welcome to December’s Spotlight on AI in Education bulletin. With how fast things are moving, this will help you cut through the noise and catch what’s important. The bulletin highlights on-the-ground practice, institutional perspectives and trends in generative AI use across the sector and beyond. We hope you find this useful.

If you have anything you’d like to contribute or see in this bulletin please email EE@sussex.ac.uk

On-the-ground at Sussex

Workshop: Introduction to Generative AI within teaching

Read this if: You’d like learn about how to use AI tools within your teaching at an introductory level.

Educational Enhancement are once again running this popular workshop on January 28th at 2pm. You will have the opportunity to get some hands-on experience using AI tools such as Microsoft Copilot, as well as learning the basic skills of prompt engineering and considering when and how to use AI to assist with your teaching.

Find out more and book on our events page. Book early to avoid disappointment.


Institutional Perspective

AI in Education at Sussex

Read this if: You want to have a say in developing institutional principles on the use of Generative AI in teaching, learning and assessment.

A series of Faculty and Division workshops will be held in the New Year to capture challenges, solutions and thoughts on AI in Education at Sussex. These will culminate in a cross-university summit in April. The output of the summit will be a set of institutional principles on AI in Education, as well as an action plan for the ongoing development of staff and student facing resources.

Save the date: Friday 11th April, 1pm – 5pm In-person (spaces limited) or online. Registration will open in the new year.


Across the Sector

Generative AI strategies for Australian higher education: Emerging practice

Read this if: Your interested in a comprehensive analysis by TEQSA (Australia’s quality assurance agency for HE) of actions plans addressing the risk generative AI poses to academic integrity.

From the introduction ‘This toolkit has been informed by an analysis of the information institutions provided in response to our request. It seeks to support institutions in further developing and implementing effective strategies for meaningful and ethical integration of gen AI tools into teaching and learning practices, while also mitigating the risk gen AI poses to award integrity.’ It’s a meaty document, but thoughtfully laid out and includes suggested actions for institutions.

Read the document.


Further Afield

Some festive fun

Read this if: You want to try and spot the fake in ‘AI or REAL’ on BBC Bitesize.

We know by now that Generative AI can be used to create plausibly human created text, images and video. Until now though, it’s often been quite easy to spot something generated by AI, however it is becoming more difficult. Here’s a light hearted quiz to see if you can spot what’s real. How did you do?

Take the quiz on BBC Bitesize


In case you missed it

Other links on the topic of AI in teaching and learning you may have missed.

  • On the EE blog: Do not worry if you missed the latest Teaching and Learning with Generative Artificial Intelligence Community of Practice. There will be a write up in the new year. So make sure you’re subscribed to our blog to get the latest straight to your inbox.
  • Elsewhere online: We’ve been exploring Google’s NotebookLM*, but for those of you who are Spotify users, you may have experienced another use for the powerful platform. Yes, your Spotify Wrapped has been ‘enhanced’ by AI. Powered by NotebookLM, Spotify provides you with a personalised ‘My Wrapped AI Podcast’ with two ‘hosts’ discussing your listening habits.

*if you explore platforms not supported at Sussex, please do not share Sussex, student, colleague, sensitive or personal data via these platforms. Not being supported means they have not passed stringent Data Protection assessments and could put you at breach of policy and legislation. For a list of supported platforms for teaching and learning please visit the Educational Enhancement website.

This was a Spotlight on AI in Education update from Educational Enhancement

Posted in AI, AI CoP

About our blog

We are the Educational Enhancement team at the University of Sussex. We publish posts each fortnight about the use of technology to support teaching and learning. Read more about us.

Subscribe to the Blog

Enter your email address to receive notifications of new posts by email.

Archive