Digifest 2025: Where today meets tomorrow!

Image by ChatGPT

The ICC in Birmingham played host to Digifest 2025 for a two-day flagship learning and technology conference from Jisc. With 900 in-person attendees and 1,700 more joining online, this year’s event boasted its biggest attendance.  

Digifest’s full programme featured over 60 sessions, including keynote speakers, panels, workshops, and breakout sessions, across several halls and lecture theatres. There were also 40+ exhibitors. There was no shortage of things to see and engage with, and I must confess I visited the E-Sports stand more than I needed (for work purposes and a shot at the leaderboard).   

Here are four of the main themes that I took away with me. 

Theme 1 – AI 

AI = Acronym Impossible  

One of the repeated themes throughout the conference relating to AI was reframing the acronym itself to be more helpful in understanding its use. Professor Paul Iske’s take on AI was that it should be called Assistance Intelligence, challenging us to rethink the role of technology not as a replacement, but as a tool to enhance human creativity and problem-solving. In a different session, AI was described as a sidekick rather than a replacement for teachers. The hopes were that AI could help identify struggling students, automate administrative tasks, and personalise learning experiences.  

While Dr Sana Khareghani asked us to think of AI as a practical tool, not a magical fix, referring to AI as artificial assistance rather than intelligence. Sana insists AI is only as powerful as the intent and integrity behind its implementation. What we need for the future of AI is stronger foundations in infrastructure, talent, and data governance. 

Investment  

In the session “AI in Education: From Hype to Impact”, they spoke about the lack of time for staff and students to ‘properly’ engage with AI tools. They talked about AI being like any other hardware or software: “When you first start using it, you need to read the manual.” It is felt that people are not being given the resources to play with AI and learn how to be effective and critical in its use. With AI-literacy in mind, one speaker advocated for an “AI-way Code.” A generic AI literacy approach supporting staff and students to use any AI tool of their choice. One of their suggestions was to offer an optional 10-credit unit on all courses where students would assess and academically scrutinise the output of AI. This would be completed at the start of a course and could then be applied to future studies. 

In several sessions, the discussion turned to apprehension that investing in AI meant a loss of jobs. However, it was felt by some panellists that while concerns are valid, actual loss of employment had not been seen in their experiences. Others see AI use as an opportunity to reduce some of the administrative tasks, “freeing us to concentrate on ‘raw’ teaching and the things that we really enjoy.” One of the panellists made the point that “We should aim to do less with less, instead of doing more with less… Why do we need to do more?” An example of recent change created by AI automation is seen with recruitment chatbots, which are already proving a valuable tool on some institutions’ websites. 

As with any technology, it is the implementation that often proves most difficult, and AI is no different. In Paul Iske’s keynote, he spoke of the “final mile” being where we see a lot of failure. It doesn’t matter the size of the affordances of the technology, and with all the best intentions in the world, if the adoption process does not go well, then the project can still fail. Several talks spoke about “Buy In” AKA a Top-Down approach. Investment in the implementation of AI needs to come from all levels, especially from those at the most senior levels of management, for it to succeed. A senior champion to advocate for adoption is a powerful thing, I heard in several panels across the two days. 

Ethics  

Ethical implications of AI were discussed regarding the data that users hand over. The panel felt that free access comes at a cost in real terms, as people’s personal data and the information fed into an LLM is the sacrifice that is knowingly made. Both speakers talked about having their own recommended AI tools (like CoPilot at UoS), but there are no rules or policies for students using any other tools.  

Several panels reported the positive impact of equitable access had on students and staff. It presented opportunities for them to express themselves in different media and allowed for adaptation of some assessments. The cost of site licenses is extremely high, and not all tools are created equally. Therefore, the question is how to offer fair access to AI at the university, without asking people to forgo privacy?  

To further compound the issues facing AI use, in Dr Khareghani’s keynote she spoke about the information bias in AI being a reflection of the data and lived experiences being feed into it (mainly western and developed world) and urged institutions to move from being passive users to active creators: “Be an AI Maker, not an AI Taker.”  

A key takeaway was “AI is not magic dust, it requires investment in infrastructure, training, and ethical considerations to be truly transformative…” This seemed to be the consensus across the event. 

Theme 2 – Failure  

FAIL – First Attempt In Learning  

Professor Paul Iske (CFO), Chief Failure Officer, delivered an enlightening opening keynote, “Brilliant failures: working together, failing together, learning together”. As the founder of the Institute for Brilliant Failures, he emphasised the value of learning from setbacks.   

As part of the opening section of his presentation, the audience was asked to confess their errors, lapses, and failures in just three words, which were then projected in a gigantic word cloud on the main hall screen for all to see. This underscored just how universal the experience of failure is (I had gotten a train ticket to Birmingham International instead of New Street). Iske used this information to emphasise that if we can accept our failings for everyday or mundane things, then is it possible that we can do the same for far more complex and multifaceted issues, such as trying to improve education systems? Maybe, failing at that is not such a bad thing after all.  

A brilliant failure can be described as a well-prepared attempt to create value that has a different outcome than what was expected, and with that comes an opportunity to learn. Therefore, we can reframe the cost of failure by looking at the value in failing well. Iske calls this “failing forward”, which is a combination of social intelligence, pattern recognition and creativity leading to failure intelligence (failing well). Using failure intelligence, you are planning for success.  

Archetypes of failure 

Iske briefly touched on his methodology for failing and its sixteen archetypes. Here are just a few: 

  1. The Junk: Persisting with a failing project due to prior investments, leading to further losses.  
  2. The Elephant: Overcomplicating solutions, making them unwieldy and ineffective.  
  3. The Banana Peel: Small oversights or errors that lead to significant negative outcomes.  
  4. The Right Brain Hemisphere: Ignoring creative or intuitive insights, resulting in missed opportunities.  
  5. The Empty Spot at the Table: Excluding key stakeholders from decision-making processes results in a lack of support or unforeseen issues. 
Image by ChatGPT

His methodology looks really interesting and is something I am keen to explore further in the future (a potential blog post). When we plan a project, does it ever finish exactly as we planned, or did we ‘fail’ and create new ideas and possibilities along the way through divergent thinking? Improvisation and creativity are what separate humans from technology. Iske mentioned overhearing a conversation before the conference in which one of the participants said, “An AI knows everything but understands nothing”.  

Theme 3 – Accessibility & Inclusion 

The 4Cs 

Kellie Mote’s Fireside chat on accessibility, for me, was one of the most impactful sessions of the event. Kellie was in conversation with Piers Wilkinson, who is the Director of the National Association of Disability Practitioners (NADP), who immediately emphasised that all solutions and policies need to use the 4 Cs approach: 

  • Co-produce 
  • Co-create 
  • Co-design 
  • Consultation 

Wilkinson is consistently exasperated at retrospective accessibility adaptions being made to new spaces and not embedded in the original design process. He explained that this could be explained, in part, because architects do not have to make buildings accessible by law. Another particularly striking example he gave was the irony of conducting digital poverty surveys online, further highlighting the need for more inclusive design thinking.  

Other key takeaways included consulting with students prior to an assessment instead of after they fail. One solution Wilkinson put forward for this was to pay disabled interns to help design accessible courses and assessments during the summer (a win-win solution, he claimed, as there are generally limited summer jobs for disabled people). 

The legal precedent set by the Abahart vs. Bristol University case, a case I was not aware of at the time, emphasises the responsibilities facing all university staff. We were reminded that disability is one of the most diverse characteristics, meaning staff have a lot to learn. How confident do we feel in supporting and helping all disabilities?  

With regards to training, Wilkinson puts forward the argument for guided tours being more important than immersive experience training. Being in situ and knowing where the issues are and what to do are far more helpful to disabled people needing support. Knowing where fire doors are and how to open them is far better than spending the morning pretending to be disabled, explained the presenter.  

Throughout this chat, it was clear that Piers Wilkinson is someone who does not mince his words, and this was further confirmed by his assertion that one of the key things for everyone to do when discussing DEI is to have blunt conversations. “It is better to say something imperfectly and learn than to stay silent…” he urged. “Develop and empower people to talk about DEI even though they have no lived experience”.  

What I came away with from this session, aside from the acknowledgement of my current limitations and low confidence in this important and diverse area, was how everything being said led back to one of Iskes’ sixteen archetypes of failure, “The Empty Spot at the Table”. The 4 C’s are imperative to make sure we don’t fail those whom this impacts most.   

Theme 4 – Esports (for the win) 

When I read the title of this presentation, “If you don’t have an esports curriculum, why not?” I thought I knew the answer. However, just fifteen minutes later, and performing cognitive dissonance on cost and implementation, I had no idea why the University of Sussex hadn’t embraced Esports.  The British Esports Federation has qualifications from level 4 to 7 on Business courses (I would also imagine that they could merge into some Informatics courses too). They were showing how Esports can 

“…transform your digital strategy and innovate your pedagogical approach and curriculum design to support the future workforce, to address skills gaps and engage a tech-agile, digital generation of young people that are in our classrooms.” 

Big Business 

The gaming industry’s value surpasses that of film and music combined, I was told. Pastorally, Esports present an opportunity for Sussex to compete nationally and internationally. The championships attract millions of viewers both in person and online. The inaugural Esports Olympics are being held in 2027 in Riyadh, all giving credence to the idea that Esports means business. Having just heard a panel discussion about investment in pastoral care to help secure the future of HE, this talk possibly resonated more strongly than it may have done another time. However, being part of the journey in educating and supporting the future creative workforce behind this industry is an enticing one. Or you could say their sales pitch worked on me. 

Final Thought 

Digifest 2025 was a whirlwind of ideas, challenges, and opportunities. From embracing failure to reimagining our biological potential. One thing was abundantly clear: no one has got to grips with the current disruption felt by AI.  

“Be part of the disruption of AI or prepare to be disrupted by AI”  

Khareghani (2025) 

The conversations around digital transformation, accessibility, and the future of AI will shape how institutions evolve in the coming years, but it is people who will be leading it. Maybe I am wrong in my assumption, but thankfully, I have never been happier with the idea of making a mistake and learning from it.  

Here’s to the next wave of brilliant failures, bold experiments, and transformative learning experiences. Until Digifest 2026! 

References 

Iske, P., 2021. Institute of Brilliant Failures: Make room to experiment, innovate, and learn. Amsterdam: BIS Publishers. 

Jisc, 2025. Digifest. [online] Available at: https://jisc.ac.uk/digifest [Accessed 27 March 2025]. 

https://www.equalityhumanrights.com/university-bristol-v-abrahart-equality-watchdog-responds-judgment-landmark-case

Posted in Uncategorized

Academic Developers June round up

a picture of a megaphone, a mobile phone, a lightbulb, a magnifying glass and the YouTube logo on a pink background, to indicate announcing information

Welcome to the June 2025 edition of the Academic Developers round up.

Sussex Academic Framework

Most of our focus at the moment is on supporting the implementation of the new Sussex Academic Framework. This includes providing direct support and guidance to our faculties, via bespoke workshops. 

We’re also helping to develop guidance and resources on the Sussex Academic Framework support site on Canvas. We’re also adapting and updating our resources so they better align with the Sussex Academic Framework and the Curriculum Design Principles within.  

If you have any questions regarding this please email your Academic Developer or Amanda Bolt

New web guidance 

We have a new guidance page on how to embed experiential learning into your curriculum.  We’ve also, this week, updated the Curriculum design step-by-step page. 

New on Learning Matters 

The Learning Matters site has a new look. We’ve a new banner image and the tags have been streamlined the tags to make it easier to search for case studies, in particular those aligned to Sussex curriculum design principles, framework and strategic priorities. If you would like to contribute to Learning Matters there are various ways to do so, please visit our About page to find out more. 

A new Learning Matters article, What I have learnt from grading students on their participation, by Paven Basuita (Assistant Professor in Law) reflects on her experience of grading students on participation in Clinical Legal Education. She argues that assessing participation can offer a holistic, inclusive, and continuous evaluation of student learning. However, she also acknowledges the challenges of such an assessment and provides practical recommendations for others looking to assess participation in their own teaching. 

Elsewhere in HE 

The Office for Students Director of Strategy and Delivery, Josh Fleming, explains our position on artificial intelligence, published a blog post last week on Embracing innovation in higher education: our approach to artificial intelligence 

Edinburgh Napier University have launched a new (free) Coursera course: Transforming Higher Education with GenAI: Enhancing Teaching, Learning and Student Engagement. 

Designed for educators, administrators, and technologists, the course explores how GenAI tools can support teaching, learning, and institutional practices – while critically engaging with their limitations, biases, and ethical dilemmas. 

Rather than offering simple solutions, the course focuses on helping you reflect on your own context, experiment safely, and develop tangible resources like lesson plans, policy frameworks, or teaching strategies that you can take back to your work. 

Educational Enhancement Office Move! 

Educational Enhancement will be saying goodbye to our home in the library this week, as we are moving to Level 1, Bramber House. 

Posted in Uncategorized

Three Quick Fixes for Common Canvas Issues

Students said, you can..

It’s that time of year again when we start thinking about setting up our Canvas module sites for the next academic year.  

Earlier this year members of Educational Enhancement worked with student connectors on a project to review and improve Canvas module templates. 

While there will be a more in-depth blog post later with more detail of the whole project, we wanted to share some tips for improving your Canvas modules, based on feedback from students and colleagues as part of the project, to help you when setting up your Canvas module sites for 25/26.  

Accessible PowerPoints 

Students said: “Some uploaded PowerPoints on Canvas have vital information hidden behind animations or off-screen” 

If PowerPoints include animations (i.e. transitions, etc), these will not work in the on-screen preview within Canvas. 

  • If animations are included, ensure students are given the instruction to download them, or create a version of the slides which do not include animations. 

Clear Assignment Signposting 

Students said: Assessment information and assessments are in different locations which can be irritating” 

Include a link to the module’s Assignments screen on your Assessment Information page (Note – the page which provides assessment information to students may be labelled differently, depending on which school template you are currently using): 

  • Open Assessment information page 
  • Click Edit in the top right 
  • Type “Submit assignment here” below each assessment detail 
  • Highlight the text 
  • Change the formatting so that it is obvious amongst other text on the page 
  • With the text still highlighted, click the link icon on the toolbar 
  • Select Module link 
  • Select Module Navigation from the pane on the right 
  • Select Assignments from the list of options 
  • Ensure this is labelled and formatted in such a way to make its purpose obvious to students (e.g. “Submit your assignment here” in a large bold font) 

Adding files to pages 

Academic Colleagues said: “I wish it was easier to add files to a page without first uploading to the Files area.” 

See guidance on how to upload files directly within a page 

  • Upload documents within Rich Content Editor 
  • Drag & Drop from File Manager 
  • Copy & Paste the file from file manager 

Follow the set-up instructions! 

And one last extra tip from us. 

Follow the module set-up guidance for your school when setting up your 25/26 modules. There are some important changes this year, due to the Student Information System project to replace Sussex Direct and our central student database. 

If you have any questions about setting up your 25/26 Canvas module site, please contact your Learning Technologist via educationalenhancement@sussex.ac.uk

For more information about the Student Information System project, visit the Student Information System project website. 

Posted in Uncategorized

Academic Developers March round up

a picture of a megaphone, a mobile phone, a lightbulb, a magnifying glass and the YouTube logo on a pink background, to indicate announcing information

Inclusivity week – 31 March – 4 April 

Talks and workshops are taking place throughout the week highlighting some of the excellent inclusive teaching and assessment practices at Sussex, and there’s still time for you to sign up Review the programme and sign up now.  

The Sussex Education Festival 

The Sussex Education Festival is back for its third year and will be held on Friday 2 May (9.30-3:30pm) in the Woodland Rooms at the Student Centre. 

Registration is now open! We’d like to encourage all colleagues involved in teaching and learning to register to attend. The Festival will consist of a number of different session types from speakers from all faculties. This will include lightning talks, case studies and panel discussions, which are focused around themes such as inclusion and belonging, student engagement, generative AI, embedding employability, group work, and outdoor learning. The full programme will be available shortly. 

We’re excited to celebrate and reflect on all the amazing work that goes in to teaching and learning here at Sussex. We hope to see you there! 

Register your place 

Learning Matters 

Read the latest posts on the Learning Matters blog

We welcome all colleagues at the University to write a blog, or an article, present a case study, or to collaborate on a podcast around their teaching and/or around student learning at Sussex. If you would like to contribute or would like to find out more please contact k.r.horne@sussex.ac.uk

Gen AI Practical tips for teachers

The popular University of Kent Digitally Enhanced Education webinars last week focused on AI in Assessment – How Universities Are Responding and Innovating. The recordings are now available if you’d like to catch up or share them with colleagues. 

You can also register to join their second webinar following the same theme on Wednesday the 2nd April, from 14:00–16:30 (BST).  

The Advance HE ‘Let’s Talk About Student Success’ Podcast

Join the Advance HE monthly conversations with higher education’s thought leaders – a space where insights flow as freely as coffee, and where experience meets innovation in supporting student success. 

Participate in a research project  

Take part in a project being conducted by colleagues from a number of UK Universities into personal tutoring and academic advising practices within UK HE. The study consists of a short survey which will take approx. 15 minutes to complete. 

Posted in Uncategorized

When does assistive technology become prohibitive technology?

By Dan Axson (Learning Technologies Manager)

In May 2024, I wrote a post titled ‘What if our students with disabilities didn’t have to jump through hoops to have an equitable experience.’ You’ll be forgiven for not having read it, it never saw the light of day (hence the very un-catchy title). Amongst the noise of a never-ending flurry of AI related content, I never quite got it finished. In it I had a meandering wander through my thoughts and experience about the onerous admin, lack of agency and impacts on cognitive load a student with disabilities may experience. Naturally given we’re in the ‘AI is mentioned in all the things’ era, the topic came up. It made the case that university policies and guidance on the use of generative AI technology for teaching, learning and assessment may inadvertently put students with Disabled Students Allowance funded software at odds with academic integrity policies.

Fast forward ten months and the topic, with little shock to anyone, is not only still very relevant but becoming more so. Then an email thread shared with me recently resurrected the ideas. Discussed in the email was the complexities of approving software in a space where the software companies themselves can update tools to add AI ‘features’ without notice, students understanding or awareness of institutional policies can’t fairly be relied on and the DSA assessor’s ability to find them is an us problem not a them problem. So here is that post, dusted off and made fit for 2025.

 Robot holding out hand in supportive manner. Hand fills frame. 
Photo by Possessed Photography on Unsplash 

From dependent to independent

As anyone in the Learning Technology game will tell you, conversations are largely dominated by generative AI. Recently, we have been testing a tool called Jamworks. In short, the tool generates a transcript of a recording, for example a recorded lecture, then uses said transcript for a number of very well thought out AI tools such as summaries, key points, flashcards etc. As part of our conversations around the use of such tools in the classroom, was how this would be hugely valuable to someone who relies on a notetaker. These notes are generated almost immediately, they don’t require waiting on the notetaker to send them on and unlike with the human notetaker, the student can ask questions of the notes, whereas the human notetaker may not be a subject expert. In other words, the technology enables much more autonomy and independence than existing approaches with human notetakers. This is of course just one example, there are many.

We know assistive technologies provided through assessment are essential for students. However, in an increasingly complex technology environment, how do we ensure the technology doesn’t put students at odds with academic integrity?


It’s a trap

As noted, advances in technology have a lot of opportunity to make things better for people with disabilities, we ignore and suppress them at our peril. A concern though is how easy it might be to indirectly create barriers through the wording and enforcement of academic integrity policies or directly through mitigations. For example, it would be possible for someone relying on a tool such as Grammarly to fall foul of the proof-reading policy. Further, we’ve seen an increase in inaccessible content like photos of text be uploaded to quizzes to prevent AI tools copying text, overlooking the fact that optical character recognition (OCR) or the ability for software to read text in an image, is getting the same AI love as everything else. This is simply not an acceptable approach, but it highlights the immediate response HE has taken to advances in technology and that is one of pulling up the drawbridge and bolstering defences by compromising on accessibility and if you’re caught on the other side, you’re kind of on your own… for now.

However, for the most part our students do not fall foul of policy in using these tools routinely for teaching and learning. Our Skills Hub guidance on Grammarly:

‘Can I use Grammarly? It’s fine to use Grammarly to improve basic grammar, spelling and punctuation errors. However, you should not use Grammarly’s generative AI features to meaningfully change your work. The same restrictions apply to Grammarly as they do to a proofreader, and you should think carefully about which changes you accept, as it remains your responsibility to ensure the accuracy of your work.’

Sam Hemsley, Academic Developer follows up:

‘As above, just using AI in an assistive capacity isn’t contravening any rules/isn’t academic misconduct (for students with DSA provided assistive tech and those using genAI tools in general). There is nothing, inherently, wrong with AI being built into assistive tech. As with Grammarly, I suspect the onus is currently on the student to ensure use of such tools doesn’t contravene academic integrity policy and/or explicit permissions from module convenors, should they allow use of AI in the creation of assessed submissions.’


Who’s problem is it anyway?

Given the demands already placed on this group of students, having them need to police the software they get recommended is perhaps a stretch too far. As Learning Technologist Helen Morley points out.

‘It wouldn’t be appropriate to place the onus on the students to police the assistive technology (AT) they have been given […]. As for AT they’ve sourced/bought themselves, I’d be wary of the providers advising them of any changes in a way which is clear and accessible enough. Of course, students use AT for a variety of reasons but for those who already managing pressure on their executive function, energy levels, stress levels etc. this is an unreasonable expectation and turns something from assistive to prohibitive.’

I have to agree and whilst we need to look at reducing this extraneous demand, I strongly believe one way in which we can do this is by (yes you guessed it) building the digital capabilities of our staff and student body. Being able to have an informed discussion with your assessor doesn’t place the onus on you but enables you to feel more at ease with the solutions provided. Conversely, teaching staff will be more familiar with the tools, the capabilities, and the needs they are addressing and between them all, know where the boundaries are and how to avoid crossing them. Sounds too good to be true? Maybe, maybe not, but it has to be worth a try.

I don’t think it’s a stretch to say improvement in skills equates to an improvement in equity. Yes, I can hear you saying what about curriculum design, and you’re right, it’s a key element. Frameworks such as Universal Design for Learning and the university’s own curriculum review will address some of the challenges we’re seeing, but my view is that to engage meaningfully with the ‘threat or opportunity’ of advances in technology it’s fundamentally a skills question.

Skills led equity

There are number of barriers to mass adoption of Generative AI associated skills development: resources, content, training, time, willing, strategic direction, language being just a few. So where do you start with addressing these challenges?

I suggest language is the starting point, a common language of digital skills, what’s the difference between a chatbot and an LLM, how do I talk to my students about ‘hallucinations’, what about relying on summarised text. Does everyone use the same terms in the same way? When a DSA assessor is looking for institutional policy on use of Generative AI tools, wouldn’t it be easier if the terminology was the same across the institutions? A reach I know but makes the point.

So, what do we do about it?

My call to action here is to think explicitly about the digital skills you and your students have and where you might like to make improvements in confidence to ensure a fair, equitable and barrier free approach to the use of assistive technology for those that need it. Start by talking to your students who are using this stuff and find out how they are using it. It may then follow that when a student is assessed they are more able to understand the implications of certain technologies on your teaching and assessment.

Then, find opportunities to engage with the conversation, whether it be the internal AI summit (deadline 28th March 2025 for contribution), the AI in Education Community of Practice, or other development opportunities on these topics. We’re all learning as we go on this stuff, come join us.

Tagged with:
Posted in Accessibility, Uncategorized

Coming back to Educational Enhancement

By Katie Turner Educational Enhancement Coordinator

I recently returned to the Educational Enhancement after doing a secondment in Communications and completing my apprenticeship in Digital Marketing. I would describe it as doing an exchange trip if I was a student, going away to learn new experiences and skills. With this time away it has given me further knowledge and understanding of different areas that I hope to bring back to EE.

The importance of communication

Whether that is with our colleagues, staff or students. I had the opportunity to work on different campaigns for different projects with staff and students. The way we talk about subjects to audiences can have a significant impact on the message we are trying to portray. The channels in how we communicate, and we write for that channel. Not only does this mean writing but other media forms such as video, animation and photo galleries.

Engagement

Whilst doing my digital marketing apprenticeship I had the opportunity to look at data analytics that can be used to capture engagement from the use of different channels. This is strongly used in communications with every article or video launched for a campaign. I hope to use these data collecting skills capturing the engagement for the different events and blogs the EE team send out. One subject that really stood out on my digital marketing apprenticeship is understanding your customer’s journey.

What is a customer journey

A customer journey refers to the series of steps a customer takes when interacting with a product or business, from the first awareness of a product or service to post-purchase interactions and loyalty. Understanding this journey can help that business or department.

If students or staff are trying to gather a piece of information. A good exercise set out on the course was to try and step into the shoes of that persona, so that being an academic or student and try a roadmap of that customer trying to gather information or professional development. To do this we had to map create a persona and then map out what information they may want to try reach and then follow this journey for how they would retain this and the feelings of moments of truth whether it was a happy or bad experience.

In Educational Enhancement the team focuses on assisting academics with teaching and learning from all aspects from the technology side of providing classrooms online to the pedagogy side of teaching. The guidance for staff has been thought about from every angle and kept updated with the newer trends with the use of AI technology. Our main sources of information can be found on the Educational Enhancement webpages. The team are dedicated to help academics in schools and run online and in-person workshops on topics relating to enhancing teaching and learning. If staff want to study in their own time they can use a self-study course on Canvas.

I am excited to bring the knowledge back to Educational Enhancement and hope to use my new skills gained to combine them with the team’s expertise to help academics and students in more meaningful way.

Tagged with: ,
Posted in Educational Enhancement, Professional Development

DeepSeek, AI advancement and the path forward for AI in education

The world of AI is a fast-moving place, shown very clearly by the arrival of a new AI-model from a Chinese company called Deepseek a few weeks ago. The whale in the company’s logo is very fitting as the release of their V3 model certainly made a splash!  

Although perhaps a shark logo would have been more appropriate given the impact over the last few weeks. The release of the V3 model immediately wiped almost $600bn (£482bn) off the market value from Nvidia, the biggest one-day loss in US history, and set panic amongst American AI companies and AI policy. But why the fuss and how does this relate to the future of AI in relation to education? 

Why is it such a big deal?

There were a few reasons this release was such a big deal, including its low cost, its power-usage and the fact it’s been made open-source. Let’s explore what each of these points mean. 

Cost

DeepSeek was reportedly trained on a budget of 6 million dollars, a fraction of the cost compared to the American models. DeepSeek V3 is comparable to the most powerful models that competitors like ChatGPT have developed and, more importantly, DeepSeek is able to give access to their model for free. You would have to pay almost $200 a month to get access to ChatGPT’s comparable model. 

From testing, the results these are essentially the same. OpenAI has a few more features but the models processing and speed are roughly on par. 

Power usage

AI models need a lot of processing power to train them. The best way is to use GPU’s (Graphic processing units) which are able to do advanced processing of data. Now America has restricted Chinese access to Nvidia’s H100 GPU’s because of their usefulness to training AI models. By restricting them, the American policy aimed to prevent China developing strong AI models…or so it was thought! 

But necessity is the mother of invention and with Chinese companies like DeepSeek being forced to use much weaker GPU’s (they used Nvidia’s much slower H800 GPU’s), they were able to work out ways of working smarter, programming models that got the same result using much less processing power. DeepSeek researchers then published a paper showing how they achieved this result and how others can do the same. 

Open source

DeepSeek’s model has also been released as open-source. The model can be used for free by anyone locally which means it would run entirely from an individual’s own computer (there is a cut down model that can be hosted on a reasonably powerful computer and a full model which requires a very powerful computer to run. It’s still out of reach for most people but closer than it was). This open-source approach is in opposition to other AI models such as OpenAI Google and Microsoft’s which are closed. This means you cannot run your own version of it but can only run it from those companies’ servers.  

Running your own model means you have complete access to how your data is used, plus you aren’t beholden to a company which has now clear advantages- we’ll cover that later. 

Should we be using it?

There’s no real reason to be less trustful of DeepSeek then any other model. DeepSeek was created in China and so has some censorship built into it due to the Chinese government, however American companies’ models have strong connections to the current American government and their own biases and potential data issues.  

You could argue that DeepSeek is currently the most ‘trustworthy’ (used loosely) of the big models or at least the best of a bad bunch given their release of an open-source model and free sharing of their research compared to the American companies’ secrecy. Also, at the time of writing, the only one to not have a contract with defence or defence adjacent weapons companies (Google has recently withdrawn their policy against developing weapons systems). DeepSeek also uses the lowest amount of power to run queries making it the most “sustainable” (again used loosely) of the current crop of models. 

We should be using the same critical thinking when using any AI model regardless of which company it comes from. This means you shouldn’t put any private or personal information into any AI model that’s not been approved for use by the institution. Don’t download any AI tools directly to a computer unless it’s been approved. Assume all models will have biases and potentially be aligned to certain views and values. Don’t assume anything it tells you is automatically true- it’s healthy to apply this principle to any AI model you use regardless of where it’s come from. 

The ideal path forward for AI models at Universities 

DeepSeek does however point us in the direction of a more ethical and safer future for AI- smaller, more efficient open-source models that are locally hosted and run by individuals or institutions, entirely under their own control without the data being taken or used by an outside company. This would also allow for more ethical models with less bias and censorships. Currently, there are cost barriers but a model like DeepSeek can run on a computer that costs a fraction of what ChatGPT and similar take to run. 

The advantages of these models would be multifaceted. First, a decoupling from what are largely unethical and unsustainable companies, meaning that students and staff could use AI models without worrying that their data is being harvested. Secondly, models could be tuned that are more free of biases and censorship.  

Of course, for this to come to fruition we’d need a few more technological leaps to make even smaller models that use much less power, but DeepSeek’s progress is an encouraging step in the right direction and more competition will hopefully breed more innovation.  

For better or worse we cannot avoid the proliferation of AI into our society and institutions, and so, taking an optimists view, I’d like to hope we can move to a less corporate ownership model and put such tools in the hands of Universities that can use them for the public good. We can but hope!  

Tagged with: ,
Posted in AI, Educational Enhancement

Academic Developers February round up

a picture of a megaphone, a mobile phone, a lightbulb, a magnifying glass and the YouTube logo on a pink background, to indicate announcing information

Welcome to the February 2025 Academic Developers round up, where we share information, events and news.

Sussex Education Festival 

The Sussex Education Festival is back for its third year! On Friday 2 May, colleagues from across the university will come together to share their experiences, research and reflections on teaching and learning here at Sussex. Further information, and a Call to Participation is now live on the Staff Hub. 

New and updated Educational Enhancement Advice and Guidance web pages 

A new Assessment Equivalencies page provides guidelines for determining student assessment workloads and equivalences.  

Our Curriculum design step by step page has improved guidance on writing or reviewing course and module learning outcomes. See also our updated and expanded guidance on oral assessments, which now includes lots of ideas for oracy based teaching and assessment activities. 

The page on AI and academic integrity now includes general guidance on what to do if AI use is suspected and the Assessment in an AI world page now lists assessment types in order from highly vulnerable to academic misconduct using AI, to inherently resilient.  

In other AI news, see our most recent Spotlight on AI in Education: January 2025, click here to join the newly launched Teaching and Learning with AI Community of Practice Teams space. And find out more about the Sussex AI in Education Summit planned for April. 

New on Learning Matters is Episode 5 of the Learning Matters featuring Dr Sophie Anns’ work (Associate Professor of Psychology) on creating an autism-friendly university. If you have a spare 30 minutes, then give it a listen (or read the transcript). It’s really interesting! 

Events and workshops organised by Educational Enhancement

Learning Technologist workshops – now available Workshops to support staff using technology in teaching and learning : Staff Hub : University of Sussex free staff development.   

Enhancing assessment and feedback: A case study compendium (Published in October by Advance HE) features a case study from Angela Gao (USBS) titled: “Changing the paradigm: rethinking assessment in the AI era”. 

Employability 

Between 250-300 students per year benefit from the employability-focused Business Law and Practice module taught in the School of Law, Politics and Sociology, which has been shortlisted in the Lexis Nexis Legal Awards University Commercial Impact Award category. Read on Learning Matters about (1) the module convenors’ approach to working with students to develop the module, and (2) the module content, delivery and student feedback. 

Newly published: The Advance HE 2025 employability case study compendium. Nineteen case studies from across the sector. 

Posted in Uncategorized

About our blog

We are the Educational Enhancement team at the University of Sussex. We publish posts each fortnight about the use of technology to support teaching and learning. Read more about us.

Subscribe to the Blog

Enter your email address to receive notifications of new posts by email.

Archive