
The ICC in Birmingham played host to Digifest 2025 for a two-day flagship learning and technology conference from Jisc. With 900 in-person attendees and 1,700 more joining online, this year’s event boasted its biggest attendance.
Digifest’s full programme featured over 60 sessions, including keynote speakers, panels, workshops, and breakout sessions, across several halls and lecture theatres. There were also 40+ exhibitors. There was no shortage of things to see and engage with, and I must confess I visited the E-Sports stand more than I needed (for work purposes and a shot at the leaderboard).
Here are four of the main themes that I took away with me.
Theme 1 – AI
AI = Acronym Impossible
One of the repeated themes throughout the conference relating to AI was reframing the acronym itself to be more helpful in understanding its use. Professor Paul Iske’s take on AI was that it should be called Assistance Intelligence, challenging us to rethink the role of technology not as a replacement, but as a tool to enhance human creativity and problem-solving. In a different session, AI was described as a sidekick rather than a replacement for teachers. The hopes were that AI could help identify struggling students, automate administrative tasks, and personalise learning experiences.
While Dr Sana Khareghani asked us to think of AI as a practical tool, not a magical fix, referring to AI as artificial assistance rather than intelligence. Sana insists AI is only as powerful as the intent and integrity behind its implementation. What we need for the future of AI is stronger foundations in infrastructure, talent, and data governance.
Investment
In the session “AI in Education: From Hype to Impact”, they spoke about the lack of time for staff and students to ‘properly’ engage with AI tools. They talked about AI being like any other hardware or software: “When you first start using it, you need to read the manual.” It is felt that people are not being given the resources to play with AI and learn how to be effective and critical in its use. With AI-literacy in mind, one speaker advocated for an “AI-way Code.” A generic AI literacy approach supporting staff and students to use any AI tool of their choice. One of their suggestions was to offer an optional 10-credit unit on all courses where students would assess and academically scrutinise the output of AI. This would be completed at the start of a course and could then be applied to future studies.
In several sessions, the discussion turned to apprehension that investing in AI meant a loss of jobs. However, it was felt by some panellists that while concerns are valid, actual loss of employment had not been seen in their experiences. Others see AI use as an opportunity to reduce some of the administrative tasks, “freeing us to concentrate on ‘raw’ teaching and the things that we really enjoy.” One of the panellists made the point that “We should aim to do less with less, instead of doing more with less… Why do we need to do more?” An example of recent change created by AI automation is seen with recruitment chatbots, which are already proving a valuable tool on some institutions’ websites.
As with any technology, it is the implementation that often proves most difficult, and AI is no different. In Paul Iske’s keynote, he spoke of the “final mile” being where we see a lot of failure. It doesn’t matter the size of the affordances of the technology, and with all the best intentions in the world, if the adoption process does not go well, then the project can still fail. Several talks spoke about “Buy In” AKA a Top-Down approach. Investment in the implementation of AI needs to come from all levels, especially from those at the most senior levels of management, for it to succeed. A senior champion to advocate for adoption is a powerful thing, I heard in several panels across the two days.
Ethics
Ethical implications of AI were discussed regarding the data that users hand over. The panel felt that free access comes at a cost in real terms, as people’s personal data and the information fed into an LLM is the sacrifice that is knowingly made. Both speakers talked about having their own recommended AI tools (like CoPilot at UoS), but there are no rules or policies for students using any other tools.
Several panels reported the positive impact of equitable access had on students and staff. It presented opportunities for them to express themselves in different media and allowed for adaptation of some assessments. The cost of site licenses is extremely high, and not all tools are created equally. Therefore, the question is how to offer fair access to AI at the university, without asking people to forgo privacy?
To further compound the issues facing AI use, in Dr Khareghani’s keynote she spoke about the information bias in AI being a reflection of the data and lived experiences being feed into it (mainly western and developed world) and urged institutions to move from being passive users to active creators: “Be an AI Maker, not an AI Taker.”
A key takeaway was “AI is not magic dust, it requires investment in infrastructure, training, and ethical considerations to be truly transformative…” This seemed to be the consensus across the event.
Theme 2 – Failure
FAIL – First Attempt In Learning
Professor Paul Iske (CFO), Chief Failure Officer, delivered an enlightening opening keynote, “Brilliant failures: working together, failing together, learning together”. As the founder of the Institute for Brilliant Failures, he emphasised the value of learning from setbacks.
As part of the opening section of his presentation, the audience was asked to confess their errors, lapses, and failures in just three words, which were then projected in a gigantic word cloud on the main hall screen for all to see. This underscored just how universal the experience of failure is (I had gotten a train ticket to Birmingham International instead of New Street). Iske used this information to emphasise that if we can accept our failings for everyday or mundane things, then is it possible that we can do the same for far more complex and multifaceted issues, such as trying to improve education systems? Maybe, failing at that is not such a bad thing after all.
A brilliant failure can be described as a well-prepared attempt to create value that has a different outcome than what was expected, and with that comes an opportunity to learn. Therefore, we can reframe the cost of failure by looking at the value in failing well. Iske calls this “failing forward”, which is a combination of social intelligence, pattern recognition and creativity leading to failure intelligence (failing well). Using failure intelligence, you are planning for success.
Archetypes of failure
Iske briefly touched on his methodology for failing and its sixteen archetypes. Here are just a few:
- The Junk: Persisting with a failing project due to prior investments, leading to further losses.
- The Elephant: Overcomplicating solutions, making them unwieldy and ineffective.
- The Banana Peel: Small oversights or errors that lead to significant negative outcomes.
- The Right Brain Hemisphere: Ignoring creative or intuitive insights, resulting in missed opportunities.
- The Empty Spot at the Table: Excluding key stakeholders from decision-making processes results in a lack of support or unforeseen issues.

His methodology looks really interesting and is something I am keen to explore further in the future (a potential blog post). When we plan a project, does it ever finish exactly as we planned, or did we ‘fail’ and create new ideas and possibilities along the way through divergent thinking? Improvisation and creativity are what separate humans from technology. Iske mentioned overhearing a conversation before the conference in which one of the participants said, “An AI knows everything but understands nothing”.
Theme 3 – Accessibility & Inclusion
The 4Cs
Kellie Mote’s Fireside chat on accessibility, for me, was one of the most impactful sessions of the event. Kellie was in conversation with Piers Wilkinson, who is the Director of the National Association of Disability Practitioners (NADP), who immediately emphasised that all solutions and policies need to use the 4 Cs approach:
- Co-produce
- Co-create
- Co-design
- Consultation
Wilkinson is consistently exasperated at retrospective accessibility adaptions being made to new spaces and not embedded in the original design process. He explained that this could be explained, in part, because architects do not have to make buildings accessible by law. Another particularly striking example he gave was the irony of conducting digital poverty surveys online, further highlighting the need for more inclusive design thinking.
Other key takeaways included consulting with students prior to an assessment instead of after they fail. One solution Wilkinson put forward for this was to pay disabled interns to help design accessible courses and assessments during the summer (a win-win solution, he claimed, as there are generally limited summer jobs for disabled people).
The legal precedent set by the Abahart vs. Bristol University case, a case I was not aware of at the time, emphasises the responsibilities facing all university staff. We were reminded that disability is one of the most diverse characteristics, meaning staff have a lot to learn. How confident do we feel in supporting and helping all disabilities?
With regards to training, Wilkinson puts forward the argument for guided tours being more important than immersive experience training. Being in situ and knowing where the issues are and what to do are far more helpful to disabled people needing support. Knowing where fire doors are and how to open them is far better than spending the morning pretending to be disabled, explained the presenter.
Throughout this chat, it was clear that Piers Wilkinson is someone who does not mince his words, and this was further confirmed by his assertion that one of the key things for everyone to do when discussing DEI is to have blunt conversations. “It is better to say something imperfectly and learn than to stay silent…” he urged. “Develop and empower people to talk about DEI even though they have no lived experience”.
What I came away with from this session, aside from the acknowledgement of my current limitations and low confidence in this important and diverse area, was how everything being said led back to one of Iskes’ sixteen archetypes of failure, “The Empty Spot at the Table”. The 4 C’s are imperative to make sure we don’t fail those whom this impacts most.
Theme 4 – Esports (for the win)
When I read the title of this presentation, “If you don’t have an esports curriculum, why not?” I thought I knew the answer. However, just fifteen minutes later, and performing cognitive dissonance on cost and implementation, I had no idea why the University of Sussex hadn’t embraced Esports. The British Esports Federation has qualifications from level 4 to 7 on Business courses (I would also imagine that they could merge into some Informatics courses too). They were showing how Esports can
“…transform your digital strategy and innovate your pedagogical approach and curriculum design to support the future workforce, to address skills gaps and engage a tech-agile, digital generation of young people that are in our classrooms.”
Big Business
The gaming industry’s value surpasses that of film and music combined, I was told. Pastorally, Esports present an opportunity for Sussex to compete nationally and internationally. The championships attract millions of viewers both in person and online. The inaugural Esports Olympics are being held in 2027 in Riyadh, all giving credence to the idea that Esports means business. Having just heard a panel discussion about investment in pastoral care to help secure the future of HE, this talk possibly resonated more strongly than it may have done another time. However, being part of the journey in educating and supporting the future creative workforce behind this industry is an enticing one. Or you could say their sales pitch worked on me.
Final Thought
Digifest 2025 was a whirlwind of ideas, challenges, and opportunities. From embracing failure to reimagining our biological potential. One thing was abundantly clear: no one has got to grips with the current disruption felt by AI.
“Be part of the disruption of AI or prepare to be disrupted by AI”
Khareghani (2025)
The conversations around digital transformation, accessibility, and the future of AI will shape how institutions evolve in the coming years, but it is people who will be leading it. Maybe I am wrong in my assumption, but thankfully, I have never been happier with the idea of making a mistake and learning from it.
Here’s to the next wave of brilliant failures, bold experiments, and transformative learning experiences. Until Digifest 2026!
References
Iske, P., 2021. Institute of Brilliant Failures: Make room to experiment, innovate, and learn. Amsterdam: BIS Publishers.
Jisc, 2025. Digifest. [online] Available at: https://jisc.ac.uk/digifest [Accessed 27 March 2025].