What Use are QUESTIONS to a Learning Technologist?

by Dan Axson, Learning Technologies Manager, University of Sussex

A 3D rendered image of a pile, similar to jigsaw puzzle pieces of question marks. On top of the pile sits the word 'what' in capital letters.

In my previous posts I put forward the first two of (what I believe are) three key attributes of any Learning Technologist (LT): kindness to yourself and others and (using the analogy of active noise cancellation) the ability to make sense of trends in technology and their impact on teaching, learning and assessment. 

In what I guess is now a series, I ask what’s next and what comes after attitude and knowledge? For me, the answer is tools, and one tool in particular – The Question.

Is asking questions the most essential tool an LT has available?

The ability to ask the right ones at the right times to the right people is a skill, therefore like (nearly) any skill it can be acquired, practised and used in any scenario facing an LT. To explore the utility of questions, I’ll use three themes of LT work; technology, pedagogy, and strategy.

Technology: What happens when we don’t know the answer?

A common feeling amongst LTs (and likely similar roles) when we first start is one which I’m sure you’ll all recognise: frustration at not knowing all the answers. I’ve been there, I know our team of LTs have been there and I have little doubt future LTs in EE (Educational Enhancement) will have the same feeling. There’s only so much onboarding and official training one can do; sometimes you just need to get stuck in and give it a go. Embrace the chaos, remember? 

…if you think you’ve asked enough, ask another…

The problem is, how do we help someone when we don’t know the answer?

The solution: Ask questions. Lots of them. And if you think you’ve asked enough, ask another. Ask many questions – of your colleagues, of yourself, of the internet, of the people coming to you with the query… In scenarios where we need to support someone on an unfamiliar process, we can still probe the issue and gather intel to help us ask the right questions of our experienced colleagues.

For example:

  • When does the issue happen?
  • Is it happening to everyone?
  • Can you give me an example?
  • Has it happened before?
  • Who reported it?

These questions don’t require any knowledge of the process or the platforms, but they’ll get you closer to understanding where an answer might be.

Pedagogy: How can we clarify what’s needed?

Imagine you’re asked about supporting a module convenor to improve student satisfaction on the module. They want to improve engagement during contact hours and create a better sense of community amongst the students. There is a lot to unpack here, so where do you start?

With questions of course! Are students saying they feel a lack of community or are they not turning up to lectures (which is often – rightly or wrongly – used as an indicator of engagement)? Or is this being assumed through stats on VLE usage, for example? Each one of these has different routes to explore potential resolutions but only by asking clarifying questions can we know which route to take.

Even then, we still need to find out more before we can start to suggest solutions. We need the parameters for any interventions; the level of skills that exist within them or their students. We might need to come back to the LOs. What are the students saying about the module? Is this written somewhere? What has been tried already? Is there anything they’ve seen elsewhere they’d like to try? And so on.

Strategy: When is the right time to question everything?

When it comes to conversations around conventions, processes, updates, initiatives, future developments, or (perhaps especially) when you hear the phrase, “but this is how we’ve always done it”, typically the only answer is to question everything with my two personal favourites: “Why?” and “What if?”

To be clear, I don’t mean super broad, super long-term organisational strategy stuff here (though it does also apply). For this, we’re still focusing on the day-to-day of a Learning Technologist and our direct areas of influence. I mean the times when we get involved in updating processes, best practice, curriculum design, best use of the VLE, that kind of stuff. 

When horizon scanning for opportunities as technologies develop, it is useful constantly to revisit these conversations with a “What if?” or a “Why?” For example, we wanted our VLE (Virtual Learning Environment) to be as accessible, easy to navigate and as inclusive as possible for students, but we also wanted it to be easy to update and manage for staff. To achieve this we have module templates but there is still an inconsistent experience across schools and devices and some elements are difficult to update.  So our LTs – ever excellent and ever questioning – noted the ability to auto-apply templates in Canvas and, as a team, asked “Why can’t we force-apply the template to all modules? What if all schools had a consistent, easy to update, accessible template?” This in turn led to a review of the Canvas pages, prompting further questions such as “Why do we need all that stuff on the home page?” Such questions also help us identify the opportunity costs of not doing stuff: What if we don’t make our sites more consistently applied and more accessible?

Is it that simple?

Yes, I think it is. It is likely your LT is asking more questions with you than you are of them. Whilst that might feel counter-intuitive, it tells you that they are doing their job very well.

So, how good are questions? In my view they are very good indeed. After all, questions help us:

  • When we don’t know the answer and give us agency in situations where we might feel helpless.
  • Unpack complex, ambiguous queries, enabling us to provide more sustainable and more appropriate guidance. 
  • Interrogate existing practice as and when new technologies arise. This should be a constant activity in the minds of an LT. 

Kindness, sense-making and the ability to ask the right questions. That right there, in my humble opinion, is the secret sauce of a great LT. 

Are there any you would add, do you disagree with any of these, I’d be very interested in your thoughts, let us know in the comments. Remember – be kind…

Posted in Educational Enhancement, Learning Technologies, Professional Development

The University of Sussex Artificial Intelligence Community of Practice Launch Event

by Helen Morley, Learning Technologist, University of Sussex

Few things have dominated the conversations in Higher Education like the emergence of Generative Artificial Intelligence has, and the past eighteen months have been ringing with the topic. From the whats, to the whys (and why nots), through the hows and the whens, so many of us want to know more about these exciting tools and if – or rather how – they will impact our work.

An image created with generative artificial intelligence, showing slightly uncanny-looking university students in a futuristic campus. Large drones fly in the sky overhead and a robot is in the foreground, being studied by a student with a clipboard.
George Robinson used Adobe Firefly to create this AI image using the prompts: university, AI, community.

The Educational Enhancement team at Sussex has been a key contributor to these conversations; we have researched and reported on the opportunities and challenges across the university and have delivered sessions to familiarise colleagues with the tools, including how they could be adopted to benefit staff and students.

In December we launched our Community of Practice with an in-person event (live streamed for colleagues who required it) on campus. It was a drizzly day and we were overjoyed to have so many people join us. The launch was so popular we actually ran out of tea, which is a misdemeanour we promise never to repeat!

We started with some time to mingle over the mince pies before George Robinson (Senior Learning Technologist) and Dr Sam Hemsley (Academic Developer) introduced the project. They shared information about what the EE team had been doing, what we were witnessing in other universities, and what Sussex colleagues had told us about their own perceived AI literacy. Professor Michael Luck (Deputy VC and Provost) then set some further context by giving both a whistlestop tour of recent AI history and his own impressions of how Sussex was embracing the technology from his perspective as a new member of our community with particular interest and expertise in the field. I think many of us were surprised to learn just how rooted in AI the University of Sussex has been over the decades, including hosting the third annual conference on Simulation and Adaptive Behaviour back in 1994.

A group of participants in the AI Community of Practice launch event, photographed from behind. At the top-right of the photo, Professor Luck is standing and presenting. A slide projected on the wall shows a newspaper headline that reads "A.I. 'Could Wipe Out Humanity'".
Dozens of Sussex staff attended the launch of the AI Community of Practice held in the Open Learning Space in the Library in December 2024. Here is Professor Michael Luck, Deputy Vice-Chancellor and Provost at Sussex and founding Director of King’s College London’s Institute for Artificial Intelligence, giving a whistle-stop tour of the recent history of Artificial Intelligence.
Photo: Dr Katie Piatt, Educational Enhancement.

We were really fortunate to have members of teaching staff volunteer to share their experiences and ideas of using AI in their work and we heard from three of them, representing a good range of subject area and focus. These “lightning talks” were: 

  • “Teaching about AI in society” by Chirantan Chatterjee;
  • “Using generative AI in assessment” by Giovanni Contreras Garcia; and
  • “Exploring the role of AI tools and their potential impact on international students” by Hengyi Wang.

Professor Chirantan Chatterjee teaches Artificial Intelligence and Policies for Technological Revolutions in the University of Sussex Business School; his talk gave us a view into how the ethics of AI are taught which gave all attendees food for thought regarding our own use of the technology.

Dr Giovanni Contreras Garcia teaches Product Design in the School of Engineering and Informatics. His talk explained how students use AI to create sketches for their projects which they evaluate and adapt to suit their purpose. This is an interesting point to consider in subject like product design where a lot of time can be spent trying to draft and tweak ideas at a much slower rate than the ideas come. It seems that using AI to support with this process allows the students to think more dynamically.

Hengyi Wang is the EDI Champion in USBS and a Senior Academic Success Advisor.  Her talk was a concise exploration of how International Students in particular are using AI to access course material and improve their work for submission. Hengyi relayed how for many students the use of assistive AI is considered the norm and the tools are not used subversively or dishonestly. This valuable insight was deservedly one of the final words on the subject that day: the students are already making use of AI to supplement their studies and it’s time we learn how to do so also.

The next University of Sussex AI CoP meeting will be on 18th March. Sign up here to let us know you’re coming. The meeting will focus on sharing approaches to talking with students about AI for learning and assessment. Email EE@sussex.ac.uk if you’d like to give a short (5 minute) lighting talk on the topic!

We asked what people wanted from the CoP and what they could bring.  See the Reflective Padlet with responses from attendees. See also our Collaborative Padlet with examples, articles and ideas for AI in Higher Education.

The AI and academic integrity guidance page has been updated new module and assessment levels statements on the permissible use of generative AI outputs in assessment submissions.

Educational Enhancement have AI and education related workshops scheduled over the next few weeks, notably ‘Assessment in an AI World’ and ‘Talking with Students about AI’.

To find out more about Educational Enhancement, please visit: staff.sussex.ac.uk/teaching/enhancement

Posted in AI, Educational Enhancement, Learning Technologies, Technology Enhanced Learning

Digital Superpowers: A Tale of Tech Guardians and Code Suppressors

by Rachael Thomas, Learning Technologist, University of Sussex

In the heart of Sussex something extraordinary is happening. A few times each term a unique assembly occurs. Learning technologists and digital team members from various educational establishments across the region gather not just to meet but to embark on a mission. Their quest? To harness technology’s full potential in supporting teaching and learning in higher education. 

At a recent Christmas meetup hosted by Brighton Metropolitan College, I had the privilege of leading a discussion unlike any other. Our topic: how to elevate digital skills among academic staff. In the dynamic landscape of higher education this isn’t merely a nice-to-have. It’s the lifeline of modern academia, ensuring its vibrancy and relevance in the digital age. As online and blended learning become more prevalent, mastering digital skills is akin to wielding a compass in uncharted territories. 

Mastering digital tools is akin
to wielding a compass in uncharted territories

Rather than opting for the usual panel discussion, I decided to add a twist. Imagine a world where digital skills are the beacon of progress and empowerment, now threatened by a sinister plot. Enter our narrative: The Code Suppressors, supervillains whose diabolical plan involves eradicating digital literacy, leaving humanity at their mercy through a device called the Mind Matrix Inhibitor. 

Opposing them are the Tech Guardians: superheroes with unique digital skill superpowers, embarking on a quest to reverse the Inhibitors’ effects. The fate of humanity lies in this epic struggle. 

As the group gathered, each participant chose a superhero mask. Those donning red, yellow, or white masks became Tech Guardians; the rest, adorned in black, green, blue, or pink, transformed into Code Suppressors. Each team had 15 minutes to identify their superpowers – for the Tech Guardians, it meant skills, tools, and knowledge to develop digital skills; for the Code Suppressors, it involved identifying barriers to digital skills enhancement. 

The Tech Guardians boasted powers like the Digital Accessibility Wand and the Cloak of Simplicity, which grants its wearer the ability to craft simple, easy-to-understand instructions. They also wielded the Good Practice Generator, inspiring others with effective use cases. 

Meanwhile, the Code Suppressors had their own arsenal: The Lurgy Laser inducing staff illness, the Time-sucker Black Hole representing the ever-present challenge of insufficient time, and The Habitual Cyclone, symbolising resistance to change. 

What ensued was an epic battle…

What ensued was an epic battle. Each time the Code Suppressors attacked with a barrier, the Tech Guardians countered with a digital skill solution. Of course, this interactive duel wasn’t just about winning; it was a creative way to foster discussion, share experiences and seek solutions collaboratively to enhance digital skills. 

Through this engaging and imaginative approach, we not only identified strategies to overcome common obstacles but also realised that we were all facing similar challenges. The feedback was overwhelmingly positive. Participants found it a fun and engaging way to promote discussion, felt more connected to the community, and left with actionable suggestions to enhance digital skills in their respective institutions. 

In the end, our gathering was more than just a meeting; it was a vivid illustration of how creativity and collaboration can make the journey of learning and adapting to new technologies in higher education not just necessary, but exciting and deeply rewarding too. As we continue to navigate the evolving digital landscape of higher education, let’s remember the lessons learned from our superhero journey: with the right mindset, tools, and collaborative spirit, we can transform challenges into opportunities for growth and innovation. 

If you would like support to improve your Digital Skills, please contact EducationalEnhancement@sussex.ac.uk 

We acknowledge the use of ChatGPT https://chat.openai.com/ on 26/01/2024 to generate materials that were included in this post in modified form. The prompts used include expanding and enhancing provided text. The output from these prompts was used to improve the language of the article. 

Posted in AI, Educational Enhancement, Learning Technologies

Academic Developers: February Roundup

Pink background, with a megaphone, mobile phone, pencil, magnifying glass, lightbulb and the YouTube logo

Curriculum Framework ready for consultation

A Curriculum Framework for Sussex is in development as part of Curriculum Reimagined. The working document has been created from stakeholder engagement, dedicated open meetings and working groups.  There are elements that will need to be added as Curriculum Reimagined continues (particularly in relation to assessments).

We would encourage everyone to take a look, particularly at the draft Curriculum Principles from page 6, which will guide future course design and review. Is there anything missing? Can these principles be contextualised to your disciplines? Please provide feedback to the Curriculum Reimagined team – curriculumreimagined@sussex.ac.uk

The framework can be found on the Teaching pages (just underneath the white boxes) Teaching : Staff Hub : University of Sussex.

Workshops and web guidance

Each month we feature a workshop from our menu of staff development workshops, all of which can be tailored for course teams, departments and schools. If you want us to facilitate a bespoke workshop to support making changes to your programme, or to try a new teaching or assessment approach in your module team, email us to find out more.

This month’s featured workshop is ‘Alternative Assessments’. 

This workshop:

  • Explores the benefits and challenges of designing and implementing alternative assessments. 
  • A range of options for diversifying assessments, 
  • How these changes may improve student experience and make assessments more AI resilient. 

Other Educational Enhancement (EE) workshops scheduled for Semester 2 include Applying for promotion on the scholarship track (Thursday 8 February 11:00 until 12:30).  Book your place via Eventbrite

Go to our  events, workshops and seminars page to find out more and register for upcoming workshops.

We’ve also added new guidance to our website on:

Teaching Methods:

  • Teaching sensitive subjects
  • Building student engagement
  • Inclusive teaching
  • Supporting academic writing

Curriculum Design:

  • Education for Sustainable Development

Sussex Education Festival 2024

Following the success of last year’s inaugural Education Festival here at Sussex, we are now planning this year’s event. We’re scheduling a two-day programme with talks, workshops and interactive sessions from Wednesday 10th to Thursday 11th of July. Watch this space for a Call for Papers, and save the date now!

Sussex Education Awards 2024

Voting is now open for the Education Awards, which are an opportunity for staff and students to recognise members of Professional Services or academic staff who have had a positive impact on our community.   Don’t delay, vote today!

2nd AI CoP: Monday 18th March, 14:00-15:30

The next meeting of the Teaching and Learning with AI Community of Practice (AI CoP) will focus on sharing approaches to talking with students about AI for learning and assessment. Save the date for now and look out for details and joining instructions.

Learning Matters

We published three new case studies this month on Learning Matters that explore student engagement, authentic assessment and optionality in assessment modes. We also featured a short article by Dr Jo Wilson that discusses the pedagogical rationale for flexible assessment.

Introducing optionality in assessment

Using Vevox in the classroom

Developing authentic assessment for learning

“It equals the playing field” student reflections on introducing optionality

Posted in Academic Development, Educational Enhancement, Monthly Round-ups

Charting Success: The Transformative Role of the Educational Enhancement Coordinator

by George Robinson, Senior Learning Technologist, University of Sussex

Press play to listen to an audio recording of this blog post

One of the cornerstone roles in the Educational Enhancement team is that of the E.E. Co-ordinator. It is noteworthy that each individual who once held the co-ordinator position has not only chosen to remain part of the University of Sussex community but has also successfully transitioned into one of a range of new roles across the institution. This retention and diversification of talent speak volumes about the skills growth and experiences that this role, and similar roles across the university, can provide.

In this post, we meet the six past and present holders of the Educational Enhancement Co-ordinator role, hear how the role shaped them, and look at their career journeys. We will showcase the breadth of opportunities available for career progression within the University of Sussex.

Kitty Horne

Academic Developer

“My first role in Educational Enhancement, then named Technology Enhanced Learning, was as a graduate intern. This was a fixed term role and I was then lucky enough to stay on in the team as the first co-ordinator between 2015 and 2017. During this time I was given the opportunity to contribute to the team blog, plan and present workshops and to organise our seminar series.

“It was great to work in a team that was so encouraging and who always made sure to include me in new workshops they were planning or research into emerging technologies. This helped me to develop the skills needed then to progress into the role of Learning Technologist and in turn into the role of Academic Developer that I hold now.”

George Robinson

Senior Learning Technologist

“I was the team’s co-ordinator from 2017 to 2019. I’d been working at the University of Brighton beforehand and had discovered the field of Learning Technology, which I really wanted to enter as it combined my two passions: education and technology. When a job came up for the role of Learning Technologist at Sussex I went for it… and didn’t succeed, but the team saw potential in me and so asked me if I’d interview for the co-ordinator position.

“The role was fantastic as not only did I gain a comprehensive understanding of what it means to be a Learning Technologist, but I also had the creative freedom to design workshops and develop resources. This hands-on experience was instrumental, enriching my knowledge and skills, and preparing me for my next big leap two years later, when the opportunity arose once again for a Learning Technologist role. This time, armed with experience and a deeper understanding of the field, I succeeded!

“Since then, I’ve evolved to become a Senior Learning Technologist within the team, in my seven years here the team has grown from having six members to over 25. A lot has changed over the years but E.E. has continued to be an amazing place to work!”

Faye Tucknott

Student Engagement Manager

“I was the team co-ordinator from October 2019 until February 2021. I studied my undergraduate degree at Sussex and loved my time here as a student, so I was really excited by the opportunity to be part of a team who worked to enhance teaching and learning at the university. I must admit I was a little apprehensive about the technology part; I was the go-to tech person in my family for fixing Nan’s Sky box or sorting Mum’s computer, but would I be “tech savvy” enough for the team? Of course, the wonderful Educational Enhancement colleagues supported the skills I already had and helped me build on them, guiding my development throughout my time in the role. I’m sure anyone who has worked with the E.E. team will have experienced their problem-solving expertise and innovation, but they will also know that it comes with a tremendous amount of kindness and understanding.

“Starting my Sussex career in E.E. has encouraged me to be less fearful of trying things out and testing new ways of working for the benefit of students and staff alike, be that by using technology or trialing new initiatives. The E.E. Co-ordinator role encourages problem-solving, creativity, excellent organisation and administration skills, curiosity, collaboration and so much more. Over the last four years it has stood me in excellent stead to continue growing and developing within the university, at first within the E.E. team as the Online Distance Learning Officer (Feb 2021 to June 2022), and since as the Executive Logistics Officer for the Division of Student Experience.

“I have loved continuing to explore how to enhance education for students, and do so now working with Student Connectors to deliver events and initiatives to support student belonging, enhance their skills, and develop their interests. I am delighted this has led me to the next chapter of my Sussex journey, as I joined the Student Engagement and Enhancement team as the Student Engagement Manager (Spirit of Sussex Award) in late November 2023.”

Katie Turner

Change Communications Officer

“I worked for Educational Enhancement from 2020 to 2021 as the team co-ordinator.

“I was intrigued by the role as it included event planning and communications, but I was keen to learn about the digital aspect which included updating webpages. The role allowed me to build up my knowledge for using HTML coding for updating the teams’ webpages and supporting E.E. with events. Since then, I have moved into a communications role as a Change Communications Officer, where I have worked on webpages and communication channels across the university for our Capital Programme. I am currently learning more about digital communications by completing an apprenticeship in Digital Marketing through Creative Digital Process.”

Keira Thomas

Success Programmes Assistant

“I was welcomed with such friendliness into the E.E. Team, as the team co-ordinator. The role allowed me to expand my organisation and communication skills, whilst giving me the opportunity to learn about and build relationships within my team, and other areas of the university with whom E.E. work closely and collaborate. I was able to engage in a range of training and development opportunities, which has increased my knowledge and confidence in a range of topics. I have developed skills in finance, comms, and various platforms, which have been valuable assets within the role that I have progressed to, as a Success Programmes Assistant within Student Engagement and Enhancement.

“Being part of an inclusive and supportive team in E.E., encouraged me to develop within my additional (volunteer) roles, as an Equality, Diversity and Inclusion (EDI) Champion for the Division of Student Experience; a Co-chair of the LGBTQ+ Staff Network; and a Mental Health First Aider. These roles are very close to the heart for me as someone who identifies as a lesbian, and who is extremely passionate about EDI practices, which were nurtured in the team. E.E. were such a positive start to my journey within the University, so I am truly grateful for my time and what I learned from being a member of the team. Once in E.E., always in E.E.

Simon Overton

Educational Enhancement
Co-ordinator

“For most of my career I have been a primary school teacher. The idea of teaching a roomful of children scares a lot of people but I always loved it. A primary classroom is a whirlwind of noise, activity, behaviour, needs and peculiarity. It is a force of nature.

“So (as I discovered) is a university.

“On returning to the U.K. after 15 years away, I decided to change direction a bit and try my hand in higher education. I liked the Educational Enhancement department immediately – literally during the interview I felt comfortable with my potential new colleagues and the kind of tasks I was being asked to prove that I could co-ordinate.

“I was tremendously proud to be offered the role and my first few days went well. I could find my way to the office and back – no mean feat in the University of Sussex library – and I understood the work I needed to do. The E.E. team has as many people in it as a medium-sized classroom but they were all grown-up and none of them needed their noses wiping or shoelaces tying.

“It was at the Student Experience Division “Meet & Greet” that I really got a sense of how I fit into the university as a whole. The division is enormous (at least, to me it is) and the possible connections between departments, teams and individuals seem to be like those brain cell network diagrams you see. All of this spread across a campus that is like a small city. It is complex and amazing. A force of nature.

“But it isn’t scary. In fact, it is thrilling and challenging and mysterious and inspiring to me. What could I do with my role and where can it lead me? I have met so many people from within our team and around our division, and each one has a role that is profound and remarkable and the passion with which they perform their roles and how excited they are to explain them to me is amazing. I feel like my role as E.E. Co-ordinator places me at the start of a journey; and I have loved the journey so far.”

Find out more about Educational Enhancement, including information about our events, workshops and seminars, by visiting staff.sussex.ac.uk/teaching/enhancement .

Posted in Case Study, Educational Enhancement, Technology Enhanced Learning

Your students don’t know what a RUBRIC is – and maybe you don’t either?

by Sam Hemsley, Academic Developer, University of Sussex

One of the first pieces of work I took on when joining the Educational Enhancement team as an Academic Developer in September 2021, was to contribute to the content of a guidance page on the ‘Principles of rubrics and grading forms’. At the time I was grappling with a severe case of imposter syndrome and felt that if I didn’t immediately know or fully understand something then the problem was with me, not the ‘thing’.

A photo of a young man, head in his hands in front of a laptop, representing a student feeling frustrated with an assignment.
Photo by Tim Gouw on Unsplash

So, at the time, although I was uncomfortable with using the rather obscure term ‘rubric’ to describe what is basically a ‘grading form’ (see also ‘marking grid’), I didn’t challenge it. Not least because it was used by everyone around me, both at Sussex and at my previous institution and in the sector more widely. Also, it is the term used by Turnitin and Canvas for their in-built grading forms. Certainly there is plenty of interesting debate about the value and limitations of rubrics, but what does the term actually describe, and should we be using it at all?

What does the term “rubric” actually describe
and should we be using it at all?

Since that time, I have (mostly) overcome my imposter syndrome. However, I have noticed three problems with the term “rubric” which can make its use a barrier to understanding and, therefore, to inclusion, accessibility and basic good practice:

Inconsistent use

I noticed that in some schools, courses and modules at Sussex, the term rubric is used to describe:

  1. Grading forms – both those set up within Turnitin or Canvas and those provided separately;
  2. Assignment instructions – both assignment briefs (task descriptions) and completion and submission instructions (e.g. for exams); and
  3. A combination of the two.

When I asked the Educational Enhancement team, many had also noticed such inconsistent use. Other, equally obfuscatory, terms were also identified by the team; such as the use, in some schools, of ‘instrument’ to describe assignment briefs.

Of course, I’m not the first to recognise this inconsistency.  Cooper and Gargan in 2011, note that “the term, apparently, can refer to almost anything: rule, guide, criterion, or description that is used to assess the progress of students in their academic subjects, as well as the grading system for assessing each criterion.” In 2015, Phillip Dawson from the Centre for Research in Assessment and Digital Learning, noted that “since the beginning of its use in education, ‘rubric’ has not been a particularly clear term”.

Inconsistent definitions

The problem also goes beyond how ‘we’ use the term to the fact it is defined differently by ‘others’. For example, if you Google something like ‘Rubric meaning’ the top result will be the definition provided by Google’s own Dictionary informing you a rubric is “A heading on a document. A set of instructions or rules”. You’ll probably also see a link to Collins Dictionary, stating “A rubric is a set of rules or instructions, for example the rules at the beginning of an examination paper”.  This is because the term was originally used to refer to red text, specifically instructions in a liturgical book on how a church service should be conducted.

The Oxford English Dictionary also provides this usage, along with other uses of the term to refer both to chapter headings and instructions or explanatory notes such those introducing an exam paper.  However, also in common with a number of other dictionaries, the OED doesn’t mention grading forms, or marking grids or any such definition. 

However, if you ask ChatGPT the same question you will likely receive, like me, a response describing a rubric as “…a scoring guide or a set of criteria used to assess and evaluate the quality or performance of a task, project, or assignment”.

Now, consider the implications of this lack of consistency for students who, faced with the novel term ‘rubric’, ask Google or ChatGPT and are given a definition which doesn’t match their experience of its use. Or worse, the term has been used inconsistently (as above) and they are trying to tease out its ‘correct’ meaning. Less confident students, particularly those dealing with their own case of imposter syndrome, will likely think it is their fault they can’t reconcile the ‘thing’ with the definition(s) they are presented with.  Although the ChatGPT definition aligns to the Canvas and Turnitin use of the term, wherein students are directed to ‘view rubrics’ to access a breakdown of their marks and/or feedback, they don’t come across this in every module and, if they do, it won’t be until they are receiving grades on an assignment.

Unnecessary use

Ultimately, I think that the use of the term ‘rubric’ is a manifestation of academic gatekeeping, or the ‘hidden curriculum’, i.e.: the unspoken or assumed rules and norms that exist in educational settings. Indeed, writing in 1997, Popham observed that 20 years previously,

“…rubric began to take on a new meaning among educators. Measurement specialists who scored students’ written compositions began to use the term to describe the rules that guided their scoring. They could have easily employed a more readily comprehensible descriptor, such as scoring guide, but scoring guide lacked adequate opacity. Rubric was a decisively more opaque, hence technically attractive, descriptor.”

As such, its use is a barrier to learning and increases the cognitive load and stress for students who are simply trying to understand what is being asked of them as they attempt to navigate the world of higher education.

Its use is a barrier to learning and
increases the cognitive load and stress
for students.

What should we call these things instead?

We have, at our disposal, a wide range of easily understandable terms and phrases to describe grading forms, marking grids or assignment briefs. Even if we are inconsistent in the exact use of these terms, they still clearly describe the ‘thing’ we are referring to.  So why not use them? Yes, ‘rubric’ is a term students see, and need to understand, when accessing their grades and feedback in Canvas and Turnitin.  Yes, it’s used with wild abandon across Higher Education. Therefore, we’ve agreed in Educational Enhancement that we will:

  1. Always clarify the definition when using it with staff (and maybe link to this blog?);
  2. Question and challenge its use for assignment briefs or exam instructions;
  3. Question and challenge its use without further definition in student facing communication; and
  4. Suggest to schools that they agree to do the same.

This is a first step in our ongoing discussion in Educational Enhancement on enabling clarity and transparency in the communication of assessment requirements and marking criteria.

So, the question remains, will you join us?

If you would like to contact the Educational Enhancement team about best practice in developing assignment briefs, or would like to know more about using rubrics (grading forms), please find us at educationalenhancement@sussex.ac.uk

Posted in Academic Development, Accessibility, Canvas, Educational Enhancement, Inclusive teaching, Marking and assessment, Technology Enhanced Learning

Five Tips for Using PORTFOLIOS in Assessment for Learning

by Laura Clarke, Academic Developer, University of Sussex

A stock image of miniature figures ascending stairs to different levels of education, from a pencil and calculator to a mortarboard. It is rendered in shades of blue and represents the typical learning journey.

When I taught in my previous institution, we used portfolios as a summative assessment for students as well as a means to evaluate instruction. We scored the portfolios based on a rubric and quantitative data was collected for various stakeholders. While this approach to using portfolios was useful for the institution, I felt that it missed a valuable opportunity to engage students in their learning.

While this approach to using portfolios was useful …
I felt that it missed a valuable opportunity
to engage students in their learning.

Portfolios can serve a dual purpose, catering to both Assessment of Learning—a conclusive assessment focused on summarising and evaluating overall student performance—and Assessment for Learning—an interactive and formative student-centred approach that prioritises ongoing assessment. Paulson and Paulson (1994) delineate two categories of portfolios that correspond to the objectives of Assessment of Learning and Assessment for Learning: positivist and constructivist, respectively. Positivist portfolios “assess learning outcomes and those outcomes are, generally, defined externally,” which “assume[s] meaning is constant across users, contexts, and purposes”. This approach views the portfolio as “a receptacle for examples of student work used to infer what and how much learning has occurred” (p.8). By contrast, the constructivist portfolio is “a learning environment in which the learner constructs meaning. It assumes that meaning varies across individuals, over time, and with purpose.” Here, “the portfolio presents process, a record of the processes associated with learning itself” (pp.8-9).

Positivist portfolios emphasising Assessment of Learning are usually structured around a set of outcomes, goals, or standards and are used to make high-stakes decisions. Within constructivist portfolios, students choose specific artefacts to narrate the tale of their learning journey, and these portfolios are employed primarily for formative rather than summative assessment purposes.

With Curriculum Reimagined, the University of Sussex is exploring new forms of authentic and flexible assessment so that students are given accessible opportunities to explore their ideas, interests and preferences within their learning. Portfolios can be a powerful vehicle for learning if they transfer responsibility for learning from the teacher to the learner. If we want to develop inclusive assessments that give students the opportunity to take ownership of their learning, it is important to balance both product- and process-oriented portfolios.

Portfolios can be a powerful vehicle for learning if they transfer responsibility for learning from the teacher to the learner.

Here are some recommendations for how to leverage portfolios to enrich the Assessment for Learning process: 

  1. Give learners the opportunity to choose between assessment modes and to decide which artefacts best serve as evidence of their learning and development. Keep in mind that it is important to consider the equity of effort, standards, and feedback when offering students choice in assessment modes. 
  2. Include a reflective component that encourages students to reflect on how the evidence they have selected demonstrates an evolution in learning. As reflection poses a new and challenging skill for many students, it is advisable to provide them with sample frameworks for reflective writing well in advance. 
  3. Encourage students to incorporate proof of their ongoing learning process. This may involve submitting earlier drafts of finished assignments, engaging in a thoughtful review and reflection on feedback received for these drafts, and providing an explanation of the steps taken to incorporate feedback and enhance their knowledge and comprehension. 
  4. Motivate students to repurpose the content of their portfolio for integration into a different kind of portfolio, like one designed to bolster a job application. Discuss with them the unique purposes such a portfolio would serve in the context of employment. 
  5. To ensure that students recognise the value of creating a portfolio, it should be easily adaptable. Ideally, the portfolio should remain accessible to learners even after they have completed their time at an academic institution. 

Educational Enhancement’s Academic Developers are developing resources to help instructors implement portfolios in their courses and modules. Please contact your School’s Academic Developer for support in developing and implementing portfolios that balance the goals of Assessment of Learning and Assessment for Learning. 

Posted in Educational Enhancement, Learning Design, Learning theory, Marking and assessment

Fakery, Fallacy and Faults, but the AI is no Fraud

by Helen Morley, Learning Technologist, University of Sussex

How should we speak about the processes of Large Language Models and other AI? Language, as humans and other animals use it, is a voluntary and intentional tool which is used to aid interaction. Can we say the same for AI generated text?

Figure 1: Image generated by Dall-E (OpenAI) on 4 Jan 2024 using HM’s prompt: “a great ape and a robot talking to each other”.

In November 2023, Professor Gilly Forrester delivered a lecture entitled “Hand to Mouth: The Language Puzzle” at the University of Sussex. I was excited to attend. It was a fascinating account of her studies into the correlation between anatomy and language skills, in particular how humans (and other great apes) demonstrate links between our manual dexterity and our communication with each other. Professor Forrester spoke about some of the neurobiology of language and the criteria some use to determine what language is. The lecture began with Professor Forrester asking us all how we would define language; for me, the answer is “a tool we use to help us understand what’s going on!”

That same month, I joined a webinar hosted by the American University in Cairo with Anna Mills as guest speaker. The topic was (as much of my 2023 also had been) “Artificial Intelligence in Education” with a particular focus on how AI can be used in writing classes. Mills’s observations, and those in the chat, turned at points to the phenomenon of AI generating false text. Words used to describe this included “fabrication” and the more popular “hallucination”. I have long been an advocate of the precise use of lexis and while I don’t particularly like either of these suggestions, I felt it necessary to explain why – for me – “fabrication” was not appropriate and to bring the conversation to readers of this blog to see if we can agree on a more suitable term. In other words, if we can make use of this language tool of ours to understand WHAT’S GOING ON?

I railed against “fabrication” because to fabricate something is to make it, and to do so with intention. Early uses of the word alluded to skill, and the Latin root refers to craftmanship and purpose. Today we use “fabricate” for the process of making something with a purpose; it is also used as something of a euphemism for dishonesty. Neither of these definitions are appropriate for describing AI programs generating inaccurate text: Large Language Models have no sense of purpose and they’re neither honest nor dishonest!

I railed against “fabrication” because to fabricate something is to make it, and to do so with intention.

I was thrilled to see Mills take this point to X (as @EnglishOER), where she asked for more ideas about what terms we could use to describe what the AI is doing in these situations. It is not sentient, conscious, creative, benevolent or malicious. It is not “making it up” any more than any other occasions when it strings words together into what we accept as a sentence. So what is it doing?

The TL;DR is that LLMs predict the next most likely word. They do this by “reading” existing texts and spotting the patterns. The sky is…, the dog goes…, my old man’s a…. It is this huge corpus of texts which is responsible for the biases we see in LLM output and for the erroneous accusations that some students’ work is AI-written when they’re actually just formulaic writers. The point is, ChatGPT does not think about what to output – it doesn’t think at all!
Not thinking, and not being competent to think, also rules out “hallucination” which I originally preferred to fabrication as at least hallucination is involuntary. It’s still not quite right and, in anthropomorphising AI, it risks confusing the matter further. When Mills put the question to her followers on X her criteria was: “the word shouldn’t imply conscious experience or intent”. The responses included: concoction (which was for a while her favourite); debris (too random and not representative of the coherent nature of LLM output); SLU which stood for “Statistically Likely Utterances” (dreamed up by Edward O’Neill with the enviable X handle of @learningtech); phantasma; and my contribution, “jibbering“.

As the year drew to a close, Mills did what so many of us spent 2023 doing: she turned to the tech. She gave ChatGPT itself the criteria, expanded to include that the term mustn’t imply intent or conscious experience; it must imply untruth/unreality, it should reflect patterns from training data but go beyond said data; the term must be accessible without having to be explained, be catchy and be memorable. I’ll let you decide if it’s disconcerting or not that ChatGPT output one of the best suggestions of them all. It called the product of its plausible, informed but false jibberings “data mirage”.

I’ll let you decide if it’s disconcerting or not that ChatGPT output one of the best suggestions of them all.

Figure 2: The image Dall-E generated for Anna Mills using the prompt “data mirage” from ChatGPT’s response to Mills’s query. Image used with her permission.

That’s it. Data Mirage. The data is real but what we experience is not. The unreal generation that we witness is not the fault or the design of anything else and it falls to us to determine whether what we are witnessing is to be trusted.

After all, we’re the Great Apes that are supposed to understand what’s going on!

More: Anna has compiled a list of suggestions she received here: https://bit.ly/HallucinationAlternatives
Watch Anna’s webinar and read about her work here:
https://learnhub.aucegypt.edu/digitaltoolkit/index.php/2023/10/30/generative-ai-activities-for-the-writing-language-classroom-anna-mills/

Posted in AI, Educational Enhancement, Learning Technologies, Technology Enhanced Learning

About our blog

We are the Educational Enhancement team at the University of Sussex. We publish posts each week on using technology to support teaching and learning. Read more about us.

Subscribe to the Blog

Enter your email address to receive notifications of new posts by email.

Archive