Encouraging attendance and engagement through portfolio assessment

Lynne Murphy is Professor of Linguistics at the University of Sussex. Her research and teaching concerns lexicology, lexical semantics, and pragmatics, and transatlantic Englishes. Twice a National Endowment for the Humanities Public Scholar, she is currently writing a book about small words.

A classroom filled with prepared students, interested in the subject and eager to talk with each other about the subject. This is the goal. This is a classroom in which people learn. This is a classroom in which teachers enjoy teaching.

But these days, it feels like the world is conspiring against such classrooms. Standing in the way are the cost-of-living crisis, the mental-health crisis, the perennial academic-timetabling crisis, not to mention the after-effects of pandemic lockdowns. Students, for the most part, want to participate in their learning—but it’s easy for reading, attendance, and course enrichment activities to fall down the list of priorities behind pulling a work shift, saving a bus fare, or staying under a warm duvet.

We tell students that active engagement in their course will be worth their while, but the evidence we give for that claim is fuzzy. The rewards of engagement are not necessarily immediate or immediately perceptible. And it’s easy to understand how the intention to participate falls down. I know that physical exercise will be worth my while, but it’s a slog. It’s also hard to know where to start (weights? cardio? flexibility?). So, it goes lower and lower on my to-do list while I do the easier things first.

My solution for the exercise problem is to make myself accountable to others: to book a spot on a scheduled class or arrange a walk with a friend. The exercise gets crossed off the list. And that’s what I try to do for students: to make the individual rewards of course engagement more concrete, so it goes up the to-do list. Then the whole class benefits from an engaged studentship. Portfolio assessment makes this very doable.

What is a portfolio?

A portfolio is a collection of work (i.e. more than one piece), related to a theme (i.e. the module topic), produced over a period of time (i.e. the semester) (University Modes of Assessment).

The key here is that the contents of a portfolio are not prescribed. The types of work involved can vary across modules. A portfolio for one module might involve learning journals and a podcast. For another it might be multiple drafts of an essay or two. How portfolios are assessed can vary too.

Incorporating engagement into the portfolio

Of course, the main part of a portfolio must be academic work that tests the learning outcomes of the module. Engagement activities should relate to these learning outcomes as well, but should focus more on taking part than on mastering academic skills/content. In my modules, these activities are called participation (and so, from here I use participation as a synonym for engagement).In my first-year modules, participation is 20% of the portfolio mark, in order to instil good engagement practices from the start. From second year, it goes down to 10%. Appendix 1 below this post gives first- and final-year examples.

For the assessment-period portfolio submission, students submit a ‘participation record’ that indicates which activities they did during the term (first- and final-year examples in Appendix 2 below this post). (Not shown, but available on request: the Canvas information pages that make clear what each of the participation activities involves.)

Portfolio-friendly engagement activities

Engagement activities in the portfolio should:Set clear expectations. Students should know what counts as participation and when their deadlines for it are.

  1.  Set clear expectations.
    Students should know what counts as participation and when their deadlines for it are.
  2.  Have a virtual paper trail.
    Anything on the participation record should be independently verifiable through Canvas or Sussex Direct. I.e. either the student should be submitting something to Canvas or the tutor should be counting something on one of those platforms.
  3.  Avoid any potential for bias.
    In particular, staff should not be grading students on the frequency or quality of contributions to seminar discussions, as our perceptions of who’s said what/how much are unreliable (and there is no paper trail).
  4.  Offer choice / be inclusive.
    Not all students can or will participate in the same ways. It should be possible to get a very good participation score without attending extra events or speaking in front of class.

And, of course, the module convenor should consider the workload their activities create for themselves—e.g. what expectations to set about feedback on these activities.

Potential engagement activities include:

  • doing assigned formative work for feedback
  • participating in activities in the classroom
    • e.g. quizzes on the week’s reading, unassessed presentations, writing up ‘minutes’ of seminars for posting on Canvas
  • reflecting on the teaching material or the process of learning
    • e.g. learning journals
  • engaging with tutors or peers outside the classroom
    • e.g. attending student hours, forming/attending study groups, contributing to Canvas discussions
  • doing extension activities beyond the classroom
    • e.g. attending research seminars, Skills Hub events
  • doing extra module work
    • e.g. taking online quizzes, doing supplementary assignments
  • attendance at teaching sessions.

No portfolio should try to include all of these! The nature of the subject, the level, the tutor’s workload, and the module learning outcomes (see below) should come into consideration.

Some of these are more about engaging with the subject or learning processes individually; others are about building community among the cohort—including the tutors. Some are controversial—many believe the last one in particular is undoable. So that one gets two further sections:

Assessing attendance: directly

I asked Sussex Academic Developer Sarah Watson to review the ‘legality’ of how I treat attendance in portfolio assessment. She wrote:

Currently, there is no University policy to say that we can’t grade attendance, though it is of course a contested issue due to cost of living, caring responsibilities etc., With this in mind, it is recommended that students should not be penalised for not attending their lectures and seminars. However, you offset this by:

1. having attendance as only one aspect of the participation mark

2. allowing students to get the grade if they have informed you that they cannot attend

In other words, it’s OK to consider attendance as part of an engagement/participation mark because (1) students who fail to attend can ‘make up’ for poor attendance by doing more of other activities, and (2) ‘notified’ absences don’t count against anyone. Take the example of a student who attended 11/22 sessions (lecture and seminar) but emailed the tutor about each of the absences when they happened; that student would have a 100% attendance record (22/22). If the same student had not emailed the tutor, then they would have achieved 50% attendance. Emailing is certainly not the same as attending, but keeping in touch with the tutor at least shows continued engagement in the module while acknowledging that perfect attendance is often not possible.

In recent years, I have treated attendance as up to 10 or 20 participation marks (see appendices), relying on those percentages. In a class where it’s worth 10, then, the 50% attender gets 5 points toward participation. Another way to do it is to do categorical marking: with a certain number of points for hitting a certain attendance threshold. Those who don’t meet that threshold will know they should make it up with other kinds of participation.

Assessing attendance: indirectly

Another way to ensure attendance is to have participation activities that happen during class time. Our first-year modules have reading quizzes at each session, ‘played’ like pub quizzes in teams. Those quiz scores contribute to the portfolio mark. Zero scores resulting from notified absences are removed from the quiz average. (For what it’s worth, these quizzes are very popular; they are often requested in student evaluations of other modules.)

The maths

The total participation ‘points’ available should add up to at least 100, so they resemble a percentage mark that can easily be figured into the portfolio. (In some of my modules, students can get more than 100 participation points, and so some students’ marks are lifted considerably by participation.) 

Students are told to strive to do at least slightly better on their participation mark than they expect to do the rest of the portfolio, so that the participation helps their grade.

There is an aspect of ‘the rich get richer/the poor get poorer’. Students who are already well-organised and keen are the most likely to do the most participation work. Students who don’t engage enough to even know about the participation opportunities are likely to have their mark taken down further by lack of participation. But in the middle, I see students who might be struggling (whether with the material or with the social aspects of learning) putting themselves into a place where learning is more active and possible.

Incidentally, having a participation element in the module does not seem to result in rampant grade inflation. Average marks on my portfolio-assessed modules are in the low-mid 60s, like the marks for other modules I’ve taught.

Learning Outcomes/Resits

The portfolio as a whole must assess whether the student has achieved the module’s learning outcomes (LOs). But because portfolios submitted in the re-sit period generally cannot involve participation activities, no learning outcomes can explicitly demand engagement/participation.

So, I treat the participation element of the portfolio as being other means of engaging with the content/skills LOs. That may be direct engagement with it (as when students submit the assigned formative work), supplemental (as when they go to events related to the module content or skills development), or indirect (as through attendance, where they get opportunities to develop and show learning).

Give it a try!

I am an evangelist for portfolio development, and I’d be happy to talk with any Sussex colleagues about their portfolio ideas. Contact me at m.l.murphy@sussex.ac.uk.


Appendix 1

Appendix 2

Tagged with: , ,
Posted in Blog

Developing a feedback policy for Life Science

Dr Joanna Richardson

Dr Joanna Richardson, Senior Lecturer in Biochemistry, explains how the School of Life Sciences developed and implemented its feedback policy.  

What we did: 

Over the summer of 2021 I led a working group in the Life Sciences tasked with developing a new School assessment and feedback policy. This was in response to student feedback, via module evaluations and NSS responses, that assessment and feedback needed improvement in the School. While the Unviersity of Sussex has a marking, moderation and feedback policy, the principles set out within it are necessarily broad. Having a school policy would, we hoped, provide a for staff on how to effectively translate policy into action by providing more detailed principles for staff in Life Sciences and, therefore, help to improve feedback clarity and consistency for students and staff.  

How we did it: 

The School didn’t have its own assessment and feedback policy so I had to start from scratch. Having brought together a working group,* I convened a few initial meetings to discuss and agree the purpose of the policy, i.e. that problems the policy sought to address, and what should be included within it.  

I also looked at examples of practice in other schools and universities and teased out common themes. I also dipped into the pedagogic literature, notably the HEA feedback toolkit, and the National Forum for the Enhancement of Teaching and Learning in Higher Education. I took a lot of inspiration from the University of Sussex Business School because they had worked with student Connectors to develop clear assessment criteria. 

I then drafted a policy for discussion and further refinement with the working group, boards of study (department) meetings, the School education committee and, importantly, some of our students. Happily, the proposals were, bar a few suggestions for refinement, well received and the resulting policy articulates and supports three key principles: 

  1. Transparency: Clear communication of practices to staff and students 
  1. Consistency: Practices are applied consistently and fairly across the School 
  1. Relevance: Students can use assessment and feedback for effective learning 

The policy also clarifies the expectation that feedback should: refer to the assessment criteria qualitative adjectives (e.g. ‘Good’ etc), that it should be constructive. It also provided an indication of reasonable length for written feedback (100-200 words).  

Finally, to help the translation of policy into action, the key points of the policy (including a link to the whole document) are articulated on our staff facing web page. We also created student guidance, which is linked to from all canvas sites via our School template. 

Impact and feedback 

One of the key action points for staff was to include a 3-point “feedback template” in writing feedback for students, to standardise and simplify feedback. In response to staff comments, this was subsequently reduced to a 2-point template: 

  • Areas in which you did well in this assessment 
  • Areas in which you did less well in this assessment and guidance for improvement”) The external examiners have commented very favourably on the use of the template as an example of good practice, and our NSS scores in this area have improved. 

I think it’s important that policies like this one are responsive and can change. Of course, ensuring consistency in the application of any policy can be a challenge. Nevertheless, staff have appreciated clear guidance on the content and quantity of feedback, including the realization that it isn’t necessary to write reams of text to provide effective feedback. This has saved some markers time and is especially useful guidance for new members of faculty and our doctoral tutors.   

Top Tips  

  • Make sure your policy isn’t left on the digital shelf – has to be visible, findable, publicized, accessible and responsive to feedback. 
  • When developing such a policy, convene a group of experts (people with significant teaching and assessment experience) and look at what other institutions are doing (your Academic Developer can also help with this!) 
  • Consult students at every stage. 

* This working group included Joanna Richardson (Biochemistry, lead), Dr Jenna Macciochi (Biochemistry/BMS), Dr Zahid Pranjol (BMS/DoSE), Dr Louise Newnham (Genome), Dr Camilla Tornoe (Neuroscience), Dr Valentina Scarponi (EBE), Dr Shane Lo Fan Hin (Chemistry), Dr Claire May (Pharmacy), Dr Lorraine Smith (Foundation), Dr Christina Magkoufopoulou (TEL), and Ms Amy Horwood (Deputy School Administrator) 

Related links 

See Educational Enhancement guidance on: 

Tagged with: , , , ,
Posted in Case Studies, Uncategorised

Sussex Education Festival, 10-11th July 2024, – Call for Participation

Sarah Watson, Mellow Sadik and Kelly Coate at last year’s Education Festival

Following the success of last year’s inaugural Education Festival, we’re excited to announce a Call for Participation for the Sussex Education Festival 2024. Hosted over two days (10th July in person and 11th July online), the festival will provide a space for colleagues from across the University to share their experiences, insights and innovation in teaching, learning and assessment.  

The festival will have the three drivers of change from the upcoming Sussex 2035 strategy as its core themes: Human Flourishing, Environmental Sustainability and Digital and Data Futures. We have a variety of presentation and discussion formats to choose from, and we particularly encourage presentations co-delivered with students. We have some student participation vouchers we can offer- please get in touch with EE for more information. 

The three core themes of the festival can be interpreted as broadly as possible. Some suggestions for potential topics could be: 

Human Flourishing:  

  • Building student belonging 
  • Supporting inclusive learning communities 
  • Social justice pedagogies, decolonising the curriculum 
  • Student creativity and self-expression 
  • Student resilience and wellbeing 


  • Education for Sustainable Development 
  • Authentic assessment and feedback literacy 
  • Community engagement and co-creation 
  • Pedagogies of hope 
  • Learning through the landcape  

Digital and Data Futures: 

  • Generative AI in teaching and assessment 
  • Digital innovations in teaching and learning 
  • Embedding learning technologies  
  • Accessible and inclusive online teaching 

We’re excited to celebrate and reflect on all the amazing work that goes into teaching, learning and assessment here at Sussex. We hope the festival will appeal to colleagues who would like to share their experiences and reflections at any stage of their projects. To reflect that aim, we’re asking for contributions in a variety of formats. 

Choose your format: 

Work-in-progress lightning talks will last 7 minutes, providing short reflections on current practice, or a pedagogic development you would like to make.  

The 30-minute interactive sessions can be run in any way you’d like; they could be used to demonstrate a new tool or teaching technique, or workshop an idea or challenge with fellow colleagues interested in teaching and learning. 

The 60-minute facilitation slots are open for colleagues to suggest longer workshops and discussions on a dedicated topic. Do you have a ‘wicked problem’ within teaching and learning you’d like to dissect with colleagues in a solution room, a provocation to push our boundaries and thinking on current practice, or a wider theme you would like to explore through a global café? Please let us know on the CFP form- the Academic Developers would be very happy to help you plan and facilitate a session.  

Any of these sessions can be presented in person or online. Please note that the majority of the in-person content will take place at the Student Centre on Wednesday 10th July, and the online content will be hosted on Zoom on Thursday 11th July.  

Please submit your ideas through the Call for Participation Form by Friday 19th April. If you would like a document version of the CFP, or if you have any questions, please contact the team.

Posted in Blog, Uncategorised

Closing the ‘feedback gap’ for international postgraduate students: An embedded writing approach

Dr Martin Brown

Dr. Brown embarked on his teaching journey in the realm of outdoor education during the 1980s before transitioning to teach Geography at the secondary school level. In 2004, he transitioned from the classroom to become an Educational Advisor for Learning and Teaching Scotland (now Education Scotland), collaborating with various national stakeholders. From 2004 to 2010, he served as a National Assessor for both the General Teaching Council for Scotland and the Chartered Teacher Program. Since 2020, Dr. Brown has been teaching international students in ESW.

Since 2020, I have developed two courses for Academic Skills Support for post graduate international students. In the Autumn Term, I teach student essay writing and in Spring Term, I teach dissertation writing. My aim was to close an identified attainment gap in three MA programmes: International Education and Development (MAIED), Education (MAED) and Childhood and Youth (MACY). These courses are ‘embedded’ in so far as they focus on Social Science research, theories, and concepts and aim to demystify essay writing for assessment by aligning seminar teaching with: national learning outcomes, local assessment criteria and tutor feedback. I model different types of language and student voice in: essays, portfolio submissions, and dissertations. Over ten weeks, I endeavour to build relationships through conversations about writing. I focus my efforts on developing conversations from free writing activities, and I offer technical solutions for improving academic writing in a Social Sciences genre. In this space, I aim to move beyond teaching with slides; instead, I focus on peer conversation, student writing, tutor observation, and tutor feedback.

In my writing workshops, I’m constantly aware of a sort of chronic anxiety among international students. Occasionally I think this type of stress is cultural shock but more often it is, in my view, due to “a lack of predictive information” (Sapolsky, 2004). There is an assumption that postgraduate students have fewer problems in negotiating the learning environment in a university, but this is not true for many international students or older students (like me) who return to study after many years of absence (Brown, 2014). Post graduate students may find difficulty where there is a lack of clarity or consistency in assessment guidelines, or a lack of alignment between learning outcomes, assessment rubrics, and assignment titles (Evans, 2013). My workshops are built on conversations with international students, and we talk about their experience of writing for assessment. Critically, my feedback to them includes my reflections on marking student submissions and the most common problems I see. These feedback exchanges influence my seminar planning and allow me to improve formative writing activities that are designed to address common problems.

One of these common problems is accidental plagiarism, otherwise described as weak paraphrasing and poor citation, although often referred to by markers as a lack of author voice. For me, this problem often demonstrates weak English language skills, but it can also be a lack of self-confidence to offer personal opinions. Accidental plagiarism is most often rooted in excessive description and a failure to move from description to discussion, and unfortunately where students are completely unaware that they have adopted another author’s voice. I offer a space to discuss this problem and to strengthen the international student’s voice through reflective writing. I focus on improving student self-confidence by creating a space to discuss ideas of self and positionality. In other words, I address a known problem, that most post graduate students do not know how to describe themselves or their opinions in an essay. Moreover, they do not know how to reflect on their identity, or justify their beliefs and opinions (Holmes, 2020). To help students to think, talk, and write about themselves, I encourage regular free writing because it removes barriers and fosters self-expression and discovery.

These embedded writing courses give me the opportunity to clarify or demystify assignment guidance for students and narrow “the feedback gap” (Evans, 2013). This is the gap between what tutors say or understand about the feedback they offer and what students say or understand about the feedback they have been given. I do this by describing specific assessment criteria and by interpreting the advice tutors offer about their assignment titles. Once essay assignments have been marked by tutors, I interpret the summative feedback students are given by their assessors. Additionally, I advise students on how to edit their essays for resubmission. These courses also give me the space to offer empathetic and directive solutions to a variety of technical problems, such as: how to write a meaningful essay title, how to structure an essay, how to signpost an argument or citation, how to separate the broad context of an essay from the field of research in the literature review (a problem of breadth), how to demonstrate critical and reflective thinking in a paragraph, how to  speak to the marker and when to use ‘I’, and how to edit an essay for better internal alignment and flow.  

Good teaching is labour intensive in so far as it is about collecting and interpreting evidence to inform teaching and feedback exchanges. However, it is also important to create activities that challenge students as well. To that end, in my writing courses, I offer both cognitive and socio-constructive approaches, with clear instructions, creative tasks, and contextual feedback (Evans, 2013, Brown 2014). This feedback should be about four things (but not all at the same time): explaining how to begin and complete a task, activity, or assignment; describing how the student can look forward to the next task and proceed confidently; giving technical advice about ‘self-regulation’ or metacognitive skills; and describing aspects of ‘self’ in terms of the attributes or capacities of the individual student in relation to the task, activity or assignment (Evans 2013 p72). In this way students can become less stressed about assessment and more confident in their ability to reflect on their learning. In other words, tutor and peer feedback is primarily, “ a crucial way to facilitate student’s development as independent learners who are able to monitor and regulate their own learning” (Ferguson, 2011 in Evans 2013 p72). Overall, an embedded writing approach reduces student academic workload, lowers their stress levels and strengthens their self-efficacy.


Carol Evans (2013)  Making sense of feedback in Higher Education, Review of Educational Research vol 83 no 1, 70-120. https://www.jstor.org/stable/41812119

Sally Brown (2014) What are the perceived differences between assessing at master’s level and undergraduate level assessment? Some findings from an NTFS- funded project, Innovations in Education and Teaching International, vol 51 no 3, 265-276. https://doi.org/10.1080/14703297.2013.796713

Andrew Holmes (2020) Researcher Positionality- A Consideration of Its Influence and Place in Qualitative Research- A New Researcher Guide, International Journal of Education, vol 8 no. 4, 1-10.  https://doi.org/10.34293/

Black, P. and William, D. (1998) ‘Assessment and Classroom Learning’, Assessment in Education: Principles, Policy & Practice, vol 5 no1, 7-74.

Sapolsky, Robert, M. (2004) Why Zebras don’t get ulcers. 3rd Edition St Martin’s Press: New York.

Tagged with: , , , , , , ,
Posted in Blog

Strategies for making quantitative assessments more AI resilient

Dr Myrna Hennequin

In this case study, Dr Myrna Hennequin, Lecturer in Economics, shares her strategies for making online quantitative assessments more AI resilient. 

What I did  

I redesigned the online assessments for the module Quantitative Methods for Business and Economics: Maths (QMBE A). This introductory maths module for Foundation Year students is assessed by several Canvas quizzes. My goal was to create computer-markable exams that are varied, balanced and more resilient against academic misconduct through AI or collusion. 

Why I did it   

Creating effective online assessments for quantitative modules poses several challenges. With the rapid rise of AI tools such as ChatGPT, I was concerned about cheating through the use of AI. At the same time, timetabling issues implied that the window during which the in-semester tests were available had to be extended from one hour to four hours, increasing the opportunity for collusion. 

In response to these issues, I aimed to make the quizzes more resilient. As the current free version of ChatGPT (GPT-3.5) does not allow for uploading images, I ensured that each quiz included one or more questions involving graphs.

I also took advantage of Canvas’s built-in maths editor, which puts formulas in LaTeX format. It is currently not possible to directly copy these formulas into ChatGPT, which deters students from quickly copying an entire question including formulas.

Another strategy I use is randomisation. Canvas allows for randomising questions as a whole by using question groups or setting up numerical questions with random numbers using formula questions. To prevent collusion, I ensured that nearly all quiz questions were randomised in some way.

Lastly, I created a varied and balanced exam by using different question types. Even within the MCQ assessment mode, there are many ways to ask questions beyond the standard multiple-choice question consisting of a prompt and a set of answer options. I make use of a range of question types available in Canvas quizzes: 

  • ‘Formula questions’ are numerical questions with randomised numbers. 
  • ‘Fill in multiple blanks’ can be used to let students construct a step-by-step solution to a problem. 
  • ‘Multiple answers’ are multiple-choice questions where there might be more than one correct answer. 
  • ‘Multiple dropdowns’ are essentially fill-in-the-blank questions with predetermined answer options. 
  • ‘Matching’ questions ask students to match a series of statements to a fixed set of answer options. 

Combining various question types makes it possible to test different aspects of learning and get students to demonstrate depth of knowledge in a computer-markable format. 


Designing robust quizzes and AI-proofing questions requires time and creativity. It can be time consuming to plan new approaches and to test how well ChatGPT responds to each type of question. However, the Canvas instructor guide proved very useful as it introduced me to the different possibilities within Canvas quizzes. 

Impact and student feedback 

I always provide practice quizzes containing the same type of questions as the official assessments, so that students know what to expect and can prepare in advance. Students are happy with this combination of summative and formative assessments: on this year’s mid-module evaluation, 21 students (out of 62 respondents) mentioned the practice quizzes and/or the official online tests as a positive point in their written comments. 

In terms of results, the distribution of marks for the assessments this year was very much in line with previous years. This suggests that the marks were not inflated by the use of AI tools or other misconduct. 

Future plans 

While my approach makes the quizzes more robust for the moment, AI tools are rapidly evolving. I would expect AI to get better at dealing with images (e.g. graphs) and mathematical formulas. This will be an additional challenge for the future. 

Top 3 Tips 

  1. Think creatively: can you ask your quiz questions in a different way? Consider making use of the various question types available in Canvas quizzes. 
  1. Include images (e.g. graphs) to make it harder to answer questions using AI (*). Use Canvas’s built-in maths editor for any formulas so that they cannot directly be copied.  
  1. Use randomisation to deter collusion. You can randomise questions as a whole by using question groups, or set up numerical questions with random numbers using formula questions. 

(*) NB: Don’t forget that some students may have visual impairments and require screen readers. You could try to add alt text to the image in such a way that the quiz question can still be answered, but without giving away too much information. 

Tagged with: , , ,
Posted in Case Studies

Student engagement is key for inclusive curriculum (re)design

Katerina Psarikidou and Alejandro Luna

Science Policy Research Unit, University of Sussex Business School 

Katerina Psarikidou is Lecturer in Sustainable Development at the Science Policy Research Unit. She is the University of Sussex Business School PRME Champion, a UN initiative for Responsible Management Education. She is a Fellow of the Higher Education Academy and the Sussex Sustainability Research Programme. She is principal and co-investigator for 3 scholarship and research grants engaging in processes of co-creation with student and citizen communities. Her pedagogic scholarship is published at the international journals of ‘Postdigital Science and Education’ and ‘International Med Education’.  

Alejandro Luna is a Lecturer in Sustainability, Innovation and Energy Policy in the Science Policy Research Unit (SPRU). He is convenor of the MSc course Science and Technology Policy. He is one of the Business School UN PRME Champion Institution Co-Leads. He is investigator in a USBS-funded scholarship grant. Alejandro sits at the Resilience Frontiers Technology Advisory Group (part of the UNFCCC secretariat). He is a Fellow of the Higher Education Academy.  


As part of our university lives, we have all encountered ideas of ‘inclusive curriculum design’ as a key objective and imperative of teaching and learning in Higher Education Institutions (HEIs). However, what does it mean? And, what is the role of students in it? 

As early as 2011, Advance HE published its commissioned report ‘Inclusive Curriculum Design in Higher Education’ providing definitions, generic guidance as well as specific to subject or disciplinary area (Morgan and Houghton, 2011). According to this report, “an inclusive curriculum design approach is one that takes into account students’ educational, cultural and social background and experience as well as the presence of any physical or sensory impairment and their mental well-being” (Morgan and Houghton, 2011, p.5). As also highlighted in the report, “it is an imperative on institutions that they design their curriculum in such a way as to promote success among all students” (ibid:5). 

From the above, we understand that students are central to the mission of (re)designing university curriculums in more inclusive ways. However, what needs to be clarified in the above, is the role that students can play in the process of (re)designing curriculums in more inclusive ways.  

Inclusive curricula and co-creation

It is often the case that inclusive curriculums are designed by university educators themselves without engaging students in the formulation of ideas and proposals for more inclusive teaching, learning and assessment practices. As also discussed in relevant scholarship literature, student engagement can be key for student learning – not only in terms of securing the effectiveness of the (re)designed curricula, but also in terms of enhancing students’ belonging to the university, and therefore their overall levels of retention and academic performance (Zhao and Kuh, 2004; O’Keeffe, 2013). 

This was also our experience and understanding from discussions we had with our students in late 2023, as part of our USBS-funded scholarship project titled ‘Co-creating a USBS staff-student community on innovative teaching and learning’. 

In December of 2023, as part of our scholarship grant, we conducted two co-creation workshops with students from two ODL modules that each of us convened and had recently delivered. The ODL MSc Sustainable Development course provides fertile ground for (re)designing curricula in more inclusive ways – both in terms of the diversity of the student cohort coming from all parts of the world, age groups, ethnic, professional and disciplinary backgrounds, but also in terms of the challenges of inclusion that can sometimes emerge in relation to online teaching and learning (Mackenzie et al, 2021). 

As also indicated in the title of our project, the aim of our workshops was to engage students in co-designing innovative methods for teaching, learning and assessments that would put students’ learning needs centre stage.  

In both workshops, students highlighted inclusion as a key objective as well as a challenge for (re)designing the curriculum, especially in online environments. As said:  

I also just wanted to pick up on the interesting on the diversity and inclusion…that was what triggered my thinking around how I think it’s a real challenge when you’re doing online learning” (Student 1, Workshop 2, 2023) 

Students as active agents in their own learning

As students explained, for them, ‘inclusion’ referred to a diversity of both teaching and learning methods that can cater to people’s different learning needs, but also to methods that can make students more active agents in their own learning.   

For example, students from both workshops proposed student-led seminars for case-based learning. They proposed the organisation of cross-module seminars that would be led and delivered by students, in which students could present cases related to their own professional, cultural and disciplinary backgrounds and experiences.  

As students explained, these seminars would also be important for enabling students to advance their knowledge by “putting theory into practice” (Student 1, Workshop 1, 2023). However,  they also stressed the importance of these seminars in creating new more inclusive spaces of learning: both in terms of students being actively involved in running the seminars but also in terms of fostering social interaction and community belonging amongst themselves and to the university. 

However, what students also underlined that they find “most useful” was “conversations like these” (Student 2, Workshop 1, 2023) facilitated by our workshops, and thus processes in which students feel included in discussions about (re)designing curriculum in more inclusive ways.   

Thus, as evidenced in our workshops, questions and aspects of inclusion have been key in students’ discussions as well as their proposals going forward. Students talked about inclusion as a key challenge, especially in the context of online teaching and learning, but also as a key solution to enhancing their own student experience and learning. As also evidenced above, for them, (re)designing the curriculum in ways that would support the development of student-centred as well as student-led learning communities is key for supporting their learning in more inclusive ways. However, for students what is also important and would like to see more of is their engagement in processes of (re)designing the curriculum, in ways that have been done through our workshops! 


The discussion above raises some broader questions about inclusive curriculum design. From our co-creation workshops, we learned that inclusive curriculum design is not just about methods of teaching in class and how to make them more inclusive. There is also a lot of learning that takes place outside the class, and it is for us, as educators, to support student learning by facilitating the development of those student-led communities of learning. Finally and very importantly, it is also very much about how to make processes of curriculum design more inclusive in order to design methods of teaching and learning that are truly inclusive. And, here student engagement is key. And, as students commented, our workshops can be an example of how this can be done! 


  • MacKenzie A, Bacalja A, Annamali D, Panaretou A, Girme P, Cutajar M, Abegglen S, Evens M, Neuhaus F, Wilson K, Psarikidou K, Koole M, Hrastinski S, Sturm S, Adachi C, others (2022) Dissolving the dichotomies between online and campus-based teaching: A collective response to the manifesto for teaching online, Postdigital Science and Education, 4(2): 271-329.  
  • Morgan, H. and Houghton, A-M. (2011) Inclusive curriculum design in higher education: considerations for effective practices across and within subject areas, Advanced HE, available online at https://s3.eu-west-2.amazonaws.com/assets.creode.advancehe-document-manager/documents/hea/private/resources/introduction_and_overview_1568037036.pdf . Retrieved on 8 February 2024.  
  • O’Keefe, P. (2013). A Sense of Belonging: Improving Student Retention, College Student Journal, 47 (4): 605-613.  
  • Zhao, C.M. and Kuh, G.D. (2004). Adding Value: Learning Communities and Student Engagement. Research in Higher Education 45, 115–138. 
Tagged with: , , , , ,
Posted in Blog, Uncategorised

Using Vevox in the classroom

Dr Seun Osituyo

In this case study, Seun Osituyo, Deputy Director of Student Experience and Director of MSc Accounting and Finance at USBS, explains how she uses Vevox to increase student engagement. 

What I did 

One of the tools I have used to promote active participation during my teaching is Vevox. I set multiple choice questions on Vevox, based on a previous lecture or seminar. Then students are provided with the meeting ID and asked to attempt the questions on Vevox at the beginning of the lecture or seminar.  

Why I did it 

I am always interested in improving student engagement during my lectures and seminars. Many students have smartphones, and I thought it would be useful to engage them using the device that they are already familiar with. Some students also bring other devices such as laptops and tablets to the classroom. Students can access the Vevox site with these devices and do not need to create an account to use the tool. One benefit of Vevox is that it allows anonymity. Students’ responses on Vevox can be completely anonymous (unless they choose to indicate their names in their responses). Vevox is incredibly helpful when delivering a module for the first time as it allows me to check that students have understood the concept(s) I am teaching. I also use Vevox to collect informal feedback on my teaching delivery. This feedback helped me know, for example, that I was speaking too fast during the first lecture and that helped me improve in other lectures I had with the same cohort. Many students are motivated when they are listened to and using Vevox helped me to build a comfortable environment for students to achieve the learning outcomes. 


I did not encounter any challenges when using Vevox. The process was really simple–I only had to create the quiz and share the session ID with students to access it during the lecture. 

Impact and student feedback 

Students who participated in the Vevox exercises found them useful. One student mentioned Vevox in their comment on the best aspects of the module, explaining that they enjoyed being given formative assessments regularly. I used Vevox every week on that module. This student wrote: “The way that the lectures are taught like workshops and how there is a test in the seminar. This motivates you to go away and learn the lecture content ready for the test in the seminar.”  

Future plans 

I have continued using Vevox to check students’ understanding of concepts. In my most recent module, I used Vevox every week to assess students’ understanding of pre-lecture videos during the synchronous lecture in one of my modules. This helped me identify the areas in the pre-lecture videos that need to be revisited. In terms of the teaching delivery, it helps me to know what works for the students. I will continue to use this tool and others, where it is practical to do so. 

Top 3 tips 

  • Give students instructions about how to access and complete the quiz in advance. Ideally, you should add the Session ID to your lecture PPT slides and publish it before the lecture. 
  • Give students a few minutes when accessing the Vevox for the first time. 
  • This should be obvious but explaining the purpose of taking the quiz often improves engagement. 
Tagged with: , , , ,
Posted in Case Studies, Uncategorised

Developing authentic assessment for learning

Dr Verona Ní Drisceoil

Dr Verona Ní Drisceoil, Reader in Legal Education at the University of Sussex Law School, explains how she developed a new Case Briefing Assessment for her Year 1 core law module to promote inclusivity and foster transferable skills.  

What I did 

As an educator, I am motivated by assessment ‘for’ learning and not just ‘of’ learning (See further Sambell et al (2012)). In this respect, I endeavour to provide modes of assessments that offer approaches for all students to grow and feel empowered. In my year 1 core law module, instead of requiring students to write a case note essay, I developed a new assessment that asks students to take on the role of a Trainee Solicitor and prepare a case briefing presentation for their supervising solicitor and other partners of the ‘law firm’. Students pre-record the case briefing presentation and submit for marking. A particularly novel element of the assessment is that students are expected to include an evaluative judgement (see further Boud et al. (2019; Tai (2018)) of their performance. This is pitched as ‘you stay online to discuss the briefing’ with your supervising solicitor. They are, in this respect, also assessed on self-reflection.   

To develop this new assessment, I adopted a backward design approach. Backward design requires one to work back from the assessment and ensure students receive ample opportunities to practice and develop the skills required to excel in said assessment. Students have 7 seminars in this module. Working backwards from the assessment, the focus for the seminars is as follows: 

Seminar 7: Digital Literacy, exemplars and how to upload the recording. 

Seminar 6: OSCOLA Referencing  

Seminar 5: Voice Work 2 

Seminar 4: Voice Work 1  

Seminar 3: The Doctrine of Precedent and Human Rights 

Seminar 2: Write a Case Summary 

Seminar 1: Introduction to Law 

The Brief: 

You are a trainee solicitor at the law firm Clyde & Clyde, London. Your supervising solicitor, Valerie Adebisi, asks you to research the Supreme Court case of AM (Zimbabwe) (Appellant) v Secretary of State for the Home Department (Respondent) [2020] UKSC 17, [2020] 2 WLR 1152 and prepare a legal case briefing presentation of 10 minutes (max) for your supervising solicitor and the Head of the Human Rights Division at the firm. 

The client is unable to attend. You have been asked to record the presentation so that it can be added to the client’s file. Once you complete the presentation, you stay online to have a reflective debrief with your supervising solicitor where she asks you to respond to a number of questions. 

Why I did it   

The primary motivator and guiding rationale for changing the mode of assessment in this core module was, and is, widening participation. I believe that all students should have the opportunity to develop voice work skills. Whilst we, as a law school, offer a range of extracurricular opportunities to support students to develop voice work and advocacy skills, many of our students cannot take part due to a restraint on numbers and other responsibilities they might have, including caring responsibilities or work. For that reason, I sought to embed this key transferable skill, and opportunity, into a core law module so that all students could take part and could develop these skills. 


The most challenging aspect of supporting students with this assessment mode is the additional IT element. It is important to ensure that students have the required digital literacy to complete and submit confidently. I now feel that students have all the support they require but it has taken much trial and error and exploration of the best platform to use (e.g. Panopto, Zoom, or alternative) and how to best relay guidance to students. 

This year, the final seminar of the module gave students an opportunity to practice recording and uploading a short video. The purpose of this was so that students could test software and become familiar with the submission process on Canvas Online. 

Impact and student feedback 

Feedback from students has been overwhelming positive and shines through in the evaluative judgement aspect of the module itself. Students speak about enjoying the challenge of a ‘differing approach’, having the opportunity ‘to practice and develop their advocacy’, ‘to apply the law as a trainee solicitor’, to ‘overcome fears’ and to ‘feel proud’. The feedback and reflections that speak to empowerment and growth are particularly rewarding for me as a teacher. 

Future plans 

I have no intention to move away from this mode of assessment, but consideration of how to improve will, of course, always continue. Building on this teaching practice and approach, I am co-authoring a book, alongside Dr Jo Wilson and Jeanette Ashton, on ‘How to Design and Embed Authentic Assessment in Law’ (Edward Elgar, 2025). 

Top 3 tips 

  1. Work, from the start of design, with your Learning Technologist  
  1. Ensure the module supports the new mode of assessment. Remember the importance of backward design. 
  1. If you are adding a different IT component, ensure your guidance is clear. Remember that students will be using different devices, and they may not have the software they require for the assignment. 


Boud, D. et al. (2018) Developing Evaluative Judgement in Higher Education: Assessment for Knowing and Producing Quality Work Abingdon, Oxon; Routledge. 

Sambell, K. et al (2012) Assessment for learning in higher education. Abingdon, Oxon; Routledge. 

Tai, J. et al (2018) ‘Developing evaluative judgement: Enabling students to make decisions about the quality of work’, Higher education, 76(3), pp. 467–481. doi:10.1007/s10734-017-0220-3. 

Posted in Case Studies, Uncategorised

Introducing optionality in assessment modes

Dr Jo Wilson

In this case study, Dr Jo Wilson, Senior Lecturer in Commercial Law, talks about how she introduced optionality in assessment modes in her final year module, Advanced Contract Law in Practice, to create more inclusive and accessible assessment practices. 

What I did 

Law students are predominantly assessed through critical essay writing in their final year at Sussex. As such, I wanted to give students the opportunity to try alternative assessment modes while incorporating student choice. To do this, I ask students to take on the role of Trainee Solicitor, and they can choose between producing either a poster presentation or a pre-recorded oral presentation on a commercial contract clause/phrase of their choice. Accordingly, this assessment incorporates optionality, both in terms of mode of assessment, and subject matter. Students are given the following instructions: 

You are a Trainee Solicitor at Carlill & Partridge LLP. Your firm is hosting its annual research seminar, the theme of which this year is ‘Drafting Commercial Contracts’. You have been asked to pick any legal/commercial/boiler plate clause/phrase that features in commercial contracts and produce EITHER a poster presentation, OR a 10-minute pre-recorded oral presentation with accompanying visuals. Your presentation should cover: 

  • What the clause/phrase is; 
  • The function of the clause/phrase; 
  • How such clause/phrase should be drafted/written; 
  • (If applicable) relevant litigation relating to drafting issues of the clause/phrase; 
  • Critique of the commercial/legal issues raised by the clause/phrase 
  • References 

When offering optionality in assessment, it is imperative that there is transparency and clarity regarding how assessments are marked. To ensure this, I created a set of bespoke marking criteria for each assessment mode. I also created two visuals on Canva which summarise in simple terms how the marking criteria apply to each of the assessment options.  

Further, and most importantly, I embedded into the module design, a two-hour seminar which is dedicated to preparing for the presentation assessment. In terms of timing, the presentation is due for submission in Week 9, and this seminar takes place in Week 7. This timeframe was chosen deliberately so that students can get information at a crucial time, but also so that they could have some breathing space to work on their presentations, rather than having to use their time to prepare for another substantive seminar. 

The seminar is broken down into two parts. First, I provide students with information regarding the expectations for the assessment, and then we go through the marking criteria and look at exemplars. Second, students are put into pairs/small groups with other students who are working on a different clause/phrase, and they are asked to present and give feedback to each other on their work in progress. 

Why I did this     

It was the benefits of authenticity and optionality that were the driving forces behind the adoption of this approach. First, in relation to authenticity, I chose this approach so that students could develop skills that will benefit them in their lives beyond university, including their ability to be creative, to present information clearly and succinctly, and to deliver their ideas orally. By giving the students a role to play, and by giving the assessment an authentic purpose, I found that students were much more engaged in the learning process, because they were able to apply their knowledge in a meaningful way to a real-world context (Mueller 2005). 

In terms of optionality, the Quality Assurance Agency for Higher Education (QAA) has recently highlighted the need for the higher education sector to design more inclusive, accessible, and flexible assessment choices (Firth et al 2023). Waterford and West (2006) define inclusive assessment practice as ‘a flexible range of assessment modes made available to all, capable of assessing the same learning outcomes in different ways.’ In recognition of this, I was keen to incorporate optionality on this module, so that students can take control of their learning and pick a mode of assessment that lends itself to their strengths (O’Neill 2011). 


The assessment design itself was fairly straightforward. However, time and resources were required to develop the material to support the students in the planning and execution of this non-traditional, authentic mode of assessment.  

Impact and student feedback  

The student response to using optionality has been overwhelmingly positive. At the end of the module, I invited students to complete a short survey. Of the 23 students that responded, 100% agreed or strongly agreed that optionality creates a more inclusive learning environment. Unsurprisingly then, the qualitative comments very much so spoke to the benefits of this approach in terms of accessibility and inclusivity: 

  • “It equals the playing field” 
  • “Students can choose the best suited assessment mode to get a fairer academic assessment of them” 
  • “As different students have their strengths and weaknesses, optionality allows more inclusivity and room to prove your best effort.” 
  • “People are able to be assessed in formats they are more comfortable with and suit their learning style.” 
  • “Students are allowed to choose a mode that tailors to their strengths” 
  • “Means we can play to our strengths and lets people with other skills (such as creativity) succeed.” 

Future plans 

This approach worked incredibly well, and I received excellent feedback from students, both formally and informally, so I intend to continue with it, just as it is! 

Top 3 tips 

  1. Embed a teaching session that is dedicated to preparing for the assessment. 
  1. Create bespoke marking criteria for each assessment mode for transparency. 
  1. Think carefully about the different assessment modes and the skills that will be required of students. 


Firth, M. et al, ‘Optionality in Assessment: A cross institutional exploration of the feasibility, practicality & utility of student choices in assessment in UK higher education (QAA, Oct 2023). 

Mueller, J. ‘The authentic assessment toolbox: Enhancing student learning through online faculty development’ (2005) Journal of Online Learning and Teaching 1 

O’Neill, G. (Ed) ‘A Practitioner’s Guide to Choice of Assessment Methods within a Module’ (2011, Dublin: UCD Teaching and Learning, http://www.ucd.ie/teaching/resources/assessment/howdoyouassessstudentlearning/ 

Waterfield, J. and West, B.  ‘Inclusive Assessment in Higher Education: A Resource for Change’ (2006 University of Plymouth: Plymouth) 

Tagged with: , , , , , , ,
Posted in Case Studies, Uncategorised

“It equals the playing field” : Student reflections on introducing optionality as an accessible and inclusive assessment practice

Dr Jo Wilson

Royalty Free Examination Stock Photos | rawpixel


Optionality in assessment has recently come under the spotlight, with the QAA highlighting the need for Higher Education to develop more inclusive, accessible, and flexible assessment choices. In response to this, the University of Manchester recently led on a QAA funded project which gathered insights from over 1200 academic and professional services staff and students across a number of UK Higher Education institutions regarding the expectations and challenges of providing flexible assessment. In their final report, published on 31 October 2023, they set out the key findings and themes that emerged from that research, and make a series of recommendations to the sector (Firth et al 2023). In this article I will consider some of those findings and recommendations in the context of my own experiences of, and student reflections on, the introduction of optionality in a final year law module at the University of Sussex.

What is optionality in assessment?

Waterfield and West (2006) define inclusive assessment practice as “a flexible range of assessment modes made available to all, capable of assessing the same learning outcomes in different ways.” Adopting a slightly broader approach, Firth et al (2023) define optionality in their report as giving “some level of control over student decision-making about when, how, and in what format they submit assessments, and whether this is individual or collaborative.” As such, this flexibility could relate to, for example, the subject-matter, mode, word length, weighting, and timing of the assessment, or whether it is an individual or group effort (see, e.g. Wanner et al 2021; O’Neill 2017).

Introducing student choice in Advanced Contract Law in Practice

Advanced Contract Law in Practice is a 15 credit, final year optional module which I convene at the University of Sussex. This module, which ran for the first time in the 22/23 academic year, is broadly split into two parts; drafting commercial contracts, and interpreting commercial contracts. The assessment in question relates to the former topic.

In terms of the assessment brief, students take on the role of a Trainee Solicitor and they are asked to produce, for the purposes of their employers annual research seminar, either a poster presentation, or a pre-recorded oral-presentation on a commercial contract clause/phrase of their choice. Below are the instructions students receive:

You are a Trainee Solicitor at Carlill & Partridge LLP. Your firm is hosting its annual research seminar, the theme of which this year is ‘Drafting Commercial Contracts’. You have been asked to pick any legal/commercial/boiler plate clause/phrase that features in commercial contracts and produce EITHER a poster presentation, OR a 10-minute pre-recorded oral presentation with accompanying visuals. Your presentation should cover: What the clause/phrase is; The function of the clause/phrase; How such clause/phrase should be drafted/written; (If applicable) relevant litigation relating to drafting issues of the clause/phrase; Critique of the commercial/legal issues raised by the clause/phrase References

I designed the assessment in this way for two reasons. First, I wanted the assessment to be authentic, so that students could develop skills that will benefit them in their lives beyond university (McArthur 2023), including their ability to be creative, to present information clearly and succinctly, and to deliver their ideas orally. By giving the students a role to play, and by giving the assessment an authentic purpose, I found that students were much more engaged in the learning process, because they were able to apply their knowledge in a meaningful way to a real-world context (Mueller 2005).

Second, and most importantly, I wanted to embed an element of student choice, both in terms of subject matter and mode of assessment. I chose these particular aspects of flexibility because I wanted to maximise engagement, and give students the opportunity to tailor the assessment to their strengths as a learner (O’Neill 2011). These benefits are also recognised in Firth et al’s (2023) report, which highlights greater inclusivity, supporting diverse learning styles, and enhancing the student learning experience as some of the positive traits of student choice in assessment (p2).


At the end of the Spring term in 2023 I invited students on the module to complete a short survey on their experiences and perceptions of optionality in assessment practice. Of the 40 students enrolled on the module 23 completed the survey. In terms of results, 100% of the participants agreed or strongly agreed that optionality creates a more inclusive learning environment and, interestingly, 100% also agreed or strongly agreed that optionality makes a final year module more attractive. To gain a deeper insight into these perspectives, I then asked students to comment first on the benefits of optionality, and then on the potential drawbacks. The discussion that follows will analyse this data in the context of some of the findings and recommendations of Firth et al’s (2023) report, focusing first on two key benefits, and then on two key drawbacks and the steps I have taken to mitigate them.

Benefits of optionality

Inclusivity was one of the key themes to emerge from Firth et al’s (2023) study in relation to the benefits of optionality. Both staff and students agreed that assessment optionality could enhance inclusivity, though it was important that students could access the relevant resources, and be supported in the development of the skills necessary to complete the different assessment formats (p17). Accordingly, Firth et al (2023) recommended that:

“Educational institutions should prioritise the introduction of diverse assessment formats to explicitly address accessibility and concerns about fairness, ensuring access to necessary resources and skills development to prevent the unintentional widening of awarding gaps.” (p17)

Unsurprisingly, inclusivity was most commonly cited by the students completing my end of module survey as the key benefit of optionality, with many of the sentiments from Firth et al’s (2023) report mirrored. Students recognised that they have diverse learning styles, with different strengths and weaknesses, and allowing student choice in assessment levelled the playing field, in the sense that it gave them the opportunity to pick an assessment mode or subject matter that played to their strengths. One student summarised the benefits of student choice well, stating that:

“It does not limit students. Everyone (especially those with learning disabilities like myself) has different strengths and ways of learning. Through optionality, everyone is given the chance of success when they might have previously been limited. It equals the playing field.”

These notions of inclusivity and fairness, I argue, are the key driving forces behind the adoption of optionality in assessment; we must give our students the opportunity to demonstrate the knowledge and skills they have gained in a way that makes sense to them as a unique learner.

Another key theme to emerge from Firth et al’s (2023) study in terms of the benefits of optionality, and something that is inherently linked to inclusivity, is the impact of choice on student outcomes. In their study, students argued that by allowing them to select an assessment method that aligns with their backgrounds, abilities and skills, they are able to tailor their learning experience and potentially improve their academic performance (p18).

In my own survey, many students made the same connection between choice and improved student outcomes. One student commented that “People do better in different types of assessments so they are more likely to get higher grades in an assessment type that they prefer” and another argued that it “Allows individuals the opportunity to attempt the assessment mode they feel most comfortable with, which consequently could help them to achieve the best grade possible.”

Interestingly, one student stated that optionality “Means we can play to our strengths and lets people with other skills (such as creativity) succeed.” This is really poignant, and demonstrates the importance of giving students the opportunity to be assessed in non-traditional ways.  

As a word of caution, Firth et al (2023) found that academic staff were concerned that students might consistently opt for assessments they felt more comfortable doing, and potentially miss out on valuable learning experiences and skills development (p20). I agree with this concern and think it highlights the need for conversations and planning around assessment choices to be taken at year or course level. This will ensure a holistic approach where students have the opportunity to explore and develop a range of skills across their modules.

Drawbacks of optionality

One of the key concerns demonstrated in both Firth et al’s (2023) study, and my own survey, related to perceptions of fairness between different types of assessment. Firth et al (2023) reported that students thought assessment methods should be fair, and that no method should be punitive or disadvantageous. From an academic perspective, concerns were raised regarding the perceived differences in the difficulty of various assessment types, and the need to maintain trust in the assessment process (p18).

Similar views were demonstrated by the students completing my survey, many of which focused on the potential inequity across different modes of assessment, and the role of the teacher in ensuring a consistent approach to marking. One student commented that it is “Difficult to assess both modes in the same way”, while another was concerned that students might be “marked to different standards…if one option was easier than the other.” Similarly one student argued that optionality “requires the professor to be extremely aware of how to even out the playing field between the two assessments so that marking is equivalent across the board.” Finally, another student responded that “The assessment modes may vary in difficulty which might make the module a bit more unfair than if there was only one mode of assessment.”

At first glance, these responses are concerning, and could serve to undermine the core aim of optionality which is to create a fair and inclusive assessment environment. However, the anecdotal evidence from my own module, is that students have very differing views on which assessment modes are more difficult than others. Many students commented that the poster presentation was the easier option, whereas others argued the same to be true of the oral presentation. This observation feeds usefully into the narrative regarding the diverse skills, experiences and capabilities of our students, and therefore, I argue, actually reaffirms the importance of optionality in giving students the opportunity to play to their individual strengths.

The other key concern raised by students related to feeling confused and overwhelmed by the options. The burden of choice was discussed in Firth et al’s (2023) literature review, though it did not emerge as a key theme in their final report, which is interesting, given that many students commented on this as a concern in my survey. For example, one student commented that “Students may feel overwhelmed with the choice they have to make”, and another commented that “It may be difficult and sometimes overwhelming to decide what to choose.” This reflects the findings of Brown et al (2020) who highlight that some students can find greater choice time consuming, overwhelming and challenging.

In response to both of these drawbacks, it is imperative that 1) marking criteria is tailored and clear, and 2) students are fully supported in making their assessment choices.

Regarding the marking criteria, Firth et al (2023) recommend that:

“When offering students the option to choose their assessment format, academics should prioritise transparency and consistency. This means creating and communicating well-defined grading criteria that align with learning outcomes. This approach ensures that students will have a clear grasp of expectations and how their work will be assessed.”

I agree that transparency in relation to the marking criteria, and how it will apply to the different modes of assessment, is key. For my module, I have created a set of bespoke marking criteria for each assessment mode – the poster presentation, and the oral presentation – so that there is transparency, and clarity, regarding how their assessments are marked. To accompany this, I created two visuals on Canva which summarise in simple terms how the marking criteria applies to each of the assessment options.

Further, and most importantly, I think it is important to dedicate time within the teaching framework, to support and advise students regarding their assessment choices. As such, I have embedded into the module design, a two-hour seminar which is dedicated to preparing for the presentation assessment. The seminar is broken down into two parts. First, I discuss with the students information regarding the expectations for the assessment, we go through the marking criteria and together we look at exemplars. Second, students are put into pairs/small groups with other students who are working on a different clause/phrase and they are asked to present and give feedback to each other on their work in progress. This session was very well received, with students commenting that it helped to clarify expectations and relieve assessment anxiety.


Firth et al’s (2023) research provides some incredibly important insights and recommendations regarding inclusive assessment practices in Higher Education. My own experiences of introducing optionality, and the results from my own student survey, reflect and build on those findings. While the benefits of student choice are clear, some of the drawbacks could potentially undermine those benefits. However, with careful planning, adopting a transparent approach, and providing adequate support, I argue that those concerns can be largely mitigated.

Dr Jo Wilson is Senior Lecturer in Commercial Law at the University of Sussex


Brown, N., Morea-Ghergu, D. & Onwuka, N. ‘Assessments: letting students decide’ in: Mawani, S., & Mukadam, A. (eds). Student Empowerment in Higher Education: Reflecting on Teaching Practice and Learner Engagement (Berlin: Logos Verlag 2020)

Miriam Firth et al, ‘Optionality in Assessment: A cross institutional exploration of the feasibility, practicality & utility of student choices in assessment in UK higher education (QAA, Oct 2023)

McArthur, J. ‘Rethinking authentic assessment: work, well‑being, and society’ (2023) 85 Higher Education 85

Mueller, J. ‘The authentic assessment toolbox: Enhancing student learning through online faculty development’ (2005) Journal of Online Learning and Teaching 1

O’Neill, G. (Ed) ‘A Practitioner’s Guide to Choice of Assessment Methods within a Module’ (2011, Dublin: UCD Teaching and Learning), available at: http://www.ucd.ie/teaching/resources/assessment/howdoyouassessstudentlearning/

O’Neill, G. ‘It’s not fair! Students and staff views on the equity of the procedures and outcomes of students’ choice of assessment methods’ (2017) 36(2) Irish Educational Studies 221

Wanner, T., Palmer, E., & Palmer, D. ‘Flexible assessment and student empowerment: advantages and disadvantages – research from an Australian university’ (2021) Teaching in Higher Education 1

Tagged with: , , , , , ,
Posted in Articles