By Dr Dave Smalley
The topic of student perceptions of written feedback is an under-researched area which is surprising given that universities typically struggle disproportionately with the Assessment and Feedback questions in the National Student Survey (NSS). We know that feedback is very much valued by students but we also know, both from peer-reviewed research and from simply asking students, that they often find it hard to actually use their feedback and that they get very frustrated by what they perceive to be inconsistencies in the quality and quantity of feedback they receive across markers. With that in mind, I set about exploring the student perception of feedback further with the hope of developing our systems and improving our students’ experience of receiving feedback on their work.
Focus groups conducted in 2020 revealed that students wanted more guidance to help them understand the essay marking criteria. It is quite common for students to think that they understand what they are supposed to be doing with regard to a particular element of essay writing (e.g. structuring an essay), only to be marked down for it in the next submission. Part of the issue, it seems, is that the marking criteria can be vague when it comes to describing specific elements of essay writing. This leads to an incomplete understanding of what the marker is looking for and subsequently confusion when interpreting feedback. So how can we remedy this? I propose a more specific and structured framework of marking criteria that identifies individual elements of essay writing that are important (e.g. how to signpost the reader effectively by means of paragraph structure). For this to be effective, it is crucial that students have sufficient guidance to help them understand what the individual elements mean and – and this is key – are able to identify what it looks like in an essay when this is done well or inadequately.
The same focus groups unanimously agreed that students wanted consistency in their feedback, particularly with regard to how useful it is. Students want practical suggestions as to how they could go about improving an area of their essay writing, and this, they said, was in short supply. I argue that giving meaningful practical tips to help students improve their essay writing is actually really hard to do. In my experience, even excellent essay writers struggle to explain exactly what they do that makes them excellent essay writers. They just, kind of, learn how to do it. What we need therefore are experienced educators who have acquired a toolbox of tips and tricks to help students improve their essay writing. The problem is that there are not enough of these to cover the sheer volume of scripts that need to be marked. A solution – we complement our structured and detailed framework of the marking criteria with a set of specific and practical suggestions compiled by experienced educators, each linked to specific elements of essay writing.
So this is what I did. I started by creating a 15-item rubric that breaks down and details the key elements of essay writing identified in the existing marking criteria. When essays are marked the marker links each comment made to one of the elements so the student has a specific idea about what exactly they did that was ‘good’ or ‘needs attention’. Each item in the rubric is explained in detail in a series of marking criteria videos in which I use previously marked essays to demonstrate what effective and not-so-effective practice looks like. Next, I created a supporting feedback guidance document in which I exhaustively list all the issues that markers observe in student essays, organised by the 15 criteria of the rubric. Issues are colour coded into a traffic light system so that students can see how severe an impact the issue has on their grade. Next to each issue are practical suggestions about how to avoid the issue reoccurring in the next essay submission. The magic of the approach is that markers can simply link an in-essay comment to the issue in the guidance document. That means that there is less room for inconsistencies across markers, and markers have more time available to focus on being extra clear when making more individualised feedback comments in the essay.
Evaluation of this new approach is in its infancy but early indications are that it is very well received by both students and markers alike. We know that feedback is an essential component in the learning cycle so fingers crossed we’ve just succeeded in oiling the wheels a little!