Internal Student Surveys at Sussex

The University carries out internal surveys of students to measure their satisfaction with their courses, any common issues with key themes across the University, and to give all students the opportunity to make their opinion heard. This blog post outlines some of the changes to internal surveys at Sussex over the past few years, and details the new system of internal surveys launching this year.

Surveys until 2015 – Module Evaluation Questionnaires

The previous system for internal surveys was module-specific, known as Module Evaluation Questionnaires (MEQs). These questionnaires were offered towards the end of the teaching each term, with a separate questionnaire per module, meaning that students were being encouraged to complete around four near-identical surveys each term to give feedback on their experience of the module. Crucially, these questionnaires took place before the final assessments, meaning that students were unable to comment on the entire assessment and feedback for the module. Module leads were able to respond to students based on the feedback received, which would often take place around a month after the end of the module.

Questionnaires

The questionnaires included a number of standard questions for each module:

  • the module was well organised and ran smoothly
  • the module matched my expectations; the content was in line with the module description
  • lecturers were good at explaining things
  • feedback on my work has helped me clarify things I did not know
  • clear information about the assessment of this module and the marking criteria, was provided
  • the learning resources for this module, including such as Study Direct, the Library and any special equipment and facilities, met my needs
  • overall, I was satisfied with the quality of this module

The system of Module Evaluation Questionnaires had several major drawbacks, namely:

  • the questionnaires were repetitive, asking many of the same questions in multiple surveys, leading students to report a feeling of “survey fatigue”
  • the questionnaires offered no insights into the student experience for the whole course as the questions only related to each individual taught module
  • response rates for the questionnaires varied but were generally poor, achieving an average response rate of between 5 and 30%
  • the questionnaires were conducted during the module, meaning that the final elements of the module – the final teaching sessions and assessments – were not considered
  • as the student responses were not considered until the end of the module, any changes introduced did not help the current students

The Module Evaluation Questionnaires were considered by the University’s Teaching and Learning Committee and one of its sub-committees, the Student Engagement Working Group. This group recommended two major changes to the surveys which have been developed over the past two years: the use of mid-term module feedback, and a new course-level questionnaire.

Scaled questions

Surveys from 2015 onwards

1) Mid-term Module Feedback

The ability to gather feedback from students during the running of the module was noted as highly important for teaching staff, which would enable changes to be introduced to benefit the current student cohort. Tools were developed to support teaching staff to conduct mid-term module feedback in a variety of ways, including:

  • using clickers or in-class polling tools
  • using tools which are built-in to the online virtual learning environment, Study Direct
  • holding focus groups with students

The use of mid-term module feedback is still encouraged across the University, and a number of Schools have reported great success in their use of these feedback methods.

2) Course Evaluation Survey (2016 pilot)

The second element of the new surveys process was the introduction of a newly-designed course survey, which asked a number of standard course-level questions to all students as well as gathering a satisfaction rating for each taught module.

The survey was developed during the 2015-16 academic year and a pilot survey was launched in May 2016. This Course Evaluation Survey was delivered through Qualtrics, an external survey system which had a number of attractive design features which could be used as part of the surveys, such as icons for satisfaction scales and a mobile-friendly layout.

The survey asked a number of questions which gather feedback on their experience of the course overall, such as the workload on the course, skills development, and the organisation of the course. Students were also asked for a satisfaction score for each module and were able to include free-text comments on both the modules and the course in general.

The pilot was conducted with all first- and second-year students, a total population of 6,545 students. It remained open for around 6 weeks, closing on the 13th June 2016 with 1,618 responses, giving a response rate of 25%. While this response rate was still lower than hoped, the quiet launch with little promotion and the timing of the survey – taking place during the end-of-year assessment period – meant that the pilot was deemed a success.

There were a number of challenges and issues with the Course Evaluation Survey, which were identified through the pilot:

  • the reporting functionality in Qualtrics was not suited for reporting at module- and course-level. While the results were available, the process of filtering and extracting results for each module for dissemination was particularly difficult. The solution during the pilot was to export the complete set of results and process the data using spreadsheets and code written in Visual Basic for Applications, where the reports were programmed and output as PDF documents
  • the survey was conducted during the summer but asked for a satisfaction score for autumn term modules, which took place over 6 months’ before. Students reported difficulty in accurately remembering their experiences to provide comments on these modules
  • there was no feedback mechanism for the survey, meaning that academic staff were unable to directly respond to the comments left by students and have this form part of the survey results
  • there was no simple distribution method for the results. The Module Evaluation Questionnaires were delivered through the University’s main staff/student portal, Sussex Direct, meaning that staff could simply login to see the survey results for their modules. In the 2016 pilot, the reports for each School of Study were uploaded to a password-protected webpage

The results of the Course Evaluation Survey pilot, along with the feedback from staff and students, were considered by the Student Engagement Working Group. Changes were developed over the autumn term 2016, which included use of the questions in the new National Student Survey (NSS). The ADQE Office worked with colleagues in ITS to address some of the issues identified during the pilot through the development and testing of a new system.

The actual pilot for the Course Evaluation Survey pilot

End-of-term questionnaires (2017)

This year, the University is launching the new End-of-term questionnaires, which will ask all taught students about their experiences on their modules. All non-finalist undergraduate students will also be asked some course-level questions aligned to the new National Student Survey.

The end-of-term questionnaire is split into two sections:

  1. the first section asks for a module satisfaction score for each module taken during the previous term, and also invites free-text comments for these modules. All taught students will receive this part of the questionnaire
  2. the second section of the questionnaire asks a subset of the National Student Survey questions each term, with half of the NSS questions being asked in the autumn questionnaire and the remaining half asked during the spring questionnaire. Due to the rules governing the National Student Survey, any student who would be eligible to complete the NSS will be excluded from this section of the end-of-term questionnaire

Using NSS questions within the second section of the questionnaire will help us to build a picture of student satisfaction and engagement in key areas like assessment and feedback at different stages of the student journey. With this information available to us, our ability to enhance and improve the student experience for students whilst they are still studying at Sussex should be improved.

The questionnaire is delivered and reported through Sussex Direct, meaning that staff and students will be able to access the survey and its results in a familiar environment. The questionnaire has also been designed to work on mobile devices, which should allow students to complete the questionnaire at a time, or on a device, they are more comfortable with.

Some final testing is taking place at present to ensure that all elements of the system work as intended. Discussions are also taking place about making the institutional data available for analysis as early as possible, so that the results at University-level can be considered as soon after the close of the questionnaire as possible.

The surveys are being announced via news items in the coming weeks. The survey is due to launch following the deadline for feedback on mid-year assessments – to ensure that students have experienced the entire module before providing a satisfaction score and qualitative comments. The results will be available to module convenors once vetting of free-text comments takes place, which is expected to take one week following the survey close.

The surveys are being announced through a news item published on Monday 30 January.

External Surveys

The internal surveys described above will complement the external surveys of students at Sussex:

Tagged with: , , , ,
Posted in Enhancement, Uncategorised
One comment on “Internal Student Surveys at Sussex
  1. Matthew Tiernan says:

    As a follow-up to this post, we launched the new internal surveys as the “End-of-term questionnaires” in February 2017.

    Further details on the final format and guidance for staff and students are available at: http://www.sussex.ac.uk/adqe/enhancement/studentengagement/studentvoice/endoftermqs

Leave a Reply

Your email address will not be published. Required fields are marked *

*

Subscribe to the Blog via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 213 other subscribers.

Visitors to the ADQE blog

  • 5,625 hits

Tweets from ADQE