{"id":112,"date":"2021-02-12T10:00:00","date_gmt":"2021-02-12T10:00:00","guid":{"rendered":"http:\/\/blogs.sussex.ac.uk\/daretotransform\/?p=112"},"modified":"2026-03-26T10:44:02","modified_gmt":"2026-03-26T10:44:02","slug":"meq","status":"publish","type":"post","link":"https:\/\/blogs.sussex.ac.uk\/learning-matters\/2021\/02\/12\/meq\/","title":{"rendered":"How can we improve module evaluation questionnaires?"},"content":{"rendered":"\n<p>by Junko Winch<\/p>\n\n\n\n<div class=\"wp-block-image\"><figure class=\"aligncenter size-large\"><img loading=\"lazy\" width=\"260\" height=\"346\" src=\"http:\/\/blogs.sussex.ac.uk\/learning-matters\/files\/2021\/02\/Junko-Winch.jpg\" alt=\"Photograph of Junko, who is Lecturer of Japanese at the University of Sussex. \" class=\"wp-image-119\" srcset=\"https:\/\/blogs.sussex.ac.uk\/learning-matters\/files\/2021\/02\/Junko-Winch.jpg 260w, https:\/\/blogs.sussex.ac.uk\/learning-matters\/files\/2021\/02\/Junko-Winch-225x300.jpg 225w, https:\/\/blogs.sussex.ac.uk\/learning-matters\/files\/2021\/02\/Junko-Winch-100x133.jpg 100w, https:\/\/blogs.sussex.ac.uk\/learning-matters\/files\/2021\/02\/Junko-Winch-150x200.jpg 150w, https:\/\/blogs.sussex.ac.uk\/learning-matters\/files\/2021\/02\/Junko-Winch-200x266.jpg 200w\" sizes=\"(max-width: 260px) 100vw, 260px\" \/><figcaption>Junko Winch is a Lecturer of Japanese at the School of Media, Arts and Humanities at the University of Sussex.<\/figcaption><\/figure><\/div>\n\n\n\n<p>Module Evaluation Questionnaires (MEQs) are an important source of student feedback on teaching and learning. They are also often relied upon as evidence cases for promotion for tutors. However, in their current form they suffer from low response rates reducing their usefulness and validity. Local practices have grown to address the need for feedback, but they are inconsistent year on year or across the University. Existing research on teaching evaluations indicates that there are a source of bias and suggests careful design of MEQs.<\/p>\n\n\n\n<h2>The MEQ project<\/h2>\n\n\n\n<p>The MEQ project was undertaken to inform the University of Sussex&#8217;s policy and practice. The output was presented to the University\u2019s Surveys Group for the strategic direction of the University.<\/p>\n\n\n\n<h2>Method<\/h2>\n\n\n\n<h3><strong>Research Question 1: Purpose of MEQs<\/strong><br><strong>Method:<\/strong> Literature Review<\/h3>\n\n\n\n<p>A literature search was conducted using a total of 79 journal publications from the fields of psychology, education, business management, and sociology. The publication dates ranged from 1978 to 2020. The keywords used in the search were <em>student evaluation of teaching (SETs)<\/em>, <em>validity<\/em>, <em>assessment<\/em>, and <em>evaluation<\/em>. All selected journals underwent a content analysis.<\/p>\n\n\n\n<hr class=\"wp-block-separator\" \/>\n\n\n\n<h3><strong>Research Question 2: Analysis of MEQ\u2019s Seven Statements<\/strong><br><strong>Method:<\/strong> Qualitative<\/h3>\n\n\n\n<p>This stage involved a qualitative analysis of the seven core statements in the Module Evaluation Questionnaire (MEQ). The analysis focused on the following aspects:<\/p>\n\n\n\n<ol><li>The purpose of the University MEQ<\/li><li>Identified weaknesses<\/li><li>Suggestions for improvement<\/li><\/ol>\n\n\n\n<h2>Literature review<\/h2>\n\n\n\n<p>The literature review revealed tutors&#8217; and students&#8217; biases related to MEQs. However, bias is a source of unreliability, which also threatens validity. Validity and reliability are defined in various terms, but for the purpose of this report, validity is defined as \u201cthe general term most often used by researchers to judge quality or merit\u201d (Gliner et al., <a href=\"https:\/\/www.routledge.com\/Research-Methods-in-Applied-Settings-An-Integrated-Approach-to-Design-and-Analysis-Third-Edition\/Gliner-Morgan-Leech\/p\/book\/9781138852976?srsltid=AfmBOoowwaH-TwfAQA_aCyiMUFNQWgIXDAPBxtSk9gkI6tmmMIxhmWod\" data-type=\"URL\" data-id=\"https:\/\/www.routledge.com\/Research-Methods-in-Applied-Settings-An-Integrated-Approach-to-Design-and-Analysis-Third-Edition\/Gliner-Morgan-Leech\/p\/book\/9781138852976?srsltid=AfmBOoowwaH-TwfAQA_aCyiMUFNQWgIXDAPBxtSk9gkI6tmmMIxhmWod\">2009<\/a>, 102) and reliability as \u201cconsistency with which we measure something\u201d (Robson, <a href=\"https:\/\/archive.org\/details\/realworldresearc0000robs\" data-type=\"URL\" data-id=\"https:\/\/archive.org\/details\/realworldresearc0000robs\">2002<\/a>, 101).\u00a0  <\/p>\n\n\n\n<h3>Tutors related biases<\/h3>\n\n\n\n<p>The literature highlighted that tutor related biases must be taken into account:<\/p>\n\n\n\n<ul><li>Students who have learned more in class will receive higher grades and will naturally rate the professor more highly because of the knowledge they have gained on the course (Patrick, <a href=\"https:\/\/doi.org\/10.1080\/02602930903308258\" data-type=\"URL\" data-id=\"https:\/\/doi.org\/10.1080\/02602930903308258\">2011<\/a>, 242).<\/li><li>Tutors who give higher grades receive better evaluation (Carrell and West, <a href=\"https:\/\/doi.org\/10.1086\/653808\" data-type=\"URL\" data-id=\"https:\/\/doi.org\/10.1086\/653808\">2010<\/a>).<\/li><\/ul>\n\n\n\n<h3>Student related biases<\/h3>\n\n\n\n<p>Student biases may arise from a wide range of factors, including the weather, time of day, personality traits, gender, racial stereotypes, the tutor\u2019s physical attractiveness, student anxiety, and more.<\/p>\n\n\n\n<h2>The findings and recommendations<\/h2>\n\n\n\n<h3>1. The purpose of MEQs<\/h3>\n\n\n\n<p>MEQs have three purposes: institutional, teaching and academic promotion. To help to reduce the bias effects outlined in the literature, full MEQs and other teaching related data should be provided to promotion panels to avoid the cherry picking of comments or data by applicants. For example, quantitative data such as class average attendance rate, average, minimum and maximum marks as well as qualitative response analysis would help build a more accurate overall picture of the class.<\/p>\n\n\n\n<h3>2. Analysis of MEQs<\/h3>\n\n\n\n<p>Students\u2019 biases mentioned in the literature may present difficulty in relying on MEQs as sole instrument. Furthermore, the current MEQ statements may confuse students due their contents and wording. <\/p>\n\n\n\n<p>Following points are suggested:<\/p>\n\n\n\n<ul><li>Purpose and goal of the questionnaire should be clearly stated. The purpose of the stakeholders should be taken into account when designing the MEQs to ensure that the intended MEQ purpose is achieved.<\/li><li>Some statements ask two questions in one statement. However, some students may not necessarily answer both questions, which affect validity.<\/li><li>Consideration should be given to the words such as \u2018satisfied\u2019 which might have different connotations depending on cultures and individuals.<\/li><\/ul>\n\n\n\n<h3>Recommendations<\/h3>\n\n\n\n<p>Carefully developed MEQs have potential to offer valuable insights to all stakeholders. The primary recommendation is to undertake a staff-student partnership to agree the purpose of the MEQs and co-design a revised instrument that meets the stated purpose.<\/p>\n\n\n\n<h2>Reflections<\/h2>\n\n\n\n<p>I have engaged this project as my Continual Professional Development and appreciate that it has given me various opportunities. For example, I was given an opportunity to write this blog. Furthermore, giving a presentation to the University Surveys Group reminded me of my doctorate viva, as the University Survey Group included Pro Vice Chancellor for Education and Students, Associate Dean of the Business School and the Deputy Pro Vice Chancellor of Student Experience. When answering questions from the University Survey Group, I learned how difficult it is to meet the needs of different perspectives and cultures. For example, I was asked a question from a quality assurance perspective, which was unexpected as I wrote this report from a teaching staff perspective. The University Survey Group also included Students\u2019 Experience team, which also made me consider another perspective involving MEQs. Furthermore, working with my colleague from the Business School made me realise the departmental\/academic discipline\u2019s cultural differences from where I am affiliated (School of Media, Arts and Humanities). Looking back, this was a very valuable experience for me and hopefully the institution.<\/p>\n\n\n\n<h2>References<\/h2>\n\n\n\n<p>Carrell, S.E. and West, J.E. (2010) \u2018Does professor quality matter? Evidence from random assignment of students to professors\u2019, <em>Journal of Political Economy<\/em>, 118, pp.409\u2013432. <a href=\"https:\/\/doi.org\/10.1086\/653808\">https:\/\/doi.org\/10.1086\/653808<\/a><br><br>Gliner, J.A., Morgan, G.A. and Leech, N.L. (2009) <em>Research methods in applied settings: An integrated approach to design and analysis<\/em>. New York: Routledge.<\/p>\n\n\n\n<p>Patrick, C.L. (2011) \u2018Student evaluations of teaching: effects of the Big Five personality traits, grades and the validity hypothesis\u2019, <em>Assessment &amp; Evaluation in Higher Education<\/em>, 36(2), pp.239\u2013249. <a href=\"https:\/\/doi.org\/10.1080\/02602930903308258\">https:\/\/doi.org\/10.1080\/02602930903308258<\/a><\/p>\n\n\n\n<p>Robson, C. (2002) <em>Real world research<\/em>. 2nd edn. Oxford: Blackwell.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>by Junko Winch Module Evaluation Questionnaires (MEQs) are an important source of student feedback on teaching and learning. They are also often relied upon as evidence cases for promotion for tutors. However, in their current form they suffer from low<span class=\"ellipsis\">&hellip;<\/span><\/p>\n<div class=\"read-more\"><a href=\"https:\/\/blogs.sussex.ac.uk\/learning-matters\/2021\/02\/12\/meq\/\">Read more &#8250;<\/a><\/div>\n<p><!-- end of .read-more --><\/p>\n","protected":false},"author":172,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"spay_email":""},"categories":[71757],"tags":[123700,123702],"jetpack_featured_media_url":"","_links":{"self":[{"href":"https:\/\/blogs.sussex.ac.uk\/learning-matters\/wp-json\/wp\/v2\/posts\/112"}],"collection":[{"href":"https:\/\/blogs.sussex.ac.uk\/learning-matters\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/blogs.sussex.ac.uk\/learning-matters\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/blogs.sussex.ac.uk\/learning-matters\/wp-json\/wp\/v2\/users\/172"}],"replies":[{"embeddable":true,"href":"https:\/\/blogs.sussex.ac.uk\/learning-matters\/wp-json\/wp\/v2\/comments?post=112"}],"version-history":[{"count":4,"href":"https:\/\/blogs.sussex.ac.uk\/learning-matters\/wp-json\/wp\/v2\/posts\/112\/revisions"}],"predecessor-version":[{"id":1317,"href":"https:\/\/blogs.sussex.ac.uk\/learning-matters\/wp-json\/wp\/v2\/posts\/112\/revisions\/1317"}],"wp:attachment":[{"href":"https:\/\/blogs.sussex.ac.uk\/learning-matters\/wp-json\/wp\/v2\/media?parent=112"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/blogs.sussex.ac.uk\/learning-matters\/wp-json\/wp\/v2\/categories?post=112"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/blogs.sussex.ac.uk\/learning-matters\/wp-json\/wp\/v2\/tags?post=112"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}