In last week’s blog post Pete Sparkes shared some different approaches to help Moodle users rapidly create quizzes for use in course sites, overcoming some of the perceived usability challenges associated with the Quiz activity.
In this article I want to focus on the effectiveness of the questions within the quizzes we create, an aspect of the process that I’m sure anyone who has previously created an online quiz will appreciate is challenging and anything but rapid.
An online quiz – be that in Moodle, another learning management system or assessment platform – is typically a form of objective test, that is to say a test which consists of questions to which there is a specific correct or ‘best’ answer. Critics of objective testing argue that it is difficult to design questions which assess at appropriate levels. While it is certainly the case that it is easier to write questions which primarily test factual recall, it is also quite possible (though admittedly harder) to design questions which test comprehension (requiring some form of interpretation/comparison), application (requiring application of knowledge, typically in a sequence of questions or through the computation of answers) or analysis (requiring the distinction between inferences and facts).
I have long argued however that online testing has value, particular as a formative component of an assessment strategy, with efficiencies of automated marking, the ability to subjects question/test scores to statistical analysis and to provide targeted feedback based on defined answers useful to both lecturer and learner.
Anatomy of a Question
Before looking at some of the dos and don’ts of effective online question design it’s helpful to understand the anatomy of a question. An objective test question (e.g. a multiple choice question) generally has the following basic structure:
- Stem – the question wording, often including a lead-in statement
- Choices/distracters – options available to select from
- Key – the correct answer(s)
- Feedback (important)
There are of course more complex variations of this format and questions can be further augmented with inclusion of scenarios (vignettes), images, data, graphs or other relevant ancillary material.
Now we have our basic structure in mind what practical steps can you take to improve the effectiveness of your questions?
- Ensure content validity and alignment with learning outcomes – consider the purpose of the question, what you are trying to test.
- Frame the problem (the question) clearly within the stem – be concise and avoid jargon terms
- Only include plausible distracters – if the incorrect choices (the distracters) are easily discounted then the difficulty of the question is diminished and the likelihood of answering the question correctly by guessing greatly increase (this is a concept known as question guess factor).
- Focus on the general rather than the specific – avoid asking questions which emphasise small details e.g. Who invented…?
Having started with good practice (positive before a negative) let’s look at some of the common mistakes often made in the writing of online questions:
- The question is poorly framed, the stem contains superfluous text and/or jargon.
- Information within the stem is repeated within choices
- Choices are not mutually exclusive yet only one correct answer permitted
- Grammatical clue within the question stem indicates correct answer or aids in the discounting of other choices
- The most detailed choice (and sometimes consistently the shortest choice) is the correct answer
- Information within stem of one question provides answer to a subsequent question
“Aspects of assessment design including clarity of instructions, question wording and question weighting…affected student understanding and on-task processes. Where questions were found to be vague or ambiguous, students reported spending disproportionate amounts of time trying to interpret the meaning or trying to make visual judgements on the differences in spelling between distracters.”
Walker, D., Topping, K. & Rodrigues, S. (2008)
Returning to Rapid Design
There’s clearly much to consider when writing online quiz questions – particularly if they are to be in any way effective. This takes time but to return to our previous post on rapid production of quizzes a technique that might make the question writing process quicker is Item Modelling.
Item Modelling is an approach which can be taken to quickly develop a large bank of questions. The technique involves use of a high quality/high performing question, identifying keywords/figures within the stem then varying these (e.g. changing age of patient, quantity of materials, drug dosage level). Ideally one of the existing choices will fit with the new stem but it may be necessary to modify the question wording and/or the choices to ensure an appropriate fit.
Whatever approach you take to question writing my advice is to seek peer feedback wherever possible before deploying and to review the performance of questions after their use with students.
Finally a word on Feedback
One of the key features of online testing is the ability to provide instant feedback to students however this element is often omitted, even where the test is being used formatively. It’s arguable that a formative assessment without feedback is not formative. When writing feedback its good practice to avoid simply giving the correct answer (though admittedly this is quick). It’s far more useful to provide an explanation of why an answer is incorrect and also why an answer is correct (for those students who got the question right by guessing!).
You might also take the opportunity to include links to other useful learning resources. In my own experience use of online quizzes by students tends to be higher when questions include feedback. If you’re going to invest the time in writing valid, concise, clearly worded questions which are pitched at the correct level then you’re clearly going to want students to engage with them.
If you’d like further information or guidance on how to design questions or structure quizzes for use in Moodle (Study Direct) then staff at Sussex can contact their School Learning Technologist.
Walker, D., Topping, K. & Rodrigues, S. (2008). Student reflections on formative e-assessment: expectations and perceptions. Learning Media and Technology, 33(3), 221-234.