Instructors and students alike often express a certain "distaste" for multiple-choice (MC) exams. While MC exams have the advantage of being easy to grade, the items themselves are frequently described as tricky and unnecessarily difficult. Students may feel that the exam did not test their knowledge of the course material. Instructors complain that items tend to be "low-level" and test recall of material rather than critical thinking. Is it possible to construct a fair, yet challenging MC exam?
Absolutely! The key to an effective MC exam is to construct high-level items that actually reflect the course material. High-level items are those that tap cognitive skills from the top categories of Bloom's (1956) classic taxonomy. Bloom described a six-layer hierarchy of cognitive skills arrayed in the following order: knowledge, comprehension, application, analysis, synthesis and evaluation. At the bottom of the hierarchy, questions require students to simply define, recall or identify information. This is not a complex cognitive activity and leads to only a surface understanding of the material. If we want students to engage in critical thinking, then questions must be structured to access more complex processing, e.g., compare sources of information, extract the appropriate information, formulate a new approach and evaluate how well you have done. This can be done on a MC exam, but care must be taken when constructing the items. Some good examples of questions at each level of the hierarchy can be found at the following website:
A good way to begin writing your exam is by constructing an exam blueprint. This is simply a cross-tabulated matrix or spread sheet that indicates the number of items devoted to each level from Bloom for each of the topics to be covered on your exam. For example, let's say that my final exam is supposed to cover chapters 10, 11, and 12 of the text plus lecture material that may not appear in the text. My exam blueprint might look something like this
|Level||Chapter 10||Chapter 11||Chapter 12||Lecture|
This exam consists of 67 questions, fairly evenly distributed across the topics, but with slightly less emphasis on the lecture material. We can also see that the emphasis in chapter 10 is more toward the lower end of the taxonomy, while chapters 11 and 12 focus on higher-level material. This scheme may, in fact, reflect the orientation of the chapters (10 is more directly tied to content while 11 and 12 discuss broader implications) and the kind of knowledge that I as the instructor expect the student to master. An exam blueprint helps the instructor identify where higher-level items are needed and serves to validate the learning objectives for the course. It is a good idea to share the exam blueprint with your students--it will help them to assess the kind of knowledge they need to acquire. It also further reinforces the learning objectives. If I want my students to be critical thinkers, my exams should be designed to assess this ability.
Are there any guidelines for writing items? There are several good sources on the web (see the list at the end of this article), but here are a few tips. NOTE. These are general guidelines and do not necessarily apply to all questions at all times. For example, there are occasions when "none of the above" is a reasonable alternative to include. However, you should not just make "none of the above" an alternative when you need something for option "e" and cannot really think of anything else.
Which of the following is not north of Detroit, Michigan?
a. Windsor, Ontario
b. London, Ontario
c. Buffalo, New York
d. Halifax, Nova Scotia
e. Vancouver, British Columbia
This question (which is at the knowledge level) is made more difficult by phrasing it in the negative (not north)--once again we have an increase in cognitive load that is unrelated to the course content. If you want to know which city is south of Detroit, you should ask it that way.
Which of the following is the best car?
The answer depends on how we define the word "best". Do you mean "most economical", "safest", "most luxurious", or perhaps something else? Be clear in what you are asking.
Multiple-choice exams can be every bit as challenging and thought provoking as an essay exam, but it will take time to construct a really good test. When students complain that items are "picky" or that the instructor was just trying to trick them, this usually reflects the fact that the test was not well-constructed. Be open to these comments and be willing to examine your items after the test has been written. We are usually very open to constructive criticism in our scholarly work, and we should take the same stance when it comes to our pedagogy.