Frequently Asked Questions About Licensing Exams
CLEAR Exam Review
Eric Werner, M.A.
Question: From time to time our multiple-choice licensing test has been criticized for not having an established grade-level readability index. Some of those who train candidates for the vocational subject matter tested by our exam have argued that because a high school education is not required to enter training, the readability of our exam should be well below the high school graduation level. Our item writers are carefully instructed to produce test questions that are easy to read and understand. We want to test subject matter, not reading comprehension. Are we on thin ice without having a readability index to show to those who criticize us?
Answer: I do not think that you are on either legal or psychometric thin ice for not regularly computing readability, but you definitely should document the procedures you use to assure that the language of your test is well within the reading ability of your candidates. Computing a readability index is not likely to do any harm. However, before you decide whether to measure your test's readability, consider the issues discussed below. There are several optional approaches to measuring the readability (or so-called reading ease) of a test. The SMOG, FOG, and Flesch formulas are among the more well known (a few references follow). Although these differ in the extent to which they consider such text characteristics as sentence length, sentence complexity, and word complexity (e.g., syllables per word), they have been shown to yield indices that correlate highly with one another. Some of the formulas are more complex than others, but computer programs can handle most computational work.
My concerns with applying readability formulas to multiple-choice tests are that they (a) can be awkward to apply; (b) do not address the complexity and comprehensibility of ideas behind the words analyzed; © do not without special modification give appropriate consideration to technical terms that licensees must know; and (d) express readability results in terms that are ambiguous.
First, consider awkwardness. To use a computer program that applies the Flesch formula to a question in which the stem is a partial sentence that is completed by each option, you will usually have to use a word processor to combine into one sentence the question stem and the correct response option. This procedure will give you a sentence to analyze, but it disregards the remaining options even though they affect the readability of the overall question. If you include the distracters by following the above procedure for each of them, your readability index will be overly influenced by the question stem, which will have been evaluated four or five times (depending on the number of response options). Items in which the stem is a complete question and in which the options are one- or two-word choices create obvious problems also. Most of these problems can be overcome or handled in ways that minimize their impact on your results. My point is not that computerized readability calculations will not work, but that they cannot be applied as efficiently to a multiple-choice test as to a paragraph of text.
Second, readability formulas ignore the ideas and concepts behind the words. Consider the sentence I think; therefore I am. The SMOG index estimates the reading level of this sentence at the third grade. There is clearly more to the comprehensibility of some text than is revealed by word and syllable counts alone.
Third, some polysyllabic technical terms must be known by most licensed persons, regardless of the amount of education required for professional entry. The SMOG and FOG formulas analyze the word electroencephalogram as an eight-syllable word that can greatly increase the grade-level indices they produce. However, few nurse candidates either should or would have trouble reading and understanding this word. Special modifications of readability assessment procedures are necessary to accommodate required technical terms so that the formulas do not give a distorted picture of reading difficulty for the candidate population.
Finally, who in these times can explain to us in terms that are unambiguous and useful to our practical purposes the meaning of, say, "twelfth-grade reading level"?
To help assure readable tests, break longer sentences into shorter ones, use simple words in place of complex words, and avoid unnecessary phrases. By carefully reviewing and editing your questions before they are administered, you will identify and solve many problems of this kind.
© 2002 Council on Licensure, Enforcement and Regulation