Winter 2019, Volume 29, No.2
- Abstracts and Updates, by George Gray- Dr. Gray first directs us to an introductory paper that provides an overview of standardized testing written for non-psychometricians. Next he summarizes articles covering a varied assortment of topics, including the use of data analysis to detect cheating on exams, evaluating alignment in support of test validity, issues in equating small-volume exams, the use of augmented subscores to enhance value in subscore reporting, the relative cost-effectiveness of automated item generation and manual item writing, and considerations in selecting appropriate scales for a task analysis survey. Three articles on standard setting are reviewed: one on the tendency of standard setters to regress toward moderate difficulty in Angoff ratings, one related to the precision/rounding of Angoff ratings, and one focused on the proficiency of the experts serving as standard setters. Two publications describe multistage testing (MST), a module-based form of adaptive testing. The first describes a top-down methodology for developing multistage exams, and the second compares routing strategies used in MST. Finally, two articles from the context of medical licensing are summarized. One examines the impact of time constraints on test performance; the other describes the procedures followed in redesigning the test blueprint for a national licensure examination in osteopathic medicine.
- Legal Beat, by Dale Atkinson- Dale Atkinson presents a case that illustrates how test preparation services, if short on scruples, can “threaten the security of the examination program and undermine the integrity of the licensure process.” A test preparation service deliberately acquired items from a licensure examination and provided their clients with numerous items, including translations of the items in Chinese, along with the answers. The organization admitted to harvesting and disseminating the items, yet attempted to deny wrong-doing. The exam owners sued on the basis of copyright infringement and breach of contract. This case demonstrates the importance of establishing strong test security practices that include copyright protection and establishment of confidentiality agreements with examinees. It also serves as a reminder that not everyone respects the integrity of the licensure process or the importance of demonstrating a level of competence prior to practicing a regulated profession. High-stakes examinations, including those used in licensure and certification, require diligent protection.
- What Makes a Difference for Candidates Taking Computer-based Tests? Issues Surrounding Device Comparability and User Interface Modifications , by Paula Lehane- Paula Lehane tackles the topic of online testing, reviewing the literature related to differences in test-taking experiences and outcomes that result from allowing the use of non-standardized devices and interfaces. Does it matter whether a candidate takes an examination on a smartphone or a desktop computer? What factors affect the comparability of scores on exams taken on different devices? This article presents important considerations not only for online testing but for anyone pondering the electronic administration of examinations with non-standardized devices and/or software.
- A Review of Certain Biases in Testing Processes, by Peter Mackey and Chris Wiese- Involvement of subject matter experts (SMEs) is crucial to the quality and validity of a credentialing examination program. We rely heavily on the judgments of such experts in assessing candidates for professional practice. In the final article in this issue, Peter Mackey and Chris Wiese remind us that SMEs, as human beings, are susceptible to bias. This article addresses areas in the examination process where unintentional bias may occur, particularly in standard setting. Findings from a survey of 20 credentialing organizations are presented, summarizing varied experiences with SME bias and suggestions for avoiding and/or minimizing the influence of bias. Even for those who do not work directly with SMEs, this article can enhance understanding of why validity is described as multifaceted and why establishing validity requires more than rotely following prescribed steps.
Design and composition of CLEAR Exam Review is underwritten by Prometric, a leading provider of technology-enabled testing and assessment solutions to many of the world’s most recognized licensing and certification organizations. Learn more at Prometric.com.