Reference Guide for Auditing a Credentialing Examination Program
by Barbara Showers
"Audit" means "to examine with intent to verify," according to Webster’s. It also means, "a methodical examination and review."
In the context of a credentialing examination program, there are four general purposes of an audit:
- to verify compliance with generally accepted professional testing standards
- to verify compliance with the goals of the test user, such as measuring competence to protect the public
- to verify whether the program is accomplishing specific goals and objectives specified in a contract, request for proposals, or an organizational mission statement
- to verify whether appropriate inferences are made from test results.
This guide can be used to conduct a self-evaluation by examination program staff, or to guide an evaluation by an external audit team that is neutral to the interests of the examination provider. The external audit team can be hired by the test provider or an external body which may contract for examination services, such as an association of regulatory boards.
Verification is best established by developing audit questions, collecting and evaluating specific written documentation which responds to the audit questions, and summarizing the findings in a written report. An office site visit may be needed to verify resources and operations, and a meeting with the provider after initial review of documents may be helpful to clarify questions that arise. In addition a visit to a test administration site might be appropriate.
Persons conducting the audit should be objective and knowledgeable of psychometrics and credentialing testing, and should be prepared to spend significant time evaluating the information provided, and writing the report. The scope of the audit and the timeline of activities should be agreed upon in advance. The audit may take weeks, months, or even a year, depending on the size and complexity of the testing program.
The most commonly accepted professional testing standards against which a high-stakes testing program can be measured are the Standards for Educational and Psychological Testing of the American Psychological Association, the American Educational Research Association, and the National Council on Measurement in Education. (See Appendix A, Bibliography, for additional sources of information regarding professional testing standards.)
The sponsor of the audit should consider in advance how the audit report will be distributed and used. A procedure for resolution of any negative findings should be determined.
The following pages detail questions that should be asked by organizations wishing to conduct an audit of a credentialing examination program. The purpose of this publication is to provide a framework and suggestions for planning and conducting the audit process.
The questions were compiled by a subcommittee of the Examination Resources and Advisory Committee of the Council on Licensure, Enforcement and Regulation (CLEAR). Committee participants were:
Barbara Showers, Ph.D., Chair
Director, Office of Examinations, Wisconsin Department of Regulation and Licensing
James D. Blum, CPA, Ph.D.
Director, Examinations Division, American Institute of Certified Public Accountants
F. Jay Breyer, Ph.D.
Managing Principal, The Chauncey Group International
Tadas Dabsys, B.A.
Manager of Operations, PSI Examination Services
Chuck Friedman, Ph.D.
Assistant Vice President, ACT, Inc.
Sandra Greenberg, Ph.D.
Vice President, Public Service Activities, Professional Examination Service
Steven Nettles, Ed.D.
Vice President, Research and Development, Applied Measurement Professionals, Inc.
Rae Ramsdell, M.A.
Examination Specialist, Michigan Department of Consumer and Industry Services
Kara Schmitt, Ph.D.
Director of Testing Services, Michigan Department of Consumer and Industry Services
Lee Schroeder, Ph.D.
President, Schroeder Measurement Technologies
Rina Sjoland, M.A.
Assistant Vice President, ACT, Inc.
Anthony Zara, Ph.D.
Senior Director of Testing and Research Services, National Council of State Boards of Nursing
Sherry Young, M.A.
Senior Manager, Cooperative Personnel Services
Jim Zukowski, Ed.D.
Director, Professional Licensing and Certification Division, Texas Department of Health
HOW TO STRUCTURE THE AUDIT PROCESS
The following issues and decisions should be considered prior to the beginning of the audit. For an audit with external auditors, an engagement letter, or letter of understanding, is helpful to document in advance the scope and timeline of the audit, and the responsibilities and expectations of the parties. Elements of an engagement letter are suggested in Appendix B.
A statement of compliance should be signed by the auditors. This statement acknowledges what types of materials the auditor is obligated not to disclose (such as secure examination questions, for example), and it verifies that the auditor does not have conflicts of interest, such as working for a review course provider.
A letter of representation may also be obtained from the organization being audited, in which the organization acknowledges its responsibility for the testing program, and agrees to make available the necessary documents for the audit, to accurately disclose all relevant information known to it, and not to interfere with the audit.
1. What is the purpose of the audit?
- Verify test meets professional testing standards?
- Verify test meets user goals?
- Verify contract performance?
- Verify inferences made from test results?
2. Who should the auditors be?
- Important characteristics of the audit team include:
- Independent of the audited organization
- Knowledgeable of the basic audit process
- Knowledgeable of psychometrics, or have access to independent psychometric assistance
- Multiple auditors to aid objectivity, reliability and breadth of perspective
3. What should be expected of the auditors?
- Confidentiality regarding secure information
- Objective review of the information presented
- Audit limited to factual findings
- Recommendations for improvement stated separately from findings
- Written formal report provided
4. What types of information should be made available to the auditors?
The following information should be available in written documentation. Verbal statements without written verification indicate a weakness in the program:
- Job analysis report
- Test specifications
- Forms used in performance or oral examinations
- Item writing training materials and any written procedures for item writing and item review
- List of references used in item writing
- Description of type of scoring used (scales, weights, norm- or criterion-referenced)
- Rationale for type of scoring used
- Access to written tests provided to auditors under secure situations
- Item analyses
- Report of how passing scores determined; actual passing scores for individual forms
- Reliability statistics; validity reports
- Test equating reports (design, methodology)
- Bias sensitivity review procedures
- Sample score reports, school reports, any reports published
- Candidate appeal process
- Candidate handbook describing responsibility of certifying organization, examination content information, exam construction and purpose, dates, fees, results reporting, and due process procedures.
- Standardized test administration procedures and instructions
- Security policies and procedures for handling test materials and for test administration
- Proctor selection criteria and training materials
- Pass/fail statistics
- Examination administration reports
- Incident reports of problems at testing sites
- Test provider bylaws, organizational chart, budget/revenue accounting statements, annual reports
- Test provider contract and request for proposals (RFP), if applicable
5. What are the audit procedure considerations?
- Pre-audit: Meeting to establish audit scope and questions, timetable, responsibilities, costs, letter of engagement and letter of representation.
- Audit: Is a site visit necessary? Is a meeting with the test provider to answer questions necessary?
- Report: Auditors must draft a written report of findings. Auditors will need to confer and agree via meetings, e-mail and/or conference calls. The report should be cross-checked by multiple auditors before release. Will the draft report be reviewed with test provider before a final report is issued? What will be the method of test provider response to audit findings? How will differences between auditors and test provider be resolved? What will be the organization’s action plan and timeline to address the findings of the audit?
6. How will the findings be distributed and used?
- Who will see the auditors’ report?
- Who will receive the management report?
- Who will have access to the auditors’ workpapers?
- Will the documents be public?
- How will negative findings be resolved?
BACKGROUND INFORMATION ABOUT THE TEST PROVIDER
Credentialing examinations are often provided through the efforts of more than one organization. For example, a credentialing body with a governing board might contract with a testing company to develop a credentialing examination. The credentialing body might establish certain purposes and standards for the credentialing examination. The testing company might consult with the credentialing body regarding technical testing issues, and might also be responsible for carrying out the test development activities requested in accord with its own mission. The testing company or the credentialing body might contract with another company for the administration of the examination.
A comprehensive audit would include an analysis of all parties which contribute to provision of the final product of the credentialing examination, or, an audit might be conducted of one or more of the parties.
1. What is the mission of the test provider organization(s) being audited?
- Is the organization established to develop an assessment tool for a particular profession or occupation?
- Does the organization have other functions in connection with the profession or occupation?
- Do any organizational functions conflict with the assessment purpose?
- What is the organization’s legal status? Is it independent of other agencies? Does it have control of all testing issues and processes?
- What are the anticipated outcomes of the testing program?
- Who are the key stakeholders of the organization?
- What are the key values and philosophy of the organization?
- How does the organization differ from other organizations with the same or similar goals?
2. What resources are available to the test provider organization(s)?
- Who is responsible for the testing program?
- Are staff trained in testing? If staff are not trained, are there consultants available who are trained in testing?
- What are the credentials of the people responsible for the development of the test?
- What physical resources are available to the staff and consultants - computers, security, etc.
- What legal and psychometric support is available in the event of a lawsuit?
3. What is the purpose of the test? Is there a difference between the stated goals and the documentation?
4. Who are the users of the test? For what purpose do they use it? Is there an opportunity for user involvement or input into the testing process?
5. What level of performance is the examination designed to measure (e.g., minimum competency or specialized skill mastery)? What is the expected education and experience preparation of eligible candidates? Are eligibility criteria consistently applied?
6. Who has decision-making authority for the examination process?
7. What are the name(s) and qualifications of the people who are responsible for:
- the development of the test
- the scoring of the test
- the psychometric aspects of the examination
- the administration of the examination
- the security of the examination
8. What is the financial status of the organization?
- How is the testing program financed?
- Is there a reasonable expectation of stability? (Revenues meeting expenses?)
- Do examination fees fund other programs?
- Are fees reasonable compared to other similar examinations? Are fees appropriate to the profession? Is justification provided when fees are increased?
9. What information is provided to the public (candidates and users) about the testing program?
- Applicant handbook
- Test content outline
- Score interpretation
- Sample test questions
- Appeals process
- Performance statistics (e.g., pass rates, reliabilities, cut scores)
- Web site
- Telephone inquiries
- Other outreach
AUDIT OF TEST DEVELOPMENT
The following questions are designed to investigate elements of test development procedures which affect the validity and quality of the resulting examination. The Standards for Educational and Psychological Testing are a particularly relevant resource for audit of test development.
1. Is the test or evaluation instrument based on a job analysis?
- Does the report include the survey, the results, recommendations, committee members and their qualifications?
- If no job analysis, what is the basis for the test?
2. Do test specifications exist?
- Are test specifications linked to the job analysis? By what process?
- If no job analysis, how were the test specifications developed?
- How was the test format (written, oral, practical) determined?
- How were dimensions of complexity or cognitive level established and supported?
3. Are the test items consistent with test specifications and cognitive level?
4. How are the items developed?
- Who are the item writers and what are their qualifications?
- How are the item writers selected?
- How are the item writers trained?
- How are the items initially reviewed and approved?
- Is the review process objective and based on consensus of experts?
- Is each item supported by a text reference or written documentation of some sort?
- Is each item then linked with the test blueprint and job analysis?
- Who has the authority to edit items?
5. How are items introduced into production?
- Are items pre-tested in some manner?
- Are item statistics evaluated against standards for item difficulty and discrimination?
- Are results of item bias and adverse impact studies available?
6. How are test forms assembled?
- What is the percentage of old (used) items on the test?
- What is the percentage of new items?
- Are pretest items included?
- Who reviews final test form for content and correctness of answer key?
- How many forms of the test need to be developed for an administration or a specified time period?
7. Is the bank of items large enough?
- Are there sufficient items to develop several examination forms?
- If the test is computer adaptive, are there enough items at each difficulty level to administer a test that can make an accurate assessment of an individual’s level of proficiency?
- How often are items re-used?
- Are there enough items in the bank to replace a stolen or compromised form?
8. How is the passing score established?
- What method is used to establish a passing score?
- How is the standard of competence defined?
- Is it criterion-referenced or norm-referenced?
- Does the method incorporate the systematic judgment of a representative group of content experts?
- How are the experts selected, and what are their qualifications?
- What is the reasoning for the standard that was selected?
- Is the passing standard the same for all forms of the test?
9. How often is test content updated? What procedures are in place to assure currency of test content?
10. What procedures assure that test content is secure during the test development process? Consider office security, electronic security, and security measures for at-home writing of questions, if used.
11. If there are essay, practical or oral questions:
- How was it determined that this was the best way to measure the competence domain?
- Is the variety of question content broad enough to provide a generalizable result?
- Have objective scoring benchmarks been established prior to the administration of the test?
- What are the qualifications of the examiners?
- What training do the examiners receive?
- Is the examiner performance evaluated or calibrated for reliability and accuracy?
12. If the test is now administered in a computer format, but is intended to be comparable to a test previously administered in paper and pencil format, is there data to demonstrate that the results are comparable?
13. If simulations are used, what support is there for the validity and accuracy of the simulation?
AUDIT OF TEST ADMINISTRATION
The following questions are designed to investigate elements of test administration which affect fairness, due process, security, and ultimately score validity.
1. What is included in the candidate information bulletin?
- Look for examination application requirements, test specifications, examination procedures and schedule, location of test site(s), passing score requirement, suggested references for preparing for test, sample items, examinee responsibilities, review/appeal procedures, how to apply for special accommodations for the disabled or translated forms of the exam.
2. How is the test administered: paper and pencil, computer, interview, take-home, etc.?
- Is the format appropriate to the purpose of the test?
3. What procedures are established to maintain security of the test materials?
- Who prints the test booklets and materials?
- What directions are given to printer regarding the printing and security?
- How are test booklets stored after printing?
- What happens to test materials after they have been used? What are the procedures for destroying secured materials?
- What measures are taken to assure secure and traceable shipping, or secure electronic delivery of materials?
- What measures are taken to maintain security of the tests at the examination site before, during and after the test administration?
- How are test booklets distributed to administration staff?
- How are test booklets stored by administration staff?
- If a computer-delivered test, what security measures have been established for on-site storage and delivery?
- If a computer-delivered test, what security measures have been established to monitor the testing site and the candidates?
- How are examination materials returned to test developer if administration staff is not part of test development staff?
4. Are there standardized written procedures and instructions for test administration? Does administration staff receive training? Are the procedures followed? Procedures include:
- Controlled admittance of candidates (ID verification, limited room access, seat assignments)
- Controlled distribution and collection of secure materials, including before and after inventories
- Standard dates, start times, end times, policies for supervision of candidate breaks
5. Is an emergency plan established in case there are problems at the test site such as noise, fire, flood, or other interruptions?
6. What are the procedures if someone is suspected of cheating at the test site?
- Is there a policy that defines the types of behavior that appear to be cheating?
- Are there procedures detailing what to do with someone suspected of cheating?
- Are statistical analysis procedures available to evaluate the probability of response pattern similarities when candidates are suspected of cheating?
7. What is the procedure for providing modifications to administration for people with disabilities?
8. What is the procedure for providing feedback to the credentialing authority regarding irregularities in administration, such as cheating?
- Are irregularities evaluated for impact on validity of test scores before scores are released to candidates and others?
- Is there a policy for withholding scores when validity is at issue due to an irregularity?
AUDIT OF STATISTICAL ANALYSES AND SCORING
The following questions are designed to investigate elements of statistical analysis and scoring which affect the validity and quality of the resulting examination. The Standards for Educational and Psychological Testing are a relevant resource of requirements in this area.
1. Are item statistics calculated?
- Are the item statistics analyzed? What criteria are applied to keep or reject items?
- What happens to items that are deemed inadequate or flawed?
- Are the statistics stored for use in item revision?
- Are statistics generated for essay, oral or practical examinations?
2. Are reliability statistics available?
- On what sample group were the statistics calculated?
- Is the exam reliable enough to support making pass/fail decisions?
3. Is the test equated so that reported scores on different forms will be a result of candidate ability and not item difficulty?
- What method is used to equate the tests?
- Why was that method selected?
- On what sample group was the equating performed?
4. What are the pass/fail rates, and the sample groups on which they were calculated? Are the pass/fail rates reviewed for expected outcomes, given the sample group?
5. If the test is comprised of more than one scored part, are the scores combined into one total score, or must the parts be passed separately?
- Is there a study or other documentation which supports the decision whether to combine scores or keep them separate?
- If scores are combined, what is the method of combining them?
- If scores are separate, are they each sufficiently reliable? Do the parts represent separate content?
6. How are tests scored? What quality control procedures assure scoring accuracy?
7. How are individual candidate scores reported?
- Are they reported promptly ?
- Do they indicate what scoring rules are used?
- Are candidate scores reported as subscale and total scores?
- Are numeric scores issued or just pass/fail status?
- If candidates fail, do they receive any diagnostic information?
8. What types of reports are generated: school results, committee reports, etc.?
9. Are examination reviews or challenges allowed? If candidates can contest individual items on the test, what is the procedure for the protest? If not, what is the reason for not allowing appeals?
10. What complaints have been expressed about the current test? What responses have been generated by the testing agency?
AUDIT REPORTS AND MANAGEMENT LETTER
In its briefest form, an audit report can be a short attestation by the auditors describing the audit scope, the audit method, and whether, in the auditors’ opinion, the scores of the examination can be relied upon. The auditor’s report should include:
- Date – this date should be the last date field work was completed;
- Addressee – the primary parties to whom the report is directed;
- Audit period – period of time that the audit includes;
- Scope – areas of the examination process the auditor audited;
- Opinion – results of the audit (can the grades be relied upon); and
- Signature – auditors’ name signed.
A management letter is a more informative document, detailing the scope and method, the findings of the auditors, and their recommendations to management regarding areas of improvement. The auditors must report any conditions that they consider to be significant deficiencies in meeting recognized technical standards, such as the Standards for Educational and Psychological Testing, or deficiencies in meeting user goals or contract requirements, if these are part of the purpose of the audit. This letter may be presented to the board of directors of the test provider organization. In accounting audit practice, this letter may be designated as solely for the information and use of the board of directors, management and others within the entity.
Credentialing examinations are often used by state governments to screen candidates for state-regulated credentials. Unlike the private sector, there may be a presumption that results of audits of government activities must be available to the public as part of government’s accountability to the public. In the broadest sense, the public is the "board of directors." Therefore, a report of audit results for credentialing examinations used by states may be expected to be made public, as are results of audits of other government functions.
If the management letter is to be available to the public, the public report can incorporate management’s response to the audit findings and recommendations. The management response can be separate from the management letter, or can be incorporated into the public report after each major section of findings and recommendations. The management response may include a plan and timetable for resolution of any identified problems.
For example, a public report of audit could be structured as follows:
- Audit Objectives and Scope
- Audit Methods
- Auditors’ Finding #1
- Auditors’ Recommendations
- Management Response
- Auditors’ Finding #2
- Auditors’ Recommendations
- Management Response
[And so forth]
APPENDIX A
BIBLIOGRAPHY AND RESOURCES
American College Testing Program. (1990). ACT Elements of a Request for Proposal. Iowa City, IA.
American College Testing Program. (1995). ACT Elements of a Sound Certification Testing Program. Iowa City, IA.
American Educational Research Association, American Psychological Association, & National Council on Measurement in Education. (1985). Standards for Educational and Psychological Testing. Washington, D.C.: American Psychological Association, Inc.
AICPA Auditing and Accounting Manual, Section 3175, Sample Engagement Letters, June, 1997.
AICPA Auditing and Accounting Manual, Section 7400, Representation Letters, June, 1997.
AICPA Auditing and Accounting Manual, Section 7500, Communication with Audit Committees, June, 1997.
Downing, S.M., & Haladyna, T.M. (Spring, 1996). A Model for Evaluating High-Stakes Testing Programs: Why the Fox Should Not Guard the Chicken Coop. Educational Measurement: Issues and Practice, 5-12.
Educational Testing Service. (1987). ETS Standards for Quality and Fairness, Princeton, NJ.
National Association of State Boards of Accountancy. (April, 1998). Report of the CPA Examination Review Board on the 1997 Uniform CPA Examinations and IQEX. Nashville, TN.
National Commission for Certifying Agencies. (1995). Standards for Accreditation of National Certification Organizations. Washington, D.C.
National Council of State Boards of Nursing. (February, 1998). NCLEX® Comprehensive Evaluation Guide Report By Evaluator. Chicago, IL.
Office of the State Auditor. (1996). An Audit Report on Management Controls at the Texas Department of Health. The State of Texas, Austin, TX.
Office of the State Auditor. (1997). An Audit Report on the Compliance and Effectiveness of the Texas Board of Nursing Facility Administrators. The State of Texas, Austin, TX.
Office of the State Auditor. (1996). An Audit Report on Management Controls at the Department of Health’s Licensing and Certification Division. The State of Texas, Austin, TX.
Professional Examination Service. PES Guidelines for the Development, Use, and Evaluation of Licensure and Certification Programs, adopted by the Board of Directors in 1995. New York, New York.
Professional Examination Service. (Winter 1998-1999). Methods for Evaluating and Auditing Credentialing Programs. PES News, Volume XIX, Number 1. New York, NY.
State of Florida, Department of Business and Professional Regulation, Bureau of Testing. (1995). Examination Evaluation Questionnaire. Unpublished.
State of Michigan, Department of Consumer and Industry Services, Office of Testing Services. Survey of Counseling Examinations. Unpublished.
State of Wisconsin. Department of Regulation and Licensing, Office of Examinations. (1997) Essential Elements of a Credentialing Examination. Unpublished.
The Council on Licensure, Enforcement and Regulation. (1993). Development, Administration, Scoring and Reporting of Credentialing Examinations: Recommendations for Board Members. Lexington, KY.
The Council on Licensure, Enforcement and Regulation. (1993). Principles of Fairness: An Examining Guide for Credentialing Boards. Lexington, KY.
Contacts for Documents Referenced Above:
American College Testing Program
PO Box 168
Iowa City, IA 52243
American Psychological Association, Inc. (publishers of Standards for Educational and Psychological Testing)
1200 Seventeenth Street, NW
Washington, DC 20036
American Institute of Certified Public Accountants
Harborside Financial Center
201 Plaza Three
Jersey City, NJ 07311-3881
Educational Testing Service
Princeton, NJ 08541
National Association of State Boards of Accountancy
150 Fourth Avenue North Suite 700
Nashville, TN 37219-2417
National Commission for Certifying Agencies
1200 19th Street, NW, Suite 300
Washington, DC 20036-2422
National Council of State Boards of Nursing
676 N. St. Clair, Suite 550
Chicago, IL 60611-2921
Office of the State Auditor
Two Commodore Plaza
206 East Ninth Street, Suite 1900
Austin, TX 78701
Professional Examination Service
475 Riverside Drive
New York, NY 10115-0089
State of Florida
Department of Business and Professional Regulation
Bureau of Testing
1940 North Monroe Street
Tallahassee, FL 32399-0791
State of Michigan
Department of Consumer and Industry Services
Office of Testing Services
PO Box 30018
Lansing, MI 48909
State of Wisconsin
Department of Regulation and Licensing
Office of Examinations
PO Box 8935
Madison, WI 53708
The Council on Licensure, Enforcement and Regulation (CLEAR)
403 Marquis Avenue, Suite 100
Lexington, KY 40502
APPENDIX B
ELEMENTS OF AN ENGAGEMENT LETTER FOR AUDIT OF A CREDENTIALING EXAMINATION PROGRAM
- What the auditors will audit (scope)
- How the audit will be conducted
- Standards used
- Evidence examined (list of documents to be provided to auditors)
- Procedures for audit
- Topics to be included in the audit report
- Responsibilities of the auditee
- Responsibility for management of the testing program
- Accurate representation of relevant facts to auditors
- Responsibilities of the auditors
- Design audit to provide reasonable assurance of detecting significant weakness in the program
- Issue a written report upon completion of the audit
- Time Schedule
- Costs, fees, clerical assistance to auditors
- Ownership of audit report
- Ownership of auditors working papers
- Publication of results (Who will receive the results)
- Indemnification of Auditors (optional)
- ("The client agrees to release, indemnify and hold us and our partners and our heirs, executors, personal representatives, successors , and assigns harmless from any liability and costs from knowing misrepresentations by management.")
Note: Ideas for this appendix were taken from sample engagement letters found in the American Institute of Certified Public Accountants Audit and Accounting Manual, Section 3175. However, the information contained here has been adapted to audits of testing programs rather than finances, and is the product of the authors of this reference guide.
COPYRIGHT 2000. Rights to copy and distribute this publication are hereby granted to members of the Council on Licensure, Enforcement and Regulation (CLEAR), providing credit is given to CLEAR and copies are not distributed for profit.