Writing exam questions is one of the most important parts of teaching nursing. Having the right roadmap to what to include must be in the minds of nurse educators while developing those exams. This presentation provided directions on how to develop the test blueprint and how to revise questions.
2. Topics
• Test Blueprinting
• Item Alignment
– Alignment with Nursing Process
– Ensuring Cognitive Levels
– Alignment of Course and Unit Objectives
– Cross Walking with National Standards
• NCLEX Client Needs
• NLN End of Program Competencies
• AACN Essentials
• QSEN Competencies
• Testing Policy
• Question Revisions
4. Blueprint
• A framework for the structure of the test.
• Outlines the scope and content of skills being measured by
the test and the importance of the content
• The way in which the content is taught may affect the level of
test items
• Tool for assuring that sufficient items are developed at the
appropriate level
• Used as evidence for judging validity of the resulting test
scores
• Ensuring course and unit objectives are being met
• Method of meeting alignment of standards to test
items
5. Purpose of Blueprinting
• Ensures course objectives are being met
• Meets accreditation standards and requirements
• Helps to design instructional strategies as outlined in the curriculum and
then assess the expected outcomes.
• Assists in ensuring test questions are at appropriate cognitive level
• Ensures the questions are meeting
– NCLEX Competencies
– NLN End of Program Competencies
– AACN Essentials
– QSEN Competencies
• Ensures the questions are valid and reliable
• Helps to distribute questions across topics
6. Types of Blueprinting
May include:
• Levels of Cognitive Difficulty (Bloom’s Level)
– Program – Same standard across all courses
– Course – Same standard within a course
– Exam – Same standard within exam
• Alignment with objectives
– Each objective is aligned with exam questions
• NCLEX Categories – Matching Categories and subcategories
– Each course
– Across the program
7. Types of Blueprinting (con’t)
May include: (con’t)
• National Standards
– National League of Nursing End of Program Competencies
(NLN EOP)
– AACN Essentials
– QSEN Competencies
8. Types of Blueprinting
Levels of Cognitive Difficulty (Bloom’s Level)
• Program – Same standard across all courses
• Course – Same standard within a course
• Exam – Same standard within exam
Cognitive Level Comprehensive Application Analysis Evaluation
Term 1 40% 20% 30% 10%
Term 2 35% 30% 25% 10%
9. Types of Blueprinting
Alignment with Objectives
• Each course and/or unit objective is aligned with exam
questions
Question Program
Outcome
Course Objective Unit Objective
1 5 3 3-5
2 3 2 2-8
10. Types of Blueprinting
NCLEX Categories – Matching categories and subcategories
• Each course
• Across the program
Question Learning
Outcome
Nursing
Process
NCLEX Cognitive
Level
Difficulty
Level
(After
Exam
Discrimin
ation
(After
Exam)
1 5 Planning SECE* -
1
Application
*Safe and Effect Care Environment (SECE)
11. Types of Blueprinting (con’t)
National Standards
• National League of Nursing End of Program Competencies
(NLN EOP)
• AACN Essentials
• QSEN Competencies
Question NLN EOP AACN QSEN
1 Nursing
Judgment
Evidence-Based
Practice
Safety
12. Types of Blueprinting – All Types
Question Program
Outcome
Course
Objective
Unit
Objective
Cognitive
Level
NCLEX
1 5 3 3-2 Application SECE – 1*
Question Nursing
Process
NLN EOP AACN QSEN Difficulty
Level (After
Exam
Discrimina
tion
(After
Exam)
1 Planning Nursing
Judgment
Evidence-
Based
Practice
Safety
*Safe and Effect Care Environment (SECE)
13. Types of Blueprinting - Program
Levels of Cognitive Difficulty (Bloom’s Level)
• Program – Same standard across all courses
• Course – Same standard within a course
• Exam – Same standard within exam
Cognitive Level Comprehensive Application Analysis Evaluation
Term 1 40% 20% 30% 10%
Term 2 35% 30% 25% 10%
14. Purpose of Blueprinting
• Ensures course objectives are being met
• Meets accreditation standards and requirements
• Helps to design instructional strategies as outlined in the curriculum and
then assess the expected outcomes.
• Assists in ensuring test questions are at appropriate cognitive level
• Ensures the questions are meeting
– NCLEX Competencies
– NLN End of Program Outcomes
– AACN Essentials
– QSEN Competencies
• Ensures the questions are valid and reliable
• Helps to distribute questions across topics
16. Testing Policy
• Need to ensure consistency
• Supports expectations of students regarding frequency and
number of questions
• Outlines when exams and questions are revised or exchanged
• Who should be revision exams
17. Testing Policy (con’t)
• Testing occurrences and procedures
– Testing procedures such as where and proctors
– Awarding points after testing or throwing out questions
– Student absence policy and make up tests
– Acceptable Discrimination Level used to determine if
question is thrown out
– Which questions are mastery items
18. Testing Policy (con’t)
• Policy when must a test be reviewed with faculty member
• Process for reviewing a test (completion of form) and provide
rationale with page number from reference.
• Reviewing after testing by faculty and students
• Screened for offensive content
• Reviewed regularly and do analytics to determine
appropriate questions
• Analysis levels for discrimination (point biserial) guidelines
• Test Security Policy
20. Parts of the Test Question
• Stem - The initial part of the item is which is has a
question or an incomplete statement about
something the student must solve
• Option - All of the answer choices for an item
• Key/Answer - Correct option
• Distracters - The incorrect options
• Rationales - One per distractor/answers
• Standards – Alignment of Components
21. Test Bank Usage
• Test Banks Revisions
– Caution (they are all online)
– Use for ideas for questions
– Do Not use questions as written in test banks (students
have them)
– Only use as part of the exam, maybe even as low as 10% of
the questions from test banks
– Randomize the questions, if possible
– Give more than one version of the exam
• Bristol, T. J. (2017). Examination test banks at risk, Teaching and Learning.
https://doi.org/10.1016/j.teln.2017.09.009
22. 10 Steps to Item Revision
• Step 1 – Read the stem.
• Step 2 – Review the question alignment in the test blueprint.
• Step 3 – Is the alignment acceptable or do you need to revise the
question or delete it?
• Step 4 – If you keep the question, read all of the options. Determine if
test analysis is favorable. Decide if you are going to keep question.
• Step 5 – If you keep the questions, pick one of the options as the
correct option. Are there errors?
• Step 6 – Are other options plausible?
• Step 7 – Decide to keep or reject question.
• Step 8 – If you keep the question, keep as written or revise it.
• Step 9 – Rewrite Question and align standards.
• Step 10 – Repeat Step 1 for next question.
23. Analyzing Items
• Difficulty (p-value): Percentage of students who got the answer correct
• Item discrimination: Point-biserial relationship between how well
students did on the item and their total test score.
• Reliability coefficient: Measure of the amount of measurement error
associated with a test score.
• Distractor Evaluation: Distractor quality can influence student
performance on an exam
Zimmaro, D. W. (2016). Writing good multiple-choice exams. The University of Texas at
Austin Faculty Innovation, 38-40 Center Retrieved from
http://learningsciences.utexas.edu/teaching/assess-learning/question-
types/multiple-choice
24. Checklist for Reviewing Items
• Does the item include only one correct or clearly best
answer?
• If multiple select (response), ensure answers are only
options.
• Has the answer been randomly assigned to one of the
alternative positions?
• Is the item laid out in a clear and consistent manner?
• Any errors?
• Any unnecessary difficult vocabulary?
25. Checklist for Reviewing Items (con’t)
• Matches one unit objective
• Has specific problem and stated clearly
• Stem includes as much of the item as possible
• Stem stated in a positive form; avoid except, all of the
above, none of above, both “a “and “b”
• Distractors:
– Avoid opposites
– Avoid answers that are too easy, obvious, or simple
– Similar distractors from similar topics
26. Writing Items
• Ideas for exams
– Create items as you start to work on the course
– Note questions students ask in class
– Pull information from homework or clinical
• Guidelines
– Build questions based objectives and balance topics across
exam
– Only one best answer
– Write distractors vertically
– No errors in stems or distractors/answer
– Avoid trick or “cute or funny” items
27. Writing Items
• Stems
– Clearly have a question or completion
– Question should not be in the distractors or answer
– No blanks in beginning or middle of stem
– Avoid unneeded wording
– Avoid “NOT”, “Never”, “Except”
• Distractors/Answers
– Align answers in logical, chronologically, are alphabetically
order
– Make distractors/answers same length
– Avoid all of the above, none of the above, etc.
– Use plausible distractors
28. Time and Expertise
• Time to write exam question is usually 45 minutes to 1 hour.
• Practice is needed to become proficient at writing good questions
• A standard approach is helpful
• Using box of cards and online system
• Each IntelliStem Writer kit includes 800+ stems for multiple choice and
multiple select (response) questions organized by Nursing Process. Each
stem is:
– Educationally sound: 96% of the stems are written at the application
or higher cognitive level.
– Aligned: All are aligned to established standards such as NCLEX Client
Needs, Nursing Process, Cognitive Level, NLN End of
Program Competencies, AACN Essentials, and
QSEN Competencies
29. Using IntelliStem WriterTM
• Question Development Process:
– Select a topic from your test blueprint.
– Determine the key point to be tested.
– Keeping in mind the nursing process being tested.
– Select a stem from the set of cards.
– Review the National Standards on the back and
determine if it meets your test blue print
30. Using IntelliStem WriterTM
• Question Development Process (con’t):
– Add a client situation or description to the stem by
changing content found within the brackets [ ].
– Decide on the answer(s) and write the rationale.
– Write the distractors and rationales for each.
– Since each stem is already aligned with established
standards, each final question will have a stem that is
aligned to established standards, correct answer(s) and
believable distractors.
– Usually construction time is cut by 50-75%.
32. Topics
• Test Blueprinting
• Item Alignment
– Alignment with Nursing Process
– Ensuring Cognitive Levels
– Alignment of Course and Unit Objectives
– Cross Walking with National Standards
• NCLEX Client Needs
• NLN End of Program Competencies
• AACN Essentials
• QSEN Competencies
• Testing Policy
• Question Revisions
33. References
• Patil, S. Y., Gosavi, M., Bannur, H. B., & Ratnakar, A. (2015). Blueprinting in
assessment: A tool to increase the validity of undergraduate written
examinations in pathology. International Journal of Applied and Basic
Medical Research, 5(Suppl 1), S76–S79. http://doi.org/10.4103/2229-
516X.162286
• Zimmaro, D. W., (2016) Writing Good Multiple-Choice Exams.
The University of Texas at Austin Faculty Innovation Center.
http://learningsciences.utexas.edu/teaching/assess-learning/question-
types/multiple-choice
Students can pass your unit exams but not be able to pass NCLEX.
Students will be tempted to use them to study
Test banks are compromised
Dawn Zimmaro - Austin
The percentage of students who got the answer correct.
Range is 0% to 100% but often written as 0.0 to 1.00
Higher value the easier the question
To determine difficulty: Number of students who got the question correct
Number of students who answered it
Above 0.90 question is very easy to answer and should be revised or discarded.
Below 2.0 question is very difficult and should be review for possible revision because the question is not clear or confusing.
Ideal p-value is slightly higher than midway between chance
(1.00) divided by the number of choices) and a perfect score
(1.00) for the item. Ideal value is .60%