1. The Systematic Design of
Instruction
Developing Assessment Instruments
By: Angel Jones
2. Developing Assessment Instruction Background
Assessments are used as a broader term that includes all types of activities
effective for demonstrating the learners mastery of new skills.
Learner centered assessments are to be criterion referenced. Used for evaluating
both learners’ progress and instructional quality.
The result of criterion referenced test indicated to the instructor exactly how well
learners were able to achieve each objective and indicates to the designer which
components of the instruction worked well and which ones need to be revised.
Aides learners in becoming ultimately responsible for the quality of their work.
3. Objectives
I can describe the purpose for Criterion Referenced Test
I can describe how entry skills test, pretest, practice tests, and post test are used by
instructional designers
I can name for categories of criteria for developing criterion-referenced tests and list
several considerations within each criterion category.
I can give a variety of objectives, write criterion-referenced, objective-style test items
that meet quality criteria in all four categories.
I can develop instructions for product development, live performance, and attitude
assessment, develop rubrics for evaluating
I can evaluate instructional goals, subordinate skills, learner and context analyses,
performance objectives, and criterion reference test items for congruence.
5. Types of Criterion Referenced Test
Entry Skills Test:
Administered before learners begin instruction.
Assess learners’ mastery of prerequisite skills
Learners lacking these skills will have great difficulty with the instruction.
Pre-Test
Administered before learner begin instruction.
Used to profile the learners with regard to the instructional analysis.
Used to determine whether the learner has previously mastered some or all of the skills.
Used to develop instruction most efficiently for a particular group.
Practice Test
Administered during instruction
Enables learner to rehearse new knowledge and skill to assess themselves.
Used to identify errors and misconception.
Post Test
Administered following instruction
Parallel to pretest
Measures objectives included in the instruction.
Evaluates instructional effectiveness and learner knowledge.
7. Designing and Developing a Criterion-Referenced Test
Verbal Information Domain
Objective-style Test Items: short answer; alternative response, matching, multiple choice items
Learners either recall or don’t recall the information
Intellectual Skills Domain
Complex
Objective-style test items, the creation of a product, or a live performance.
Create a rubric to assess.
Attitudinal Domain
Complex
Learner state their preferences or it is observed by instructor
Psychomotor Domain
Set of direction on how to demonstrate the task
Learner usually performs a sequence of steps that represent the instructional goal.
Checklist or scale is used to assess.
9. How do you determine the mastery level?
Norm-Referenced:
Mastery = the level of performance normally expected from the BEST learners.
Statistical:
Mastery = sufficient opportunities should be provided to perform the skill so that is nearly impossible for
correct performance to be result of chance alone.
Explicit Level of Performance:
Mastery = respect to both evaluating the performance at that point in time and enhancing the learning of
subsequent , related skills.
Mastery = the level required to be SUCCESSFUL on the job.
11. Test Item Writing Techniques
Goal Centered Criteria
Test items and tasks should
reflect the objectives.
Learner Centered Criteria
Test items and assessment
tasks must be tailored to
the characteristics and
needs of the learners.
Context Centered Criteria
Test items must be
designed with the
performance setting,
learning and classroom
environment in mind.
Assessment Centered
Criteria
Design and creating of test
items.
13. What is the proper number of items needed to determine
mastery of an objective?
Include several parallel test items for the objective if the item requires a response
format that will enable the student to guess the answer correctly.
If the likelihood of guessing is slim, then you may decided that one or two items
are sufficient to test the students ability.
To assess intellectual skills provide three or more opportunities to demonstrate
the skill.
Verbal skills only one item is needed .
Psychomotor skill ask the student to perform the skill under several different
conditions.
17. Objective test include test items that are easy for learners to complete and
designers to score. The answers are short, they are scored as correct or incorrect.
Objective formats include completion, short answer, true/false, matching and
multiple choice.
19. Sequencing Items
Sequencing strategies is a common type of test item, where the user is required to
place a number of items into the 'correct' order. This order may be temporal,
preference, logical, or other order, and must be clearly specified in the question. As
with Multiple Choice questions, this type of question works best with a reasonable
number of possible answers, say at least five pairs to be matched.
Hand scoring strategy: Cluster items for one objective together, regardless of item
format.
20. Writing Directions
Test should include clear, concise directions
1. The test title suggest the content to be covered.
2. A brief statement explains the objective or performance to be demonstrated and the
amount of credit that will be given.
3. Learners are told if the should guess if unsure about an answer.
4. Instruction specify whether words must be spelled correctly to receive full credit
5. Learners are told whether they should use their names or identify themselves as group
members
6. Time limits, word limits, or space limits are spelled out. Learners should be informed
whether they need anything special to respond to the test.
22. Steps in developing an instrument to measure performances,
products, or attitudes .
Writing Directions
Developing the
Instrument
Identify,
Paraphrase, and
Sequence Elements
Developing the
Response Format
Scoring Procedure
23. Developing the Instrument
Develop a rubric to guide your evaluation of performances, products or attitudes.
1. Identify the elements to be evaluated
2. Paraphrase each element
3. Sequence the elements on the instruments
4. Select the type of judgement to be made by the evaluator.
5. Determine how the instruments will be scored
25. Portfolios
Portfolios are collection of criterion –referenced assessments that illustrate
learners work.
Portfolios may include assessment of learners’ attitudes , objective style test,
progress from pretest to posttest, products that learners develop during
instruction, and live performances.
Portfolio assessment is done as it is completed, and the overall assessment of the
portfolio is carried out at the end of the process using rubrics.
26. Criteria for designing a quality portfolio assessment
1. The instructional goal and objectives should be very important and warrant
the increased time required for this assessment.
2. The work samples must be anchored to specific instructional goals and
performance objectives
3. The work samples should be the criterion referenced assessment that are
collected during the process of instruction.
4. The assessments are the regular pretest and posttest
5. Each regular assessment is accompanied by its rubric with a students’
responses evaluated and scored indicating the strengths
28. Procedures for Evaluating the Design
1. Organize and present the materials to illuminate their relationships.
2. Judge the congruence between the information and skills in instructional goal
analysis and the materials created
3. Judge the congruence between the materials and the characteristics of the
target learners
4. Judge the congruence between the performance and learning context and the
materials.
5. Judge the clarity of all the materials.
29. Structure of the Design Evaluating Chart
Subskill Performance Objective Sample Assessment
1 Objective 1 Test Item
2 Objective 2 Test Item
3 Objective 3 Test Item
Instructional Goal Terminal Objective Test Item
30. Change Agent, Reflective Practitioner or Lifelong Learner
Change Agent: People who act at a catalysts for change
Reflective Practitioner: Someone who, at regular intervals, looks back at the
work they do, and the work process, and considers how they can improve. They
‘reflect’ on the work they have done.
Lifelong Learner: The ongoing, voluntary, and self-motivated" pursuit of
knowledge for either personal or professional reasons