SlideShare ist ein Scribd-Unternehmen logo
1 von 40
Advance
Assessment and
Evaluation
PRESENTED BY
MUHAMMAD MUNSIF
MPHIL EDUCATION
(EVENING)
MUNSIFSAIL@GM
AIL.COM
TEST
CONSTRUCTION
• STEPS INVOLVING IN
TEST CONSTRUCTION
• PLANNING THE
CLASSROOM TEST
• DEVELOPING TEST
SPECIFICATION
• RULES FOR
CONSTRUCTING TEST
ITEMS
Steps Involving in Test Construction
Effective test development/construction
requires Twelve Steps
1. Overall Plan
2. Content definition
3. Test specifications
4. Item development
5. Test design and assembly
6. Test production
7. Test administration
8. Scoring test responses
9. Passing scores
10. Reporting test results
11. Item banking
12. Test technical report
Step 1: Overall Plan
• What construct is to be measured?
• What score interpretations are desired?
• What test format or combination of formats (selected
response or constructed response/performance) is most
appropriate for the planned assessment?
• What test administration modality will be used (paper and
pencil or computer based)?
Step 2: Content Definition
•What content is to be tested?
•Content defining methods vary in rigor, depending on the
purpose of the test, the consequences of decisions made from
the resulting test scores, and the amount of defensibility
required for any decisions resulting from test scores.
•Content defining methods may be very simple and
straightforward (Lower-stakes achievement tests ).
•Content defining methods must be systematic, comprehensive,
and defensible (High-stakes achievement tests ).
Step 3: Test Specifications
• The test specifications or test blueprint identifies the objectives
and skills which are to be tested and the relative weight on the
subject/test given to each.
Test specifications must describe:
 The type of testing format to be used (selected response or
constructed response/performance)
 The total number of test items to be created or selected for the
test, as well as the type or format of test items
Step 3: Test Specifications
 The cognitive classification system to be used (e.g., modified
Bloom’s taxonomy)
 Whether or not the test items or performance prompts will contain
visual stimuli (e.g., photographs, graphs, charts)
 The expected item scoring rules (e.g., 1 point for correct, 0 points
for incorrect)
 How test scores will be interpreted (e.g., norm or criterion
referenced)
 The time limit for each item.
Step 4: Item Development
•Item development can proceed only when a clearly agreed
upon set of objectives is available. The form chosen depends
on decisions made in the test plan (e.g., purpose, audience,
method of administration, scoring)
•Methods used to systematically develop selected response
items.
Creating effective test items (appropriate cognitive level)
What test item formats to use for the proposed
examination
Step 5: Test Design and
Assembly
• Assembling is a collection of test items into a test or test form is a critical
step in test development.
• The specific method and process of assembling test items into final test
forms depends on the mode of examination delivery.
Test Assembly Rules for Essay Tests.
1. All examinees must take the same items. Do not give them a chance to
choose which items they want to answer.
2. Meaningful comparisons normally can be made only if all examinees
take the same test.
Step 5: Test Design and
Assembly
Test Assembly Rules for Multiple-Choice Tests.
1. Set the number of items so that at least 95 percent of the examinees
can answer all items..
2. The correct choice should appear about an equal number of times in
each response position.
3. Directions to examinees should be written on the test to indicate
whether guessing is permitted or not.
Step 6: Test Production
•The production, printing, or publication of examinations in test
development is often overlooked with respect to its validity
aspects.
•Test production activities and their validity implications apply
equally to performance tests and selected-response tests.
•Test production security standards and policies must be
developed, implemented.
•Printers usually can provide some type of off-press copy for
final review by test development staff.
Step 7: Test
Administration
•The administration of tests is the most public and visible aspect of testing.
•Major validity issues associated with test administration because much of the
standardization of testing conditions relates to the quality of test administration.
•Security is a major concern for test administration. For most examinations (e.g.,
large-scale, high-stakes tests), the entire test development process is highly
secure, with extremely limited access to test materials.
•For paper-and-pencil test administration, proctoring is one of the most critical
issues.
•For large-scale computer-based tests, proctoring is generally delegated to the
agency providing the computer test administration network.
Step 8: Scoring Test
Responses
•Test scoring is the process of applying a scoring key to examinee responses to the test
stimuli.
•Many fundamental validity issues associated with test scoring. The most obvious issue
relates to accuracy of scoring.
•Scoring can be extremely simple or very complex, depending on the type of test item or
stimuli.
•The responses to single-best-answer ,multiple-choice items are easily scored by
computer software, whereas responses to complex computer simulation problems can
be more challenging to score reliably.
•Final scoring of the examinee responses follows the preliminary scoring and key
validation procedures. The final answer key must be carefully proofread, and quality
controlled for absolute accuracy.
Step 9: Passing Scores
•Tests require some type of cut score (passing score) or
performance standard.
•The methods and procedures used to establish cut scores are a
major source of validity evidence and are fundamental part of the
test development process.
•Methods and procedures used to establish passing scores may
take place at several different stages of test development,
depending on the methods used and the overarching philosophy of
standard setting adopted by the test developers and users.
Step 10: Reporting Test
Results
•Score reporting is an important, often complex, step in test
development.
•Examinees have a right to an accurate, timely, meaningful, and
useful report of their test performance.
•The reporting of test scores to examinees is an extremely
important step for all types of test development projects.
Step 11: Item Banking
•The process of securely storing test items for potential future use is typically
referred to as item banking.
•Effective test questions are so difficult to develop and so many resources
must be used to produce useful test items that perform well, it is sensible to
store such questions, together with all their relevant performance data, to
reuse such questions on some future form of the examination.
•Item banks can be as simple as a secure file cabinet using paper copies of
examination questions, with the appropriate identifying information and test
usage data (such as item difficulty, item discrimination).
Step 12: Test Technical Report
•Testing program with meaningful consequences for examinees should be
systematically documented and summarized in a technical report describing all
important aspects of test development, administration, scoring, reporting, and
test analyses and evaluation.
•Test developers are often unwilling to fully document their testing programs.
•Time and effort spent in such documentation activities are rewarded by the
ease of retrieval of important validity data and by their support of research
efforts on tests.
•Technical reports preserve essential validity evidence for the historical record,
including any recommendations for future testing program improvement
Planning the Classroom Test
Defining Targets
Aims, Goals and Objectives
Taxonomies of Instructional Objectives
Bloom’s Taxonomy
SOLO Taxonomy
Defining Targets
Target specifies and unpacks
the objective and spells out what students will
be able to do during and after the lesson or
lesson series.
Aims, Goals and Objectives
•Aims: general statements that provide direction to
educational action.
•Goals: generalized statement about what is to be
learned.
•Objectives: focuses on educational outcomes.
Taxonomies of Instructional
Objectives
TAX ONOMY
Arrangements Groups
Bloom’s Taxonomy
•Bloom's Taxonomy was created in 1956 under the
leadership of educational psychologist Dr. Benjamin
Bloom in order to promote higher order thinking in
education, rather than just remembering facts (rote
learning). It is most often used when designing
educational, training, and learning processes.
Bloom’s Taxonomy (Cognitive
Domain)
•Bloom’s taxonomy is a classification system for defining and
distinguishing different levels of human cognition.
•The committee identified three domains of educational activities
or learning (Bloom, et al. 1956):
Cognitive Domain: mental skills (knowledge) (Bloom’s
Taxonomy)
Affective Domain: growth in feelings or emotional areas (attitude
or self) (Krathwohl’s Taxonomy)
Psychomotor Domain: manual or physical skills (skills) (Harrow’s
Taxonomy)
SOLO Taxonomy
•SOLO stands for “Structure of Observed Learning
Outcome”.
•Developed by John Biggs and Kevin Collis in 1982.
•It describes the level of increasing complexity in
student’s understanding of a subject through five
stages/levels. Not all students get through all five
stages which shows differentiation.
Level of SOLO Taxonomy
1. Pre-structural : incompetence.
2. Uni-structural : one relevant aspect.
3. Multi-structural : several relevant and independent
aspects.
4. Relational : integrated into a structure.
5. Extended abstract : generalized to new domain.
BLOOM’S TAXONOMY
LEVELS
SOLO TAXONOMY LEVELS
Knowledge Uni-structural
Comprehensive Multi-
Application structural
Analysis
Synthesis Relational
Evaluation Extended Abstract
Developing Test Specification
Developing Test Specification
Developing Test Specification
•The first column lists the content areas to be assessed.
•The rest of the columns are devoted to levels of the objectives
assessed by individual items according to Bloom’s categories.
•The body of the table consists of the number of each item in
every category.
•A final column lists the totals of items in each content area.
•This table can then be converted into percentages, which
some experts prefer.
Developing Test Specification
 Table of specifications should help you to ensure you cover the
curriculum content and have items of varying complexity.
 After designing the test, you can determine how you will interpret
the scores:
Norm-referenced scores= The performance of a student is
relative to how other students performed (e.g. percentile ranks and
standard scores)
Criterion-referenced scores= Absolute, and represent how well
the individual student has mastered the content (e.g. percent
correct and mastery/non-mastery scores)
Rules/Guidelines for Constructing Test
Items
 Fixed Choice items (MCQ’s & Alternative
Response).
 Supply Type (Restricted Response, Essay
Response and Extended Response)
Fixed Choice items (MCQ’s &
Alternative Response)
This is the most common objective-type item. Multiple-choice item is a test question
which has numerous alternative choices from which the examinee is to select the
correct answer. It is generally recommended that one use 4 or 5 choices per question,
whenever possible. Using fewer alternatives often results in items with inferior
characteristics. The item choices are typically identified on the test copy by the letters A
through E.
Stem: This is the part of the item in which the problem is stated for the examinee. It
can be a question, a set of directions or a statement with an embedded blank.
Options/Alternatives: These are the choices given for the item.
Key: This is the correct choice for the item.
Distractors: These are the incorrect choices for the item.
Rules/Guidelines for Fixed
Choice items
The general rules used for writing multiple-choice items are described below.
Recognize that these are general rules; not all rules will be applicable to all types of
testing.
1. The stem should contain the problem and any qualifications. The entire stem must
always precede the alternatives.
2. Each item should be as short and verbally uncomplicated as possible. Give as much
context as is necessary to answer the question, but do not include unnecessary
information.
3. Avoid negatively stated items. If you have to use this kind of item, emphasize the fact
by underlining the negative part, putting it in capital letters or using italics.
4. If one or more alternatives are partially correct, ask for the "best" answer.
Rules/Guidelines for Fixed
Choice items
5. Try to test a different point in each question. If creating item clones, sufficiently change
the context, vocabulary, and order of alternatives, so that students cannot recognize the
two items as clones.
6. If an omission occurs in the stem, it should appear near the end of the stem and not at the
beginning.
7. Use a logical sequence for alternatives. If two alternatives are very similar (cognitively or
visually), they should be placed next to one another to allow students to compare them
more easily.
8. Make all incorrect alternatives (i.e., distractors) plausible and attractive. It is often useful
to use popular misconceptions and frequent mistakes as distractors.
9. All alternatives should be homogeneous in content, form and grammatical structure.
Rules/Guidelines for Fixed
Choice items
10. Make all alternatives grammatically consistent with the stem.
11. Use 4or 5 alternatives in each item.
12. Avoid repeating words between the stem and key. It can be done, however, to make
distractors more attractive.
13. Alternatives should not overlap in meaning or be synonymous with one another.
14. Avoid terms such as "always" or "never," as they generally signal incorrect choices.
15. Do not use "none of the above" as a last option when the correct answer is simply the
best answer among the choices offered.
16. Try to avoid "all of the above" as a last option. If an examinee can eliminate any of the
other choices, this choice can be automatically eliminated as well.
Supply Type (Essay Response)
•Essay items are useful when examinees have to show how they arrived at an
answer.
•Writing ability is a good example of the kind of test that should be given in an
essay response format.
•It is difficult to score reliably and can require a significant amount of time to be
graded.
•Grading is often affected by the verbal fluency in the answer, handwriting,
presence or lack of spelling errors, grammar used and the subjective
judgements of the grader.
•Training of graders can require a large amount of time and needs to be
repeated at frequent intervals throughout the grading.
Rules/Guidelines for Supply Type
Items
The following rules may be useful in developing and grading essay questions:
1. The shorter the answer required for a given essay item, generally the better.
2. More objectives can be tested in the same period, and factors such as verbal fluency, spelling,
etc., have less of an opportunity to influence the grader. Help the examinees focus their
answers by giving them a starting sentence for their essay.
3. Make sure questions are sharply focused on a single issue.
4. Avoid humorous items. Classroom testing is very important and humorous items may cause
students to either not take the exam seriously or become confused or anxious.
5. Items should measure only the construct of interest, not one s knowledge of the item context.
6. Write items to measure what students know, not what they do not know.
Test construction

Weitere ähnliche Inhalte

Was ist angesagt?

Principles and Techniques of Test Construction
Principles and Techniques of Test ConstructionPrinciples and Techniques of Test Construction
Principles and Techniques of Test ConstructionOlufemi Jeremiah Olubodun
 
Constructing test Items
Constructing test ItemsConstructing test Items
Constructing test ItemsDEBABRATA GIRI
 
Assessment of learning Chapter 1
Assessment of learning Chapter 1Assessment of learning Chapter 1
Assessment of learning Chapter 1Jarry Fuentes
 
Standardized testing.pptx 2
Standardized testing.pptx 2Standardized testing.pptx 2
Standardized testing.pptx 2Jesullyna Manuel
 
Assessment of Student Learning 2: Rubrics
Assessment of Student Learning 2: RubricsAssessment of Student Learning 2: Rubrics
Assessment of Student Learning 2: RubricsAlyssa Denise Valino
 
Norm referenced and Criterion Referenced Test
Norm referenced and Criterion Referenced TestNorm referenced and Criterion Referenced Test
Norm referenced and Criterion Referenced TestDrSindhuAlmas
 
Concept of Test, Measurement, Assessment and Evaluation
Concept of Test, Measurement, Assessment and Evaluation Concept of Test, Measurement, Assessment and Evaluation
Concept of Test, Measurement, Assessment and Evaluation HadeeqaTanveer
 
Basic Principles of Assessment
Basic Principles of AssessmentBasic Principles of Assessment
Basic Principles of AssessmentYee Bee Choo
 
Test Construction
Test ConstructionTest Construction
Test Constructionsongoten77
 
TEST-CONSTRUCTION-AND-PREPARATION.pptx
TEST-CONSTRUCTION-AND-PREPARATION.pptxTEST-CONSTRUCTION-AND-PREPARATION.pptx
TEST-CONSTRUCTION-AND-PREPARATION.pptxjennyvienerveza
 
Chapter 1a Basic Concepts in Assessment
Chapter 1a   Basic Concepts in AssessmentChapter 1a   Basic Concepts in Assessment
Chapter 1a Basic Concepts in AssessmentAhl Terdie Mantua
 

Was ist angesagt? (20)

Principles and Techniques of Test Construction
Principles and Techniques of Test ConstructionPrinciples and Techniques of Test Construction
Principles and Techniques of Test Construction
 
Constructing test Items
Constructing test ItemsConstructing test Items
Constructing test Items
 
Assembling The Test
Assembling The TestAssembling The Test
Assembling The Test
 
Test construction
Test constructionTest construction
Test construction
 
Item analysis
Item analysisItem analysis
Item analysis
 
Assessment of learning Chapter 1
Assessment of learning Chapter 1Assessment of learning Chapter 1
Assessment of learning Chapter 1
 
Test construction
Test constructionTest construction
Test construction
 
subjective test
subjective  testsubjective  test
subjective test
 
Standardized testing.pptx 2
Standardized testing.pptx 2Standardized testing.pptx 2
Standardized testing.pptx 2
 
Types of Essay Items
Types of Essay ItemsTypes of Essay Items
Types of Essay Items
 
Steps fo test constructions
Steps fo test constructionsSteps fo test constructions
Steps fo test constructions
 
Learning Objectives
Learning ObjectivesLearning Objectives
Learning Objectives
 
Preparing The Test Items
Preparing The Test ItemsPreparing The Test Items
Preparing The Test Items
 
Assessment of Student Learning 2: Rubrics
Assessment of Student Learning 2: RubricsAssessment of Student Learning 2: Rubrics
Assessment of Student Learning 2: Rubrics
 
Norm referenced and Criterion Referenced Test
Norm referenced and Criterion Referenced TestNorm referenced and Criterion Referenced Test
Norm referenced and Criterion Referenced Test
 
Concept of Test, Measurement, Assessment and Evaluation
Concept of Test, Measurement, Assessment and Evaluation Concept of Test, Measurement, Assessment and Evaluation
Concept of Test, Measurement, Assessment and Evaluation
 
Basic Principles of Assessment
Basic Principles of AssessmentBasic Principles of Assessment
Basic Principles of Assessment
 
Test Construction
Test ConstructionTest Construction
Test Construction
 
TEST-CONSTRUCTION-AND-PREPARATION.pptx
TEST-CONSTRUCTION-AND-PREPARATION.pptxTEST-CONSTRUCTION-AND-PREPARATION.pptx
TEST-CONSTRUCTION-AND-PREPARATION.pptx
 
Chapter 1a Basic Concepts in Assessment
Chapter 1a   Basic Concepts in AssessmentChapter 1a   Basic Concepts in Assessment
Chapter 1a Basic Concepts in Assessment
 

Ähnlich wie Test construction

Test Management.pptx
Test Management.pptxTest Management.pptx
Test Management.pptxMAshok10
 
Software Engineering (Testing Activities, Management, and Automation)
Software Engineering (Testing Activities, Management, and Automation)Software Engineering (Testing Activities, Management, and Automation)
Software Engineering (Testing Activities, Management, and Automation)ShudipPal
 
Moderation workshop dietetics april 2017b
Moderation workshop dietetics april 2017bModeration workshop dietetics april 2017b
Moderation workshop dietetics april 2017bRita Ndagire Kizito
 
Assessment_Basics[1]
Assessment_Basics[1]Assessment_Basics[1]
Assessment_Basics[1]'Gbenga Aina
 
chapter-no-4-test-management fudhg ddh j
chapter-no-4-test-management fudhg ddh jchapter-no-4-test-management fudhg ddh j
chapter-no-4-test-management fudhg ddh jAmitDeshai
 
Designing and Conducting Formative Evaluations
Designing and Conducting Formative EvaluationsDesigning and Conducting Formative Evaluations
Designing and Conducting Formative Evaluationscloder6416
 
Evaluation the many faces
Evaluation   the many facesEvaluation   the many faces
Evaluation the many facesleesha roberts
 
Software-Testing-Chapgdgdgsghshshshshshshs
Software-Testing-ChapgdgdgsghshshshshshshsSoftware-Testing-Chapgdgdgsghshshshshshshs
Software-Testing-Chapgdgdgsghshshshshshshsshaikbab
 
DATA DRIVEN INSTRUCTIONAL MANAGEMENT-8615
DATA DRIVEN INSTRUCTIONAL MANAGEMENT-8615DATA DRIVEN INSTRUCTIONAL MANAGEMENT-8615
DATA DRIVEN INSTRUCTIONAL MANAGEMENT-8615EqraBaig
 
Quick Start Users Guide-June 2014-Working Draft
Quick Start Users Guide-June 2014-Working DraftQuick Start Users Guide-June 2014-Working Draft
Quick Start Users Guide-June 2014-Working DraftResearch in Action, Inc.
 
M1-Designing the Assessment-June 2014-FINAL
M1-Designing the Assessment-June 2014-FINALM1-Designing the Assessment-June 2014-FINAL
M1-Designing the Assessment-June 2014-FINALResearch in Action, Inc.
 
Summary and Analysis of ID&T
Summary and Analysis of ID&TSummary and Analysis of ID&T
Summary and Analysis of ID&TAnnaJeanneLucero1
 

Ähnlich wie Test construction (20)

Test Management.pptx
Test Management.pptxTest Management.pptx
Test Management.pptx
 
M0-Orientation-JAN2014-Final
M0-Orientation-JAN2014-FinalM0-Orientation-JAN2014-Final
M0-Orientation-JAN2014-Final
 
Software Engineering (Testing Activities, Management, and Automation)
Software Engineering (Testing Activities, Management, and Automation)Software Engineering (Testing Activities, Management, and Automation)
Software Engineering (Testing Activities, Management, and Automation)
 
Moderation workshop dietetics april 2017b
Moderation workshop dietetics april 2017bModeration workshop dietetics april 2017b
Moderation workshop dietetics april 2017b
 
Assessment_Basics[1]
Assessment_Basics[1]Assessment_Basics[1]
Assessment_Basics[1]
 
Construction of Test
Construction of TestConstruction of Test
Construction of Test
 
ISTQB Test Process
ISTQB Test ProcessISTQB Test Process
ISTQB Test Process
 
M0-Orientation-June 2014-FINAL
M0-Orientation-June 2014-FINALM0-Orientation-June 2014-FINAL
M0-Orientation-June 2014-FINAL
 
chapter-no-4-test-management fudhg ddh j
chapter-no-4-test-management fudhg ddh jchapter-no-4-test-management fudhg ddh j
chapter-no-4-test-management fudhg ddh j
 
Designing and Conducting Formative Evaluations
Designing and Conducting Formative EvaluationsDesigning and Conducting Formative Evaluations
Designing and Conducting Formative Evaluations
 
Evaluation the many faces
Evaluation   the many facesEvaluation   the many faces
Evaluation the many faces
 
Lesson 1 bb.docx
Lesson 1 bb.docxLesson 1 bb.docx
Lesson 1 bb.docx
 
La notes (5 10)
La notes (5 10)La notes (5 10)
La notes (5 10)
 
Software-Testing-Chapgdgdgsghshshshshshshs
Software-Testing-ChapgdgdgsghshshshshshshsSoftware-Testing-Chapgdgdgsghshshshshshshs
Software-Testing-Chapgdgdgsghshshshshshshs
 
DATA DRIVEN INSTRUCTIONAL MANAGEMENT-8615
DATA DRIVEN INSTRUCTIONAL MANAGEMENT-8615DATA DRIVEN INSTRUCTIONAL MANAGEMENT-8615
DATA DRIVEN INSTRUCTIONAL MANAGEMENT-8615
 
Chapter 3
Chapter 3Chapter 3
Chapter 3
 
Quick Start Users Guide-June 2014-Working Draft
Quick Start Users Guide-June 2014-Working DraftQuick Start Users Guide-June 2014-Working Draft
Quick Start Users Guide-June 2014-Working Draft
 
M1-Designing the Assessment-June 2014-FINAL
M1-Designing the Assessment-June 2014-FINALM1-Designing the Assessment-June 2014-FINAL
M1-Designing the Assessment-June 2014-FINAL
 
HDP PPT.pptx
HDP PPT.pptxHDP PPT.pptx
HDP PPT.pptx
 
Summary and Analysis of ID&T
Summary and Analysis of ID&TSummary and Analysis of ID&T
Summary and Analysis of ID&T
 

Mehr von munsif123

On Page SEO presentations
On Page SEO presentations On Page SEO presentations
On Page SEO presentations munsif123
 
Big shadow test
Big shadow testBig shadow test
Big shadow testmunsif123
 
Computer based test designs (cbt)
Computer based test designs (cbt)Computer based test designs (cbt)
Computer based test designs (cbt)munsif123
 
Test equating using irt. final
Test equating using irt. finalTest equating using irt. final
Test equating using irt. finalmunsif123
 
Simultaneously assembly
Simultaneously assemblySimultaneously assembly
Simultaneously assemblymunsif123
 
The nedelsky method
The nedelsky methodThe nedelsky method
The nedelsky methodmunsif123
 
Portfolios in Education
Portfolios in EducationPortfolios in Education
Portfolios in Educationmunsif123
 
Instrument development process
Instrument development processInstrument development process
Instrument development processmunsif123
 
Advanced assessment and evaluation (role of assessment and measurement in tea...
Advanced assessment and evaluation (role of assessment and measurement in tea...Advanced assessment and evaluation (role of assessment and measurement in tea...
Advanced assessment and evaluation (role of assessment and measurement in tea...munsif123
 
Alternative assessment strategies
Alternative assessment strategiesAlternative assessment strategies
Alternative assessment strategiesmunsif123
 
Anecdotal record in education
Anecdotal record in education Anecdotal record in education
Anecdotal record in education munsif123
 
Role of objective Assessment and Evaluation
Role of objective Assessment and Evaluation  Role of objective Assessment and Evaluation
Role of objective Assessment and Evaluation munsif123
 
Item analysis in education
Item analysis  in educationItem analysis  in education
Item analysis in educationmunsif123
 
Principles of assessment
Principles of assessmentPrinciples of assessment
Principles of assessmentmunsif123
 
vertical Moderate Standard setting
vertical Moderate Standard setting vertical Moderate Standard setting
vertical Moderate Standard setting munsif123
 
Angoff method ppt
Angoff method pptAngoff method ppt
Angoff method pptmunsif123
 
Angoff method ppt
Angoff method pptAngoff method ppt
Angoff method pptmunsif123
 
American psychology Association
American psychology Association American psychology Association
American psychology Association munsif123
 
Student diversity
Student diversity Student diversity
Student diversity munsif123
 

Mehr von munsif123 (20)

On Page SEO presentations
On Page SEO presentations On Page SEO presentations
On Page SEO presentations
 
Big shadow test
Big shadow testBig shadow test
Big shadow test
 
Computer based test designs (cbt)
Computer based test designs (cbt)Computer based test designs (cbt)
Computer based test designs (cbt)
 
Test equating using irt. final
Test equating using irt. finalTest equating using irt. final
Test equating using irt. final
 
Simultaneously assembly
Simultaneously assemblySimultaneously assembly
Simultaneously assembly
 
The nedelsky method
The nedelsky methodThe nedelsky method
The nedelsky method
 
Portfolios in Education
Portfolios in EducationPortfolios in Education
Portfolios in Education
 
Instrument development process
Instrument development processInstrument development process
Instrument development process
 
Advanced assessment and evaluation (role of assessment and measurement in tea...
Advanced assessment and evaluation (role of assessment and measurement in tea...Advanced assessment and evaluation (role of assessment and measurement in tea...
Advanced assessment and evaluation (role of assessment and measurement in tea...
 
Alternative assessment strategies
Alternative assessment strategiesAlternative assessment strategies
Alternative assessment strategies
 
Anecdotal record in education
Anecdotal record in education Anecdotal record in education
Anecdotal record in education
 
Role of objective Assessment and Evaluation
Role of objective Assessment and Evaluation  Role of objective Assessment and Evaluation
Role of objective Assessment and Evaluation
 
Item analysis in education
Item analysis  in educationItem analysis  in education
Item analysis in education
 
Principles of assessment
Principles of assessmentPrinciples of assessment
Principles of assessment
 
vertical Moderate Standard setting
vertical Moderate Standard setting vertical Moderate Standard setting
vertical Moderate Standard setting
 
Angoff method ppt
Angoff method pptAngoff method ppt
Angoff method ppt
 
Angoff method ppt
Angoff method pptAngoff method ppt
Angoff method ppt
 
American psychology Association
American psychology Association American psychology Association
American psychology Association
 
Student diversity
Student diversity Student diversity
Student diversity
 
Rationalism
RationalismRationalism
Rationalism
 

Kürzlich hochgeladen

An Overview of Mutual Funds Bcom Project.pdf
An Overview of Mutual Funds Bcom Project.pdfAn Overview of Mutual Funds Bcom Project.pdf
An Overview of Mutual Funds Bcom Project.pdfSanaAli374401
 
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17  How to Extend Models Using Mixin ClassesMixin Classes in Odoo 17  How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17 How to Extend Models Using Mixin ClassesCeline George
 
SECOND SEMESTER TOPIC COVERAGE SY 2023-2024 Trends, Networks, and Critical Th...
SECOND SEMESTER TOPIC COVERAGE SY 2023-2024 Trends, Networks, and Critical Th...SECOND SEMESTER TOPIC COVERAGE SY 2023-2024 Trends, Networks, and Critical Th...
SECOND SEMESTER TOPIC COVERAGE SY 2023-2024 Trends, Networks, and Critical Th...KokoStevan
 
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...Shubhangi Sonawane
 
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...EduSkills OECD
 
Gardella_Mateo_IntellectualProperty.pdf.
Gardella_Mateo_IntellectualProperty.pdf.Gardella_Mateo_IntellectualProperty.pdf.
Gardella_Mateo_IntellectualProperty.pdf.MateoGardella
 
Application orientated numerical on hev.ppt
Application orientated numerical on hev.pptApplication orientated numerical on hev.ppt
Application orientated numerical on hev.pptRamjanShidvankar
 
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...christianmathematics
 
This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.christianmathematics
 
Unit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptxUnit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptxVishalSingh1417
 
Measures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDMeasures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDThiyagu K
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxiammrhaywood
 
Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104misteraugie
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdfQucHHunhnh
 
Measures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeMeasures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeThiyagu K
 
Key note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfKey note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfAdmir Softic
 
Introduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsIntroduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsTechSoup
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingTechSoup
 
Making and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdfMaking and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdfChris Hunter
 

Kürzlich hochgeladen (20)

An Overview of Mutual Funds Bcom Project.pdf
An Overview of Mutual Funds Bcom Project.pdfAn Overview of Mutual Funds Bcom Project.pdf
An Overview of Mutual Funds Bcom Project.pdf
 
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17  How to Extend Models Using Mixin ClassesMixin Classes in Odoo 17  How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
 
SECOND SEMESTER TOPIC COVERAGE SY 2023-2024 Trends, Networks, and Critical Th...
SECOND SEMESTER TOPIC COVERAGE SY 2023-2024 Trends, Networks, and Critical Th...SECOND SEMESTER TOPIC COVERAGE SY 2023-2024 Trends, Networks, and Critical Th...
SECOND SEMESTER TOPIC COVERAGE SY 2023-2024 Trends, Networks, and Critical Th...
 
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
 
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
 
Gardella_Mateo_IntellectualProperty.pdf.
Gardella_Mateo_IntellectualProperty.pdf.Gardella_Mateo_IntellectualProperty.pdf.
Gardella_Mateo_IntellectualProperty.pdf.
 
Application orientated numerical on hev.ppt
Application orientated numerical on hev.pptApplication orientated numerical on hev.ppt
Application orientated numerical on hev.ppt
 
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
 
This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.
 
Unit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptxUnit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptx
 
Measures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDMeasures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SD
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
 
Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
 
Measures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeMeasures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and Mode
 
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
 
Key note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfKey note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdf
 
Introduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsIntroduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The Basics
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy Consulting
 
Making and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdfMaking and Justifying Mathematical Decisions.pdf
Making and Justifying Mathematical Decisions.pdf
 

Test construction

  • 1. Advance Assessment and Evaluation PRESENTED BY MUHAMMAD MUNSIF MPHIL EDUCATION (EVENING) MUNSIFSAIL@GM AIL.COM
  • 2. TEST CONSTRUCTION • STEPS INVOLVING IN TEST CONSTRUCTION • PLANNING THE CLASSROOM TEST • DEVELOPING TEST SPECIFICATION • RULES FOR CONSTRUCTING TEST ITEMS
  • 3. Steps Involving in Test Construction Effective test development/construction requires Twelve Steps 1. Overall Plan 2. Content definition 3. Test specifications 4. Item development 5. Test design and assembly 6. Test production 7. Test administration 8. Scoring test responses 9. Passing scores 10. Reporting test results 11. Item banking 12. Test technical report
  • 4. Step 1: Overall Plan • What construct is to be measured? • What score interpretations are desired? • What test format or combination of formats (selected response or constructed response/performance) is most appropriate for the planned assessment? • What test administration modality will be used (paper and pencil or computer based)?
  • 5. Step 2: Content Definition •What content is to be tested? •Content defining methods vary in rigor, depending on the purpose of the test, the consequences of decisions made from the resulting test scores, and the amount of defensibility required for any decisions resulting from test scores. •Content defining methods may be very simple and straightforward (Lower-stakes achievement tests ). •Content defining methods must be systematic, comprehensive, and defensible (High-stakes achievement tests ).
  • 6. Step 3: Test Specifications • The test specifications or test blueprint identifies the objectives and skills which are to be tested and the relative weight on the subject/test given to each. Test specifications must describe:  The type of testing format to be used (selected response or constructed response/performance)  The total number of test items to be created or selected for the test, as well as the type or format of test items
  • 7. Step 3: Test Specifications  The cognitive classification system to be used (e.g., modified Bloom’s taxonomy)  Whether or not the test items or performance prompts will contain visual stimuli (e.g., photographs, graphs, charts)  The expected item scoring rules (e.g., 1 point for correct, 0 points for incorrect)  How test scores will be interpreted (e.g., norm or criterion referenced)  The time limit for each item.
  • 8. Step 4: Item Development •Item development can proceed only when a clearly agreed upon set of objectives is available. The form chosen depends on decisions made in the test plan (e.g., purpose, audience, method of administration, scoring) •Methods used to systematically develop selected response items. Creating effective test items (appropriate cognitive level) What test item formats to use for the proposed examination
  • 9. Step 5: Test Design and Assembly • Assembling is a collection of test items into a test or test form is a critical step in test development. • The specific method and process of assembling test items into final test forms depends on the mode of examination delivery. Test Assembly Rules for Essay Tests. 1. All examinees must take the same items. Do not give them a chance to choose which items they want to answer. 2. Meaningful comparisons normally can be made only if all examinees take the same test.
  • 10. Step 5: Test Design and Assembly Test Assembly Rules for Multiple-Choice Tests. 1. Set the number of items so that at least 95 percent of the examinees can answer all items.. 2. The correct choice should appear about an equal number of times in each response position. 3. Directions to examinees should be written on the test to indicate whether guessing is permitted or not.
  • 11. Step 6: Test Production •The production, printing, or publication of examinations in test development is often overlooked with respect to its validity aspects. •Test production activities and their validity implications apply equally to performance tests and selected-response tests. •Test production security standards and policies must be developed, implemented. •Printers usually can provide some type of off-press copy for final review by test development staff.
  • 12. Step 7: Test Administration •The administration of tests is the most public and visible aspect of testing. •Major validity issues associated with test administration because much of the standardization of testing conditions relates to the quality of test administration. •Security is a major concern for test administration. For most examinations (e.g., large-scale, high-stakes tests), the entire test development process is highly secure, with extremely limited access to test materials. •For paper-and-pencil test administration, proctoring is one of the most critical issues. •For large-scale computer-based tests, proctoring is generally delegated to the agency providing the computer test administration network.
  • 13. Step 8: Scoring Test Responses •Test scoring is the process of applying a scoring key to examinee responses to the test stimuli. •Many fundamental validity issues associated with test scoring. The most obvious issue relates to accuracy of scoring. •Scoring can be extremely simple or very complex, depending on the type of test item or stimuli. •The responses to single-best-answer ,multiple-choice items are easily scored by computer software, whereas responses to complex computer simulation problems can be more challenging to score reliably. •Final scoring of the examinee responses follows the preliminary scoring and key validation procedures. The final answer key must be carefully proofread, and quality controlled for absolute accuracy.
  • 14. Step 9: Passing Scores •Tests require some type of cut score (passing score) or performance standard. •The methods and procedures used to establish cut scores are a major source of validity evidence and are fundamental part of the test development process. •Methods and procedures used to establish passing scores may take place at several different stages of test development, depending on the methods used and the overarching philosophy of standard setting adopted by the test developers and users.
  • 15. Step 10: Reporting Test Results •Score reporting is an important, often complex, step in test development. •Examinees have a right to an accurate, timely, meaningful, and useful report of their test performance. •The reporting of test scores to examinees is an extremely important step for all types of test development projects.
  • 16. Step 11: Item Banking •The process of securely storing test items for potential future use is typically referred to as item banking. •Effective test questions are so difficult to develop and so many resources must be used to produce useful test items that perform well, it is sensible to store such questions, together with all their relevant performance data, to reuse such questions on some future form of the examination. •Item banks can be as simple as a secure file cabinet using paper copies of examination questions, with the appropriate identifying information and test usage data (such as item difficulty, item discrimination).
  • 17. Step 12: Test Technical Report •Testing program with meaningful consequences for examinees should be systematically documented and summarized in a technical report describing all important aspects of test development, administration, scoring, reporting, and test analyses and evaluation. •Test developers are often unwilling to fully document their testing programs. •Time and effort spent in such documentation activities are rewarded by the ease of retrieval of important validity data and by their support of research efforts on tests. •Technical reports preserve essential validity evidence for the historical record, including any recommendations for future testing program improvement
  • 18. Planning the Classroom Test Defining Targets Aims, Goals and Objectives Taxonomies of Instructional Objectives Bloom’s Taxonomy SOLO Taxonomy
  • 19. Defining Targets Target specifies and unpacks the objective and spells out what students will be able to do during and after the lesson or lesson series.
  • 20. Aims, Goals and Objectives •Aims: general statements that provide direction to educational action. •Goals: generalized statement about what is to be learned. •Objectives: focuses on educational outcomes.
  • 21. Taxonomies of Instructional Objectives TAX ONOMY Arrangements Groups
  • 22. Bloom’s Taxonomy •Bloom's Taxonomy was created in 1956 under the leadership of educational psychologist Dr. Benjamin Bloom in order to promote higher order thinking in education, rather than just remembering facts (rote learning). It is most often used when designing educational, training, and learning processes.
  • 23. Bloom’s Taxonomy (Cognitive Domain) •Bloom’s taxonomy is a classification system for defining and distinguishing different levels of human cognition. •The committee identified three domains of educational activities or learning (Bloom, et al. 1956): Cognitive Domain: mental skills (knowledge) (Bloom’s Taxonomy) Affective Domain: growth in feelings or emotional areas (attitude or self) (Krathwohl’s Taxonomy) Psychomotor Domain: manual or physical skills (skills) (Harrow’s Taxonomy)
  • 24.
  • 25. SOLO Taxonomy •SOLO stands for “Structure of Observed Learning Outcome”. •Developed by John Biggs and Kevin Collis in 1982. •It describes the level of increasing complexity in student’s understanding of a subject through five stages/levels. Not all students get through all five stages which shows differentiation.
  • 26. Level of SOLO Taxonomy 1. Pre-structural : incompetence. 2. Uni-structural : one relevant aspect. 3. Multi-structural : several relevant and independent aspects. 4. Relational : integrated into a structure. 5. Extended abstract : generalized to new domain.
  • 27.
  • 28. BLOOM’S TAXONOMY LEVELS SOLO TAXONOMY LEVELS Knowledge Uni-structural Comprehensive Multi- Application structural Analysis Synthesis Relational Evaluation Extended Abstract
  • 31. Developing Test Specification •The first column lists the content areas to be assessed. •The rest of the columns are devoted to levels of the objectives assessed by individual items according to Bloom’s categories. •The body of the table consists of the number of each item in every category. •A final column lists the totals of items in each content area. •This table can then be converted into percentages, which some experts prefer.
  • 32. Developing Test Specification  Table of specifications should help you to ensure you cover the curriculum content and have items of varying complexity.  After designing the test, you can determine how you will interpret the scores: Norm-referenced scores= The performance of a student is relative to how other students performed (e.g. percentile ranks and standard scores) Criterion-referenced scores= Absolute, and represent how well the individual student has mastered the content (e.g. percent correct and mastery/non-mastery scores)
  • 33. Rules/Guidelines for Constructing Test Items  Fixed Choice items (MCQ’s & Alternative Response).  Supply Type (Restricted Response, Essay Response and Extended Response)
  • 34. Fixed Choice items (MCQ’s & Alternative Response) This is the most common objective-type item. Multiple-choice item is a test question which has numerous alternative choices from which the examinee is to select the correct answer. It is generally recommended that one use 4 or 5 choices per question, whenever possible. Using fewer alternatives often results in items with inferior characteristics. The item choices are typically identified on the test copy by the letters A through E. Stem: This is the part of the item in which the problem is stated for the examinee. It can be a question, a set of directions or a statement with an embedded blank. Options/Alternatives: These are the choices given for the item. Key: This is the correct choice for the item. Distractors: These are the incorrect choices for the item.
  • 35. Rules/Guidelines for Fixed Choice items The general rules used for writing multiple-choice items are described below. Recognize that these are general rules; not all rules will be applicable to all types of testing. 1. The stem should contain the problem and any qualifications. The entire stem must always precede the alternatives. 2. Each item should be as short and verbally uncomplicated as possible. Give as much context as is necessary to answer the question, but do not include unnecessary information. 3. Avoid negatively stated items. If you have to use this kind of item, emphasize the fact by underlining the negative part, putting it in capital letters or using italics. 4. If one or more alternatives are partially correct, ask for the "best" answer.
  • 36. Rules/Guidelines for Fixed Choice items 5. Try to test a different point in each question. If creating item clones, sufficiently change the context, vocabulary, and order of alternatives, so that students cannot recognize the two items as clones. 6. If an omission occurs in the stem, it should appear near the end of the stem and not at the beginning. 7. Use a logical sequence for alternatives. If two alternatives are very similar (cognitively or visually), they should be placed next to one another to allow students to compare them more easily. 8. Make all incorrect alternatives (i.e., distractors) plausible and attractive. It is often useful to use popular misconceptions and frequent mistakes as distractors. 9. All alternatives should be homogeneous in content, form and grammatical structure.
  • 37. Rules/Guidelines for Fixed Choice items 10. Make all alternatives grammatically consistent with the stem. 11. Use 4or 5 alternatives in each item. 12. Avoid repeating words between the stem and key. It can be done, however, to make distractors more attractive. 13. Alternatives should not overlap in meaning or be synonymous with one another. 14. Avoid terms such as "always" or "never," as they generally signal incorrect choices. 15. Do not use "none of the above" as a last option when the correct answer is simply the best answer among the choices offered. 16. Try to avoid "all of the above" as a last option. If an examinee can eliminate any of the other choices, this choice can be automatically eliminated as well.
  • 38. Supply Type (Essay Response) •Essay items are useful when examinees have to show how they arrived at an answer. •Writing ability is a good example of the kind of test that should be given in an essay response format. •It is difficult to score reliably and can require a significant amount of time to be graded. •Grading is often affected by the verbal fluency in the answer, handwriting, presence or lack of spelling errors, grammar used and the subjective judgements of the grader. •Training of graders can require a large amount of time and needs to be repeated at frequent intervals throughout the grading.
  • 39. Rules/Guidelines for Supply Type Items The following rules may be useful in developing and grading essay questions: 1. The shorter the answer required for a given essay item, generally the better. 2. More objectives can be tested in the same period, and factors such as verbal fluency, spelling, etc., have less of an opportunity to influence the grader. Help the examinees focus their answers by giving them a starting sentence for their essay. 3. Make sure questions are sharply focused on a single issue. 4. Avoid humorous items. Classroom testing is very important and humorous items may cause students to either not take the exam seriously or become confused or anxious. 5. Items should measure only the construct of interest, not one s knowledge of the item context. 6. Write items to measure what students know, not what they do not know.