SlideShare ist ein Scribd-Unternehmen logo
1 von 40
Downloaden Sie, um offline zu lesen
HSD2 Assessment Question Creation ,[object Object],[object Object],[object Object],[object Object]
What to Assess? ,[object Object],[object Object],[object Object]
Assessment Question Type Grade Level/Content Assessment Question Type Assessment K-12 Specials & Electives (Visual Arts, Performing Arts, PE) Multiple Choice CBM 2 K-12 English/Language Arts (Reading) Multiple Choice – frameworks that are not passage specific CBM 2 K-12 English/Language Arts (Writing) Multiple Choice & SCR (K-3)/ECR (4-12) Prompt CBM 2 K-12 Math Multiple Choice & CR CBM 2 K-5 Science Multiple Choice CBM 2 6-12 Science Multiple Choice & CR CBM 2 6-12 Social Studies Multiple Choice & SCR End of Course Exam
Developed from Brigham Young University Testing Services and The Department of Instructional Science How to Prepare Better Multiple-Choice Test Items: Guidelines for Teachers
Anatomy of a Multiple-Choice Item
Reliability
Difficulty of Construction ,[object Object]
Guidelines for Constructing Multiple-Choice Items ,[object Object],[object Object]
2. Base each item on a specific problem stated clearly in the stem using proper grammar, punctuation, and spelling.
As illustrated in the following examples, the stem may consist of either a direct question or an incomplete sentence, whichever presents the problem more clearly and concisely.
3. Include as much of the item as possible in the stem, but do not include irrelevant material.
Excess material in the stem that is not essential to answering the problem increases the reading burden and adds to student confusion over what he or she is being asked to do.
4. State the stem in positive form (in general).
5. Word the alternatives clearly and concisely. Clear wording reduces student confusion, and concise wording reduces the reading burden placed on the student.
6. Keep the alternatives mutually exclusive.
7. Keep the alternatives homogeneous in content
8.1 Keep the grammar of each alternative consistent with the stem.  Students often assume that inconsistent grammar is the sign of a distractor, and they are generally right.
 
8.2 Keep the alternatives parallel in form.  If the answer is worded in a certain way and the distractors are worded differently, the student may take notice and respond accordingly.
8.3 Keep the alternatives similar in length.  An alternative noticeably longer or shorter than the other is frequently assumed to be the answer, and not without good reason.
8.4 Avoid textbook, verbatim phrasing.  If the answer has been lifted word-for-word from the pages of the textbook, the students may recognize the phrasing and choose correctly out of familiarity rather than achievement.
8.5 Avoid the use of specific determiners.  When words such as never, always, and only are included in distractors in order to make them false, they serve as flags to the alert student.
8.6 Avoid including keywords in the alternatives.  When a word or phrase in the stem is also found in one of the alternatives, it tips the student off that the alternative is probably the answer.
8.7 Use plausible distractors.  For the student who does not possess the ability being measured by the item, the distractors should look as plausible as the answer. Unrealistic or humorous distractors are nonfunctional and increase the student’s chance of guessing the correct answer.
9. Avoid the alternatives “all of the above” and “none of the above” (in general).
10. Include one and only one correct or clearly best answer in each item.   When more than one of the alternatives can be successfully defended as being the answer, responding to an item becomes a frustrating game of determining what the teacher had in mind when he or she wrote the item.
11. Present the answer in each of the alternative positions approximately an equal number of times, in a random order.
12. Avoid using unnecessarily difficult vocabulary. ,[object Object],[object Object]
13. Avoid Question Bias – Be aware of questions that can: a) be construed as offensive to particular groups of individuals, b) portray groups of individuals unfavorably, or in a stereotypical fashion, c) be advantageous to one group, and/or disadvantageous to another, or d) be unfamiliar to certain groups of individuals.
Guidelines For Writing  Content SCRs ,[object Object],[object Object]
Example of Content Scoring Rubric
Constructed Response Questions ,[object Object],[object Object],[object Object],[object Object]
Example of CR Scoring Guide
What to Assess? ,[object Object],[object Object],[object Object]
Assessment Question Type Grade Level/Content Assessment Question Type Assessment K-12 Specials & Electives (Visual Arts, Performing Arts, PE) Multiple Choice CBM 2 K-12 English/Language Arts (Reading) Multiple Choice – frameworks that are not passage specific CBM 2 K-12 English/Language Arts (Writing) Multiple Choice & SCR (K-3)/ECR (4-12) Prompt CBM 2 K-12 Math Multiple Choice & CR CBM 2 K-5 Science Multiple Choice CBM 2 6-12 Science Multiple Choice & CR CBM 2 6-12 Social Studies Multiple Choice & SCR End of Course Exam
[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
How To Add ImagesTo  Questions You Submit
How To Add ImagesTo  Questions You Submit
How To Add ImagesTo  Questions You Submit
You Are Ready to Submit

Weitere ähnliche Inhalte

Was ist angesagt? (20)

Power point for the techniques for constructing exam items
Power point for the techniques for constructing exam itemsPower point for the techniques for constructing exam items
Power point for the techniques for constructing exam items
 
Less #6 Types Of Tests
Less #6 Types Of TestsLess #6 Types Of Tests
Less #6 Types Of Tests
 
Basic testing techniques2
Basic testing techniques2Basic testing techniques2
Basic testing techniques2
 
Best Practices in Item Writing
Best Practices in Item WritingBest Practices in Item Writing
Best Practices in Item Writing
 
Writing Essay Test Items
Writing Essay Test ItemsWriting Essay Test Items
Writing Essay Test Items
 
Matching type of evaluation
Matching type of evaluationMatching type of evaluation
Matching type of evaluation
 
Objective test
Objective testObjective test
Objective test
 
Writing objective test items
Writing objective test itemsWriting objective test items
Writing objective test items
 
Types of test item
Types of test itemTypes of test item
Types of test item
 
Writing objective Test Items
Writing objective Test ItemsWriting objective Test Items
Writing objective Test Items
 
Constructing Objective Supply Type of Items
Constructing Objective Supply Type of ItemsConstructing Objective Supply Type of Items
Constructing Objective Supply Type of Items
 
Item Writing Guidelines
Item Writing GuidelinesItem Writing Guidelines
Item Writing Guidelines
 
8 essay test
8 essay test8 essay test
8 essay test
 
Tests and Measurements Essay Questions
Tests and Measurements Essay QuestionsTests and Measurements Essay Questions
Tests and Measurements Essay Questions
 
Constructing Objective and Subjective Test
Constructing Objective and Subjective TestConstructing Objective and Subjective Test
Constructing Objective and Subjective Test
 
Item writing
Item writingItem writing
Item writing
 
ESSAY TYPE OF TEST
ESSAY TYPE OF TESTESSAY TYPE OF TEST
ESSAY TYPE OF TEST
 
Matching type
Matching typeMatching type
Matching type
 
Subjective test
Subjective testSubjective test
Subjective test
 
Item Analysis, Design and Test Formats
Item Analysis, Design and Test FormatsItem Analysis, Design and Test Formats
Item Analysis, Design and Test Formats
 

Andere mochten auch

The Sydney Issue - Weekend TODAY 19 Feb 2011
The Sydney Issue - Weekend TODAY 19 Feb 2011The Sydney Issue - Weekend TODAY 19 Feb 2011
The Sydney Issue - Weekend TODAY 19 Feb 2011huiying315
 
MáXimo Recreio Condominio Resort JoãO Fortes
MáXimo Recreio Condominio Resort   JoãO FortesMáXimo Recreio Condominio Resort   JoãO Fortes
MáXimo Recreio Condominio Resort JoãO FortesJuliana Schettini
 

Andere mochten auch (6)

Atlantis Park Chl
Atlantis Park   ChlAtlantis Park   Chl
Atlantis Park Chl
 
Residencial Origami Chl 2
Residencial Origami   Chl 2Residencial Origami   Chl 2
Residencial Origami Chl 2
 
Internet dlm pembelajaran
Internet dlm pembelajaranInternet dlm pembelajaran
Internet dlm pembelajaran
 
The Sydney Issue - Weekend TODAY 19 Feb 2011
The Sydney Issue - Weekend TODAY 19 Feb 2011The Sydney Issue - Weekend TODAY 19 Feb 2011
The Sydney Issue - Weekend TODAY 19 Feb 2011
 
MáXimo Recreio Condominio Resort JoãO Fortes
MáXimo Recreio Condominio Resort   JoãO FortesMáXimo Recreio Condominio Resort   JoãO Fortes
MáXimo Recreio Condominio Resort JoãO Fortes
 
Vitality Spa Condominio
Vitality Spa CondominioVitality Spa Condominio
Vitality Spa Condominio
 

Ähnlich wie Assessment grounding

Ähnlich wie Assessment grounding (20)

Chapter 6: Writing Objective Test Items
Chapter 6: Writing Objective Test ItemsChapter 6: Writing Objective Test Items
Chapter 6: Writing Objective Test Items
 
TEST_CONSTRUCTION.pptx
TEST_CONSTRUCTION.pptxTEST_CONSTRUCTION.pptx
TEST_CONSTRUCTION.pptx
 
typesofeval.pptx
typesofeval.pptxtypesofeval.pptx
typesofeval.pptx
 
Type of Test
Type of TestType of Test
Type of Test
 
Types of Essay Items
Types of Essay ItemsTypes of Essay Items
Types of Essay Items
 
Essay Question -assessment
Essay Question -assessmentEssay Question -assessment
Essay Question -assessment
 
Sound%20 design%20%28ch%204 7%29
Sound%20 design%20%28ch%204 7%29Sound%20 design%20%28ch%204 7%29
Sound%20 design%20%28ch%204 7%29
 
Materi mc
Materi mcMateri mc
Materi mc
 
Test construction essay tests
Test construction  essay testsTest construction  essay tests
Test construction essay tests
 
Constructing of tests
Constructing of testsConstructing of tests
Constructing of tests
 
Item Writting.pptx
Item Writting.pptxItem Writting.pptx
Item Writting.pptx
 
null-1.pptx
null-1.pptxnull-1.pptx
null-1.pptx
 
guidelines paper setters for people llllll
guidelines paper setters for people llllllguidelines paper setters for people llllll
guidelines paper setters for people llllll
 
Essaytypetest
EssaytypetestEssaytypetest
Essaytypetest
 
Types of evaluation tool
Types of evaluation toolTypes of evaluation tool
Types of evaluation tool
 
APUS Assignment Rubric Undergraduate Level 300-400 .docx
APUS Assignment Rubric Undergraduate Level 300-400  .docxAPUS Assignment Rubric Undergraduate Level 300-400  .docx
APUS Assignment Rubric Undergraduate Level 300-400 .docx
 
Analysis of Traditional Assessment in the EFL Teaching
Analysis of Traditional Assessment in the EFL TeachingAnalysis of Traditional Assessment in the EFL Teaching
Analysis of Traditional Assessment in the EFL Teaching
 
Essay type test
Essay type testEssay type test
Essay type test
 
Selected Response Items.pdf
Selected Response Items.pdfSelected Response Items.pdf
Selected Response Items.pdf
 
Assessment of Learning - Multiple Choice Test
Assessment of Learning - Multiple Choice TestAssessment of Learning - Multiple Choice Test
Assessment of Learning - Multiple Choice Test
 

Mehr von Doug

01158951
0115895101158951
01158951Doug
 
Ca eett modules 1-4 - april 21
Ca   eett modules 1-4 - april 21Ca   eett modules 1-4 - april 21
Ca eett modules 1-4 - april 21Doug
 
Harrison Institute Session Two
Harrison Institute Session TwoHarrison Institute Session Two
Harrison Institute Session TwoDoug
 
Brochures
BrochuresBrochures
BrochuresDoug
 
Mod 5 Presentation Group Solar System
Mod 5 Presentation   Group   Solar SystemMod 5 Presentation   Group   Solar System
Mod 5 Presentation Group Solar SystemDoug
 
Group Storyboard
Group StoryboardGroup Storyboard
Group StoryboardDoug
 
Module 5 Power Point In The Classroom And Lab
Module 5   Power Point In The Classroom And LabModule 5   Power Point In The Classroom And Lab
Module 5 Power Point In The Classroom And LabDoug
 
Using Excel In Your Classroom And Beyond
Using Excel In Your Classroom And BeyondUsing Excel In Your Classroom And Beyond
Using Excel In Your Classroom And BeyondDoug
 
Using Web 2.0 In The Classroom... or lab
Using Web 2.0 In The Classroom... or labUsing Web 2.0 In The Classroom... or lab
Using Web 2.0 In The Classroom... or labDoug
 

Mehr von Doug (9)

01158951
0115895101158951
01158951
 
Ca eett modules 1-4 - april 21
Ca   eett modules 1-4 - april 21Ca   eett modules 1-4 - april 21
Ca eett modules 1-4 - april 21
 
Harrison Institute Session Two
Harrison Institute Session TwoHarrison Institute Session Two
Harrison Institute Session Two
 
Brochures
BrochuresBrochures
Brochures
 
Mod 5 Presentation Group Solar System
Mod 5 Presentation   Group   Solar SystemMod 5 Presentation   Group   Solar System
Mod 5 Presentation Group Solar System
 
Group Storyboard
Group StoryboardGroup Storyboard
Group Storyboard
 
Module 5 Power Point In The Classroom And Lab
Module 5   Power Point In The Classroom And LabModule 5   Power Point In The Classroom And Lab
Module 5 Power Point In The Classroom And Lab
 
Using Excel In Your Classroom And Beyond
Using Excel In Your Classroom And BeyondUsing Excel In Your Classroom And Beyond
Using Excel In Your Classroom And Beyond
 
Using Web 2.0 In The Classroom... or lab
Using Web 2.0 In The Classroom... or labUsing Web 2.0 In The Classroom... or lab
Using Web 2.0 In The Classroom... or lab
 

Assessment grounding

  • 1.
  • 2.
  • 3. Assessment Question Type Grade Level/Content Assessment Question Type Assessment K-12 Specials & Electives (Visual Arts, Performing Arts, PE) Multiple Choice CBM 2 K-12 English/Language Arts (Reading) Multiple Choice – frameworks that are not passage specific CBM 2 K-12 English/Language Arts (Writing) Multiple Choice & SCR (K-3)/ECR (4-12) Prompt CBM 2 K-12 Math Multiple Choice & CR CBM 2 K-5 Science Multiple Choice CBM 2 6-12 Science Multiple Choice & CR CBM 2 6-12 Social Studies Multiple Choice & SCR End of Course Exam
  • 4. Developed from Brigham Young University Testing Services and The Department of Instructional Science How to Prepare Better Multiple-Choice Test Items: Guidelines for Teachers
  • 5. Anatomy of a Multiple-Choice Item
  • 7.
  • 8.
  • 9. 2. Base each item on a specific problem stated clearly in the stem using proper grammar, punctuation, and spelling.
  • 10. As illustrated in the following examples, the stem may consist of either a direct question or an incomplete sentence, whichever presents the problem more clearly and concisely.
  • 11. 3. Include as much of the item as possible in the stem, but do not include irrelevant material.
  • 12. Excess material in the stem that is not essential to answering the problem increases the reading burden and adds to student confusion over what he or she is being asked to do.
  • 13. 4. State the stem in positive form (in general).
  • 14. 5. Word the alternatives clearly and concisely. Clear wording reduces student confusion, and concise wording reduces the reading burden placed on the student.
  • 15. 6. Keep the alternatives mutually exclusive.
  • 16. 7. Keep the alternatives homogeneous in content
  • 17. 8.1 Keep the grammar of each alternative consistent with the stem. Students often assume that inconsistent grammar is the sign of a distractor, and they are generally right.
  • 18.  
  • 19. 8.2 Keep the alternatives parallel in form. If the answer is worded in a certain way and the distractors are worded differently, the student may take notice and respond accordingly.
  • 20. 8.3 Keep the alternatives similar in length. An alternative noticeably longer or shorter than the other is frequently assumed to be the answer, and not without good reason.
  • 21. 8.4 Avoid textbook, verbatim phrasing. If the answer has been lifted word-for-word from the pages of the textbook, the students may recognize the phrasing and choose correctly out of familiarity rather than achievement.
  • 22. 8.5 Avoid the use of specific determiners. When words such as never, always, and only are included in distractors in order to make them false, they serve as flags to the alert student.
  • 23. 8.6 Avoid including keywords in the alternatives. When a word or phrase in the stem is also found in one of the alternatives, it tips the student off that the alternative is probably the answer.
  • 24. 8.7 Use plausible distractors. For the student who does not possess the ability being measured by the item, the distractors should look as plausible as the answer. Unrealistic or humorous distractors are nonfunctional and increase the student’s chance of guessing the correct answer.
  • 25. 9. Avoid the alternatives “all of the above” and “none of the above” (in general).
  • 26. 10. Include one and only one correct or clearly best answer in each item. When more than one of the alternatives can be successfully defended as being the answer, responding to an item becomes a frustrating game of determining what the teacher had in mind when he or she wrote the item.
  • 27. 11. Present the answer in each of the alternative positions approximately an equal number of times, in a random order.
  • 28.
  • 29. 13. Avoid Question Bias – Be aware of questions that can: a) be construed as offensive to particular groups of individuals, b) portray groups of individuals unfavorably, or in a stereotypical fashion, c) be advantageous to one group, and/or disadvantageous to another, or d) be unfamiliar to certain groups of individuals.
  • 30.
  • 31. Example of Content Scoring Rubric
  • 32.
  • 33. Example of CR Scoring Guide
  • 34.
  • 35. Assessment Question Type Grade Level/Content Assessment Question Type Assessment K-12 Specials & Electives (Visual Arts, Performing Arts, PE) Multiple Choice CBM 2 K-12 English/Language Arts (Reading) Multiple Choice – frameworks that are not passage specific CBM 2 K-12 English/Language Arts (Writing) Multiple Choice & SCR (K-3)/ECR (4-12) Prompt CBM 2 K-12 Math Multiple Choice & CR CBM 2 K-5 Science Multiple Choice CBM 2 6-12 Science Multiple Choice & CR CBM 2 6-12 Social Studies Multiple Choice & SCR End of Course Exam
  • 36.
  • 37. How To Add ImagesTo Questions You Submit
  • 38. How To Add ImagesTo Questions You Submit
  • 39. How To Add ImagesTo Questions You Submit
  • 40. You Are Ready to Submit

Hinweis der Redaktion

  1. by Standards, Frameworks and Pace. Review assessment schedule, background info
  2. A standard multiple-choice test item consists of two basic parts: a problem ( stem) and a list of suggested solutions ( alternatives). The stem may be in the form of either a question or an incomplete statement, and the list of alternatives contains one correct or best alternative ( answer) and a number of incorrect or inferior alternatives ( distractors). The purpose of the distractors is to appear as plausible solutions to the problem for those students who have not achieved the objective being measured by the test item. Conversely, the distractors must appear as implausible solutions for those students who have achieved the objective. Only the answer should appear plausible to these students.
  3. Although they are less susceptible to guessing than are true false-test items, multiple-choice items are still affected to a certain extent. This guessing factor reduces the reliability of multiple-choice item scores somewhat, but increasing the number of items on the test offsets this reduction in reliability. The following table illustrates this principle. For example, if your test includes a section with only two multiple-choice items of 4 alternatives each ( a b c d), you can expect 1 out of 16 of your students to correctly answer both items by guessing blindly. On the other hand if a section has 15 multiple-choice items of 4 alternatives each, you can expect only 1 out of 8,670 of your students to score 70% or more on that section by guessing blindly.
  4. The stem is the foundation of the item. After reading the stem, the student should know exactly what the problem is and what he or she is expected to do to solve it. If the student has to infer what the problem is, the item will likely measure the student’s ability to draw inferences from vague descriptions rather than his or her achievement of a course objective.
  5. Rather than repeating redundant words or phrases in each of the alternatives, place such material in the stem to decrease the reading burden and more clearly define the problem in the stem. Notice how the underlined words are repeated in each of the alternatives in the poor example above. This problem is fixed in the better example, where the stem has been reworded to include the words common to all of the alternatives
  6. The stem of the poor example above is excessively long for the problem it is presenting. The stem of the better example has been reworded to exclude most of the irrelevant material, and is less than half as long. Research. Several studies have indicated that including irrelevant material in the item stem decreases both the reliability and the validity of the resulting test scores (Haladyna & Downing, 1989b).
  7. Negatively-worded items are those in which the student is instructed to identify the exception, the incorrect answer, or the least correct answer. Such items are frequently used, because they are relatively easy to construct. The teacher writing the item need only come up with one distractor, rather than the two to four required for a positively-worded item. Positive items, however, are more appropriate to use for measuring the attainment of most educational objectives. For most educational objectives, a student’s achievement is more effectively measured by having him or her identify a correct answer rather than an incorrect answer. Just because the student knows an incorrect answer does not necessarily imply that he or she knows the correct answer. For this reason, items of the negative variety are not recommended for general use.
  8. The alternatives in the poor example above are rather wordy, and may require more than one reading before the student understands them clearly. In the better example, the alternatives have been streamlined to increase clarity without losing accuracy.
  9. Alternatives that overlap create undesirable situations. Some of the overlapping alternatives may be easily identified as distractors. On the other hand, if the overlap includes the intended answer, there may be more than one alternative that can be successfully defended as being the answer. In the poor example above, alternatives a and d overlap, as do alternatives b and c. In the better example, the alternatives have been rewritten to be mutually exclusive.
  10. If the alternatives consist of a potpourri of statements related to the stem but unrelated to each other, the student’s task becomes unnecessarily confusing. Alternatives that are parallel in content help the item present a clear-cut problem more capable of measuring the attainment of a specific objective. The poor example contains alternatives testing knowledge of state agriculture, physical features, flags, and nicknames. If the student misses the item, it does not tell the teacher in which of the four areas the student is weak. In the better example, all of the alternatives refer to state agriculture, so if the student misses the item, it tells the teacher that the student has a weakness in that area.
  11. The word “an” in the stem of the poor example above serves as a clue to the correct answer, “ adjective,” because the other alternatives begin with consonants. The problem has been corrected in the better example by placing the appropriate article, “an” or “a,” in each alternative.
  12. In the poor example above, the answer fits better grammatically with the stem than do the distractors. This problem has been solved in the better example by rewording the alternatives. Research. Several studies have found that grammatical clues make items easier (Haladyna & Downing, 1989b).
  13. The answer in the poor example above stands out because it does not include the identical wording underlined in each of the distractors. The answer is less obvious in the better example because the distractors have been reworded to be more parallel with the answer.
  14. Notice how the answer stands out in the poor example above. Both the answer and one of the distractors have been reworded in the better example to make the alternative lengths more uniform. Research. Numerous studies have indicated that items are easier when the answer is noticeably longer than the distractors when all of the alternatives are similar in length (Haladyna & downing, 1989b).
  15. The answer in the poor example above is a familiar definition straight out of the textbook, and the distractors are in the teacher’s own words.
  16. In the poor example above, the underlined word in each of the distractors is a specific determiner. These words have been removed from the better example by rewording both the stem and the distractors.
  17. In the poor example above, the underlined word “journal” appears in both the stem and the answer. This clue has been removed from the better example by replacing the answer with another valid answer that does not include the keyword. Research. Several studies have reported that items are easier when a keyword in the stem is also included in the answer (Haladyna & Downing, 1989b).
  18. The implausible distractors in the poor example have been replaced by more plausible distractors in the better example. Plausible distractors may be created in several ways, a few of which are listed below: • Use common student misconceptions as distractors. The incorrect answers supplied by students to a short answer version of the same item are a good source of material to use in constructing distractors for a multiple-choice item. • Develop your own distractors, using words that “ring a bell” or that “sound official.” Your distractors should be plausible enough to keep the student who has not achieved the objective from detecting them, but not so subtle that they mislead the student who has achieved the objective.
  19. These two alternatives are frequently used when the teacher writing the item has trouble coming up with a sufficient number of distractors. Such teachers emphasize quantity of distractors over quality. Unfortunately, the use of either of these alternatives tends to reduce the effectiveness of the item, as illustrated in the following table: Research. While research on the use of “all of the above” is not conclusive, the use of “none of the above” has been found in several studies to decrease item discrimination and test score reliability (Haladyna & Downing, 1989b).
  20. Such ambiguity is particularly a problem with items of the best answer variety, where more than one alternative may be correct, but only one alternative should be clearly best. If competent authorities cannot agree on which alternative is clearly best, the item should either be revised or discarded. While the answer to the poor example above is a matter of debate, the underlined phrase added to the better example clarifies the problem considerably and rules out all of the alternatives except the answer.
  21. The easiest method of randomizing the answer position is to arrange the alternatives in some logical order. The following table gives examples of three logical orders. The best order to use for a particular item depends on the nature of the item’s alternatives.
  22. Research. Although very little research has been done on this guideline, one study has reported that simplifying the vocabulary makes the items about 10% easier (Cassels & Johnstone, 1984).