Anzeige
Anzeige

Más contenido relacionado

Anzeige

501 assignment 9 chapter 12 revision

  1. Chapter 11 REVISING INSTRUCTIONAL MATERIALS KARL WRIGHT
  2.  Summarizing and analyzing data obtained from formative evaluation  Revising materials  The changes that are made to the content of the materials  The changes that are related to the procedures employed in using the materials BACKGROUND
  3. • Describe various methods for summarizing data obtained from formative evaluation studies. • Summarize data obtained from formative evaluation studies. • Given summarized formative evaluation data, identify weaknesses in instructional materials and instructor-led instruction. • Given formative evaluation data for set of instructional materials, identify problems in the materials, and suggest revisions for the materials. OBJECTIVES
  4.  Learner characteristics  Entry behavior  Direct responses to the instruction  Learning time  Posttest performance  Responses to an attitude questionnaire  Comments made directly in the materials KINDS OF DATA TO ANALYZE
  5. The designer must look at the similarities and differences among the responses of the learners, and determine the best changes to make in the instruction. Three Sources Of Suggestions For Changes  Learner suggestions  Learner performance  Your own reactions to the instruction ANALYZING DATA FROM ONE-TO-ONE TRIALS
  6. The fundamental unit of analysis for all the assessments is the individual assessment item. Performance on each item must be scored as correct or incorrect. Methods For Summarizing Data  Item-by-objective performance  Graphing learners’ performance  Descriptive fashion ANALYZING DATA FROM SMALL-GROUP
  7. • Comments can be captured in one-on- one charts where you list out comments made by each learner • Assessment scores can be shown in charts or hierarchies that represent your individual objectives • Assessment scores can be shown in charts or hierarchies that represent your individual objectives ANALYZING DATA FROM FIELD TRIAL
  8. • Derive assessment instruments based on the objectives to: Diagnose an individual’s possessions of the necessary prerequisites for learning new skills  Check the results of student learning during the process of a lesson  Provide document of students progress for parents or administrators • It is useful in evaluating the instructional system itself (Formative/ Summative evaluation) and for early determination of performance measures before the development of lesson plan and instructional materials LEARNERS’ PERFORMANCE ACROSS TESTS
  9. • The goal of continuous monitoring and charting of student performance is twofold. First, it provides you, the teacher, information about student progress on discrete, short-term objectives. It enables you to adjust your instruction to review or re-teach concepts or skills immediately, rather than waiting until you've covered several topics to find out that one or more students didn't learn a particular skill or concept. Second, it provides your students with a visual representation of their learning. Students can become more engaged in their learning by charting and graphing their own performance •GRAPHING LEARNERS’ PERFORMANCES
  10. assessment. Teachers can circulate the room to monitor students' progress. If students are working independently or in groups, teachers should intervene when the students are not understanding the material. Teachers can also take note of students' comments and participation levels during class discussions to gauge their learning.  SELECTED RESPONSE ASSESSMENTS are any type of objective exam where there is only one correct answer for each question. Multiple choice, fill-in-the-blank, matching and true/false questions are all types of selected response assessments. This type of assessment allows the teacher to score exams quickly and with a large degree of reliability in scoring from one exam to another. • CONSTRUCTED RESPONSE ASSESSMENTS require students to generate their own response rather than selecting a single response from several possible ones. These exams are much more subjective as there is not a single correct answer. Instead, teachers must grade either with a rubric or holistically to maintain a fair degree of reliability. OTHER TYPES OF DATA
  11. • PERFORMANCE ASSESSMENTS require students to perform as a means of showing they understand class material. The types of performances can include actual performing, as in a class debate, or performance by creating, as in making a brochure or TV ad. These assessments evaluate complex cognitive processes as well as attitude and social skills, and students often find them engaging. • PORTFOLIO ASSESSMENTS evaluate a student's progress over the course of the semester. It is more than a one-time picture of what a learner has accomplished. Portfolios include all of a student's work in a particular area. For example, a student in an English class could have a portfolio for a research paper that includes note cards, outlines, rough drafts, revisions and a final draft. The teacher would evaluate the portfolio as a whole, not just the final draft, to see how the student has grown OTHER TYPES OF DATA
  12. • The information on the clarity of instruction, impact on learner, and feasibility of instruction needs to be summarized and focused. • Particular aspects of the instruction found to be weak can then be reconsidered in order to plan revisions likely to improve the instruction for similar learners. • SEQUENCE FOR EXAMINING DATA
  13. • A step-by-step determination of what people are doing when they perform the goal and what entry behaviors are needed. • Involves identification of the context in which the skills will be learned and the context in which the skills will be used. ENTRY BEHAVIORS
  14. • After the students in the one- to- one trials have completed the instruction, they should review the posttest and attitude questionnaire in the same fashion. • After each item or step in the assessment, ask the learners why they made the particular responses that they did. • This will help you spot not only mistakes but also the reasons for the mistakes, which can be quite helpful during the re-vision process. PRETESTS & POSTTESTS
  15. • Instructional strategy is an overall plan of activities to achieve an instructional goal; it includes the sequence of intermediate objectives and the learning activities leading to the instructional goal. • Its purpose is to identify the strategy to achieve the terminal objective and to outline how instructional activities will relate to the accomplishment of the objectives. • Emphasis is given on presentation of information, practice and feedback, and testing. • A well-designed lesson should demonstrating know-ledge about the learners, tasks reflected in the objectives, and effectiveness of teaching strategies. INSTRUCTIONAL STRATEGY
  16. • One design interest during one- to- one evaluation is determining the amount of time required for learners to complete instruction, which is a very rough estimate, because of the interaction between the learner and the designer. • You can attempt to subtract a certain percentage of the time from the total time, but experience has indicated that such estimates can be quite inaccurate. LEARNING TIME
  17. • Instructional strategy is an overall plan of activities to achieve an instructional goal; it includes the sequence of intermediate objectives and the learning activities leading to the instructional goal. • Its purpose is to identify the strategy to achieve the terminal objective and to outline how instructional activities will relate to the accomplishment of the objectives. • Emphasis is given on presentation of information, practice and feedback, and testing. • A well-designed lesson should demonstrating know-ledge about the learners, tasks reflected in the objectives, and effectiveness of teaching strategies. INSTRUCTIONAL PROCEDURE
  18. • Use the data, your experience, and sound learning principles as the bases for your revision. • The aim is to revise the instruction so as to make it as effective as possible for larger number of students. • Data from the formative evaluation are summarized and interpreted to attempt to identify difficulties experience by learners in achieving the objectives and to relate these difficulties to specific deficiencies in the materials. REVISION PROCESS
  19. 1. Omit portions of the instruction. 2. Include other available materials. 3. Simply develop supplementary instruction. REVISING SELECTED MATERIALS
  20. The final step in the design and development process (and the first step in a repeat cycle) is revising the instruction. Data from the formative evaluation are summarized and interpreted to identify difficulties experienced by learners in achieving the objectives and to relate those difficulties to specific deficiencies in the instruction. It is used to re-examine the validity of instructional analysis and the assumptions about the entry behaviors and characteristics of learners. It may be necessary to reexamine statements of performance objectives and test times in light of collected data. The instructional strategy is reviewed and finally all of these considerations are incorporated into revisions of the instruction to make it a more effective instructional tool. SUMMARY
Anzeige