SlideShare ist ein Scribd-Unternehmen logo
1 von 289
Quality
Assessment Design
Introduce yourself to your neighbor... and
what do you think?
Using Assessments: What do you
think?
The Most Powerful Use of My Assessment is For:
✓ Measuring Learning?
✓ Grades?
✓ Feedback for student?
✓ Measuring Growth?
✓ Another?
Welcome to the AGE OF ASSESSMENT
Lots of Choices. How do we make good
choices?
Consider this: Is the data you get from
administering that assessment more valuable
than the instructional time lost to administer the
assessment?
Welcome to the AGE OF ASSESSMENT
Lots of Choices. How do we make good
choices?
Consider this: Is the data you get from
administering that assessment more valuable
than the instructional time lost to administer the
assessment?
Every time we give an
assessment….
...we give up instructional time.
Measure the change in student understanding
over time...
Growth Tells Us--
A: Difference between student's baseline performance (beginning of the
year)
B: Difference between baseline and student's subsequent performance
(middle, end or another year)
How Much do we Assess?
Welcome to the AGE OF ASSESSMENT
Lots of Choices. How do we make good
choices?
Consider this: Is the data you get from
administering that assessment more valuable
than the instructional time lost to administer the
assessment?
Welcome to the AGE OF ASSESSMENT
Lots of Choices. How do we make good
choices?
Consider this: Is the data you get from
administering that assessment more valuable
than the instructional time lost to administer the
assessment?
Pivot Points: Places in the teacher’s lesson where student growth data will
determine the next teaching steps
SLO: Student Learning Objectives. The Illinois recommended model framework for including
growth on teacher evaluations.
Assessment Design
Common Question:
Why are we so focused on ASSESSMENT
Pivot Points: Places in the teacher’s lesson where student growth data will
determine the next teaching steps
SLO: Student Learning Objectives. The Illinois recommended model framework for including
growth on teacher evaluations.
Assessment Design
Common Question:
Why are we so focused on ASSESSMENT
….when INSTRUCTION is key to student learning?
Pivot Points: Places in the teacher’s lesson where student growth data will
determine the next teaching steps
SLO: Student Learning Objectives. The Illinois recommended model framework for including
growth on teacher evaluations.
Assessment Design
Common Question:
Why are we so focused on ASSESSMENT
….when INSTRUCTION is key to student learning?
● Assessment and instruction go hand in hand.
Pivot Points: Places in the teacher’s lesson where student growth data will
determine the next teaching steps
SLO: Student Learning Objectives. The Illinois recommended model framework for including
growth on teacher evaluations.
Assessment Design
Common Question:
Why are we so focused on ASSESSMENT
….when INSTRUCTION is key to student learning?
● Assessment and instruction go hand in hand.
● Assessments tell us if instruction is working.
Pivot Points: Places in the teacher’s lesson where student growth data will
determine the next teaching steps
SLO: Student Learning Objectives. The Illinois recommended model framework for including
growth on teacher evaluations.
Assessment Design
Common Question:
Why are we so focused on ASSESSMENT
….when INSTRUCTION is key to student learning?
● Assessment and instruction go hand in hand.
● Assessments tell us if instruction is working.
● Assessments tell us what to do next.
Pivot Points:
Assessment Design
Common Question:
Why are we so focused on ASSESSMENT
….when INSTRUCTION is key to student learning?
● Assessment and instruction go hand in hand.
● Assessments tell us if instruction is working.
● Assessments tell us what to do next.
Pivot Points: Places in the teacher’s lesson where
student growth data will determine the
next teaching steps
Assessment Design
Pivot Points: Places in the teacher’s lesson where
student growth data will determine the
next teaching steps
Assessment Design
Pivot Points in your lessons are
they way you MAKE SURE all
students are growing!
MAKE SURE you and students
are reaching your goals!
Pivot Points: Places in the teacher’s lesson where
student growth data will determine the
next teaching steps
Assessment Design
Pivot Points in your lessons are
they way you MAKE SURE all
students are growing!
MAKE SURE you and students
are reaching your goals!
You have to get the right kind
of data to make a pivot!
Key Tool: GOOD formative
assessments.
Pivot Points: Places in the teacher’s lesson where
student growth data will determine the
next teaching steps
Formative Assessment: Assessments for learning: they occur
during the instructional Interval and provide information about student learning progress.
Summative Assessment: Assessments of learning: they occur
at the end of an instructional interval and provide a final measurement of student mastery.
(Stiggins, Arter, Chappius, 2004) (Reeves, 2009) (DuFour et. al, 2010)
Assessment Design
Pivot Points: Places in the teacher’s lesson where
student growth data will determine the
next teaching steps
Formative Assessment: Assessments for learning: they occur
during the instructional Interval and provide information about student learning progress.
Summative Assessment: Assessments of learning: they occur
at the end of an instructional interval and provide a final measurement of student mastery.
(Stiggins, Arter, Chappius, 2004) (Reeves, 2009) (DuFour et. al, 2010)
Assessment Design
Pivot Points: Places in the teacher’s lesson where
student growth data will determine the
next teaching steps
Formative Assessment: Assessments for learning: they occur
during the instructional Interval and provide information about student learning progress.
Summative Assessment: Assessments of learning: they occur
at the end of an instructional interval and provide a final measurement of student mastery.
(Stiggins, Arter, Chappius, 2004) (Reeves, 2009) (DuFour et. al, 2010)
Assessment Design
Pivot Points: Places in the teacher’s lesson where
student growth data will determine the
next teaching steps
Formative Assessment: Assessments for learning: they occur
during the instructional Interval and provide information about student learning progress.
Summative Assessment: Assessments of learning: they occur
at the end of an instructional interval and provide a final measurement of student mastery.
(Stiggins, Arter, Chappius, 2004) (Reeves, 2009) (DuFour et. al, 2010)
Assessment Design
How an assessment is used really
determines if it is FORMATIVE or
SUMMATIVE!
(Bailey & Jakicic, 2012)
Pivot Points: Places in the teacher’s lesson where
student growth data will determine the
next teaching steps
Formative Assessment: Assessments for learning: they occur
during the instructional Interval and provide information about student learning progress.
Summative Assessment: Assessments of learning: they occur
at the end of an instructional interval and provide a final measurement of student mastery.
Attainment: Meeting a target on 1 test. Measurement of knowledge at
a single point in time. (ex: Final Exam, Chapter Test, Spelling Test)
Growth: Change between 2 tests. Change in understanding/knowledge
over time.
Assessment Design
Assessment Design
Pivot Points: Places in the teacher’s lesson where
student growth data will determine the
next teaching steps
Formative Assessment: Assessments for learning: they occur
during the instructional Interval and provide information about student learning progress.
Summative Assessment: Assessments of learning: they occur
at the end of an instructional interval and provide a final measurement of student mastery.
Attainment: Meeting a target on 1 test. Measurement of knowledge at
a single point in time. (ex: Final Exam, Chapter Test, Spelling Test)
Growth: Change between 2 tests. Change in understanding/knowledge
over time.
Evaluation Connection:
Measuring Student Growth
now is the assessment
SPOTLIGHT!
Why Did We Start Teaching?
Traditionally... Each unit is an
unrelated silo...
Traditionally... Each unit is an
unrelated silo...
Traditionally... Each unit is an
unrelated silo...
You can not measure longitudinal learning this
way.
Assessment Implementation Model
Assessment Implementation Model
Assess the
threads that run
throughout the
instructional
interval
Assessment Implementation Model
Assessment Implementation Model
In a growth
model:
Assessment Implementation Model
In a growth
model: Baseline
Assessment Implementation Model
In a growth
model: Baseline
Mid-point
Check
Assessment Implementation Model
In a growth
model: Baseline
Final Post-
test
Mid-point
Check
Assessment Implementation Model
Is 3 a MAGIC number?
No.
You want to have some formative between your pre-test and post-test
Assessment Implementation Model
Is 3 a MAGIC number?
No.
You want to have some formative between your pre-test and post-test
Data from Assessments
“Every time we gave an assessment and realized we were
headed in the wrong direction, it gave us a deeper
understanding of where we wanted to go,”
Discussion:
How do we use FORMATIVE Assessments?
● How do assessment results inform curriculum
and instructional choices?
● Do your tools provide information on where/how
to pivot for your students?
● Is the data easy to interpret?
Bring Growth to your Data Picture
Formative Assessments: Inform teaching
Multiple, mirrored formative assessments: Show growth
In the classroom:
PERA is changing teacher evaluations.
● Fall of 2016: *
■ All school districts must implement student
growth data into evaluation instruments
*RTTT Districts, SIG Districts, and districts in status have earlier implementation requirements
PERA Law-ISBE Requirements
The Evaluation Connection: Student Growth
The SLO Evaluation Cycle
Source: Lachlan-Haché, L., Cushing, E., & Bivona, L. (2012). Student learning objectives as measures of educator effectiveness: The basics.
Washington, DC: American Institutes for Research. Retrieved from http://educatortalent.org/inc/docs/SLOs_Measures_of_Educator_Effectiveness.
pdf
The Evaluation Connection: Student Growth
Bring Growth to your Data Picture
On the Evaluation:
Student Growth (~30%)
Assessment Set Results-Growth Data
Professional Responsibility (~70%)
1b: Demonstrate knowledge of students' skills (student data)
1e: Designing coherent instruction (pivot points)
1f: Designing student assessments (formative assessments and use to plan instruction)
3d: Using assessment in instruction (assessments integrated; students monitor progress)
3e: Demonstrating flexibility and responsiveness (enhance learning on individual student
misunderstanding)
4a: Reflecting on Teaching (Offers alternate actions, thoughtful assessment of lesson effectiveness)
4b: Maintaining Accurate Records (Students maintain progress and learning)
The Evaluation Connection: Student Growth
Bring Growth to your Data Picture
On the Evaluation:
Student Growth (~30%)
Assessment Set Results-Growth Data
Professional Responsibility (~70%)
1b: Demonstrate knowledge of students' skills (student data)
1e: Designing coherent instruction (pivot points)
1f: Designing student assessments (formative assessments and use to plan instruction)
3d: Using assessment in instruction (assessments integrated; students monitor progress)
3e: Demonstrating flexibility and responsiveness (enhance learning on individual student
misunderstanding)
4a: Reflecting on Teaching (Offers alternate actions, thoughtful assessment of lesson effectiveness)
4b: Maintaining Accurate Records (Students maintain progress and learning)
The Evaluation Connection: Student Growth
Bring Growth to your Data Picture
On the Evaluation:
Student Growth (~30%)
Assessment Set Results-Growth Data
Professional Responsibility (~70%)
1b: Demonstrate knowledge of students' skills (student data)
1e: Designing coherent instruction (pivot points)
1f: Designing student assessments (formative assessments and use to plan instruction)
3d: Using assessment in instruction (assessments integrated; students monitor progress)
3e: Demonstrating flexibility and responsiveness (enhance learning on individual student
misunderstanding)
4a: Reflecting on Teaching (Offers alternate actions, thoughtful assessment of lesson effectiveness)
4b: Maintaining Accurate Records (Students maintain progress and learning)
There is a need for artifacts to PROVE:
● Students are learning and growing
● Teachers are designing instruction around
assessment data
The Instructional Interval
What is an Instructional Interval?
It depends on your district/class/course.
It may be a full year or a portion of a year.
The Instructional Interval
What is an Instructional Interval?
It depends on your district/class/course.
It may be a full year or a portion of a year.
The Elephant in the Room
Timing:
How will this work with
other evaluation concentrations?
How long will my “Instructional Interval” be?
Gathering Data for Growth Goals
The Evaluation Connection: Student Growth
Gathering Data for Growth Goals
The Evaluation Connection: Student Growth
Gathering Data for Growth Goals
The Evaluation Connection: Student Growth
Creating Valid and Reliable
Assessments for Growth
How can Teachers Create or Choose assessments that
can quantify change in student understanding?
ilar Complexity
Creating Valid and Reliable
Assessments for Growth
How can Teachers Create or Choose assessments that
can quantify change in student understanding? How can
we reliably put a number to student growth?
ilar Complexity
Creating Valid and Reliable
Assessments for Growth
How can Teachers Create or Choose assessments that
can quantify change in student understanding? How can
we reliably put a number to student growth?
Key Features:
1. Mirrored Form
2. Mirrored Content
3. Mirrored Complexity
Misconception: Test gets harder
as student learns more material…
In Reality: All assessments at the same level -
Test does not change difficulty:
More questions correct = Growth and Learning
Measure Growth:
All At The Same Level of Difficulty
Misconception: Test gets harder
as student learns more material…
In Reality: All assessments at the same level -
Test does not change difficulty:
More questions correct = Growth and Learning
Measure Growth:
All At The Same Level of Difficulty
We will discuss the more later!!
Quality
Assessments
The Elements &
Decisions
Assessments in a Nutshell
Consider Attainment vs Growth...
Assessments in a Nutshell
Consider Attainment vs Growth...
Assessments in a Nutshell
Consider Attainment vs Growth...
Assessments in a Nutshell
My students grow and learn every day!!
Assessments in a Nutshell
My students grow and learn every day!!
Student learning IS always happening in my
classroom.
Assessments in a Nutshell
So…
HOW can we gather
information/data/artifacts
to prove growth is happening?
Assessments in a Nutshell
So…
HOW can we gather
information/data/artifacts
to prove growth is happening?
Measure Change
Assessments in a Nutshell
So…
HOW can we gather
information/data/artifacts
to prove growth is happening?
Measure Change
Measure Student Learning
Assessments in a Nutshell
So…
HOW can we gather
information/data/artifacts
to prove growth is happening?
Measure Change
Measure Student Learning
Measure Student Growth
Assessments in a Nutshell
So…
HOW can we gather
information/data/artifacts
to prove growth is happening?
Measure Change
Measure Student Learning
Measure Student Growth
Key: High Quality, Mirrored
Assessment Sets
Assessments in a Nutshell
So…
HOW can we gather
information/data/artifacts
to prove growth is happening?
Measure Change
Measure Student Learning
Measure Student Growth
Important Term:
Mirrored Assessment Set:
A series of comparable assessments that can measure
learning over 2 or more points in time. They are designed
with the same form, content, and level of complexity.
Growth in a Nutshell
So…
HOW can we gather
information/data/artifacts
to prove growth is happening?
Measure Change
Measure Student Learning
Measure Student Growth
Key: High Quality,
Assessment Sets
Important Terms:
High Validity: Accurately measure what
we intend to measure
High Reliability: Repeatable Results
Growth in a Nutshell
So…
HOW can we gather
information/data/artifacts
to prove growth is happening?
Measure Change
Measure Student Learning
Measure Student Growth
Key: High Quality,
Assessment Sets
High Validity: Accurately measure what
we intend to measure
High Reliability: Repeatable Results
Ask: What are you trying to measure?
Bad Examples:
● A writing assignment with a difficult reading passage
● A reading passage that allows students of a cultural background to have an
advantage over others
Growth in a Nutshell
So…
HOW can we gather
information/data/artifacts
to prove growth is happening?
Measure Change
Measure Student Learning
Measure Student Growth
Key: High Quality,
Assessment Sets
High Validity: Accurately measure what
we intend to measure
High Reliability: Repeatable Results
Important Terms:
Growth in a Nutshell
So…
HOW can we gather
information/data/artifacts
to prove growth is happening?
Measure Change
Measure Student Learning
Measure Student Growth
Key: High Quality,
Assessment Sets
High Validity: Accurately measure what
we intend to measure
High Reliability: Repeatable Results
Ask: Is this repeatable?
Bad Examples:
● A rubric at several points in the same school year
● Different directions to different classes
Growth in a Nutshell
So…
HOW can we gather
information/data/artifacts
to prove growth is happening?
Measure Change
Measure Student Learning
Measure Student Growth
Key: High Quality,
Assessment Sets
Important Terms:
High Validity: Accurately measure what
we intend to measure
High Reliability: Repeatable Results
✓ Design a Blueprint
1. Consider course skills to measure
2. Consider format to best measure those skills
3. Consider complexity of execution of those skills
✓ Repeat the Blueprint Pattern
1. Repeated results will inform pivot points
✓ Consider Assessment Validity
✓ Consider Assessment Reliability
HOW CAN WE DO IT?
Build High Quality Assessment Sets
Blueprint SNEAK PREVIEW:
Blueprint SNEAK PREVIEW:
Blueprints are tools for:
● Organizing standard alignment
● Organizing spectrum of question complexity
● Creating a repeatable model to monitor student learning on skills over time
✓ Design a Blueprint
1. Consider course skills to measure
2. Consider format to best measure those skills
3. Consider complexity of execution of those skills
✓ Repeat the Blueprint Pattern
1. Repeated results will inform pivot points
✓ Consider Assessment Validity
✓ Consider Assessment Reliability
HOW CAN WE DO IT?
Build High Quality Assessment Sets
✓ Design a Blueprint
1. Consider course skills to measure
2. Consider format to best measure those skills
3. Consider complexity of execution of those skills
✓ Repeat the Blueprint Pattern
1. Repeated results will inform pivot points
✓ Consider Assessment Validity
✓ Consider Assessment Reliability
HOW CAN WE DO IT?
Build High Quality Assessment Sets
✓ Design a Blueprint
1. Consider course skills to measure
2. Consider format to best measure those skills
3. Consider complexity of execution of those skills
✓ Repeat the Blueprint Pattern
1. Repeated results will inform pivot points
✓ Consider Assessment Validity
✓ Consider Assessment Reliability
HOW CAN WE DO IT?
Build High Quality Assessment Sets
✓ Design a Blueprint
1. Consider course skills to measure
2. Consider format to best measure those skills
3. Consider complexity of execution of those skills
✓ Repeat the Blueprint Pattern
1. Repeated results will inform pivot points
✓ Consider Assessment Validity
✓ Consider Assessment Reliability
HOW CAN WE DO IT?
Build High Quality Assessment Sets
Lets go step-by-step…
and set the scaffolding to write high quality
assessments!
Creating Quality Assessments
The 3 Elements
Element Variables
to Consider
WHAT:
Content/Skills
● Skills
● Standards
● Power Standards
HOW:
Form
● Selected Response (ex:multiple
choice)
● Constructed Response
● Performance Based
Difficulty:
Level of
Complexity
● Level of cognitive demand
● Question difficulty
● Difficulty of Reading Passage
● Difficulty of Prompt
Creating Quality Assessments
The 3 Elements
Element Variables
to Consider
WHAT:
Content/Skills
● Skills
● Standards
● Power Standards
HOW:
Form
● Selected Response (ex:multiple
choice)
● Constructed Response
● Performance Based
Difficulty:
Level of
Complexity
● Level of cognitive demand
● Question difficulty
● Difficulty of Reading Passage
● Difficulty of Prompt
Aligning to Common Core
Essential Skills/Knowledge
Imagine a list of everything taught in an
instructional interval.
Aligning to Common Core
Essential Skills/Knowledge
Imagine a list of everything taught in an
instructional interval.
Will you assess and re-assess EVERYTHING to monitor
student growth?
Aligning to Common Core
Essential Skills/Knowledge
Imagine a list of everything taught in an
instructional interval.
Will you assess and re-assess EVERYTHING to monitor
student growth? Not likely! Would you have time? No!
Aligning to Common Core
Essential Skills/Knowledge
Imagine a list of everything taught in an
instructional interval.
Will you assess and re-assess EVERYTHING to monitor
student growth? Not likely!
Start by identifying Essential Skills/Knowledge
I teach so many different things…
How can I determine what is “essential?”
How do we prioritize?
Essential Skills/Knowledge
1. Endurance: Will this standard or indicator provide students with
knowledge and skills that will be of value beyond a single test date?
This is information a student will need to know far beyond the last test
the teacher gives.
2. Leverage: Will this provide knowledge and skills that will be of
value in multiple disciplines? (For example: making inferences is a
skill that can be used in many subjects)
3. Readiness for the next level of learning: Will this provide
students with essential knowledge and skills that are necessary for
success in the next grade of the next level of instruction?
Ainsworth, L. (2003)
How do we prioritize?
Essential Skills/Knowledge
1. Endurance: Will this standard or indicator provide students with
knowledge and skills that will be of value beyond a single test date?
This is information a student will need to know far beyond the last test
the teacher gives.
2. Leverage: Will this provide knowledge and skills that will be of
value in multiple disciplines? (For example: making inferences is a
skill that can be used in many subjects)
3. Readiness for the next level of learning: Will this provide
students with essential knowledge and skills that are necessary for
success in the next grade of the next level of instruction?
Ainsworth, L. (2003)
How do we prioritize?
Essential Skills/Knowledge
Examples:
#1: Name the parts of the flower.
IF you ask students to do this for one test and never again all
year long...it doesn’t have endurance.
#2: Clap out a rhythm of quarter notes and half notes.
IF you will revisit quarter notes and half notes again and
again, they have endurance.
1. Endurance: Will this standard or indicator provide students with
knowledge and skills that will be of value beyond a single test date?
This is information a student will need to know far beyond the last test
the teacher gives.
2. Leverage: Will this provide knowledge and skills that will be of
value in multiple disciplines? (For example: making inferences is a
skill that can be used in many subjects)
3. Readiness for the next level of learning: Will this provide
students with essential knowledge and skills that are necessary for
success in the next grade of the next level of instruction?
Ainsworth, L. (2003)
How do we prioritize?
Essential Skills/Knowledge
1. Endurance: Will this standard or indicator provide students with
knowledge and skills that will be of value beyond a single test date?
This is information a student will need to know far beyond the last test
the teacher gives.
2. Leverage: Will this provide knowledge and skills that will be of
value in multiple disciplines? (For example: making inferences is a
skill that can be used in many subjects)
3. Readiness for the next level of learning: Will this provide
students with essential knowledge and skills that are necessary for
success in the next grade of the next level of instruction?
Ainsworth, L. (2003)
How do we prioritize?
Essential Skills/Knowledge
Example:
#1: Making Inferences.
Students can make inferences in science, reading, consumer
economics and more. It applies to many disciplines.
1. Endurance: Will this standard or indicator provide students with
knowledge and skills that will be of value beyond a single test date?
This is information a student will need to know far beyond the last test
the teacher gives.
2. Leverage: Will this provide knowledge and skills that will be of
value in multiple disciplines? (For example: making inferences is a
skill that can be used in many subjects)
3. Readiness for the next level of learning: Will this
provide students with essential knowledge and skills that are necessary
for success in the next grade of the next level of instruction?
Ainsworth, L. (2003)
How do we prioritize?
Essential Skills/Knowledge
1. Endurance: Will this standard or indicator provide students with
knowledge and skills that will be of value beyond a single test date?
This is information a student will need to know far beyond the last test
the teacher gives.
2. Leverage: Will this provide knowledge and skills that will be of
value in multiple disciplines? (For example: making inferences is a
skill that can be used in many subjects)
3. Readiness for the next level of learning: Will this
provide students with essential knowledge and skills that are necessary
for success in the next grade of the next level of instruction?
Ainsworth, L. (2003)
How do we prioritize?
Essential Skills/Knowledge Examples:
#1: Absolute Value: Math
If you believe not understanding absolute value in 6th grade
will prevent success at 7th grade math...it is essential.
#2: Foods I to Foods II
If there are skills that you MUST master in Foods I for
success in Foods II, those are your essential skills.
I teach so many different things…
How can I determine what is “essential?”
How do we prioritize?
Essential Skills/Knowledge
I teach so many different things…
How can I determine what is “essential?”
How do we prioritize?
Essential Skills/Knowledge
Typically, this is what you have an
opportunity to teach again and again in
an instructional interval.
I teach so many different things…
How can I determine what is “essential?”
How do we prioritize?
Essential Skills/Knowledge
Typically, this is what you have an
opportunity to teach again and again in
an instructional interval.
We can look to PARCC
for guidance...
How do we prioritize?
Essential Skills/Knowledge
Evidence Tables Provides the assessment
Boundaries/limits for all content
standards that will be assessed on the
PBA and EOY assessments for Grades
3-11
● The PARCC Assessment
● Assessment System
● Blueprints and Test Specs
● Scroll all the way down to the
PDF’s and choose specific
level
http://www.parcconline.
org/assessment-blueprints-
test-specs
High Level Blueprints Lists the number of items and points for
each type of task that will appear on the
PBA and EOY assessments for Grades
3-11
● The PARCC Assessment
● Assessment System
● Blueprints and Test Specs
● Scroll all the way down to the
PDF’s and choose specific
level
http://www.parcconline.
org/assessment-blueprints-
test-specs
Estimated Time on
Task
Lists the amount of time students will
have to complete PBA and EOY
assessments for Grades 3-11
● The PARCC Assessment
● Policies and Guidelines
● Administering the Test
● Unit Testing Times
http://parcconline.
org/update-session-times
Performance Level
Descriptors (PLD’S)
PLD’s describe the skills, knowledge and
practices a student who has achieved a
particular performance level should be
able to demonstrate
● The PARCC Assessment
● Assessment System
● Mathematics PLD’s or
Literacy PLD’s
● Scroll down to specific grade
level
Math PLD’s
http://www.parcconline.
org/math-plds
ELA/Literacy PLD’s
http://www.parcconline.
org/ela-plds
How do we prioritize?
Essential Skills/Knowledge
Online Practice Tests Access to Sample items, test items, and a
tutorial with TestNav8. This was
developed so that all students can
experience PARCC test items in the
correct environment
● The PARCC Assessment
● PARCC practice tests (middle
of 1st paragraph)
● View Test Preparation
● Practice Tests
● Choose Math or ELA
● Choose grade level
Math Practice Tests
http://parcc.pearson.
com/practice-tests/math/
ELA/Literacy Practice
Tests
http://parcc.pearson.
com/practice-tests/english/
Assessment System
Class Skills
1, 2, 3, 4, 5, 6, 7…
(ex: NGSS Scientific
Practices)
Assessment System
Big Idea/Unit A
Content Skills
#1
#3
#4
Class Skills
1, 2, 3, 4, 5, 6, 7…
(ex: NGSS Scientific
Practices)
Assessment System
Big Idea/Unit A
Content Skills
#1
#3
#4
Big Idea/Unit B
Content Skills
# 2
# 3
Class Skills
1, 2, 3, 4, 5, 6, 7…
(ex: NGSS Scientific
Practices)
Assessment System
Big Idea/Unit A
Content Skills
#1
#3
#4
Big Idea/Unit B
Content Skills
# 2
# 3
Class Skills
1, 2, 3, 4, 5, 6, 7…
(ex: NGSS Scientific
Practices)
Assessment System
Big Idea/Unit A
Content Skills
#1
#3
#4
Big Idea/Unit B
Content Skills
# 2
# 3
Class Skills
1, 2, 3, 4, 5, 6, 7…
(ex: NGSS Scientific
Practices)
Different in every
unit
Assessment System
Big Idea/Unit A
Content Skills
#1
#3
#4
Big Idea/Unit B
Content Skills
# 2
# 3
Class Skills
1, 2, 3, 4, 5, 6, 7…
(ex: NGSS Scientific
Practices)
Measured for
growth over the
course of many
units
Assessment System
Class Skills
1, 2, 3, 4, 5, 6, 7…
(ex: NGSS Scientific
Practices)
Assessment System
Class Skills
1, 2, 3, 4, 5, 6, 7…
(ex: NGSS Scientific
Practices)
Assessment System
Class Skills
1, 2, 3, 4, 5, 6, 7…
(ex: NGSS Scientific
Practices)
Baseline
Pretest:
All Skills
Assessment System
Class Skills
1, 2, 3, 4, 5, 6, 7…
(ex: NGSS Scientific
Practices)
Baseline
Pretest:
All Skills
Summative
Posttest: All
Skills
Assessment System
Class Skills
1, 2, 3, 4, 5, 6, 7…
(ex: NGSS Scientific
Practices)
Baseline
Pretest:
All Skills
Summative
Posttest: All
Skills
Big Idea/Unit A
Content Skills
#1
#3
#4
Big Idea/Unit B
Content Skills
#2
#3
Assessment System
Class Skills
1, 2, 3, 4, 5, 6, 7…
(ex: NGSS Scientific
Practices)
Baseline
Pretest:
All Skills
Summative
Posttest: All
Skills
Big Idea/Unit A
Content Skills
#1
#3
#4
Assessment System
Class Skills
1, 2, 3, 4, 5, 6, 7…
(ex: NGSS Scientific
Practices)
Baseline
Pretest:
All Skills
Summative
Posttest: All
Skills
Big Idea/Unit B
Content Skills
#2
#3
Assessment System
Class Skills
1, 2, 3, 4, 5, 6, 7…
(ex: NGSS Scientific
Practices)
Baseline
Pretest:
All Skills
Summative
Posttest: All
Skills
Assessment System
Class Skills
1, 2, 3, 4, 5, 6, 7…
(ex: NGSS Scientific
Practices)
Baseline
Pretest:
All Skills
Summative
Posttest: All
Skills
Assessment System
Class Skills
1, 2, 3, 4, 5, 6, 7…
(ex: NGSS Scientific
Practices)
Baseline
Pretest:
All Skills
Summative
Posttest: All
Skills
Formative Assessments: Pivot Points
Assessment System
Class Skills
1, 2, 3, 4, 5, 6, 7…
(ex: NGSS Scientific
Practices)
Baseline
Pretest:
All Skills
Summative
Posttest: All
Skills
Formative Assessments: Pivot Points
Notice some skills are not assessed
until late in the year if they are not
covered until late in the year...
Assessment System
Big Idea/Unit A
Content Skills
#1
#3
#4
Big Idea/Unit B
Content Skills
# 2
# 3
Class Skills
1, 2, 3, 4, 5, 6, 7…
(ex: NGSS Scientific
Practices)
Could there be
some content
memorization
that is
foundational
and essential?
Ex: memorize
notes for music
theory class...
No Assessment should be 100% Recall
A balanced assessment has appropriate amounts of questions at each
level of cognitive demand. It is reflective of the range of cognitive
demand required in the class it measures.
The Balanced Assessment
No Assessment should be 100% Recall
A balanced assessment has appropriate amounts of questions at each
level of cognitive demand. It is reflective of the range of cognitive
demand required in the class it measures.
The Balanced Assessment
Discussion: Take Aways
● What skills should be measured? How often?
● How should teachers prioritize?
● Do we have longitudinal ways to measure
learning on comparable assessments?
● Is the data easy to interpret?
Creating Quality Assessments
The 3 Elements
Element Variables
to Consider
WHAT:
Content/Skills
● Skills
● Standards
● Power Standards
HOW:
Form
● Selected Response (ex:multiple
choice)
● Constructed Response
● Performance Based
Difficulty:
Level of
Complexity
● Level of cognitive demand
● Question difficulty
● Difficulty of Reading Passage
● Difficulty of Prompt
Comparable assessments in a set are of the same format
Selected Response Assessments: Ask students to
select the correct answer from a provided set of answers.
Constructed Response Assessments: Ask students to
construct their own answer to a question.
Performance Assessments: Ask students to demonstrate
understanding by performing or creating a product.
Be Intentional about Question
Form
Comparable assessments in a set are of the same format
Selected Response Assessments: Ask students to
select the correct answer from a provided set of answers.
Constructed Response Assessments: Ask students to
construct their own answer to a question.
Performance Assessments: Ask students to demonstrate
understanding by performing or creating a product.
Be Intentional about Question
Form
Comparable assessments in a set are of the same format
Selected Response Assessments: Ask students to
select the correct answer from a provided set of answers.
Constructed Response Assessments: Ask students to
construct their own answer to a question.
Performance Assessments: Ask students to demonstrate
understanding by performing or creating a product.
Be Intentional about Question
Form
Teacher Time is at a Premium!
Invest in question forms that will:
● Tell teachers about student thinking
● Tell teachers what to do next
Consider Your Time Investment: Will a constructed
response version of this question tell me more about
student thinking than a multiple choice version?
Comparable assessments in a set are of the same format
Selected Response Assessments: Ask students to
select the correct answer from a provided set of answers.
Constructed Response Assessments: Ask students to
construct their own answer to a question.
Performance Assessments: Ask students to demonstrate
understanding by performing or creating a product.
Be Intentional about Question
Form
Teacher Time is at a Premium!
Invest in question forms that will:
● Tell teachers about student thinking
● Tell teachers what to do next
Consider Your Time Investment: Will a constructed
response version of this question tell me more about
student thinking than a multiple choice version?
Comparable assessments in a set are of the same format
Selected Response Assessments: Ask students to
select the correct answer from a provided set of answers.
Constructed Response Assessments: Ask students to
construct their own answer to a question.
Performance Assessments: Ask students to demonstrate
understanding by performing or creating a product.
Be Intentional about Question
Form
Inter-Rater Reliability:
● Same/Repeatable results when
2+ people are using 1 rubric
● Practice as a team, share samples,
keep on file
Intra-Rater Reliability:
● Repeatable results when
1 person is using 1 rubric
● Photograph/record samples, keep on file
Reliability:
Consistent, Repeatable Results
Selected Response Assessments:
● Take more time to write
● Take less time to grade
● Write good distractors: wrong answers should point
out misconceptions
● May not be appropriate for all ages or subjects
● Assess what you intend to assess (validity)
● Be consistent in administration technique (directions,
etc.)
Considerations in Selecting
Question Form
Selected Response Assessments:
● Take more time to write
● Take less time to grade
● Write good distractors: wrong answers should point
out misconceptions
● May not be appropriate for all ages or subjects
● Assess what you intend to assess (validity)
● Be consistent in administration technique (directions,
etc.)
Considerations in Selecting
Question Form
Writing Wrong Answers
The Correct Answer: This is the clear, and
undisputed correct answer to the question.
Incorrect Choices:
#1 The Unsupported Statement:
#2 The Distortion of the Truth:
#3 The Skip & Switch:
#4 The Extremist:
Writing Wrong Answers
The Correct Answer: This is the clear, and
undisputed correct answer to the question.
Incorrect Choices:
#1 The Unsupported Statement:
#2 The Distortion of the Truth:
#3 The Skip & Switch:
#4 The Extremist:
See More: http://kidsatthecore.com/writing-wrong-
answers/
Writing Wrong Answers
The Correct Answer: This is the clear, and
undisputed correct answer to the question.
Incorrect Choices:
#1 The Unsupported Statement:
#2 The Distortion of the Truth:
#3 The Skip & Switch:
#4 The Extremist:
The Unsupported Statement: This statement could possibly
be true but is not supported by the text, evidence, or data
provided. Often this choice sounds “smart” and includes big
words or appeals to a reader’s biases. It might appear to be
universally true, but is not supported by the evidence provided.
Writing Wrong Answers
The Correct Answer: This is the clear, and
undisputed correct answer to the question.
Incorrect Choices:
#1 The Unsupported Statement:
#2 The Distortion of the Truth:
#3 The Skip & Switch:
#4 The Extremist:
The Distortion of the Truth: This choice represents a
conclusion that distorts what is provided in the text, evidence or
data. It not supported by the passage or the data provided. It
might disagree with the meaning of the text, evidence or data or
include a conclusion beyond the provided information supports.
It might take words/phrases out of context.
Writing Wrong Answers
The Correct Answer: This is the clear, and
undisputed correct answer to the question.
Incorrect Choices:
#1 The Unsupported Statement:
#2 The Distortion of the Truth:
#3 The Skip & Switch:
#4 The Extremist:
The Skip & Switch: This is the choice if a step in thinking is
skipped or altered. This is a common variation of wrong
answers in math problems. This choice might give a correct
answer to another question. It might be a choice reached by
solving the problem with the incorrect method. Often these are
the choices that teachers will catch if students scan and hunt
for answers or skip steps in the problem.
Writing Wrong Answers
The Correct Answer: This is the clear, and
undisputed correct answer to the question.
Incorrect Choices:
#1 The Unsupported Statement:
#2 The Distortion of the Truth:
#3 The Skip & Switch:
#4 The Extremist:
The Extremist: These choices contain exaggerations of the
truth and use extreme words like “everyone” or “all the time” or
“never” when not supported by the text, evidence or data. For
the more discerning students, these are typically choices that
can be easily eliminated with a more careful lense.
Constructed Response Assessments:
● Take more time to grade
● Require a specific rubric
● Require inter-rater and intra-rater reliability
● Write descriptive rubrics: less than full credit should
point out misconceptions, next steps
● May not be appropriate for all ages or subjects
● Be consistent in administration technique (directions,
etc.)
Considerations in Selecting
Question Form
Constructed Response Assessments:
● Take more time to grade
● Require a specific rubric
● Require inter-rater and intra-rater reliability
● Write descriptive rubrics: less than full credit should
point out misconceptions, next steps
● May not be appropriate for all ages or subjects
● Be consistent in administration technique (directions,
etc.)
Considerations in Selecting
Question Form
We will discuss in detail
shortly...
Performance Assessments:
● Take more time to grade and administer
● Require a specific rubric
● Require inter-rater and intra-rater reliability
● Write descriptive rubrics: less than full credit should
point out misconceptions, next steps
● May not be appropriate for all ages or subjects
● Be consistent in administration technique (directions,
etc.)
Considerations in Selecting
Question Form
Performance Assessments:
● Take more time to grade and administer
● Require a specific rubric
● Require inter-rater and intra-rater reliability
● Write descriptive rubrics: less than full credit should
point out misconceptions, next steps
● May not be appropriate for all ages or subjects
● Be consistent in administration technique (directions,
etc.)
Considerations in Selecting
Question Form
Lets discuss this deeper
now...
Understanding Rubrics
Understanding Rubrics
ROWS:
Each of the criteria
you plan to measure.
Understanding Rubrics
ROWS:
Each of the criteria
you plan to measure.
COLUMNS:
Potential levels of
performance ranging
from “ideal” to not
evident.
Understanding Rubrics
ROWS:
Each of the criteria
you plan to measure.
COLUMNS:
Potential levels of
performance ranging
from “ideal” to not
evident.
Rating Boxes:
Detailed descriptions
of performance at
that level for that
skill.
Understanding Rubrics
ROWS:
Each of the criteria
you plan to measure.
Understanding Rubrics
Understanding Rubrics
ROWS:
Each of the criteria
you plan to measure.
Understanding Rubrics
COLUMNS:
Potential levels of
performance ranging
from “ideal” to not
evident.
ROWS:
Each of the criteria
you plan to measure.
Understanding Rubrics
Rating Boxes:
Detailed descriptions
of performance at
that level for that
skill.
COLUMNS:
Potential levels of
performance ranging
from “ideal” to not
evident.
ROWS:
Each of the criteria
you plan to measure.
Understanding Rubrics
Rating Boxes:
Detailed descriptions
of performance at
that level for that
skill.
COLUMNS:
Potential levels of
performance ranging
from “ideal” to not
evident.
ROWS:
Each of the criteria
you plan to measure.
This is NOT an example of a descriptive
rubric
Understanding Rubrics
Rating Boxes:
Detailed descriptions
of performance at
that level for that
skill.
COLUMNS:
Potential levels of
performance ranging
from “ideal” to not
evident.
ROWS:
Each of the criteria
you plan to measure.
This is NOT an example of a descriptive
rubric
Some of the time
Most of the time
Fair
Poor
Understanding Rubrics
ROWS:
Each of the criteria
you plan to measure.
Understanding Rubrics
ROWS:
Each of the criteria
you plan to measure.
Determining Appropriate Criteria:
● Areas key to student learning (essential skills)
● Definable & Observable
● Not generally characteristic of the task itself
but rather characteristics of the learning
outcomes the assessment is designed to
measure
Effective Rubrics do not list all possible criteria; they list the
right criteria for the assessment’s purpose
Understanding Rubrics
ROWS:
Each of the criteria
you plan to measure.
Determining Appropriate Criteria:
● Areas key to student learning (essential skills)
● Definable & Observable
● Not generally characteristic of the task itself
but rather characteristics of the learning
outcomes the assessment is designed to
measure
Effective Rubrics do not list all possible criteria; they list the
right criteria for the assessment’s purpose
Understanding Rubrics
ROWS:
Each of the criteria
you plan to measure.
Determining Appropriate Criteria:
● Areas key to student learning (essential skills)
● DOD: Definable, Observable & Distinct
● Not generally characteristic of the task itself
but rather characteristics of the learning
outcomes the assessment is designed to
measure
Effective Rubrics do not list all possible criteria; they list the
right criteria for the assessment’s purpose
Understanding Rubrics
ROWS:
Each of the criteria
you plan to measure.
Determining Appropriate Criteria:
● Areas key to student learning (essential skills)
● DOD: Definable, Observable & Distinct
● Not generally characteristic of the task itself
but rather characteristics of the learning
outcomes the assessment is designed to
measure
Effective Rubrics do not list all possible criteria; they list the
right criteria for the assessment’s purpose
Definable: Each Criterion has a clear, agreed upon
meaning that teacher and student understand.
Observable: Each criterion describes a quality in the
performance that can be perceived (seen, heard).
Distinct: Each Criterion identifies a separate aspecting of
the learning outcomes.
Understanding Rubrics
ROWS:
Each of the criteria
you plan to measure.
Determining Appropriate Criteria:
● Areas key to student learning (essential skills)
● DOD: Definable, Observable & Distinct
● Not generally characteristic of the task itself
but rather characteristics of the learning
outcomes the assessment is designed to
measure
Effective Rubrics do not list all possible criteria; they list the
right criteria for the assessment’s purpose
Definable: Each Criterion has a clear, agreed upon
meaning that teacher and student understand.
Observable: Each criterion describes a quality in the
performance that can be perceived (seen, heard).
Distinct: Each Criterion identifies a separate aspecting of
the learning outcomes.
Understanding Rubrics
ROWS:
Each of the criteria
you plan to measure.
Determining Appropriate Criteria:
● Areas key to student learning (essential skills)
● DOD: Definable, Observable & Distinct
● Not generally characteristic of the task itself
but rather characteristics of the learning
outcomes the assessment is designed to
measure
Effective Rubrics do not list all possible criteria; they list the
right criteria for the assessment’s purpose
Definable: Each Criterion has a clear, agreed upon
meaning that teacher and student understand.
Observable: Each criterion describes a quality in the
performance that can be perceived (seen, heard).
Distinct: Each Criterion identifies a separate aspecting of
the learning outcomes.
Understanding Rubrics
ROWS:
Each of the criteria
you plan to measure.
Determining Appropriate Criteria:
● Areas key to student learning (essential skills)
● DOD: Definable, Observable & Distinct
● Not generally characteristic of the task itself but
rather characteristics of the enduring learning
outcomes the assessment is designed to measure.
These measurement criteria could then apply to
several prompts, taks or performances.
Effective Rubrics do not list all possible criteria; they list the
right criteria for the assessment’s purpose
Understanding Rubrics
ROWS:
Each of the criteria
you plan to measure.
Determining Appropriate Criteria:
● Areas key to student learning (essential skills)
● DOD: Definable, Observable & Distinct
● Not generally characteristic of the task itself but
rather characteristics of the enduring learning
outcomes the assessment is designed to measure.
These measurement criteria could then apply to
several prompts, taks or performances.
Effective Rubrics do not list all possible criteria; they list the
right criteria for the assessment’s purpose.
Understanding Rubrics
COLUMNS:
Potential levels of
performance ranging
from “ideal” to not
evident.
ROWS:
Each of the criteria
you plan to measure.
Understanding Rubrics
COLUMNS:
Potential levels of
performance ranging
from “ideal” to not
evident.
ROWS:
Each of the criteria
you plan to measure.
Performance Level Descriptions:
Hint: Think--what does performance look like at high quality?
Work backwards.
Descriptive Descriptive language that depicts what one would
observe, NOT quality conclusions. (ex: good vs poor)
Clear Students and teachers both understand meanings of
descriptions. Teachers use in a repeatable fashion.
Whole Range &
Distinguishable
Levels
Performance descriptions are described from one
extreme to another extreme. Descriptions differ enough
from one level to another that work can be categorized.
(Ex: number of levels may be multiple or only 2)
Parallel
Descriptions
Performance descriptors at each level of the continuum
differ in the same aspects of the work and focus on
incremental improvements in a standard or skill.
Understanding Rubrics
COLUMNS:
Potential levels of
performance ranging
from “ideal” to not
evident.
ROWS:
Each of the criteria
you plan to measure.
Performance Level Descriptions:
Hint: Think--what does performance look like at high quality?
Work backwards.
Descriptive Descriptive language that depicts what one would
observe, NOT quality conclusions. (ex: good vs poor)
Clear Students and teachers both understand meanings of
descriptions. Teachers use in a repeatable fashion.
Whole Range &
Distinguishable
Levels
Performance descriptions are described from one
extreme to another extreme. Descriptions differ enough
from one level to another that work can be categorized.
(Ex: number of levels may be multiple or only 2)
Parallel
Descriptions
Performance descriptors at each level of the continuum
differ in the same aspects of the work and focus on
incremental improvements in a standard or skill.
Understanding Rubrics
COLUMNS:
Potential levels of
performance ranging
from “ideal” to not
evident.
ROWS:
Each of the criteria
you plan to measure.
Performance Level Descriptions:
Hint: Think--what does performance look like at high quality?
Work backwards.
Descriptive Descriptive language that depicts what one would
observe, NOT quality conclusions. (ex: good vs poor)
Clear Students and teachers both understand meanings of
descriptions. Teachers use in a repeatable fashion.
Levels:
Whole Range &
Distinguishable
Performance descriptions are described from one
extreme to another extreme. Descriptions differ enough
from one level to another that work can be categorized.
(Ex: number of levels may be multiple or only 2)
Parallel
Descriptions
Performance descriptors at each level of the continuum
differ in the same aspects of the work and focus on
incremental improvements in a standard or skill.
Understanding Rubrics
COLUMNS:
Potential levels of
performance ranging
from “ideal” to not
evident.
ROWS:
Each of the criteria
you plan to measure.
Performance Level Descriptions:
Hint: Think--what does performance look like at high quality?
Work backwards.
Descriptive Descriptive language that depicts what one would
observe, NOT quality conclusions. (ex: good vs poor)
Clear Students and teachers both understand meanings of
descriptions. Teachers use in a repeatable fashion.
Levels:
Whole Range &
Distinguishable
Performance descriptions are described from one
extreme to another extreme. Descriptions differ enough
from one level to another that work can be categorized.
(Ex: number of levels may be multiple or only 2)
Parallel
Descriptions
Performance descriptors at each level of the continuum
differ in the same aspects of the work and focus on
incremental improvements in a standard or skill.
Understanding Rubrics:
For Growth Data
COLUMNS:
Potential levels of
performance ranging
from “ideal” to not
evident.
ROWS:
Each of the criteria
you plan to measure.
Spotlight on Growth for Evaluations
COMPARABLE DATA:
● End of year Expectations
● Use Rubric Consistently
● High Reliability
Gradebook Problem:
How can I grade at week 2 with end of year expectations?
● Use a curve (“A” is not only for highest rating)
● Allow revisions
The Evaluation Connection: Student Growth
Understanding Rubrics:
For Growth Data
COLUMNS:
Potential levels of
performance ranging
from “ideal” to not
evident.
ROWS:
Each of the criteria
you plan to measure.
Spotlight on Growth for Evaluations
COMPARABLE DATA:
● End of year Expectations
● Use Rubric Consistently
● High Reliability
Gradebook Problem:
How can I grade at week 2 with end of year expectations?
● Use a curve (“A” is not only for highest rating)
● Allow revisions
The Evaluation Connection: Student Growth
Understanding Rubrics:
For Growth Data
COLUMNS:
Potential levels of
performance ranging
from “ideal” to not
evident.
ROWS:
Each of the criteria
you plan to measure.
Spotlight on Growth for Evaluations
COMPARABLE DATA:
● End of year Expectations
● Use Rubric Consistently
● High Reliability
Gradebook Problem:
How can I grade at week 2 with end of year expectations?
● Use a curve (“A” is not only for highest rating)
● Allow revisions
The Evaluation Connection: Student Growth
Performance Assessments:
Consider when
Collecting Baseline Data:
● Exasperation cut points
● Safety Issues
● Advertisement for class
Understanding Rubrics
The Evaluation Connection: Student Growth
Rubrics as Tools for Teacher:
● Defines “quality” for a process, product or behavior
● Powerful for teaching and learning
○ Descriptions help communicate what it takes to
succeed (This shouldn’t be a secret kept by the teachers!)
● Increase reliability of results
○ Increase consistency in grading between
students
○ Increase consistency in grading between
measurement points
Understanding Rubrics
Rubrics as Tools for Teacher:
● Defines “quality” for a process, product or behavior
● Powerful for teaching and learning
○ Descriptions help communicate what it takes to
succeed (This shouldn’t be a secret kept by the teachers!)
● Increase reliability of results
○ Increase consistency in grading between
students
○ Increase consistency in grading between
measurement points
Understanding Rubrics
Rubrics as Tools for Student:
● Defines “quality” for a process, product or behavior
● Encourage more thoughtful judgement of their own
(and others’) work.
● Increases ability to “self-assess”
● Increases student responsibility
Understanding Rubrics
Inter-Rater Reliability:
● Same/Repeatable results when
2+ people are using 1 rubric
● Practice as a team, share samples,
keep on file
Intra-Rater Reliability:
● Repeatable results when
1 person is using 1 rubric
● Photograph/record samples, keep on file
Understanding Rubrics
Discussion: Take Aways
● What are considerations for selected response?
● What are considerations for performance or
constructed response?
● Is the data easy to interpret?
Creating Quality Assessments
The 3 Elements
Element Variables
to Consider
WHAT:
Content/Skills
● Skills
● Standards
● Power Standards
HOW:
Form
● Selected Response (ex:multiple
choice)
● Constructed Response
● Performance Based
Difficulty:
Level of
Complexity
● Level of cognitive demand
● Question difficulty
● Difficulty of Reading Passage
● Difficulty of Prompt
Determine the Spectrum of
Complexity
Acknowledging the various levels of cognitive demand within each standard will
help teachers write questions at a consistent cognitive level across the
assessment sets thus allowing the sets to mirror in complexity.
Be Intentional about Question
Complexity
Mirrored questions - same range of cognitive demand.
Blooms Taxonomy
(Revised)
Marzano's
Taxonomy
Webb's Depth of
Knowledge
Remembering Level 1: Retrieval Recall and reproduction
(DOK1)
Understanding Level 2: Comprehension Skills and concepts (DOK2)
Applying Level 3: Analysis Strategic thinking/complex
reasoning (DOK3)
Analyzing Level 4: Knowledge Utilization Extended thinking/reasoning
(DOK4)
Evaluating Level 5: Metacognition
Creating Level 6: Self-System Thinking
Be Intentional about Question
Complexity
Mirrored questions - same range of cognitive demand.
Blooms Taxonomy
(Revised)
Marzano's
Taxonomy
Webb's Depth of
Knowledge
Remembering Level 1: Retrieval Recall and reproduction
(DOK1)
Understanding Level 2: Comprehension Skills and concepts (DOK2)
Applying Level 3: Analysis Strategic thinking/complex
reasoning (DOK3)
Analyzing Level 4: Knowledge Utilization Extended thinking/reasoning
(DOK4)
Evaluating Level 5: Metacognition
Creating Level 6: Self-System Thinking
These are the EASIEST
questions.
* Recall this fact.
* Answer a question about a
passage that’s “right there;” the
student could underline the
answer.
Be Intentional about Question
Complexity
Mirrored questions - same range of cognitive demand.
Blooms Taxonomy
(Revised)
Marzano's
Taxonomy
Webb's Depth of
Knowledge
Remembering Level 1: Retrieval Recall and reproduction
(DOK1)
Understanding Level 2: Comprehension Skills and concepts (DOK2)
Applying Level 3: Analysis Strategic thinking/complex
reasoning (DOK3)
Analyzing Level 4: Knowledge Utilization Extended thinking/reasoning
(DOK4)
Evaluating Level 5: Metacognition
Creating Level 6: Self-System Thinking
These are a bit HARDER...
* Recall a fact and apply it to a
new situation.
* Answer a question about a
passage that requires
inferencing; the student must
“read between the lines”
Be Intentional about Question
Complexity
Mirrored questions - same range of cognitive demand.
Blooms Taxonomy
(Revised)
Marzano's
Taxonomy
Webb's Depth of
Knowledge
Remembering Level 1: Retrieval Recall and reproduction
(DOK1)
Understanding Level 2: Comprehension Skills and concepts (DOK2)
Applying Level 3: Analysis Strategic thinking/complex
reasoning (DOK3)
Analyzing Level 4: Knowledge Utilization Extended thinking/reasoning
(DOK4)
Evaluating Level 5: Metacognition
Creating Level 6: Self-System Thinking
These are the HARDEST
questions.
* Recall facts and use them to
evaluate a new situation or
create a new product.
* Evaluate between multiple
positions or opinions or ideas in
a passage
Be Intentional about Question
Complexity
Standard Basic: Remember & Understand Standard: Apply & Analyze Expanded: Evaluate & Create
RL 4.1
&
RI 4.1
Identify explicit information:
What does the author mean by:
“quote”...?
What is the purpose of this...?
Analyze explicit information; making
inferences:
What inferences can you make
about...?
Which of these examples tells us
why...?
Evaluate explicit information and
inferences:
(Defend a position)Why do you
believe...?
Is there a better solution to the
character’s problem...?
RL 4.2
&
RI 4.2
Identify Theme/Idea:
What is the theme of this story
(text)...?
What is the message of this text
(poem, story, etc.)...?
Analyze Theme/Idea:
How do the character’s action support
the theme...?
What are the most important events in
the story...? (RL)
Which of these is a good summary
sentence...?
Evaluate Theme/Idea:
Which of these details does not
support the main idea
(message)...?
RL 4.3
&
RI 4.3
Identify elements
What is the setting of the story...?
Which describes character x...?
Which describes the setting (time,
place, social environment)...?
Which of these details (quotes)
describes character x...? (RL)
Analyze Elements
Why might ____ have happened...?
(RL)
What is the first (second) step in the
procedure...? (RI)
What was the effect of _____’s idea...?
(RI)
Evaluate Elements
Did the environment affect the
outcome of the story...?
Create a scenario: How would you
imagine the events from the text
effecting you today..?
Teachers can struggle with targeted
questions.
Best tool has been question stems
KidsAtTheCore.com/question-stems/
Be Intentional about Question
Complexity
Keep the number of questions in each level of complexity
consistent!
Commit to a number of hard questions…
Commit to a number of easy questions….
And stick with that pattern on all assessments so they
are comparable in difficulty!
Be Intentional about Question
Complexity
Keep the number of questions in each level of complexity
consistent!
Commit to a number of hard questions…
Commit to a number of easy questions….
And stick with that pattern on all assessments so they
are comparable in difficulty!
Key: This is how you get
COMPARABLE results!
Difficulty level remains CONSTANT
Be Intentional about Question
Complexity
Keep the number of questions in each level of complexity
consistent!
Key Standard/ Objective
Basic:
(Remember & Understand)
Standard:
(Apply & Analyze)
Expanded:
(Evaluate & Create)
Key Ideas and Details 2 questions 5 questions 2 questions
Craft and Structure 2 questions 4 questions 3 questions
Integration of Ideas 2 questions 3 questions 1 questions
Assessment Total: 6/24 questions
=25% of test
12/24 questions
=50% of test
6/24 questions
=25% of test
For Example, in Assessments A, B and C:
Be Intentional about Question
Complexity
Keep the number of questions in each level of complexity
consistent!
Key Standard/ Objective
Basic:
(Remember & Understand)
Standard:
(Apply & Analyze)
Expanded:
(Evaluate & Create)
Key Ideas and Details 2 questions 5 questions 2 questions
Craft and Structure 2 questions 4 questions 3 questions
Integration of Ideas 2 questions 3 questions 1 questions
Assessment Total: 6/24 questions
=25% of test
12/24 questions
=50% of test
6/24 questions
=25% of test
For Example, in Assessments A, B and C:
Is 25%, 50%, 25% a magic formula?
NO!
Think about your class/course and what would make
an appropriate balanced assessment!
Example: Honors class? Remedial Class?
Misconception: Test gets harder
as student learns more material…
In Reality: All assessments at the same level -
Test does not change difficulty:
More questions correct = Growth and Learning
Measure Growth:
All At The Same Level of Difficulty
The Evaluation Connection: Student Growth
The Evaluation Connection: Student Growth
Misconception: Test gets harder
as student learns more material…
In Reality: All assessments at the same level -
Test does not change difficulty:
More questions correct = Growth and Learning
Measure Growth:
All At The Same Level of Difficulty
The Evaluation Connection: Student Growth
OH NO!
That means students will do terrible on my first
assessment!
It is Important How Teacher Frames It...
For Example, Say:
This is how I am going to find out what you know.
By the time we are done, you are going to “own” this!
Measure Growth:
All At The Same Level of Difficulty
The Evaluation Connection: Student Growth
OH NO!
That means students will do terrible on my first
assessment!
It is Important How Teacher Frames It...
For Example, Say:
This is how I am going to find out what you know.
By the time we are done, you are going to “own” this!
Measure Growth:
All At The Same Level of Difficulty
That is the nature of a baseline assessment!
You are testing students on material you have
not taught yet!
The Evaluation Connection: Student Growth
OH NO!
That means students will do terrible on my first
assessment!
It is Important How Teacher Frames It...
For Example, Say:
This is how I am going to find out what you know.
By the time we are done, you are going to “own” this!
Measure Growth:
All At The Same Level of Difficulty
Comparable assessments in a set are of the same format
Selected Response Assessments: Ask students to
select the correct answer from a provided set of answers.
Constructed Response Assessments: Ask students to
construct their own answer to a question.
Performance Assessments: Ask students to demonstrate
understanding by performing or creating a product.
Be Intentional about Question
Complexity & Form
Teacher Time is at a Premium!
Invest in question forms that will:
● Tell teachers about student thinking
● Tell teachers what to do next
Consider Your Time Investment: Will a constructed
response version of this question tell me more about
student thinking than a multiple choice version?
Comparable assessments in a set are of the same format
Selected Response Assessments: Ask students to
select the correct answer from a provided set of answers.
Constructed Response Assessments: Ask students to
construct their own answer to a question.
Performance Assessments: Ask students to demonstrate
understanding by performing or creating a product.
Be Intentional about Question
Complexity & Form
Teacher Time is at a Premium!
Invest in question forms that will:
● Tell teachers about student thinking
● Tell teachers what to do next
Consider Your Time Investment: Will a constructed
response version of this question tell me more about
student thinking than a multiple choice version?
EXAMPLE:
Basic: Remember/Understand =
Multiple Choice
Expanded: Evaluate/Create =
Constructed Response
Aligning Questions to Assessment Form
Be Intentional about Question
Complexity & Form
Selected
Response
Constructed
Response
Performance
Task
Personal
Communication/
Discussion
Basic:
Knowledge/
Remember/
Understand
Good Match Good Match Not Good Match Partial Match
Standard:
Reasoning/
Apply/ Analyze
Partial Match Good Match Good Match Good Match
Expanded:
Evaluate
Partial Match Good Match Good Match Partial Match
Create Product Not Good Match Partial Match Good Match Not Good Match
Adapted from Classroom Assessment for Student Learning (Stiggins, Arter, Chappius & Chappius, 2006)
Discussion: Take Aways
● What are considerations for ranges of
complexity?
● Should some grades/subjects have different
ranges of complexity?
● Are we gathering useful data?
● Is the data easy to interpret?
(what does it mean to get hard/easy
questions right/wrong?)
TASK:
Write 3 Questions, Targeting Specific Standard
Targeting Specific Cognitive Demand
CHOICE:
1) Grade 4 ELA
2) Grade 11 Social Studies & ELA
3) Grade 3 Math
4) HS Mathematics/Science
Quick Activity about Question
Complexity & Form
Standard Basic Question Standard Question Expanded
Question
Spotlight on
Evaluation.
Quality
Assessments for
GROWTH DATA
The Evaluation Connection: Student Growth
Remember from earlier….
So…
HOW can we gather
information/data/artifacts
to prove growth is happening?
Measure Change
Measure Student Learning
Measure Student Growth
Important Term:
Mirrored Assessment Set:
A series of comparable assessments that can measure
learning over 2 or more points in time. They are designed
with the same form, content, and level of complexity.
Growth: Measurable change between two points in time
Attainment: Meeting a outcome or target. (ex: Mastery of a skill)
Mirrored Assessments: Assessments that can be compared for student growth. They
are designed with the same form, content, and level of complexity.
Assessment Set: A series of mirrored assessments designed to
measure student growth on a specific set of learning targets/content--COMPARABLE
Essential Skills: Key skills that are a requirement for success at the next level or for the
scaffolding of skills that are going to be taught.
Spectrum of Assessment Complexity: The range of cognitive levels within a skill or
question.
Pivot Points: Places in the teacher’s lesson where student growth data will determine the
next teaching steps
SLO: Student Learning Objectives. The Illinois recommended model framework for including
growth on teacher evaluations.
Key Vocabulary & Terms
Creating Assessments for Growth
Keep the 3 Elements Consistent
Mirroring Element Variables
to Consider
WHAT:
Content/Skills
● Skills
● Standards
● Power Standards
HOW:
Form
● Selected Response (ex:multiple
choice)
● Constructed Response
● Performance Based
Difficulty:
Level of
Complexity
● Level of cognitive demand
● Question difficulty
● Difficulty of Reading Passage
● Difficulty of Prompt
The Blueprint.
Your key to
quality design
and alignment
Assessment System
Class Skills
1, 2, 3, 4, 5, 6, 7…
(ex: NGSS Scientific
Practices)
Baseline
Pretest:
All Skills
Summative
Posttest: All
Skills
Assessment System
Class Skills
1, 2, 3, 4, 5, 6, 7…
(ex: NGSS Scientific
Practices)
Baseline
Pretest:
All Skills
Summative
Posttest: All
Skills
Assessment System
Class Skills
1, 2, 3, 4, 5, 6, 7…
(ex: NGSS Scientific
Practices)
Baseline
Pretest:
All Skills
Summative
Posttest: All
Skills
Formative Assessments: Pivot Points
Example 1:
Assessment A, B and C should mirror each other in terms of: Difficulty, Standards assessed,
Question stems, etc.
Informational
Passage
Literature
Passage
Assessment
A
Assessment
B
Assessment
C
Interim Assessments Interim Assessments
3 Key Idea Questions per
passage
3 Craft and Structure
Questions per passage
3 Integration of Ideas
Questions per passage
18 Questions Total
3 Key Idea Questions per
passage
3 Craft and Structure
Questions per passage
3 Integration of Ideas
Questions per passage
18 Questions Total
3 Key Idea Questions per
passage
3 Craft and Structure
Questions per passage
3 Integration of Ideas
Questions per passage
18 Questions Total
Informational
Passage
Literature
Passage
Informational
Passage
Literature
Passage
Lets Make A Blueprint:
Teachers can design their own, or pick from available options
Assessment Blueprint Development Protocol
Step One: Identify what essential skills & knowledge you will assess.
(Circle, Triangle Square Analogy)
Step Two: Select the form(s) for your assessment.
Step Three: Determine the number of items at each level of cognitive demand
Essential Skill &
Knowledge
Basic Level 1
(Remember &
Understand)
Standard Level 2
(Apply & Analyze)
Expanded Level 3
(Evaluate &
Create)
Analyze the development
of a central theme
1 MC 2 MC 1 MC
Lets Make A Blueprint:
Teachers can design their own, or pick from available options
Assessment Blueprint Development Protocol
Step One: Identify what essential skills & knowledge you will assess.
(Circle, Triangle Square Analogy)
Step Two: Select the form(s) for your assessment.
Step Three: Determine the number of items at each level of cognitive demand
Essential Skill &
Knowledge
Basic Level 1
(Remember &
Understand)
Standard Level 2
(Apply & Analyze)
Expanded Level 3
(Evaluate &
Create)
Analyze the development
of a central theme
1 MC 2 MC 1 MC
Try writing the question right
into the graphic organizer
EXAMPLE Blueprint Set:
EXCERPT Questions for Assessment #1
Essential Skill &
Knowledge
Basic Level 1
(Remember &
Understand)
Standard Level 2
(Apply & Analyze)
Expanded Level 3
(Evaluate &
Create)
Reading Literature-Forgetting the Words (Lexile 780)
http://www.readworks.org/passages/forgetting-words
CC.3.R.L.1-Ask and answer
questions to demonstrate
understanding of a text,
referring explicitly to the text as
the basis for the answers.
How is Andy involved in the
school play?
a. Andy is watching his friends
act in the play.
b. Andy is starring in the
school play.
c. Andy is writing the play.
d. Andy is directing the play.
The main idea of this story is
that
a. Andy is not able to perform
in the play because he is so
nervous
b. Andy is able to get over his
nerves and feel confident with
encouragement
c. Andy says the wrong thing,
his mother sees, and the whole
play is ruined
d. People sometimes do not
know what to do when on stage
and don’t say anything
In the passage, the author says
that Andy is afraid to make a
mistake in the play when he can’
t remember what to say. What
evidence best shows Andy is no
longer frightened by being on
stage?
a. “Andy is worried about
letting her down”
b. “He has been looking
forward to this for weeks.”
c. “He still can’t remember his
line, but it doesn’t matter.”
d. “Andy loves pretending to be
a pirate.”
EXAMPLE Blueprint Set:
EXCERPT Questions for Assessment #1
Essential Skill &
Knowledge
Basic Level 1
(Remember &
Understand)
Standard Level 2
(Apply & Analyze)
Expanded Level 3
(Evaluate &
Create)
Reading Literature-Forgetting the Words (Lexile 780)
http://www.readworks.org/passages/forgetting-words
CC.3.R.L.1-Ask and answer
questions to demonstrate
understanding of a text,
referring explicitly to the text as
the basis for the answers.
How is Andy involved in the
school play?
a. Andy is watching his friends
act in the play.
b. Andy is starring in the
school play.
c. Andy is writing the play.
d. Andy is directing the play.
The main idea of this story is
that
a. Andy is not able to perform
in the play because he is so
nervous
b. Andy is able to get over his
nerves and feel confident with
encouragement
c. Andy says the wrong thing,
his mother sees, and the whole
play is ruined
d. People sometimes do not
know what to do when on stage
and don’t say anything
In the passage, the author says
that Andy is afraid to make a
mistake in the play when he can’
t remember what to say. What
evidence best shows Andy is no
longer frightened by being on
stage?
a. “Andy is worried about
letting her down”
b. “He has been looking
forward to this for weeks.”
c. “He still can’t remember his
line, but it doesn’t matter.”
d. “Andy loves pretending to be
a pirate.”
This is only a one row
excerpt...
EXAMPLE Blueprint Set:
EXCERPT Questions for Assessment #2
Essential Skill &
Knowledge
Basic Level 1
(Remember &
Understand)
Standard Level 2
(Apply & Analyze)
Expanded Level 3
(Evaluate &
Create)
Reading Literature-Lessons from Fishing (Lexile 780)
CC.3.R.L.1-Ask and answer
questions to demonstrate
understanding of a text,
referring explicitly to the text as
the basis for the answers.
Why does Martin jump into the
water?
a. Martin wants to touch a fish.
b. Martin wants to see how fast
a fish can swim.
c. The fish escaped with the
stringer.
d. Martin can’t swim.
What is the main theme of the
story?
a. Learning how to fish is a
good way to learn how to swim.
b. Fishing makes you strong if
you hold onto the pole.
c. Fishing is a good family
activity.
d. Fishing is like life, with
some days that are a success and
other days that are not.
In the passage, the author says
that Morgan “goes fishing all
the time” and that he “has gotten
even better at it than his father
and his grandfather.”
Based on this evidence, what
can be concluded about the sport
of fishing?
a. Fishing can be learned in
less than a week.
b. Being good at fishing takes a
lot of practice.
c. Only teenagers are good at
fishing.
d. Fishing is best taught by
family members.
EXAMPLE Blueprint FORMAT:
Writing Assessment Sets:
Discussion:
Start thinking about your blueprint
Use 3 elementos of Quality Assessment Design:
● Should some grades/courses/levels have very
different blueprints? Why?
● Will this blueprint help us gather useful data?
● Is the data be easy to interpret?
COMMON
ROADBLOCKS
Where the rubber
meets the road...
Road Block #1:
What Do I want to Assess?
What Do I want to Assess?
Considerations:
● Look For Sustained Growth: What do you have a
chance to teach again and again? The threads that
run through a class/course.
● Look for Endurance, Leverage & Readiness for the
Next Level
Road Block #1:
Ok.
So How skills many IS THAT?
Road Block #2:
Ok.
So How skills many IS THAT?
There is no “magic” number.
Road Block #2:
Ok.
So How skills many IS THAT?
There is no “magic” number.
Considerations:
● Often teams choose a number between 5 and 20
● How long of a test is appropriate? (each essential
skill should be assessed more than once)
● Look at PARCC framework/ISBE Livebinders
Road Block #2:
How Many Questions on a Test?
Road Block #3:
How Many Questions on a Test?
There is no “magic” number.
Road Block #3:
How Many Questions on a Test?
There is no “magic” number.
Considerations:
● Cognitive Demand: Represent more than one level
of complexity for each standard/skill.
● Complexity of Standard: Some standards have
more intricate layers than others--requiring more
questions.
Road Block #3:
How Many Questions on a Test?
There is no “magic” number.
Considerations:
● Cognitive Demand: Represent more than one level
of complexity for each standard/skill.
● Complexity of Standard: Some standards have
more intricate layers than others--requiring more
questions.
Road Block #3:
Questions Per Skill?
1. Use 3 questions per skill
to triangulate your data
2. Use 4 questions per skill
to triangulate and be able
to throw out one data point
3. Use Marzano’s 6 levels
within a skill, and
represent each...
How Should I Give Points?
Road Block #4:
How Should I Give Points?
Depends on your questions & students:
Be intentional.
Road Block #4:
How Should I Give Points?
Depends on your questions & students:
Be intentional.
Considerations:
● Encourage guessing?
● Participation points for effort? (not effecting test score)
● Do different questions of different levels of complexity
have different point values?
Road Block #4:
How Should I Give Points?
Depends on your questions & students:
Be intentional.
Considerations:
● Encourage guessing?
● Participation points for effort? (not effecting test score)
● Do different questions of different levels of complexity
have different point values?
Road Block #4:
Example:
1) Basic, multiple choice: 1 point
2) Standard (mid-level) multiple
choice: 1 point
3) Expanded (high level) free
response…?
How Do I Get Started?
Road Block #5:
How Do I Get Started?
Ideas:
● Build from a Blueprint
● Borrow, Steal: Don’t reinvent the wheel. Look for
question starters, stems or tools you can use to get
started.
● Just Dig In: Question writing is slow at first, but will pick
up with practice.
Road Block #5:
Discussion:
What roadblocks will our staff face?
As leaders in your district…
● What roadblocks are ahead of us?
● What can we do to address these roadblocks
● What are other districts doing when faced with
similar roadblocks?
The Age of Assessment
REMEMBER: Is data you get from
administering that assessment is more valuable
than the instructional time lost to administer the
assessment?
USE ASSESSMENTS THAT GIVE YOU
GOOD INFORMATION ABOUT YOUR
STUDENTS!
The Age of Assessment
Is This A Good Question?
● Will it give me good information about student thinking?
● Will it help me better understand student learning needs?
● Will it help me help my students reach our goals?
Lets Look at
Some Examples...
Be Intentional about Assessments
Imagine you are advising this
teacher.
● What questions?
● What suggestions?
Be Intentional about Assessments
Quality Assessments
Example Assessment Sets
Assessment #1 Excerpt
Questions:
Word Meaning from Context:
1. What does Christine like to do?
a. Read
b. Play tennis
c. Listen to rock music
d. Visit friends
2. Who likes to listen to music?
a. Monika
b. Petra dna Monika
c. Christine and Monika
d. Christine and Petra
3. Who likes to Play chess?
a. Monika
b. Petra
c. Christine
d. Christine and Petra
Drawing a Conclusion:
4. Who is the most athletic?
a. Christine
b. Monika
c. Petra
d. Petra and Christine
5. The reading mainly focus on the girls’...
a. Ages
b. Names
c. Interests
d. Friends
Quality Assessments
Example Assessment Sets
Questions:
Word Meaning from Context:
1. Sara and her friends play cards...
a. on Mondays
b. after school
c. rarely
d. on the weekends
2. What instrument does Paul play?
a. No instrument
b. Guitar
c. Piano
d. Piano and guitar
Drawing a Conclusion:
3. The People in the conversation are most likely
a. teachers
b. students
c. relatives
d. workers
4. The conversation probably takes place…
a. in a cafe
b. in a school
c. in a home
d. in a store
5. The students go because...
a. they need to go home
b. they have to get to class
c. they have to get to soccer practice
d. they have to get to the restaurant
Assessment # 2 Excerpt
Quality Assessments
Could the 2 German Assessments be quality, mirrored tools?
Mirroring Element Variables
to Consider
WHAT:
Content/Skills
● Skills
● Standards
● Power Standards
HOW:
Form
● Selected Response (ex:multiple
choice)
● Constructed Response
● Performance Based
Difficulty:
Level of
Complexity
● Level of cognitive demand
● Question difficulty
● Difficulty of Reading Passage
● Difficulty of Prompt
Quality Assessments
Example Assessment Sets
Quality Assessments
Example Assessment Sets
Quality Assessments
Content: Essential Skills/Knowledge
Could the 2 Science Assessments be quality, mirrored tools?
Mirroring Element Variables
to Consider
WHAT:
Content/Skills
● Skills
● Standards
● Power Standards
HOW:
Form
● Selected Response (ex:multiple
choice)
● Constructed Response
● Performance Based
Difficulty:
Level of
Complexity
● Level of cognitive demand
● Question difficulty
● Difficulty of Reading Passage
● Difficulty of Prompt
Quality Assessments
Example Assessment Sets
Assessment
#1 & #2
Rubric
Quality Assessments
Example Assessment Sets
Assessment #1 Excerpt
Assessment #2 Excerpt
Quality Assessments
Example Assessment Sets
Could the 2 English 9 Assessments be quality, mirrored tools?
Mirroring Element Variables
to Consider
WHAT:
Content/Skills
● Skills
● Standards
● Power Standards
HOW:
Form
● Selected Response (ex:multiple
choice)
● Constructed Response
● Performance Based
Difficulty:
Level of
Complexity
● Level of cognitive demand
● Question difficulty
● Difficulty of Reading Passage
● Difficulty of Prompt
Quality Assessments
Content: Essential Skills/Knowledge
Assessment #1 Excerpt
Quality Assessments
Content: Essential Skills/Knowledge
Assessment # 2 Excerpt
Assessment #1 Excerpt
Quality Assessments
Content: Essential Skills/Knowledge
Could the 2 Choral Assessments be quality, mirrored tools?
Mirroring Element Variables
to Consider
WHAT:
Content/Skills
● Skills
● Standards
● Power Standards
HOW:
Form
● Selected Response (ex:multiple
choice)
● Constructed Response
● Performance Based
Difficulty:
Level of
Complexity
● Level of cognitive demand
● Question difficulty
● Difficulty of Reading Passage
● Difficulty of Prompt
Quality Assessments
Example Assessment Sets
Could the Life Skills Assessment(s) be quality, mirrored tools?
Mirroring Element Variables
to Consider
WHAT:
Content/Skills
● Skills
● Standards
● Power Standards
HOW:
Form
● Selected Response (ex:multiple
choice)
● Constructed Response
● Performance Based
Difficulty:
Level of
Complexity
● Level of cognitive demand
● Question difficulty
● Difficulty of Reading Passage
● Difficulty of Prompt
What makes a good assessment?
● Aligned to standards
● Intentional range of cognitive demand
● Intentional design (blueprint)
● Rubric for all non-selected response questions
● Comparable for measuring growth
● Designed to produce specific data types
● Valid: Measures what you intend to measure
● Reliable: Produces repeatable results
● Historical data
What makes a good assessment?
● Aligned to standards
● Intentional range of cognitive demand
● Intentional design (blueprint)
● Rubric for all non-selected response questions
● Comparable for measuring growth
● Designed to produce specific data types
● Valid: Measures what you intend to measure
● Reliable: Produces repeatable results
● Historical data
What makes a good assessment?
● Aligned to standards
● Intentional range of cognitive demand
● Intentional design (blueprint)
● Rubric for all non-selected response questions
● Comparable for measuring growth
● Designed to produce specific data types
● Valid: Measures what you intend to measure
● Reliable: Produces repeatable results
● Historical data
Does the assessment allow high
and low-achieving students to
adequately demonstrate their
knowledge?
What makes a good assessment?
● Aligned to standards
● Intentional range of cognitive demand
● Intentional design (blueprint)
● Rubric for all non-selected response questions
● Comparable for measuring growth
● Designed to produce specific data types
● Valid: Measures what you intend to measure
● Reliable: Produces repeatable results
● Historical data
Does the assessment allow high
and low-achieving students to
adequately demonstrate their
knowledge?
In other words, does the
assessment have something all
students can “Sink Their Teeth
Into?”
What makes a good assessment?
● Aligned to standards
● Intentional range of cognitive demand
● Intentional design (blueprint)
● Rubric for all non-selected response questions
● Comparable for measuring growth
● Designed to produce specific data types
● Valid: Measures what you intend to measure
● Reliable: Produces repeatable results
● Historical data
What makes a good assessment?
● Aligned to standards
● Intentional range of cognitive demand
● Intentional design (blueprint)
● Rubric for all non-selected response questions
● Comparable for measuring growth
● Designed to produce specific data types
● Valid: Measures what you intend to measure
● Reliable: Produces repeatable results
● Historical data
What makes a good assessment?
● Aligned to standards
● Intentional range of cognitive demand
● Intentional design (blueprint)
● Rubric for all non-selected response questions
● Comparable for measuring growth
● Designed to produce specific data types
● Valid: Measures what you intend to measure
● Reliable: Produces repeatable results
● Historical data
What makes a good assessment?
● Aligned to standards
● Intentional range of cognitive demand
● Intentional design (blueprint)
● Rubric for all non-selected response questions
● Comparable for measuring growth
● Designed to produce specific data types
● Valid: Measures what you intend to measure
● Reliable: Produces repeatable results
● Historical data
What makes a good assessment?
● Aligned to standards
● Intentional range of cognitive demand
● Intentional design (blueprint)
● Rubric for all non-selected response questions
● Comparable for measuring growth
● Designed to produce specific data types
● Valid: Measures what you intend to measure
● Reliable: Produces repeatable results
● Historical data
What makes a good assessment?
● Aligned to standards
● Intentional range of cognitive demand
● Intentional design (blueprint)
● Rubric for all non-selected response questions
● Comparable for measuring growth
● Designed to produce specific data types
● Valid: Measures what you intend to measure
● Reliable: Produces repeatable results
● Historical data
“...Validity is concerned with the confidence
with which we may draw inferences about
student learning from an assessment.
Furthermore, validity is not an either/or
proposition, instead it is a matter of degree,”
(Gareis & Grant, 2008 p.35)
What makes a good assessment?
● Aligned to standards
● Intentional range of cognitive demand
● Intentional design (blueprint)
● Rubric for all non-selected response questions
● Comparable for measuring growth
● Designed to produce specific data types
● Valid: Measures what you intend to measure
● Reliable: Produces repeatable results
● Historical data
“...Validity is concerned with the confidence
with which we may draw inferences about
student learning from an assessment.
Furthermore, validity is not an either/or
proposition, instead it is a matter of degree,”
(Gareis & Grant, 2008 p.35)
What makes a good assessment?
● Aligned to standards
● Intentional range of cognitive demand
● Intentional design (blueprint)
● Rubric for all non-selected response questions
● Comparable for measuring growth
● Designed to produce specific data types
● Valid: Measures what you intend to measure
● Reliable: Produces repeatable results
● Historical data
“...Validity is concerned with the confidence
with which we may draw inferences about
student learning from an assessment.
Furthermore, validity is not an either/or
proposition, instead it is a matter of degree,”
(Gareis & Grant, 2008 p.35)
Thus, increasing validity and reliability will
be an ongoing district process.
What makes a good assessment?
● Aligned to standards
● Intentional range of cognitive demand
● Intentional design (blueprint)
● Rubric for all non-selected response questions
● Comparable for measuring growth
● Designed to produce specific data types
● Valid: Measures what you intend to measure
● Reliable: Produces repeatable results
● Historical data
Example Checklist
(Source: Ohio Department of Education)
Example Checklist
(Source: New Jersey Department of Education)
Example Checklist
(Source: Kids At The Core)
✓ 100% Online
● Zero travel cost, not hotels, no gas
● Attend from school or from home
✓ Learn from educators just like you
● Fast paced presentations full of real experiences and lessons
learned in assessment writing
● Online Chatting and networking
✓ Flexible
● Enjoy it live or watch it later at your own pace using the K@C
Special Access Pass
Upcoming Event: June 2014
K@C Growth Assessment
Online Conference
In your district…
● How are assessments examined for their
quality?
● What could we do to improve our assessment
quality?
● What are other districts doing when faced with
similar roadblocks?
Discussion:
How do we examine assessment tools?
✓ Design a Blueprint
1. Consider course skills to measure
2. Consider format to best measure those skills
3. Consider complexity of execution of those skills
✓ Repeat the Blueprint Pattern
1. Repeated results will inform pivot points
✓ Consider Assessment Validity
✓ Consider Assessment Reliability
HOW CAN WE DO IT?
Build High Quality Assessment Sets
Remember
this?
✓ Design a Blueprint
1. Consider course skills to measure
2. Consider format to best measure those skills
3. Consider complexity of execution of those skills
✓ Repeat the Blueprint Pattern
1. Repeated results will inform pivot points
✓ Consider Assessment Validity
✓ Consider Assessment Reliability
HOW CAN WE DO IT?
Build High Quality Assessment Sets
The right blueprint design
gives data that can help us
pivot.
Activity:
Lets Develop an assessment blueprint.
Step One: Identify what essential skills & knowledge you will assess.
(Circle, Triangle Square Analogy)
Step Two: Select the form(s) for your assessment.
Step Three: Determine the number of items at each level of cognitive demand
Essential Skill &
Knowledge
Basic
(Remember & Understand)
Standard
(Apply & Analyze)
Expanded
(Evaluate & Create)
Analyze the development of
a central theme
1 MC 2 MC 1 MC
www.KidsAtTheCore/Downloads/AssessmentBlueprint.docx
Contact
Anne Weerda
info@KidsAtTheCore.com
www.KidsAtTheCore.com
What To Do with Data?
Stoplight Highlight:
One of the most common ways to translate numbers to a proficiency range
Red: Below Expectation Students scoring below end of
year expectations. (Ex: 20+ points, though this depends on the assessment
itself.)
Yellow: Close to Expectation Students close to end of
year expectations.(Ex: less than 20 points, though this depends on the
assessment itself.)
Green: At Expectation Students at end of year year
expectations. (Ex: +/- 5 points, though this depends on the assessment
itself.)
Blue: Above Expectation Students above end of year
year expectations. (Ex: Over 5 points above expectation, though this
depends on the assessment itself.)
What To Do with Data?
Student Score #1 Score #2
A 6% 33%
B 34% 55%
C 33% 59%
D 56% 87%
E 58% 96%
Interpreting Scores: Student Growth
Using Historical Data: You can determine expected
score based on previous “typical” growth.
Interpreting Scores: Student Growth
Using Historical Data: You can determine expected
score based on previous “typical” growth.
This gives meaning to the numbers.
Are these the scores I want?
Better? Worse?
Interpreting Scores: Student Growth
After year 1: You have Pre and Post scores for multiple students...
Interpreting Scores: Student Growth
Interpreting Scores: Student Growth
Interpreting Scores: Student Growth
Interpreting Scores: Student Growth
Pre and Post Test: Multiple Students Graphed
Interpreting Scores: Student Growth
Interpreting Scores: Student Growth
Fast Forward to the Fall of your 2nd or 3rd year...
Interpreting Scores: Student Growth
Student Score
#1
Score
#2
A 6% 33%
B 34% 55%
C 33% 59%
D 56% 87%
E 58% 96%
Interpreting Scores: Student Growth
Student Score
#1
Score
#2
A 6% 33%
B 34% 55%
C 33% 59%
D 56% 87%
E 58% 96%
Interpreting Scores: Student Growth
Student Score
#1
Score
#2
A 6% 33%
B 34% 55%
C 33% 59%
D 56% 87%
E 58% 96%
Interpreting Scores: Student Growth
Student Score
#1
Score
#2
A 6% 33%
B 34% 55%
C 33% 59%
D 56% 87%
E 58% 96%
Interpreting Scores: Student Growth
Do all pivot points have to come from FORMAL
assessments?
Interpreting Scores: Student Growth
Do all pivot points have to come from FORMAL
assessments?
NO!
Pivot Points: Informal Data
Collection
Riroe assessment workshop part #1
Riroe assessment workshop part #1

Weitere ähnliche Inhalte

Was ist angesagt?

Assessment for learning and development
Assessment for learning and developmentAssessment for learning and development
Assessment for learning and developmentMikeHayler
 
Writing Effective Learning Outcomes
Writing Effective Learning OutcomesWriting Effective Learning Outcomes
Writing Effective Learning OutcomesJoe McVeigh
 
Coaching as part of teacher evaluation process v2.0
Coaching as part of teacher evaluation process v2.0Coaching as part of teacher evaluation process v2.0
Coaching as part of teacher evaluation process v2.0Richard Voltz
 
Formative Assessment
Formative AssessmentFormative Assessment
Formative AssessmentIan Stone
 
Ways of getting young learners to assess themselves through ‘portfolio assess...
Ways of getting young learners to assess themselves through ‘portfolio assess...Ways of getting young learners to assess themselves through ‘portfolio assess...
Ways of getting young learners to assess themselves through ‘portfolio assess...Macmillan Education
 
Brightspace Rubrics: Everything you Always Wanted to Know - April 2019
Brightspace Rubrics: Everything you Always Wanted to Know - April 2019Brightspace Rubrics: Everything you Always Wanted to Know - April 2019
Brightspace Rubrics: Everything you Always Wanted to Know - April 2019D2L Barry
 
Progress Reports
Progress ReportsProgress Reports
Progress Reportsmsilberberg
 
Push back Sisyphus! Connecting feedback to learning
Push back Sisyphus! Connecting feedback to learningPush back Sisyphus! Connecting feedback to learning
Push back Sisyphus! Connecting feedback to learningTansy Jessop
 
Updated online assessment presentation
Updated online assessment presentationUpdated online assessment presentation
Updated online assessment presentationmargaretntrent
 
Online Assessment Presentation
Online Assessment PresentationOnline Assessment Presentation
Online Assessment Presentationmargaretntrent
 
An Overview of Assessment Design
An Overview of Assessment DesignAn Overview of Assessment Design
An Overview of Assessment DesignPeter Gow
 
Professor Graham Gibbs Presentation
Professor Graham Gibbs PresentationProfessor Graham Gibbs Presentation
Professor Graham Gibbs PresentationChris McEwan
 
Assessment instruments of and for learning
Assessment instruments of and for learningAssessment instruments of and for learning
Assessment instruments of and for learningMontse Irun _Chavarria
 

Was ist angesagt? (20)

Formative Assessment
Formative AssessmentFormative Assessment
Formative Assessment
 
Designs 2010 Session 3 Elementary
Designs 2010 Session 3 ElementaryDesigns 2010 Session 3 Elementary
Designs 2010 Session 3 Elementary
 
Assessment for learning and development
Assessment for learning and developmentAssessment for learning and development
Assessment for learning and development
 
Writing Effective Learning Outcomes
Writing Effective Learning OutcomesWriting Effective Learning Outcomes
Writing Effective Learning Outcomes
 
Planning for assessment
Planning for assessmentPlanning for assessment
Planning for assessment
 
Coaching as part of teacher evaluation process v2.0
Coaching as part of teacher evaluation process v2.0Coaching as part of teacher evaluation process v2.0
Coaching as part of teacher evaluation process v2.0
 
Why assessment?
Why assessment?Why assessment?
Why assessment?
 
Formative Assessment
Formative AssessmentFormative Assessment
Formative Assessment
 
What are SLO's
What are SLO'sWhat are SLO's
What are SLO's
 
Ways of getting young learners to assess themselves through ‘portfolio assess...
Ways of getting young learners to assess themselves through ‘portfolio assess...Ways of getting young learners to assess themselves through ‘portfolio assess...
Ways of getting young learners to assess themselves through ‘portfolio assess...
 
Brightspace Rubrics: Everything you Always Wanted to Know - April 2019
Brightspace Rubrics: Everything you Always Wanted to Know - April 2019Brightspace Rubrics: Everything you Always Wanted to Know - April 2019
Brightspace Rubrics: Everything you Always Wanted to Know - April 2019
 
Assessment 101
Assessment 101Assessment 101
Assessment 101
 
Progress Reports
Progress ReportsProgress Reports
Progress Reports
 
Push back Sisyphus! Connecting feedback to learning
Push back Sisyphus! Connecting feedback to learningPush back Sisyphus! Connecting feedback to learning
Push back Sisyphus! Connecting feedback to learning
 
Updated online assessment presentation
Updated online assessment presentationUpdated online assessment presentation
Updated online assessment presentation
 
Online Assessment Presentation
Online Assessment PresentationOnline Assessment Presentation
Online Assessment Presentation
 
An Overview of Assessment Design
An Overview of Assessment DesignAn Overview of Assessment Design
An Overview of Assessment Design
 
Professor Graham Gibbs Presentation
Professor Graham Gibbs PresentationProfessor Graham Gibbs Presentation
Professor Graham Gibbs Presentation
 
Assessment instruments of and for learning
Assessment instruments of and for learningAssessment instruments of and for learning
Assessment instruments of and for learning
 
Math Summit
Math SummitMath Summit
Math Summit
 

Andere mochten auch

Domestic engineer perfomance appraisal 2
Domestic engineer perfomance appraisal 2Domestic engineer perfomance appraisal 2
Domestic engineer perfomance appraisal 2tonychoper2804
 
Towards understanding genetic basis of chapatti (Indian flat bread) making qu...
Towards understanding genetic basis of chapatti (Indian flat bread) making qu...Towards understanding genetic basis of chapatti (Indian flat bread) making qu...
Towards understanding genetic basis of chapatti (Indian flat bread) making qu...CIMMYT
 
Proyecto medio ambiente 37067
Proyecto medio ambiente  37067Proyecto medio ambiente  37067
Proyecto medio ambiente 37067cpekathe
 
Přehled e-knih v čestině a ve slovenštině pro knihovny, inspirace k e-kn...
Přehled e-knih v čestině a ve slovenštině pro knihovny, inspirace k e-kn...Přehled e-knih v čestině a ve slovenštině pro knihovny, inspirace k e-kn...
Přehled e-knih v čestině a ve slovenštině pro knihovny, inspirace k e-kn...Jiri Pavlik
 
Domotica per il risparmio energetico
Domotica per il risparmio energeticoDomotica per il risparmio energetico
Domotica per il risparmio energeticoGiuseppe Salinaro
 
Volantino bruno3 a
Volantino bruno3 aVolantino bruno3 a
Volantino bruno3 aAndreaStrati
 
#RIPOLL - MUSEU DE RIPOLL - CAMBRA ANOXIA - ANOXIC CHAMBER SERVICE
#RIPOLL - MUSEU DE RIPOLL - CAMBRA ANOXIA - ANOXIC CHAMBER SERVICE#RIPOLL - MUSEU DE RIPOLL - CAMBRA ANOXIA - ANOXIC CHAMBER SERVICE
#RIPOLL - MUSEU DE RIPOLL - CAMBRA ANOXIA - ANOXIC CHAMBER SERVICEJordi Munell
 
JPJ1430 PROFILR: Toward Preserving Privacy and Functionality in Geosocial Net...
JPJ1430 PROFILR: Toward Preserving Privacy and Functionality in Geosocial Net...JPJ1430 PROFILR: Toward Preserving Privacy and Functionality in Geosocial Net...
JPJ1430 PROFILR: Toward Preserving Privacy and Functionality in Geosocial Net...chennaijp
 
How to integrate the results of the Country Reports in the Semantic Wiki
How to integrate the results of the Country Reports in the Semantic WikiHow to integrate the results of the Country Reports in the Semantic Wiki
How to integrate the results of the Country Reports in the Semantic WikiAtrium Forest
 
Business Angel - Just Do It 24 gennaio 2015
Business Angel - Just Do It 24 gennaio 2015Business Angel - Just Do It 24 gennaio 2015
Business Angel - Just Do It 24 gennaio 2015The Qube
 

Andere mochten auch (12)

Outbreak_Mapping_API
Outbreak_Mapping_APIOutbreak_Mapping_API
Outbreak_Mapping_API
 
Domestic engineer perfomance appraisal 2
Domestic engineer perfomance appraisal 2Domestic engineer perfomance appraisal 2
Domestic engineer perfomance appraisal 2
 
Towards understanding genetic basis of chapatti (Indian flat bread) making qu...
Towards understanding genetic basis of chapatti (Indian flat bread) making qu...Towards understanding genetic basis of chapatti (Indian flat bread) making qu...
Towards understanding genetic basis of chapatti (Indian flat bread) making qu...
 
Proyecto medio ambiente 37067
Proyecto medio ambiente  37067Proyecto medio ambiente  37067
Proyecto medio ambiente 37067
 
Přehled e-knih v čestině a ve slovenštině pro knihovny, inspirace k e-kn...
Přehled e-knih v čestině a ve slovenštině pro knihovny, inspirace k e-kn...Přehled e-knih v čestině a ve slovenštině pro knihovny, inspirace k e-kn...
Přehled e-knih v čestině a ve slovenštině pro knihovny, inspirace k e-kn...
 
Domotica per il risparmio energetico
Domotica per il risparmio energeticoDomotica per il risparmio energetico
Domotica per il risparmio energetico
 
Volantino bruno3 a
Volantino bruno3 aVolantino bruno3 a
Volantino bruno3 a
 
#RIPOLL - MUSEU DE RIPOLL - CAMBRA ANOXIA - ANOXIC CHAMBER SERVICE
#RIPOLL - MUSEU DE RIPOLL - CAMBRA ANOXIA - ANOXIC CHAMBER SERVICE#RIPOLL - MUSEU DE RIPOLL - CAMBRA ANOXIA - ANOXIC CHAMBER SERVICE
#RIPOLL - MUSEU DE RIPOLL - CAMBRA ANOXIA - ANOXIC CHAMBER SERVICE
 
Quercus on gae公開版
Quercus on gae公開版Quercus on gae公開版
Quercus on gae公開版
 
JPJ1430 PROFILR: Toward Preserving Privacy and Functionality in Geosocial Net...
JPJ1430 PROFILR: Toward Preserving Privacy and Functionality in Geosocial Net...JPJ1430 PROFILR: Toward Preserving Privacy and Functionality in Geosocial Net...
JPJ1430 PROFILR: Toward Preserving Privacy and Functionality in Geosocial Net...
 
How to integrate the results of the Country Reports in the Semantic Wiki
How to integrate the results of the Country Reports in the Semantic WikiHow to integrate the results of the Country Reports in the Semantic Wiki
How to integrate the results of the Country Reports in the Semantic Wiki
 
Business Angel - Just Do It 24 gennaio 2015
Business Angel - Just Do It 24 gennaio 2015Business Angel - Just Do It 24 gennaio 2015
Business Angel - Just Do It 24 gennaio 2015
 

Ähnlich wie Riroe assessment workshop part #1

OFE-AssessmentPresentation.pptx
OFE-AssessmentPresentation.pptxOFE-AssessmentPresentation.pptx
OFE-AssessmentPresentation.pptxssuser7646ea
 
Assessment
AssessmentAssessment
Assessmentaboodcr
 
Developing Student Confidence
Developing Student ConfidenceDeveloping Student Confidence
Developing Student Confidencemmcdowell13
 
week 1-CLassroom Assessment Presentation (1).pptx
week 1-CLassroom Assessment Presentation (1).pptxweek 1-CLassroom Assessment Presentation (1).pptx
week 1-CLassroom Assessment Presentation (1).pptxjason322724
 
Workplace and apprenticeship workshop ppt
Workplace and apprenticeship workshop pptWorkplace and apprenticeship workshop ppt
Workplace and apprenticeship workshop pptCindy Smith
 
Aligning Assessments to Course Outcomes
Aligning Assessments to Course OutcomesAligning Assessments to Course Outcomes
Aligning Assessments to Course OutcomesChristine Salmon
 
Continuous assessment and test construction
Continuous assessment and test constructionContinuous assessment and test construction
Continuous assessment and test constructionMekuriaMekonnen
 
Online Assessment Presentation
Online Assessment PresentationOnline Assessment Presentation
Online Assessment Presentationmargaretntrent
 
K to 12 classroom assessment ppt
K to 12 classroom assessment pptK to 12 classroom assessment ppt
K to 12 classroom assessment pptCarlo Magno
 
Curriculum Leadership Course 2009 Overview
Curriculum Leadership Course 2009 OverviewCurriculum Leadership Course 2009 Overview
Curriculum Leadership Course 2009 Overviewdbrady3702
 
Innovation presentation
Innovation presentationInnovation presentation
Innovation presentationdionesioable
 
WL-InstituteDay-101113-formativeassessment.key
WL-InstituteDay-101113-formativeassessment.keyWL-InstituteDay-101113-formativeassessment.key
WL-InstituteDay-101113-formativeassessment.keyrmak
 
Our assessment journey teachers edition
Our assessment journey  teachers editionOur assessment journey  teachers edition
Our assessment journey teachers editionAndrea Hnatiuk
 
Our assessment journey teachers edition
Our assessment journey  teachers editionOur assessment journey  teachers edition
Our assessment journey teachers editionAndrea Hnatiuk
 
Group presentation
Group presentationGroup presentation
Group presentationTutso
 

Ähnlich wie Riroe assessment workshop part #1 (20)

Summative & formative assessment
Summative & formative assessmentSummative & formative assessment
Summative & formative assessment
 
OFE-AssessmentPresentation.pptx
OFE-AssessmentPresentation.pptxOFE-AssessmentPresentation.pptx
OFE-AssessmentPresentation.pptx
 
Assessment
AssessmentAssessment
Assessment
 
Assessments
AssessmentsAssessments
Assessments
 
Assessments
AssessmentsAssessments
Assessments
 
Developing Student Confidence
Developing Student ConfidenceDeveloping Student Confidence
Developing Student Confidence
 
week 1-CLassroom Assessment Presentation (1).pptx
week 1-CLassroom Assessment Presentation (1).pptxweek 1-CLassroom Assessment Presentation (1).pptx
week 1-CLassroom Assessment Presentation (1).pptx
 
Workplace and apprenticeship workshop ppt
Workplace and apprenticeship workshop pptWorkplace and apprenticeship workshop ppt
Workplace and apprenticeship workshop ppt
 
Aligning Assessments to Course Outcomes
Aligning Assessments to Course OutcomesAligning Assessments to Course Outcomes
Aligning Assessments to Course Outcomes
 
Continuous assessment and test construction
Continuous assessment and test constructionContinuous assessment and test construction
Continuous assessment and test construction
 
Online Assessment Presentation
Online Assessment PresentationOnline Assessment Presentation
Online Assessment Presentation
 
K to 12 classroom assessment ppt
K to 12 classroom assessment pptK to 12 classroom assessment ppt
K to 12 classroom assessment ppt
 
Curriculum Leadership Course 2009 Overview
Curriculum Leadership Course 2009 OverviewCurriculum Leadership Course 2009 Overview
Curriculum Leadership Course 2009 Overview
 
Innovation presentation
Innovation presentationInnovation presentation
Innovation presentation
 
WL-InstituteDay-101113-formativeassessment.key
WL-InstituteDay-101113-formativeassessment.keyWL-InstituteDay-101113-formativeassessment.key
WL-InstituteDay-101113-formativeassessment.key
 
Our assessment journey teachers edition
Our assessment journey  teachers editionOur assessment journey  teachers edition
Our assessment journey teachers edition
 
Our assessment journey teachers edition
Our assessment journey  teachers editionOur assessment journey  teachers edition
Our assessment journey teachers edition
 
Abcsassessment
AbcsassessmentAbcsassessment
Abcsassessment
 
Assessment Advice
Assessment AdviceAssessment Advice
Assessment Advice
 
Group presentation
Group presentationGroup presentation
Group presentation
 

Kürzlich hochgeladen

Unit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptxUnit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptxVishalSingh1417
 
Salient Features of India constitution especially power and functions
Salient Features of India constitution especially power and functionsSalient Features of India constitution especially power and functions
Salient Features of India constitution especially power and functionsKarakKing
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdfQucHHunhnh
 
General Principles of Intellectual Property: Concepts of Intellectual Proper...
General Principles of Intellectual Property: Concepts of Intellectual  Proper...General Principles of Intellectual Property: Concepts of Intellectual  Proper...
General Principles of Intellectual Property: Concepts of Intellectual Proper...Poonam Aher Patil
 
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxBasic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxDenish Jangid
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfciinovamais
 
ICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptxICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptxAreebaZafar22
 
How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17Celine George
 
Fostering Friendships - Enhancing Social Bonds in the Classroom
Fostering Friendships - Enhancing Social Bonds  in the ClassroomFostering Friendships - Enhancing Social Bonds  in the Classroom
Fostering Friendships - Enhancing Social Bonds in the ClassroomPooky Knightsmith
 
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptxMaritesTamaniVerdade
 
ComPTIA Overview | Comptia Security+ Book SY0-701
ComPTIA Overview | Comptia Security+ Book SY0-701ComPTIA Overview | Comptia Security+ Book SY0-701
ComPTIA Overview | Comptia Security+ Book SY0-701bronxfugly43
 
Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)Jisc
 
Unit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptxUnit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptxVishalSingh1417
 
Food safety_Challenges food safety laboratories_.pdf
Food safety_Challenges food safety laboratories_.pdfFood safety_Challenges food safety laboratories_.pdf
Food safety_Challenges food safety laboratories_.pdfSherif Taha
 
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...ZurliaSoop
 
Single or Multiple melodic lines structure
Single or Multiple melodic lines structureSingle or Multiple melodic lines structure
Single or Multiple melodic lines structuredhanjurrannsibayan2
 
Graduate Outcomes Presentation Slides - English
Graduate Outcomes Presentation Slides - EnglishGraduate Outcomes Presentation Slides - English
Graduate Outcomes Presentation Slides - Englishneillewis46
 
SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptx
SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptxSKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptx
SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptxAmanpreet Kaur
 
Holdier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdfHoldier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdfagholdier
 
Micro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdfMicro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdfPoh-Sun Goh
 

Kürzlich hochgeladen (20)

Unit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptxUnit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptx
 
Salient Features of India constitution especially power and functions
Salient Features of India constitution especially power and functionsSalient Features of India constitution especially power and functions
Salient Features of India constitution especially power and functions
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdf
 
General Principles of Intellectual Property: Concepts of Intellectual Proper...
General Principles of Intellectual Property: Concepts of Intellectual  Proper...General Principles of Intellectual Property: Concepts of Intellectual  Proper...
General Principles of Intellectual Property: Concepts of Intellectual Proper...
 
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxBasic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdf
 
ICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptxICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptx
 
How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17How to Give a Domain for a Field in Odoo 17
How to Give a Domain for a Field in Odoo 17
 
Fostering Friendships - Enhancing Social Bonds in the Classroom
Fostering Friendships - Enhancing Social Bonds  in the ClassroomFostering Friendships - Enhancing Social Bonds  in the Classroom
Fostering Friendships - Enhancing Social Bonds in the Classroom
 
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
 
ComPTIA Overview | Comptia Security+ Book SY0-701
ComPTIA Overview | Comptia Security+ Book SY0-701ComPTIA Overview | Comptia Security+ Book SY0-701
ComPTIA Overview | Comptia Security+ Book SY0-701
 
Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)
 
Unit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptxUnit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptx
 
Food safety_Challenges food safety laboratories_.pdf
Food safety_Challenges food safety laboratories_.pdfFood safety_Challenges food safety laboratories_.pdf
Food safety_Challenges food safety laboratories_.pdf
 
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...
Jual Obat Aborsi Hongkong ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan...
 
Single or Multiple melodic lines structure
Single or Multiple melodic lines structureSingle or Multiple melodic lines structure
Single or Multiple melodic lines structure
 
Graduate Outcomes Presentation Slides - English
Graduate Outcomes Presentation Slides - EnglishGraduate Outcomes Presentation Slides - English
Graduate Outcomes Presentation Slides - English
 
SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptx
SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptxSKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptx
SKILL OF INTRODUCING THE LESSON MICRO SKILLS.pptx
 
Holdier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdfHoldier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdf
 
Micro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdfMicro-Scholarship, What it is, How can it help me.pdf
Micro-Scholarship, What it is, How can it help me.pdf
 

Riroe assessment workshop part #1

  • 2. Introduce yourself to your neighbor... and what do you think?
  • 3. Using Assessments: What do you think? The Most Powerful Use of My Assessment is For: ✓ Measuring Learning? ✓ Grades? ✓ Feedback for student? ✓ Measuring Growth? ✓ Another?
  • 4. Welcome to the AGE OF ASSESSMENT Lots of Choices. How do we make good choices? Consider this: Is the data you get from administering that assessment more valuable than the instructional time lost to administer the assessment?
  • 5. Welcome to the AGE OF ASSESSMENT Lots of Choices. How do we make good choices? Consider this: Is the data you get from administering that assessment more valuable than the instructional time lost to administer the assessment? Every time we give an assessment…. ...we give up instructional time.
  • 6. Measure the change in student understanding over time... Growth Tells Us-- A: Difference between student's baseline performance (beginning of the year) B: Difference between baseline and student's subsequent performance (middle, end or another year) How Much do we Assess?
  • 7. Welcome to the AGE OF ASSESSMENT Lots of Choices. How do we make good choices? Consider this: Is the data you get from administering that assessment more valuable than the instructional time lost to administer the assessment?
  • 8. Welcome to the AGE OF ASSESSMENT Lots of Choices. How do we make good choices? Consider this: Is the data you get from administering that assessment more valuable than the instructional time lost to administer the assessment?
  • 9. Pivot Points: Places in the teacher’s lesson where student growth data will determine the next teaching steps SLO: Student Learning Objectives. The Illinois recommended model framework for including growth on teacher evaluations. Assessment Design Common Question: Why are we so focused on ASSESSMENT
  • 10. Pivot Points: Places in the teacher’s lesson where student growth data will determine the next teaching steps SLO: Student Learning Objectives. The Illinois recommended model framework for including growth on teacher evaluations. Assessment Design Common Question: Why are we so focused on ASSESSMENT ….when INSTRUCTION is key to student learning?
  • 11. Pivot Points: Places in the teacher’s lesson where student growth data will determine the next teaching steps SLO: Student Learning Objectives. The Illinois recommended model framework for including growth on teacher evaluations. Assessment Design Common Question: Why are we so focused on ASSESSMENT ….when INSTRUCTION is key to student learning? ● Assessment and instruction go hand in hand.
  • 12. Pivot Points: Places in the teacher’s lesson where student growth data will determine the next teaching steps SLO: Student Learning Objectives. The Illinois recommended model framework for including growth on teacher evaluations. Assessment Design Common Question: Why are we so focused on ASSESSMENT ….when INSTRUCTION is key to student learning? ● Assessment and instruction go hand in hand. ● Assessments tell us if instruction is working.
  • 13. Pivot Points: Places in the teacher’s lesson where student growth data will determine the next teaching steps SLO: Student Learning Objectives. The Illinois recommended model framework for including growth on teacher evaluations. Assessment Design Common Question: Why are we so focused on ASSESSMENT ….when INSTRUCTION is key to student learning? ● Assessment and instruction go hand in hand. ● Assessments tell us if instruction is working. ● Assessments tell us what to do next.
  • 14. Pivot Points: Assessment Design Common Question: Why are we so focused on ASSESSMENT ….when INSTRUCTION is key to student learning? ● Assessment and instruction go hand in hand. ● Assessments tell us if instruction is working. ● Assessments tell us what to do next.
  • 15. Pivot Points: Places in the teacher’s lesson where student growth data will determine the next teaching steps Assessment Design
  • 16. Pivot Points: Places in the teacher’s lesson where student growth data will determine the next teaching steps Assessment Design Pivot Points in your lessons are they way you MAKE SURE all students are growing! MAKE SURE you and students are reaching your goals!
  • 17. Pivot Points: Places in the teacher’s lesson where student growth data will determine the next teaching steps Assessment Design Pivot Points in your lessons are they way you MAKE SURE all students are growing! MAKE SURE you and students are reaching your goals! You have to get the right kind of data to make a pivot! Key Tool: GOOD formative assessments.
  • 18. Pivot Points: Places in the teacher’s lesson where student growth data will determine the next teaching steps Formative Assessment: Assessments for learning: they occur during the instructional Interval and provide information about student learning progress. Summative Assessment: Assessments of learning: they occur at the end of an instructional interval and provide a final measurement of student mastery. (Stiggins, Arter, Chappius, 2004) (Reeves, 2009) (DuFour et. al, 2010) Assessment Design
  • 19. Pivot Points: Places in the teacher’s lesson where student growth data will determine the next teaching steps Formative Assessment: Assessments for learning: they occur during the instructional Interval and provide information about student learning progress. Summative Assessment: Assessments of learning: they occur at the end of an instructional interval and provide a final measurement of student mastery. (Stiggins, Arter, Chappius, 2004) (Reeves, 2009) (DuFour et. al, 2010) Assessment Design
  • 20. Pivot Points: Places in the teacher’s lesson where student growth data will determine the next teaching steps Formative Assessment: Assessments for learning: they occur during the instructional Interval and provide information about student learning progress. Summative Assessment: Assessments of learning: they occur at the end of an instructional interval and provide a final measurement of student mastery. (Stiggins, Arter, Chappius, 2004) (Reeves, 2009) (DuFour et. al, 2010) Assessment Design
  • 21. Pivot Points: Places in the teacher’s lesson where student growth data will determine the next teaching steps Formative Assessment: Assessments for learning: they occur during the instructional Interval and provide information about student learning progress. Summative Assessment: Assessments of learning: they occur at the end of an instructional interval and provide a final measurement of student mastery. (Stiggins, Arter, Chappius, 2004) (Reeves, 2009) (DuFour et. al, 2010) Assessment Design How an assessment is used really determines if it is FORMATIVE or SUMMATIVE! (Bailey & Jakicic, 2012)
  • 22. Pivot Points: Places in the teacher’s lesson where student growth data will determine the next teaching steps Formative Assessment: Assessments for learning: they occur during the instructional Interval and provide information about student learning progress. Summative Assessment: Assessments of learning: they occur at the end of an instructional interval and provide a final measurement of student mastery. Attainment: Meeting a target on 1 test. Measurement of knowledge at a single point in time. (ex: Final Exam, Chapter Test, Spelling Test) Growth: Change between 2 tests. Change in understanding/knowledge over time. Assessment Design
  • 23. Assessment Design Pivot Points: Places in the teacher’s lesson where student growth data will determine the next teaching steps Formative Assessment: Assessments for learning: they occur during the instructional Interval and provide information about student learning progress. Summative Assessment: Assessments of learning: they occur at the end of an instructional interval and provide a final measurement of student mastery. Attainment: Meeting a target on 1 test. Measurement of knowledge at a single point in time. (ex: Final Exam, Chapter Test, Spelling Test) Growth: Change between 2 tests. Change in understanding/knowledge over time. Evaluation Connection: Measuring Student Growth now is the assessment SPOTLIGHT!
  • 24. Why Did We Start Teaching?
  • 25. Traditionally... Each unit is an unrelated silo...
  • 26. Traditionally... Each unit is an unrelated silo...
  • 27. Traditionally... Each unit is an unrelated silo... You can not measure longitudinal learning this way.
  • 29. Assessment Implementation Model Assess the threads that run throughout the instructional interval
  • 32. Assessment Implementation Model In a growth model: Baseline
  • 33. Assessment Implementation Model In a growth model: Baseline Mid-point Check
  • 34. Assessment Implementation Model In a growth model: Baseline Final Post- test Mid-point Check
  • 35. Assessment Implementation Model Is 3 a MAGIC number? No. You want to have some formative between your pre-test and post-test
  • 36. Assessment Implementation Model Is 3 a MAGIC number? No. You want to have some formative between your pre-test and post-test
  • 37. Data from Assessments “Every time we gave an assessment and realized we were headed in the wrong direction, it gave us a deeper understanding of where we wanted to go,”
  • 38. Discussion: How do we use FORMATIVE Assessments? ● How do assessment results inform curriculum and instructional choices? ● Do your tools provide information on where/how to pivot for your students? ● Is the data easy to interpret?
  • 39. Bring Growth to your Data Picture Formative Assessments: Inform teaching Multiple, mirrored formative assessments: Show growth In the classroom:
  • 40. PERA is changing teacher evaluations. ● Fall of 2016: * ■ All school districts must implement student growth data into evaluation instruments *RTTT Districts, SIG Districts, and districts in status have earlier implementation requirements PERA Law-ISBE Requirements The Evaluation Connection: Student Growth
  • 41. The SLO Evaluation Cycle Source: Lachlan-Haché, L., Cushing, E., & Bivona, L. (2012). Student learning objectives as measures of educator effectiveness: The basics. Washington, DC: American Institutes for Research. Retrieved from http://educatortalent.org/inc/docs/SLOs_Measures_of_Educator_Effectiveness. pdf The Evaluation Connection: Student Growth
  • 42. Bring Growth to your Data Picture On the Evaluation: Student Growth (~30%) Assessment Set Results-Growth Data Professional Responsibility (~70%) 1b: Demonstrate knowledge of students' skills (student data) 1e: Designing coherent instruction (pivot points) 1f: Designing student assessments (formative assessments and use to plan instruction) 3d: Using assessment in instruction (assessments integrated; students monitor progress) 3e: Demonstrating flexibility and responsiveness (enhance learning on individual student misunderstanding) 4a: Reflecting on Teaching (Offers alternate actions, thoughtful assessment of lesson effectiveness) 4b: Maintaining Accurate Records (Students maintain progress and learning) The Evaluation Connection: Student Growth
  • 43. Bring Growth to your Data Picture On the Evaluation: Student Growth (~30%) Assessment Set Results-Growth Data Professional Responsibility (~70%) 1b: Demonstrate knowledge of students' skills (student data) 1e: Designing coherent instruction (pivot points) 1f: Designing student assessments (formative assessments and use to plan instruction) 3d: Using assessment in instruction (assessments integrated; students monitor progress) 3e: Demonstrating flexibility and responsiveness (enhance learning on individual student misunderstanding) 4a: Reflecting on Teaching (Offers alternate actions, thoughtful assessment of lesson effectiveness) 4b: Maintaining Accurate Records (Students maintain progress and learning) The Evaluation Connection: Student Growth
  • 44. Bring Growth to your Data Picture On the Evaluation: Student Growth (~30%) Assessment Set Results-Growth Data Professional Responsibility (~70%) 1b: Demonstrate knowledge of students' skills (student data) 1e: Designing coherent instruction (pivot points) 1f: Designing student assessments (formative assessments and use to plan instruction) 3d: Using assessment in instruction (assessments integrated; students monitor progress) 3e: Demonstrating flexibility and responsiveness (enhance learning on individual student misunderstanding) 4a: Reflecting on Teaching (Offers alternate actions, thoughtful assessment of lesson effectiveness) 4b: Maintaining Accurate Records (Students maintain progress and learning) There is a need for artifacts to PROVE: ● Students are learning and growing ● Teachers are designing instruction around assessment data
  • 45. The Instructional Interval What is an Instructional Interval? It depends on your district/class/course. It may be a full year or a portion of a year.
  • 46. The Instructional Interval What is an Instructional Interval? It depends on your district/class/course. It may be a full year or a portion of a year.
  • 47. The Elephant in the Room Timing: How will this work with other evaluation concentrations? How long will my “Instructional Interval” be?
  • 48. Gathering Data for Growth Goals The Evaluation Connection: Student Growth
  • 49. Gathering Data for Growth Goals The Evaluation Connection: Student Growth
  • 50. Gathering Data for Growth Goals The Evaluation Connection: Student Growth
  • 51. Creating Valid and Reliable Assessments for Growth How can Teachers Create or Choose assessments that can quantify change in student understanding? ilar Complexity
  • 52. Creating Valid and Reliable Assessments for Growth How can Teachers Create or Choose assessments that can quantify change in student understanding? How can we reliably put a number to student growth? ilar Complexity
  • 53. Creating Valid and Reliable Assessments for Growth How can Teachers Create or Choose assessments that can quantify change in student understanding? How can we reliably put a number to student growth? Key Features: 1. Mirrored Form 2. Mirrored Content 3. Mirrored Complexity
  • 54. Misconception: Test gets harder as student learns more material… In Reality: All assessments at the same level - Test does not change difficulty: More questions correct = Growth and Learning Measure Growth: All At The Same Level of Difficulty
  • 55. Misconception: Test gets harder as student learns more material… In Reality: All assessments at the same level - Test does not change difficulty: More questions correct = Growth and Learning Measure Growth: All At The Same Level of Difficulty We will discuss the more later!!
  • 57. Assessments in a Nutshell Consider Attainment vs Growth...
  • 58. Assessments in a Nutshell Consider Attainment vs Growth...
  • 59. Assessments in a Nutshell Consider Attainment vs Growth...
  • 60. Assessments in a Nutshell My students grow and learn every day!!
  • 61. Assessments in a Nutshell My students grow and learn every day!! Student learning IS always happening in my classroom.
  • 62. Assessments in a Nutshell So… HOW can we gather information/data/artifacts to prove growth is happening?
  • 63. Assessments in a Nutshell So… HOW can we gather information/data/artifacts to prove growth is happening? Measure Change
  • 64. Assessments in a Nutshell So… HOW can we gather information/data/artifacts to prove growth is happening? Measure Change Measure Student Learning
  • 65. Assessments in a Nutshell So… HOW can we gather information/data/artifacts to prove growth is happening? Measure Change Measure Student Learning Measure Student Growth
  • 66. Assessments in a Nutshell So… HOW can we gather information/data/artifacts to prove growth is happening? Measure Change Measure Student Learning Measure Student Growth Key: High Quality, Mirrored Assessment Sets
  • 67. Assessments in a Nutshell So… HOW can we gather information/data/artifacts to prove growth is happening? Measure Change Measure Student Learning Measure Student Growth Important Term: Mirrored Assessment Set: A series of comparable assessments that can measure learning over 2 or more points in time. They are designed with the same form, content, and level of complexity.
  • 68. Growth in a Nutshell So… HOW can we gather information/data/artifacts to prove growth is happening? Measure Change Measure Student Learning Measure Student Growth Key: High Quality, Assessment Sets Important Terms: High Validity: Accurately measure what we intend to measure High Reliability: Repeatable Results
  • 69. Growth in a Nutshell So… HOW can we gather information/data/artifacts to prove growth is happening? Measure Change Measure Student Learning Measure Student Growth Key: High Quality, Assessment Sets High Validity: Accurately measure what we intend to measure High Reliability: Repeatable Results Ask: What are you trying to measure? Bad Examples: ● A writing assignment with a difficult reading passage ● A reading passage that allows students of a cultural background to have an advantage over others
  • 70. Growth in a Nutshell So… HOW can we gather information/data/artifacts to prove growth is happening? Measure Change Measure Student Learning Measure Student Growth Key: High Quality, Assessment Sets High Validity: Accurately measure what we intend to measure High Reliability: Repeatable Results Important Terms:
  • 71. Growth in a Nutshell So… HOW can we gather information/data/artifacts to prove growth is happening? Measure Change Measure Student Learning Measure Student Growth Key: High Quality, Assessment Sets High Validity: Accurately measure what we intend to measure High Reliability: Repeatable Results Ask: Is this repeatable? Bad Examples: ● A rubric at several points in the same school year ● Different directions to different classes
  • 72. Growth in a Nutshell So… HOW can we gather information/data/artifacts to prove growth is happening? Measure Change Measure Student Learning Measure Student Growth Key: High Quality, Assessment Sets Important Terms: High Validity: Accurately measure what we intend to measure High Reliability: Repeatable Results
  • 73. ✓ Design a Blueprint 1. Consider course skills to measure 2. Consider format to best measure those skills 3. Consider complexity of execution of those skills ✓ Repeat the Blueprint Pattern 1. Repeated results will inform pivot points ✓ Consider Assessment Validity ✓ Consider Assessment Reliability HOW CAN WE DO IT? Build High Quality Assessment Sets
  • 75. Blueprint SNEAK PREVIEW: Blueprints are tools for: ● Organizing standard alignment ● Organizing spectrum of question complexity ● Creating a repeatable model to monitor student learning on skills over time
  • 76. ✓ Design a Blueprint 1. Consider course skills to measure 2. Consider format to best measure those skills 3. Consider complexity of execution of those skills ✓ Repeat the Blueprint Pattern 1. Repeated results will inform pivot points ✓ Consider Assessment Validity ✓ Consider Assessment Reliability HOW CAN WE DO IT? Build High Quality Assessment Sets
  • 77. ✓ Design a Blueprint 1. Consider course skills to measure 2. Consider format to best measure those skills 3. Consider complexity of execution of those skills ✓ Repeat the Blueprint Pattern 1. Repeated results will inform pivot points ✓ Consider Assessment Validity ✓ Consider Assessment Reliability HOW CAN WE DO IT? Build High Quality Assessment Sets
  • 78. ✓ Design a Blueprint 1. Consider course skills to measure 2. Consider format to best measure those skills 3. Consider complexity of execution of those skills ✓ Repeat the Blueprint Pattern 1. Repeated results will inform pivot points ✓ Consider Assessment Validity ✓ Consider Assessment Reliability HOW CAN WE DO IT? Build High Quality Assessment Sets
  • 79. ✓ Design a Blueprint 1. Consider course skills to measure 2. Consider format to best measure those skills 3. Consider complexity of execution of those skills ✓ Repeat the Blueprint Pattern 1. Repeated results will inform pivot points ✓ Consider Assessment Validity ✓ Consider Assessment Reliability HOW CAN WE DO IT? Build High Quality Assessment Sets Lets go step-by-step… and set the scaffolding to write high quality assessments!
  • 80. Creating Quality Assessments The 3 Elements Element Variables to Consider WHAT: Content/Skills ● Skills ● Standards ● Power Standards HOW: Form ● Selected Response (ex:multiple choice) ● Constructed Response ● Performance Based Difficulty: Level of Complexity ● Level of cognitive demand ● Question difficulty ● Difficulty of Reading Passage ● Difficulty of Prompt
  • 81. Creating Quality Assessments The 3 Elements Element Variables to Consider WHAT: Content/Skills ● Skills ● Standards ● Power Standards HOW: Form ● Selected Response (ex:multiple choice) ● Constructed Response ● Performance Based Difficulty: Level of Complexity ● Level of cognitive demand ● Question difficulty ● Difficulty of Reading Passage ● Difficulty of Prompt
  • 82. Aligning to Common Core Essential Skills/Knowledge Imagine a list of everything taught in an instructional interval.
  • 83. Aligning to Common Core Essential Skills/Knowledge Imagine a list of everything taught in an instructional interval. Will you assess and re-assess EVERYTHING to monitor student growth?
  • 84. Aligning to Common Core Essential Skills/Knowledge Imagine a list of everything taught in an instructional interval. Will you assess and re-assess EVERYTHING to monitor student growth? Not likely! Would you have time? No!
  • 85. Aligning to Common Core Essential Skills/Knowledge Imagine a list of everything taught in an instructional interval. Will you assess and re-assess EVERYTHING to monitor student growth? Not likely! Start by identifying Essential Skills/Knowledge
  • 86. I teach so many different things… How can I determine what is “essential?” How do we prioritize? Essential Skills/Knowledge
  • 87. 1. Endurance: Will this standard or indicator provide students with knowledge and skills that will be of value beyond a single test date? This is information a student will need to know far beyond the last test the teacher gives. 2. Leverage: Will this provide knowledge and skills that will be of value in multiple disciplines? (For example: making inferences is a skill that can be used in many subjects) 3. Readiness for the next level of learning: Will this provide students with essential knowledge and skills that are necessary for success in the next grade of the next level of instruction? Ainsworth, L. (2003) How do we prioritize? Essential Skills/Knowledge
  • 88. 1. Endurance: Will this standard or indicator provide students with knowledge and skills that will be of value beyond a single test date? This is information a student will need to know far beyond the last test the teacher gives. 2. Leverage: Will this provide knowledge and skills that will be of value in multiple disciplines? (For example: making inferences is a skill that can be used in many subjects) 3. Readiness for the next level of learning: Will this provide students with essential knowledge and skills that are necessary for success in the next grade of the next level of instruction? Ainsworth, L. (2003) How do we prioritize? Essential Skills/Knowledge Examples: #1: Name the parts of the flower. IF you ask students to do this for one test and never again all year long...it doesn’t have endurance. #2: Clap out a rhythm of quarter notes and half notes. IF you will revisit quarter notes and half notes again and again, they have endurance.
  • 89. 1. Endurance: Will this standard or indicator provide students with knowledge and skills that will be of value beyond a single test date? This is information a student will need to know far beyond the last test the teacher gives. 2. Leverage: Will this provide knowledge and skills that will be of value in multiple disciplines? (For example: making inferences is a skill that can be used in many subjects) 3. Readiness for the next level of learning: Will this provide students with essential knowledge and skills that are necessary for success in the next grade of the next level of instruction? Ainsworth, L. (2003) How do we prioritize? Essential Skills/Knowledge
  • 90. 1. Endurance: Will this standard or indicator provide students with knowledge and skills that will be of value beyond a single test date? This is information a student will need to know far beyond the last test the teacher gives. 2. Leverage: Will this provide knowledge and skills that will be of value in multiple disciplines? (For example: making inferences is a skill that can be used in many subjects) 3. Readiness for the next level of learning: Will this provide students with essential knowledge and skills that are necessary for success in the next grade of the next level of instruction? Ainsworth, L. (2003) How do we prioritize? Essential Skills/Knowledge Example: #1: Making Inferences. Students can make inferences in science, reading, consumer economics and more. It applies to many disciplines.
  • 91. 1. Endurance: Will this standard or indicator provide students with knowledge and skills that will be of value beyond a single test date? This is information a student will need to know far beyond the last test the teacher gives. 2. Leverage: Will this provide knowledge and skills that will be of value in multiple disciplines? (For example: making inferences is a skill that can be used in many subjects) 3. Readiness for the next level of learning: Will this provide students with essential knowledge and skills that are necessary for success in the next grade of the next level of instruction? Ainsworth, L. (2003) How do we prioritize? Essential Skills/Knowledge
  • 92. 1. Endurance: Will this standard or indicator provide students with knowledge and skills that will be of value beyond a single test date? This is information a student will need to know far beyond the last test the teacher gives. 2. Leverage: Will this provide knowledge and skills that will be of value in multiple disciplines? (For example: making inferences is a skill that can be used in many subjects) 3. Readiness for the next level of learning: Will this provide students with essential knowledge and skills that are necessary for success in the next grade of the next level of instruction? Ainsworth, L. (2003) How do we prioritize? Essential Skills/Knowledge Examples: #1: Absolute Value: Math If you believe not understanding absolute value in 6th grade will prevent success at 7th grade math...it is essential. #2: Foods I to Foods II If there are skills that you MUST master in Foods I for success in Foods II, those are your essential skills.
  • 93. I teach so many different things… How can I determine what is “essential?” How do we prioritize? Essential Skills/Knowledge
  • 94. I teach so many different things… How can I determine what is “essential?” How do we prioritize? Essential Skills/Knowledge Typically, this is what you have an opportunity to teach again and again in an instructional interval.
  • 95. I teach so many different things… How can I determine what is “essential?” How do we prioritize? Essential Skills/Knowledge Typically, this is what you have an opportunity to teach again and again in an instructional interval. We can look to PARCC for guidance...
  • 96. How do we prioritize? Essential Skills/Knowledge Evidence Tables Provides the assessment Boundaries/limits for all content standards that will be assessed on the PBA and EOY assessments for Grades 3-11 ● The PARCC Assessment ● Assessment System ● Blueprints and Test Specs ● Scroll all the way down to the PDF’s and choose specific level http://www.parcconline. org/assessment-blueprints- test-specs High Level Blueprints Lists the number of items and points for each type of task that will appear on the PBA and EOY assessments for Grades 3-11 ● The PARCC Assessment ● Assessment System ● Blueprints and Test Specs ● Scroll all the way down to the PDF’s and choose specific level http://www.parcconline. org/assessment-blueprints- test-specs Estimated Time on Task Lists the amount of time students will have to complete PBA and EOY assessments for Grades 3-11 ● The PARCC Assessment ● Policies and Guidelines ● Administering the Test ● Unit Testing Times http://parcconline. org/update-session-times Performance Level Descriptors (PLD’S) PLD’s describe the skills, knowledge and practices a student who has achieved a particular performance level should be able to demonstrate ● The PARCC Assessment ● Assessment System ● Mathematics PLD’s or Literacy PLD’s ● Scroll down to specific grade level Math PLD’s http://www.parcconline. org/math-plds ELA/Literacy PLD’s http://www.parcconline. org/ela-plds
  • 97. How do we prioritize? Essential Skills/Knowledge Online Practice Tests Access to Sample items, test items, and a tutorial with TestNav8. This was developed so that all students can experience PARCC test items in the correct environment ● The PARCC Assessment ● PARCC practice tests (middle of 1st paragraph) ● View Test Preparation ● Practice Tests ● Choose Math or ELA ● Choose grade level Math Practice Tests http://parcc.pearson. com/practice-tests/math/ ELA/Literacy Practice Tests http://parcc.pearson. com/practice-tests/english/
  • 98. Assessment System Class Skills 1, 2, 3, 4, 5, 6, 7… (ex: NGSS Scientific Practices)
  • 99. Assessment System Big Idea/Unit A Content Skills #1 #3 #4 Class Skills 1, 2, 3, 4, 5, 6, 7… (ex: NGSS Scientific Practices)
  • 100. Assessment System Big Idea/Unit A Content Skills #1 #3 #4 Big Idea/Unit B Content Skills # 2 # 3 Class Skills 1, 2, 3, 4, 5, 6, 7… (ex: NGSS Scientific Practices)
  • 101. Assessment System Big Idea/Unit A Content Skills #1 #3 #4 Big Idea/Unit B Content Skills # 2 # 3 Class Skills 1, 2, 3, 4, 5, 6, 7… (ex: NGSS Scientific Practices)
  • 102. Assessment System Big Idea/Unit A Content Skills #1 #3 #4 Big Idea/Unit B Content Skills # 2 # 3 Class Skills 1, 2, 3, 4, 5, 6, 7… (ex: NGSS Scientific Practices) Different in every unit
  • 103. Assessment System Big Idea/Unit A Content Skills #1 #3 #4 Big Idea/Unit B Content Skills # 2 # 3 Class Skills 1, 2, 3, 4, 5, 6, 7… (ex: NGSS Scientific Practices) Measured for growth over the course of many units
  • 104. Assessment System Class Skills 1, 2, 3, 4, 5, 6, 7… (ex: NGSS Scientific Practices)
  • 105. Assessment System Class Skills 1, 2, 3, 4, 5, 6, 7… (ex: NGSS Scientific Practices)
  • 106. Assessment System Class Skills 1, 2, 3, 4, 5, 6, 7… (ex: NGSS Scientific Practices) Baseline Pretest: All Skills
  • 107. Assessment System Class Skills 1, 2, 3, 4, 5, 6, 7… (ex: NGSS Scientific Practices) Baseline Pretest: All Skills Summative Posttest: All Skills
  • 108. Assessment System Class Skills 1, 2, 3, 4, 5, 6, 7… (ex: NGSS Scientific Practices) Baseline Pretest: All Skills Summative Posttest: All Skills Big Idea/Unit A Content Skills #1 #3 #4 Big Idea/Unit B Content Skills #2 #3
  • 109. Assessment System Class Skills 1, 2, 3, 4, 5, 6, 7… (ex: NGSS Scientific Practices) Baseline Pretest: All Skills Summative Posttest: All Skills Big Idea/Unit A Content Skills #1 #3 #4
  • 110. Assessment System Class Skills 1, 2, 3, 4, 5, 6, 7… (ex: NGSS Scientific Practices) Baseline Pretest: All Skills Summative Posttest: All Skills Big Idea/Unit B Content Skills #2 #3
  • 111. Assessment System Class Skills 1, 2, 3, 4, 5, 6, 7… (ex: NGSS Scientific Practices) Baseline Pretest: All Skills Summative Posttest: All Skills
  • 112. Assessment System Class Skills 1, 2, 3, 4, 5, 6, 7… (ex: NGSS Scientific Practices) Baseline Pretest: All Skills Summative Posttest: All Skills
  • 113. Assessment System Class Skills 1, 2, 3, 4, 5, 6, 7… (ex: NGSS Scientific Practices) Baseline Pretest: All Skills Summative Posttest: All Skills Formative Assessments: Pivot Points
  • 114. Assessment System Class Skills 1, 2, 3, 4, 5, 6, 7… (ex: NGSS Scientific Practices) Baseline Pretest: All Skills Summative Posttest: All Skills Formative Assessments: Pivot Points Notice some skills are not assessed until late in the year if they are not covered until late in the year...
  • 115. Assessment System Big Idea/Unit A Content Skills #1 #3 #4 Big Idea/Unit B Content Skills # 2 # 3 Class Skills 1, 2, 3, 4, 5, 6, 7… (ex: NGSS Scientific Practices) Could there be some content memorization that is foundational and essential? Ex: memorize notes for music theory class...
  • 116. No Assessment should be 100% Recall A balanced assessment has appropriate amounts of questions at each level of cognitive demand. It is reflective of the range of cognitive demand required in the class it measures. The Balanced Assessment
  • 117. No Assessment should be 100% Recall A balanced assessment has appropriate amounts of questions at each level of cognitive demand. It is reflective of the range of cognitive demand required in the class it measures. The Balanced Assessment
  • 118. Discussion: Take Aways ● What skills should be measured? How often? ● How should teachers prioritize? ● Do we have longitudinal ways to measure learning on comparable assessments? ● Is the data easy to interpret?
  • 119. Creating Quality Assessments The 3 Elements Element Variables to Consider WHAT: Content/Skills ● Skills ● Standards ● Power Standards HOW: Form ● Selected Response (ex:multiple choice) ● Constructed Response ● Performance Based Difficulty: Level of Complexity ● Level of cognitive demand ● Question difficulty ● Difficulty of Reading Passage ● Difficulty of Prompt
  • 120. Comparable assessments in a set are of the same format Selected Response Assessments: Ask students to select the correct answer from a provided set of answers. Constructed Response Assessments: Ask students to construct their own answer to a question. Performance Assessments: Ask students to demonstrate understanding by performing or creating a product. Be Intentional about Question Form
  • 121. Comparable assessments in a set are of the same format Selected Response Assessments: Ask students to select the correct answer from a provided set of answers. Constructed Response Assessments: Ask students to construct their own answer to a question. Performance Assessments: Ask students to demonstrate understanding by performing or creating a product. Be Intentional about Question Form
  • 122. Comparable assessments in a set are of the same format Selected Response Assessments: Ask students to select the correct answer from a provided set of answers. Constructed Response Assessments: Ask students to construct their own answer to a question. Performance Assessments: Ask students to demonstrate understanding by performing or creating a product. Be Intentional about Question Form Teacher Time is at a Premium! Invest in question forms that will: ● Tell teachers about student thinking ● Tell teachers what to do next Consider Your Time Investment: Will a constructed response version of this question tell me more about student thinking than a multiple choice version?
  • 123. Comparable assessments in a set are of the same format Selected Response Assessments: Ask students to select the correct answer from a provided set of answers. Constructed Response Assessments: Ask students to construct their own answer to a question. Performance Assessments: Ask students to demonstrate understanding by performing or creating a product. Be Intentional about Question Form Teacher Time is at a Premium! Invest in question forms that will: ● Tell teachers about student thinking ● Tell teachers what to do next Consider Your Time Investment: Will a constructed response version of this question tell me more about student thinking than a multiple choice version?
  • 124. Comparable assessments in a set are of the same format Selected Response Assessments: Ask students to select the correct answer from a provided set of answers. Constructed Response Assessments: Ask students to construct their own answer to a question. Performance Assessments: Ask students to demonstrate understanding by performing or creating a product. Be Intentional about Question Form
  • 125. Inter-Rater Reliability: ● Same/Repeatable results when 2+ people are using 1 rubric ● Practice as a team, share samples, keep on file Intra-Rater Reliability: ● Repeatable results when 1 person is using 1 rubric ● Photograph/record samples, keep on file Reliability: Consistent, Repeatable Results
  • 126. Selected Response Assessments: ● Take more time to write ● Take less time to grade ● Write good distractors: wrong answers should point out misconceptions ● May not be appropriate for all ages or subjects ● Assess what you intend to assess (validity) ● Be consistent in administration technique (directions, etc.) Considerations in Selecting Question Form
  • 127. Selected Response Assessments: ● Take more time to write ● Take less time to grade ● Write good distractors: wrong answers should point out misconceptions ● May not be appropriate for all ages or subjects ● Assess what you intend to assess (validity) ● Be consistent in administration technique (directions, etc.) Considerations in Selecting Question Form
  • 128. Writing Wrong Answers The Correct Answer: This is the clear, and undisputed correct answer to the question. Incorrect Choices: #1 The Unsupported Statement: #2 The Distortion of the Truth: #3 The Skip & Switch: #4 The Extremist:
  • 129. Writing Wrong Answers The Correct Answer: This is the clear, and undisputed correct answer to the question. Incorrect Choices: #1 The Unsupported Statement: #2 The Distortion of the Truth: #3 The Skip & Switch: #4 The Extremist: See More: http://kidsatthecore.com/writing-wrong- answers/
  • 130. Writing Wrong Answers The Correct Answer: This is the clear, and undisputed correct answer to the question. Incorrect Choices: #1 The Unsupported Statement: #2 The Distortion of the Truth: #3 The Skip & Switch: #4 The Extremist: The Unsupported Statement: This statement could possibly be true but is not supported by the text, evidence, or data provided. Often this choice sounds “smart” and includes big words or appeals to a reader’s biases. It might appear to be universally true, but is not supported by the evidence provided.
  • 131. Writing Wrong Answers The Correct Answer: This is the clear, and undisputed correct answer to the question. Incorrect Choices: #1 The Unsupported Statement: #2 The Distortion of the Truth: #3 The Skip & Switch: #4 The Extremist: The Distortion of the Truth: This choice represents a conclusion that distorts what is provided in the text, evidence or data. It not supported by the passage or the data provided. It might disagree with the meaning of the text, evidence or data or include a conclusion beyond the provided information supports. It might take words/phrases out of context.
  • 132. Writing Wrong Answers The Correct Answer: This is the clear, and undisputed correct answer to the question. Incorrect Choices: #1 The Unsupported Statement: #2 The Distortion of the Truth: #3 The Skip & Switch: #4 The Extremist: The Skip & Switch: This is the choice if a step in thinking is skipped or altered. This is a common variation of wrong answers in math problems. This choice might give a correct answer to another question. It might be a choice reached by solving the problem with the incorrect method. Often these are the choices that teachers will catch if students scan and hunt for answers or skip steps in the problem.
  • 133. Writing Wrong Answers The Correct Answer: This is the clear, and undisputed correct answer to the question. Incorrect Choices: #1 The Unsupported Statement: #2 The Distortion of the Truth: #3 The Skip & Switch: #4 The Extremist: The Extremist: These choices contain exaggerations of the truth and use extreme words like “everyone” or “all the time” or “never” when not supported by the text, evidence or data. For the more discerning students, these are typically choices that can be easily eliminated with a more careful lense.
  • 134. Constructed Response Assessments: ● Take more time to grade ● Require a specific rubric ● Require inter-rater and intra-rater reliability ● Write descriptive rubrics: less than full credit should point out misconceptions, next steps ● May not be appropriate for all ages or subjects ● Be consistent in administration technique (directions, etc.) Considerations in Selecting Question Form
  • 135. Constructed Response Assessments: ● Take more time to grade ● Require a specific rubric ● Require inter-rater and intra-rater reliability ● Write descriptive rubrics: less than full credit should point out misconceptions, next steps ● May not be appropriate for all ages or subjects ● Be consistent in administration technique (directions, etc.) Considerations in Selecting Question Form We will discuss in detail shortly...
  • 136. Performance Assessments: ● Take more time to grade and administer ● Require a specific rubric ● Require inter-rater and intra-rater reliability ● Write descriptive rubrics: less than full credit should point out misconceptions, next steps ● May not be appropriate for all ages or subjects ● Be consistent in administration technique (directions, etc.) Considerations in Selecting Question Form
  • 137. Performance Assessments: ● Take more time to grade and administer ● Require a specific rubric ● Require inter-rater and intra-rater reliability ● Write descriptive rubrics: less than full credit should point out misconceptions, next steps ● May not be appropriate for all ages or subjects ● Be consistent in administration technique (directions, etc.) Considerations in Selecting Question Form Lets discuss this deeper now...
  • 139. Understanding Rubrics ROWS: Each of the criteria you plan to measure.
  • 140. Understanding Rubrics ROWS: Each of the criteria you plan to measure. COLUMNS: Potential levels of performance ranging from “ideal” to not evident.
  • 141. Understanding Rubrics ROWS: Each of the criteria you plan to measure. COLUMNS: Potential levels of performance ranging from “ideal” to not evident. Rating Boxes: Detailed descriptions of performance at that level for that skill.
  • 142. Understanding Rubrics ROWS: Each of the criteria you plan to measure.
  • 144. Understanding Rubrics ROWS: Each of the criteria you plan to measure.
  • 145. Understanding Rubrics COLUMNS: Potential levels of performance ranging from “ideal” to not evident. ROWS: Each of the criteria you plan to measure.
  • 146. Understanding Rubrics Rating Boxes: Detailed descriptions of performance at that level for that skill. COLUMNS: Potential levels of performance ranging from “ideal” to not evident. ROWS: Each of the criteria you plan to measure.
  • 147. Understanding Rubrics Rating Boxes: Detailed descriptions of performance at that level for that skill. COLUMNS: Potential levels of performance ranging from “ideal” to not evident. ROWS: Each of the criteria you plan to measure. This is NOT an example of a descriptive rubric
  • 148. Understanding Rubrics Rating Boxes: Detailed descriptions of performance at that level for that skill. COLUMNS: Potential levels of performance ranging from “ideal” to not evident. ROWS: Each of the criteria you plan to measure. This is NOT an example of a descriptive rubric Some of the time Most of the time Fair Poor
  • 149. Understanding Rubrics ROWS: Each of the criteria you plan to measure.
  • 150. Understanding Rubrics ROWS: Each of the criteria you plan to measure. Determining Appropriate Criteria: ● Areas key to student learning (essential skills) ● Definable & Observable ● Not generally characteristic of the task itself but rather characteristics of the learning outcomes the assessment is designed to measure Effective Rubrics do not list all possible criteria; they list the right criteria for the assessment’s purpose
  • 151. Understanding Rubrics ROWS: Each of the criteria you plan to measure. Determining Appropriate Criteria: ● Areas key to student learning (essential skills) ● Definable & Observable ● Not generally characteristic of the task itself but rather characteristics of the learning outcomes the assessment is designed to measure Effective Rubrics do not list all possible criteria; they list the right criteria for the assessment’s purpose
  • 152. Understanding Rubrics ROWS: Each of the criteria you plan to measure. Determining Appropriate Criteria: ● Areas key to student learning (essential skills) ● DOD: Definable, Observable & Distinct ● Not generally characteristic of the task itself but rather characteristics of the learning outcomes the assessment is designed to measure Effective Rubrics do not list all possible criteria; they list the right criteria for the assessment’s purpose
  • 153. Understanding Rubrics ROWS: Each of the criteria you plan to measure. Determining Appropriate Criteria: ● Areas key to student learning (essential skills) ● DOD: Definable, Observable & Distinct ● Not generally characteristic of the task itself but rather characteristics of the learning outcomes the assessment is designed to measure Effective Rubrics do not list all possible criteria; they list the right criteria for the assessment’s purpose Definable: Each Criterion has a clear, agreed upon meaning that teacher and student understand. Observable: Each criterion describes a quality in the performance that can be perceived (seen, heard). Distinct: Each Criterion identifies a separate aspecting of the learning outcomes.
  • 154. Understanding Rubrics ROWS: Each of the criteria you plan to measure. Determining Appropriate Criteria: ● Areas key to student learning (essential skills) ● DOD: Definable, Observable & Distinct ● Not generally characteristic of the task itself but rather characteristics of the learning outcomes the assessment is designed to measure Effective Rubrics do not list all possible criteria; they list the right criteria for the assessment’s purpose Definable: Each Criterion has a clear, agreed upon meaning that teacher and student understand. Observable: Each criterion describes a quality in the performance that can be perceived (seen, heard). Distinct: Each Criterion identifies a separate aspecting of the learning outcomes.
  • 155. Understanding Rubrics ROWS: Each of the criteria you plan to measure. Determining Appropriate Criteria: ● Areas key to student learning (essential skills) ● DOD: Definable, Observable & Distinct ● Not generally characteristic of the task itself but rather characteristics of the learning outcomes the assessment is designed to measure Effective Rubrics do not list all possible criteria; they list the right criteria for the assessment’s purpose Definable: Each Criterion has a clear, agreed upon meaning that teacher and student understand. Observable: Each criterion describes a quality in the performance that can be perceived (seen, heard). Distinct: Each Criterion identifies a separate aspecting of the learning outcomes.
  • 156. Understanding Rubrics ROWS: Each of the criteria you plan to measure. Determining Appropriate Criteria: ● Areas key to student learning (essential skills) ● DOD: Definable, Observable & Distinct ● Not generally characteristic of the task itself but rather characteristics of the enduring learning outcomes the assessment is designed to measure. These measurement criteria could then apply to several prompts, taks or performances. Effective Rubrics do not list all possible criteria; they list the right criteria for the assessment’s purpose
  • 157. Understanding Rubrics ROWS: Each of the criteria you plan to measure. Determining Appropriate Criteria: ● Areas key to student learning (essential skills) ● DOD: Definable, Observable & Distinct ● Not generally characteristic of the task itself but rather characteristics of the enduring learning outcomes the assessment is designed to measure. These measurement criteria could then apply to several prompts, taks or performances. Effective Rubrics do not list all possible criteria; they list the right criteria for the assessment’s purpose.
  • 158. Understanding Rubrics COLUMNS: Potential levels of performance ranging from “ideal” to not evident. ROWS: Each of the criteria you plan to measure.
  • 159. Understanding Rubrics COLUMNS: Potential levels of performance ranging from “ideal” to not evident. ROWS: Each of the criteria you plan to measure. Performance Level Descriptions: Hint: Think--what does performance look like at high quality? Work backwards. Descriptive Descriptive language that depicts what one would observe, NOT quality conclusions. (ex: good vs poor) Clear Students and teachers both understand meanings of descriptions. Teachers use in a repeatable fashion. Whole Range & Distinguishable Levels Performance descriptions are described from one extreme to another extreme. Descriptions differ enough from one level to another that work can be categorized. (Ex: number of levels may be multiple or only 2) Parallel Descriptions Performance descriptors at each level of the continuum differ in the same aspects of the work and focus on incremental improvements in a standard or skill.
  • 160. Understanding Rubrics COLUMNS: Potential levels of performance ranging from “ideal” to not evident. ROWS: Each of the criteria you plan to measure. Performance Level Descriptions: Hint: Think--what does performance look like at high quality? Work backwards. Descriptive Descriptive language that depicts what one would observe, NOT quality conclusions. (ex: good vs poor) Clear Students and teachers both understand meanings of descriptions. Teachers use in a repeatable fashion. Whole Range & Distinguishable Levels Performance descriptions are described from one extreme to another extreme. Descriptions differ enough from one level to another that work can be categorized. (Ex: number of levels may be multiple or only 2) Parallel Descriptions Performance descriptors at each level of the continuum differ in the same aspects of the work and focus on incremental improvements in a standard or skill.
  • 161. Understanding Rubrics COLUMNS: Potential levels of performance ranging from “ideal” to not evident. ROWS: Each of the criteria you plan to measure. Performance Level Descriptions: Hint: Think--what does performance look like at high quality? Work backwards. Descriptive Descriptive language that depicts what one would observe, NOT quality conclusions. (ex: good vs poor) Clear Students and teachers both understand meanings of descriptions. Teachers use in a repeatable fashion. Levels: Whole Range & Distinguishable Performance descriptions are described from one extreme to another extreme. Descriptions differ enough from one level to another that work can be categorized. (Ex: number of levels may be multiple or only 2) Parallel Descriptions Performance descriptors at each level of the continuum differ in the same aspects of the work and focus on incremental improvements in a standard or skill.
  • 162. Understanding Rubrics COLUMNS: Potential levels of performance ranging from “ideal” to not evident. ROWS: Each of the criteria you plan to measure. Performance Level Descriptions: Hint: Think--what does performance look like at high quality? Work backwards. Descriptive Descriptive language that depicts what one would observe, NOT quality conclusions. (ex: good vs poor) Clear Students and teachers both understand meanings of descriptions. Teachers use in a repeatable fashion. Levels: Whole Range & Distinguishable Performance descriptions are described from one extreme to another extreme. Descriptions differ enough from one level to another that work can be categorized. (Ex: number of levels may be multiple or only 2) Parallel Descriptions Performance descriptors at each level of the continuum differ in the same aspects of the work and focus on incremental improvements in a standard or skill.
  • 163. Understanding Rubrics: For Growth Data COLUMNS: Potential levels of performance ranging from “ideal” to not evident. ROWS: Each of the criteria you plan to measure. Spotlight on Growth for Evaluations COMPARABLE DATA: ● End of year Expectations ● Use Rubric Consistently ● High Reliability Gradebook Problem: How can I grade at week 2 with end of year expectations? ● Use a curve (“A” is not only for highest rating) ● Allow revisions The Evaluation Connection: Student Growth
  • 164. Understanding Rubrics: For Growth Data COLUMNS: Potential levels of performance ranging from “ideal” to not evident. ROWS: Each of the criteria you plan to measure. Spotlight on Growth for Evaluations COMPARABLE DATA: ● End of year Expectations ● Use Rubric Consistently ● High Reliability Gradebook Problem: How can I grade at week 2 with end of year expectations? ● Use a curve (“A” is not only for highest rating) ● Allow revisions The Evaluation Connection: Student Growth
  • 165. Understanding Rubrics: For Growth Data COLUMNS: Potential levels of performance ranging from “ideal” to not evident. ROWS: Each of the criteria you plan to measure. Spotlight on Growth for Evaluations COMPARABLE DATA: ● End of year Expectations ● Use Rubric Consistently ● High Reliability Gradebook Problem: How can I grade at week 2 with end of year expectations? ● Use a curve (“A” is not only for highest rating) ● Allow revisions The Evaluation Connection: Student Growth
  • 166. Performance Assessments: Consider when Collecting Baseline Data: ● Exasperation cut points ● Safety Issues ● Advertisement for class Understanding Rubrics The Evaluation Connection: Student Growth
  • 167. Rubrics as Tools for Teacher: ● Defines “quality” for a process, product or behavior ● Powerful for teaching and learning ○ Descriptions help communicate what it takes to succeed (This shouldn’t be a secret kept by the teachers!) ● Increase reliability of results ○ Increase consistency in grading between students ○ Increase consistency in grading between measurement points Understanding Rubrics
  • 168. Rubrics as Tools for Teacher: ● Defines “quality” for a process, product or behavior ● Powerful for teaching and learning ○ Descriptions help communicate what it takes to succeed (This shouldn’t be a secret kept by the teachers!) ● Increase reliability of results ○ Increase consistency in grading between students ○ Increase consistency in grading between measurement points Understanding Rubrics
  • 169. Rubrics as Tools for Student: ● Defines “quality” for a process, product or behavior ● Encourage more thoughtful judgement of their own (and others’) work. ● Increases ability to “self-assess” ● Increases student responsibility Understanding Rubrics
  • 170. Inter-Rater Reliability: ● Same/Repeatable results when 2+ people are using 1 rubric ● Practice as a team, share samples, keep on file Intra-Rater Reliability: ● Repeatable results when 1 person is using 1 rubric ● Photograph/record samples, keep on file Understanding Rubrics
  • 171. Discussion: Take Aways ● What are considerations for selected response? ● What are considerations for performance or constructed response? ● Is the data easy to interpret?
  • 172. Creating Quality Assessments The 3 Elements Element Variables to Consider WHAT: Content/Skills ● Skills ● Standards ● Power Standards HOW: Form ● Selected Response (ex:multiple choice) ● Constructed Response ● Performance Based Difficulty: Level of Complexity ● Level of cognitive demand ● Question difficulty ● Difficulty of Reading Passage ● Difficulty of Prompt
  • 173. Determine the Spectrum of Complexity Acknowledging the various levels of cognitive demand within each standard will help teachers write questions at a consistent cognitive level across the assessment sets thus allowing the sets to mirror in complexity.
  • 174. Be Intentional about Question Complexity Mirrored questions - same range of cognitive demand. Blooms Taxonomy (Revised) Marzano's Taxonomy Webb's Depth of Knowledge Remembering Level 1: Retrieval Recall and reproduction (DOK1) Understanding Level 2: Comprehension Skills and concepts (DOK2) Applying Level 3: Analysis Strategic thinking/complex reasoning (DOK3) Analyzing Level 4: Knowledge Utilization Extended thinking/reasoning (DOK4) Evaluating Level 5: Metacognition Creating Level 6: Self-System Thinking
  • 175. Be Intentional about Question Complexity Mirrored questions - same range of cognitive demand. Blooms Taxonomy (Revised) Marzano's Taxonomy Webb's Depth of Knowledge Remembering Level 1: Retrieval Recall and reproduction (DOK1) Understanding Level 2: Comprehension Skills and concepts (DOK2) Applying Level 3: Analysis Strategic thinking/complex reasoning (DOK3) Analyzing Level 4: Knowledge Utilization Extended thinking/reasoning (DOK4) Evaluating Level 5: Metacognition Creating Level 6: Self-System Thinking These are the EASIEST questions. * Recall this fact. * Answer a question about a passage that’s “right there;” the student could underline the answer.
  • 176. Be Intentional about Question Complexity Mirrored questions - same range of cognitive demand. Blooms Taxonomy (Revised) Marzano's Taxonomy Webb's Depth of Knowledge Remembering Level 1: Retrieval Recall and reproduction (DOK1) Understanding Level 2: Comprehension Skills and concepts (DOK2) Applying Level 3: Analysis Strategic thinking/complex reasoning (DOK3) Analyzing Level 4: Knowledge Utilization Extended thinking/reasoning (DOK4) Evaluating Level 5: Metacognition Creating Level 6: Self-System Thinking These are a bit HARDER... * Recall a fact and apply it to a new situation. * Answer a question about a passage that requires inferencing; the student must “read between the lines”
  • 177. Be Intentional about Question Complexity Mirrored questions - same range of cognitive demand. Blooms Taxonomy (Revised) Marzano's Taxonomy Webb's Depth of Knowledge Remembering Level 1: Retrieval Recall and reproduction (DOK1) Understanding Level 2: Comprehension Skills and concepts (DOK2) Applying Level 3: Analysis Strategic thinking/complex reasoning (DOK3) Analyzing Level 4: Knowledge Utilization Extended thinking/reasoning (DOK4) Evaluating Level 5: Metacognition Creating Level 6: Self-System Thinking These are the HARDEST questions. * Recall facts and use them to evaluate a new situation or create a new product. * Evaluate between multiple positions or opinions or ideas in a passage
  • 178. Be Intentional about Question Complexity Standard Basic: Remember & Understand Standard: Apply & Analyze Expanded: Evaluate & Create RL 4.1 & RI 4.1 Identify explicit information: What does the author mean by: “quote”...? What is the purpose of this...? Analyze explicit information; making inferences: What inferences can you make about...? Which of these examples tells us why...? Evaluate explicit information and inferences: (Defend a position)Why do you believe...? Is there a better solution to the character’s problem...? RL 4.2 & RI 4.2 Identify Theme/Idea: What is the theme of this story (text)...? What is the message of this text (poem, story, etc.)...? Analyze Theme/Idea: How do the character’s action support the theme...? What are the most important events in the story...? (RL) Which of these is a good summary sentence...? Evaluate Theme/Idea: Which of these details does not support the main idea (message)...? RL 4.3 & RI 4.3 Identify elements What is the setting of the story...? Which describes character x...? Which describes the setting (time, place, social environment)...? Which of these details (quotes) describes character x...? (RL) Analyze Elements Why might ____ have happened...? (RL) What is the first (second) step in the procedure...? (RI) What was the effect of _____’s idea...? (RI) Evaluate Elements Did the environment affect the outcome of the story...? Create a scenario: How would you imagine the events from the text effecting you today..?
  • 179. Teachers can struggle with targeted questions. Best tool has been question stems KidsAtTheCore.com/question-stems/
  • 180. Be Intentional about Question Complexity Keep the number of questions in each level of complexity consistent! Commit to a number of hard questions… Commit to a number of easy questions…. And stick with that pattern on all assessments so they are comparable in difficulty!
  • 181. Be Intentional about Question Complexity Keep the number of questions in each level of complexity consistent! Commit to a number of hard questions… Commit to a number of easy questions…. And stick with that pattern on all assessments so they are comparable in difficulty! Key: This is how you get COMPARABLE results! Difficulty level remains CONSTANT
  • 182. Be Intentional about Question Complexity Keep the number of questions in each level of complexity consistent! Key Standard/ Objective Basic: (Remember & Understand) Standard: (Apply & Analyze) Expanded: (Evaluate & Create) Key Ideas and Details 2 questions 5 questions 2 questions Craft and Structure 2 questions 4 questions 3 questions Integration of Ideas 2 questions 3 questions 1 questions Assessment Total: 6/24 questions =25% of test 12/24 questions =50% of test 6/24 questions =25% of test For Example, in Assessments A, B and C:
  • 183. Be Intentional about Question Complexity Keep the number of questions in each level of complexity consistent! Key Standard/ Objective Basic: (Remember & Understand) Standard: (Apply & Analyze) Expanded: (Evaluate & Create) Key Ideas and Details 2 questions 5 questions 2 questions Craft and Structure 2 questions 4 questions 3 questions Integration of Ideas 2 questions 3 questions 1 questions Assessment Total: 6/24 questions =25% of test 12/24 questions =50% of test 6/24 questions =25% of test For Example, in Assessments A, B and C: Is 25%, 50%, 25% a magic formula? NO! Think about your class/course and what would make an appropriate balanced assessment! Example: Honors class? Remedial Class?
  • 184. Misconception: Test gets harder as student learns more material… In Reality: All assessments at the same level - Test does not change difficulty: More questions correct = Growth and Learning Measure Growth: All At The Same Level of Difficulty The Evaluation Connection: Student Growth
  • 185. The Evaluation Connection: Student Growth Misconception: Test gets harder as student learns more material… In Reality: All assessments at the same level - Test does not change difficulty: More questions correct = Growth and Learning Measure Growth: All At The Same Level of Difficulty
  • 186. The Evaluation Connection: Student Growth OH NO! That means students will do terrible on my first assessment! It is Important How Teacher Frames It... For Example, Say: This is how I am going to find out what you know. By the time we are done, you are going to “own” this! Measure Growth: All At The Same Level of Difficulty
  • 187. The Evaluation Connection: Student Growth OH NO! That means students will do terrible on my first assessment! It is Important How Teacher Frames It... For Example, Say: This is how I am going to find out what you know. By the time we are done, you are going to “own” this! Measure Growth: All At The Same Level of Difficulty That is the nature of a baseline assessment! You are testing students on material you have not taught yet!
  • 188. The Evaluation Connection: Student Growth OH NO! That means students will do terrible on my first assessment! It is Important How Teacher Frames It... For Example, Say: This is how I am going to find out what you know. By the time we are done, you are going to “own” this! Measure Growth: All At The Same Level of Difficulty
  • 189. Comparable assessments in a set are of the same format Selected Response Assessments: Ask students to select the correct answer from a provided set of answers. Constructed Response Assessments: Ask students to construct their own answer to a question. Performance Assessments: Ask students to demonstrate understanding by performing or creating a product. Be Intentional about Question Complexity & Form Teacher Time is at a Premium! Invest in question forms that will: ● Tell teachers about student thinking ● Tell teachers what to do next Consider Your Time Investment: Will a constructed response version of this question tell me more about student thinking than a multiple choice version?
  • 190. Comparable assessments in a set are of the same format Selected Response Assessments: Ask students to select the correct answer from a provided set of answers. Constructed Response Assessments: Ask students to construct their own answer to a question. Performance Assessments: Ask students to demonstrate understanding by performing or creating a product. Be Intentional about Question Complexity & Form Teacher Time is at a Premium! Invest in question forms that will: ● Tell teachers about student thinking ● Tell teachers what to do next Consider Your Time Investment: Will a constructed response version of this question tell me more about student thinking than a multiple choice version? EXAMPLE: Basic: Remember/Understand = Multiple Choice Expanded: Evaluate/Create = Constructed Response
  • 191. Aligning Questions to Assessment Form Be Intentional about Question Complexity & Form Selected Response Constructed Response Performance Task Personal Communication/ Discussion Basic: Knowledge/ Remember/ Understand Good Match Good Match Not Good Match Partial Match Standard: Reasoning/ Apply/ Analyze Partial Match Good Match Good Match Good Match Expanded: Evaluate Partial Match Good Match Good Match Partial Match Create Product Not Good Match Partial Match Good Match Not Good Match Adapted from Classroom Assessment for Student Learning (Stiggins, Arter, Chappius & Chappius, 2006)
  • 192. Discussion: Take Aways ● What are considerations for ranges of complexity? ● Should some grades/subjects have different ranges of complexity? ● Are we gathering useful data? ● Is the data easy to interpret? (what does it mean to get hard/easy questions right/wrong?)
  • 193. TASK: Write 3 Questions, Targeting Specific Standard Targeting Specific Cognitive Demand CHOICE: 1) Grade 4 ELA 2) Grade 11 Social Studies & ELA 3) Grade 3 Math 4) HS Mathematics/Science Quick Activity about Question Complexity & Form Standard Basic Question Standard Question Expanded Question
  • 194. Spotlight on Evaluation. Quality Assessments for GROWTH DATA The Evaluation Connection: Student Growth
  • 195. Remember from earlier…. So… HOW can we gather information/data/artifacts to prove growth is happening? Measure Change Measure Student Learning Measure Student Growth Important Term: Mirrored Assessment Set: A series of comparable assessments that can measure learning over 2 or more points in time. They are designed with the same form, content, and level of complexity.
  • 196. Growth: Measurable change between two points in time Attainment: Meeting a outcome or target. (ex: Mastery of a skill) Mirrored Assessments: Assessments that can be compared for student growth. They are designed with the same form, content, and level of complexity. Assessment Set: A series of mirrored assessments designed to measure student growth on a specific set of learning targets/content--COMPARABLE Essential Skills: Key skills that are a requirement for success at the next level or for the scaffolding of skills that are going to be taught. Spectrum of Assessment Complexity: The range of cognitive levels within a skill or question. Pivot Points: Places in the teacher’s lesson where student growth data will determine the next teaching steps SLO: Student Learning Objectives. The Illinois recommended model framework for including growth on teacher evaluations. Key Vocabulary & Terms
  • 197. Creating Assessments for Growth Keep the 3 Elements Consistent Mirroring Element Variables to Consider WHAT: Content/Skills ● Skills ● Standards ● Power Standards HOW: Form ● Selected Response (ex:multiple choice) ● Constructed Response ● Performance Based Difficulty: Level of Complexity ● Level of cognitive demand ● Question difficulty ● Difficulty of Reading Passage ● Difficulty of Prompt
  • 198. The Blueprint. Your key to quality design and alignment
  • 199. Assessment System Class Skills 1, 2, 3, 4, 5, 6, 7… (ex: NGSS Scientific Practices) Baseline Pretest: All Skills Summative Posttest: All Skills
  • 200. Assessment System Class Skills 1, 2, 3, 4, 5, 6, 7… (ex: NGSS Scientific Practices) Baseline Pretest: All Skills Summative Posttest: All Skills
  • 201. Assessment System Class Skills 1, 2, 3, 4, 5, 6, 7… (ex: NGSS Scientific Practices) Baseline Pretest: All Skills Summative Posttest: All Skills Formative Assessments: Pivot Points
  • 202. Example 1: Assessment A, B and C should mirror each other in terms of: Difficulty, Standards assessed, Question stems, etc. Informational Passage Literature Passage Assessment A Assessment B Assessment C Interim Assessments Interim Assessments 3 Key Idea Questions per passage 3 Craft and Structure Questions per passage 3 Integration of Ideas Questions per passage 18 Questions Total 3 Key Idea Questions per passage 3 Craft and Structure Questions per passage 3 Integration of Ideas Questions per passage 18 Questions Total 3 Key Idea Questions per passage 3 Craft and Structure Questions per passage 3 Integration of Ideas Questions per passage 18 Questions Total Informational Passage Literature Passage Informational Passage Literature Passage
  • 203.
  • 204.
  • 205. Lets Make A Blueprint: Teachers can design their own, or pick from available options Assessment Blueprint Development Protocol Step One: Identify what essential skills & knowledge you will assess. (Circle, Triangle Square Analogy) Step Two: Select the form(s) for your assessment. Step Three: Determine the number of items at each level of cognitive demand Essential Skill & Knowledge Basic Level 1 (Remember & Understand) Standard Level 2 (Apply & Analyze) Expanded Level 3 (Evaluate & Create) Analyze the development of a central theme 1 MC 2 MC 1 MC
  • 206. Lets Make A Blueprint: Teachers can design their own, or pick from available options Assessment Blueprint Development Protocol Step One: Identify what essential skills & knowledge you will assess. (Circle, Triangle Square Analogy) Step Two: Select the form(s) for your assessment. Step Three: Determine the number of items at each level of cognitive demand Essential Skill & Knowledge Basic Level 1 (Remember & Understand) Standard Level 2 (Apply & Analyze) Expanded Level 3 (Evaluate & Create) Analyze the development of a central theme 1 MC 2 MC 1 MC Try writing the question right into the graphic organizer
  • 207. EXAMPLE Blueprint Set: EXCERPT Questions for Assessment #1 Essential Skill & Knowledge Basic Level 1 (Remember & Understand) Standard Level 2 (Apply & Analyze) Expanded Level 3 (Evaluate & Create) Reading Literature-Forgetting the Words (Lexile 780) http://www.readworks.org/passages/forgetting-words CC.3.R.L.1-Ask and answer questions to demonstrate understanding of a text, referring explicitly to the text as the basis for the answers. How is Andy involved in the school play? a. Andy is watching his friends act in the play. b. Andy is starring in the school play. c. Andy is writing the play. d. Andy is directing the play. The main idea of this story is that a. Andy is not able to perform in the play because he is so nervous b. Andy is able to get over his nerves and feel confident with encouragement c. Andy says the wrong thing, his mother sees, and the whole play is ruined d. People sometimes do not know what to do when on stage and don’t say anything In the passage, the author says that Andy is afraid to make a mistake in the play when he can’ t remember what to say. What evidence best shows Andy is no longer frightened by being on stage? a. “Andy is worried about letting her down” b. “He has been looking forward to this for weeks.” c. “He still can’t remember his line, but it doesn’t matter.” d. “Andy loves pretending to be a pirate.”
  • 208. EXAMPLE Blueprint Set: EXCERPT Questions for Assessment #1 Essential Skill & Knowledge Basic Level 1 (Remember & Understand) Standard Level 2 (Apply & Analyze) Expanded Level 3 (Evaluate & Create) Reading Literature-Forgetting the Words (Lexile 780) http://www.readworks.org/passages/forgetting-words CC.3.R.L.1-Ask and answer questions to demonstrate understanding of a text, referring explicitly to the text as the basis for the answers. How is Andy involved in the school play? a. Andy is watching his friends act in the play. b. Andy is starring in the school play. c. Andy is writing the play. d. Andy is directing the play. The main idea of this story is that a. Andy is not able to perform in the play because he is so nervous b. Andy is able to get over his nerves and feel confident with encouragement c. Andy says the wrong thing, his mother sees, and the whole play is ruined d. People sometimes do not know what to do when on stage and don’t say anything In the passage, the author says that Andy is afraid to make a mistake in the play when he can’ t remember what to say. What evidence best shows Andy is no longer frightened by being on stage? a. “Andy is worried about letting her down” b. “He has been looking forward to this for weeks.” c. “He still can’t remember his line, but it doesn’t matter.” d. “Andy loves pretending to be a pirate.” This is only a one row excerpt...
  • 209. EXAMPLE Blueprint Set: EXCERPT Questions for Assessment #2 Essential Skill & Knowledge Basic Level 1 (Remember & Understand) Standard Level 2 (Apply & Analyze) Expanded Level 3 (Evaluate & Create) Reading Literature-Lessons from Fishing (Lexile 780) CC.3.R.L.1-Ask and answer questions to demonstrate understanding of a text, referring explicitly to the text as the basis for the answers. Why does Martin jump into the water? a. Martin wants to touch a fish. b. Martin wants to see how fast a fish can swim. c. The fish escaped with the stringer. d. Martin can’t swim. What is the main theme of the story? a. Learning how to fish is a good way to learn how to swim. b. Fishing makes you strong if you hold onto the pole. c. Fishing is a good family activity. d. Fishing is like life, with some days that are a success and other days that are not. In the passage, the author says that Morgan “goes fishing all the time” and that he “has gotten even better at it than his father and his grandfather.” Based on this evidence, what can be concluded about the sport of fishing? a. Fishing can be learned in less than a week. b. Being good at fishing takes a lot of practice. c. Only teenagers are good at fishing. d. Fishing is best taught by family members.
  • 212. Discussion: Start thinking about your blueprint Use 3 elementos of Quality Assessment Design: ● Should some grades/courses/levels have very different blueprints? Why? ● Will this blueprint help us gather useful data? ● Is the data be easy to interpret?
  • 214. Road Block #1: What Do I want to Assess?
  • 215. What Do I want to Assess? Considerations: ● Look For Sustained Growth: What do you have a chance to teach again and again? The threads that run through a class/course. ● Look for Endurance, Leverage & Readiness for the Next Level Road Block #1:
  • 216. Ok. So How skills many IS THAT? Road Block #2:
  • 217. Ok. So How skills many IS THAT? There is no “magic” number. Road Block #2:
  • 218. Ok. So How skills many IS THAT? There is no “magic” number. Considerations: ● Often teams choose a number between 5 and 20 ● How long of a test is appropriate? (each essential skill should be assessed more than once) ● Look at PARCC framework/ISBE Livebinders Road Block #2:
  • 219. How Many Questions on a Test? Road Block #3:
  • 220. How Many Questions on a Test? There is no “magic” number. Road Block #3:
  • 221. How Many Questions on a Test? There is no “magic” number. Considerations: ● Cognitive Demand: Represent more than one level of complexity for each standard/skill. ● Complexity of Standard: Some standards have more intricate layers than others--requiring more questions. Road Block #3:
  • 222. How Many Questions on a Test? There is no “magic” number. Considerations: ● Cognitive Demand: Represent more than one level of complexity for each standard/skill. ● Complexity of Standard: Some standards have more intricate layers than others--requiring more questions. Road Block #3: Questions Per Skill? 1. Use 3 questions per skill to triangulate your data 2. Use 4 questions per skill to triangulate and be able to throw out one data point 3. Use Marzano’s 6 levels within a skill, and represent each...
  • 223. How Should I Give Points? Road Block #4:
  • 224. How Should I Give Points? Depends on your questions & students: Be intentional. Road Block #4:
  • 225. How Should I Give Points? Depends on your questions & students: Be intentional. Considerations: ● Encourage guessing? ● Participation points for effort? (not effecting test score) ● Do different questions of different levels of complexity have different point values? Road Block #4:
  • 226. How Should I Give Points? Depends on your questions & students: Be intentional. Considerations: ● Encourage guessing? ● Participation points for effort? (not effecting test score) ● Do different questions of different levels of complexity have different point values? Road Block #4: Example: 1) Basic, multiple choice: 1 point 2) Standard (mid-level) multiple choice: 1 point 3) Expanded (high level) free response…?
  • 227. How Do I Get Started? Road Block #5:
  • 228. How Do I Get Started? Ideas: ● Build from a Blueprint ● Borrow, Steal: Don’t reinvent the wheel. Look for question starters, stems or tools you can use to get started. ● Just Dig In: Question writing is slow at first, but will pick up with practice. Road Block #5:
  • 229. Discussion: What roadblocks will our staff face? As leaders in your district… ● What roadblocks are ahead of us? ● What can we do to address these roadblocks ● What are other districts doing when faced with similar roadblocks?
  • 230. The Age of Assessment REMEMBER: Is data you get from administering that assessment is more valuable than the instructional time lost to administer the assessment? USE ASSESSMENTS THAT GIVE YOU GOOD INFORMATION ABOUT YOUR STUDENTS!
  • 231. The Age of Assessment Is This A Good Question? ● Will it give me good information about student thinking? ● Will it help me better understand student learning needs? ● Will it help me help my students reach our goals?
  • 232. Lets Look at Some Examples... Be Intentional about Assessments
  • 233. Imagine you are advising this teacher. ● What questions? ● What suggestions? Be Intentional about Assessments
  • 234. Quality Assessments Example Assessment Sets Assessment #1 Excerpt Questions: Word Meaning from Context: 1. What does Christine like to do? a. Read b. Play tennis c. Listen to rock music d. Visit friends 2. Who likes to listen to music? a. Monika b. Petra dna Monika c. Christine and Monika d. Christine and Petra 3. Who likes to Play chess? a. Monika b. Petra c. Christine d. Christine and Petra Drawing a Conclusion: 4. Who is the most athletic? a. Christine b. Monika c. Petra d. Petra and Christine 5. The reading mainly focus on the girls’... a. Ages b. Names c. Interests d. Friends
  • 235. Quality Assessments Example Assessment Sets Questions: Word Meaning from Context: 1. Sara and her friends play cards... a. on Mondays b. after school c. rarely d. on the weekends 2. What instrument does Paul play? a. No instrument b. Guitar c. Piano d. Piano and guitar Drawing a Conclusion: 3. The People in the conversation are most likely a. teachers b. students c. relatives d. workers 4. The conversation probably takes place… a. in a cafe b. in a school c. in a home d. in a store 5. The students go because... a. they need to go home b. they have to get to class c. they have to get to soccer practice d. they have to get to the restaurant Assessment # 2 Excerpt
  • 236. Quality Assessments Could the 2 German Assessments be quality, mirrored tools? Mirroring Element Variables to Consider WHAT: Content/Skills ● Skills ● Standards ● Power Standards HOW: Form ● Selected Response (ex:multiple choice) ● Constructed Response ● Performance Based Difficulty: Level of Complexity ● Level of cognitive demand ● Question difficulty ● Difficulty of Reading Passage ● Difficulty of Prompt
  • 239. Quality Assessments Content: Essential Skills/Knowledge Could the 2 Science Assessments be quality, mirrored tools? Mirroring Element Variables to Consider WHAT: Content/Skills ● Skills ● Standards ● Power Standards HOW: Form ● Selected Response (ex:multiple choice) ● Constructed Response ● Performance Based Difficulty: Level of Complexity ● Level of cognitive demand ● Question difficulty ● Difficulty of Reading Passage ● Difficulty of Prompt
  • 240. Quality Assessments Example Assessment Sets Assessment #1 & #2 Rubric
  • 241. Quality Assessments Example Assessment Sets Assessment #1 Excerpt Assessment #2 Excerpt
  • 242. Quality Assessments Example Assessment Sets Could the 2 English 9 Assessments be quality, mirrored tools? Mirroring Element Variables to Consider WHAT: Content/Skills ● Skills ● Standards ● Power Standards HOW: Form ● Selected Response (ex:multiple choice) ● Constructed Response ● Performance Based Difficulty: Level of Complexity ● Level of cognitive demand ● Question difficulty ● Difficulty of Reading Passage ● Difficulty of Prompt
  • 243. Quality Assessments Content: Essential Skills/Knowledge Assessment #1 Excerpt
  • 244. Quality Assessments Content: Essential Skills/Knowledge Assessment # 2 Excerpt Assessment #1 Excerpt
  • 245. Quality Assessments Content: Essential Skills/Knowledge Could the 2 Choral Assessments be quality, mirrored tools? Mirroring Element Variables to Consider WHAT: Content/Skills ● Skills ● Standards ● Power Standards HOW: Form ● Selected Response (ex:multiple choice) ● Constructed Response ● Performance Based Difficulty: Level of Complexity ● Level of cognitive demand ● Question difficulty ● Difficulty of Reading Passage ● Difficulty of Prompt
  • 246.
  • 247. Quality Assessments Example Assessment Sets Could the Life Skills Assessment(s) be quality, mirrored tools? Mirroring Element Variables to Consider WHAT: Content/Skills ● Skills ● Standards ● Power Standards HOW: Form ● Selected Response (ex:multiple choice) ● Constructed Response ● Performance Based Difficulty: Level of Complexity ● Level of cognitive demand ● Question difficulty ● Difficulty of Reading Passage ● Difficulty of Prompt
  • 248. What makes a good assessment? ● Aligned to standards ● Intentional range of cognitive demand ● Intentional design (blueprint) ● Rubric for all non-selected response questions ● Comparable for measuring growth ● Designed to produce specific data types ● Valid: Measures what you intend to measure ● Reliable: Produces repeatable results ● Historical data
  • 249. What makes a good assessment? ● Aligned to standards ● Intentional range of cognitive demand ● Intentional design (blueprint) ● Rubric for all non-selected response questions ● Comparable for measuring growth ● Designed to produce specific data types ● Valid: Measures what you intend to measure ● Reliable: Produces repeatable results ● Historical data
  • 250. What makes a good assessment? ● Aligned to standards ● Intentional range of cognitive demand ● Intentional design (blueprint) ● Rubric for all non-selected response questions ● Comparable for measuring growth ● Designed to produce specific data types ● Valid: Measures what you intend to measure ● Reliable: Produces repeatable results ● Historical data Does the assessment allow high and low-achieving students to adequately demonstrate their knowledge?
  • 251. What makes a good assessment? ● Aligned to standards ● Intentional range of cognitive demand ● Intentional design (blueprint) ● Rubric for all non-selected response questions ● Comparable for measuring growth ● Designed to produce specific data types ● Valid: Measures what you intend to measure ● Reliable: Produces repeatable results ● Historical data Does the assessment allow high and low-achieving students to adequately demonstrate their knowledge? In other words, does the assessment have something all students can “Sink Their Teeth Into?”
  • 252. What makes a good assessment? ● Aligned to standards ● Intentional range of cognitive demand ● Intentional design (blueprint) ● Rubric for all non-selected response questions ● Comparable for measuring growth ● Designed to produce specific data types ● Valid: Measures what you intend to measure ● Reliable: Produces repeatable results ● Historical data
  • 253. What makes a good assessment? ● Aligned to standards ● Intentional range of cognitive demand ● Intentional design (blueprint) ● Rubric for all non-selected response questions ● Comparable for measuring growth ● Designed to produce specific data types ● Valid: Measures what you intend to measure ● Reliable: Produces repeatable results ● Historical data
  • 254. What makes a good assessment? ● Aligned to standards ● Intentional range of cognitive demand ● Intentional design (blueprint) ● Rubric for all non-selected response questions ● Comparable for measuring growth ● Designed to produce specific data types ● Valid: Measures what you intend to measure ● Reliable: Produces repeatable results ● Historical data
  • 255. What makes a good assessment? ● Aligned to standards ● Intentional range of cognitive demand ● Intentional design (blueprint) ● Rubric for all non-selected response questions ● Comparable for measuring growth ● Designed to produce specific data types ● Valid: Measures what you intend to measure ● Reliable: Produces repeatable results ● Historical data
  • 256. What makes a good assessment? ● Aligned to standards ● Intentional range of cognitive demand ● Intentional design (blueprint) ● Rubric for all non-selected response questions ● Comparable for measuring growth ● Designed to produce specific data types ● Valid: Measures what you intend to measure ● Reliable: Produces repeatable results ● Historical data
  • 257. What makes a good assessment? ● Aligned to standards ● Intentional range of cognitive demand ● Intentional design (blueprint) ● Rubric for all non-selected response questions ● Comparable for measuring growth ● Designed to produce specific data types ● Valid: Measures what you intend to measure ● Reliable: Produces repeatable results ● Historical data “...Validity is concerned with the confidence with which we may draw inferences about student learning from an assessment. Furthermore, validity is not an either/or proposition, instead it is a matter of degree,” (Gareis & Grant, 2008 p.35)
  • 258. What makes a good assessment? ● Aligned to standards ● Intentional range of cognitive demand ● Intentional design (blueprint) ● Rubric for all non-selected response questions ● Comparable for measuring growth ● Designed to produce specific data types ● Valid: Measures what you intend to measure ● Reliable: Produces repeatable results ● Historical data “...Validity is concerned with the confidence with which we may draw inferences about student learning from an assessment. Furthermore, validity is not an either/or proposition, instead it is a matter of degree,” (Gareis & Grant, 2008 p.35)
  • 259. What makes a good assessment? ● Aligned to standards ● Intentional range of cognitive demand ● Intentional design (blueprint) ● Rubric for all non-selected response questions ● Comparable for measuring growth ● Designed to produce specific data types ● Valid: Measures what you intend to measure ● Reliable: Produces repeatable results ● Historical data “...Validity is concerned with the confidence with which we may draw inferences about student learning from an assessment. Furthermore, validity is not an either/or proposition, instead it is a matter of degree,” (Gareis & Grant, 2008 p.35) Thus, increasing validity and reliability will be an ongoing district process.
  • 260. What makes a good assessment? ● Aligned to standards ● Intentional range of cognitive demand ● Intentional design (blueprint) ● Rubric for all non-selected response questions ● Comparable for measuring growth ● Designed to produce specific data types ● Valid: Measures what you intend to measure ● Reliable: Produces repeatable results ● Historical data
  • 261. Example Checklist (Source: Ohio Department of Education)
  • 262. Example Checklist (Source: New Jersey Department of Education)
  • 264. ✓ 100% Online ● Zero travel cost, not hotels, no gas ● Attend from school or from home ✓ Learn from educators just like you ● Fast paced presentations full of real experiences and lessons learned in assessment writing ● Online Chatting and networking ✓ Flexible ● Enjoy it live or watch it later at your own pace using the K@C Special Access Pass Upcoming Event: June 2014 K@C Growth Assessment Online Conference
  • 265. In your district… ● How are assessments examined for their quality? ● What could we do to improve our assessment quality? ● What are other districts doing when faced with similar roadblocks? Discussion: How do we examine assessment tools?
  • 266. ✓ Design a Blueprint 1. Consider course skills to measure 2. Consider format to best measure those skills 3. Consider complexity of execution of those skills ✓ Repeat the Blueprint Pattern 1. Repeated results will inform pivot points ✓ Consider Assessment Validity ✓ Consider Assessment Reliability HOW CAN WE DO IT? Build High Quality Assessment Sets Remember this?
  • 267. ✓ Design a Blueprint 1. Consider course skills to measure 2. Consider format to best measure those skills 3. Consider complexity of execution of those skills ✓ Repeat the Blueprint Pattern 1. Repeated results will inform pivot points ✓ Consider Assessment Validity ✓ Consider Assessment Reliability HOW CAN WE DO IT? Build High Quality Assessment Sets The right blueprint design gives data that can help us pivot.
  • 268. Activity: Lets Develop an assessment blueprint. Step One: Identify what essential skills & knowledge you will assess. (Circle, Triangle Square Analogy) Step Two: Select the form(s) for your assessment. Step Three: Determine the number of items at each level of cognitive demand Essential Skill & Knowledge Basic (Remember & Understand) Standard (Apply & Analyze) Expanded (Evaluate & Create) Analyze the development of a central theme 1 MC 2 MC 1 MC www.KidsAtTheCore/Downloads/AssessmentBlueprint.docx
  • 270. What To Do with Data? Stoplight Highlight: One of the most common ways to translate numbers to a proficiency range Red: Below Expectation Students scoring below end of year expectations. (Ex: 20+ points, though this depends on the assessment itself.) Yellow: Close to Expectation Students close to end of year expectations.(Ex: less than 20 points, though this depends on the assessment itself.) Green: At Expectation Students at end of year year expectations. (Ex: +/- 5 points, though this depends on the assessment itself.) Blue: Above Expectation Students above end of year year expectations. (Ex: Over 5 points above expectation, though this depends on the assessment itself.)
  • 271. What To Do with Data? Student Score #1 Score #2 A 6% 33% B 34% 55% C 33% 59% D 56% 87% E 58% 96%
  • 272. Interpreting Scores: Student Growth Using Historical Data: You can determine expected score based on previous “typical” growth.
  • 273. Interpreting Scores: Student Growth Using Historical Data: You can determine expected score based on previous “typical” growth. This gives meaning to the numbers. Are these the scores I want? Better? Worse?
  • 274. Interpreting Scores: Student Growth After year 1: You have Pre and Post scores for multiple students...
  • 278. Interpreting Scores: Student Growth Pre and Post Test: Multiple Students Graphed
  • 280. Interpreting Scores: Student Growth Fast Forward to the Fall of your 2nd or 3rd year...
  • 281. Interpreting Scores: Student Growth Student Score #1 Score #2 A 6% 33% B 34% 55% C 33% 59% D 56% 87% E 58% 96%
  • 282. Interpreting Scores: Student Growth Student Score #1 Score #2 A 6% 33% B 34% 55% C 33% 59% D 56% 87% E 58% 96%
  • 283. Interpreting Scores: Student Growth Student Score #1 Score #2 A 6% 33% B 34% 55% C 33% 59% D 56% 87% E 58% 96%
  • 284. Interpreting Scores: Student Growth Student Score #1 Score #2 A 6% 33% B 34% 55% C 33% 59% D 56% 87% E 58% 96%
  • 285. Interpreting Scores: Student Growth Do all pivot points have to come from FORMAL assessments?
  • 286. Interpreting Scores: Student Growth Do all pivot points have to come from FORMAL assessments? NO!
  • 287. Pivot Points: Informal Data Collection