1. Improve Your District’s
Student Growth Measures Plan
Ohio’s Spring Education Symposium
March 28, 2014
Presented By:
Mindy Schulz, Allen County ESC
Director of Curriculum
eTPES, OTES, OPES State Trainer
1
2. Intended Outcomes
Participants will:
• Review the OTES & OPES student growth
measures (SGM) framework.
• Analyze their district’s current SGM
assessments for quality and relevant use.
• Evaluate and refine their district’s
current SGM plan for potential revisions
and considerations.
2
3. Session Format
• I do
• Presenter will provide
information, demonstrate, and model refining a
SGM plan
• We do
• Participants will practice applying SGM plan
revisions
• You do
• Participants will refine their district SGM plan
using own district, school information
3
4. Definition of Student Growth
For the purpose of use in Ohio’s
evaluation systems, student
growth is defined as the change in
student achievement for an
individual student between two or
more points in time.
4
7. What is a Student Growth
Measures Plan?
“Teacher evaluation as required by
O.R.C.3319.111 relies on two key evaluation
components: a rating of teacher
performance and a rating of student
academic growth, each weighted at 50
percent of each evaluation. The following
guidance speaks to the student growth
measures component, specifically
addressing determinations to be made for
using student growth measures within
teacher evaluation.” – K. Harper
7
8. Let’s Review: ODE Steps for
Designing a Local SGM Plan
1) Conduct an inventory of needs and
resources.
2) Determine and create (if necessary)
student growth measures to be
used.
3) Communicate expectations and
refine the entire process.
8
10. Who is required to be evaluated
by the new evaluation systems?
Any person who is employed under a
teaching license or under a professional or
permanent teacher’s certificate and who
spends at least 50 percent of his/her time
employed providing student instruction. This
does not apply to a teacher employed as a
substitute. (ORC3319.111)
This usually excludes:
Speech pathologists, occupational therapists
Teachers on assignment
Nurses, psychologists, guidance counselors
10
11. Who is required to be evaluated
by the new evaluation systems?
O.R.C. 3319.02 D(1)
The procedures for the evaluation of
principals and assistant principals shall be
based on principles comparable to the
teacher evaluation policy adopted by the
board under section 3319.111 of the Revised
Code, but shall be tailored to the duties and
responsibilities of principals and assistant
principals and the environment in which
they work.
11
12. Categorize Educators
• Who has teacher-level EVAAS Value-Added
data, grades 4-8 reading and/or math? Category
A1 & A2
• Which principals are assigned to buildings with
Value-Added data? Category A
• Who does not have teacher-level Value-Added
data (or building-level for principals), but has data
from assessments on the ODE approved vendor
assessment list? Category B
• Who has no Value-Added or approved vendor
data? Category C
12
13. Value Added Data Timeline
13
Spring Fall 2nd
Semester
May
4-8 Reading &
Math OAA’s
Administered
Prior school year teacher-level value-
added reports (which are actually
received in Fall of current school year) for
Category A1 & A2 Teachers will be used
in calculating SGM % based on LEA’s
default percentages. Building-Level
value-added results for Category A
principals will be used in calculating SGM
% based on LEA’s default percentages.
Teacher Level Value
Added Reports
Released (Data from
Spring of prior school
year: 4-8 Reading &
Math OAAs)
Teacher and Building
Level Value Added
reports (from prior
school year)
uploaded into eTPES
by ODE.
14. Who is a Value-Added
Teacher for 2013-2014?
If a teacher received a value-added report fall
2013 from course(s) instructed 2012-2013, they
are considered a value-added teacher for the
spring 2014 evaluation. Of those, determine:
• Who instructed all value added courses for
2012-2013 (exclusively)?
• Who instructed some value added course(s)
for 2012-2013, but not exclusively? What
percent of time was spent instructing in value
added course(s)?
14
15. Who instructed all value
added courses for 2012-2013?
Previous
School
Year
(2012-
2013)
Current
School
Year
(2013-
2014)
Assigned
Category
in eTPES
(2013-
2014)
SGM Percentage
All VA All VA A1 26-50% (2013-2014 only)
Category A1 teachers must use their teacher-
level Value-Added report as the full 50% student
growth measure beginning July 2014.
All VA Some VA A2
26-50% because previous year had full VA
All VA No VA A2
15
Category Determination
16. Who instructed some value
added course(s) for 2012-2013?
16
Previous
School
Year
(2012-
2013)
Current
School
Year
(2013-
2014)
Assigned
Category
in eTPES
(2013-
2014)
SGM Percentage
Some VA All VA A2
Proportionate to schedule;
10-50%
Some VA Some VA A2
Some VA No VA A2
Category Determination
17. Principal in Building with Value-
Added Data
# of Years in Same
Building
SGM Decisions
4 or more Category A
3 Category A
• Enter the verified principal composite provided
by ODE when available*
• If principal composite is not verified or is
incorrect, use the 2-year building average
available in the EVAAS report
• May have local measures
17
Currently, building-level Value-Added data is based on a 3-year composite.
*If principals 3-year composite is incorrect, SAS and ODE are working to create principal
composite reports which will be available in the spring.
A special thanks to Dr. Kathy Harper, Greene Co. ESC, for contributing to the contents of this slide.
18. Principal in Building with Value-
Added Data Cont.
# of Years in Same
Building
SGM Decisions
2 Category A
• Enter the verified principal composite provided by ODE
when available*
• If principal composite is not verified or is incorrect, use the
1-year data available in the EVAAS report
• May have local measures
1 – YES, previous
principal experience
• Category A if previous assignment was in a building with
Value-Added. **Use decision process for principals with 2,
3, or 4+ years in a Value-Added building.
• Category B or C if previous assignment was in a building
without Value-Added data. Use data from current
assignment to determine category
18
Currently, building-level Value-Added data is based on a 3-year composite.
*If principals 3-year composite is incorrect, SAS and ODE are working to create principal
composite reports which will be available in the spring.
A special thanks to Dr. Kathy Harper, Greene Co. ESC, for contributing to the contents of this slide.
19. Principal in Building with Value
Added Data Cont.
# of Years in Same
Building
SGM Decisions
2 Category A
• Override the default tied to the IRN; enter the 1 year
building score from EVAAS report
• May have local measures
1 – YES, previous
principal experience
• Category A if previous assignment was in a building
with Value-Added. **Use decision process for
principals with 2, 3, or 4+ years in a value added
building.
• Category B or C if previous assignment was in a
building without Value-Added data. Use data from
current assignment to determine category
• May have local measures from current assignment
19
Currently, building-level value added data is based on a 3-year composite.
*SAS and ODE are working to create principal composite reports which will be
available in the spring.
20. Special Considerations: Principals
20
Are any principals in a building with no Value-
Added data, but were previously assigned to a
building with Value-Added data?
• Category A
• Override the default percentage pre-loaded data
and enter data from the previous school; **Use
decision process for principals with 2, 3, or 4+ years in
a value added building.
• May have local measures from current assignment
21. Who has data from assessments on the
ODE approved vendor assessment list?
1) What ODE-approved vendor assessments did we use for 2013-
2014? Will our LEA use any of the newly added ODE approved
vendor assessments for 2014-2015?
2) Will our LEA continue using the 2013-2014 vendor
assessments?
2) LEA Considerations:
• Does the manner in which our LEA is using the ODE vendor assessment
meet the definition of student growth?
• Have we secured the vendor assessment growth reports?
• Which assessments are not on the ODE approved vendors assessment list,
but could be used in SLOs?
• Are any Category A2 teachers using an ODE approved vendor
assessment? If an A2 teacher uses an approved ODE vendor assessment,
that assessment becomes a local measure.
21
22. Who has No Value-Added or
ODE-Approved Vendor Data?
Inventory educators with
no value-added or
ODE-approved vendor
assessment data.
(Category C)
22
23. Special Considerations:
Teachers
Who is new to Value Added assignment for the
current year?
• Inventory teachers that did not receive a value
added report from previous year, but have been
assigned to a value added course for current
year.
This may include:
New teachers, e.g. Year One Resident Educators,
new hire
Any teacher that changed assignment from the prior
year to the current year, e.g. teacher instructed 3rd
grade in previous year, and currently instructs 6th
grade math
23
24. Special Considerations:
Teachers
For teachers new to value-added assignment and
not receiving a teacher-level value-added report in
the fall:
• Determine current year SGM
category, dependent upon available data.
Are there ODE-approved vendor assessments
available? (Category B)
If there are no ODE-approved vendor assessments
available, LEA measures will be used. (Category C)
24
25. Categorizing Teachers New to
Value-Added
Previous School
Year
Current School
Year
Assigned
Category in eTPES
SGM Percentage
No VA All VA B or C B if ODE
approved
vendor
assessment is
used;
all others are
category C
No VA Some VA B or C
No VA No VA B or C
25
Category Determination
26. Special Considerations: Principals
# of Years in Same
Building
SGM Decisions
1– NO previous
principal
experience
• Category B if the building has ODE-
approved vendor data (may also have
local measures)
• Category C if no ODE-approved vendor
data
26
Are any 1st year principals assigned to a building with Value-Added data?
27. Step Two:
Designing a Local SGM Plan
Determine and create
(if necessary) student growth
measures to be used.
27
28. Determine LEA Default Percentages
What percentages will your LEA attribute to:
• Value-Added data (Category A1 and A2)?
• Assessments from the ODE approved vendors
(Category B)?
• Local Measures within each category?
(Local Measures may also apply to Category A1
(2013-2014 only), Category A2 teachers, Category A
principals, and Category B educators)
28
29. SB229 Legislative Update
29
(a)One factor shall be student academic growth which shall
account for fifty thirty-five per cent of each evaluation.
A school district may attribute an additional percentage to the
academic growth factor, not to exceed fifteen per cent of each
evaluation. However, a school district may instead attribute that
additional percentage to any of the factors set forth in
division (A)(1)(b) of this section.
(b) The remainder of each evaluation may include a combination
of the following factors:
(i) Formal observations and reviews as required by division (A)(3)
of this section;
(ii) Student surveys;
(iii) Any other factors the board determines necessary and
appropriate.
SB229 Proposed Changes to Current O.R.C. 3319.111
30. SB229 Legislative Update Cont.
2013-2014
Any proposed changes to ORC3319.111 will not alter
the 2013-2014 OTES or OPES framework and
requirements. The proposed effective date for SB229 is
July 1, 2014.
Disclaimer
Information presented in this session is what is available
today. Anything can change this afternoon or
tomorrow. New and/or revised legislative mandates
can be proposed and passed, even after a
requirement has been implemented.
To check the status of SB229:
http://www.legislature.state.oh.us/bills.cfm?ID=130_SB_229
30
31. 31
How much will our LEA attribute to
Teacher-Level Value-Added Data?
31
O.R.C. 3319.111, O.R.C. 3319.112
A1. Teacher Instructs Value-Added Subjects
Exclusively
A2. Teacher Instructs Value-Added Subjects, but
Not Exclusively
Teacher Value
Added
26-50%
LEA Measures
0-24%
Teacher-Level
Value Added
Proportional to teaching
schedule
10-50%
0-40%
LEA Measures
Proportional to
teaching schedule
2013-2014
32. 32
How much will our LEA attribute to
Teacher-Level Value-Added Data?
32
A1. Teacher Instructs Value-Added
Subjects Exclusively
O.R.C. 3319.111, O.R.C. 3319.112
Teacher Value Added
50%
Teacher-Level
Value Added
Proportional to teaching
schedule
10-50%
0-40%
LEA Measures
Proportional to
teaching schedule
A2. Teacher Instructs Value-Added
Subjects, but Not Exclusively
2014-2015
33. How much will our LEA attribute to
Building-Level Value-Added Data
for Principals?
33
34. How much will our LEA attribute to
the assessments from the ODE
Approved Vendor List?
B:ApprovedVendorAssessmentdataavailable
C: NoTeacher-levelValue-AddedorApprovedVendorAssessmentdataavailable
St
=
=LEAMeasures
0-40%
LEAMeasures
0-40%
TeacherValue-Added
10-50%
S
VendorAssessment
10-50%
34
35. Category B:
Special Considerations
• How many years has the
assessment(s) been administered?
• Is there trend data to analyze?
• Are there variations in the number of
vendor assessments available by
course and/or grade level?
35
36. What LEA measures will be
used for teachers?
1. Student Learning Objective (SLO) process for
measures that are specific to relevant subject matter.
Measures must be district-approved and may
include:
• Other vendor assessments not on the ODE Approved List
• Career Technical Educational assessments
• Locally developed assessments
• Performance-based assessments
• Portfolios.
2. Teacher Category A2 (with Value-Added) also may
use Vendor assessments as an LEA-determined
measure proportionate to the teacher’s schedule for
non-Value-Added courses/subjects.
3. Shared attribution
36
37. What LEA Measures will be
used for principals?
1. An average of all teachers' student growth ratings in the
building
2. Building-Based Student Learning Objectives (SLOs) process
for using measures that are specific to relevant building
goals and priorities and aligned with Ohio Improvement
Process. Measures for SLOs must be district-approved and
may include both direct and indirect measures such as:
• Student achievement trends
• Locally developed assessments
• Progress on school improvement plans
• Student course taking patterns, e.g. more students taking
advanced courses, PSEO, etc.
3. Shared Attribution
37
38. What is Shared Attribution for
Teachers?
• Shared attribution is a collective measure.
• The LEA determines which measure of
shared attribution it would like to use.
• Shared attribution could be:
• A building or district value-added score
• Recommended if available
• Building team composite value-added score
(i.e. the 5th grade VAM score or the middle
school reading ELA team’s combined VAM
score)
• Building-level or district-level SLOs
38
39. What is Shared Attribution for
Principals?
• Shared attribution is a collective measure.
• The LEA determines which measure of
shared attribution it would like to use.
• Shared attribution could be:
• District Value-Added is recommended if
available
• Groups of schools (such as grade level
buildings or regional areas within a district) may
utilize an average Value-Added score
• District-based SLOs
39
40. What Default Percentages will
your LEA Set for 2013-14?
40
*For Category A, teachers with Value-Added may also include
ODE-Approved Vendor Assessment data in the LEA Measures.
41. What Default Percentages will your
LEA Set for 2014-15?
*This information may appear differently in eTPES 2014-2015.
Educator Category Value-Added
Vendor
Assessment
LEA Measures
Total = 50%
SLOs/Other* Shared
Attribution
A: Value-
Added
A1 (exclusive)
50% 50%
A2 (non-
exclusive)
10% or greater
*Remaining % may be split
among SLOs and Shared
Attribution areas
50%
B: Approved Vendor
Assessment
10% or greater
Remaining % may be split
among SLOs and Shared
Attribution areas
50%
C: LEA Measures Remaining % may be split
among SLOs and Shared
Attribution areas 50%
41
*For Category A2, teachers with Value-Added may also include
ODE-Approved Vendor Assessment data in this LEA Measures.
42. Special Considerations
If the district decides to allow variation
from the default percentages, they must
make manual adjustments within eTPES.
• Districts should try to be as consistent as
possible when setting percentages.
• Percentages should not be determined by
individual teachers or determined based
on individual past results.
42
43. What Default Percentages
Will Your LEA Set for Principals?
43
*For Category A principals, this could also include the
ODE-Approved Vendor Assessment data average of all teachers’ growth ratings.
44. Determine how the LEA will implement
the local measures process.
• Will shared attribution measures be used?
• Who is required to create SLOs?
• Within the guidelines of 2-4 SLOs, how many
SLOs are required for each teacher?
• Who will be approving the SLOs?
• How will SLOs be tracked, through revisions,
and to final approval?
• What guidance, training, and support will be
provided to teachers and evaluators?
44
45. Will Shared Attribution
Measures be Used?
• What shared attribution measures are
we using?
• Have we secured the proper reports?
• Will the same shared attribution
measures be used for all educators
within each SGM category?
Note: Only one shared attribution
measure may be used per educator.
45
46. Will SLOs be Used?
• Who is required to create SLOs?
Which categories of teachers will have LEA measures?
Did we select SLOs as an LEA measure?
Which SGM categories will this include?
• Within the guidelines of 2-4 SLOs, how many SLOs
are required for each teacher?
• What assessments will be used?
Refer to the LEA’s “Available Assessments Inventory”
If assessments do not exist for certain grade level(s)
and/or courses, have the “SLO Guidelines for Selecting
Assessments” been followed?
Will the LEA create a district-approved list of SLO
assessments?
46
47. SLO Approval
• Who is approving SLOs in our LEA?
LEAs are responsible for SLO approval.
ODE recommends this process is
completed by a committee(s).
• Has SLO calibration been completed?
SLO calibration is the process of
ensuring a thorough and fair review of
all SLOs by systematically requiring high
quality and rigor across SLOs.
47
48. SLO Procedures
•How will SLOs be tracked?
Submission
Revisions
Final Approval
•What guidance, training, and
support will be provided to
teachers and evaluators?
48
49. SLO Tracking Form
Teacher Name SLO Event Date
Completed
Original SLO Submission
Committee Feedback
Provided to Teacher
SLO Approval
Midpoint Check-In
(recommended, not
required)
SLO End-of-Interval
Scoring and Conference
Final SLO Score Entered in
eTPES 49
50. SLO Professional Development Form
(Example)
Grade
Level
ELA Math Science Soc. St. P.E. Art Music Other
(__________)
Other
(__________)
K
1
2
3
4
5
6
7
50
51. Step Three:
Designing a Local SGM Plan
Communicate expectations and
refine the entire process.
Design communication plans, training, and
professional development opportunities
around requirements and implementation
for teachers and their evaluators.
51
52. SGM Professional Development Form
(Example)
Date Agenda Items Target
Audience
(Identify
which
teachers will
attend the
training)
Follow-Up
Training
Date (if
applicable)
Follow up
Training Agenda
Items
Target
Audience
(Identify
which
teachers will
attend the
training)
52
53. Additional Session Resources
To access session resources, including
the step-by-step workbook and templates
on how to design and improve your own
LEA SGM Plan, go to:
http://bit.ly/SGMPlan
53
54. Works Cited
• Education, Ohio Department of. (2013, May 12). Steps for Designing a Local Student Growth Measures Plan. Retrieved
from Ohio Department of Education: http://education.ohio.gov/Topics/Teaching/Educator-Evaluation-System/Ohio-s-
Teacher-Evaluation-System/Student-Groth-Measures/Additional-Information/Steps-for-Designing-a-Local-Student-
Growth-Measure
• LaWriter Ohio Laws and Rules. (2013, March 22). 3319.111 Applicability of section; evaluating teachers on limited
contracts. Retrieved from LaWriter Ohio Laws and Rules: http://codes.ohio.gov/orc/3319.111
• LaWriter Ohio Laws and Rules. (2013, September 29). 3319.112 [Effective 9/29/2013] Standards-based state framework
for the evaluation of teachers. Retrieved from LaWriter Ohio Laws and Rules: http://codes.ohio.gov/orc/3319.112v2
• Ohio Department of Education. (2013, July 26). Approved Vendor Assessments. Retrieved from Ohio Department of
Education: http://education.ohio.gov/Topics/Teaching/Educator-Evaluation-System/Ohio-s-Teacher-Evaluation-
System/Student-Growth-Measures/Approved-List-of-Assessments#approved
• Ohio Department of Education. (2013, September 13). Local Measures. Retrieved from Ohio Department of Education:
http://education.ohio.gov/getattachment/Topics/Academic-Content-Standards/New-Learning-Standards/Student-
Learning-Objective-Examples/041113-Guidance_on_Selecting_Assessments_for_SLOs.pdf.aspx
• Ohio Department of Education. (2013, September 13). Local Measures. Retrieved from Ohio Department of Education:
http://education.ohio.gov/getattachment/Topics/Academic-Content-Standards/New-Learning-Standards/Student-
Learning-Objective-Examples/112912-SLO-Requirements-and-Recommendations.pdf.aspx
• Ohio Department of Education. (2013, September 25). Student Growth Measures for Teachers. Retrieved from Ohio
Department of Education: http://education.ohio.gov/getattachment/Topics/Teaching/Educator-Evaluation-System/Ohio-
s-Teacher-Evaluation-System/Student-Growth-Measures/091913_Business-rules-for-SGM-FINAL-040913-3.pdf.aspx
• Ohio Department of Education. (2013, September 25). Student Growth Measures for Teachers. Retrieved from Ohio
Department of Education: http://education.ohio.gov/getattachment/Topics/Teaching/Educator-Evaluation-System/Ohio-
s-Teacher-Evaluation-System/Student-Growth-Measures/091913_Combining-the-SGM-scores-one-pager.pdf.aspx
54
55. Questions?
Improve Your District’s SGM Plan
Steps for Designing a Local SGM Plan
Mindy Schulz
Director of Curriculum, Allen County ESC
mindy.schulz@allencountyesc.org
Website info: www.allencountyesc.org
ODE Contact Information:
SGM@education.ohio.gov
55
Hinweis der Redaktion
Introduce self. Ask how many are implementing OTES this year. Ask how many are administrators, teachers, other. Ask how many have a SGM plan already.Share how we will go through the steps for the LOCAL DECISIONS that need to be made.
Review session outcomes. Explain time and assistance will be provided to work on their own SGM plan.
“Our session format is the same format we use in OTES, OPES, and eTPES training. Our session and time together will be using the format I do, We do, You do.”
“Please remember that the 50% of the evaluation tied to student growth measures must focus on student academic achievement and change between two or more points in time.”
Ask by a show of hands how many people this graphic looks familiar to. Review a teacher can only be in ONE category & you always start at the top (A1) and work your way down to identify which category the teacher belongs. Categories are determined by the TYPE of data available for each teacher. Give BROAD overview of each category---- A1 = exclusively instructs value added courses; A2 = teachers with value added data and other student growth measures; B = teachers in non-tested grades & subjects with vendor assessment growth measures; C = teachers in non-tested grades and subjects w/o comparable vendor assessments.
Same rule for principals-----only be in one category----start at top & work you way down. Give BROAD overview:A = principals w/value added data & other student growth measuresB= principals w/only non-tested grades, but who have vendor assessment growth measuresC = principals in bldgs. w/non-tested grades & subjects w/o comparable vendor assessments
Acknowledge & thank Kathy for sharing definition, editing & providing feedback on PPT.
Reference this info is on ODE’s website. This session will elaborate on each step.
This is the 1st broad step in designing a local SGM plan.
Principals and assistant principals are also required to be evaluated using the state framework, which is comparable to the teacher evaluation framework.ODE business rules do provide guidance regarding those administrators who do not have the title of principal or assistant principal, but function as principals or assistant principals…Review Business Rules:Are directors/supervisors of Education Service Centers/ Joint Vocational Schools and assistant principals evaluated under the Ohio Principal Evaluation System (OPES)? Administrative titles vary greatly within the state. Thus to answer the above question, focus not on the administrative title, but rather on the role and alignment (if any) to the Principal Performance Rating Rubric, which is based on the Ohio Standards for Principals. When making determinations regarding an administrator’s requirement to be evaluated under the principal evaluation system, answer these important questions:Is the administrator serving as an instructional leader?Do the duties of the administrator fall into at least two of the five principal standards?Does the administrator evaluate multiple staff members?Only those administrators meeting all three criteria above would be evaluated under the principal evaluation system, including student growth measures.Administrators who would not be required to be evaluated under the principal evaluation system include those who do not fit all of these considerations above. These administrators: may have limited contact with teachers and/or students; have narrowly defined roles and administrative responsibilities that do not directly relate to the principal standards; do not provide instructional leadership; nor do they evaluate multiple teachers.
The next step after inventorying teachers required to be evaluated under the new system, is to categorize them. Review criteria for each.
Explain a teacher has to have the value added report “in hand”, in order to be a category A teacher. Value added is determined by the schedule the teacher instructed the prior year. Review timeline…….OAA’s administered in spring….teacher-level VA reports released in fall; uploaded in eTPES 2nd semester; report from previous year follows the teacher to the current year’s evaluation.
Review slide contents.
This chart is useful in determining the category ofteacher. It is important to note again that any teacherwho received a value-added report this fall iseither a Category A1 or A2. What makes thedifference? As you can see from the chart, onlythose teachers who taught value added coursesexclusively last school year and who are teachingvalue added courses exclusively this year areconsidered A1 teachers. ALL + ALL = A1 and these teachers will have aminimum of 26% of the evaluation derived fromthe individual value-added rating.All other teachers with individual value-addedreports are Category A2. If the value-addedreport the teacher received this fall was based ona schedule of exclusive value-added teachinglast year, then that teacher needs to have aminimum of 26% of theevaluation derived fromthe individual value-added rating this year.
All other A2 teachers need to have the studentgrowth measure be proportionate to theirschedules. This is determined at the local level.The minimumpercentage of value added is 10%for these teachers.
Principals in the VA building for the 4 yr. will use the Ohio Report card 3-year composite. The score is loaded automatically for each principal.Currently, building-level Value-Added data is based on a 3-yr. composite and is automatically loaded into eTPES. Since the 3-year composite may not be accurate for each principal due to not serving in a Value-Added building for the past three years, there is a hierarchy of data that we will discuss.
Here, we continue in the possibilities of # of yrs. principals could serve in buildings in which the 3-yr. avg. might need to be re-calculated. On this chart, notice it could be the principal’s 1st yr. in the building, but they could have previous principal experience in another district, in a building with VA data. Let’s review how the local district might approach this situation……REVIEW SLIDE CONTENTS.
Here, we continue in the possibilities of # of yrs. principals could serve in buildings in which the 3-yr. avg. might need to be re-calculated. On this chart, notice it could be the principal’s 1st yr. in the building, but they could have previous principal experience in another district, in a building with VA data. Let’s review how the local district might approach this situation……REVIEW SLIDE CONTENTS.
Just like teachers that instructed in value added courses in the previous year, but may not be teaching any value added courses in the current year…..Principals may also have been in buildings with value added data in the previous year, and be assigned to buildings with no value-added data for the current year. The info on the slide explains the decision process that will need to occur for principals in this scenario.
ORC3319.111requires ODE to develop a list of student assessments that may measure mastery of the course content, for which the VA measure doesn’t apply. The list will be maintained & updated by ODE. Ask how many have seen the ODE-approved vendor assessment list? List is on the ODE website.Ask how many are familiar with the checklist for selecting assessments. This is also on ODE website.Emphasize it is the LEAs responsibility to contact the vendor to ensure they way in which they are using the assessment meets the criteria for student growth. Also, it is their responsibility to secure the growth reports. ODE has posted a 2013-2014 vendor contact information list on their website.
All other teachers and principals will beCategory C, those teachers in non-tested gradesand subjects without comparable vendorassessments or value-addeddata and theprincipals of those buildings.
Once you have completed an inventory of the 3 groups----those with value added data, those w/o value added, but have ODE approved vendor assessment, & those with neither value added or vendor assessment data, you will need to determine if any teachers are new to a value added assignment for the current year.Review the examples on the screen. Remind participants even though the teacher is in a value added course(s), they are not considered a value added teacher until they have the report in their hand. For situations where a value added report does not exist, then LEAs will need to determine if teacher is Cat. B or Cat. C. for the current school year.
Review slide contents.
This chart is useful in determining the category ofteacher. The chart explains Category B and Category Cteachers.
Just like teachers new to value added, principals can also be new with no prior experience as a principal. This chart is useful in determining the category of principal. The chart explains Category B and Category C principals.
This is the 2nd broad step in designing a local SGM plan.
Once step one is completed, “conducting an inventory of needs and resources”, it is time to determine the LEA’s default percentages. The following are ?s to consider (REVIEW SLIDE CONTENTS). Remind participants to stay within the SGM framework, which will be discussed on the following slides.
As many of you might have heard, there is PROPOSED legislation that could POTENTIALLY have a future impact on the educator evaluation model. While this is NOT the focus of today’s session, it is being shared in an effort to keep you informed in the event this is passed and you deem revisions to your SGM plan are needed.
REVIEW SLIDE CONTENTS. Emphasize: ANYTHING CAN CHANGE AT ANY TIME……Stay Tuned!
“Starting in 2014-15, the value-added report must count as 50 percent of the evaluation of an A1 teacher. Again, so if I receive a value-added report for all of the courses I teach, I am considered an A1 teacher and, starting in 2014-15, 50 percent of my evaluation must be based upon those value-added results.” “The allocation of measures and weights will not change for Category A2 teachers between 2013-14 and 2014-15. In 2014-15 and beyond, the weighting of student growth measures must be proportional to the teaching schedule.
“For Category A Principals, there would be no A1 principals, since there would be teachers in the building in non-tested grades and subjects. This graphic depicts the decisions to be made regarding student growth measures for principal evaluation.”
“Category B teachers and principals are those with data from assessments on the ODE-approved vendor assessment list. You may be using wonderful vendor assessments, but only those approved are considered Category B. If you have Category B assessments given in the manner that the vendor states will provide a student growth measure, then you must use that data as part of the evaluation. (A point of note for the facilitator: If the assessment is not on the approved list, it can be used within an SLO as long as it is valid and reliable for the SLO.) “ “The local board of education will need to determine the percentage of the vendor assessment to be used within the evaluation system. The local board of education will make a decision on this for all Category B teachers. This default percentage for the district will be consistent for all Category B teachers. There may be circumstances where this percentage varies; if it does, it should be for valid reasons.” “Here’s a quick question to make sure we’re on the same page: If a teacher has both value-added and approved vendor assessments, is he/she a Category B teacher? (No, the teacher has a value-added report, so the teacher is Category A. The vendor assessment data can be used as part of the local measures, but does not have to be used in this case.)” “Teachers who fall into Category B typically do not have value-added scores, but they do have an approved vendor assessment associated with their classes that can be used to measure student growth.”Note: URM data from SOAR districts will fall intocategory B
Share about Bluffton using STAR Early Literacy for 13 years (began in 2000 with STAR Reading & upgraded as STAR had enhancements). Compare them with Spencerville----just purchased STAR Early Literacy last year. When determining how much weight to place on vendor assessment, it is important to note the length of time the assessment has been in place & the amount of trend data available……establishes greater level of confidence for determining % to apply to vendor assessments. Bluffton placed a higher % on their vendor assessments because they have 13 yrs. of trend data to analyze.
What exactly are local measures?Local measures for teachers differ slightly fromlocal measures for principals. REVIEW SLIDE CONTENTS.
REVIEW SLIDE CONTENTS.
Shared attribution is a local measure that is attributed to a GROUP OF TEACHERS.“We have named the three types of LEA measures and discussed possible weights of these measures. However, let’s take a minute now to further define shared attribution, which is a local measure. Shared attribution is a collective measure of student growth. If the LEA elects to use shared attribution, the LEA must choose which measure of shared attribution it would like to use. Shared attribution can be one of three things:A building or district value-added score, which is the recommended shared attribution measure A building team composite value-added score (i.e. the 5th grade VAM score or the middle school reading ELA team’s combined VAM score)Building-level or district-level SLOs ODE recommends that, if a district opts to include shared attribution, that it use a building or district value-added score. One reason to consider using building or district value-added score because it expands the collective pool. The smaller the pool of teachers who contribute to the value-added score, the larger the effect one teacher could have on the composite value-added score. For example think about new teachers who might not be as effective as veterans during that first year of teaching but is still doing the best she or he can. In addition, using building or district value-added scores rather than team value-added scores could create a team atmosphere as opposed to a competitive atmosphere. Like individual student growth measures, measures of shared attribution would produce a growth score of 1 to 5 that would later be included in the calculation of the teacher’s summative score.”
Like shared attribution for teachers, sharedattribution for principal is also a collectivemeasure of student growth. If the LEA elects touse shared attribution, theLEA must choose whichmeasure of shared attribution it would like to use.However, shared attribution for principals canbe one of three things:• A district value-added rating, which is therecommended shared attribution measure• Groups of schools (such as grade levelbuildings or regional areas within a district)may utilize a composite Value-Addedrating• District-based SLOs
Here is a screen shot of the teacher defaultstudent growth measures portion of eTPES. This isprovided so you can make decisions.
This is an example for 2014-2015. Keep in mind for 14-15 Cat. A1 requires all 50% of SGM to be based on teacher-level value added.
Once the LEA sets their default %s, and categories are assigned, it might be necessary to allow for variations. If there are reasons for varying the percentages within a category, the superintendent may do that for the principal. The principal may do that for the teacher. For example, for category A2 teachers, there is one default percentage for the entire district. There may be varying schedules per teacher and the principal may enter the appropriate percentages based upon the teacher’s schedule so that the percentage is proportionate to the teacher’s schedule. Refer participants to “Combining the Student Growth Measures in the Educator Evaluator Systems” on the ODE website.
Here is a screen shot of the principal defaultstudent growth measures portion of eTPES. This isprovided so you can make decisions.
Once the default %s have been set, it is time to determine how the LEA will implement the local measures process. REVIEW SLIDE CONTENTS.
Remind participants of shared attribution measures covered previously (bldg. or district VA; bldg. team composite; bldg. level or district level SLOs). REVIEW SLIDE CONTENTS.
Important to let teachers know if their category is required to write SLOs. Teachers will need to know which subject areas/courses they are writing SLOs. SLOs development begins in the fall. Several ODE guidance docs are available on the ODE website.SLO FAQs: ODE requires a minimum of 2 & recommends no more than 4 which are representative of your schedule & student population. Guideline applied to both Cat. B and A2 teachers if the LEA has determined SLOs will be used as local measures.
AIR Module 6 focuses on calibration & is available on ODE website.Share calibration is threaded throughout entire OTES & OPES systems……credentialing training requires calibration on the performance rating rubrics; emphasize need to calibrate on any type of rubrics-----rubrics used in SLOs for assessments, etc.
SLO approval is not a “once & done”. It will require ongoing feedback between the SLO approval committee and teachers.
Here is an example of how a district might track SLOs throughout the approval and scoring process.
Another example of how a district might plan for SLO PD.
This is the 3rd broad step in designing a local SGM plan. Keep in mind, the educator evaluation systems will require ONGOING SUPPORT and PD.
Again, sample form for identifying professional development and support.
Explain downloadable, electronic templates of all of the inventory steps are available here, as well as a “How to Design a Local SGM Plan” workbook and today’s PPT.
These are the resources used, mostly ODE docs that are available on ODE’s website.
Review contact info. Ask if any participants have questions. Thank them for attending.