SlideShare a Scribd company logo
1 of 71
Download to read offline
OERC RESEARCH RELATED
TO STUDENT GROWTH
MEASURES AND EDUCATOR
EFFECTIVENESS
Jill Lindsey, Ph.D.
Wright State University
Marsha Lewis, Ph.D.
Ohio University
E x t e n d i n g Y o u r K n o wl e d g e T h r o u g h R e s e a r c h T h a t W o r k s ! I C o l u m b u s , O H I J u n e 1 8 , 2 0 1 4
 OERC can examine statewide policy and
practice questions
 Follow and document early implementation
in order to inform policy and practice
 Research implementation for multiple years
as start-up issues are resolved and
implementation takes hold—look for what is
working and what can be improved.
PURPOSE OF RESEARCH
 Time span of findings offer insight into changing landscape
around teacher evaluation and student growth measures
 2012: Teachers and principals philosophically supportive of
new evaluation system and need to measure student growth
 2013: HB 555
 2014: Teachers and principals far less supportive and
troubled by concerns related to use of different types of
SGMs for evaluation
 Common Themes
3
FOUR STUDIES SPANNING
THREE YEARS
4
METHODOLOGY
 Structured interviews with superintendents
and administration team members
 Focus groups with teachers
 Surveys of teachers in each pilot LEA
 eTPES data analysis
 OTES/OPES Implementation Study (37 LEAs)
 Extended Testing for Value-Added Reporting (23
LEAs)
 Initial Use of Student Learning Objectives (30 LEAs)
 Student Growth Measures Policy & Practice (13
LEAs)
5
FUNDED PROJECTS RELATED TO
STUDENT GROWTH MEASURES
 Sequencing, planning, feedback, and student
growth measures for teachers and principals
 Preparation for evaluation
 Experiences of teachers and principals evaluated
using student growth measures
 Processes and measures of student growth that
districts adopted for use in teacher and principal
evaluation systems
6
OTES/OPES IMPLEMENTATION
STUDY
7
EARLIEST FINDINGS
 Generally positive about the new evaluation systems
 Supported use of student growth measures in evaluation
 Lack of trust & misunderstandings about value-added, vendor
assessments, and local measures of student growth
 Unfairness of using different kinds of measures and differing
time cycles for different measures of student growth
 Conversations around new evaluation system focus on
instruction
 Time required to complete evaluation took time away from
working with students
 Appreciative at being asked about their experiences and views
 Grant funds provided vendor testing for grades 1,2,3, and
high school subjects
 Provided teacher-level value-added scores from vendor test
results
 Processes and challenges related to extended testing
implementation
 Role of roster verification
 Use of SGMs in educator evaluation
 Best practices/lessons learned
8
SGM EXTENDED TESTING MINI-
GRANT
Findings
 Want reliable student
growth measures
 Lack assessment literacy
 Unclear how vendors will
provide data
 Uncertain of roster
verification timing and
impact on VAM
 LEAs opting to use lowest
percentages in weights
 Grateful for being asked
about their experiences
Nine Drop-outs’ Reasons
 Requirement to use
extended testing results
too soon, unfair, not
part of grant
 Cost of extended testing
was too high
 Too many changes, and
too much on teachers’
plates
SGM MINI–GRANTS FOR
EXTENDED TESTING
10
INITIAL USE OF STUDENT
LEARNING OBJECTIVES
Study examined fidelity of SLO use for:
 improving student performance
 measuring academic growth
 evaluating teachers
 Training was not uniform across the state
 Assessments varied widely across grade
levels, buildings, and districts
 Processes excessively time-consuming
 More challenging for semester or quarter
courses; limited time to complete the pre-
test, teach, post-test cycle
 Implementation hampered by too many
changes to common core, piloting new
state tests and PARCC, and implementing
OTES
 Many emotional moments and gratitude for
being invited to talk about experiences
Interviews
Surveys
Documents
eTPES
Data
11
EARLY
SLO
THEMES
ALL DATA
NOT YET
ANALYZED
12
SGM POLICY AND PRACTICE
STUDY
OERC study of early adopter districts of
Student Growth Measures
 Designed to provide timely data to inform state
policy and district practice.
 “What does this look like when implemented?”
 Teachers’ perceptions of SGM components
 Do SGMs correlate with Performance on Standards? If
not, why not?
 The distribution of teacher and principal ratings
Focus group themes:
 Fairness questions (e.g. Category A teachers do
not know OAA items in advance while Category C
teachers develop their own assessments)
 Principals’ time consumed with teacher observation
activities
 Teachers have questions/misconceptions about
value-added methodology
13
SGM POLICY AND PRACTICE
STUDY
14
SGM POLICY AND PRACTICE
SURVEY
 Deployed late February through mid-April, 2014
 22% response rate (603 teacher respondents/2,709 full-
time teachers) N = 469–classroom teachers, 97
intervention specialists
 Survey responses were similar to focus group findings
 Of the four SGMs (Value-Added, SLOs, Vendor-Approved
Assessments, Shared Attribution), more surveyed
teachers think Student Learning Objectives “most
accurately assess a teacher’s instructional impact.”
 Early stage of implementation
 Uncontrollable factors
 Unequal measures/accuracy of the measures
 SLOs teacher-developed, validity/reliability questions
 Others see the SLOs as most fair because focused on
the content taught and results during evaluation year
 Approved vendor assessments may not match content
standards
 Value-added model was not formulated to measure
individual teacher effectiveness
15
FAIRNESS CONCERNS WITH
SGMS & EVALUATION
Teachers who see value in SGMs:
 Feel it is important to measure student growth
 Recognize the need for accountability
 SGMs useful source of feedback for planning and
adjustment to outcomes
16
SGM POLICY AND PRACTICE
STUDY
“I do think it is important to make sure a child makes
adequate growth. However, there are factors that are
out of my control (attendance, home support, etc.)
that affect a child's learning and are not considered
when calculating the yearly academic growth of a
student.”
“It shows the effectiveness of a teacher and useful
data to adjust your teaching.”
17
SGM POLICY AND PRACTICE
STUDY
Teacher–Student Data Link/Roster Verification is necessary to
ensure SGM data quality.
Research Questions:
 Are teachers actively participating in the verification of their
own rosters and percentage of instructional time with
students as specified by Ohio’s roster verification process
guidelines?
 Do principals and teachers have access to adequate
training and technical assistance?
 Do principals and/or teachers perceive any issues with
roster verification?
 What do Ohio educators view as the hallmarks of a good
system?
18
TEACHER ROSTER
VERIFICATION RESEARCH
19
TEACHER ROSTER
VERIFICATION SURVEY
Sent online survey to all teachers and principals who
completed the link/roster verification process in spring
2013 and spring 2014
2013 survey: 5,984 teacher responses from 695 LEAs
2014 survey to-date: 6,778 teacher responses. Survey
still in field.
2011 2013 2014*
(prelim.)
Yes 46% 57% 59%
No 23% 25% 25%
Don't
know
31% 17% 16%
Teachers – Do you think the linkage process accurately
captured what was happening in your classroom (i.e.
students you taught last year, their length of enrollment,
and your percentage of instructional time with them)?
20
TEACHER ROSTER VERIFICATION SUR
21
TEACHER ROSTER
VERIFICATION SURVEY
For teachers that answered “No”:
Teachers – Explain why you think the student–teacher
linkage process did not accurately capture what was
happening in your classroom. (open-ended)
Themes:
 Difficulty dividing time in various co-teaching
situations.
 Unable to account for student absences
 Teachers want to be able to report finer
increments of shared instructional responsibility
 Students’ schedules changed too
often/environment too dynamic to accurately
estimate time
2011 2013 2014*
(prelim.)
Not at all
confident
39% 32% 35%
Somewhat
confident
55% 61% 58%
Very
confident
6% 8% 7%
Teachers – Given your experience with the linkage process, how
confident are you that the linkage process improves the
accuracy of the teacher-level value-added data?
22
TEACHER ROSTER VERIFICATION SURVEY
Concerns
 Early in
implementation—lack of
trust and
misunderstandings
 Perceived unfairness of
different kinds of
measures
 Time required to
complete evaluation
 Too many changes at
the same time
Kudos
 Support measuring
student growth
 Training for assessment
literacy desired
 Appreciate being
consulted and heard
 Roster verification
process is improving
23
COMMON THEMES ACROSS TIME
 Build trust by continuing to include teachers and
administrators in conversations and policies that impact
them
 Acknowledge concerns as legitimate
 Provide professional development opportunities to
correct misunderstandings and knowledge deficits
 Streamline paperwork where possible; use adaptations
from the field
 Modify policy and roll-out timelines when possible
24
RECOMMENDATIONS
QUESTIONS?
JILL.LINDSEY@WRIGHT.EDU
LEWISM5@OHIO.EDU
connect@oerc.osu.edu | oerc.osu.edu
PLANNING FOR THE FUTURE,
LEARNING FROM THE PAST: WHAT
CAN SCHOOLS LEARN FROM COLLEGE
AND CAREER PROFILES OF
GRADUATES?
Joshua D. Hawley
Director, OERC and Associate Professor
John Glenn School of Public Affairs
The Ohio State University
Making Research Work for Education
Ohio’s Constitution
A “thorough” and
“efficient“
education.
OHIO’S STANDARDS OF PUBLIC
EDUCATION
College for all
High
School
College Work
Education and Career
High
School
• CTE
• STEM
College
• AP/Dual
Enrollment
• School +
Work
Workforce
Training
• Apprenticeship
• Military
29
LINKING SCHOOL TO
WORK
30
CHANGING VIEWS OF WORK
REQUIRE NEW INFORMATION
 As educators, we want to know about a variety of
educational outcomes, not just college-going rates
for students. At its broadest, we might consider the
following domains:
 College (traditional two and four-year sectors)
 Credentialed and non credentialed workforce training
 Apprenticeships
 Military
31
QUALITY OF OUTCOMES MATTER
 In this day and age, the quality of the outcomes matter a
great deal, and, therefore, we are concerned with how well
students are prepared to perform over time.
 Concerns we typically have in this case:
 Student remediation
 Does student knowledge match what is required in college
classes?
 Are students prepared to pick a career? (I distinguish this
from a job)
 What happens to kids that go directly from high school to work?
 What happens to kids that dropout (both in terms of further
education and work?)
32
PILOT HIGH SCHOOL REPORTS
 Using data from the Ohio Longitudinal Data Archive
(OLDA), the OERC has been able to answer many of
these questions, beginning for high schools, and
present them in a format that schools can use.
 The report has four key question areas:
 What are the employment outcomes of high school graduates?
 What are the post secondary education outcomes of high school
graduates?
 What is the quality of post secondary education high school
graduates are carrying out?
 What happens to individuals that do not graduate from high
school (dropout)?
33
PILOT REPORTS: COLLEGE AND
CAREER
34
WHAT KIND OF EMPLOYMENT
EXPERIENCES DO STUDENTS HAVE
AFTER SCHOOL?
35
WHAT HIGHER EDUCATION
OUTCOMES DO STUDENTS
HAVE?
36
HOW WELL DO SCHOOLS
COMPARE WITH EACH OTHER IN
REMEDIAL EDUCATION?
37
VISUAL LOOK
College and Career Report
High School Name
County
State
 
How do X year graduates from this school compare to others in Ohio?
Education Career Outcomes
(Note): Following legal agreement covering data use by the OERC, individual cells with fewer then 10 people have been redacted. They are indicated by a *.
School District State  
Number of Students that Started High School in X year Number of Students working in Ohio in the year following graduation
Number of Graduates Average annual earnings for individuals with high school diploma
Number of Dropouts Average annual earnings for individuals without a high school diploma
Average High School GPA      
Percent of students in this class eligible for free or reduced lunch Industry of employment for graduates of high school not in college
  Retail 20
Average junior year ACT scores for this class by subject   Constructi 20
Financial S 60
English  
Math  
Reading
Science
Composite
Percent of graduates ready for college Industry of employment for high school dropouts
  Retail 40
Average state scholarship awards for graduates Constructi 40
Financial S 10
Overall post secondary rate for graduates from this class Other 10
Two year college
Four year college
Other vocational/workforce training
Percent of graduates from this class who attended an in‐state college or university
Percent of graduates from this class who attended an out of state college or university
Trend in college going rates for this school vs. state and district Working and school
*Readiness scores are calculated based on the individual ACT score over or below the "Remeditaion Free Standard"
Footer: Describe data origin. Refer to website that summarizes origin. 
Retail
Constr
uction
Retail
Construc
tion
Financial
Services
Other
38
COLLEGE OUTCOMES
How do X year graduates from this school compare to others in Ohio?
Education
(Note): Following legal agreement covering data use by the OERC, individual cells with fewer then 10 
people have been redacted. They are indicated by a *.
School District State
Number of Students that Started High School in X year
Number of Graduates
Number of Dropouts
Average High School GPA
Percent of students in this class eligible for free or reduced lunch
Average junior year ACT scores for this class by subject
English
Math
Reading
Science
Composite
39
COLLEGE OUTCOMES (CONT.)
Percent of graduates ready for college
Average state scholarship awards for graduates
Overall post secondary rate for graduates from this class
Two year college
Four year college
Other vocational/workforce training
Percent of graduates from this class who attended an in‐state college or university
Percent of graduates from this class who attended an out of state college or university
Trend in college going rates for this school vs. state and district
40
CAREER OUTCOMES
Number of Students working in Ohio in the year following graduation
Average annual earnings for individuals with high school diploma
Average annual earnings for individuals without a high school diploma
     
Industry of employment for graduates of high school not in college
  Retail 20
  Constructi 20
Financial S 60
 
 
Industry of employment for high school dropouts
  Retail 40
Constructi 40
Financial S 10
Other 10
Retail
Constr
uction
Retail
Construc
tion
Financial
Services
Other
 Develop a formal high school college and career report for
select districts (next up, Columbus City Schools; Battelle
For Kids)
 Complete Workforce Success Measures Project with the
Office of Workforce Transformation (see OWT website for
introduction: http://workforce.ohio.gov/ ).
 Work with Ohio Department of Education and Board of
Regents to answer questions about employment outcomes
for K-12 and higher education.
41
FUTURE PLANS
THANKS FOR YOUR
ATTENTION!
connect@oerc.osu.edu | oerc.osu.edu
SM22
Slide 42
SM22 to be consistent with brochure and briefs
www.oerc.osu.edu | connect@oerc.osu.edu
Sunny Munn, 6/13/2013
THIRD GRADE READING
GUARANTEE:
A CASE STUDY
Suzanne Franco, Professor, Wright State University
Jarrod Brumbaugh, Principal, Milton-Union Schools
Making Research Work for Education
Extending Your Knowledge Through Resear ch That Works! I Columbus , OH I June 18,
2 0 14
 Ohio Third Grade Reading Guarantee (TGRG)
2012
 In 2012–13, 81% of Ohio’s 3rd graders were
proficient or above
 ODE offered competitive funding grant for
developing TGRG 2013–2014 implementation
 OERC funded a case study of a funded TGRG
three-LEA consortium for 2013–2014
BACKGROUND
45
 Co-located in Midwestern Ohio but had
not collaborated on previous initiatives
 Orton-Gillingham Multi-Sensory training
and instructional strategies
 Professional Learning Community (PLC)
 Parent Informational Opportunities
46
CONSORTIUM TGRG PLAN
47
CONSORTIUM DEMOGRAPHICS
LEA Typology Report
Card
Rating
2010–
2011
%
passed
3rd grade
reading
2011–
2012
% passed
3rd grade
reading
2012–
2013
Mobilit
y
%
White/
Non-
Hispani
c
%
Econ.
Dis-
advan.
1 2: Rural w/
avg. pvrty &
small ADM
Exc. 87.3 85.9 7.0 97 40
2 4: Small
Town w high
pvrty & avg
ADM
Exc. 90.3 80.2 11.4 87 53
3 2: Rural w/
avg. pvrty &
small ADM
Exc. w/
Distinc-
tion
84.7 84 6.1 98 19
 Feedback and buy-in for the training,
implementation, and PLC.
 Progress and monitoring tools used.
 Reading skills improved for On-Target
students. For Not on Target students?
 Percentage of K–3 students Not on Target in
2012-14?
49
RESEARCH QUESTIONS
For each LEA:
 Document analysis of historical data and end-of-year
2014 RIMPS
 Interviews and focus groups with Administrators (6),
Teachers (12)
 Observations of O/G training and classroom instruction
50
METHODOLOGY
51
O/G TRAINING
Training Details
 Two 5-day sessions the week after school year ended
 One 5-day session in November
 Refresher course available summer, 2014
Training Feedback
 Teachers felt it was engaging but too long, or covered
grade levels not in their interest. They would like to repeat
after one year of implementation.
 Administrators from one LEA attended training. They felt
that common language helped with classroom observations.
52
IMPLEMENTATION
Implementation Details
 LEA 1 – O/G not required due to receipt of grant
funds in mid-September, 2013. Used in RTI, Title 1,
and other interventions.
 KRAL is the identifier for K; State assessment tool for
grades 1–3
 DIBELS is the progress monitor along with STAR and
Study Island
 LEA 2 – O/G not required (see above). Used in RTI,
intervention, and Title 1
 NWEA (2012) and DIBELS (2013) for K–3
 DIBELS is the progress monitor
 LEA 3 decided not to participate
Implementation Feedback Details
 Not all supplies at beginning of year for all teachers due to
delay in receiving grant funds
 Not all training completed at beginning of year (new
teachers)
 Use of O/G not required; inconsistency a challenge for
teams
 Merging O/G with LEA-approved reading curriculum difficult
 Parent Nights were not well attended; PLC not formed
53
IMPLEMENTATION FEEDBACK
54
PROGRESS AND MONITORING
TOOLS
 LEA 1: DIBELS
 LEA 2: NWEA (2012); DIBELS (2013)
Feedback
 O/G assessment tools not Ohio-approved, therefore the
LEAs use DIBELS and NWEA to assess student progress
 RIMPs not standardized among LEAs (issue for moving and
determining LEA or statewide impact)
 For highly mobile student populations, 30-day requirement
for RIMP is very difficult to meet.
 Too much testing for young students; test anxiety rising
Successes
 After School
Program
 Students respond
well to Multi-Sensory
 Teachers want more
training
Challenges
 Use of O/G not
consistent
 Costs to sustain
 O/G assessments not
state approved
 RIMP forms could be
improved; data should be
collected for analyses
 No information about
other LEA TGRG plans
55
CONSORTIUM SUMMARY
 LEA1
 Grade 3 results to date
 Changes in implementations for 2014–2015
 LEA 2 Details
 Grade 3 results to date
 Changes in implementations for 2014–2015
56
2013–2014 RESULTS
2014–2015 PLANS
 Assessment tools aligned with TGRG programs
funded by state need approval.
 Primary students exhibit high anxiety regarding
TGRG, impacting performance and fear of
school.
 Required testing takes away from instruction
time. Embrace testing that collects needed
data for all accountability purposes, not just
one initiative.
57
TESTING RECOMMENDATIONS
 Continue funding for TGRG development.
 Continue monitoring LEA implementation of
funded and non-funded TGRG
implementation plans, and share “lessons
learned.”
 Revise RIMP format and collect RIMP data
for longitudinal analyses of common
deficiencies across the state.
58
TGRG POLICY
RECOMMENDATIONS
QUESTIONS
suzanne.franco@wright.edu
BrumbaughJ@milton-union.k12.oh.us
thompsond@piqua.org
Making Research Work for Education
Extending Your Knowledge Through Resear ch That Works! I Columbus , OH I June 18,
2 0 14
READY OR NOT?
E x t e n d i n g Y o u r K n o wl e d g e T h r o u g h R e s e a r c h T h a t W o r k s ! I C o l u m b u s , O H I J u n e 1 8 , 2 0 1 4
Ginny Rammel, Ph.D.
Superintendent
Milton-Union Exempted Village Schools
 Research and study
 Use of multiple forms of data
 Create a culture of “calculated risk-takers”
 Embed professional development
“EVERY STUDENT, EVERY DAY”
 Role model to all at all times
 Establish high expectations
 Collaborate, share
 Trust, be truthful and supportive
 Know your staff
63
CULTURE TAKES TIME TO
CHANGE!
64
EXPLORE OPPORTUNITIES
 Through grants, pilot studies, action
research
 Connect with:
 Personnel at colleges and universities, OERC
 Educators from other districts
 Members of professional organizations
 Policymakers, legislators
 Milton-Union was involved in a number of
grants:
 RttT Mini-grant Value-Added
 Student Growth Measures
 Early Literacy and Reading Readiness
 OERC case study
The more you and your staff research, study,
and share data, the better decisions you make.
Collaborating and working together help to
create a culture of “Every Student, Every Day.”
65
EMBEDDED PD:
BE AN ACTIVE PARTICIPANT
Our culture and the research and data from
our grant involvement led to development
of our OTES instrument, and a successful
year of implementation
This trust and openness flowed throughout
recent negotiations.
66
RESULTS
All initiatives impact one another and YOU:
 OTES
 OPES
 Graduation requirements
 Third Grade Reading Guarantee
Keep the main thing the main thing – is
what I’m doing going to help students
learn? Are we preparing students for
“down-the-road” careers?
67
CHANGE WILL OCCUR WITH OR
WITHOUT YOU!
 Do the research upfront
 Study the data
 Reflect, revise if necessary
 Building project
 Food service program
68
CALCULATED RISK-TAKERS
 All day, every day kindergarten
 On-site Head-Start programs
 Grouping students by quintiles
 H.S. ACT EOC exams
 Recognized as a U.S. Department of Education
Green Ribbon School
 Food service program ended the year in the black!
69
DATA SUPPORTS INITIATIVES
 How do we better prepare students for their
futures – colleges, universities, employers?
 How can we convey to young parents the
importance of their role as a teacher?
 How can we differentiate education so all
students are better served?
 How can we better communicate the results
of research and the sharing of data?
70
NEXT STEPS…
QUESTIONS?
RAMMELV@MILTON-UNION.K12.OH.US
connect@oerc.osu.edu | oerc.osu.edu

More Related Content

What's hot

Equity-Efficiency-Effectiveness through Assessment Levers - 1-23-15
Equity-Efficiency-Effectiveness through Assessment Levers - 1-23-15Equity-Efficiency-Effectiveness through Assessment Levers - 1-23-15
Equity-Efficiency-Effectiveness through Assessment Levers - 1-23-15Peter Hofman
 
Maximizing student assessment systems cronin
Maximizing student assessment systems   croninMaximizing student assessment systems   cronin
Maximizing student assessment systems croninNWEA
 
NWEA Growth and Teacher evaluation VA 9-13
NWEA Growth and Teacher evaluation VA 9-13NWEA Growth and Teacher evaluation VA 9-13
NWEA Growth and Teacher evaluation VA 9-13NWEA
 
Teacher Observations: The Case for Arts for All Public Charter School Policy ...
Teacher Observations: The Case for Arts for All Public Charter School Policy ...Teacher Observations: The Case for Arts for All Public Charter School Policy ...
Teacher Observations: The Case for Arts for All Public Charter School Policy ...Tiffany Brooks
 
Teachers’ satisfaction of Assessment Process of Competency Based Curriculum...
Teachers’ satisfaction   of Assessment Process of Competency Based Curriculum...Teachers’ satisfaction   of Assessment Process of Competency Based Curriculum...
Teachers’ satisfaction of Assessment Process of Competency Based Curriculum...The Open University of Sri Lanka
 
Study on students dropouts in advanced certificate in pre school education pr...
Study on students dropouts in advanced certificate in pre school education pr...Study on students dropouts in advanced certificate in pre school education pr...
Study on students dropouts in advanced certificate in pre school education pr...The Open University of Sri Lanka
 
Teacher evaluations using technology
Teacher evaluations using technologyTeacher evaluations using technology
Teacher evaluations using technologyRichard Voltz
 
Work Sampling System in Early Childhood Education
Work Sampling System in Early Childhood EducationWork Sampling System in Early Childhood Education
Work Sampling System in Early Childhood EducationCheryl Ramos-Roldan
 
Value-Added Data and Teacher Effectiveness
Value-Added Data and Teacher EffectivenessValue-Added Data and Teacher Effectiveness
Value-Added Data and Teacher Effectivenessjobepe
 
Karim value added
Karim value addedKarim value added
Karim value addedAnilKarim
 
Reviewing the Research and PEAC Recommendations around Principal Evaluation
Reviewing the Research and PEAC Recommendations around Principal EvaluationReviewing the Research and PEAC Recommendations around Principal Evaluation
Reviewing the Research and PEAC Recommendations around Principal EvaluationRichard Voltz
 
Measuring student academic growth through Value Added methodology: An introd...
Measuring student academic growth through Value Added methodology:  An introd...Measuring student academic growth through Value Added methodology:  An introd...
Measuring student academic growth through Value Added methodology: An introd...F Jenkins
 
11212014_Skills_for_Success_Tooley_Bornfreund
11212014_Skills_for_Success_Tooley_Bornfreund11212014_Skills_for_Success_Tooley_Bornfreund
11212014_Skills_for_Success_Tooley_BornfreundIsabel Huston
 
IASB Student Growth Presentation
IASB Student Growth PresentationIASB Student Growth Presentation
IASB Student Growth PresentationRichard Voltz
 
Standardized Deception: Examining the Debilitating Effects of Standardized Te...
Standardized Deception: Examining the Debilitating Effects of Standardized Te...Standardized Deception: Examining the Debilitating Effects of Standardized Te...
Standardized Deception: Examining the Debilitating Effects of Standardized Te...petermanr22
 

What's hot (20)

Equity-Efficiency-Effectiveness through Assessment Levers - 1-23-15
Equity-Efficiency-Effectiveness through Assessment Levers - 1-23-15Equity-Efficiency-Effectiveness through Assessment Levers - 1-23-15
Equity-Efficiency-Effectiveness through Assessment Levers - 1-23-15
 
Maximizing student assessment systems cronin
Maximizing student assessment systems   croninMaximizing student assessment systems   cronin
Maximizing student assessment systems cronin
 
NWEA Growth and Teacher evaluation VA 9-13
NWEA Growth and Teacher evaluation VA 9-13NWEA Growth and Teacher evaluation VA 9-13
NWEA Growth and Teacher evaluation VA 9-13
 
Teacher Observations: The Case for Arts for All Public Charter School Policy ...
Teacher Observations: The Case for Arts for All Public Charter School Policy ...Teacher Observations: The Case for Arts for All Public Charter School Policy ...
Teacher Observations: The Case for Arts for All Public Charter School Policy ...
 
Using Assessment data
Using Assessment dataUsing Assessment data
Using Assessment data
 
Teachers’ satisfaction of Assessment Process of Competency Based Curriculum...
Teachers’ satisfaction   of Assessment Process of Competency Based Curriculum...Teachers’ satisfaction   of Assessment Process of Competency Based Curriculum...
Teachers’ satisfaction of Assessment Process of Competency Based Curriculum...
 
Study on students dropouts in advanced certificate in pre school education pr...
Study on students dropouts in advanced certificate in pre school education pr...Study on students dropouts in advanced certificate in pre school education pr...
Study on students dropouts in advanced certificate in pre school education pr...
 
Teacher evaluations using technology
Teacher evaluations using technologyTeacher evaluations using technology
Teacher evaluations using technology
 
Supporting school improvement through inspection
Supporting school improvement through inspectionSupporting school improvement through inspection
Supporting school improvement through inspection
 
Work Sampling System in Early Childhood Education
Work Sampling System in Early Childhood EducationWork Sampling System in Early Childhood Education
Work Sampling System in Early Childhood Education
 
Value-Added Data and Teacher Effectiveness
Value-Added Data and Teacher EffectivenessValue-Added Data and Teacher Effectiveness
Value-Added Data and Teacher Effectiveness
 
Karim value added
Karim value addedKarim value added
Karim value added
 
EVAAS
EVAAS EVAAS
EVAAS
 
Reviewing the Research and PEAC Recommendations around Principal Evaluation
Reviewing the Research and PEAC Recommendations around Principal EvaluationReviewing the Research and PEAC Recommendations around Principal Evaluation
Reviewing the Research and PEAC Recommendations around Principal Evaluation
 
Measuring student academic growth through Value Added methodology: An introd...
Measuring student academic growth through Value Added methodology:  An introd...Measuring student academic growth through Value Added methodology:  An introd...
Measuring student academic growth through Value Added methodology: An introd...
 
Assessment in Pre-School and Primary School
Assessment in Pre-School and Primary SchoolAssessment in Pre-School and Primary School
Assessment in Pre-School and Primary School
 
11212014_Skills_for_Success_Tooley_Bornfreund
11212014_Skills_for_Success_Tooley_Bornfreund11212014_Skills_for_Success_Tooley_Bornfreund
11212014_Skills_for_Success_Tooley_Bornfreund
 
IASB Student Growth Presentation
IASB Student Growth PresentationIASB Student Growth Presentation
IASB Student Growth Presentation
 
Standardized Deception: Examining the Debilitating Effects of Standardized Te...
Standardized Deception: Examining the Debilitating Effects of Standardized Te...Standardized Deception: Examining the Debilitating Effects of Standardized Te...
Standardized Deception: Examining the Debilitating Effects of Standardized Te...
 
NASPA AnP 2014
NASPA AnP 2014NASPA AnP 2014
NASPA AnP 2014
 

Similar to Oerc june 2014 final ppt combined

Session 1 Tom Abbott Biddulph High School
Session 1    Tom  Abbott    Biddulph  High  SchoolSession 1    Tom  Abbott    Biddulph  High  School
Session 1 Tom Abbott Biddulph High SchoolMike Blamires
 
NCDPI NCTED Meeting April 2014
NCDPI NCTED Meeting April 2014NCDPI NCTED Meeting April 2014
NCDPI NCTED Meeting April 2014rachelmcbroom
 
Ceppe march 2011 final (Susanna Loeb)
Ceppe march 2011 final (Susanna Loeb)Ceppe march 2011 final (Susanna Loeb)
Ceppe march 2011 final (Susanna Loeb)Ceppe Chile
 
Connecticut mesuring and modeling growth
Connecticut   mesuring and modeling growthConnecticut   mesuring and modeling growth
Connecticut mesuring and modeling growthJohn Cronin
 
Connecticut mesuring and modeling growth
Connecticut   mesuring and modeling growthConnecticut   mesuring and modeling growth
Connecticut mesuring and modeling growthJohn Cronin
 
Connecticut mesuring and modeling growth
Connecticut   mesuring and modeling growthConnecticut   mesuring and modeling growth
Connecticut mesuring and modeling growthJohn Cronin
 
Carnegie Foundation Summit on Improvement in Education: Driver Diagrams
Carnegie Foundation Summit on Improvement in Education: Driver DiagramsCarnegie Foundation Summit on Improvement in Education: Driver Diagrams
Carnegie Foundation Summit on Improvement in Education: Driver DiagramsNext Generation Learning Challenges
 
Continuous Assessment System (CAS In Nepal)
Continuous Assessment System (CAS In Nepal)Continuous Assessment System (CAS In Nepal)
Continuous Assessment System (CAS In Nepal)Ravi Maharjan
 
Preparing Teachers: Building Evidence for Sound Policy
Preparing Teachers: Building Evidence for Sound PolicyPreparing Teachers: Building Evidence for Sound Policy
Preparing Teachers: Building Evidence for Sound PolicyPioneer One
 
Seven purposes presentation
Seven purposes presentationSeven purposes presentation
Seven purposes presentationJohn Cronin
 
Guide Assessment4learning
Guide Assessment4learningGuide Assessment4learning
Guide Assessment4learningDai Barnes
 
[Re]evaluating practice: measuring what is important, to inform improvements ...
[Re]evaluating practice: measuring what is important, to inform improvements ...[Re]evaluating practice: measuring what is important, to inform improvements ...
[Re]evaluating practice: measuring what is important, to inform improvements ...LearningandTeaching
 
Engaging Teachers in Student Surveys - White Paper for September 2015
Engaging Teachers in Student Surveys - White Paper for September 2015Engaging Teachers in Student Surveys - White Paper for September 2015
Engaging Teachers in Student Surveys - White Paper for September 2015Ryan Balch
 
Rt i training module.ppt nelson
Rt i training module.ppt nelsonRt i training module.ppt nelson
Rt i training module.ppt nelsonJackie Stone
 
RtIMTSS SPE 501-Spr
  RtIMTSS                                       SPE 501-Spr  RtIMTSS                                       SPE 501-Spr
RtIMTSS SPE 501-SprVannaJoy20
 
Teaching To The Test
Teaching To The TestTeaching To The Test
Teaching To The Testnoblex1
 
DLP Governor Workshop 1
DLP Governor Workshop 1DLP Governor Workshop 1
DLP Governor Workshop 1David Carr
 

Similar to Oerc june 2014 final ppt combined (20)

Session 1 Tom Abbott Biddulph High School
Session 1    Tom  Abbott    Biddulph  High  SchoolSession 1    Tom  Abbott    Biddulph  High  School
Session 1 Tom Abbott Biddulph High School
 
NCDPI NCTED Meeting April 2014
NCDPI NCTED Meeting April 2014NCDPI NCTED Meeting April 2014
NCDPI NCTED Meeting April 2014
 
Ceppe march 2011 final (Susanna Loeb)
Ceppe march 2011 final (Susanna Loeb)Ceppe march 2011 final (Susanna Loeb)
Ceppe march 2011 final (Susanna Loeb)
 
Connecticut mesuring and modeling growth
Connecticut   mesuring and modeling growthConnecticut   mesuring and modeling growth
Connecticut mesuring and modeling growth
 
Connecticut mesuring and modeling growth
Connecticut   mesuring and modeling growthConnecticut   mesuring and modeling growth
Connecticut mesuring and modeling growth
 
Connecticut mesuring and modeling growth
Connecticut   mesuring and modeling growthConnecticut   mesuring and modeling growth
Connecticut mesuring and modeling growth
 
Carnegie Foundation Summit on Improvement in Education: Driver Diagrams
Carnegie Foundation Summit on Improvement in Education: Driver DiagramsCarnegie Foundation Summit on Improvement in Education: Driver Diagrams
Carnegie Foundation Summit on Improvement in Education: Driver Diagrams
 
Continuous Assessment System (CAS In Nepal)
Continuous Assessment System (CAS In Nepal)Continuous Assessment System (CAS In Nepal)
Continuous Assessment System (CAS In Nepal)
 
Preparing Teachers: Building Evidence for Sound Policy
Preparing Teachers: Building Evidence for Sound PolicyPreparing Teachers: Building Evidence for Sound Policy
Preparing Teachers: Building Evidence for Sound Policy
 
Seven purposes presentation
Seven purposes presentationSeven purposes presentation
Seven purposes presentation
 
Awsp Seattle Hilton May 2013
Awsp Seattle Hilton May 2013Awsp Seattle Hilton May 2013
Awsp Seattle Hilton May 2013
 
Guide Assessment4learning
Guide Assessment4learningGuide Assessment4learning
Guide Assessment4learning
 
[Re]evaluating practice: measuring what is important, to inform improvements ...
[Re]evaluating practice: measuring what is important, to inform improvements ...[Re]evaluating practice: measuring what is important, to inform improvements ...
[Re]evaluating practice: measuring what is important, to inform improvements ...
 
Engaging Teachers in Student Surveys - White Paper for September 2015
Engaging Teachers in Student Surveys - White Paper for September 2015Engaging Teachers in Student Surveys - White Paper for September 2015
Engaging Teachers in Student Surveys - White Paper for September 2015
 
Fs
FsFs
Fs
 
Rt i training module.ppt nelson
Rt i training module.ppt nelsonRt i training module.ppt nelson
Rt i training module.ppt nelson
 
RtIMTSS SPE 501-Spr
  RtIMTSS                                       SPE 501-Spr  RtIMTSS                                       SPE 501-Spr
RtIMTSS SPE 501-Spr
 
Data Summer
Data SummerData Summer
Data Summer
 
Teaching To The Test
Teaching To The TestTeaching To The Test
Teaching To The Test
 
DLP Governor Workshop 1
DLP Governor Workshop 1DLP Governor Workshop 1
DLP Governor Workshop 1
 

More from Ohio Education Research Center

Improving Student Outcomes Through Early Warning Systems
Improving Student Outcomes Through Early Warning SystemsImproving Student Outcomes Through Early Warning Systems
Improving Student Outcomes Through Early Warning SystemsOhio Education Research Center
 
How Arkansas is Effectively Using Data to Guide Instruction
How Arkansas is Effectively Using Data to Guide InstructionHow Arkansas is Effectively Using Data to Guide Instruction
How Arkansas is Effectively Using Data to Guide InstructionOhio Education Research Center
 
The Kentucky Longitudinal Data System – Connecting Education & Outcomes
The Kentucky Longitudinal Data System – Connecting Education & OutcomesThe Kentucky Longitudinal Data System – Connecting Education & Outcomes
The Kentucky Longitudinal Data System – Connecting Education & OutcomesOhio Education Research Center
 
Impact of Career Pathways on Participant and Employer Outcomes
Impact of Career Pathways on Participant and Employer OutcomesImpact of Career Pathways on Participant and Employer Outcomes
Impact of Career Pathways on Participant and Employer OutcomesOhio Education Research Center
 
WORKFORCE OUTCOMES OF WIA-FUNDED ON-THE-JOB TRAINING IN OHIO
WORKFORCE OUTCOMES OF WIA-FUNDED ON-THE-JOB TRAINING IN OHIOWORKFORCE OUTCOMES OF WIA-FUNDED ON-THE-JOB TRAINING IN OHIO
WORKFORCE OUTCOMES OF WIA-FUNDED ON-THE-JOB TRAINING IN OHIOOhio Education Research Center
 
Return on Investment and Performance Measurement of Workforce Development Pro...
Return on Investment and Performance Measurement of Workforce Development Pro...Return on Investment and Performance Measurement of Workforce Development Pro...
Return on Investment and Performance Measurement of Workforce Development Pro...Ohio Education Research Center
 
Investing in Career Pathways for Regional Workforce Success
Investing in Career Pathways for Regional Workforce Success Investing in Career Pathways for Regional Workforce Success
Investing in Career Pathways for Regional Workforce Success Ohio Education Research Center
 
An Introduction To and Hands-On Tutorial of OLDA EMIS DATA
An Introduction To and Hands-On Tutorial of OLDA EMIS DATAAn Introduction To and Hands-On Tutorial of OLDA EMIS DATA
An Introduction To and Hands-On Tutorial of OLDA EMIS DATAOhio Education Research Center
 
Enhancing Decision Making Using Workforce Outcomes in Ohio
Enhancing Decision Making Using Workforce Outcomes in OhioEnhancing Decision Making Using Workforce Outcomes in Ohio
Enhancing Decision Making Using Workforce Outcomes in OhioOhio Education Research Center
 

More from Ohio Education Research Center (20)

Building regional data tools
Building regional data toolsBuilding regional data tools
Building regional data tools
 
Streamlining data usage in michigan using ed fi
Streamlining data usage in michigan using ed fiStreamlining data usage in michigan using ed fi
Streamlining data usage in michigan using ed fi
 
Improving Student Outcomes Through Early Warning Systems
Improving Student Outcomes Through Early Warning SystemsImproving Student Outcomes Through Early Warning Systems
Improving Student Outcomes Through Early Warning Systems
 
Improving Student Outcomes Through Data Dashboards
Improving Student Outcomes Through Data DashboardsImproving Student Outcomes Through Data Dashboards
Improving Student Outcomes Through Data Dashboards
 
How Arkansas is Effectively Using Data to Guide Instruction
How Arkansas is Effectively Using Data to Guide InstructionHow Arkansas is Effectively Using Data to Guide Instruction
How Arkansas is Effectively Using Data to Guide Instruction
 
Ohio's Student Success Dashboard
Ohio's Student Success DashboardOhio's Student Success Dashboard
Ohio's Student Success Dashboard
 
OCTEO Keynote Presentation
OCTEO Keynote Presentation OCTEO Keynote Presentation
OCTEO Keynote Presentation
 
Getting Ahead of WIOA Standards
Getting Ahead of WIOA StandardsGetting Ahead of WIOA Standards
Getting Ahead of WIOA Standards
 
ODJFS Conference Slides
ODJFS Conference SlidesODJFS Conference Slides
ODJFS Conference Slides
 
The Kentucky Longitudinal Data System – Connecting Education & Outcomes
The Kentucky Longitudinal Data System – Connecting Education & OutcomesThe Kentucky Longitudinal Data System – Connecting Education & Outcomes
The Kentucky Longitudinal Data System – Connecting Education & Outcomes
 
Impact of Career Pathways on Participant and Employer Outcomes
Impact of Career Pathways on Participant and Employer OutcomesImpact of Career Pathways on Participant and Employer Outcomes
Impact of Career Pathways on Participant and Employer Outcomes
 
WORKFORCE OUTCOMES OF WIA-FUNDED ON-THE-JOB TRAINING IN OHIO
WORKFORCE OUTCOMES OF WIA-FUNDED ON-THE-JOB TRAINING IN OHIOWORKFORCE OUTCOMES OF WIA-FUNDED ON-THE-JOB TRAINING IN OHIO
WORKFORCE OUTCOMES OF WIA-FUNDED ON-THE-JOB TRAINING IN OHIO
 
Return on Investment and Performance Measurement of Workforce Development Pro...
Return on Investment and Performance Measurement of Workforce Development Pro...Return on Investment and Performance Measurement of Workforce Development Pro...
Return on Investment and Performance Measurement of Workforce Development Pro...
 
Investing in Career Pathways for Regional Workforce Success
Investing in Career Pathways for Regional Workforce Success Investing in Career Pathways for Regional Workforce Success
Investing in Career Pathways for Regional Workforce Success
 
Measuring Postsecondary Employment Outcomes
Measuring Postsecondary Employment OutcomesMeasuring Postsecondary Employment Outcomes
Measuring Postsecondary Employment Outcomes
 
4th Federal Reserve District Economic Conditions
4th Federal Reserve District Economic Conditions4th Federal Reserve District Economic Conditions
4th Federal Reserve District Economic Conditions
 
An Introduction To and Hands-On Tutorial of OLDA EMIS DATA
An Introduction To and Hands-On Tutorial of OLDA EMIS DATAAn Introduction To and Hands-On Tutorial of OLDA EMIS DATA
An Introduction To and Hands-On Tutorial of OLDA EMIS DATA
 
Bridging the Mathematics Gap in Southern Ohio
Bridging the Mathematics Gap in Southern OhioBridging the Mathematics Gap in Southern Ohio
Bridging the Mathematics Gap in Southern Ohio
 
Using Visualization to Turn Data into Knowledge
Using Visualization to Turn Data into KnowledgeUsing Visualization to Turn Data into Knowledge
Using Visualization to Turn Data into Knowledge
 
Enhancing Decision Making Using Workforce Outcomes in Ohio
Enhancing Decision Making Using Workforce Outcomes in OhioEnhancing Decision Making Using Workforce Outcomes in Ohio
Enhancing Decision Making Using Workforce Outcomes in Ohio
 

Recently uploaded

AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdfAMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdfphamnguyenenglishnb
 
Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Celine George
 
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptxINTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptxHumphrey A Beña
 
Keynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-designKeynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-designMIPLM
 
FILIPINO PSYCHology sikolohiyang pilipino
FILIPINO PSYCHology sikolohiyang pilipinoFILIPINO PSYCHology sikolohiyang pilipino
FILIPINO PSYCHology sikolohiyang pilipinojohnmickonozaleda
 
ACC 2024 Chronicles. Cardiology. Exam.pdf
ACC 2024 Chronicles. Cardiology. Exam.pdfACC 2024 Chronicles. Cardiology. Exam.pdf
ACC 2024 Chronicles. Cardiology. Exam.pdfSpandanaRallapalli
 
How to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPHow to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPCeline George
 
What is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERPWhat is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERPCeline George
 
Field Attribute Index Feature in Odoo 17
Field Attribute Index Feature in Odoo 17Field Attribute Index Feature in Odoo 17
Field Attribute Index Feature in Odoo 17Celine George
 
Judging the Relevance and worth of ideas part 2.pptx
Judging the Relevance  and worth of ideas part 2.pptxJudging the Relevance  and worth of ideas part 2.pptx
Judging the Relevance and worth of ideas part 2.pptxSherlyMaeNeri
 
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptxMULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptxAnupkumar Sharma
 
Barangay Council for the Protection of Children (BCPC) Orientation.pptx
Barangay Council for the Protection of Children (BCPC) Orientation.pptxBarangay Council for the Protection of Children (BCPC) Orientation.pptx
Barangay Council for the Protection of Children (BCPC) Orientation.pptxCarlos105
 
Science 7 Quarter 4 Module 2: Natural Resources.pptx
Science 7 Quarter 4 Module 2: Natural Resources.pptxScience 7 Quarter 4 Module 2: Natural Resources.pptx
Science 7 Quarter 4 Module 2: Natural Resources.pptxMaryGraceBautista27
 
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️9953056974 Low Rate Call Girls In Saket, Delhi NCR
 
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxiammrhaywood
 
4.18.24 Movement Legacies, Reflection, and Review.pptx
4.18.24 Movement Legacies, Reflection, and Review.pptx4.18.24 Movement Legacies, Reflection, and Review.pptx
4.18.24 Movement Legacies, Reflection, and Review.pptxmary850239
 
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...JhezDiaz1
 

Recently uploaded (20)

AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdfAMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
 
Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17
 
YOUVE GOT EMAIL_FINALS_EL_DORADO_2024.pptx
YOUVE GOT EMAIL_FINALS_EL_DORADO_2024.pptxYOUVE GOT EMAIL_FINALS_EL_DORADO_2024.pptx
YOUVE GOT EMAIL_FINALS_EL_DORADO_2024.pptx
 
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptxINTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
 
Keynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-designKeynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-design
 
FILIPINO PSYCHology sikolohiyang pilipino
FILIPINO PSYCHology sikolohiyang pilipinoFILIPINO PSYCHology sikolohiyang pilipino
FILIPINO PSYCHology sikolohiyang pilipino
 
ACC 2024 Chronicles. Cardiology. Exam.pdf
ACC 2024 Chronicles. Cardiology. Exam.pdfACC 2024 Chronicles. Cardiology. Exam.pdf
ACC 2024 Chronicles. Cardiology. Exam.pdf
 
How to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPHow to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERP
 
What is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERPWhat is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERP
 
Field Attribute Index Feature in Odoo 17
Field Attribute Index Feature in Odoo 17Field Attribute Index Feature in Odoo 17
Field Attribute Index Feature in Odoo 17
 
Judging the Relevance and worth of ideas part 2.pptx
Judging the Relevance  and worth of ideas part 2.pptxJudging the Relevance  and worth of ideas part 2.pptx
Judging the Relevance and worth of ideas part 2.pptx
 
LEFT_ON_C'N_ PRELIMS_EL_DORADO_2024.pptx
LEFT_ON_C'N_ PRELIMS_EL_DORADO_2024.pptxLEFT_ON_C'N_ PRELIMS_EL_DORADO_2024.pptx
LEFT_ON_C'N_ PRELIMS_EL_DORADO_2024.pptx
 
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptxMULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
MULTIDISCIPLINRY NATURE OF THE ENVIRONMENTAL STUDIES.pptx
 
Raw materials used in Herbal Cosmetics.pptx
Raw materials used in Herbal Cosmetics.pptxRaw materials used in Herbal Cosmetics.pptx
Raw materials used in Herbal Cosmetics.pptx
 
Barangay Council for the Protection of Children (BCPC) Orientation.pptx
Barangay Council for the Protection of Children (BCPC) Orientation.pptxBarangay Council for the Protection of Children (BCPC) Orientation.pptx
Barangay Council for the Protection of Children (BCPC) Orientation.pptx
 
Science 7 Quarter 4 Module 2: Natural Resources.pptx
Science 7 Quarter 4 Module 2: Natural Resources.pptxScience 7 Quarter 4 Module 2: Natural Resources.pptx
Science 7 Quarter 4 Module 2: Natural Resources.pptx
 
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
 
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
 
4.18.24 Movement Legacies, Reflection, and Review.pptx
4.18.24 Movement Legacies, Reflection, and Review.pptx4.18.24 Movement Legacies, Reflection, and Review.pptx
4.18.24 Movement Legacies, Reflection, and Review.pptx
 
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
 

Oerc june 2014 final ppt combined

  • 1. OERC RESEARCH RELATED TO STUDENT GROWTH MEASURES AND EDUCATOR EFFECTIVENESS Jill Lindsey, Ph.D. Wright State University Marsha Lewis, Ph.D. Ohio University E x t e n d i n g Y o u r K n o wl e d g e T h r o u g h R e s e a r c h T h a t W o r k s ! I C o l u m b u s , O H I J u n e 1 8 , 2 0 1 4
  • 2.  OERC can examine statewide policy and practice questions  Follow and document early implementation in order to inform policy and practice  Research implementation for multiple years as start-up issues are resolved and implementation takes hold—look for what is working and what can be improved. PURPOSE OF RESEARCH
  • 3.  Time span of findings offer insight into changing landscape around teacher evaluation and student growth measures  2012: Teachers and principals philosophically supportive of new evaluation system and need to measure student growth  2013: HB 555  2014: Teachers and principals far less supportive and troubled by concerns related to use of different types of SGMs for evaluation  Common Themes 3 FOUR STUDIES SPANNING THREE YEARS
  • 4. 4 METHODOLOGY  Structured interviews with superintendents and administration team members  Focus groups with teachers  Surveys of teachers in each pilot LEA  eTPES data analysis
  • 5.  OTES/OPES Implementation Study (37 LEAs)  Extended Testing for Value-Added Reporting (23 LEAs)  Initial Use of Student Learning Objectives (30 LEAs)  Student Growth Measures Policy & Practice (13 LEAs) 5 FUNDED PROJECTS RELATED TO STUDENT GROWTH MEASURES
  • 6.  Sequencing, planning, feedback, and student growth measures for teachers and principals  Preparation for evaluation  Experiences of teachers and principals evaluated using student growth measures  Processes and measures of student growth that districts adopted for use in teacher and principal evaluation systems 6 OTES/OPES IMPLEMENTATION STUDY
  • 7. 7 EARLIEST FINDINGS  Generally positive about the new evaluation systems  Supported use of student growth measures in evaluation  Lack of trust & misunderstandings about value-added, vendor assessments, and local measures of student growth  Unfairness of using different kinds of measures and differing time cycles for different measures of student growth  Conversations around new evaluation system focus on instruction  Time required to complete evaluation took time away from working with students  Appreciative at being asked about their experiences and views
  • 8.  Grant funds provided vendor testing for grades 1,2,3, and high school subjects  Provided teacher-level value-added scores from vendor test results  Processes and challenges related to extended testing implementation  Role of roster verification  Use of SGMs in educator evaluation  Best practices/lessons learned 8 SGM EXTENDED TESTING MINI- GRANT
  • 9. Findings  Want reliable student growth measures  Lack assessment literacy  Unclear how vendors will provide data  Uncertain of roster verification timing and impact on VAM  LEAs opting to use lowest percentages in weights  Grateful for being asked about their experiences Nine Drop-outs’ Reasons  Requirement to use extended testing results too soon, unfair, not part of grant  Cost of extended testing was too high  Too many changes, and too much on teachers’ plates SGM MINI–GRANTS FOR EXTENDED TESTING
  • 10. 10 INITIAL USE OF STUDENT LEARNING OBJECTIVES Study examined fidelity of SLO use for:  improving student performance  measuring academic growth  evaluating teachers
  • 11.  Training was not uniform across the state  Assessments varied widely across grade levels, buildings, and districts  Processes excessively time-consuming  More challenging for semester or quarter courses; limited time to complete the pre- test, teach, post-test cycle  Implementation hampered by too many changes to common core, piloting new state tests and PARCC, and implementing OTES  Many emotional moments and gratitude for being invited to talk about experiences Interviews Surveys Documents eTPES Data 11 EARLY SLO THEMES ALL DATA NOT YET ANALYZED
  • 12. 12 SGM POLICY AND PRACTICE STUDY OERC study of early adopter districts of Student Growth Measures  Designed to provide timely data to inform state policy and district practice.  “What does this look like when implemented?”  Teachers’ perceptions of SGM components  Do SGMs correlate with Performance on Standards? If not, why not?  The distribution of teacher and principal ratings
  • 13. Focus group themes:  Fairness questions (e.g. Category A teachers do not know OAA items in advance while Category C teachers develop their own assessments)  Principals’ time consumed with teacher observation activities  Teachers have questions/misconceptions about value-added methodology 13 SGM POLICY AND PRACTICE STUDY
  • 14. 14 SGM POLICY AND PRACTICE SURVEY  Deployed late February through mid-April, 2014  22% response rate (603 teacher respondents/2,709 full- time teachers) N = 469–classroom teachers, 97 intervention specialists  Survey responses were similar to focus group findings  Of the four SGMs (Value-Added, SLOs, Vendor-Approved Assessments, Shared Attribution), more surveyed teachers think Student Learning Objectives “most accurately assess a teacher’s instructional impact.”
  • 15.  Early stage of implementation  Uncontrollable factors  Unequal measures/accuracy of the measures  SLOs teacher-developed, validity/reliability questions  Others see the SLOs as most fair because focused on the content taught and results during evaluation year  Approved vendor assessments may not match content standards  Value-added model was not formulated to measure individual teacher effectiveness 15 FAIRNESS CONCERNS WITH SGMS & EVALUATION
  • 16. Teachers who see value in SGMs:  Feel it is important to measure student growth  Recognize the need for accountability  SGMs useful source of feedback for planning and adjustment to outcomes 16 SGM POLICY AND PRACTICE STUDY
  • 17. “I do think it is important to make sure a child makes adequate growth. However, there are factors that are out of my control (attendance, home support, etc.) that affect a child's learning and are not considered when calculating the yearly academic growth of a student.” “It shows the effectiveness of a teacher and useful data to adjust your teaching.” 17 SGM POLICY AND PRACTICE STUDY
  • 18. Teacher–Student Data Link/Roster Verification is necessary to ensure SGM data quality. Research Questions:  Are teachers actively participating in the verification of their own rosters and percentage of instructional time with students as specified by Ohio’s roster verification process guidelines?  Do principals and teachers have access to adequate training and technical assistance?  Do principals and/or teachers perceive any issues with roster verification?  What do Ohio educators view as the hallmarks of a good system? 18 TEACHER ROSTER VERIFICATION RESEARCH
  • 19. 19 TEACHER ROSTER VERIFICATION SURVEY Sent online survey to all teachers and principals who completed the link/roster verification process in spring 2013 and spring 2014 2013 survey: 5,984 teacher responses from 695 LEAs 2014 survey to-date: 6,778 teacher responses. Survey still in field.
  • 20. 2011 2013 2014* (prelim.) Yes 46% 57% 59% No 23% 25% 25% Don't know 31% 17% 16% Teachers – Do you think the linkage process accurately captured what was happening in your classroom (i.e. students you taught last year, their length of enrollment, and your percentage of instructional time with them)? 20 TEACHER ROSTER VERIFICATION SUR
  • 21. 21 TEACHER ROSTER VERIFICATION SURVEY For teachers that answered “No”: Teachers – Explain why you think the student–teacher linkage process did not accurately capture what was happening in your classroom. (open-ended) Themes:  Difficulty dividing time in various co-teaching situations.  Unable to account for student absences  Teachers want to be able to report finer increments of shared instructional responsibility  Students’ schedules changed too often/environment too dynamic to accurately estimate time
  • 22. 2011 2013 2014* (prelim.) Not at all confident 39% 32% 35% Somewhat confident 55% 61% 58% Very confident 6% 8% 7% Teachers – Given your experience with the linkage process, how confident are you that the linkage process improves the accuracy of the teacher-level value-added data? 22 TEACHER ROSTER VERIFICATION SURVEY
  • 23. Concerns  Early in implementation—lack of trust and misunderstandings  Perceived unfairness of different kinds of measures  Time required to complete evaluation  Too many changes at the same time Kudos  Support measuring student growth  Training for assessment literacy desired  Appreciate being consulted and heard  Roster verification process is improving 23 COMMON THEMES ACROSS TIME
  • 24.  Build trust by continuing to include teachers and administrators in conversations and policies that impact them  Acknowledge concerns as legitimate  Provide professional development opportunities to correct misunderstandings and knowledge deficits  Streamline paperwork where possible; use adaptations from the field  Modify policy and roll-out timelines when possible 24 RECOMMENDATIONS
  • 26.
  • 27. PLANNING FOR THE FUTURE, LEARNING FROM THE PAST: WHAT CAN SCHOOLS LEARN FROM COLLEGE AND CAREER PROFILES OF GRADUATES? Joshua D. Hawley Director, OERC and Associate Professor John Glenn School of Public Affairs The Ohio State University Making Research Work for Education
  • 28. Ohio’s Constitution A “thorough” and “efficient“ education. OHIO’S STANDARDS OF PUBLIC EDUCATION
  • 29. College for all High School College Work Education and Career High School • CTE • STEM College • AP/Dual Enrollment • School + Work Workforce Training • Apprenticeship • Military 29 LINKING SCHOOL TO WORK
  • 30. 30 CHANGING VIEWS OF WORK REQUIRE NEW INFORMATION  As educators, we want to know about a variety of educational outcomes, not just college-going rates for students. At its broadest, we might consider the following domains:  College (traditional two and four-year sectors)  Credentialed and non credentialed workforce training  Apprenticeships  Military
  • 31. 31 QUALITY OF OUTCOMES MATTER  In this day and age, the quality of the outcomes matter a great deal, and, therefore, we are concerned with how well students are prepared to perform over time.  Concerns we typically have in this case:  Student remediation  Does student knowledge match what is required in college classes?  Are students prepared to pick a career? (I distinguish this from a job)  What happens to kids that go directly from high school to work?  What happens to kids that dropout (both in terms of further education and work?)
  • 32. 32 PILOT HIGH SCHOOL REPORTS  Using data from the Ohio Longitudinal Data Archive (OLDA), the OERC has been able to answer many of these questions, beginning for high schools, and present them in a format that schools can use.  The report has four key question areas:  What are the employment outcomes of high school graduates?  What are the post secondary education outcomes of high school graduates?  What is the quality of post secondary education high school graduates are carrying out?  What happens to individuals that do not graduate from high school (dropout)?
  • 34. 34 WHAT KIND OF EMPLOYMENT EXPERIENCES DO STUDENTS HAVE AFTER SCHOOL?
  • 36. 36 HOW WELL DO SCHOOLS COMPARE WITH EACH OTHER IN REMEDIAL EDUCATION?
  • 37. 37 VISUAL LOOK College and Career Report High School Name County State   How do X year graduates from this school compare to others in Ohio? Education Career Outcomes (Note): Following legal agreement covering data use by the OERC, individual cells with fewer then 10 people have been redacted. They are indicated by a *. School District State   Number of Students that Started High School in X year Number of Students working in Ohio in the year following graduation Number of Graduates Average annual earnings for individuals with high school diploma Number of Dropouts Average annual earnings for individuals without a high school diploma Average High School GPA       Percent of students in this class eligible for free or reduced lunch Industry of employment for graduates of high school not in college   Retail 20 Average junior year ACT scores for this class by subject   Constructi 20 Financial S 60 English   Math   Reading Science Composite Percent of graduates ready for college Industry of employment for high school dropouts   Retail 40 Average state scholarship awards for graduates Constructi 40 Financial S 10 Overall post secondary rate for graduates from this class Other 10 Two year college Four year college Other vocational/workforce training Percent of graduates from this class who attended an in‐state college or university Percent of graduates from this class who attended an out of state college or university Trend in college going rates for this school vs. state and district Working and school *Readiness scores are calculated based on the individual ACT score over or below the "Remeditaion Free Standard" Footer: Describe data origin. Refer to website that summarizes origin.  Retail Constr uction Retail Construc tion Financial Services Other
  • 38. 38 COLLEGE OUTCOMES How do X year graduates from this school compare to others in Ohio? Education (Note): Following legal agreement covering data use by the OERC, individual cells with fewer then 10  people have been redacted. They are indicated by a *. School District State Number of Students that Started High School in X year Number of Graduates Number of Dropouts Average High School GPA Percent of students in this class eligible for free or reduced lunch Average junior year ACT scores for this class by subject English Math Reading Science Composite
  • 40. 40 CAREER OUTCOMES Number of Students working in Ohio in the year following graduation Average annual earnings for individuals with high school diploma Average annual earnings for individuals without a high school diploma       Industry of employment for graduates of high school not in college   Retail 20   Constructi 20 Financial S 60     Industry of employment for high school dropouts   Retail 40 Constructi 40 Financial S 10 Other 10 Retail Constr uction Retail Construc tion Financial Services Other
  • 41.  Develop a formal high school college and career report for select districts (next up, Columbus City Schools; Battelle For Kids)  Complete Workforce Success Measures Project with the Office of Workforce Transformation (see OWT website for introduction: http://workforce.ohio.gov/ ).  Work with Ohio Department of Education and Board of Regents to answer questions about employment outcomes for K-12 and higher education. 41 FUTURE PLANS
  • 43. Slide 42 SM22 to be consistent with brochure and briefs www.oerc.osu.edu | connect@oerc.osu.edu Sunny Munn, 6/13/2013
  • 44.
  • 45. THIRD GRADE READING GUARANTEE: A CASE STUDY Suzanne Franco, Professor, Wright State University Jarrod Brumbaugh, Principal, Milton-Union Schools Making Research Work for Education Extending Your Knowledge Through Resear ch That Works! I Columbus , OH I June 18, 2 0 14
  • 46.  Ohio Third Grade Reading Guarantee (TGRG) 2012  In 2012–13, 81% of Ohio’s 3rd graders were proficient or above  ODE offered competitive funding grant for developing TGRG 2013–2014 implementation  OERC funded a case study of a funded TGRG three-LEA consortium for 2013–2014 BACKGROUND 45
  • 47.  Co-located in Midwestern Ohio but had not collaborated on previous initiatives  Orton-Gillingham Multi-Sensory training and instructional strategies  Professional Learning Community (PLC)  Parent Informational Opportunities 46 CONSORTIUM TGRG PLAN
  • 48. 47 CONSORTIUM DEMOGRAPHICS LEA Typology Report Card Rating 2010– 2011 % passed 3rd grade reading 2011– 2012 % passed 3rd grade reading 2012– 2013 Mobilit y % White/ Non- Hispani c % Econ. Dis- advan. 1 2: Rural w/ avg. pvrty & small ADM Exc. 87.3 85.9 7.0 97 40 2 4: Small Town w high pvrty & avg ADM Exc. 90.3 80.2 11.4 87 53 3 2: Rural w/ avg. pvrty & small ADM Exc. w/ Distinc- tion 84.7 84 6.1 98 19
  • 49.  Feedback and buy-in for the training, implementation, and PLC.  Progress and monitoring tools used.  Reading skills improved for On-Target students. For Not on Target students?  Percentage of K–3 students Not on Target in 2012-14? 49 RESEARCH QUESTIONS
  • 50. For each LEA:  Document analysis of historical data and end-of-year 2014 RIMPS  Interviews and focus groups with Administrators (6), Teachers (12)  Observations of O/G training and classroom instruction 50 METHODOLOGY
  • 51. 51 O/G TRAINING Training Details  Two 5-day sessions the week after school year ended  One 5-day session in November  Refresher course available summer, 2014 Training Feedback  Teachers felt it was engaging but too long, or covered grade levels not in their interest. They would like to repeat after one year of implementation.  Administrators from one LEA attended training. They felt that common language helped with classroom observations.
  • 52. 52 IMPLEMENTATION Implementation Details  LEA 1 – O/G not required due to receipt of grant funds in mid-September, 2013. Used in RTI, Title 1, and other interventions.  KRAL is the identifier for K; State assessment tool for grades 1–3  DIBELS is the progress monitor along with STAR and Study Island  LEA 2 – O/G not required (see above). Used in RTI, intervention, and Title 1  NWEA (2012) and DIBELS (2013) for K–3  DIBELS is the progress monitor  LEA 3 decided not to participate
  • 53. Implementation Feedback Details  Not all supplies at beginning of year for all teachers due to delay in receiving grant funds  Not all training completed at beginning of year (new teachers)  Use of O/G not required; inconsistency a challenge for teams  Merging O/G with LEA-approved reading curriculum difficult  Parent Nights were not well attended; PLC not formed 53 IMPLEMENTATION FEEDBACK
  • 54. 54 PROGRESS AND MONITORING TOOLS  LEA 1: DIBELS  LEA 2: NWEA (2012); DIBELS (2013) Feedback  O/G assessment tools not Ohio-approved, therefore the LEAs use DIBELS and NWEA to assess student progress  RIMPs not standardized among LEAs (issue for moving and determining LEA or statewide impact)  For highly mobile student populations, 30-day requirement for RIMP is very difficult to meet.  Too much testing for young students; test anxiety rising
  • 55. Successes  After School Program  Students respond well to Multi-Sensory  Teachers want more training Challenges  Use of O/G not consistent  Costs to sustain  O/G assessments not state approved  RIMP forms could be improved; data should be collected for analyses  No information about other LEA TGRG plans 55 CONSORTIUM SUMMARY
  • 56.  LEA1  Grade 3 results to date  Changes in implementations for 2014–2015  LEA 2 Details  Grade 3 results to date  Changes in implementations for 2014–2015 56 2013–2014 RESULTS 2014–2015 PLANS
  • 57.  Assessment tools aligned with TGRG programs funded by state need approval.  Primary students exhibit high anxiety regarding TGRG, impacting performance and fear of school.  Required testing takes away from instruction time. Embrace testing that collects needed data for all accountability purposes, not just one initiative. 57 TESTING RECOMMENDATIONS
  • 58.  Continue funding for TGRG development.  Continue monitoring LEA implementation of funded and non-funded TGRG implementation plans, and share “lessons learned.”  Revise RIMP format and collect RIMP data for longitudinal analyses of common deficiencies across the state. 58 TGRG POLICY RECOMMENDATIONS
  • 59. QUESTIONS suzanne.franco@wright.edu BrumbaughJ@milton-union.k12.oh.us thompsond@piqua.org Making Research Work for Education Extending Your Knowledge Through Resear ch That Works! I Columbus , OH I June 18, 2 0 14
  • 60.
  • 61. READY OR NOT? E x t e n d i n g Y o u r K n o wl e d g e T h r o u g h R e s e a r c h T h a t W o r k s ! I C o l u m b u s , O H I J u n e 1 8 , 2 0 1 4 Ginny Rammel, Ph.D. Superintendent Milton-Union Exempted Village Schools
  • 62.  Research and study  Use of multiple forms of data  Create a culture of “calculated risk-takers”  Embed professional development “EVERY STUDENT, EVERY DAY”
  • 63.  Role model to all at all times  Establish high expectations  Collaborate, share  Trust, be truthful and supportive  Know your staff 63 CULTURE TAKES TIME TO CHANGE!
  • 64. 64 EXPLORE OPPORTUNITIES  Through grants, pilot studies, action research  Connect with:  Personnel at colleges and universities, OERC  Educators from other districts  Members of professional organizations  Policymakers, legislators
  • 65.  Milton-Union was involved in a number of grants:  RttT Mini-grant Value-Added  Student Growth Measures  Early Literacy and Reading Readiness  OERC case study The more you and your staff research, study, and share data, the better decisions you make. Collaborating and working together help to create a culture of “Every Student, Every Day.” 65 EMBEDDED PD: BE AN ACTIVE PARTICIPANT
  • 66. Our culture and the research and data from our grant involvement led to development of our OTES instrument, and a successful year of implementation This trust and openness flowed throughout recent negotiations. 66 RESULTS
  • 67. All initiatives impact one another and YOU:  OTES  OPES  Graduation requirements  Third Grade Reading Guarantee Keep the main thing the main thing – is what I’m doing going to help students learn? Are we preparing students for “down-the-road” careers? 67 CHANGE WILL OCCUR WITH OR WITHOUT YOU!
  • 68.  Do the research upfront  Study the data  Reflect, revise if necessary  Building project  Food service program 68 CALCULATED RISK-TAKERS
  • 69.  All day, every day kindergarten  On-site Head-Start programs  Grouping students by quintiles  H.S. ACT EOC exams  Recognized as a U.S. Department of Education Green Ribbon School  Food service program ended the year in the black! 69 DATA SUPPORTS INITIATIVES
  • 70.  How do we better prepare students for their futures – colleges, universities, employers?  How can we convey to young parents the importance of their role as a teacher?  How can we differentiate education so all students are better served?  How can we better communicate the results of research and the sharing of data? 70 NEXT STEPS…