Tools And Resources For Continuous Improvement Of Technology In Schools
1. Tools and resources for continuous
improvement of technology in
schools
Kevin Oliver, LaTricia Townsend, Rodolfo Argueta,
Daniel Stanhope
Institute on Leading Innovation: Implementing Effective 1:1 Learning
Technology Programs
July 8-10, 2009. The Friday Institute, NCSU, Raleigh, NC
2. Description
This interactive session will highlight resources
school leaders can utilize to facilitate data-
driven decisions to improve the effectiveness of
a technology project. Participants will learn
about a variety of data collection tools freely
available to them, practice using these tools,
and discuss ways in which the results may
shape practice in their schools.
3. Defining Evaluation
⢠two common evaluation modes--formative and
summative
⢠formative evaluation is concerned with collecting
data to help revise an innovation of interest (e.g.,
one-to-one computing)
⢠summative evaluation is concerned with
collecting data to help judge the worth of an
innovation and whether to adopt/reject
⢠1:1 leaders may be involved in both modes
4. Evaluation vs. Assessment
⢠evaluation is concerned with improving or
judging the worth of some innovation
⢠assessment is concerned with how much a
student knows
⢠evaluation is the more inclusive term, often
making use of assessment data as one data
source, in addition to surveys, observations,
interviews, and more
6. What is Commonly Evaluated?
⢠instructional materials (e.g., a problem-based
learning lesson and resources)
⢠projects (e.g., school professional development or
technology plans)
⢠programs (e.g., one-to-one computing in a school
district or state)
⢠in project or program evaluation, you may study
instructional materials purchased or created, but you
will also look at leadership, professional
development, collaboration, sustainability, etc.
7. Where to Begin?
⢠evaluation questions are difficult to write for
project or program evaluation with so many
factors to consider
⢠evaluation models can help you develop
evaluation questions to study the most
important elements
⢠for example, Stufflebeam's CIPP Model
encourages a close look at Context, Inputs,
Processes, and Products
8. CIPP Model
⢠context questions seek to identify needs of target
pop., opportunities to address them, determine how
well goals address needs
⢠input questions seek to define capabilities, project
strategies and designs, types of support useful in
reaching goals, what was required to achieve goals
⢠process questions seek to define deficiencies in the
process or implementation, how resources were
allocated, barriers to success
⢠product questions seek to define outcomes, describe
lessons learned
9. Flashlight Model
⢠the Flashlight program is another evaluation
model developed by AAHE for technology-
based evaluations
⢠Flashlight is based on the concept of "triads,"
consisting of the relationship between
outcomes (goals/objectives), strategies, and
technologies, tools, or supporting materials
11. Logic Models
⢠logic models can also help you identify
questions for your evaluation; they contain
four elements:
1. inputs or the resources that go into the project
2. activities that take place as part of the project
3. short term objectives (e.g., "students will
use...")
4. long term goals (e.g., "increase student
scores...")
12.
13. After Generating Evaluation
Questions
⢠helpful to generate a table that matches
evaluation questions with indicators and
benchmarks of success (expectations), and
data sources that you will use to check on
that success (today's session topic!)
Questions Indicators Benchmarks Data Results,
Sources, Changes,
Instruments Outcomes
14. Indicators
⢠indicators are a continuous factor used to
describe a construct of interest (e.g.,
unemployment percentage, lots per acre,
violent crime per 1000 citizens, etc.)
⢠have some historical/past value, present
value, and future value
⢠value is in constant flux and can be changed
by projects we are evaluating
15. Benchmarks
⢠if an indicator states: number of students who
use the Internet for research
⢠a benchmark would specify the time interval
by which you would expect to see changes
⢠if we know 40% used Internet last year before
our project was implemented, we might
expect 60% to use it this year, and 80% next
year
16. Examples
Indicators Benchmarks
number of wireless 1 in 2006, 2 in 2007,
access points and 3 in 2008
mobile labs in the 5 in 2006, 10 in 2007,
district 15 in 2008
teacher access to 80% in 2006, 90% in
e-mail 2007, 100% in 2008
17. A More Detailed Planning Grid
⢠Irving, TX long-range technology plan
⢠different look, similar planning items: goals-questions,
objectives-benchmarks, evaluation-data sources
18. Remainder of Session
⢠focus on tools and resources you can use to
collect data in evaluating a 1:1 computing
program
Questions Indicators Benchmarks Data Results,
Sources, Changes,
Instruments Outcomes
21. School Technology Readiness
Measures (continued)
ISTE/CEO Forum: STaR chart (CEO Forumâs
Interactive School Technology and Readiness
(STaR) Chart )
Online access:
http://www.iste.org/inhouse/starchart/index.cfm?Section=ST
22. School Technology Readiness Measures
(continued)
exas Teacher STaR Chart http://starchart.esc12.net/docs/TxTSC.pdf
exas Campus STaR Chart http://starchart.esc12.net/docs/TxCSC.pdf
ichigan 2003-04 Freedom to Learn School Readiness Rubric
http://school.discoveryeducation.com/schrockguide/pdf/schooltechrubric.pdf
tate of Washington Technology Essential Conditions Rubric
http://www.k12.wa.us/edtech/TechEssCondRubric.aspx or
http://www.k12.wa.us/edtech/pubdocs/TECR-WA.doc
23. School Technology Readiness
Measures (continued)
SEIR*TEC School Technology Needs
Assessment (STNA)
Sample of the online survey:
http://www.keysurvey.com/survey/134127/1516/
Download Paper-Pencil Versions:
http://www.serve.org/evaluation/capacity/EvalFramework/resou
24. STNA 4.0 - School Technology
Needs Assessment
⢠What is STNA?
⢠What data does STNA collect?
⢠What does this data tell us?
⢠How can findings be used?
⢠What is the STNA process?
⢠What data is reported?
⢠How is STNA data interpreted?
25. What is STNA?
⢠STNA is a valid and reliable instrument allowing effective assessment
of educational technology needs to better design and evaluate projects
and initiatives
⢠STNA provides a free, user-friendly online tool that allows for planning
and formative evaluation of technology projects in educational settings
⢠Helps planners collect and analyze needs data related to
implementation of technological innovation aimed at examining
technology use in teaching and learning
⢠Guides school- and district-level decisions about professional
development for educators
26. What data does STNA collect?
⢠STNA reports at the school level
⢠Documents respondentsâ perceptions and attitudes
⢠Broad areas of school technology
â Supportive Environment for Technology Use
⢠Vision, Planning and Budget, Communication, Infrastructure and Staff
Support
â Professional Development
⢠Professional Development Needs, Professional Development Quality
â Teaching and Learning
⢠Teacher Technology Use, Student Technology Use
â Impact of Technology
⢠Teacher Impact, Student Impact
27. What does this data tell us?
⢠STNA reports descriptive data
â Item and response set
â Frequencies
â Percentages
⢠Scales
â 1 (Strongly Agree) to 5 (Strongly Disagree) and 6 (Do Not Know)
â 1 (Daily) to 5 (Never) and 6 (Do Not Know)
⢠Look at the profile:
â Generally positive or negative?
â Number of highs and lows?
â Height of highs and lows?
â Very different than other items?
⢠You might get more questions instead of answers
28. How is STNA data interpreted?
⢠Each construct examined is theoretically beneficial to successful
implementation of technology in teaching and learning settings
â Strongly agreeing with an item or indications of daily use is inherently
âpositiveâ
â âDo not knowâ is neither positive nor negative
⢠All respondents SA or A: âNeeds are being metâ
⢠Mostly SA or A: âNot as emphatically positive, perhaps there is room for
improvementâ
⢠Mostly neutral, D, or SD: âArea for concernâ
⢠Large number of DNK: âLack of awareness in this itemâ
⢠Mixed responses: âLack of strong feelings about this item⌠why?â
⢠Split between D/SD and A/SA: âMuch disagreement between staff, and
an area of concernâ
29. How can findings be used?
⢠Plan your technology program implementation
â Incorporate into your school improvement plan or technology plan
â Define your priorities
â Plan professional development
â Allocate resources â funding, staffing, infrastructure
⢠Clarify your technology program implementation steps
â Provide rationale for your goals and objectives
â Connect to your strategies
â Make your case with needs data
⢠Repeated uses track changes in the school STNA âprofileâ over time
⢠Ultimately, uses are driven by the questions you are looking to answer
30. What is the STNA process?
⢠Currently free, thanks to Friday Institute support
⢠Convene your team, decide if you need STNA data and how it will be
used
⢠Communicate with staff for buy-in
⢠Email Danny at daniel.s.stanhope@gmail.com with:
â Names of each school participating
â Opening and closing dates
â An accurate count of expected respondents
⢠Have all staff working with students complete STNA â goal is 100% response
rate
⢠Coordinate STNA completion and track who has completed the survey
⢠Reports are distributed shortly after STNA closes
⢠Reconvene team to make decisions based on STNA results
31. Technology Integration Measures
ISTE Classroom Observation Tool (ICOTÂŽ)
http://icot.craftyspace.com/
- Register (create an account and confirm it).
- Install tool (Adobe AIR Runtime needed for installation).
- Record observations (online or pencil-and-paper)
- Export and analyze data
- Help (pdf file)
Online tutorial:
http://iste.acrobat.com/p92443979/
33. Technology Integration Measures
(continued)
North Central Regional Technology in Education
Consortium Scoring Guide for Student Products
â create scoring guides to evaluation a variety of student
products
â export scoring guides to pdf
Description and directions:
http://www.ncrtec.org/tl/sgsp/teachers.htm
Tool page: http://goal.learningpt.org/spsg/GetProd.asp
35. Why use the LoFTI Instrument?
⢠Determine how technology is being used
school-wide.
⢠Record instances of particular uses of
technology, not âhow wellâ it is used.
⢠This is not a teacher evaluation tool.
36. Who Can Use the LoFTI Instrument?
⢠The observer can be any member of the
school staff.
⢠Observers should be trained before
conducting observations using the LoFTI
instrument.
37. Using the Protocol
⢠The observerâs presence will undoubtedly
influence what they observe.
⢠Be mindful of the observerâs placement and
interactions with students and teachers.
⢠Observers should record only what is
observed during the actual visit.
38. Using the Protocol
⢠Visits are not scheduled.
⢠Keep a separate record of visits.
⢠Avoid typical non-instructional times â
beginnings, endings, or transitions.
⢠Visit a given room at different times and days
of the week.
39. Using the Protocol
⢠The observerâs presence will undoubtedly
influence what they observe.
⢠Be mindful of the observerâs placement and
interactions with students and teachers.
⢠Observers should record only what is
observed during the actual visit.
41. Additional Data Sources
⢠Teacher Lesson Plans
⢠Teacher Reflection Logs
⢠Technology Use Logs
⢠Rubrics
42. Using the Data to Effectively Plan
Professional Development
As a whole group, identify professional
development opportunities that seem to be a
good match for the needs identified using the
data from the STNA results.
43. Effective Professional Development
⢠Fosters a deepening of subject-matter
knowledge, understanding of learning, and
appreciation of studentsâ needs
⢠Centers around the critical activities of
teaching and learning
⢠Engages educators in professional learning
communities
⢠Is sustained, intensive, and woven into the
everyday fabric of the teaching profession