Why Teams call analytics are critical to your entire business
Consider The Evidence Tpu Ti Cs
1. Consider the Evidence Data analysis and evidence-driven decision making for TIC’s Teen Parent Units 27 August 2009 Naomi Kinnaird Centre for Educational Development Insert Pic
6. Evidence-driven decision making Just having data offers very little. For a skilled leader, however, data can be a compelling force for improving schools. The value associated with data come from being able to discern the quality of the data and to organise it, think about what it might mean and use it to make decisions. This is a human activity that requires capturing and organising ideas and turning the information into meaningful actions (Senge, 1990).
15. how classes are compiled, how classes are allocated to teachers, test results, teachers’ observations, attendance data, portfolios of work, student opinions…Data are one form of evidence
18. Evidence-driven decision making We have more evidence about what students know and can do than ever before - their achievements, behaviours, environmental factors that influence learning We should draw on all our knowledge about the learning environment to improve student achievement explore what lies behind patterns of achievement decide what changes will make a difference
24. Student achievement What evidence do we have now about student achievement? What other evidence could we collect? National assessment results Standardised assessment results administered internally Other in-school assessments Student work
25. Perceptions What evidence do we have now about what students, staff and others think about the school? Are there other potential sources? Self appraisal Formal and informal observations made by teachers Structured interactions Externally generated reports Student voice Other informal sources
26. School processes What evidence do we have about how our school is organised and operates? Timetable Classes Resources Finance Staffing
27. Other practice How we can find out about what has worked in other schools? Documented research Experiences of other schools
28. The evidence-driven decision making cycle Trigger Clues found in data, hunches Explore Is there really an issue? Question What do you want to know? Assemble Get all useful evidence together Analyse Process data and other evidence Interpret What information do you have? Intervene Design and carry out action Evaluate What was the impact? Reflect What will we change?
29. The evidence-driven decision making cycle cycle SPECULATE A teacher has a hunch about a problem or a possible action TRIGGER Data indicate a possible issue that could impact on student achievement EXPLORE Check data and evidence to explore the issue REFLECT on what has been learned, how practice will change QUESTION Clarify the issue and ask a question ASSEMBLE Decide what data and evidence might be useful EVALUATE the impact on the intervention ACT Carry out the intervention ANALYSE data and evidence INTERVENE Plan action to improve student achievement INTERPRET Insights that answer your question
30. A teacher has a hunch - poor readers might spend little time on homework. Explore data Survey of students shows that this is only partially true Trigger Significant numbers not achieving well in reading. Reflect How will we teach reading in the future? Question What are the characteristics of students who are poor at reading? Evaluate Has reading improved? Analyse asTTLe reading (standardised) results. Intervene Create multiple opportunities for reading; include topics that can use sport as context; connect reading with curriculum areas. PD for staff. Assemble more data & other evidence: Probe reading, homework, extracurricular, attendance, etc. Interpret information Poor readers likely to play sport, read less, do little homework. Analyse non standardised data and evidence The evidence-driven decision making cycle
31. The length of the cycle will vary for different situations. We might wait a year to evaluate the effects of our actions - but sometimes we’ll be able to (and ought to) work to shorter (or maybe longer) cycles.It’s important that we reflect, evaluate and make professional judgements at each stage of this cycle.Now we will invent scenarios that might apply in our school (or department). Do this in groups. Draw up your scenario as a cycle as in this slide. The next slide provides a blank template for this exercise. You might like to photocopy this template for groups to fill in. A sample scenario is given in the following slide.The length of the cycle will vary for different situations. We might wait a year to evaluate the effects of our actions - but sometimes we’ll be able to (and ought to) work to shorter (or maybe longer) cycles.It’s important that we reflect, evaluate and make professional judgements at each stage of this cycle.Now we will invent scenarios that might apply in our school (or department). Do this in groups. Draw up your scenario as a cycle as in this slide. The next slide provides a blank template for this exercise. You might like to photocopy this template for groups to fill in. A sample scenario is given in the following slide. The evidence-driven decision making cycle SPECULATE TRIGGER EXPLORE REFLECT QUESTION ASSEMBLE EVALUATE ACT ANALYSE INTERVENE INTERPRET
32. Evaluate and reflect Summative evaluation – assess how successful the intervention was; decide how our practice will change; report to board Formative evaluation - at every stage in the cycle we reflect and evaluate Are we are on the right track? Do we need to fine-tune? Do we actually need to complete this?
36. Trigger questions How good/poor is …? What aspects of … are good/poor? Is … actually changing? How is … changing? Is … better than last year? How can … be improved? Why is … good/poor? What targets are reasonable for …? What factors influence the situation for …? What would happen if we …? Formative or summative?
37. Questions from hunches I suspect this poor performance is being caused by … Is this true? We reckon results will improve if we put more effort into ... Is this likely? I think we’d get better results from this module if we added … Is there any evidence to support this idea?
38. Questions with purpose What do we know about attendance for TPU students? MAY BE BETTER AS Who has been absent? When? Why? Where have they been? How long? What are students telling us? What does pastoral care data tell us? Were some interventions more effective with some students / groups of students than others?
39. Professional decision making We have evidence-based information that we see as reliable and valid What do we do about it? If the information indicates a need for action, we use our collective experience to make a professional decision
40. Professionals making decisions You asked what factors are related to poor student performance in formal writing. The analysis suggested that poor homework habits have a significant impact on student writing. You make some professional judgements and decide Students who do little homework don’t write enough You could take action to improve homework habits - but you’ve tried that before and the success rate is low You have more control over other factors - like how much time you give students to write in class So you conclude - the real need is to get students to write more often
41. Deciding on an action Information will often suggest a number of options for action. How do we decide which action to choose? We need to consider what control we have over the action the likely impact of the action the resources needed
42. Planning for evaluation What evidence do we need to collect before we start? Do we need to collect evidence along the way, or just at the end? How can we be sure that any assessment at the end of the process will be comparable with assessment at the outset? How will we monitor any unintended effects? Don’t forget evidence such as timetables, student opinions, teacher observations …
43. Evaluate the impact of our action Did the intervention improve the situation that triggered the process? Was any change in student achievement significant? What else happened that we didn’t expect? How do our results compare with other similar studies we can find? Does the result give us the confidence to make the change permanent?
44. Future practice What aspects of the intervention will we build into future practice? What aspects of the intervention will have the greatest impact? What aspects of the intervention can we maintain over time? What changes can we build into the way we do things in our school? Would there be any side-effects?
45. What now? How can we apply this model in our TPU? Develop a specific task to review / implement (in regard to evidence / data) for learning in your TPU. What evidence already exists? How is this collected / recorded / analysed and used in the interests of improving student achievement?
Hinweis der Redaktion
This session will help us to think about how we can make decisions based on evidence in a structured and informed way. In a moment we’ll discuss what is meant by ‘data and other evidence’.We can apply today’s material to student achievement at all secondary levels - not just senior – and to all curriculum learning areas and school processes.
First, lets look at the questions you were asked to consider in preparation for today.Table mat sharing activity. Feedback to whole group.
So what is evidence-driven decision making?
Optional introductory activity.This scenario is just to get us started.There’s nothing mysterious about evidence-driven decision making.We all make decisions every day based on an analysis of a number of factors. In this scenario you’d analyse the factors and make a decision in seconds (or you’d go hungry).What other factors you might consider before buying lunch?For example: Who are you eating with? How much do you want to spend? What did you have for breakfast? How hungry are you? Are you on a special diet? What else do you need to do this lunchtime? Who do you want to avoid this lunchtime?
The aim here is to demonstrate (and acknowledge) that teachers look for and use a variety of evidence as a normal part of effective and reflective teaching. The conclusions the teacher reaches are not as important here as the investigative approach he uses . Teachers continually consider what they know about students. This story told by Ana’s history teacher is typical. It’s not a story about a formal investigation. It’s just the sort of thing good teachers do all the time. It might have taken place over just a week or two and taken very little of the teacher’s time.This teacher had a ‘hunch’ based on his general professional observations. He informally compared a range of evidence to see if his hunch was correct. It was. He wanted to find a way to improve this one aspect of Ana’s achievement. He considered other evidence and analysed it. This enabled him to pinpoint the problem and plan a course of action designed to improve Ana’s achievement.This teacher was thinking about the data and other evidence he had right there in front of him - and then he acted on his conclusions. The teacher used evidence-driven decision making, using data and other evidence to inform his actions. In this session we want to see how to expand (and systematise) that sort of thinking to drive improvement in your TPU.
What is meant by ‘data and other evidence’?
‘Evidence’ is used here in the same way that it’s used in courts of law and in standards based assessment.Like all schools, we have access to a lot of ‘data’ about student achievement and student behaviour – test results, attendance patterns, etc.But we have access to a lot more information than what is normally thought of as ‘data’.In this session we want to be aware of all the ‘evidence’ we have access to.Some of this evidence is ‘data’ - but some (like student opinions, teachers’ observations) can’t be easily processed in the way we process ‘data’ – so it’s best called ‘evidence’.Having concerns about the use of jargon, here’s a way to discuss the issue: Whenever people come to grips with new ideas, they might have to learn new terms or give special meaning to existing words. This happened with curriculum and assessment developments – but most teachers (and parents) are now familiar with terms and concepts like strands, levels and credits. The language of computing is another good example.
This resource treats the word ‘data’ as a plural noun – hence ‘data are …’.There’s nothing new here - but for some of you, this will be quite a narrow definition of data.We could have a discussion about what constitutes data – for example, do all data have to be coded in some way (eg as a number)? But I suggest we accept this distinction for the purposes of this session.The main point is: If we want to improve student achievement, we can look at a lot more than what we traditionally think of as data.
We all know that we can make significant improvements to teaching and learning by analysing data and applying what we learn from it.But an issue for many schools is that they have too much data in some areas – too much evidence – and not enough time and resources to use it all effectively. In other areas, we have too little evidence.Any change in teaching practice is a risk - you can never be entirely sure of the consequences. The approach we are looking at today shows us how to decide what we should change to improve student achievement. Evidence-driven decision making can help by reducing or assessing risk, and maybe by pointing out changes that have lesser risk.No School Left Behind – jigsaw activity
Introductory slide – the next five slides deal with each category in turn.All schools have data about student achievement. To make the most of these data, we need to take be aware of many other factors - evidence that describes our students’ wider learning environment. We have so much evidence that it’s useful to categorise it in some way.The approach we are taking today separates all data and other evidence into these five categories.
Demographics – also known as Profile data – objective data that describe our school and its students, staff and community – decile, gender, suspensions, etcYou should make the point that data and other evidence should be generated only if it’s for a purpose. The Consider the Evidence web pages and the full presentation provide examples for each bulleted point.
Student Achievement data and other evidence - much of this is readily available – from national assessments, standardised testing we carry out in the school, portfolios of student work, etc.The Consider the Evidence web pages and the full presentation provide examples for each bulleted point.
In many schools there will be little of this sort of evidence, so you might spend more time on this.Perceptions - evidence of what staff, students and other think about the school - probably the most subjective evidence but much of it will be factual and collected in formal ways - student self appraisal, formal and informal observations made by teachers, etc.The Consider the Evidence web pages and the full presentation provide examples for each bulleted point.
Some teachers may not think of School Processes as evidence that can be used in decision making – you might need to skip forward to later slides that provide examples.School processes - how our school is organised and operates – the timetable, resources, etcThe Consider the Evidence web pages and the full presentation provide examples for each bulleted point.
Other Practice – we should look at the experiences of others - documented academic research, the experiences of other schools, etc. We can access a lot of this material from Te Kete Ipurangi, the TKI website. Educational LeadersThe Consider the Evidence web pages and the full presentation provide examples for each bulleted point.Let’s take a look at some data and how we might analyse it.Use these categories to discuss what sort of evidence you used in the exercise – and what other evidence could be used to extend the example.
Copy of text below included with handouts.It’s useful to think of the cycle as having sequential stages:Trigger Data, ideas, hunches, etc set a process in action. The trigger is whatever it is that makes you think there could be an opportunity to improve student achievement. You can routinely scan available data looking for inconsistencies, etc. It can be useful to speculate about possible causes or effects - and then explore data and other evidence to see if there are any grounds for the speculation.Explore Initial data, ideas or hunches usually need some preliminary exploration to pinpoint the issue and suggest good questions to ask.Question This is the key point: what question/s do you want answered. Questions can raise an issue and/or propose a possible solution.Assemble Get together all the data and evidence you might need - some will already exist and some will have to be generated for the occasion.Analyse Process sets of data and relate them to other evidence. You are looking for trends and results that will answer your questions (but watch out for unexpected results that might suggest a new question).Interpret Think about the results of the analysis and clarify the knowledge and insights you think you have gained. Interrogate the information. It’s important to look at the information critically. Was the data valid and reliable enough to lead you to firm conclusions? Do the results really mean what they seems to mean? How sure are you about the outcome? What aspects of the information lead to possible action?Intervene Design and implement a plan of action designed to change the situation you started with. Be sure that your actions are manageable and look at the resourcing needed. Consider how you’ll know what has been achieved.Evaluate Using measures you decided in advance, assess how successful the intervention has been. Has the situation that triggered the process been improved? What else happened that you maybe didn’t expect? Reflect Think about what has been learned and discovered. What did we do that worked? Did this process suggest anything that we need to investigate further? What aspects of the intervention can be maintained? What changes will we make to our practices? What support will we need?
The length of the cycle will vary for different situations. We might wait a year to evaluate the effects of our actions - but sometimes we’ll be able to (and ought to) work to shorter (or maybe longer) cycles.It’s important that we reflect, evaluate and make professional judgements at each stage of this cycle.
A sample scenario.
Handout on A3 sheet.
A reminder that could be used at many points / useful to flick back to the previous slide / when discussing reflection at various stages of the process.We should pause here and think about evaluation and reflection.The final stage in this cycle is summative evaluation - we assess how successful the whole process was and reflect on whether we will change our future practice.But at every stage in this cycle we should be reflecting, evaluating in a formative way and making professional judgements about where to go next.We need to be sure that we are on the right track. Should we fine-tune the process as we go?Many schools are consciously developing a ‘culture of inquiry’ – an open and supportive environment in which staff and the school community regularly reflect on the way the school operates, one in which calculated risk-taking is seen as an essential ingredient of innovation. A cyclical improvement process is iterative – incremental changes are incorporated into the knowledge base and into professional practice and feed into the next cycle. There is a compounding effect - change becomes the trigger for more questions.This resource can be used as a contribution to that approach.
This slide simply lists the common types of analysis used in schools. The Consider the Evidence web pages and the full presentation provide examples for each bulleted point.Teachers will be familiar with these three types of data analysis – we introduce them through discussing questions because asking the right questions is a major theme of this resource.The way we analyse data depends on the question we are trying to answer … let’s look at some examples.
Use this slide without exploring the summative / formative issue.Questions that trigger the process can relate to student achievement or behaviour, teaching approaches and school processes - like the ones on this slide.Questions can be described as summative or formative. Let’s think about the questions on this slide - which of them are summative and which are formative.Could have a discussion about the purpose of summative and formative questions – when is each type of question useful?Summative questions give us end-of-process results, often suitable for reporting and accountability. Formative questions are intended to provide more immediate feedback to improve teaching and learning, so they are probably more specific.
The colloquial term hunch is used here to recognise how intuitive teachers can be. The aim is not to belittle hunches. They are extremely useful. In fact most hunches are based on sound professional experience and observation. But until they have been tested against evidence, they remain hunches.In terms of improving particular aspects of teaching and learning, many of the most pertinent questions come from our ‘hunches’.Some of our hunches will be based on a hypothesis or a speculation about a possible change. We’re using the casual term hunch here, but this does not belittle hunches.Teachers, like detectives, base hunches on professional observations – it’s just that you haven’t yet tested them against some evidence.
Once we get into the process, before we start assembling the evidence we plan to analyse, we need to be sure that we are asking good questions.The initial question on this slide can be answered quite easily but what use will the answer be? More purposeful questions are likely to lead to information we can act on.
The aim here is to point out that the analysis will not always point to an obvious intervention. Teachers still need to make professional decisions. Two examples are provided in later slides.Let’s assume that we have carried out the analysis and we have some information we might act on. What we decide to do as result of the information we get from the analysis is guided by our professional experience.
Example:Teachers decided the poor performance in writing was not homework habits but the total amount of writing students do.Professional judgments lead to the conclusion that the action required is to ensure that students do more writing.
Return to the slide that shows the full improvement cycle to remind the group which stage we are at.We have analysed some evidence and decided what sort of intervention might improve the initial situation.Remember, what we are going to do is to try out a change, then evaluate it to see if it worked.But first we need to plan the intervention well.These are the factors we need to consider when we are planning an intervention:Control - What aspects of the situation do we have most control over? Do we run a limited pilot rather than a full scale intervention.Impact - What can we do that is most likely to have the desired impact? Do we play it safe and intervene only where we know we can make a major difference? Resources - time, money, people - What will we need? What do we have? What other activities could be affected if we divert resources to this project?
At the end of the intervention, we will need to evaluate its effect. Did it work?Some questions to consider as we think about the data we will need to evaluate the impact of our intervention.
When we evaluate the impact of our intervention, we need to ask the same sort of questions that we asked earlier in the process. That is, we need to interrogate our evaluation. We must not leap to conclusions.The final question in this slide is the crucial one – has the intervention been effective enough to justify embedding the change in normal practice?
Even if things didn’t go exactly as we planned them – even if student achievement wasn’t greatly improved - there are probably some things we have learnt that we should incorporate into future practice. We need to be realistic about what we can achieve – we need to be sure we could maintain the intervention.We should also think about any side-effects – if we put time and effort into this change, will anything else suffer?
Let’s consider how to apply this model of evidence-driven decision making in your TPU.What evidence already exists in the TPU?How this is collected and recorded, and how well equipped are you to analyse and use it in the interests of improving student achievement?Where are you on the cycle?Develop a specific task / goal to review/implement in regard to evidence / data for learning in your TPU.