SlideShare ist ein Scribd-Unternehmen logo
1 von 23
The role of evaluation in
mental health and
greenspace
Dr Ruth Jepson
Co-Director, Centre for Population and Public
Health Research, University of Stirling
Lead for Physical Activity, Diet and Health
Research Programme
Outline of talk
he work at the Centre for Public Health and Population Health
Research


hy do evaluation?


oints to consider in evaluation


ome types of evaluation techniques and three examples


ood sources of support/toolkits/ training available to projects wishing
to evaluate what they do
Centre for Public Health &
   Population Health Research

Programme on physical activity and diet
Focus on:
•Promoting physical and healthy diet as part of everyday
behaviour
•Promoting physical activity through health professional
referrals
•Understanding the barriers to physical activity and healthy
eating in different population groups (including ethnic minority
groups)
•Encouraging people to use the outdoors to increase their
feelings of health and wellbeing (including walking and
gardening)

Three relevant evaluation projects:
•Walking groups via a GP practice
•Health effects of community gardening
•Indoor versus outdoor activities via Exercise Referral Schemes
Why do evaluation?
Evaluation can be powerful and exciting! (Hmm..)
It can help you:
•Improve services
•Understand what works and what doesn’t
•Demonstrate the difference that a project makes
•Make decisions about the best use of funding
•Have evidence for policy and decision-making
                                             (Evaluation Support Scotland)
    Can be your most important and influential tool for getting new
    or sustained funding - funders want evidence of what works
Who is evaluation for?

Evaluation Stakeholders   What sort of evaluation is valued?
Policy makers             Effectiveness; what works?
Funders                   Accountability/value for money
Planning & Performance    Performance monitoring/targets
Managers                  Process evaluations
Researchers               Knowledge building; research quality
                          and utility
                          Service quality – access, experience,
Service users
                          relevance to needs
What makes a ‘good’ evaluation?
                                                 Influences decision
                                                       making
 Contributes
   to the
  evidence
    base                                                ifle
                                                        tu
                                                         s T            a
                                                                       ’s
                                                                   g
                                                                   o
                                                                   d
                                                         on
                                                          tti
                                                         fe
                                                          ne
                                                           s
                                                           mv
                                                             go
                                                             n r
                                                               u
                                                               u
                                                               f
                                                              in
                                                               d
    Is obje
            cti
     and w ve      Shows the
            e
    execu ll                               sH
                                          sle
                                          up
          ted     value of what
                   we’re doing    twu
                                  ann
                                    dd
                                   hte
                                    as
                                     r
                                        cid
                                        ef
                                       ee
                                       wne
                                         r
                                           ika
                                           nr
                                           ge
                                            m
                                            a
Before you start an evaluation
• Clarify the aims of your project
• Think about who you are targeting
• Think about how your project will effect change
• Identify what you want to achieve in the short, intermediate and long-
  term
• Decide how you will measure what you want to achieve [if possible
  used well validated measures – don’t attempt to make up your own]
• Think about what information you will need to collect [NEVER collect
  information you don’t intend to use!]
Types of evaluation
There are 4 broad types of evaluation:
•process (which deals broadly with the processes
involved in service delivery)
•outcome (which determines whether aims have
been met & how effective the service is)
Often carried out at the same time or slightly
staggered

•impact (determining the wider implications of
the service, often comparing with an areas where
no service is provided)
•economic (determining whether the service is
cost effective)
Process evaluation

akes place during setting up and/or delivery of the project

rovides guidance to those who are responsible for ensuring
and improving the project’s quality

ocuses on identifying barriers and facilitators to successful
implementation/delivery, as well as assessing whether the key
objectives have been met

an be used to refine/modify the delivery of the project

an be used to determine the effects of the project (intended
Good process evaluation:
1.    Is carried out by people not involved in the project (makes it more
      objective and people may answer more honestly)

2.    Doesn’t make assumptions about how the project works and what it
      achieves – sometimes there are unintended consequences

3.    Uses different types of data to assess the processes
     Qualitative (talking to people) data can help identify if the project is running as
     intended; meeting the needs of the participants (staff and users); the experiences
     (positive and negative) of participants; the changes experienced (intended and
     unintended); does the project work as it should?

     Quantitative (numbers) data should be collected to demonstrate that you are
     attracting the ‘right’ participants (called reach) Minimum data should include all the
     variables you think are relevant. For example: age, gender, postcode, ? health
     condition, ?physical activity level for all participants, performance data (e.g. how
     many referrals were made, activities carried out etc)
Outcome Evaluation
 ims to answer questions such as:

 oes the project work?

 oes it provide the benefits to participants you wanted it to make?

 ow do you measure success? [Where possible VERY IMPORTANT to use
 questionnaires etc. that have been validated and used in the same type
 of setting/population]

 f possible should try and measure change when people start in the
 project, after a few months and after about one year.

 old standard would be the randomised controlled trial where you have a
 control group – you can be more certain that any benefits that are seen
 are due to the project and not due to other reasons.
To do a good outcome evaluation
•   Be realistic and clear about what your project is likely to achieve – make sure you are
    measuring the right outcomes (e.g. improving mental health, not curing depression)
    –ask why you think your project would impact on these outcomes
•   Collect core information on everyone who comes to the project (baseline)
•   Consistently ask same questions to all participants
•   Use proper validated tools to measure outcomes
•   Don’t collect unnecessary data (just because you think it ‘might be interesting’)
•   Follow up participants at medium and longer term (e.g. 6 months & 12 months) to
    see if change/benefit has occurred
•   Demonstrate effectiveness by using experimental methods (not necessary for all
    projects) – aspirational but not impossible!
Musselburgh Health Walks for sedentary people
and/or mental health problems
An evaluation by Roma Robertson, PhD student




rocess evaluation

 Is it possible to run a programme of health walks 3 times/week?

 Do health practitioners (HPs) refer onto the service?

 Do sedentary patients/people with depression utilise the service?

 Can the model be improved? What positive and
Process evaluation
1. Is it possible to run a programme of health walks 3 times/week?
Yes: The programme of walks ran well for the planned 24 weeks. It was
organised by CHANGES Community Health Project and supported by 8
volunteer walk leaders, many of whom had attended the well
established wellbeing walks run by CHANGES over the years. However,
without the volunteer walk leaders it would have been difficult to
provide such frequent walks and at low cost.


2. Do health practitioners refer to the service?
The initial plan was to recruit walk participants
via GP consultations.
Low recruitment numbers led to other
alternative recruitment strategies being introduced
(so No!)
Process evaluation

. Do sedentary patients/people utilise the service?

f the 19 who participated, 13 (68%) stayed until the end of the
12 weeks and 9 people continued to walk with the group
beyond the 12 weeks they had agreed to. This suggests that a
large proportion of participants valued the service.


. Can the walks be improved?

here was a lot of very positive feedback.

hen pressed for how the walks could be improved, walkers
indicated they would like a greater variety of walks; and a
quarter felt the walks were too short. One person found it
embarrassing to meet at the health centre and one was
worried in case someone they knew would see them on the
walks. Seven people liked going for coffee afterwards; 5
disagreed or were undecided
Outcomes evaluation
s there any evidence of benefits to participants?

ollected data at baseline, 6 weeks and 6 months

sed validated measures of health outcomes (IMPORTANT!)

ollected data on mental wellbeing, physical activity, social networks, general health


hysical activity

mmediately after the intervention and 6 months later more people reported that they

ook at least 20 minutes exercise on 3 days each week than before the study


ental wellbeing

cores for mental health improved after 12 weeks of walks, but the

mprovement had reduced and was no longer statistically significant

months later
Effects of community gardening on
health outcomes
Project by Di Blackmore, PhD student


 im to investigate the effect of community gardens on health and

 elated outcomes. The objectives of this research are to:

  explore a range of health effects for the individuals

 explore mechanisms by which the community gardening project
 affects health

 etermine how/if the outcomes vary between the different community
 gardens and other variables such as the amount of time spent in the
 garden
Methods
Intend to recruit participants near the start of their gardening
experience, and take baseline measures of stress level and physical
health: blood pressure, body mass index, activity level and salivary
cortisol.

In addition, participants will be asked to complete validated
questionnaires that examine aspects of mental wellbeing, physical
activity, quality adjusted life years, loneliness, community cohesion
and social capital.

This data will be collected at baseline, some measures again at 6
weeks and all measures at 12 weeks.

Also collecting qualitative data to explore participants experiences
of being involved in the projects, and how they felt they benefited
from them
A feasibility study of Exercise Referral Scheme:
indoor versus outdoor activities
Led by Dr Larry Doi

 im

 o test the feasibility, acceptability and effectiveness of randomising patients to ERS
 in either indoor or outdoor activities. YES a randomised controlled trial!!


 esearch questions

 . What are the initial estimations of effectiveness of indoor versus outdoor activities
 on a range of health outcomes? [OE]

 . Are there are particular aspects of the outdoor exercise or indoor exercise,
 delivered via an exercise referral scheme, that confer specific health benefits? [OE]

 . Do the patients have strong preferences for setting of physical activity? [PE]

 . What are the main barriers and facilitators to implementing the outdoor
 intervention successfully? [PE]

 . What are the underlying mechanisms of action or change (why and how the
 outdoor and indoor activities have an effect on health outcomes) [PE]
A feasibility study of Exercise Referral Scheme:
indoor versus outdoor activities
Setting: Bathgate, West Lothian

Interventions
1) Indoor ERS (normal activities in leisure centre)
2) Outdoor ERS
Intervention to be developed but will roughly equate with duration and exercise intensify to
the indoor intervention. Components may include:
1.Green gym
2.Led walks
3.Other outdoor activities
Duration of the intervention period would be 12 weeks. Participants will be asked to only do
activities in the arm of the trial to which they have been allocated. After the 12 weeks all
participants will be able to continue with the physical activities of their choice.

All the outdoor activities (e.g. green gym, led walks) will remain in place for a minimum of
48 weeks (making the duration of both interventions a year in total). All participants will be
followed up at one year.
Data collected routinely by the exercise referral
team
eight loss               BMI/ Abdominal Girth

hysical activity General Practice Physical Activity Questionnaire

hysical fitness Peak Flow

lood pressure Systolic BP/ Diastolic BP

ental health    Hospital Anxiety and Depression Questionnaire

eneral health The General Health Questionnaire (GHQ12)

atient satisfaction      Satisfaction survey

dherence                 Electronic records on attendance at feedback sessions


his data is recorded electronically and is collected at several time
Good Evaluation Resources
Evaluation Support Scotland
http://www.evaluationsupportscotland.org.uk/

BHF Exercise Referral Toolkit
http://www.bhfactive.org.uk/sites/Exercise-Referral-Toolkit/

Standard Evaluation Framework for physical activity interventions
http://www.noo.org.uk/uploads/doc/vid_16722_SEF_PA.pdf

Learning, Evaluation and Planning (LEAP)
http://www.scdc.org.uk/what/LEAP/
Email: Ruth.jepson@stir.ac.uk

Weitere ähnliche Inhalte

Andere mochten auch

Greenability 2014 - Enable Scotland
Greenability 2014 - Enable ScotlandGreenability 2014 - Enable Scotland
Greenability 2014 - Enable ScotlandTCV Scotland
 
David Graham, 2012
David Graham, 2012David Graham, 2012
David Graham, 2012TCV Scotland
 
Prezentare proiect Active Travel Network in cadrul programului urbact ii
Prezentare proiect Active Travel Network in cadrul programului urbact iiPrezentare proiect Active Travel Network in cadrul programului urbact ii
Prezentare proiect Active Travel Network in cadrul programului urbact iiPrimariaSebes
 
Redhall 'i had a black dog'
Redhall 'i had a black dog'Redhall 'i had a black dog'
Redhall 'i had a black dog'TCV Scotland
 
Reteaua de magazine prietenoase cu mediul in cadrul proiectului Active Travel...
Reteaua de magazine prietenoase cu mediul in cadrul proiectului Active Travel...Reteaua de magazine prietenoase cu mediul in cadrul proiectului Active Travel...
Reteaua de magazine prietenoase cu mediul in cadrul proiectului Active Travel...PrimariaSebes
 
Jocul walking bus in cadrul proiectului Active Travel Network
Jocul walking bus in cadrul proiectului Active Travel NetworkJocul walking bus in cadrul proiectului Active Travel Network
Jocul walking bus in cadrul proiectului Active Travel NetworkPrimariaSebes
 
Greenability 2014 - Joe Gibson Why Outdoors presentation
Greenability 2014 - Joe Gibson Why Outdoors presentationGreenability 2014 - Joe Gibson Why Outdoors presentation
Greenability 2014 - Joe Gibson Why Outdoors presentationTCV Scotland
 

Andere mochten auch (8)

Greenability 2014 - Enable Scotland
Greenability 2014 - Enable ScotlandGreenability 2014 - Enable Scotland
Greenability 2014 - Enable Scotland
 
Trellis -
Trellis -Trellis -
Trellis -
 
David Graham, 2012
David Graham, 2012David Graham, 2012
David Graham, 2012
 
Prezentare proiect Active Travel Network in cadrul programului urbact ii
Prezentare proiect Active Travel Network in cadrul programului urbact iiPrezentare proiect Active Travel Network in cadrul programului urbact ii
Prezentare proiect Active Travel Network in cadrul programului urbact ii
 
Redhall 'i had a black dog'
Redhall 'i had a black dog'Redhall 'i had a black dog'
Redhall 'i had a black dog'
 
Reteaua de magazine prietenoase cu mediul in cadrul proiectului Active Travel...
Reteaua de magazine prietenoase cu mediul in cadrul proiectului Active Travel...Reteaua de magazine prietenoase cu mediul in cadrul proiectului Active Travel...
Reteaua de magazine prietenoase cu mediul in cadrul proiectului Active Travel...
 
Jocul walking bus in cadrul proiectului Active Travel Network
Jocul walking bus in cadrul proiectului Active Travel NetworkJocul walking bus in cadrul proiectului Active Travel Network
Jocul walking bus in cadrul proiectului Active Travel Network
 
Greenability 2014 - Joe Gibson Why Outdoors presentation
Greenability 2014 - Joe Gibson Why Outdoors presentationGreenability 2014 - Joe Gibson Why Outdoors presentation
Greenability 2014 - Joe Gibson Why Outdoors presentation
 

Ähnlich wie Mental Health Greenspace Evaluation

M&E handout for module 8
M&E handout for module 8M&E handout for module 8
M&E handout for module 8Tony
 
Using voting pads and our learning platform to aid assessment
Using voting pads and our learning platform to aid assessmentUsing voting pads and our learning platform to aid assessment
Using voting pads and our learning platform to aid assessmentBecta
 
AEA 2016 - 17 Step Checklist for UFE Evaluation: A Formal Dinner or A La Carte?
AEA 2016 - 17 Step Checklist for UFE Evaluation: A Formal Dinner or A La Carte?AEA 2016 - 17 Step Checklist for UFE Evaluation: A Formal Dinner or A La Carte?
AEA 2016 - 17 Step Checklist for UFE Evaluation: A Formal Dinner or A La Carte?Tiffany Smith
 
6 M&E - Monitoring and Evaluation of Aid Projects
6 M&E - Monitoring and Evaluation of Aid Projects6 M&E - Monitoring and Evaluation of Aid Projects
6 M&E - Monitoring and Evaluation of Aid ProjectsTony
 
The F Word - Evaluation.
The F  Word - Evaluation.The F  Word - Evaluation.
The F Word - Evaluation.vinspired
 
Developing a Shared Vision for Academic Technology on Your Campus
Developing a Shared Vision for Academic Technology on Your CampusDeveloping a Shared Vision for Academic Technology on Your Campus
Developing a Shared Vision for Academic Technology on Your CampusGail Matthews-DeNatale
 
Module 8 presenter notes
Module 8 presenter notesModule 8 presenter notes
Module 8 presenter notesTony
 
Startup Foundation
Startup FoundationStartup Foundation
Startup Foundationmarcnager
 
Setting the Stage for Health Systems Strengthening: What We Have Learned from...
Setting the Stage for Health Systems Strengthening: What We Have Learned from...Setting the Stage for Health Systems Strengthening: What We Have Learned from...
Setting the Stage for Health Systems Strengthening: What We Have Learned from...Health Systems 20/20
 
Sandra Groeneveld
Sandra GroeneveldSandra Groeneveld
Sandra Groeneveldfedactio
 
Evaluating and communicating your project
Evaluating and communicating your project Evaluating and communicating your project
Evaluating and communicating your project mycommunitylocality
 
Project output versus influence in practice : impact as a dimension of resea...
Project output versus influence in practice: impact as a dimension of resea...Project output versus influence in practice: impact as a dimension of resea...
Project output versus influence in practice : impact as a dimension of resea...Hazel Hall
 
Program Evaluation Studies TK Logan and David Royse .docx
Program Evaluation Studies TK Logan and David Royse .docxProgram Evaluation Studies TK Logan and David Royse .docx
Program Evaluation Studies TK Logan and David Royse .docxstilliegeorgiana
 
Communication & supervisory process
Communication  & supervisory processCommunication  & supervisory process
Communication & supervisory processRamil Polintan
 
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2Brent MacKinnon
 
Project Evaluation, Recycling and Closing
Project Evaluation, Recycling and ClosingProject Evaluation, Recycling and Closing
Project Evaluation, Recycling and ClosingJo Balucanag - Bitonio
 
Addressing the Risks & Opportunities of Implementing an Outcomes Based Strategy
Addressing the Risks & Opportunities of Implementing an Outcomes Based Strategy Addressing the Risks & Opportunities of Implementing an Outcomes Based Strategy
Addressing the Risks & Opportunities of Implementing an Outcomes Based Strategy Blackbaud Pacific
 
Considerations for the successful design & implementation of ICT systems in t...
Considerations for the successful design & implementation of ICT systems in t...Considerations for the successful design & implementation of ICT systems in t...
Considerations for the successful design & implementation of ICT systems in t...IRC
 
MEAS Course on E-Learning: 5 Planning for scaleable operations and costs
MEAS Course on E-Learning: 5 Planning for scaleable operations and costsMEAS Course on E-Learning: 5 Planning for scaleable operations and costs
MEAS Course on E-Learning: 5 Planning for scaleable operations and costsMEAS
 

Ähnlich wie Mental Health Greenspace Evaluation (20)

M&E handout for module 8
M&E handout for module 8M&E handout for module 8
M&E handout for module 8
 
Using voting pads and our learning platform to aid assessment
Using voting pads and our learning platform to aid assessmentUsing voting pads and our learning platform to aid assessment
Using voting pads and our learning platform to aid assessment
 
AEA 2016 - 17 Step Checklist for UFE Evaluation: A Formal Dinner or A La Carte?
AEA 2016 - 17 Step Checklist for UFE Evaluation: A Formal Dinner or A La Carte?AEA 2016 - 17 Step Checklist for UFE Evaluation: A Formal Dinner or A La Carte?
AEA 2016 - 17 Step Checklist for UFE Evaluation: A Formal Dinner or A La Carte?
 
6 M&E - Monitoring and Evaluation of Aid Projects
6 M&E - Monitoring and Evaluation of Aid Projects6 M&E - Monitoring and Evaluation of Aid Projects
6 M&E - Monitoring and Evaluation of Aid Projects
 
The F Word - Evaluation.
The F  Word - Evaluation.The F  Word - Evaluation.
The F Word - Evaluation.
 
Developing a Shared Vision for Academic Technology on Your Campus
Developing a Shared Vision for Academic Technology on Your CampusDeveloping a Shared Vision for Academic Technology on Your Campus
Developing a Shared Vision for Academic Technology on Your Campus
 
Module 8 presenter notes
Module 8 presenter notesModule 8 presenter notes
Module 8 presenter notes
 
Startup Foundation
Startup FoundationStartup Foundation
Startup Foundation
 
Setting the Stage for Health Systems Strengthening: What We Have Learned from...
Setting the Stage for Health Systems Strengthening: What We Have Learned from...Setting the Stage for Health Systems Strengthening: What We Have Learned from...
Setting the Stage for Health Systems Strengthening: What We Have Learned from...
 
Sandra Groeneveld
Sandra GroeneveldSandra Groeneveld
Sandra Groeneveld
 
Evaluating and communicating your project
Evaluating and communicating your project Evaluating and communicating your project
Evaluating and communicating your project
 
Project output versus influence in practice : impact as a dimension of resea...
Project output versus influence in practice: impact as a dimension of resea...Project output versus influence in practice: impact as a dimension of resea...
Project output versus influence in practice : impact as a dimension of resea...
 
Emp Research
Emp ResearchEmp Research
Emp Research
 
Program Evaluation Studies TK Logan and David Royse .docx
Program Evaluation Studies TK Logan and David Royse .docxProgram Evaluation Studies TK Logan and David Royse .docx
Program Evaluation Studies TK Logan and David Royse .docx
 
Communication & supervisory process
Communication  & supervisory processCommunication  & supervisory process
Communication & supervisory process
 
Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2Street Jibe Evaluation Workshop 2
Street Jibe Evaluation Workshop 2
 
Project Evaluation, Recycling and Closing
Project Evaluation, Recycling and ClosingProject Evaluation, Recycling and Closing
Project Evaluation, Recycling and Closing
 
Addressing the Risks & Opportunities of Implementing an Outcomes Based Strategy
Addressing the Risks & Opportunities of Implementing an Outcomes Based Strategy Addressing the Risks & Opportunities of Implementing an Outcomes Based Strategy
Addressing the Risks & Opportunities of Implementing an Outcomes Based Strategy
 
Considerations for the successful design & implementation of ICT systems in t...
Considerations for the successful design & implementation of ICT systems in t...Considerations for the successful design & implementation of ICT systems in t...
Considerations for the successful design & implementation of ICT systems in t...
 
MEAS Course on E-Learning: 5 Planning for scaleable operations and costs
MEAS Course on E-Learning: 5 Planning for scaleable operations and costsMEAS Course on E-Learning: 5 Planning for scaleable operations and costs
MEAS Course on E-Learning: 5 Planning for scaleable operations and costs
 

Mental Health Greenspace Evaluation

  • 1. The role of evaluation in mental health and greenspace Dr Ruth Jepson Co-Director, Centre for Population and Public Health Research, University of Stirling Lead for Physical Activity, Diet and Health Research Programme
  • 2. Outline of talk he work at the Centre for Public Health and Population Health Research hy do evaluation? oints to consider in evaluation ome types of evaluation techniques and three examples ood sources of support/toolkits/ training available to projects wishing to evaluate what they do
  • 3. Centre for Public Health & Population Health Research Programme on physical activity and diet Focus on: •Promoting physical and healthy diet as part of everyday behaviour •Promoting physical activity through health professional referrals •Understanding the barriers to physical activity and healthy eating in different population groups (including ethnic minority groups) •Encouraging people to use the outdoors to increase their feelings of health and wellbeing (including walking and gardening) Three relevant evaluation projects: •Walking groups via a GP practice •Health effects of community gardening •Indoor versus outdoor activities via Exercise Referral Schemes
  • 4. Why do evaluation? Evaluation can be powerful and exciting! (Hmm..) It can help you: •Improve services •Understand what works and what doesn’t •Demonstrate the difference that a project makes •Make decisions about the best use of funding •Have evidence for policy and decision-making (Evaluation Support Scotland) Can be your most important and influential tool for getting new or sustained funding - funders want evidence of what works
  • 5. Who is evaluation for? Evaluation Stakeholders What sort of evaluation is valued? Policy makers Effectiveness; what works? Funders Accountability/value for money Planning & Performance Performance monitoring/targets Managers Process evaluations Researchers Knowledge building; research quality and utility Service quality – access, experience, Service users relevance to needs
  • 6. What makes a ‘good’ evaluation? Influences decision making Contributes to the evidence base ifle tu s T a ’s g o d on tti fe ne s mv go n r u u f in d Is obje cti and w ve Shows the e execu ll sH sle up ted value of what we’re doing twu ann dd hte as r cid ef ee wne r ika nr ge m a
  • 7. Before you start an evaluation • Clarify the aims of your project • Think about who you are targeting • Think about how your project will effect change • Identify what you want to achieve in the short, intermediate and long- term • Decide how you will measure what you want to achieve [if possible used well validated measures – don’t attempt to make up your own] • Think about what information you will need to collect [NEVER collect information you don’t intend to use!]
  • 8. Types of evaluation There are 4 broad types of evaluation: •process (which deals broadly with the processes involved in service delivery) •outcome (which determines whether aims have been met & how effective the service is) Often carried out at the same time or slightly staggered •impact (determining the wider implications of the service, often comparing with an areas where no service is provided) •economic (determining whether the service is cost effective)
  • 9. Process evaluation akes place during setting up and/or delivery of the project rovides guidance to those who are responsible for ensuring and improving the project’s quality ocuses on identifying barriers and facilitators to successful implementation/delivery, as well as assessing whether the key objectives have been met an be used to refine/modify the delivery of the project an be used to determine the effects of the project (intended
  • 10. Good process evaluation: 1. Is carried out by people not involved in the project (makes it more objective and people may answer more honestly) 2. Doesn’t make assumptions about how the project works and what it achieves – sometimes there are unintended consequences 3. Uses different types of data to assess the processes Qualitative (talking to people) data can help identify if the project is running as intended; meeting the needs of the participants (staff and users); the experiences (positive and negative) of participants; the changes experienced (intended and unintended); does the project work as it should? Quantitative (numbers) data should be collected to demonstrate that you are attracting the ‘right’ participants (called reach) Minimum data should include all the variables you think are relevant. For example: age, gender, postcode, ? health condition, ?physical activity level for all participants, performance data (e.g. how many referrals were made, activities carried out etc)
  • 11. Outcome Evaluation ims to answer questions such as: oes the project work? oes it provide the benefits to participants you wanted it to make? ow do you measure success? [Where possible VERY IMPORTANT to use questionnaires etc. that have been validated and used in the same type of setting/population] f possible should try and measure change when people start in the project, after a few months and after about one year. old standard would be the randomised controlled trial where you have a control group – you can be more certain that any benefits that are seen are due to the project and not due to other reasons.
  • 12. To do a good outcome evaluation • Be realistic and clear about what your project is likely to achieve – make sure you are measuring the right outcomes (e.g. improving mental health, not curing depression) –ask why you think your project would impact on these outcomes • Collect core information on everyone who comes to the project (baseline) • Consistently ask same questions to all participants • Use proper validated tools to measure outcomes • Don’t collect unnecessary data (just because you think it ‘might be interesting’) • Follow up participants at medium and longer term (e.g. 6 months & 12 months) to see if change/benefit has occurred • Demonstrate effectiveness by using experimental methods (not necessary for all projects) – aspirational but not impossible!
  • 13. Musselburgh Health Walks for sedentary people and/or mental health problems An evaluation by Roma Robertson, PhD student rocess evaluation Is it possible to run a programme of health walks 3 times/week? Do health practitioners (HPs) refer onto the service? Do sedentary patients/people with depression utilise the service? Can the model be improved? What positive and
  • 14. Process evaluation 1. Is it possible to run a programme of health walks 3 times/week? Yes: The programme of walks ran well for the planned 24 weeks. It was organised by CHANGES Community Health Project and supported by 8 volunteer walk leaders, many of whom had attended the well established wellbeing walks run by CHANGES over the years. However, without the volunteer walk leaders it would have been difficult to provide such frequent walks and at low cost. 2. Do health practitioners refer to the service? The initial plan was to recruit walk participants via GP consultations. Low recruitment numbers led to other alternative recruitment strategies being introduced (so No!)
  • 15. Process evaluation . Do sedentary patients/people utilise the service? f the 19 who participated, 13 (68%) stayed until the end of the 12 weeks and 9 people continued to walk with the group beyond the 12 weeks they had agreed to. This suggests that a large proportion of participants valued the service. . Can the walks be improved? here was a lot of very positive feedback. hen pressed for how the walks could be improved, walkers indicated they would like a greater variety of walks; and a quarter felt the walks were too short. One person found it embarrassing to meet at the health centre and one was worried in case someone they knew would see them on the walks. Seven people liked going for coffee afterwards; 5 disagreed or were undecided
  • 16. Outcomes evaluation s there any evidence of benefits to participants? ollected data at baseline, 6 weeks and 6 months sed validated measures of health outcomes (IMPORTANT!) ollected data on mental wellbeing, physical activity, social networks, general health hysical activity mmediately after the intervention and 6 months later more people reported that they ook at least 20 minutes exercise on 3 days each week than before the study ental wellbeing cores for mental health improved after 12 weeks of walks, but the mprovement had reduced and was no longer statistically significant months later
  • 17. Effects of community gardening on health outcomes Project by Di Blackmore, PhD student im to investigate the effect of community gardens on health and elated outcomes. The objectives of this research are to: explore a range of health effects for the individuals explore mechanisms by which the community gardening project affects health etermine how/if the outcomes vary between the different community gardens and other variables such as the amount of time spent in the garden
  • 18. Methods Intend to recruit participants near the start of their gardening experience, and take baseline measures of stress level and physical health: blood pressure, body mass index, activity level and salivary cortisol. In addition, participants will be asked to complete validated questionnaires that examine aspects of mental wellbeing, physical activity, quality adjusted life years, loneliness, community cohesion and social capital. This data will be collected at baseline, some measures again at 6 weeks and all measures at 12 weeks. Also collecting qualitative data to explore participants experiences of being involved in the projects, and how they felt they benefited from them
  • 19. A feasibility study of Exercise Referral Scheme: indoor versus outdoor activities Led by Dr Larry Doi im o test the feasibility, acceptability and effectiveness of randomising patients to ERS in either indoor or outdoor activities. YES a randomised controlled trial!! esearch questions . What are the initial estimations of effectiveness of indoor versus outdoor activities on a range of health outcomes? [OE] . Are there are particular aspects of the outdoor exercise or indoor exercise, delivered via an exercise referral scheme, that confer specific health benefits? [OE] . Do the patients have strong preferences for setting of physical activity? [PE] . What are the main barriers and facilitators to implementing the outdoor intervention successfully? [PE] . What are the underlying mechanisms of action or change (why and how the outdoor and indoor activities have an effect on health outcomes) [PE]
  • 20. A feasibility study of Exercise Referral Scheme: indoor versus outdoor activities Setting: Bathgate, West Lothian Interventions 1) Indoor ERS (normal activities in leisure centre) 2) Outdoor ERS Intervention to be developed but will roughly equate with duration and exercise intensify to the indoor intervention. Components may include: 1.Green gym 2.Led walks 3.Other outdoor activities Duration of the intervention period would be 12 weeks. Participants will be asked to only do activities in the arm of the trial to which they have been allocated. After the 12 weeks all participants will be able to continue with the physical activities of their choice. All the outdoor activities (e.g. green gym, led walks) will remain in place for a minimum of 48 weeks (making the duration of both interventions a year in total). All participants will be followed up at one year.
  • 21. Data collected routinely by the exercise referral team eight loss BMI/ Abdominal Girth hysical activity General Practice Physical Activity Questionnaire hysical fitness Peak Flow lood pressure Systolic BP/ Diastolic BP ental health Hospital Anxiety and Depression Questionnaire eneral health The General Health Questionnaire (GHQ12) atient satisfaction Satisfaction survey dherence Electronic records on attendance at feedback sessions his data is recorded electronically and is collected at several time
  • 22. Good Evaluation Resources Evaluation Support Scotland http://www.evaluationsupportscotland.org.uk/ BHF Exercise Referral Toolkit http://www.bhfactive.org.uk/sites/Exercise-Referral-Toolkit/ Standard Evaluation Framework for physical activity interventions http://www.noo.org.uk/uploads/doc/vid_16722_SEF_PA.pdf Learning, Evaluation and Planning (LEAP) http://www.scdc.org.uk/what/LEAP/

Hinweis der Redaktion

  1. Stakeholder analysis - Evaluation framework (Wimbush & Watson) Policy-makers (SE) Effectiveness; outcome-oriented evaluations; what works? Strategic planners (DCE) Perfprmance management and monitoring Programme/project managers Objectives-based evaluations; developmental/formative evaluations R&E specialists Knowledge-building; research quality; research utility Service users Quality of service provision, experience/how treated, relevance to needs But this is usually ‘ messy ’ – not everyone with an interest in a program will necessary have the same purpose or questions. Stakeholder management is a key skill for evaluators
  2. Questions and discussion What do you think makes a good evaluation? ( data producer/user person)
  3. If taking account of and managing different stakeholders views/interests & questions in an evaluation, you can ’ t address all questions all at once SO YOU NEED TO PRIORITISE This is where the above principles ( pragmatic) come into play
  4. If taking account of and managing different stakeholders views/interests & questions in an evaluation, you can ’ t address all questions all at once SO YOU NEED TO PRIORITISE This is where the above principles ( pragmatic) come into play