SlideShare ist ein Scribd-Unternehmen logo
1 von 9
Getting the Most out of Evaluations: A Guide to Successful Evaluation Utilization
Prepared for: The Department of Health and Human Services
Assistant Secretary for Planning and Evaluation
TORP #07EASPE000098
How are Findings from HHS Program Evaluation Used
Submitted by: The Lewin Group
Date: June 30, 2009
Getting the Most out of Evaluations: A Guide to Successful Evaluation Utilization

Table of Contents
Purpose of this Guide _________________________________________________________ 1
What is Evaluation Utilization?______________________________________________________ 1
Where Does This Information Come From? ___________________________________________ 1
How to Increase Use of Evaluation Findings? __________________________________________ 1

Approach 1—Involve Stakeholders in the Evaluation ______________________________ 2
1.
2.

Solicit input on Evaluation Design from a Wide Range of Stakeholders __________________________2
Circulate Early Evaluation Findings ______________________________________________________3

Approach 2—Prioritize Evaluation Quality_______________________________________ 3
1.
2.

Explore the Full Range of Options for Addressing Evaluation Questions _________________________4
Place a Premium on the Evaluators Knowledge _____________________________________________4

Approach 3—Plan and Execute Several Dissemination Methods _____________________ 5
1.
2.
3.

Review Draft Reports for Clarity ________________________________________________________5
Target Dissemination Method for each Audience____________________________________________5
Specify Multiple Deliverables Staggered Throughout the Evaluation Contract _____________________5

Approach 4—Encourage and Support an Evaluation Culture _______________________ 6

i
Getting the Most out of Evaluations: A Guide to Successful Evaluation Utilization

Purpose of this Guide
This guide provides practical action steps that can be taken by project officers and other agency
staff to maximize the extent to which evaluation findings are utilized.

What is Evaluation Utilization?
There are several ways in which findings from an evaluation can be used. Findings may be
used to make discrete decisions about a program, help educate decision-makers, or contribute
to common-knowledge about a program, which may in turn influence later decisions about
similar types of programs. Broadly, evaluation utilization can be divided into two primary
categories: direct use and conceptual or indirect use. Both are valuable and should be sought
after as the end result of any evaluation.
Direct use of evaluations involves decisions about program funding, the nature or operation of
the program, and/or program management. Direct use is relatively rare, and individual
evaluations are not typically the sole reason for discrete decisions about a program/policy.
Rather, decision-makers use information from a variety of sources, one of which is evaluation.
More frequently, evaluation findings contribute to common knowledge about a program,
strategy, or program structure; provide legitimacy for a policy or program; or influence other
institutions or events beyond the program being evaluated (conceptual/indirect use). Although
this type of use may not be visible immediately, it may become apparent when decision-makers
use their new understanding to influence future policy. Long-term influence can occur when
evaluation findings contribute to conceptual use where an evaluation, or accumulation of
evaluations, adds to the continual process of dialogue and learning about a program or strategy
over time.

Where Does This Information Come From?
Information presented in this guide draws largely from The Utilization of DHHS Program
Evaluation: A Preliminary Examination. This study, commissioned by ASPE, involved a
comprehensive review of previous literature, a survey of HHS project officers and program
managers, and focus groups with senior staff at several HHS agencies. Following the
recommendations outlined in this guide will help increase the likelihood that findings from
evaluations will be utilized and help evaluation meet its core goals of program improvement.

How to Increase Use of Evaluation Findings?
We recommend maximizing evaluation utilization using four primary approaches:
1. Involve Stakeholders in the Evaluation,
2. Prioritize Evaluation Quality,
3. Plan and Execute Several Dissemination Methods, and
4. Encourage and Support an Evaluation Culture.

1
Getting the Most out of Evaluations: A Guide to Successful Evaluation Utilization

Each approach provides multiple practical steps that can be taken to ensure that evaluation
findings are used. Each step is supported by the literature on evaluation utilization and findings
from the study of HHS program evaluation utilization.

Approach 1—Involve Stakeholders in the Evaluation
Evaluation stakeholders are anyone with a vested interest in the outcomes of an evaluation, and
can include, for example, Federal Department or agency staff, Congress, State legislators and
State governors, community organizations, State and local program managers and staff, the
research community, consumers and users of the services being studied, and the general public.
A wide body of research has found that the involvement of stakeholders in the evaluation
process increases evaluation utilization. There are several reasons for this; when stakeholders
are involved in the evaluation:
Stakeholders’ knowledge about the evaluation is increased,
The study is tailored to the information needs of the stakeholders, and
The credibility and validity of the evaluation is increased from the perspective of decisionmakers.
Findings from the study of HHS evaluation utilization confirmed that stakeholder involvement
increased evaluation use. Specifically, use was higher when stakeholders were involved in the
evaluation design or previewed findings. Utilization was higher when project officers were
involved in the evaluation design . Most evaluations included in the study had identified at
least one identified stakeholder. While the contracting process presents some barriers to
involving outside stakeholders, there are a number of steps that may be taken to increase
stakeholder involvement at different stages.

1. Solicit input on Evaluation Design from a Wide Range of Stakeholders
Greater efforts to engage key stakeholders in the evaluation design process may lead to greater
utilization by identifying key questions of interest to various stakeholders and building demand
for the evaluation results. Recommended strategies include:
Use Requests for Information (RFIs). Utilize Requests for Information (RFIs) in the
procurement process to solicit input from the stakeholder community. RFIs can take
several forms. An RFI can be a request for comments on strengths and weaknesses of a
program concept. Alternatively, agencies may request a capability statement from
contractors that includes both their approach and the perceived structure for the proposed
concept. While the latter is primarily directed to the contractor community, the former can
be shared with a broader range of stakeholders. The use of RFIs prior to the development
of a Request for Proposal (RFP) can expand the level of input from a broad range of
stakeholders about the evaluation design.
Use Technical Work Groups (TWGs) to Incorporate Expert Advice. The use of TWGs is
another effective way to include stakeholders in the evaluation. Using TWGs, especially in

2
Getting the Most out of Evaluations: A Guide to Successful Evaluation Utilization

the early stages of an evaluation, can help ensure that stakeholders have input when the
evaluation research questions and key design features are defined. Including a program
champion within the TWG also helps keep critical issues at the forefront of the evaluation.
Publicize the Agency’s Evaluation Plan. By early publication of the agency’s evaluation
plan for the year, the agency can obtain input on the overall evaluation agenda from a wide
range of stakeholders. For example, an agency in the Labor Department conducts a full day
meeting to present and discuss their research and evaluation plans with key stakeholders.

2. Circulate Early Evaluation Findings
Interim reports and early findings can take the form of short briefs, design reports, and annual
reports that reflect progress toward short-, intermediate-, and long-term goals. More can be
done to include stakeholders both internal and external to HHS in reviewing these products
through the evaluation process:
Involve agency staff in the review of early findings. By reviewing findings early,
program staff can insure that the appropriate and relevant questions are being addressed.
Require interim products and short briefs as project deliverables. Producing interim
briefs helps to remind the stakeholder community that the evaluation is in progress, creates
interest, and fills the “information void” often created by delays in findings due to both
lengthy evaluation and clearance delays.
Explore and take advantage of forums for sharing early findings with key stakeholders.
Agency staff should seek (and be encouraged) to participate in conferences, debriefings,
and cross-site meetings to encourage sharing across sites and agencies.

Approach 2—Prioritize Evaluation Quality
In the existing literature on evaluation utilization, evaluation quality is one of the most
frequently cited factors for ensuring utilization. Examples of evaluation characteristics
considered to be indicative of high quality include: a lack of technical or methodological flaws,
findings that are published in peer-reviewed journal articles, experienced evaluators, and
evaluators that are knowledgeable of the program area. In the survey of project officers, all of
the evaluations included in the study were rated as moderate, high, or very high in terms of
quality. However, even within this narrow rating band, findings from evaluations with higher
rankings were more likely to be used.
In addition to rating the overall quality of an evaluation, project officers were asked to rank (1)
the evaluation experience of the evaluator and (2) knowledge of the evaluator in the program
area. As with the evaluations, project officers tended to rank evaluation and program area
expertise high. There was an increased likelihood of third-party citations for those evaluations
where the evaluator was considered very knowledgeable versus only moderately
knowledgeable in these two areas. Senior staff at several HHS agencies also reported that
findings were most likely to be used when (1) the evaluation method was rigorous and (2)
replicable promising practices were identified.

3
Getting the Most out of Evaluations: A Guide to Successful Evaluation Utilization

1. Explore the Full Range of Options for Addressing Evaluation Questions
There is a time and place for all types of evaluation; consider what the purpose of the
evaluation is and what it will be used for when deciding on an evaluation design.
Fund Design Studies. Design studies for large evaluation efforts help explore the full
range of options for addressing the evaluation questions. Such studies allow for
investigation into the evaluation approach, data collection, data availability, sample size,
analysis power, and evaluation challenges. They also permit more extensive input from
stakeholders (TWG members, consultants, and agency staff) informed in both method and
subject area.
Use the most rigorous evaluation method possible consistent with the study objectives.
More rigorous designs, particularly experimental methods, are more likely to be able to
detect modest impacts and are generally viewed as more reliable in the evaluation
community. Strong implementation studies should also be prioritized to describe program
and participant characteristics and to document promising practices. Implementation
research serves to help interpret results for impact studies. However, experimental
evaluations are not appropriate for all research questions. Give particular attention to the
type of evaluation that is appropriate for the questions to be answered. Regardless of the
evaluation method used, devote time up front to ensure the fidelity of the evaluation
design.
Encourage agency staff to engage in evaluation training opportunities. Project officers
and other agency staff are often responsible for designing the specifications of evaluations.
Therefore, it is important that they are aware of other evaluations being conducted in the
field and evaluation methods being used. The more knowledgeable these staff are about
evaluation, the more likely the evaluation design will be high in quality.

2. Place a Premium on the Evaluators Knowledge
When selecting an evaluator, place a premium not only on evaluation knowledge held by the
proposed program evaluation team but also their program knowledge. In-depth evaluator
knowledge of program issues and context may be critical to framing research questions and
interpreting and presenting findings. Contractor staff new to the agency may lack institutional
knowledge and familiarity with agency jargon and administrative data. One focus group
reported that it often received poor evaluations from new contractors. It also commented that
the first 6 to 12 months of an evaluation carried out by new contractors is devoted to the
contractor’s orientation to the agency.
Encouraging these contractors new to the agency to interview key agency personnel and
resource center staff early in a new project may mitigate these problems. Also, providing links
to or copies of key reports and evaluations, particularly those considered of high quality might
also speed this learning process. Copies of/links to related research can also be provided
during the proposal process.

4
Getting the Most out of Evaluations: A Guide to Successful Evaluation Utilization

Approach 3—Plan and Execute Several Dissemination Methods
Effective dissemination methods have been found to be associated with increased utilization.
Research suggests that some effective dissemination methods include: disseminating findings to
multiple audiences, tailoring the message for each audience, circulating findings in a timely
manner, communicating findings periodically rather than once, and presenting findings that are
in a format that is accessible to the intended user.
Reports, papers, and journal articles were reported as the most common dissemination methods
in the study of HHS evaluation utilization. Other common methods included conference
presentations and posts on agency websites. However, four barriers to effective dissemination
were also noted in the study:
1. Translating findings for a non-technical audience,
2. Tailoring dissemination activities to a wide range of audiences,
3. Dissemination of findings in a timely manner, and
4. Assuring sufficient funding for dissemination activities.
Most agencies that were interviewed in the study said that multiple deliverables are usually
required during the course of the evaluation rather than simply a final report. However, these
agencies also said that more could be done to tailor each dissemination method to the
appropriate audience.

1. Review Draft Reports for Clarity
Agency staff can play an important role in reviewing draft reports for clear presentation,
particularly in complex or experimental studies including statistical data. Agency staff should
also ensure that both the program design and the evaluation methodology are described
sufficiently to provide context for the findings.

2. Target Dissemination Method for each Audience
Focus on both internal and external audiences (e.g. program/policy staff within the
Department/agency as well as the research community, evaluation practitioners and the
general public). It is easy to forget to disseminate findings internally, but this is a really
important step to take. The internal agency audience can be one of the strongest advocates for
research findings.

3. Specify Multiple Deliverables Staggered Throughout the Evaluation
Contract
Addressing timing issues can improve dissemination and utilization of evaluations findings.
Timing issues turned up in the evaluation utilization study. These included the length of time
required to complete an evaluation, the time to clear findings within the Department, and the
“policy window” or the time when decisions are being made about a program or approach

5
Getting the Most out of Evaluations: A Guide to Successful Evaluation Utilization

being used in the field. Coordinating these timing issues is difficult. Therefore, requiring the
delivery of multiple deliverables throughout the evaluation can ensure that preliminary
findings are released early and shared with stakeholders as well as increase the visibility of
findings. These early reports can indicate whether programs are on track and whether early
goals are met. Additionally, different types of deliverables, such as a technical report for the
program office and a research brief of findings for policy makers and other audiences, can also
be designated as deliverables.

4. Budget Dissemination Activities
A number of dissemination activities can be built into the initial budget of the grant or contract.
For both interim and final finding, such activities could include meetings with technical work
groups and HHS stakeholders, presentations, and key conferences. Such presentations could be
tied to key deliverables discussed above.

Approach 4—Encourage and Support an Evaluation Culture
When agencies place a premium on conducting quality evaluations, findings are used more
frequently. Actions agencies can take to build evaluation capacity include:
Prioritize evaluation, or reinforce the process of self-examination and improvement,
Build collaborative partnerships to obtain needed data and expertise,
Obtain analytic expertise and provide technical assistance to partners, and
Assure data quality by improving administrative systems and carrying out special data
collections.
In the study of HHS program evaluations, evaluation culture played a role in the extent to
which findings were used directly (as described previously). When evaluation was seen as
being a priority within the agency, the findings were more likely to be used directly than when
the evaluation was not viewed as being important to the agency. There are several steps that
can be taken to increase the importance of evaluation and foster an evaluation culture within
the agency/Department.

1. Encourage/Institutionalize Cross-Agency Evaluation Forums
HHS should consider (re) institutionalizing an internal evaluation group within the Department
chaired by ASPE. Meeting regularly to frame research questions and study methods and to
discuss findings along the way, members of the group can learn from each other and gain
perspective on each other’s agencies. Using this kind of approach across the Department by
setting up a standing group to share agency evaluation agendas, methods, and dissemination
efforts would spur cross-pollination across the Department.

6
Getting the Most out of Evaluations: A Guide to Successful Evaluation Utilization

2. Submit All Agency Evaluations to the Policy Information Center (PIC)
Routinely submit evaluations conducted in the agency to the PIC. Some agencies have limited
their submissions to those funded with evaluation set aside funds. Expanding the submission
to all evaluations would help to increase cross-agency evaluation awareness and utilization as
well as help identify current evaluation practices within the Department.

3. Ask for Support from Agency/Department Staff
Help members of the Agency/Department understand why the steps suggested in this guide
are important. Recommendations from this guide will be most effective with institutional
support and a shared dedication to evaluation.

4. Encourage Robust Local Evaluations
Emphasize the evaluation culture to grantees as well as within HHS. By modeling involvement
of HHS and stakeholders in the national evaluation, grantees can gain insight into what is
expected at the local level.

7

Weitere ähnliche Inhalte

Andere mochten auch

Calusul the fastest folk dance (1)
Calusul the fastest folk dance (1)Calusul the fastest folk dance (1)
Calusul the fastest folk dance (1)Nicol Vrettou
 
Meritorious Mast 2010
Meritorious Mast 2010Meritorious Mast 2010
Meritorious Mast 2010Bryant Davis
 
Certificate of Commendation 2009
Certificate of Commendation 2009Certificate of Commendation 2009
Certificate of Commendation 2009Bryant Davis
 
Commendation Letters Sem1%2c 2016 Bentley 3plus SWA75
Commendation Letters Sem1%2c 2016 Bentley 3plus SWA75Commendation Letters Sem1%2c 2016 Bentley 3plus SWA75
Commendation Letters Sem1%2c 2016 Bentley 3plus SWA75Nicole Moller
 
δραστηριοτητα αυτογνωσιασ 4
δραστηριοτητα αυτογνωσιασ 4δραστηριοτητα αυτογνωσιασ 4
δραστηριοτητα αυτογνωσιασ 4Papanikolaougiannis
 
Programming Paradigm & Languages
Programming Paradigm & LanguagesProgramming Paradigm & Languages
Programming Paradigm & LanguagesGaditek
 
Programming Paradigms
Programming ParadigmsProgramming Paradigms
Programming ParadigmsDirecti Group
 

Andere mochten auch (8)

Calusul the fastest folk dance (1)
Calusul the fastest folk dance (1)Calusul the fastest folk dance (1)
Calusul the fastest folk dance (1)
 
Meritorious Mast 2010
Meritorious Mast 2010Meritorious Mast 2010
Meritorious Mast 2010
 
Certificate of Commendation 2009
Certificate of Commendation 2009Certificate of Commendation 2009
Certificate of Commendation 2009
 
Commendation Letters Sem1%2c 2016 Bentley 3plus SWA75
Commendation Letters Sem1%2c 2016 Bentley 3plus SWA75Commendation Letters Sem1%2c 2016 Bentley 3plus SWA75
Commendation Letters Sem1%2c 2016 Bentley 3plus SWA75
 
800 wf copper
800 wf copper800 wf copper
800 wf copper
 
δραστηριοτητα αυτογνωσιασ 4
δραστηριοτητα αυτογνωσιασ 4δραστηριοτητα αυτογνωσιασ 4
δραστηριοτητα αυτογνωσιασ 4
 
Programming Paradigm & Languages
Programming Paradigm & LanguagesProgramming Paradigm & Languages
Programming Paradigm & Languages
 
Programming Paradigms
Programming ParadigmsProgramming Paradigms
Programming Paradigms
 

Mehr von Washington Evaluators

2020 WE Member Engagement Survey Results - Summary
2020 WE Member Engagement Survey Results - Summary2020 WE Member Engagement Survey Results - Summary
2020 WE Member Engagement Survey Results - SummaryWashington Evaluators
 
Building a Community of Practice through WE's Mentor Minutes
Building a Community of Practice through WE's Mentor MinutesBuilding a Community of Practice through WE's Mentor Minutes
Building a Community of Practice through WE's Mentor MinutesWashington Evaluators
 
George Julnes: Humility in Valuing in the Public Interest - Multiple Methods ...
George Julnes: Humility in Valuing in the Public Interest - Multiple Methods ...George Julnes: Humility in Valuing in the Public Interest - Multiple Methods ...
George Julnes: Humility in Valuing in the Public Interest - Multiple Methods ...Washington Evaluators
 
Harry Hatry: Cost-Effectiveness Basics for Evidence-Based Policymaking
Harry Hatry: Cost-Effectiveness Basics for Evidence-Based PolicymakingHarry Hatry: Cost-Effectiveness Basics for Evidence-Based Policymaking
Harry Hatry: Cost-Effectiveness Basics for Evidence-Based PolicymakingWashington Evaluators
 
George Julnes: Humility in Valuing in the Public Interest - Multiple Methods ...
George Julnes: Humility in Valuing in the Public Interest - Multiple Methods ...George Julnes: Humility in Valuing in the Public Interest - Multiple Methods ...
George Julnes: Humility in Valuing in the Public Interest - Multiple Methods ...Washington Evaluators
 
DC Consortium Student Conference 1.0
DC Consortium Student Conference 1.0DC Consortium Student Conference 1.0
DC Consortium Student Conference 1.0Washington Evaluators
 
Are Federal Managers Using Evidence in Decision Making?
Are Federal Managers Using Evidence in Decision Making?Are Federal Managers Using Evidence in Decision Making?
Are Federal Managers Using Evidence in Decision Making?Washington Evaluators
 
Causal Knowledge Mapping for More Useful Evaluation
Causal Knowledge Mapping for More Useful EvaluationCausal Knowledge Mapping for More Useful Evaluation
Causal Knowledge Mapping for More Useful EvaluationWashington Evaluators
 
Partnerships for Transformative Change in Challenging Political Contexts w/ D...
Partnerships for Transformative Change in Challenging Political Contexts w/ D...Partnerships for Transformative Change in Challenging Political Contexts w/ D...
Partnerships for Transformative Change in Challenging Political Contexts w/ D...Washington Evaluators
 
@WashEval: Facilitating Evaluation Collaboration for 30+ Years
@WashEval:  Facilitating Evaluation Collaboration for 30+ Years@WashEval:  Facilitating Evaluation Collaboration for 30+ Years
@WashEval: Facilitating Evaluation Collaboration for 30+ YearsWashington Evaluators
 
Transitioning from School to Work: Preparing Evaluation Students and New Eval...
Transitioning from School to Work: Preparing Evaluation Students and New Eval...Transitioning from School to Work: Preparing Evaluation Students and New Eval...
Transitioning from School to Work: Preparing Evaluation Students and New Eval...Washington Evaluators
 
The Importance of Systematic Reviews
The Importance of Systematic ReviewsThe Importance of Systematic Reviews
The Importance of Systematic ReviewsWashington Evaluators
 
GPRAMA Implementation After Five Years
GPRAMA Implementation After Five YearsGPRAMA Implementation After Five Years
GPRAMA Implementation After Five YearsWashington Evaluators
 
Challenges and Solutions to Conducting High Quality Contract Evaluations for ...
Challenges and Solutions to Conducting High Quality Contract Evaluations for ...Challenges and Solutions to Conducting High Quality Contract Evaluations for ...
Challenges and Solutions to Conducting High Quality Contract Evaluations for ...Washington Evaluators
 
Junge wb bb presentation 06 17-15 final
Junge wb bb presentation 06 17-15 finalJunge wb bb presentation 06 17-15 final
Junge wb bb presentation 06 17-15 finalWashington Evaluators
 
Building Program Evaluation Capacity in Central Asia, Part 1
Building Program Evaluation Capacity in Central Asia, Part 1Building Program Evaluation Capacity in Central Asia, Part 1
Building Program Evaluation Capacity in Central Asia, Part 1Washington Evaluators
 
Building Program Evaluation Capacity in Central Asia, Part 1
Building Program Evaluation Capacity in Central Asia, Part 1Building Program Evaluation Capacity in Central Asia, Part 1
Building Program Evaluation Capacity in Central Asia, Part 1Washington Evaluators
 
Sustaining an Evaluator Community of Practice
Sustaining an Evaluator Community of PracticeSustaining an Evaluator Community of Practice
Sustaining an Evaluator Community of PracticeWashington Evaluators
 
Influencing Evaluation Policy and Practice: The American Evaluation Associati...
Influencing Evaluation Policy and Practice: The American Evaluation Associati...Influencing Evaluation Policy and Practice: The American Evaluation Associati...
Influencing Evaluation Policy and Practice: The American Evaluation Associati...Washington Evaluators
 

Mehr von Washington Evaluators (20)

2020 WE Member Engagement Survey Results - Summary
2020 WE Member Engagement Survey Results - Summary2020 WE Member Engagement Survey Results - Summary
2020 WE Member Engagement Survey Results - Summary
 
Building a Community of Practice through WE's Mentor Minutes
Building a Community of Practice through WE's Mentor MinutesBuilding a Community of Practice through WE's Mentor Minutes
Building a Community of Practice through WE's Mentor Minutes
 
George Julnes: Humility in Valuing in the Public Interest - Multiple Methods ...
George Julnes: Humility in Valuing in the Public Interest - Multiple Methods ...George Julnes: Humility in Valuing in the Public Interest - Multiple Methods ...
George Julnes: Humility in Valuing in the Public Interest - Multiple Methods ...
 
Harry Hatry: Cost-Effectiveness Basics for Evidence-Based Policymaking
Harry Hatry: Cost-Effectiveness Basics for Evidence-Based PolicymakingHarry Hatry: Cost-Effectiveness Basics for Evidence-Based Policymaking
Harry Hatry: Cost-Effectiveness Basics for Evidence-Based Policymaking
 
George Julnes: Humility in Valuing in the Public Interest - Multiple Methods ...
George Julnes: Humility in Valuing in the Public Interest - Multiple Methods ...George Julnes: Humility in Valuing in the Public Interest - Multiple Methods ...
George Julnes: Humility in Valuing in the Public Interest - Multiple Methods ...
 
DC Consortium Student Conference 1.0
DC Consortium Student Conference 1.0DC Consortium Student Conference 1.0
DC Consortium Student Conference 1.0
 
Are Federal Managers Using Evidence in Decision Making?
Are Federal Managers Using Evidence in Decision Making?Are Federal Managers Using Evidence in Decision Making?
Are Federal Managers Using Evidence in Decision Making?
 
Causal Knowledge Mapping for More Useful Evaluation
Causal Knowledge Mapping for More Useful EvaluationCausal Knowledge Mapping for More Useful Evaluation
Causal Knowledge Mapping for More Useful Evaluation
 
Partnerships for Transformative Change in Challenging Political Contexts w/ D...
Partnerships for Transformative Change in Challenging Political Contexts w/ D...Partnerships for Transformative Change in Challenging Political Contexts w/ D...
Partnerships for Transformative Change in Challenging Political Contexts w/ D...
 
@WashEval: Facilitating Evaluation Collaboration for 30+ Years
@WashEval:  Facilitating Evaluation Collaboration for 30+ Years@WashEval:  Facilitating Evaluation Collaboration for 30+ Years
@WashEval: Facilitating Evaluation Collaboration for 30+ Years
 
Transitioning from School to Work: Preparing Evaluation Students and New Eval...
Transitioning from School to Work: Preparing Evaluation Students and New Eval...Transitioning from School to Work: Preparing Evaluation Students and New Eval...
Transitioning from School to Work: Preparing Evaluation Students and New Eval...
 
The Importance of Systematic Reviews
The Importance of Systematic ReviewsThe Importance of Systematic Reviews
The Importance of Systematic Reviews
 
GPRAMA Implementation After Five Years
GPRAMA Implementation After Five YearsGPRAMA Implementation After Five Years
GPRAMA Implementation After Five Years
 
Challenges and Solutions to Conducting High Quality Contract Evaluations for ...
Challenges and Solutions to Conducting High Quality Contract Evaluations for ...Challenges and Solutions to Conducting High Quality Contract Evaluations for ...
Challenges and Solutions to Conducting High Quality Contract Evaluations for ...
 
Junge wb bb presentation 06 17-15 final
Junge wb bb presentation 06 17-15 finalJunge wb bb presentation 06 17-15 final
Junge wb bb presentation 06 17-15 final
 
Building Program Evaluation Capacity in Central Asia, Part 1
Building Program Evaluation Capacity in Central Asia, Part 1Building Program Evaluation Capacity in Central Asia, Part 1
Building Program Evaluation Capacity in Central Asia, Part 1
 
Building Program Evaluation Capacity in Central Asia, Part 1
Building Program Evaluation Capacity in Central Asia, Part 1Building Program Evaluation Capacity in Central Asia, Part 1
Building Program Evaluation Capacity in Central Asia, Part 1
 
Sustaining an Evaluator Community of Practice
Sustaining an Evaluator Community of PracticeSustaining an Evaluator Community of Practice
Sustaining an Evaluator Community of Practice
 
Visualizing Evaluation Results
Visualizing Evaluation ResultsVisualizing Evaluation Results
Visualizing Evaluation Results
 
Influencing Evaluation Policy and Practice: The American Evaluation Associati...
Influencing Evaluation Policy and Practice: The American Evaluation Associati...Influencing Evaluation Policy and Practice: The American Evaluation Associati...
Influencing Evaluation Policy and Practice: The American Evaluation Associati...
 

Kürzlich hochgeladen

Experience the Future of the Web3 Gaming Trend
Experience the Future of the Web3 Gaming TrendExperience the Future of the Web3 Gaming Trend
Experience the Future of the Web3 Gaming TrendFabwelt
 
Manipur-Book-Final-2-compressed.pdfsal'rpk
Manipur-Book-Final-2-compressed.pdfsal'rpkManipur-Book-Final-2-compressed.pdfsal'rpk
Manipur-Book-Final-2-compressed.pdfsal'rpkbhavenpr
 
Global Terrorism and its types and prevention ppt.
Global Terrorism and its types and prevention ppt.Global Terrorism and its types and prevention ppt.
Global Terrorism and its types and prevention ppt.NaveedKhaskheli1
 
VIP Girls Available Call or WhatsApp 9711199012
VIP Girls Available Call or WhatsApp 9711199012VIP Girls Available Call or WhatsApp 9711199012
VIP Girls Available Call or WhatsApp 9711199012ankitnayak356677
 
complaint-ECI-PM-media-1-Chandru.pdfra;;prfk
complaint-ECI-PM-media-1-Chandru.pdfra;;prfkcomplaint-ECI-PM-media-1-Chandru.pdfra;;prfk
complaint-ECI-PM-media-1-Chandru.pdfra;;prfkbhavenpr
 
IndiaWest: Your Trusted Source for Today's Global News
IndiaWest: Your Trusted Source for Today's Global NewsIndiaWest: Your Trusted Source for Today's Global News
IndiaWest: Your Trusted Source for Today's Global NewsIndiaWest2
 
Quiz for Heritage Indian including all the rounds
Quiz for Heritage Indian including all the roundsQuiz for Heritage Indian including all the rounds
Quiz for Heritage Indian including all the roundsnaxymaxyy
 
16042024_First India Newspaper Jaipur.pdf
16042024_First India Newspaper Jaipur.pdf16042024_First India Newspaper Jaipur.pdf
16042024_First India Newspaper Jaipur.pdfFIRST INDIA
 
Rohan Jaitley: Central Gov't Standing Counsel for Justice
Rohan Jaitley: Central Gov't Standing Counsel for JusticeRohan Jaitley: Central Gov't Standing Counsel for Justice
Rohan Jaitley: Central Gov't Standing Counsel for JusticeAbdulGhani778830
 
57 Bidens Annihilation Nation Policy.pdf
57 Bidens Annihilation Nation Policy.pdf57 Bidens Annihilation Nation Policy.pdf
57 Bidens Annihilation Nation Policy.pdfGerald Furnkranz
 

Kürzlich hochgeladen (10)

Experience the Future of the Web3 Gaming Trend
Experience the Future of the Web3 Gaming TrendExperience the Future of the Web3 Gaming Trend
Experience the Future of the Web3 Gaming Trend
 
Manipur-Book-Final-2-compressed.pdfsal'rpk
Manipur-Book-Final-2-compressed.pdfsal'rpkManipur-Book-Final-2-compressed.pdfsal'rpk
Manipur-Book-Final-2-compressed.pdfsal'rpk
 
Global Terrorism and its types and prevention ppt.
Global Terrorism and its types and prevention ppt.Global Terrorism and its types and prevention ppt.
Global Terrorism and its types and prevention ppt.
 
VIP Girls Available Call or WhatsApp 9711199012
VIP Girls Available Call or WhatsApp 9711199012VIP Girls Available Call or WhatsApp 9711199012
VIP Girls Available Call or WhatsApp 9711199012
 
complaint-ECI-PM-media-1-Chandru.pdfra;;prfk
complaint-ECI-PM-media-1-Chandru.pdfra;;prfkcomplaint-ECI-PM-media-1-Chandru.pdfra;;prfk
complaint-ECI-PM-media-1-Chandru.pdfra;;prfk
 
IndiaWest: Your Trusted Source for Today's Global News
IndiaWest: Your Trusted Source for Today's Global NewsIndiaWest: Your Trusted Source for Today's Global News
IndiaWest: Your Trusted Source for Today's Global News
 
Quiz for Heritage Indian including all the rounds
Quiz for Heritage Indian including all the roundsQuiz for Heritage Indian including all the rounds
Quiz for Heritage Indian including all the rounds
 
16042024_First India Newspaper Jaipur.pdf
16042024_First India Newspaper Jaipur.pdf16042024_First India Newspaper Jaipur.pdf
16042024_First India Newspaper Jaipur.pdf
 
Rohan Jaitley: Central Gov't Standing Counsel for Justice
Rohan Jaitley: Central Gov't Standing Counsel for JusticeRohan Jaitley: Central Gov't Standing Counsel for Justice
Rohan Jaitley: Central Gov't Standing Counsel for Justice
 
57 Bidens Annihilation Nation Policy.pdf
57 Bidens Annihilation Nation Policy.pdf57 Bidens Annihilation Nation Policy.pdf
57 Bidens Annihilation Nation Policy.pdf
 

The Utilization of DHHS Program Evaluations: A Preliminary Examination

  • 1. Getting the Most out of Evaluations: A Guide to Successful Evaluation Utilization Prepared for: The Department of Health and Human Services Assistant Secretary for Planning and Evaluation TORP #07EASPE000098 How are Findings from HHS Program Evaluation Used Submitted by: The Lewin Group Date: June 30, 2009
  • 2. Getting the Most out of Evaluations: A Guide to Successful Evaluation Utilization Table of Contents Purpose of this Guide _________________________________________________________ 1 What is Evaluation Utilization?______________________________________________________ 1 Where Does This Information Come From? ___________________________________________ 1 How to Increase Use of Evaluation Findings? __________________________________________ 1 Approach 1—Involve Stakeholders in the Evaluation ______________________________ 2 1. 2. Solicit input on Evaluation Design from a Wide Range of Stakeholders __________________________2 Circulate Early Evaluation Findings ______________________________________________________3 Approach 2—Prioritize Evaluation Quality_______________________________________ 3 1. 2. Explore the Full Range of Options for Addressing Evaluation Questions _________________________4 Place a Premium on the Evaluators Knowledge _____________________________________________4 Approach 3—Plan and Execute Several Dissemination Methods _____________________ 5 1. 2. 3. Review Draft Reports for Clarity ________________________________________________________5 Target Dissemination Method for each Audience____________________________________________5 Specify Multiple Deliverables Staggered Throughout the Evaluation Contract _____________________5 Approach 4—Encourage and Support an Evaluation Culture _______________________ 6 i
  • 3. Getting the Most out of Evaluations: A Guide to Successful Evaluation Utilization Purpose of this Guide This guide provides practical action steps that can be taken by project officers and other agency staff to maximize the extent to which evaluation findings are utilized. What is Evaluation Utilization? There are several ways in which findings from an evaluation can be used. Findings may be used to make discrete decisions about a program, help educate decision-makers, or contribute to common-knowledge about a program, which may in turn influence later decisions about similar types of programs. Broadly, evaluation utilization can be divided into two primary categories: direct use and conceptual or indirect use. Both are valuable and should be sought after as the end result of any evaluation. Direct use of evaluations involves decisions about program funding, the nature or operation of the program, and/or program management. Direct use is relatively rare, and individual evaluations are not typically the sole reason for discrete decisions about a program/policy. Rather, decision-makers use information from a variety of sources, one of which is evaluation. More frequently, evaluation findings contribute to common knowledge about a program, strategy, or program structure; provide legitimacy for a policy or program; or influence other institutions or events beyond the program being evaluated (conceptual/indirect use). Although this type of use may not be visible immediately, it may become apparent when decision-makers use their new understanding to influence future policy. Long-term influence can occur when evaluation findings contribute to conceptual use where an evaluation, or accumulation of evaluations, adds to the continual process of dialogue and learning about a program or strategy over time. Where Does This Information Come From? Information presented in this guide draws largely from The Utilization of DHHS Program Evaluation: A Preliminary Examination. This study, commissioned by ASPE, involved a comprehensive review of previous literature, a survey of HHS project officers and program managers, and focus groups with senior staff at several HHS agencies. Following the recommendations outlined in this guide will help increase the likelihood that findings from evaluations will be utilized and help evaluation meet its core goals of program improvement. How to Increase Use of Evaluation Findings? We recommend maximizing evaluation utilization using four primary approaches: 1. Involve Stakeholders in the Evaluation, 2. Prioritize Evaluation Quality, 3. Plan and Execute Several Dissemination Methods, and 4. Encourage and Support an Evaluation Culture. 1
  • 4. Getting the Most out of Evaluations: A Guide to Successful Evaluation Utilization Each approach provides multiple practical steps that can be taken to ensure that evaluation findings are used. Each step is supported by the literature on evaluation utilization and findings from the study of HHS program evaluation utilization. Approach 1—Involve Stakeholders in the Evaluation Evaluation stakeholders are anyone with a vested interest in the outcomes of an evaluation, and can include, for example, Federal Department or agency staff, Congress, State legislators and State governors, community organizations, State and local program managers and staff, the research community, consumers and users of the services being studied, and the general public. A wide body of research has found that the involvement of stakeholders in the evaluation process increases evaluation utilization. There are several reasons for this; when stakeholders are involved in the evaluation: Stakeholders’ knowledge about the evaluation is increased, The study is tailored to the information needs of the stakeholders, and The credibility and validity of the evaluation is increased from the perspective of decisionmakers. Findings from the study of HHS evaluation utilization confirmed that stakeholder involvement increased evaluation use. Specifically, use was higher when stakeholders were involved in the evaluation design or previewed findings. Utilization was higher when project officers were involved in the evaluation design . Most evaluations included in the study had identified at least one identified stakeholder. While the contracting process presents some barriers to involving outside stakeholders, there are a number of steps that may be taken to increase stakeholder involvement at different stages. 1. Solicit input on Evaluation Design from a Wide Range of Stakeholders Greater efforts to engage key stakeholders in the evaluation design process may lead to greater utilization by identifying key questions of interest to various stakeholders and building demand for the evaluation results. Recommended strategies include: Use Requests for Information (RFIs). Utilize Requests for Information (RFIs) in the procurement process to solicit input from the stakeholder community. RFIs can take several forms. An RFI can be a request for comments on strengths and weaknesses of a program concept. Alternatively, agencies may request a capability statement from contractors that includes both their approach and the perceived structure for the proposed concept. While the latter is primarily directed to the contractor community, the former can be shared with a broader range of stakeholders. The use of RFIs prior to the development of a Request for Proposal (RFP) can expand the level of input from a broad range of stakeholders about the evaluation design. Use Technical Work Groups (TWGs) to Incorporate Expert Advice. The use of TWGs is another effective way to include stakeholders in the evaluation. Using TWGs, especially in 2
  • 5. Getting the Most out of Evaluations: A Guide to Successful Evaluation Utilization the early stages of an evaluation, can help ensure that stakeholders have input when the evaluation research questions and key design features are defined. Including a program champion within the TWG also helps keep critical issues at the forefront of the evaluation. Publicize the Agency’s Evaluation Plan. By early publication of the agency’s evaluation plan for the year, the agency can obtain input on the overall evaluation agenda from a wide range of stakeholders. For example, an agency in the Labor Department conducts a full day meeting to present and discuss their research and evaluation plans with key stakeholders. 2. Circulate Early Evaluation Findings Interim reports and early findings can take the form of short briefs, design reports, and annual reports that reflect progress toward short-, intermediate-, and long-term goals. More can be done to include stakeholders both internal and external to HHS in reviewing these products through the evaluation process: Involve agency staff in the review of early findings. By reviewing findings early, program staff can insure that the appropriate and relevant questions are being addressed. Require interim products and short briefs as project deliverables. Producing interim briefs helps to remind the stakeholder community that the evaluation is in progress, creates interest, and fills the “information void” often created by delays in findings due to both lengthy evaluation and clearance delays. Explore and take advantage of forums for sharing early findings with key stakeholders. Agency staff should seek (and be encouraged) to participate in conferences, debriefings, and cross-site meetings to encourage sharing across sites and agencies. Approach 2—Prioritize Evaluation Quality In the existing literature on evaluation utilization, evaluation quality is one of the most frequently cited factors for ensuring utilization. Examples of evaluation characteristics considered to be indicative of high quality include: a lack of technical or methodological flaws, findings that are published in peer-reviewed journal articles, experienced evaluators, and evaluators that are knowledgeable of the program area. In the survey of project officers, all of the evaluations included in the study were rated as moderate, high, or very high in terms of quality. However, even within this narrow rating band, findings from evaluations with higher rankings were more likely to be used. In addition to rating the overall quality of an evaluation, project officers were asked to rank (1) the evaluation experience of the evaluator and (2) knowledge of the evaluator in the program area. As with the evaluations, project officers tended to rank evaluation and program area expertise high. There was an increased likelihood of third-party citations for those evaluations where the evaluator was considered very knowledgeable versus only moderately knowledgeable in these two areas. Senior staff at several HHS agencies also reported that findings were most likely to be used when (1) the evaluation method was rigorous and (2) replicable promising practices were identified. 3
  • 6. Getting the Most out of Evaluations: A Guide to Successful Evaluation Utilization 1. Explore the Full Range of Options for Addressing Evaluation Questions There is a time and place for all types of evaluation; consider what the purpose of the evaluation is and what it will be used for when deciding on an evaluation design. Fund Design Studies. Design studies for large evaluation efforts help explore the full range of options for addressing the evaluation questions. Such studies allow for investigation into the evaluation approach, data collection, data availability, sample size, analysis power, and evaluation challenges. They also permit more extensive input from stakeholders (TWG members, consultants, and agency staff) informed in both method and subject area. Use the most rigorous evaluation method possible consistent with the study objectives. More rigorous designs, particularly experimental methods, are more likely to be able to detect modest impacts and are generally viewed as more reliable in the evaluation community. Strong implementation studies should also be prioritized to describe program and participant characteristics and to document promising practices. Implementation research serves to help interpret results for impact studies. However, experimental evaluations are not appropriate for all research questions. Give particular attention to the type of evaluation that is appropriate for the questions to be answered. Regardless of the evaluation method used, devote time up front to ensure the fidelity of the evaluation design. Encourage agency staff to engage in evaluation training opportunities. Project officers and other agency staff are often responsible for designing the specifications of evaluations. Therefore, it is important that they are aware of other evaluations being conducted in the field and evaluation methods being used. The more knowledgeable these staff are about evaluation, the more likely the evaluation design will be high in quality. 2. Place a Premium on the Evaluators Knowledge When selecting an evaluator, place a premium not only on evaluation knowledge held by the proposed program evaluation team but also their program knowledge. In-depth evaluator knowledge of program issues and context may be critical to framing research questions and interpreting and presenting findings. Contractor staff new to the agency may lack institutional knowledge and familiarity with agency jargon and administrative data. One focus group reported that it often received poor evaluations from new contractors. It also commented that the first 6 to 12 months of an evaluation carried out by new contractors is devoted to the contractor’s orientation to the agency. Encouraging these contractors new to the agency to interview key agency personnel and resource center staff early in a new project may mitigate these problems. Also, providing links to or copies of key reports and evaluations, particularly those considered of high quality might also speed this learning process. Copies of/links to related research can also be provided during the proposal process. 4
  • 7. Getting the Most out of Evaluations: A Guide to Successful Evaluation Utilization Approach 3—Plan and Execute Several Dissemination Methods Effective dissemination methods have been found to be associated with increased utilization. Research suggests that some effective dissemination methods include: disseminating findings to multiple audiences, tailoring the message for each audience, circulating findings in a timely manner, communicating findings periodically rather than once, and presenting findings that are in a format that is accessible to the intended user. Reports, papers, and journal articles were reported as the most common dissemination methods in the study of HHS evaluation utilization. Other common methods included conference presentations and posts on agency websites. However, four barriers to effective dissemination were also noted in the study: 1. Translating findings for a non-technical audience, 2. Tailoring dissemination activities to a wide range of audiences, 3. Dissemination of findings in a timely manner, and 4. Assuring sufficient funding for dissemination activities. Most agencies that were interviewed in the study said that multiple deliverables are usually required during the course of the evaluation rather than simply a final report. However, these agencies also said that more could be done to tailor each dissemination method to the appropriate audience. 1. Review Draft Reports for Clarity Agency staff can play an important role in reviewing draft reports for clear presentation, particularly in complex or experimental studies including statistical data. Agency staff should also ensure that both the program design and the evaluation methodology are described sufficiently to provide context for the findings. 2. Target Dissemination Method for each Audience Focus on both internal and external audiences (e.g. program/policy staff within the Department/agency as well as the research community, evaluation practitioners and the general public). It is easy to forget to disseminate findings internally, but this is a really important step to take. The internal agency audience can be one of the strongest advocates for research findings. 3. Specify Multiple Deliverables Staggered Throughout the Evaluation Contract Addressing timing issues can improve dissemination and utilization of evaluations findings. Timing issues turned up in the evaluation utilization study. These included the length of time required to complete an evaluation, the time to clear findings within the Department, and the “policy window” or the time when decisions are being made about a program or approach 5
  • 8. Getting the Most out of Evaluations: A Guide to Successful Evaluation Utilization being used in the field. Coordinating these timing issues is difficult. Therefore, requiring the delivery of multiple deliverables throughout the evaluation can ensure that preliminary findings are released early and shared with stakeholders as well as increase the visibility of findings. These early reports can indicate whether programs are on track and whether early goals are met. Additionally, different types of deliverables, such as a technical report for the program office and a research brief of findings for policy makers and other audiences, can also be designated as deliverables. 4. Budget Dissemination Activities A number of dissemination activities can be built into the initial budget of the grant or contract. For both interim and final finding, such activities could include meetings with technical work groups and HHS stakeholders, presentations, and key conferences. Such presentations could be tied to key deliverables discussed above. Approach 4—Encourage and Support an Evaluation Culture When agencies place a premium on conducting quality evaluations, findings are used more frequently. Actions agencies can take to build evaluation capacity include: Prioritize evaluation, or reinforce the process of self-examination and improvement, Build collaborative partnerships to obtain needed data and expertise, Obtain analytic expertise and provide technical assistance to partners, and Assure data quality by improving administrative systems and carrying out special data collections. In the study of HHS program evaluations, evaluation culture played a role in the extent to which findings were used directly (as described previously). When evaluation was seen as being a priority within the agency, the findings were more likely to be used directly than when the evaluation was not viewed as being important to the agency. There are several steps that can be taken to increase the importance of evaluation and foster an evaluation culture within the agency/Department. 1. Encourage/Institutionalize Cross-Agency Evaluation Forums HHS should consider (re) institutionalizing an internal evaluation group within the Department chaired by ASPE. Meeting regularly to frame research questions and study methods and to discuss findings along the way, members of the group can learn from each other and gain perspective on each other’s agencies. Using this kind of approach across the Department by setting up a standing group to share agency evaluation agendas, methods, and dissemination efforts would spur cross-pollination across the Department. 6
  • 9. Getting the Most out of Evaluations: A Guide to Successful Evaluation Utilization 2. Submit All Agency Evaluations to the Policy Information Center (PIC) Routinely submit evaluations conducted in the agency to the PIC. Some agencies have limited their submissions to those funded with evaluation set aside funds. Expanding the submission to all evaluations would help to increase cross-agency evaluation awareness and utilization as well as help identify current evaluation practices within the Department. 3. Ask for Support from Agency/Department Staff Help members of the Agency/Department understand why the steps suggested in this guide are important. Recommendations from this guide will be most effective with institutional support and a shared dedication to evaluation. 4. Encourage Robust Local Evaluations Emphasize the evaluation culture to grantees as well as within HHS. By modeling involvement of HHS and stakeholders in the national evaluation, grantees can gain insight into what is expected at the local level. 7