Call Girls ITPL Just Call 7001305949 Top Class Call Girl Service Available
Â
Evaluation of health programs
1. Evaluation of Health Programmes Dr IzharulHasan PG Scholar Dept of TST NIUM, Bangalore
2. Evaluation is the systematic acquisition and assessment of information to provide useful feedback about some object.
3. The Goals of Evaluation The generic goal of most evaluations is to provide "useful feedback" to a variety of audiences including sponsors, donors, client-groups, administrators, staff, and other relevant constituencies. Most often, feedback is perceived as "useful" if it aids in decision-making. But the relationship between an evaluation and its impact is not a simple one -- studies that seem critical sometimes fail to influence short-term decisions, and studies that initially seem to have no influence can have a delayed impact when more congenial conditions arise. Despite this, there is broad consensus that the major goal of evaluation should be to influence decision-making or policy formulation through the provision of empirically-driven feedback.
4. Types of Evaluation Formative evaluation includes several evaluation types: needs assessment determines who needs the program, how great the need is, and what might work to meet the need evaluability assessment determines whether an evaluation is feasible and how stakeholders can help shape its usefulness structured conceptualization helps stakeholders define the program or technology, the target population, and the possible outcomes implementation evaluation monitors the fidelity of the program or technology delivery process evaluation investigates the process of delivering the program or technology, including alternative delivery procedures
5. Summative evaluation can also be subdivided: outcome evaluations investigate whether the program or technology caused demonstrable effects on specifically defined target outcomes impact evaluation is broader and assesses the overall or net effects -- intended or unintended -- of the program or technology as a whole cost-effectiveness and cost-benefit analysis address questions of efficiency by standardizing outcomes in terms of their dollar costs and values secondary analysis reexamines existing data to address new questions or use methods not previously employed meta-analysis integrates the outcome estimates from multiple studies to arrive at an overall or summary judgement on an evaluation question
6. Public health actions have expanded beyond infectious diseases to include chronic diseases violence, emerging pathogens, threats of bioterrorism, and the social factors influencing health disparities,
7. Evaluation is the only way to separate programs that promote health and prevent injury, disease, or disability from those that do not do so.
8. Evaluation is a driving force for 1.planning effective public health strategies, 2.improving existing programs, and 3.demonstrating the results of resource investments.
9. Program evaluation is an essential organizational practice in public health however, it is not practiced consistently in all programs, nor is it well-integrated into the day-to-day management of most programs
10. Today, need for program improvement and accountability continues to grow in government, private, and nonprofit sectors. Investments in evaluation improves program quality and effectiveness
11. Evaluation Questions and Methods Evaluators ask many different kinds of questions and use a variety of methods to address them. These are considered within the framework of formative and summative evaluation.
12. In formative research the major questions and methodologies are: What is the definition and scope of the problem or issue, or what's the question? Where is the problem and how big or serious is it? How should the program or technology be delivered to address the problem? How well is the program or technology delivered?
13. The questions and methods addressed under summative evaluation include: What type of evaluation is feasible? What was the effectiveness of the program or technology? What is the net impact of the program?
14. VALUE of the PROGRAM ACTIVITIES Questions regarding values involve three interrelated issues: merit (i.e., quality), worth (i.e., cost-effectiveness), and significance (i.e., importance)
15. Based on evidence, value can be assigned by answering the following questions What will be evaluated? (what is the program and in what context does it exist?) What aspects of the program will be considered What standards must be reached for the program to be considered successful? What evidence will be used How will the lessons learned be used to improve public health effectiveness?
16. FRAMEWORK FOR PROGRAM EVALUATION IN PUBLIC HEALTH (CDC) The framework is composed of six steps that must be taken in any evaluation
24. 3. Focusing the Evaluation Design Plan of action and strategy for evaluation developed Plan should anticipate intended uses of the evaluation data Plan should produce data which is useful, feasible, ethical, and accurate.
25. Among the items to consider when focusing an evaluation are Purpose, Users, Uses, Questions, Methods, and agreements.
26. 4. Gathering Credible Evidence Having credible evidence strengthens evaluation judgments and the recommendations made Aspects of evidence gathering that affect perceptions of credibility include: indicators, sources, quality, quantity, and logistics.
27. Indicators. Indicators reflect aspects of the program that are meaningful for monitoring Examples of indicators that can be defined a) measures of program activities eg. participation rate; levels of client satisfaction; the efficiency of resource use b) measures of program effects eg. changes in participant behavior, community norms, or attitude, health status, quality of life
28. 5.Justifying Conclusions Justifying conclusions on the basis of evidence using standards, analysis, interpretation, judgment, and recommendations.
29. 6.Ensuring Use and Sharing Lessons Learned Lessons learned in the course of an evaluation do not automatically translate into informed decision-making and appropriate action. Elements critical for ensuring use of an evaluation, include design, preparation, feedback, follow-up, and dissemination
30. Standards for Effective Evaluation The program evaluation standards are guiding principles which make fair evaluations practical They provide practical guidelines when we need to decide among evaluation options They help avoid creating an imbalanced evaluation (e.g., one that is accurate and feasible but not useful or one that would be useful and accurate but is infeasible).
31. Standards are grouped into the following four categories utility, feasibility propriety, and accuracy each category has an associated list of guidelines
32. Utility standards Utility standards ensure that information needs of evaluation users are satisfied. Seven utility standards address such items.
33. 1.Stakeholder identification 2.Evaluator credibility 3.Information scope and selection. 4.Values identification 5.Report clarity 6.Report timeliness and dissemination 7.Evaluation impact
36. Propriety standards Propriety standards ensure that an evaluation will be conducted legally, ethically, and with regard for the welfare of those involved in the evaluation as well as those affected by its results:
37. 1.Service orientation. 2.Formal agreements. 3.Rights of human subjects 4.Complete and fair assessment. 5.Conflict of interest
38. Accuracy standards Accuracy standards ensure that an evaluation will convey technically adequate information regarding the determining features of merit of the program
39. 1.Program documentation 2.Context analysis 3.Valid information 4.Reliable information 5.Systematic information 6.Analysis of quantitative information 7.Analysis of qualitative information 7.Justified conclusions.