This document discusses evaluation approaches for complex adaptive systems. It begins with an overview of complexity and characteristics of complex systems. It then presents 8 questions evaluators should ask to evaluate projects through a complexity lens. The questions focus on issues like understanding history and priorities, accommodating diversity, influencing dynamics, monitoring and adapting to changes. The document provides examples and explanations for each question. It concludes that these questions can help evaluators contribute to the evidence base on influencing behavior in complex systems.
(PRIYA) Call Girls Rajgurunagar ( 7001035870 ) HI-Fi Pune Escorts Service
Evaluation amidst complexity: 8 questions evaluators should ask
1. Evaluation amidst complexity
Eight questions evaluators should ask
Originally presented at Australasian
Evaluation Society Annual Meeting,
Melbourne, September 2015.
Revised November 2015.
Ann Larson, PhD
www.socialdimensions.com.au
2. Complexity is often invoked as a
reason for a project’s failures. My key
message is that evaluators can use a
complexity lens to understand and
facilitate success – if we know what to
look for.
3. Overview of presentation
• Framing the problem
• Explaining characteristics of complex
adaptive systems
• Eight questions to ask in an evaluation
• Closing remarks
5. Characteristics of projects introducing
an evidence-informed intervention
which are affected by complexity
Signs of resistance or lack of
support among some
stakeholders or intended
beneficiaries
Slow progress in starting
implementation
Small or lack of adoption of new
practices despite different
strategies to change behaviour
Not clear how to
sustain gains when
the project finishes
6. Context for evaluators
Involved only at the end of
the project, not at the
beginning or middle.
The task is to make a
narrative to explain success
or failure that is convincing
to implementers and donors.
We also must make
recommendations that are
grounded in evidence.
There is rarely a logic model
or an M&E system that is
particularly helpful for the
evaluation.
16. An eclectic list of strategies for
projects to harness rather than control
complexity
• Flexible, long term funding
• New behaviour grounded in relevant history and
saliency
• Build coalitions around a vision for change
• Understand motivations for behaviour change:
introduce accountability and incentives
• Start small, be flexible and experiment
• Balance local initiative with quality standards
• Monitor, review and act in a timely manner
18. Evaluation designs for coping with CAS
Suggestions from evaluators
Developmental evaluations
Push-back from clients
Evaluation only budgeted at end,
take too long, too expensive, too
much like internal evaluation
Intensive case studies to
elucidate challenges and
successes
Lacks generalisability, doesn’t
address original project
objectives and logic models
20. Summary of complexity-sensitive
evaluation questions
Is there evidence that project designers and implementers ….
1. Were sensitive to history and current priorities?
2. Accommodate diversity in its design and implementation by
employing different approaches depending on capacity and
circumstance?
3. Understood dynamics of relevant behaviour?
4. Effectively influenced those dynamics?
5. Monitored, reviewed and took action based on regular
information?
6. Recognised and embraced emergent behaviours that supported
the intervention?
7. Responded to external change?
8. Were focused on what happens after the project?
21. 1) Has the project aligned
itself with history and
current priorities? Has that
had an effect on its
acceptability?
Shape of the Busselton
Sheds, planks from the old
jetty
22. 2) Did the project take into account the differences in
regions, local context, workforce or beneficiaries?
How could it have been more effective if it had?
States -2
Districts -5
Facilities-5
States -19
Districts -35
Facilities-53
States -19
Districts- 40
Facilities - 81
States -19
Districts – 206
Facilities - 371
States -19
Districts – 236
Facilities - 458
Different strategies employed in different states and facilities
24. 3) Did project designers and implementers
understand relationships and dynamics in
the workplace, families or agencies?
This can be done
by being part of
the culture,
involving the
group whose
behaviour you
want to change,
or running small
scale pilots.
25. Projects within complex adaptive systems need to identify and work ‘with’ or
work ‘around’ all of the important components, such as supply chains.
26. 4) What did they put into place to change
those dynamics to change behaviour?
Training
alone is
rarely
effective in
changing
behaviour
because
there are so
many other
influences
reinforcing
the old
behaviour.
27. Among strategies to change behaviour affected by local norms and reinforced by
feedback loops are coaching, measures to increase accountability, rewarding
performance and removing obstacles.
29. 5) Did the project routinely collect, review and
respond to information about activities and
behaviour change?
30. 6) Did the project recognise emergent
behaviours and, if so, how did they respond?
Local co-option of men’s health strategies
31. 7) How did the project respond and adapt in
face of external shocks and changes? Why
were they able to do this?
Projects need good
relationships and the
capacity to be
flexible to respond
to changes in
policies, funding or
security issues
through advocacy or
problem solving at
the appropriate
level.
32. 8) How is the project preparing for what will
happen when it ends?
What aspects
will
government
or local NGOs
retain?
Will the
principal goal
continue to
inspire
action?
How is the
implementing
agency
altering their
approach?
33. Summary of complexity-sensitive
evaluation questions
Is there evidence that project designers and implementers ….
1. Were sensitive to history and current priorities?
2. Accommodate diversity in its design and implementation by
employing different approaches depending on capacity and
circumstance?
3. Understood dynamics of relevant behaviour?
4. Effectively influenced those dynamics?
5. Monitored, reviewed and took action based on regular
information?
6. Recognised and embraced emergent behaviours that supported
the intervention?
7. Responded to external change?
8. Were focused on what happens after the project?
34. Closing remarks
These questions can be used in an interview guide
with stakeholders or as a framework for analysis.
We need more evidence on how to influence
behaviour in complex adaptive systems. Evaluators
can contribute to the evidence base.
35. A personal reading list
Paina, L. and D. H. Peters (2012). "Understanding pathways for scaling up health
services through the lens of complex adaptive systems." Health Policy and Planning
27(5): 365-373.
Preskill H et al. Evaluating Complexity: Propositions for improving practice. FSG
working paper, 2014.
Everything by Lant Prichett but especially Pritchett, L. and F. de Weijer (2010).
Fragile States: Stuck in a Capability Trap? World Development Report 2011
Background Paper. Washington DC, World Bank.
Everything by Trish Greenhalgh but especially Greenhalgh, T., J. Russell, R. E.
Ashcroft and W. Parsons (2011). "Why National eHealth Programs Need Dead
Philosophers: Wittgensteinian Reflections on Policymakers’ Reluctance to Learn
from History." Milbank Quarterly 89(4): 533-563.
Axelrod, R., & Cohen, M. D. (1999). Harnessing Complexity: Organizational
Implications of a Scientific Frontier
Chandy, L., A. Hosono, H. Kharas and J. Linn, Eds. (2013). Getting to Scale: How to
Bring Development Solutions to Millions of Poor People. Washington DC, Brookings
Institutions Press.
Sutton, R. I. and H. Rao (2014). Scaling Up Excellence: Getting to More without
Settling for Less. New York, Random House Business Books.
Editor's Notes
This presentation had its origins in work I did on scaling up national MNCH interventions for a USAID flagship program. I first did the analysis to see ‘what works’ but quickly learned that this didn’t help to explain why there was success or failure in individual projects. This was in part because there were so many different definitions of what constituted success. I started looking seriously at complex adaptive systems (CAS) and found in those writings a better way of learning what distinguished programs that had achieved positive outcomes. Now I see CAS everywhere, including my past evaluation projects, and I will draw on many examples.
Evaluators often come in at the end of a project, facing ambiguity and unresolved issues. A big part of the job is give implementers and funders a narrative that explains obstacles, achievements and uses them to talk about the next steps.
My examples are pulled from donor-funded maternal and child health programs in developing countries and small to medium NGO projects in rural and remote Australia.
This presentation had its origins in work I did on scaling up national MNCH interventions for a USAID flagship program. I first did the analysis to see ‘what works’ but quickly learned that this didn’t help to explain why there was success or failure in individual projects. This was in part because there were so many different definitions of what constituted success. I started looking seriously at complex adaptive systems (CAS) and found in those writings a better way of learning what distinguished programs that had achieved positive outcomes. Now I see CAS everywhere, including my past evaluation projects, and I will draw on many examples.
As I will show, you cannot plan your way out of complexity but projects can be designed and implemented in such a way that will work with the advantages of CAS and minimise the ever-present risk of failure. The next slides illustrate these key characteristics that many other researchers have also found to be relevant for understanding how projects can change behaviour to enable better outcomes.
History matters and decisions taken at one stage will have a profound effect on what is possible later. As a metaphor, two boulders, starting at the same place, and given a push will likely coming to different resting places. By extension, similar interventions, such as community health workers, will be implemented differently in different locations depending on regulations, availability of workers and funding, educational levels and so forth.
Diversity of actors, tasks and units exist within CAS, even if initially the landscape appears homogenous. A classic situation is in health clinics and some schools systems where tasks are handled differently in different cases: different management skills, different environments, different motivational levels. CASs need diversity to survive external shocks. If everything is the same or the variability is relatively predictable (e.g. all health centres operate in one way and all hospitals in another way) then the system is not complex.
CAS are semi-open systems which are subject to unforeseen events outside of the control of the project. Over time, external shocks that change the equilibrium are inevitable.
Projects cannot be future proofed but they can be sensitive to the future.
The relationships between the diverse units and layers are what hold a CAS together. These are done through tacit and explicit norms or practice, strong leadership or rules, and feedback loops that dampen, or sometimes facilitates adoptions.
Outside of the control of the project, surprising things happen. People gather together to protest or act in new ways that may support or threaten the interventions. Project implementers are often nervous of self-organising or emergent behaviour, but this is what accelerates behaviour change.
Changing complex adaptive systems is not a smooth process. The rate of change is not steady and the effect is disproportionate to effort. There may be thresh holds or tipping points, after which change becomes more rapid. Interventions take time to become accepted.
Complexity is not the enemy. Also we cannot plan our way out of complexity. We are sure that top down, replicable strategies that use external resources to design and implement and propel change are not sustainable and in many cases may not even be effective. However, it is also true that they will in some way change the system, and that change might lead in the right direction. Ultimately the only way to know for sure is to watch and see. Complexity aware project design and implementation is the way forward. We do not have a set of best practices although something like that is evolving.
These are taken from a large body of literature. Some references available at the end of this presentation.
Within the evaluation field, leaders are encouraging practitioners to adopt developmental approaches (see work by Patton) and case studies (see Greenhalgh et al in refs). While both are powerful designs, they are likely to be resisted by the typical client who just needs an end-of-project evaluation or mid-term review.
Questions that are sensitive to complex adaptive systems Looking for how the project harnesses the characteristics of CAS – part of the trick is to be alert to evidence that complex adaptive systems are at work. So I will give examples from some projects I have evaluated. The examples are to indicate how certain implementation strategies where successful in harnessing complexity. The absence of these behaviours may point, and has in many other projects I have evaluated, a vulnerability to the vicissitudes of CASs.
The eight questions gives a framework about how one can praise some aspects of program delivery that might not otherwise be recognised as being important, while explaining why it has not yet had the impact one might expect. The use of these questions fit very well within realist evaluation approaches.
This question is related to path dependency. Projects should explicit integrate or be responsive to current policies, topical issues. Also project implementers should take local criticisms and objections, these help to explain what is and is not acceptable.
In large projects, diversity is almost a given. A one-size-fit all approach may not do any a favour.
Information in real time is essential for making the small adaptations necessary to keep a project effective. While many projects collect basic information, for example on numbers of participants or clients, it is surprising how few actually use it to make decisions. Annual reviews are rarely sufficient. Information needs to be reviewed at least quarterly, preferably monthly, by people with the authority and commitment to take corrective actions.
Some projects are protective of their brands and their protocols, claiming that fidelity is vital. However, local change, adaptation and co-option signifies that the change has become accepted, institutionalised and is likely to be maintained.
Discussion about what happens after the project should be explicit from the design through implementation. The answers will evolve as more parties become engaged, but unless the discussion is happening, a project is unlikely to have any legacy at all and may be ignored during implementation because ‘it is just a project.’
These have a strong focus on health programs and scaling up because that was how I encountered the issue of complexity. To a large degree the issues surrounding sustaining an intervention and scaling up are the same.