SlideShare ist ein Scribd-Unternehmen logo
1 von 26
Downloaden Sie, um offline zu lesen
Exploring how evaluation practice changes
over time
Postgraduate Conference, 25th April 2015
Joana Zozimo, Educational Research, Lancaster University
Why researching evaluation?
• Experience evaluation practice
in four countries: Portugal;
Spain; Mozambique and UK
• Short term experiences: Guinea
Bissau; Cape Verde; Angola and
Senegal
• PCM cycle main expertise:
evaluation stage
• Evaluation resistance,
misunderstanding, struggling
• Bed time story explaining
researcher/evaluator
Research Problem
• Performance measurement framework places certainty and
attribution (Kusek and Rist, 2004) over reflective practice,
uncertainty and contribution (Mayne, 2008; Patton, 2010;
Saunders et al., 2005)
• Development education (Dev Ed) sector knows very little about
how evaluation is being practiced, by whom and where it is
being practiced and to what effect (Henry and Mark, 2003).
• Dev Ed ultimate goals are attitudinal change; social change
and/or change behaviour of its beneficiaries.
• Scholars emphasised that Dev Ed changes are more likely to be
captured through a practice led-evaluation rather than a
performance led-evaluation (Bourn, 2014).
What is Development education?
• active learning process,
• founded on values of solidarity, equality, inclusion and co-operation.
• enables people to move from basic awareness of international development
priorities and sustainable human development,
• through understanding of the causes and effects of global issues
• to personal involvement and informed actions.
• fosters the full participation of all citizens in world-wide poverty eradication, and the
fight against exclusion.
• seeks to influence more just and sustainable economic, social, environmental,
human rights based national and international policies (DARE Forum, 2004).
Dev Ed Research context : funding dependent; volatile human resources; short term
activities based.
Research Problem (cont.)
My approach
• Overall research aim: explore how evaluation practice changes over time
within developmen education sector.
• Research context: Youth Project evaluation practice timeline; three-year
project; run by NDEC (3 groups of participants; funder-recipient context
based)
• Research questions: How do development education organisations evaluate
their interventions?
- What do the Youth Project stakeholders do when evaluate it?
- What are the internal and external influences that shape NDEC’s evaluation practice,
particularly in the Youth Project?
• What is Evaluation? What is a social practice?
What is Evaluation?
• sets or clusters of behaviours
• conceptualised as social practices
• forming ways of ‘thinking and doing’
• associated with undertaking evaluative activity
• attributing value, complexity and appropriateness to social
interventions (Saunders et al., 2011)
• I will use the term “Evaluative Practices”
• In other words, to evaluate is to attribute value …
How Evaluation is broadly perceived
What is a social practice?
• routinized type of behaviour,
• forms of bodily activities,
• forms of mental activities,
• ‘things’ and their use,
• a background knowledge in the form of understanding,
• know-how,
• states of emotion and motivational knowledge (Reckwitz, 2002)
In everyday’ s language, a practice is:
• a way of cooking, of working,
• of investigating, of taking care of oneself or of others,
• of evaluating
What is a social practice? (cont.)
Elements of Practice : Materials; Competence; Meaning
Dynamics of social practice: co-occurrence of practices; how they are carried by
practitioners (Shove et al., 2012)
• Overlap
• Interact
• Compete
• Collaborate
• Integrate
• Dominate
• Or what else?..
Social Practice Theory by Reckwitz (2002) and Shove et al (2012) applied to
Evaluation by Saunders (2011) in a Development Education sector.
Methodology / Research design
 Social constructivist perspective;
 Case study research as approach , not only method! (Simons, 2009); Single in-depth
case study;
 Qualitative inductive approach, purposeful sampling to allow rich case to be
explored;
 The case: Evaluation practice of the Youth project (three-year project run by NDEC).
 The case design: Units of analysis/ Embedded cases (Yin, 2003):
1. Coordinators
2. Staff
3. Funders
Research decision: ethical decision to design three embedded cases to preserve
indviduals’ anonynity and confidentiality.
Data collection overview
• Pilot study with 5 informal conversations; 3 interviews
• Fieldwork in practice: 7 hr of participant observation
• Observation notes; Informal conversations (n=3); documents (n=10);
Phase 1
Planning the
Evaluation
Feb till May
2012
• 15 in-depth semi structured interviews,
• 95 hr of participant observation spread over 3 months (weekly basis)
• 24 hr of non participant observation spread over 2 months (event basis)
• Interviews notes (n=15); observation notes;
Phase 2
Evaluation off
the ground
Sept-Nov 2012
• 3 in-depth interviews; 3 informal conversations
• 4 documents analysed
• Interview notes (n=3); Informal conversations (n=3)
Phase 3
Evaluation
reporting
March-April
2013
My Data set/ Approach to analysis
 6 Informal conversations notes/research diary
 18 interviews transcripts (fully transcribed)
 14 documents
 126 hr observational field notes /research diary
Case “story” account (Simons, 2009); Transforming data (Wolcott, 1994)
“Thick” description (Geertz, 1973); Longitudinal analysis (Saldana, 2002)
 Atlas TI software; coding and categorisation; seek patterns and connections;
 Three cycles of coding - Content analysis, value coding analysis, longitudinal analysis;
 Social practice theoretical lenses applied to focus on what different groups do on a
daily basis – coordinators; staff and funders;
 Informed by literature on:
Moments of evaluation (McCluskey et al, 2008)
Provisional stabilities (Saunders, 2005)
Dynamics of practice (Shove et al., 2012)
The Case Evaluation Practice
Timeline (WIP)
Example image
Youth
Project
approved
DfID’s
official
letter and
guidelibes
Stage 1
Planning
Evaluation
GMA’s
evaluation
mandate
Practice
change
episode
s (PCE)
DfID’s
evaluati
on
reviewe
d
My
presence
in the
fieldwork
Stag
e 2
Gett
ing
Eval
uati
on
“off
the
grou
nd”
Stage 3
Reporti
ng
Evaluat
ion
Dev
Ed
Politi
cal
conte
xt
chan
ge
PCE - Funders
contrasting
evaluation
approaches
2009/2010
2011
3/12 9/12 1/13
The Case Evaluation Practice
Timeline (WIP)
Example image
Finding
Designing
the
Project
Stage 1
Planning
Evaluation
Practice
change
episodes
Researcher
presence in
the fieldwork
Stage 2
Getting
Evaluation
“off the
ground”
Stage 3
Reporting
Evaluation
2009/2010 2011 3/12 9/12 1/13
Main Themes
• Absence of evaluation culture (dishonesty, fear, scepticism)
• Retrospective reflection
Finding
Designing the
Project
• Evaluation language/discourse
• Forced/Imposed practice
Stage 1
Planning
Evaluation
• Resistance
• Power (funders’ relationship)
• Managerialism
• Co-occurrence of practices
Stage 2
Evaluation
“off the
ground”
• Expectations
• Fragmented evaluation practice
• Competence
Stage 3
Reporting
Evaluation
My account/ research story
• The Grant Management Agency only administered the grant,
they weren’t development education specialists. It’s like asking
your bank what are the best sausages, you know...they don’t
know [laughs]. They might eat it, but they don’t know; you
know what I mean...they are not education people; they are
consultants who are administering the grants. (Mary, project
worker, 7:77, emphasis added)
My account/ research story
Reflection enables practitioners to notice how evaluation practice has changed over time;
and to learn from it !
My account/ research story
When a new actor [GMA] appear within the funders/recipient context, evaluation practice
changes (the way a practice is carried by the practitioners is altered, because new actors
have occupied that space in the field).
My account /research story
A fragmented practice changes the way evaluation is practiced over time.
My account /research story
Research participants are expected to have competence in evaluation, to respond to
funders mandatory requirements
My contribution is to the evaluation
practice debate
 Theoretical contribution – extending evaluation practice theory by using an
advance theoretical framework of SPT applied to a new domain of Dev Ed.
 Implications for Practitioners- add fresh insights on evaluative practices
within Dev Ed domain; dynamics of social practice places reflective
practice at the center of evaluation practice.
 Research Impact – gain momentum of 2015 Year of Evaluation
http://mymande.org/evalyear/Declaring_2015_as_the_International_Year
_of_Evaluation
My reflective space / Research
journey
 Writing as a sense making tool – despite the use of Atlas, it was only when
I start writing on a daily basis that I deeply engaged with my materials.
 Writing as interpretation (Richardson, 1994).
 Writing retreats (Rowena Murray) – learning the dynamics of academic
writing; create habits; practice writing in a peer group; supportive and
contained environment; create your own writing group.
 More resources:
• Rowena Murray work http://www.rowenamurray.org/
• FASS writing retreat http://www.lancaster.ac.uk/fass/gradschool/training/
• Thesis whisperer http://thesiswhisperer.com/
References
Bourn, D. (2014) The Theory and Practice of Global Learning. Development Education Research Centre.
DARE Forum. (2004). What is DARE. DEEEP - Developing Europeans’ Engagement for the Eradication of Global Poverty.
Retrieved from http://www.deeep.org/what-is-dare-.html
Geertz, C. (1973) The Interpretation of cultures. [New York (N.Y.)]: Basic Books.
Henry, J., Mark, M.M. (2003) Toward an agenda for research on evaluation. In C. A. Christie, ed. The practice-theory
relationship in evaluation. Jossey-Bass.
Kusek, J.Z., Rist, R.C. (2004) Ten Steps to a Results-Based Monitoring and Evaluation System: A Handbook for Development
Practitioners. World Bank Publications.
Mayne, J. (2008) Contribution analysis: An approach to exploring cause and effect.
McCluskey, A. (2011). Evaluation as deep learning: a holistic perspective on evaluation in the Pallete project. In
Reconceptualising evaluative practices in HE (Saunders et al.). Open University Press.
Reckwitz, A. (2002). Toward a Theory of Social Practices: A Development in Culturalist Theorizing. European Journal of
Social Theory, 5(2), 243–263
Patton, M.Q. (2010) Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use. Guilford
Press.
Saldana, J. (2002) Analyzing Change in Longitudinal Qualitative Data. Youth Theatre Journal 16(1). 1-17
References
Saunders, M., Charlier, B., Bonamy, J. (2005) Using Evaluation to Create ‘Provisional Stabilities’. Evaluation. 11(1), 37–54.
Saunders, M., Trowler, P., & Bamber, V. (2011). Reconceptualising Evaluative Practices in HE (1st ed.). Open University
Press.
Simons, H. (2009) Case study research in practice. Los Angeles ; London: SAGE.
Shove, E. (2012) The dynamics of social practice: everyday life and how it changes. Los Angeles: SAGE.
Wolcott, H.F. (1994) Transforming qualitative data : description, analysis, and interpretation. Thousand Oaks, Calif.: Sage
Publications.
Yin, R.K. (2003) Case study research: design and methods. 3rd ed. Thousand Oaks, Calif: Sage Publications.
Thank you! What resonates with you?
Welcome your questions.
j.zozimo@lancaster.ac.uk

Weitere ähnliche Inhalte

Was ist angesagt?

Action Research for Development Communication
Action Research for Development CommunicationAction Research for Development Communication
Action Research for Development CommunicationAnkuran Dutta
 
Action Research in Education
Action Research in EducationAction Research in Education
Action Research in EducationWSSU CETL
 
Collaborative Action Research
Collaborative Action ResearchCollaborative Action Research
Collaborative Action ResearchColleen Graves
 
Action research in education 2011
Action research in education 2011Action research in education 2011
Action research in education 2011Learning 3.0
 
Collaborative Action Research 2007
Collaborative Action Research 2007Collaborative Action Research 2007
Collaborative Action Research 2007Johan Koren
 
Collaborative action research
Collaborative action researchCollaborative action research
Collaborative action researchJohan Koren
 
Information Systems Action research methods
Information Systems  Action research methodsInformation Systems  Action research methods
Information Systems Action research methodsRaimo Halinen
 
Naturalistic evaluation2
Naturalistic evaluation2Naturalistic evaluation2
Naturalistic evaluation2Shika Hershel
 
NGSS Curriculum Development: Lessons Learned from the Mi-STAR Program
NGSS Curriculum Development: Lessons Learned from the Mi-STAR ProgramNGSS Curriculum Development: Lessons Learned from the Mi-STAR Program
NGSS Curriculum Development: Lessons Learned from the Mi-STAR ProgramSERC at Carleton College
 
Data collection in qualitative research focus groups october 2015
Data collection in qualitative research focus groups october 2015Data collection in qualitative research focus groups october 2015
Data collection in qualitative research focus groups october 2015Tünde Varga-Atkins
 
Classroom research ELT
Classroom research ELTClassroom research ELT
Classroom research ELTBüşra Durbin
 
Collaborative action research 2003
Collaborative action research 2003Collaborative action research 2003
Collaborative action research 2003Johan Koren
 
Action research
Action researchAction research
Action researchvjkamal
 
Classroom research
Classroom researchClassroom research
Classroom researchAmina1089
 
Modern Doctorate Literature Review
Modern Doctorate Literature ReviewModern Doctorate Literature Review
Modern Doctorate Literature Reviewkariwhaley
 

Was ist angesagt? (20)

Action Research for Development Communication
Action Research for Development CommunicationAction Research for Development Communication
Action Research for Development Communication
 
Introducing action research
Introducing action researchIntroducing action research
Introducing action research
 
Action Research in Education
Action Research in EducationAction Research in Education
Action Research in Education
 
Collaborative Action Research
Collaborative Action ResearchCollaborative Action Research
Collaborative Action Research
 
Action Research
Action ResearchAction Research
Action Research
 
Action Research
Action ResearchAction Research
Action Research
 
Action research in education 2011
Action research in education 2011Action research in education 2011
Action research in education 2011
 
Collaborative Action Research 2007
Collaborative Action Research 2007Collaborative Action Research 2007
Collaborative Action Research 2007
 
Collaborative action research
Collaborative action researchCollaborative action research
Collaborative action research
 
Information Systems Action research methods
Information Systems  Action research methodsInformation Systems  Action research methods
Information Systems Action research methods
 
Naturalistic evaluation2
Naturalistic evaluation2Naturalistic evaluation2
Naturalistic evaluation2
 
NGSS Curriculum Development: Lessons Learned from the Mi-STAR Program
NGSS Curriculum Development: Lessons Learned from the Mi-STAR ProgramNGSS Curriculum Development: Lessons Learned from the Mi-STAR Program
NGSS Curriculum Development: Lessons Learned from the Mi-STAR Program
 
Data collection in qualitative research focus groups october 2015
Data collection in qualitative research focus groups october 2015Data collection in qualitative research focus groups october 2015
Data collection in qualitative research focus groups october 2015
 
Classroom research ELT
Classroom research ELTClassroom research ELT
Classroom research ELT
 
Collaborative action research 2003
Collaborative action research 2003Collaborative action research 2003
Collaborative action research 2003
 
Action research
Action researchAction research
Action research
 
Classroom research
Classroom researchClassroom research
Classroom research
 
Action research 2013 (2)
Action research 2013 (2)Action research 2013 (2)
Action research 2013 (2)
 
Modern Doctorate Literature Review
Modern Doctorate Literature ReviewModern Doctorate Literature Review
Modern Doctorate Literature Review
 
TESTA Masterclass
TESTA MasterclassTESTA Masterclass
TESTA Masterclass
 

Ähnlich wie Joana Zozimo presentation_25042015

Sensemaking LS and DL
Sensemaking LS and DLSensemaking LS and DL
Sensemaking LS and DLPhilwood
 
Qualitative Methods Course: Moving from Afterthought to Forethought
Qualitative Methods Course: Moving from Afterthought to ForethoughtQualitative Methods Course: Moving from Afterthought to Forethought
Qualitative Methods Course: Moving from Afterthought to ForethoughtMEASURE Evaluation
 
Class 6 research quality in qualitative methods 3 2-17
Class 6 research quality in qualitative methods 3 2-17Class 6 research quality in qualitative methods 3 2-17
Class 6 research quality in qualitative methods 3 2-17tjcarter
 
Qualitative Methods Course: Moving from Afterthought to Forethought
Qualitative Methods Course: Moving from Afterthought to ForethoughtQualitative Methods Course: Moving from Afterthought to Forethought
Qualitative Methods Course: Moving from Afterthought to ForethoughtMEASURE Evaluation
 
Jace Hargis Designing Online Teaching
Jace Hargis Designing Online TeachingJace Hargis Designing Online Teaching
Jace Hargis Designing Online TeachingJace Hargis
 
Using realist evaluation with vulnerable young people and the services that s...
Using realist evaluation with vulnerable young people and the services that s...Using realist evaluation with vulnerable young people and the services that s...
Using realist evaluation with vulnerable young people and the services that s...BASPCAN
 
Proposing a model for the incremental development of peer assessment and feed...
Proposing a model for the incremental development of peer assessment and feed...Proposing a model for the incremental development of peer assessment and feed...
Proposing a model for the incremental development of peer assessment and feed...Laura Costelloe
 
Emergent evaluation some initial thoughts
Emergent evaluation some initial thoughtsEmergent evaluation some initial thoughts
Emergent evaluation some initial thoughtsPhilwood
 
Outcomes and Assessment for Bonner and Campus Centers
Outcomes and Assessment for Bonner and Campus CentersOutcomes and Assessment for Bonner and Campus Centers
Outcomes and Assessment for Bonner and Campus CentersBonner Foundation
 
Extra-curricular events and self-efficacy: measuring self-concepts
Extra-curricular events and self-efficacy: measuring self-conceptsExtra-curricular events and self-efficacy: measuring self-concepts
Extra-curricular events and self-efficacy: measuring self-conceptsJill Dickinson
 
05 approaches to_researching_educational_innovation_palitha_edirisingha
05 approaches to_researching_educational_innovation_palitha_edirisingha05 approaches to_researching_educational_innovation_palitha_edirisingha
05 approaches to_researching_educational_innovation_palitha_edirisinghaPalitha Edirisingha
 
Vicky Pelka's Training Session On Impact Evaluation
Vicky Pelka's Training Session On Impact EvaluationVicky Pelka's Training Session On Impact Evaluation
Vicky Pelka's Training Session On Impact EvaluationJosh Chandler
 
Feedback as dialogue and learning technologies: can e-assessment be formative?
Feedback as dialogue and learning technologies: can e-assessment be formative?Feedback as dialogue and learning technologies: can e-assessment be formative?
Feedback as dialogue and learning technologies: can e-assessment be formative?Centre for Distance Education
 
Feedback, Agency and Analytics in Virtual Learning Environments – Creating a ...
Feedback, Agency and Analytics in Virtual Learning Environments – Creating a ...Feedback, Agency and Analytics in Virtual Learning Environments – Creating a ...
Feedback, Agency and Analytics in Virtual Learning Environments – Creating a ...Diogo Casanova
 
Lin Norton - Ulster developing a robust pedagogical action research study
Lin Norton - Ulster developing a robust pedagogical action research studyLin Norton - Ulster developing a robust pedagogical action research study
Lin Norton - Ulster developing a robust pedagogical action research studycampone
 
CHAPTER 6 curriculum Evaluation 1.2.pptx
CHAPTER 6 curriculum Evaluation 1.2.pptxCHAPTER 6 curriculum Evaluation 1.2.pptx
CHAPTER 6 curriculum Evaluation 1.2.pptxAngelouRivera
 
The Curriculum and the Study of the Curriculum
The Curriculum and the Study of the CurriculumThe Curriculum and the Study of the Curriculum
The Curriculum and the Study of the CurriculumJellyfab February
 
Rsi 26 5-solooooooo
Rsi 26 5-soloooooooRsi 26 5-solooooooo
Rsi 26 5-soloooooooDilshad Shah
 
qualitative research DR. MADHUR VERMA PGIMS ROHTAK
 qualitative research DR. MADHUR VERMA PGIMS ROHTAK qualitative research DR. MADHUR VERMA PGIMS ROHTAK
qualitative research DR. MADHUR VERMA PGIMS ROHTAKMADHUR VERMA
 

Ähnlich wie Joana Zozimo presentation_25042015 (20)

Action research
Action researchAction research
Action research
 
Sensemaking LS and DL
Sensemaking LS and DLSensemaking LS and DL
Sensemaking LS and DL
 
Qualitative Methods Course: Moving from Afterthought to Forethought
Qualitative Methods Course: Moving from Afterthought to ForethoughtQualitative Methods Course: Moving from Afterthought to Forethought
Qualitative Methods Course: Moving from Afterthought to Forethought
 
Class 6 research quality in qualitative methods 3 2-17
Class 6 research quality in qualitative methods 3 2-17Class 6 research quality in qualitative methods 3 2-17
Class 6 research quality in qualitative methods 3 2-17
 
Qualitative Methods Course: Moving from Afterthought to Forethought
Qualitative Methods Course: Moving from Afterthought to ForethoughtQualitative Methods Course: Moving from Afterthought to Forethought
Qualitative Methods Course: Moving from Afterthought to Forethought
 
Jace Hargis Designing Online Teaching
Jace Hargis Designing Online TeachingJace Hargis Designing Online Teaching
Jace Hargis Designing Online Teaching
 
Using realist evaluation with vulnerable young people and the services that s...
Using realist evaluation with vulnerable young people and the services that s...Using realist evaluation with vulnerable young people and the services that s...
Using realist evaluation with vulnerable young people and the services that s...
 
Proposing a model for the incremental development of peer assessment and feed...
Proposing a model for the incremental development of peer assessment and feed...Proposing a model for the incremental development of peer assessment and feed...
Proposing a model for the incremental development of peer assessment and feed...
 
Emergent evaluation some initial thoughts
Emergent evaluation some initial thoughtsEmergent evaluation some initial thoughts
Emergent evaluation some initial thoughts
 
Outcomes and Assessment for Bonner and Campus Centers
Outcomes and Assessment for Bonner and Campus CentersOutcomes and Assessment for Bonner and Campus Centers
Outcomes and Assessment for Bonner and Campus Centers
 
Extra-curricular events and self-efficacy: measuring self-concepts
Extra-curricular events and self-efficacy: measuring self-conceptsExtra-curricular events and self-efficacy: measuring self-concepts
Extra-curricular events and self-efficacy: measuring self-concepts
 
05 approaches to_researching_educational_innovation_palitha_edirisingha
05 approaches to_researching_educational_innovation_palitha_edirisingha05 approaches to_researching_educational_innovation_palitha_edirisingha
05 approaches to_researching_educational_innovation_palitha_edirisingha
 
Vicky Pelka's Training Session On Impact Evaluation
Vicky Pelka's Training Session On Impact EvaluationVicky Pelka's Training Session On Impact Evaluation
Vicky Pelka's Training Session On Impact Evaluation
 
Feedback as dialogue and learning technologies: can e-assessment be formative?
Feedback as dialogue and learning technologies: can e-assessment be formative?Feedback as dialogue and learning technologies: can e-assessment be formative?
Feedback as dialogue and learning technologies: can e-assessment be formative?
 
Feedback, Agency and Analytics in Virtual Learning Environments – Creating a ...
Feedback, Agency and Analytics in Virtual Learning Environments – Creating a ...Feedback, Agency and Analytics in Virtual Learning Environments – Creating a ...
Feedback, Agency and Analytics in Virtual Learning Environments – Creating a ...
 
Lin Norton - Ulster developing a robust pedagogical action research study
Lin Norton - Ulster developing a robust pedagogical action research studyLin Norton - Ulster developing a robust pedagogical action research study
Lin Norton - Ulster developing a robust pedagogical action research study
 
CHAPTER 6 curriculum Evaluation 1.2.pptx
CHAPTER 6 curriculum Evaluation 1.2.pptxCHAPTER 6 curriculum Evaluation 1.2.pptx
CHAPTER 6 curriculum Evaluation 1.2.pptx
 
The Curriculum and the Study of the Curriculum
The Curriculum and the Study of the CurriculumThe Curriculum and the Study of the Curriculum
The Curriculum and the Study of the Curriculum
 
Rsi 26 5-solooooooo
Rsi 26 5-soloooooooRsi 26 5-solooooooo
Rsi 26 5-solooooooo
 
qualitative research DR. MADHUR VERMA PGIMS ROHTAK
 qualitative research DR. MADHUR VERMA PGIMS ROHTAK qualitative research DR. MADHUR VERMA PGIMS ROHTAK
qualitative research DR. MADHUR VERMA PGIMS ROHTAK
 

Joana Zozimo presentation_25042015

  • 1. Exploring how evaluation practice changes over time Postgraduate Conference, 25th April 2015 Joana Zozimo, Educational Research, Lancaster University
  • 2. Why researching evaluation? • Experience evaluation practice in four countries: Portugal; Spain; Mozambique and UK • Short term experiences: Guinea Bissau; Cape Verde; Angola and Senegal • PCM cycle main expertise: evaluation stage • Evaluation resistance, misunderstanding, struggling • Bed time story explaining researcher/evaluator
  • 3. Research Problem • Performance measurement framework places certainty and attribution (Kusek and Rist, 2004) over reflective practice, uncertainty and contribution (Mayne, 2008; Patton, 2010; Saunders et al., 2005) • Development education (Dev Ed) sector knows very little about how evaluation is being practiced, by whom and where it is being practiced and to what effect (Henry and Mark, 2003). • Dev Ed ultimate goals are attitudinal change; social change and/or change behaviour of its beneficiaries. • Scholars emphasised that Dev Ed changes are more likely to be captured through a practice led-evaluation rather than a performance led-evaluation (Bourn, 2014).
  • 4. What is Development education? • active learning process, • founded on values of solidarity, equality, inclusion and co-operation. • enables people to move from basic awareness of international development priorities and sustainable human development, • through understanding of the causes and effects of global issues • to personal involvement and informed actions. • fosters the full participation of all citizens in world-wide poverty eradication, and the fight against exclusion. • seeks to influence more just and sustainable economic, social, environmental, human rights based national and international policies (DARE Forum, 2004). Dev Ed Research context : funding dependent; volatile human resources; short term activities based.
  • 6. My approach • Overall research aim: explore how evaluation practice changes over time within developmen education sector. • Research context: Youth Project evaluation practice timeline; three-year project; run by NDEC (3 groups of participants; funder-recipient context based) • Research questions: How do development education organisations evaluate their interventions? - What do the Youth Project stakeholders do when evaluate it? - What are the internal and external influences that shape NDEC’s evaluation practice, particularly in the Youth Project? • What is Evaluation? What is a social practice?
  • 7. What is Evaluation? • sets or clusters of behaviours • conceptualised as social practices • forming ways of ‘thinking and doing’ • associated with undertaking evaluative activity • attributing value, complexity and appropriateness to social interventions (Saunders et al., 2011) • I will use the term “Evaluative Practices” • In other words, to evaluate is to attribute value …
  • 8. How Evaluation is broadly perceived
  • 9. What is a social practice? • routinized type of behaviour, • forms of bodily activities, • forms of mental activities, • ‘things’ and their use, • a background knowledge in the form of understanding, • know-how, • states of emotion and motivational knowledge (Reckwitz, 2002) In everyday’ s language, a practice is: • a way of cooking, of working, • of investigating, of taking care of oneself or of others, • of evaluating
  • 10. What is a social practice? (cont.) Elements of Practice : Materials; Competence; Meaning Dynamics of social practice: co-occurrence of practices; how they are carried by practitioners (Shove et al., 2012) • Overlap • Interact • Compete • Collaborate • Integrate • Dominate • Or what else?.. Social Practice Theory by Reckwitz (2002) and Shove et al (2012) applied to Evaluation by Saunders (2011) in a Development Education sector.
  • 11. Methodology / Research design  Social constructivist perspective;  Case study research as approach , not only method! (Simons, 2009); Single in-depth case study;  Qualitative inductive approach, purposeful sampling to allow rich case to be explored;  The case: Evaluation practice of the Youth project (three-year project run by NDEC).  The case design: Units of analysis/ Embedded cases (Yin, 2003): 1. Coordinators 2. Staff 3. Funders Research decision: ethical decision to design three embedded cases to preserve indviduals’ anonynity and confidentiality.
  • 12. Data collection overview • Pilot study with 5 informal conversations; 3 interviews • Fieldwork in practice: 7 hr of participant observation • Observation notes; Informal conversations (n=3); documents (n=10); Phase 1 Planning the Evaluation Feb till May 2012 • 15 in-depth semi structured interviews, • 95 hr of participant observation spread over 3 months (weekly basis) • 24 hr of non participant observation spread over 2 months (event basis) • Interviews notes (n=15); observation notes; Phase 2 Evaluation off the ground Sept-Nov 2012 • 3 in-depth interviews; 3 informal conversations • 4 documents analysed • Interview notes (n=3); Informal conversations (n=3) Phase 3 Evaluation reporting March-April 2013
  • 13. My Data set/ Approach to analysis  6 Informal conversations notes/research diary  18 interviews transcripts (fully transcribed)  14 documents  126 hr observational field notes /research diary Case “story” account (Simons, 2009); Transforming data (Wolcott, 1994) “Thick” description (Geertz, 1973); Longitudinal analysis (Saldana, 2002)  Atlas TI software; coding and categorisation; seek patterns and connections;  Three cycles of coding - Content analysis, value coding analysis, longitudinal analysis;  Social practice theoretical lenses applied to focus on what different groups do on a daily basis – coordinators; staff and funders;  Informed by literature on: Moments of evaluation (McCluskey et al, 2008) Provisional stabilities (Saunders, 2005) Dynamics of practice (Shove et al., 2012)
  • 14. The Case Evaluation Practice Timeline (WIP) Example image Youth Project approved DfID’s official letter and guidelibes Stage 1 Planning Evaluation GMA’s evaluation mandate Practice change episode s (PCE) DfID’s evaluati on reviewe d My presence in the fieldwork Stag e 2 Gett ing Eval uati on “off the grou nd” Stage 3 Reporti ng Evaluat ion Dev Ed Politi cal conte xt chan ge PCE - Funders contrasting evaluation approaches 2009/2010 2011 3/12 9/12 1/13
  • 15. The Case Evaluation Practice Timeline (WIP) Example image Finding Designing the Project Stage 1 Planning Evaluation Practice change episodes Researcher presence in the fieldwork Stage 2 Getting Evaluation “off the ground” Stage 3 Reporting Evaluation 2009/2010 2011 3/12 9/12 1/13
  • 16. Main Themes • Absence of evaluation culture (dishonesty, fear, scepticism) • Retrospective reflection Finding Designing the Project • Evaluation language/discourse • Forced/Imposed practice Stage 1 Planning Evaluation • Resistance • Power (funders’ relationship) • Managerialism • Co-occurrence of practices Stage 2 Evaluation “off the ground” • Expectations • Fragmented evaluation practice • Competence Stage 3 Reporting Evaluation
  • 17. My account/ research story • The Grant Management Agency only administered the grant, they weren’t development education specialists. It’s like asking your bank what are the best sausages, you know...they don’t know [laughs]. They might eat it, but they don’t know; you know what I mean...they are not education people; they are consultants who are administering the grants. (Mary, project worker, 7:77, emphasis added)
  • 18. My account/ research story Reflection enables practitioners to notice how evaluation practice has changed over time; and to learn from it !
  • 19. My account/ research story When a new actor [GMA] appear within the funders/recipient context, evaluation practice changes (the way a practice is carried by the practitioners is altered, because new actors have occupied that space in the field).
  • 20. My account /research story A fragmented practice changes the way evaluation is practiced over time.
  • 21. My account /research story Research participants are expected to have competence in evaluation, to respond to funders mandatory requirements
  • 22. My contribution is to the evaluation practice debate  Theoretical contribution – extending evaluation practice theory by using an advance theoretical framework of SPT applied to a new domain of Dev Ed.  Implications for Practitioners- add fresh insights on evaluative practices within Dev Ed domain; dynamics of social practice places reflective practice at the center of evaluation practice.  Research Impact – gain momentum of 2015 Year of Evaluation http://mymande.org/evalyear/Declaring_2015_as_the_International_Year _of_Evaluation
  • 23. My reflective space / Research journey  Writing as a sense making tool – despite the use of Atlas, it was only when I start writing on a daily basis that I deeply engaged with my materials.  Writing as interpretation (Richardson, 1994).  Writing retreats (Rowena Murray) – learning the dynamics of academic writing; create habits; practice writing in a peer group; supportive and contained environment; create your own writing group.  More resources: • Rowena Murray work http://www.rowenamurray.org/ • FASS writing retreat http://www.lancaster.ac.uk/fass/gradschool/training/ • Thesis whisperer http://thesiswhisperer.com/
  • 24. References Bourn, D. (2014) The Theory and Practice of Global Learning. Development Education Research Centre. DARE Forum. (2004). What is DARE. DEEEP - Developing Europeans’ Engagement for the Eradication of Global Poverty. Retrieved from http://www.deeep.org/what-is-dare-.html Geertz, C. (1973) The Interpretation of cultures. [New York (N.Y.)]: Basic Books. Henry, J., Mark, M.M. (2003) Toward an agenda for research on evaluation. In C. A. Christie, ed. The practice-theory relationship in evaluation. Jossey-Bass. Kusek, J.Z., Rist, R.C. (2004) Ten Steps to a Results-Based Monitoring and Evaluation System: A Handbook for Development Practitioners. World Bank Publications. Mayne, J. (2008) Contribution analysis: An approach to exploring cause and effect. McCluskey, A. (2011). Evaluation as deep learning: a holistic perspective on evaluation in the Pallete project. In Reconceptualising evaluative practices in HE (Saunders et al.). Open University Press. Reckwitz, A. (2002). Toward a Theory of Social Practices: A Development in Culturalist Theorizing. European Journal of Social Theory, 5(2), 243–263 Patton, M.Q. (2010) Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use. Guilford Press. Saldana, J. (2002) Analyzing Change in Longitudinal Qualitative Data. Youth Theatre Journal 16(1). 1-17
  • 25. References Saunders, M., Charlier, B., Bonamy, J. (2005) Using Evaluation to Create ‘Provisional Stabilities’. Evaluation. 11(1), 37–54. Saunders, M., Trowler, P., & Bamber, V. (2011). Reconceptualising Evaluative Practices in HE (1st ed.). Open University Press. Simons, H. (2009) Case study research in practice. Los Angeles ; London: SAGE. Shove, E. (2012) The dynamics of social practice: everyday life and how it changes. Los Angeles: SAGE. Wolcott, H.F. (1994) Transforming qualitative data : description, analysis, and interpretation. Thousand Oaks, Calif.: Sage Publications. Yin, R.K. (2003) Case study research: design and methods. 3rd ed. Thousand Oaks, Calif: Sage Publications.
  • 26. Thank you! What resonates with you? Welcome your questions. j.zozimo@lancaster.ac.uk

Hinweis der Redaktion

  1. Research over time, do every day at the same time
  2. Research looks at the case evaluative practice through this SP analytical framework to explore what are the elements of that practice and how it interact with other practices, within a Dev Ed domain
  3. Ethnographic approach to data collection; longitudinal dimension; triangulation of different data sources; 3 units of analysis; 2 rounds of interviews in a retrospective approach; rich data set primarily focused on evaluation practices; conducted by a single researcher
  4. Hard to pick up part of data
  5. So timeline helped to do that through practice change episodes
  6. So timeline helped to do that through practice change episodes
  7. Findings were situated according with the project’s evaluation timeline suggesting that in each stage evaluative practice was informed by
  8. Some evidence to support potential claims/ conclusions …so, the knock on impact for us was really quite threatening and if you get it wrong they [GMA] can just take the money off from you and that was quite hard for me (Mary, 27/9/12, 7:31) now the relationship is with GMA...and honestly I can stand GMA, because they are so rigorous and the attention to detail drives me crazy, because I don’t have time to give them every little piece of information they want all the time, so I can’t stand it…I hate the relationship with them, it is a total power relation where they say what they want and we do what they want, because if we don’t do what they want then we won’t get the money. So we are subservient to GMA and we are very nice to them when they came (Rachel, 17/9/12, 1;163)
  9. Other concurrent practices affect how evaluation practice changes over time.
  10. Some evidence to support potential claims/ conclusions
  11. Some evidence to support potential claims/ conclusions
  12. Some evidence to support potential claims/ conclusions Saunders, M., Trowler, P., & Bamber, V. (2011). Reconceptualising Evaluative Practices in HE (1st ed.). Open University Press. Simons, H. (2009) Case study research in practice. Los Angeles ; London: SAGE. Shove, E. (2012) The dynamics of social practice: everyday life and how it changes. Los Angeles: SAGE. Wolcott, H.F. (1994) Transforming qualitative data : description, analysis, and interpretation. Thousand Oaks, Calif.: Sage Publications. Yin, R.K. (2003) Case study research: design and methods. 3rd ed. Thousand Oaks, Calif: Sage Publications.
  13. Some evidence to support potential claims/ conclusions