4. The problem
HRM: 3,600 articles in 2015 (ABI/INFORM). For an HR
manager to keep up this means reading 10 articles
every day (for a ‘general’ manager more than 100!)
BTW: most of the research is seriously
flawed or irrelevant for practice
(signal to noise ratio: 5 vs 95)
5. The solution: evidence summaries (reviews)
We need evidence summaries of the most valid and
relevant scientific research on a certain topic (eliminating
the need to read individual studies), so busy managers can:
• Consult summaries when a new problem crops up.
• Check summaries for new findings when an important
issue recurs.
• Update their professional knowledge as new demands
appear.
6. What sort of reviews are available
White papers by consulting firms
Reviews in popular management literature
Reviews in current textbooks
Traditional (academic) literature reviews
7. Question to you
• What are white papers,
insight reports, etc. ?
• How are they produced?
• What is their purpose?
9. Claims made by most of these reviews
“Previous research has shown that team building
improves performance”
“It has been demonstrated that management
development is effective”
“Many studies have shown that employee engagement
increases performance”
“There is much evidence that job stress causes ill health”
10. Type of claims made by most of these reviews
BUT !!!
Did all previous research show this?
What proportion of previous research?
How many studies?
How strongly or clearly or consistently was this
shown?
Were the study designs such that the conclusions
reached could be justified?
What did the authors do to avoid the biases of pre-
existing beliefs?
11. These are therefore meaningless statements or
vague opinions!!
Previous research has shown that team building
improves performance
It has been demonstrated that management
development is effective
Many studies have shown that employee engagement
increases performance
There is much evidence that job stress causes ill health
12. Given all these limitations systematic
reviews are the gold standard
13. What are systematic reviews?
The intention behind a systematic review is to identify
as fully as possible all the scientific studies of relevance
to a particular subject and to assess the validity and
authority of the evidence of each study separately,
based on explicit criteria such as research design,
population, or outcome measures. A well-specified
approach is applied to selecting studies and their
methodological quality is assessed according to explicit
criteria by independent raters. A systematic review is
therefore transparent, verifiable, and reproducible.
14. Core principles of SRs
Systematic/organized: Systematic reviews are conducted according to
a system or method which is designed in relation to and specifically to
address the question the review is setting out to answer.
Transparent/explicit/Replicable: The method used in the review is
explicitly stated such that other researchers can repeat the review or
update it
Critically appraised: The methodological quality (trustworthiness) of
each study included is critically evaluated according to explicit criteria
by independent raters
Synthesize/summarize: Systematic reviews pull together in a
structured and organized way the results of the review in order to
summarize the evidence relating to the review question.
15. SRs in other areas
• Worldwide communities devoted to promoting access
to evidence-based practice
• Members collaborate to summarize state of the art
knowledge on specific practices identified as
important and under/over/mis-used
• On-line access to information, designed for ease and
speed of use
16.
17.
18.
19. SRs answer these type of questions
What do we know: what works, for whom, under
what circumstances?
What do we not know?
What are we sure about?
What are we not sure about?
What is the evidence base for our claims? (e.g., How
much evidence? What quality?)
20. Systematic Review
(What academics do)
Critically Appraised Topic
(What you could do) Rapid Evidence Assessment
(What organizations
should do)
SRs, REAs and CAT’s
21. They all follow the same steps
1. Identify and clearly define the question the review will address.
2. Determine the types of studies and data that will answer the question.
3. Search the literature to locate relevant studies.
4. Sift through all the retrieved studies in order to identify those that meet
the inclusion criteria (and need to be examined further) and those that
do not and should be excluded.
5. Extract the relevant data or information from the studies.
6. Critically appraise the studies by assessing the study quality determined
in relation to the review question.
7. Synthesize the findings from the studies.
8. Consider potential effects of publications or other biases.
22. 1. Background
2. Search strategy
3. Selection
4. Critical appraisal
5. Main findings
- Causal mechanism (logic model)
- Effect (main findings)
- Moderators / mediators
5. Conclusion
They all have the same outline
24. SRs, REAs and CATs: focus
CATs: meta-analyses and systematic reviews
REAs: includes controlled and/or longitudinal studies
SRs: include all studies, including unpublished papers
26. TODAY
• CAT > what is it?
• Teaching CATs: 10 insights
• Examples
• Your questions, examples, experiences
27.
28. It is hard!
It takes considerable time to learn !
(students will hate you)
29. Explain in advance
The way we teach you to do a CAT is NOT the way
you use it in practice
We teach you how to acquire & apply evidence from
research in a systematic, transparent, reproducible
way > because that is what EBP is about
The CAT report is only for a critical smarty-pants
(often a colleague with a MSc or PhD)
Insight 1
30. Explain in advance:
1. In most cases you will use Google Scholar and just
do a quick and dirty search for meta-analyses
(example: McKinsey Case - diversity)
2. Only when the decision/issue is important you do a
CAT
(But if you have experience with 2, you will be better at 1)
Insight 2
31. Skills to do a CAT
1. Formulate a focused (answerable) question
2. Know how to search (module/chapter 5, librarian)
3. Know how to critically appraise (module/chapter 6&7)
4. Know how to identify the most relevant findings (ES!)
Insight 3
But you may go easy on 2
32. 1. There is an area of tension between learning and
performing!
2. Learning how to do a CAT is not the same as
conducting a (perfect) CAT
3. Students often want to do a CAT on a topic that is
relevant to them or their organization (capstone!)
4. But we want them to do a CAT on a topic that
guarantees learning > suggest topics!
Insight 4
33. Take a stepwise approach
1. Demonstrate a quick and dirty search on Google Scholar:
MA’s and ES’s (McKinsey paper > Diversity)
2. Let them do a quick and dirty search on Google Scholar:
provide topics (mini-CAT)
3. Then let them do a search in ABI, BSE or PycINFO:
moderators/mediators? Interesting primary research?
4. Then let them do a ‘real’ CAT
5. Finally: let them do CAT exam
Insight 5
34. 1. Generational differences > work related outcomes, attitudes
2. Performance feedback > task performance
3. Goal setting > task performance
4. Open office designs > task performance
5. Net Promotor Score > reliable predictor for …?
6. Information sharing > team performance
7. Social cohesion > team performance
8. Psych safety > team performance
9. Etc.
Topics for mini- CATs
Tip: make a nice case!
35. When you let them do a ‘full’ CAT: build in security checks:
1. Teacher has to approve the topic
then
2. Teacher has to approve the search strategy & outcome
then
3. Teacher has to approve the critical appraisal
then
4. Teacher has to approve the main findings
Only then you can submit your CAT
Insight 6
36. Use the group as learning tool
Let students / teams present their draft at each step:
1. Question
2. Search
3. Appraisal
4. Findings
Celebrate ’educational’ mistakes!!
Insight 7
37. There will be students/teams that will crash an burn!
That is OK > we are here to learn (so this should not affect
their grade!)
Insight 8
38. Keep it simple > your aim is to make them (a bit) more
evidence based, NOT to turn them into little academics
Focus on
1. methodological appropriateness & body of evidence
2. effect size (is there a relevant correlation? If not: stop)
3. moderators, mediators, contextual factors
4. so what? (practical implications / recommendations)
Insight 9