3. Course objectives
•
•
•
introduce
the
process
of
eblip
demonstrate
tools
and
strategies
for
applying
evidence
in
pracDce
in
the
real-‐world
parDcipants
will:
o
o
learn
strategies
to
incorporate
different
types
of
evidence
into
their
decision
making
have
opportuniDes
to
work
through
their
own
pracDce
quesDons
and
determine
an
approach
to
take
back
to
their
workplace
4. What we will cover today
9:00
IntroducDons
and
overview
9:15
The
EBLIP
process
9:30
FormulaDng
quesDons
10:15
Break
10:30
Sources
of
evidence
11:00
CriDcal
appraisal
11:40
Applying
evidence
in
pracDce
11:50
Wrap
up
7. What is EBLIP?
“an
approach
to
informaDon
science
that
promotes
the
collecDon,
interpretaDon
and
integraDon
of
valid,
important
and
applicable
user-‐reported,
librarian
observed,
and
research-‐derived
evidence.
The
best
available
evidence,
moderated
by
user
needs
and
preferences,
is
applied
to
improve
the
quality
of
professional
judgements”
(Booth,
2000)
8. Why should you care?
Wisdom
means
acDng
with
knowledge
while
doubDng
what
you
know.
Jeffrey
Pfeffer
and
Robert
I.
Su^on
9. A brief history
1997
2000
2000
2001
2004
2006
Hypothesis
arDcle
by
Jon
Eldredge
MLA
Research
SecDon
created
an
Evidence-‐Based
Librarianship
ImplementaDon
Commi^ee
Eldredge
publishes
papers
that
provide
the
framework
for
EBL
First
Evidence
Based
Librarianship
conference
held
in
Sheffield,
UK
Booth
and
Brice
book
on
EBIP
EBLIP
journal
launches
10. The 5 A's of EBLIP
1)
2)
3)
4)
5)
Formulate
a
focused
quesDon
(Ask)
Find
the
best
evidence
to
help
answer
that
quesDon
(Acquire)
CriDcally
appraise
what
you
have
found
to
ensure
the
quality
of
the
evidence
(Appraise)
Apply
what
you
have
learned
to
your
pracDce
(Apply)
Evaluate
your
performance
(Assess)
11. 5
A's
process
Hayward, 2007, http://
www.cche.net/info.asp
12. Is the EBLIP model used?
• The
ideal
vs
reality
• CriDcisms
of
EBLIP
• Barriers
to
pracDcing
in
an
evidence
based
manner
13. Barriers to evidence use
• OrganizaDonal
dynamics
• Lack
of
Dme/compeDng
demands
on
Dme
• Personal
outlook
/
lack
of
confidence
• EducaDon
and
training
gaps
• InformaDon
needs
not
being
met
• Financial
limits
16.
Ques+ons
drive
the
en+re
EBL
process.
[…]
The
wording
and
content
of
the
ques+ons
will
determine
what
kinds
of
research
designs
are
needed
to
secure
answers.
(J.
Eldredge,
2000)
18. SPICE question structure
Se6ng
the
context
(e.g.,
university
library,
academic
health
center,
K-‐12
school)
Perspec<ve
the
stakeholder(s)
(e.g.,
graduate
students,
managers,
reference
librarians,
parents,
teachers)
Interven<on
the
service
being
offered
(e.g.,
chat
reference,
RefWorks
workshops,
discovery
layer)
Comparison
the
service
to
which
it
is
being
compared
(opDonal)
Evalua<on
the
measure
used
to
determine
change/success/
impact
(e.g.,
usage
staDsDcs,
course
grade)
19. SPICE example
Se6ng
Research
university
Perspec<ve
Librarians
Professors
Interven<on
Survey
quesDonnaire
to
determine
ahtudes,
percepDons,
experiences
Comparison
Not
applicable
Evalua<on
RaDngs
of
informaDon
literacy
competencies
Inclusion
of
IL
in
courses
Disciplinary
differences
20. Librarianship domains
Reference/Enquiries—providing
service
and
access
to
informaDon
that
meets
the
needs
of
library
users.
Educa<on—
IncorporaDng
teaching
methods
and
strategies
to
educate
users
about
library
resources
and
how
to
improve
research
skills.
LIS
Educa+on
subset
–
Specifically
pertaining
to
the
professional
educaDon
of
librarians.
Collec<ons—Building
a
high-‐quality
collecDon
of
print
and
electronic
materials
that
is
useful,
cost-‐effecDve
and
meets
the
users’
needs.
Management—managing
people
and
resources
within
an
organizaDon.
This
includes
markeDng
and
promoDon
as
well
as
human
resources.
Informa<on
access
and
retrieval—creaDng
be^er
systems
and
methods
for
informaDon
retrieval
and
access.
Professional
Issues—exploring
issues
that
affect
librarians
as
a
profession.
(Koufogiannakis,
Crumley,
and
Slater,
2004)
24. Definition
“the
available
body
of
facts
or
informa+on
indica+ng
whether
a
belief
or
proposi+on
is
true
or
valid”
(Oxford
English
DicDonary,
2011).
25. AcDvity
4
What
are
some
possible
evidence
sources
we
use
to
make
decisions
libraries?
26. Evidence Sources
Hard
evidence
SoK
evidence
Published
literature
Input
from
colleagues
StaDsDcs
Tacit
knowledge
Local
research
and
evaluaDon
Feedback
from
users
Other
documents
Facts
Anecdotal
evidence
28. Loca<ng
Published
research
• Databases
• Books,
bibliographies
• Mail
lists,
blogs,
word
•
•
of
mouth
Conferences
SystemaDc
reviews,
Evidence
summaries
29. Crea<ng
Local
evidence
• Usage
data
• TransacDon
data
• EvaluaDon
results
• Survey,
interview,
•
focus
group
findings
Inputs,
outputs,
outcomes,
impact
34. Locating published evidence
Evidence
summaries
h^p://ejournals.library.ualberta.ca/index.php/EBLIP
Evidence
Based
Library
and
Informa+on
Prac+ce
journal,
2006-‐
>250
evidence
summaries
35. Creating evidence
Data
and
findings
Usage
data
TransacDon
data
EvaluaDon
results
Survey,
interview,
focus
group
findings
•
•
•
•
36. Creating evidence
Sources
for
local
evidence
already
available
Library
assessment
department
University
planning
and
insDtuDonal
analysis
Annual
reports
Internal
reports
"Stats"
•
•
•
•
•
37. Creating evidence
Dudden,
R.
F.
(2007).
Using
benchmarking,
needs
assessment,
quality
improvement,
outcome
measurement,
and
library
standards.
New
York:
Neal
Schuman.
38. Evidence for example
LocaDng
evidence
Databases:
LISA
SystemaDc
Review
Wiki
Journals:
Communica+ons
in
IL,
J
of
IL,
J
of
Academic
Librarianship
Conferences:
LILAC,
LOEX,
WILU
EBLIP
Evidence
Summary
CreaDng
evidence
survey
quesDonnaire
•
•
•
•
•
•
39. AcDvity
6
1.
idenDfy
2-‐3
sources
for
locaDng
evidence
to
answer
your
quesDon
2.
consider
1
potenDal
source
of
local
evidence
to
look
into
41. Critical appraisal
Weigh
up
the
evidence
Reliable
Valid
Applicable
•
•
•
Checklists
help
with
criDcal
appraisal
process
Language
is
different
for
interpreDve
(qualitaDve)
research
42. Reliability
1. Results
clearly
explained
2. Response
rate
3. Useful
analysis
4. appropriate
analysis
5. Results
address
research
quesDon(s)
6. LimitaDons
7. Conclusions
based
on
actual
results
43. Validity
1. Focused
issue/quesDon
2. Conflict
of
interest
3. Appropriate
and
replicable
method
4. PopulaDon
and
representaDve
sample
5. Validated
instrument
45. CRiSTAL Checklist
For
appraising
research
on
user
studies
Focuses
on:
Study
design
Results
Relevance
•
•
•
Developed
by
Andrew
Booth
and
Anne
Brice.
Available
from:
h^p://nehngtheevidence.pbworks.com/w/page/
11403006/CriDcal%20Appraisal%20Checklists
50. Widening the model
A
revised
process:
1. Ar+culate
–
come
to
an
understanding
of
the
problem
and
arDculate
it.
2. Assemble
–
assemble
evidence
from
mulDple
sources
that
are
most
appropriate
to
the
problem
at
hand.
3. Assess
–
place
the
evidence
against
all
components
of
the
wider
overarching
problem.
Assess
the
evidence
for
its
quanDty
and
quality.
4. Agree
–
determine
the
best
way
forward
and
if
working
with
a
group,
try
to
achieve
consensus
based
on
the
evidence
and
organisaDonal
goals.
5. Adapt
–revisit
goals
and
needs.
Reflect
on
the
success
of
the
implementaDon.
54. Ways to apply evidence
1) The
evidence
is
directly
applicable
2) The
evidence
needs
to
be
locally
validated
3) The
evidence
improves
understanding
ReflecDon