How can we use metrics to better monitor, develop and discuss our services? Building on past experience in NHS Libraries and the wider sector a set of principles for good metrics are described. These are illustrated using data generated by running. The quality metric template is introduced as a tool to help the definition and sharing of metrics.
4. Why Metrics?
• How are we doing?
• How do we compare?
• Have changes made a
difference?
• Have better conversations
@NHS_HealthEdEng #heelks
5. Defining terms
• "A metric is criteria against which something is
measured" (Ben Showers (2015) Library Analytics
and Metrics)
• "a criterion or set of criteria stated in quantifiable
terms" (OED)”
@NHS_HealthEdEng #heelks
6. What about KPIs?
1. Result Indicator (RIs)
RIs tell you what you have done
2. Performance Indicators (PIs)
PIs tell you what to do
3. Key Result Indicator (KRI)
KRIs tell you how you have done in a
perspective or critical success factor
4. Key performance Indicators (KPIs)
KPIs tell you what to do to increase
performance dramatically
According to Parmenter
in Appleton, L (2017)
7. What was the KfH plan?
• Take a look around
• Identify appropriate methodologies
and mechanisms
• Help people get better with metrics
• Support Knowledge for Healthcare
@NHS_HealthEdEng #heelks
12. @NHS_HealthEdEng #heelks
NHS explorations
Library Quality Assurance Framework (LQAF)
• Replaced HeLICon (2010 onwards)
• 48 criteria across 5 domains
– Strategic Management
– Finance and Service Level Agreements
– Human Resources and staff management
– Infrastructure and facilities
– Library/ Knowledge Services Delivery and Development
• Annual submission
13. @NHS_HealthEdEng #heelks
NHS explorations
LQAF
Pro • Rigorous
• Regular
• Linked to
stakeholders
• Growing pool of
data
Con
• Inconsistent
compliance
regimes
• Self assessment
subjective
• Burden of evidence
collection
14. @NHS_HealthEdEng #heelks
NHS explorations
SHALL National KPI
• 2011 consultation on 6 national KPI
• Revised to 4 (not all from original list)
– % of the organisation’s workforce (headcount) who are registered library
members.
– % of the organisation’s workforce (headcount) who have registered as a
library member in the last year.
– % of the organisation’s workforce (headcount) who have used ATHENS in
the last year.
– % increase in compliance with the Library Quality Assurance Framework
(LQAF) compared with the previous year.
• Not implemented
15.
16. Practice in the NHS (at the time)
• Brief KfH survey on metrics in use
• 150 responses but only 47 offered a metric
• 117 metrics suggested
@NHS_HealthEdEng #heelks
17. Areas of focus and approaches
0 5 10 15 20 25 30
Access
Book/physical
Current awareness
Document Supply/ILLs
Enquiries
E-Resource Use
Literature Searches
Outreach
Quality assurance
Training
Unclear
User registration
Website
Impact LQAF Satisfaction Timely Response Usage statistics Value Not stated
18. Serendipity
• Areas for focus (Van Loo in Haines-Taylor &
Wilson, 1990):
– time consuming
– space intensive
– high cost
– affect most users
– directly linked to library objectives
– well defined and easy to describe
– relatively easy to collect
– are in areas where library staff have some
control to make changes
@NHS_HealthEdEng #heelks
19. @NHS_HealthEdEng #heelks
Wider world - libraries
International standard (ISO 11620:2014)
• Generic approach to performance
indicators
• Well defined terms
– Resources
– Use (activity)
– Efficiency (cost)
– Potentials and Development (value added work)
• 52 indicators offered
20. @NHS_HealthEdEng #heelks
Wider world - libraries
International standard - criteria
Informative content (provides information for decision
making
Reliability (produces same result when repeated)
Validity (measures what it is intended to measure –
though indirect measures can be valid)
Appropriateness (units and methods of measurement
appropriate to purpose)
Practicality (does not require unreasonable staff or user
time)
Comparability (the extent to which a score will mean the
same for different services – standard is clear you should
only compare similar services)
21. @NHS_HealthEdEng #heelks
Wider world - libraries
RLUK – service standards
• Pilot of 8 initial standards
• “We will achieve X% in Y”
• Shift to benchmarking
approach
• Potential kite mark
22. @NHS_HealthEdEng #heelks
Wider world
The Metric Tide - dimensions
“Robustness: basing metrics on the best possible data
in terms of accuracy and scope
Humility: recognising that quantitative evaluation should
support – but not supplant – qualitative, expert
assessment
Transparency: keeping data collection and analytical
processes open and transparent, so that those being
evaluated can test and verify the results
Diversity: accounting for variation by field, and using a
range of indicators to reflect and support a plurality of
research and researcher career paths across the system
Reflexivity: recognising and anticipating the systemic
and potential effects of indicators, and updating them in
response.”
23. @NHS_HealthEdEng #heelks
Wider world
HSCIC – Quality Assurance Indicators Tool
Relevance (Does it meet user need? Is it actionable?)
Accurate and reliable (Quality of data? Is it a good estimate of reality?)
Timeliness and Punctuality (How long after the event is data available / collected?)
Accessibility and clarity (How easy is to access the data? How easy is it to
interpret?)
Coherence and comparability (Are data from different sources on the same topic
similar? Can it be compared over time?)
Trade-offs (Would improving this metric have a negative impact on another?)
Assessment of user needs and perceptions (What do stakeholders think?)
Performance, cost and respondent burden (How much work is involved in
collection?)
Confidentiality and transparency
25. @NHS_HealthEdEng #heelks
Principles for good metrics
Meaningful
• Relates to goals of organisation
• Relates to needs of stake holders
• Re-examined over time to ensure
still valid
26. @NHS_HealthEdEng #heelks
Principles for good metrics
Actionable
• Measures what matters
• Measures something you can
influence
• Drives changes to behaviour /
services
• Investigate not assume
27. @NHS_HealthEdEng #heelks
Principles for good metrics
Reproducible
• Clearly defined in advance
• Transparent
• Can be replicated
• Best available data
• Non burdensome (to allow repetition)
28. @NHS_HealthEdEng #heelks
Principles for good metrics
Comparable
• Valid over time for internal use
• Valid externally for benchmarking
• Respect diversity of services
29. A quick run through
• What are your running metrics?
• Scribble a couple down and
think how they work out as we
go along
30. Meaningful running
• Citius, Altius, Amplius
• What are my goals?
• Are they valid this year?
• Not just about times
31. Actionable running
• What matters?
• Can I influence it?
• Changes behaviour?
• Danger of assumptions
36. NHS Athens usage levels
M – Very precise goal!
A – Influencing how?
R – What is usage in this
context? Summer versus
Autumn?
C – This year versus last?
37. Lit search turn around time
M – How set? Negotiated
deadlines? Does it matter?
A – Is it ever broken?
R – Is this days or searches? Is
there a combined measure that
would be clearer?
C –
38. ToC generated requests
M – What do the colours mean?
What do the requests mean?
Importance?
A – Ability to influence this?
R – 5 or less what?
C – Longer run of data?
39. ToC generated requests
M – What do I get from this as a
stakeholder? Is this what I want?
A – How would we change?
R – What is included here?
Timed how? Is this accurate?
C –
45. Metric Definition: Achieve Bronze for devotion (regularity of running) at 4 week, 12 week, 24 week and 1 year time periods using the SmashRun
Rank system
Why is it important?
Demonstrates consistent activity over an extended period
Regular running is the way to improve
Having a target motivates
Looking to all the time periods encourages regularity while smoothing patterns through the year
Process for compiling the Metric:
Open http://smashrun.com/alanfricker/ranks and select Devotion option
Ensure Demographic is set correctly for comparison (M / 40-49)
Check for medal status
What does it mean?
Records number of sessions versus days in each time period and ranks me against
others in the same demographic
Bronze equals top 50%
Limitations
Takes no account of distance so I could track very brief runs to up averages
Seasonal variation in others running means bronze standard on shorter time
periods varies through year (fewer sessions in the winter overall)
Desired outcomes:
Increase average number of runs over a year from 1.8 per
week in 2017 to 2.1
Improvement plans:
Find someone to run with / drag the family to Park Run
Take running shoes on holiday
Organise #libraruns meetups at races / conferences
Reporting:
Review in year but consider full impact at year end
47. Great tools
• Running has Strava, Endomondo, Smashrun, Parkrun
data, RunBritainRankings etc etc
• What do we have?
– OpenAthens
– COUNTER
– LMS
– Other systems (KnowledgeShare, MailChimp etc)
– Tools for feedback?
– Your favourites…
48. Get stuck in
• You should have brought
some ideas!
• Your tables have
– Copies of the template
– Copies with space to write
– Copies of LYP dashboard
– Copies of LTH Quality
– Copies of metrics from the
bank
49. Have a go
• Draft one yourself
• Discuss with your table
• Ask for help
• Be ready to share!
50. @NHS_HealthEdEng #heelks
Make a depost in the bank
Promoting sharing and supporting use
• Open submission and
publication
• Quality check – mostly around
reproducibility
• One from each of you please!
51. Thanks
Alan Fricker - Head of NHS Partnership & Liaison,
King’s College London
Alan.Fricker@kcl.ac.uk
Images authors own or CC0 from pixabay.com
@NHS_HealthEdEng #heelks
Editor's Notes
You could say Something to argue with
We started by considering where metrics (and quality assurance / KPI) had been discussed in the NHS previously – we stuck with major initiatives and did not seek out an exhaustive picture of local work.
Helicon roots back in original LINC health panel accreditation checklist and toolkit (1996-1998)
Example of use of these figures in the NLH finance report
How have we addressed the cons?
Previous attempt to address this issue. Feel very culpable here as one of the people who shot holes in things. Basically – I could game almost every single one – the question was – did they matter?
First six
KPI1. Percentage of the organisation’s workforce (headcount) which are “active* library users.(Indicates penetration of library service).
KPI2. Percentage of the organisation’s workforce (headcount) which are registered ATHENS users.(Indicates use of e-resources)(E.g., 1000 Athens users in an organisation of 10,000 staff = 10% )
KPI3. Re-current expenditure commitment on library services based on the organisation's workforce (WTE). (Indicates Trust commitment to Library Services).(E.g., £100,000 spent on Library services in a Trust of 10,000 staff = £10 is spent on library services per WTE)
KPI4. Number of information consultancy enquiries per member of staff based on the organisation's workforce (WTE).(Indicates penetration level of Library enquiries on the organisation).(E.g., 400 enquiries in an organisation with 1,000 staff = a penetration level of 0.4)
KPI5. Percentage of the organisation's workforce (headcount) that subscribe to current awareness services. (Indicates penetration level of current awareness services on the organisation).
KPI6. Percentage of organisation's workforce (headcount) which have received information skills training in one year.(Indicates penetration of information skills/information literacy training on organisation).
Remember that it can be easy to pick holes in things. But the metric might be the best one you can have. Our ship might get torpedoed but it is useful in the mean time and no one should die (with a bit of luck)
Why so few metrics? Issue with tool? Survey overload? Discomfort with metrics?
Discovered on the discard pile – describe what we were seeing in the survey data perfectly
Bingo! Powerful way to think about what we are interested in
Research Libraries UK. Targets set across the piece do not make sense.
Debate in HE around use of Metrics – post REF 2014 and in an increasingly numbers driven approach to career futures.
Now Known as NHS Digital. National Library of quality assurance indicators – task under the 2012 Health and Social Care Act – aimed at healthcare delivery and performance but work for our quality purposes too
People care about this metric
Often combines more than one facet
Aligned to organisational goals
This metric makes a difference and that you can change
No numbers without stories
A piece of research
You could repeat my metric and the results would be consistent
See change over time
Internal more reliable than external as fewer variables
Take care with comparisons!
Who runs?
Faster, Higher, Further – Olympics is Citius, Altius, Fortius, which is Latin for "Faster, Higher, Stronger“
Lots of data – distance, time, splits, frequency, cadence, heart rate – but what is interesting changes according to my meaningful criteria
What I do makes a difference and the metrics definitely make a difference to my behaviour (data changed my view of running)
Watch for head winds – muddy days
Up to me to define goals. Data is open for scrutiny. I can run the same routes and see what happens
The same park run!
Comparison to myself (and other matched people!)
Age grading as an option! And Age generally!
Doing this is not easy! The template is there to help
Main template – spells it out and offers gaps. Simple Word based
Checklist – good enough for Gawande and the WHO – good enough for me
Booklet with modified forms (previous version)
Uses 9 lightly modified templates to define and present quality standards
Used to help present a wide picture of the service
Revisiting standards at moment to ensure meaningful – time of document supply turn around for example as always made. Is it challenging? Meaningful?
Good feedback from colleagues within service and that she works with – wider use in division who were thinking about similar issues
Three year was plan but more organic in the end
(accessing GMC data, considering options for KnowledgeShare)