We can improve the way that we commission and create services if we alter our model. Using Design Thinking we can create services that meet user needs but are also cheaper.
2. Using data, from individuals, to describe communities and
demonstrate outcomes
1. Using Design Thinking we can create better services that meet the needs
of people
2. Data has a social value beyond proving contracts are working
3. Using data we can describe the communities we live in and co-produce
services
4. Demonstrating outcomes in terms of national policy allows data to be
aggregated for social benefit
3. We all love the idea of measuring outcomes rather than activity
4. Challenges with Measuring Outcomes
• Attribution – Did you cause that outcome?
• Complexity – Especially in preventative services
• Why are you doing it?
• Contract specific outcomes can change a service
• Place vs service
7. Built environment – 10%
Clinical care – 20%
Healthy behaviours – 30%
Socio-economic factors – 40%
Contributors to Overall Health
Outcomes
Source: Robert Wood Johnson Foundation and
University of Wisconsin Population Health Institute
8. There are two types of outcome
1. Systemic outcomes – e.g. A&E attendance, GP appointments
2. Individual outcomes – measurable improvement in health and
social situation
9. A focus on outcomes should fundamentally change
the way we commission and design services
10. By focussing on measuring the system
we miss the opportunity of
human-centric design
11. Challenges With Commissioning
• The difference between commissioning and procurement
• Co-designing services is more expensive
• What if the public don’t agree?
• Where do you get your data from?
12. What do we commission against?
• QOF Data
• Open Exeter Data
• National Data Sets (Atlas - life expectancy, infant mortality)
• Local data sets?
• Mosaic?
• Community data on service design?
13. Traditional Commissioning Model
This often the point where we
start getting involved
Build service
model
Test model with
patient/user
groups
Decide on some
metrics
Go out to
contract
Measure
metrics
14. What is the purpose of
collecting data?
Is it purely to measure
transactions?
16. Alternative Service Design Model
Set out
problem to
be solved
Go out to
contract
Define user
needs
Identify data
to meet user
needs
Design
service
Adapt service
Test service
against user
needs
Prove the
problem is
being solved
17. Tracey is 24.
Tracey grew up in Chelmsley Wood until she was
21. She lived at home until she fell out with her
family and came to stay with friends in
Birmingham. After a couple of months it was
obvious there wasn’t enough room in the one bedroom flat she was staying in
and without anywhere else to go she ended up in a hostel in Birmingham. It’s not a
great part of Birmingham and Tracey doesn’t feel safe walking around there.
Tracey can’t stay in the hostel during the day so spends a lot her days in friend’s flats.
During the day her, and her friends, watch TV and usually drink quite a lot of vodka.
She has completed over 2000 levels of Candy Crush
Traceyple
18. Tracey:
● In debt
● Socially isolated
● Lives in a hostel
● Been to see GP 7 times in 3 months
● Stressed and anxious
● Attended A & E on two occasions with alcohol related
alcohol related issues
● Smokes
● Misuses alcohol
● Poor diet
● No exercise
Traceyple
Socio-economic factors
Clinical factorsHealthy behaviour
19. The danger of the microscope
Defining people by the services they use, and the data
we choose to collect on those services,
leads to data reductionism
20. Not understanding the person can lead to
Solutionism:-
The providing of a solution or solutions to a customer or client
(sometimes before a problem has been identified)
24. Integration around the individual
GP
A&E
Smoking
Cessation
Alcohol
Services
Job
Centre
Money
advice
Gym
Housing
Health
Trainer
Collective
Outcomes
25. Data Collection is Structured to Match the Life Course
Starting
Well Data
Dictionary
Developing
Well Data
Dictionary
Working
Well Data
Dictionary
Living Well
Data
Dictionary
Ageing Well
Data
Dictionary
Diabetes
Data
Dictionary
But also service specific
Mental
Health Data
Dictionary
Supported
Housing Data
Dictionary
End of Life
Data
Dictionary
Domestic
Abuse Data
Dictionary
We have identified 93 common risks and issues.
Each has been defined and is monitored for any change in policy.
We are adding to this list all of the time.
26. Living Well Data Dictionary
Personal Circumstances:
• Domestic Abuse
• Homeless
• Temporary Accommodation
• Unsuitable Accommodation
• Vulnerable Adult
• Financial Hardship
• Social Isolation - Loneliness
• Environment - Noise
• Environment - Outdoor Spaces
Behaviour:
• Very Low Fruit & Vegetable Intake
• Low Fruit and Vegetable Intake
• Significant Fried and Processed Food
Intake
• Excessive Sugar
• Nutrition - Iron
• Physical Activity - Moderately
• Physical Activity - Inactive
• Alcohol Misuse
• Smoking
• Substance Misuse
Status:
• Weight - Overweight
• Weight – Obese
• Mental Health – Low Reported
Wellbeing
• Mental Health - Stress and Anxiety
• Sexual Health - Unwanted Pregnancy
• Sexual Health – Sexually Transmitted
Infections
• Pre - Diabetes: Non - Diabetic
• Screening - Increased Blood Pressure
• Screening - High Blood Pressure
27.
28. Household
income is
>60% of UK
average
Reduce
households
where
neither
parent is in
work
Healthy Child
Programme
The family can
afford food
and clothing
items
Social Justice Outcomes
Framework
Department of Health
Department of Work and
Pensions
Financial Hardship
After
required fuel
costs the
family
remains
above the
poverty line
Improving
Outcomes
Supporting
Transparency
Reduce the
proportion of
those on
work-related
benefits
The number
of working
age adults
engaged in
work related
activity
32. Our tips to designing services
Start with the person
Think about data that is relevant to the person
Be broad in the data you collect
Make data collection as simple as possible
Iterate regularly
If the data says the service isn’t working then change the
service
Four orders, graphic artefacts, products, processes, services (and systems)
2012 saw a systemic shift to outcomes from activity. Largely supported by people. But…… what’s the breakdown of outcomes vs activity?Typically we fund that it breaks down to 95% activity and 5% outcomes. Frequently things labelled as outcomes are actually activity.
Do you routinely use these frameworks to assess progress?In the main frameworks are the domain of commissioners but aligning with these frameworks make you a more attractive provider.
The challenge of moving to a focus on outcomes means the metric of success is no longer how busy you are but how effective you, and your services are.The move to a qualitative approach from a quantitative approach means that you frequently have to make more fundamental changes if the data tells you a service isn’t working.If an activity based contract is failing then just do more, if an outcome based contract is failing then doing more of the same is a really bad idea.
The main focus of investment is in clinical care. It costs more money, it requires significantly more long term investment and is less effective. Traditionally services in the bottom two tiers have struggled to demonstrate their worth to the people with money.
There is a disconnection in priorities.
Providers (and not surprisingly individuals) prioritise individual outcomes. Commissioners tend to focus on system outcomes. They are related.Systemic outcomes are relatively easy to achieve. Cynically you could achieve them through deteriorating the quality of service. That wouldn’t mean an improvement in overall health and wellbeing.
Such a fundamental change in emphasis needs to be supported in a change to the way we carry out service design. Doing what we used to do and just changing the reports that people complete won’t work.
Putting the people that use a system at the heart of the design process mean that you can create a service that can be empirically tested as effectiveBy putting continual challenge at the heart of your service you create something that is agile and can respond to changes in need.
We don’t incorporate failure into our commissioning model. If a service fails what do we do? Do we allow the service to redesign in line with patient expectations, even if this creates a different model to that specified in a contract? Or do we just wait out the contract length and then no recommission?This approach creates dead money in the system. Agile service design can create effective services but also efficient services.
I’ve never come across anyone systematically using community data in service design or commissioning.
Data shouldn’t be an after thought. It should run through your service design process from the very beginning.
We need to think about why we collect data.Does it facilitate supporting the individual?Does it prove your contract is working?Does it support how your service works?Have you considered how the data can be used? How it can be analysed? What information it creates?Have you thought about the technology implications of data collection and data protection?
Knowledge is the part where you get insight.
Wisdom is where it becomes actionable.Can you validate all of the data you collect against this model? Do you collect data to create value as a matter of course.
Iterative process remove, or at least mitigate, the need for the dreaded evaluationWhen you test service against user needs to remember to test the data and the emotional response.
This is persona made up of user interviews. Notice that none of this relates to services. This the person.
Personas need to be at the heart of all service design.
If we see someone accessing a smoking cessation service as purely a smoker we miss the complexity that might be contributing to why someone smokes.
Solutionism can lead us to decide on a service design that we like and then make us start to look for the data that will support it.
The system is complex but Tracey doesn’t need to know this. Tracey has three needs.1) To get services to integrate around her2) To be able to navigate the system without friction3) To know her personal data is being used productively and securelyEvery element of this system has a data need from Tracey. They rarely co-ordinate that need
Particular problem of services that are removed from direct client interventions. For example if a social prescribing project refers to an exercise class and a person begins exercising who owns that outcome?
If Tracey goes to a money advice service and a smoking cessation service these could impact each other. Giving up smoking could free up a lot of weekly money for Tracey but does that mean the money advice agency has achieved an outcome.
At the end of the day does it really matter how it happened?Alliance contracting provides a model to properly measure collective impact.
This should be fundamental to how you manage services.
Every month can you identify a tangible service improvement and implement it?
Aggregating data across services allows us to truly gain insight into neighbourhoods and means data provides social value.
Holistic approach support integration.The adversarial form of commissioning is endingSimplicity in data collection minimises data protection issues and increases compliance and accuracy