mental health & wellbeing commissioning pack programme ... · (proof of concept) – final...
TRANSCRIPT
Mental Health & Wellbeing Commissioning Pack Programme: Cost-Consequence Tool (Proof of Concept) – Final Deliverable December 2012
i
Cost-Consequence Tool Development Team
Avon and Wiltshire Mental Health Partnership NHS Trust Julie Hankin
Brunel University Dr Julie Eatock
Prof. Tony Elliman
Dr Joanne Lord
Peter Taylor
Prof. Terry Young
Independent Adviser Letsie Tilley
NHS Hampshire Joanne Kiff
Southern Health NHS Foundation Trust Jessamy Baird
Chris Woodfine
The Whole Systems Partnership Carol Cochrane
Peter Lacey
Lucy O’Leary
University of Southampton Prof. David Kingdon
ii
Table of Contents Cost-Consequence Tool Development Team................................................................................................... i
Introduction ....................................................................................................................................................1
Tool’s Purpose and its place within the Mental Health Commissioning Pack ............................................1
“Systems Thinking” basis of tool .....................................................................................................................2
Reinforcing and Balancing loops .................................................................................................................2
Example of influences in a mental health system ..........................................................................................4
Systems map of mental health services..........................................................................................................8
Running the model........................................................................................................................................10
Baseline outputs ...........................................................................................................................................11
Modelling change .....................................................................................................................................12
Scenario 1 .............................................................................................................................................12
Scenario 2 .............................................................................................................................................14
Combining scenario 1 & 2 .....................................................................................................................15
Current limitations: .......................................................................................................................................15
Commissioner and Provider Perspectives ....................................................................................................16
Local customisation ..................................................................................................................................16
Negotiation ...............................................................................................................................................17
Consultation Undertaken ..............................................................................................................................17
Further Development ...................................................................................................................................18
Alignment with Other MH Commissioning Tools .........................................................................................18
Appendix 1 Influence Diagrams ..............................................................................................................19
Appendix 2 User Guide To Model ...........................................................................................................43
1
Introduction This document accompanies and contextualises the last of three contracted outputs (proof of concept
cost-consequence tool) from a consortium led by the Mental Health Foundation and tasked by the NHS Confederation, acting on behalf of the SHA Mental Health Leads’ Group and the Mental Health Joint Commissioning Panel. The cost-consequence tool complements the consortium’s two earlier, related deliverables (Cases for Change and Service Specifications & Contract Inserts).
Consortium organisations involved in the cost-consequence tool’s development are the Avon and Wiltshire Mental Health Partnership NHS Trust, Brunel University, NHS Hampshire, the Southern Health NHS Foundation Trust, the University of Southampton and The Whole Systems Partnership. Individual members of the development team from each consortium are named previously (page i).
The document describes the cost-consequence tool’s scope and purpose and provides guidance about its use. The document also introduces a set of charts (influence diagrams), designed to help commissioners see ‘at a glance’ the possible impact of commissioned changes in service provision. These, together with supporting commentary about underpinning assumptions and envisaged risks, offer information that can be used in conjunction with or independently of the tool.
Lastly, the document sets out the scope, extent and results of consultation undertaken as part of the tool’s development. At the same time, it includes some of the main issues and ideas identified around the tool’s future dissemination, deployment, onward development and support, as well as its potential for generalizability. Also included are emerging results of dialogue with developers of other tools commissioned by the SHA Mental Health Leads’ Group and the Mental Health Joint Commissioning Panel (i.e. the London Needs Assessment Service & Financial Profiling Tool and the West Midlands PbR Tool), about the cost-consequence tool’s alignment with other tools designed to support the commissioning of mental health services.
Tool’s Purpose and its place within the Mental Health Commissioning Pack
We defined the purpose of our assignment to build a prototype tool as
“To provide a strategic tool to inform commissioning discussions between CCGs and the Specialist Provider that is built on MH Clusters and will link to both local and national tariffs but which also reflects service pathways and redesign options.”
The Tool is intended to be used jointly by commissioners and providers, to encourage discussion and facilitate decisions about the type and level of services being commissioned to meet service users’ mental health needs.
The Tool is a prototype and demonstrates the conceptualisation of projecting and comparing cost-consequences that can arise from variations in the type and level of services being commissioned to meet service users’ mental health needs aligned to the ‘Cases for Change’ scenarios, which have also been developed as part of the Mental Health Commissioning Pack.
Relevant questions from the commissioners’ or providers’ perspectives may include: what is the impact of changes in numbers of service users (increased referrals, better early diagnosis,
demography etc.)? how might the rates of flow between clusters change over time? what happens if the capacity in inpatient care is reduced? do we need to invest in community services? and how can we best reduce total costs?
2
“Systems Thinking” basis of tool
Health care is a complex system with many inter-related actions and consequences. This means that when we consider making a change to any part of the system we need to take account of the system in its entirety if we want to assess the full impact of implementing the change. Systems thinking allows us to identify and understand how changes over time affect all parts of the system, through the use of feedback loops. This is best illustrated by some simple examples.
Reinforcing and Balancing loops There are two main types of loops: reinforcing loops and balancing loops. Consider the case of savings, the more savings you have, the more interest you earn; and assuming that this is added to your savings, the more savings you have and therefore the more interest you earn, so the more savings you have….and so on. This means that over time the amount of savings will continually rise.
Figure 1 Reinforcing Loops. Reproduced courtesy of The Whole Systems Partnership
Wheras reinforcing loops have the effect of continually increasing (or decreasing) the output over time, balancing loops will reach a steady state over time. So if you have a desired inventory level (e.g. stock of drugs on a ward), and currently you are below this level, you will make inventory adjustments, your actual level of inventory will increase, and the discrepancy between the actual and the desired inventory level will reduce, but may still be less than the desired level, in which case you will make more inventory adjustments, further increasing your inventory level, further reducing the discrepancy. This loop will continue until there is no discrepancy between your desired level of inventory and the actual level of inventory. This has the effect of balancing the inventory at the desired level.
11
Reinforcing Loops
Time
Savi
ngs
Behaviour over time
Savings
InterestEarned
Structure
S
S
R
3
Figure 2. Balancing Loops. Reproduced courtesy of The Whole Systems Partnership
However, most complex systems – and health care is no exception - consist of combinations of these types of loops that interact with each other and so the impact over time will be dependent on the strength of the relationships both within the loops and between the loops. Consider the following case: If morale of workers is high, then productivity will increase, so sales growth will increase, so opportunity for promotion will increase, further increasing morale. This is a reinforcing loop and would imply that this cycle can continue indefinitely. However we know from experience that this is not the case in reality. A second loop exists that says that as sales growth increases, you increase the level of market saturation, and as the level of market saturation increases sales growth will decrease. Combining these two loops means that sales growth will rise over time, until a point where a level of saturation is reached, and sales growth will stagnate. At this point promotion opportunities will reduce, staff morale will drop, productivity will drop, and consequently sales growth will drop rather than level out.
Figure 3. Combining Reinforcing loops and Balancing Loops. Reproduced courtesy of The Whole Systems Partnership
15
Balancing Loops
Behaviour over timeDesired
Inventory
ActualInventory
Discrepancy
InventoryAdjustments
O
B
S
S
Structure
TimeIn
vent
ory
Desired Inventory
29
Limits to Growth
Morale
Promotion OpportunitiesProductivity
S
S
S
Sales Growth
R
S
B
Market Saturation
O
S
Time
Growth
4
The speed at which the growth rates rise and fall will be dependent on the strength of the relationship between the various parts of the system. So in the case of interest earned on savings, the higher the interest rate the steeper the gradient of the graph.
Example of influences in a mental health system
The diagram below shows how the various parts of the mental health system are linked. Some aspects (e.g. social care) although important were considered beyond the scope of this project at this stage and therefore are not included within this model.
Figure 4 Influences within a mental health system
The colour of the arrows indicates whether values will move in the same direction (green) or the opposite direction (red). For instance as the number of emergency admissions increases, the number of beds available will decrease, so the arrow is red. However as the number of service users discharged earlier increases the numbers of service users being treated in the community will also increase, and therefore the arrow is green. Each arrow is numbered and the reasoning behind the link and the colouring of the arrow is given in the Table 1. These numbers are then used to explain the reinforcing and balancing loops that exist within the system. Figure 5 and Figure 6 depict the individual loops within the system and then identify them as either balancing or reinforcing loops with explanations taken from the table.
5
Table 1 Influence descriptions
Key Description of influence Assumptions
1 As the number of beds available decreases the number of service users admitted will also decrease
Beds utilisation remains the same, so empty beds will be filled as the threshold for admission will change
2 As the number of service users admitted decreases the acuity of the admitted patients will increase
This assumes that the most severe cases will still need to be admitted, and the less severe will be treated in the community
3 As the acuity of the admitted patient’s condition increases, the average length of stay as an inpatient will increase
Assumed that those with the more severe problems take longer to treat and stabilise and therefore have longer LOS
4 As the average length of stay increases the number of available beds will decrease
5 As the number of service users admitted decreases, the number of service users that receive treatment in the community will increase
This assumes that as the service users are in the system they need to be treated somewhere
6 As the number of service users receiving treatment in the community increases the staff to service user ratio decreases
This assumes that the number of staff and skill mix doesn’t change
7 As the staff to service user ratio decreases the number of emergency admissions will increase
8 As the number of emergency admissions increases the number of available beds will decrease
9 As the number of available beds decreases the number of service users discharged earlier will increase
This assumes that service users will be discharged to ensure enough capacity for unanticipated service users rather than sending to inpatient facility out of area
10 As the number of service users discharged earlier increases the average length of stay will decrease
11 As the number of service users discharged earlier increases then the number of service users being treated in the community will also increase
This assumes that as the service users are in the system they need to be treated somewhere
12 As the number of service users being treated in the community increases the demand for supported accommodation services will increase
Accommodation service is outside the scope of this model
13 As the staff to service user decreases the number of GP visits will increase As appointments are more difficult to make service users will start to rely on GP services ( GP visits are outside the scope of this model)
6
Figure 5 Reinforcing Loops associated with changing the number of avaialble beds
7
Figure 6 Balancing loop and other impacts associated with changing the number of beds available
8
From these influence diagrams we are unable to see what the impact on the overall system is likely to be, and this is where we need a dynamic model that can show us the effects over time. Using the model allows us to articulate the strength of the relationships shown in the scenario diagrams, and, given those relationships, determine how the various parts of the model interact with each other – to give an overall impact on the numbers of service users in the various parts of the system, and the overall costs.
Systems map of mental health services
Below is a systems map of the various services within the mental health system. Service users are allocated to a cluster, and access various parts of the service (service elements) at different points in time. For instance there may be regular community team visits, but the frequency of these will vary depending on size of team, number of service users, cluster allocation and severity. Similarly admissions to in-patient care will vary and the cost will be dependent on the length of stay which is likely to be correlated to the severity.
Figure 7 Systems map of a mental health system
So in order to run the model dynamically we need to have some information about the current situation. There are two levels at which information is required to set up the initial system, one at a population level and the other at a cluster level.
Table 2 Typical information required for the development of a system-level model Population Level Cluster level Rates of new service users entering the system per time period
The number of people entering a service element per time period
%s of new service users entering the system by entry route (range of referrers, criminal justice, etc.)
The average LOS in each service element
%s of new service users by initial cluster The % of people leaving a service element who go to other service elements or to no service elements
Average duration in a cluster before moving on (or leaving the system)
Costs/prices (identified separately for commissioner and provider)
9
Of those leaving the cluster, the %s of people who go to other clusters or leave the system
Any costs/prices associated with presence in a cluster per time period (e.g. commissioner local tariff payment, or basic provider costs of delivering CMHT) Costs/prices associated with entry to a service (i.e. not dependent on time spent in the service – e.g. provider cost of assessment for new IP on admission) Unit cost for each service per time period (may be made up of average levels of input by time period x unit cost of input – e.g. average number of visits per week for Hospital at Home service user x unit provider cost of visit). Wherever possible Service user Level Information Costing Systems (PLICS) should be used to improve accuracy.
So for the purposes of this example we have interrogated the data held on the electronic clinical information system for one NHS Trust and produced the following information for clusters 10-15 in order to provide initial information for our model. This information is not intended as a definitive data set, but has been used so that it is possible to see/infer what the results from running the model mean.
The tool is capable of incorporating both costs and tariffs (i.e. cost + surplus = price (tariff)) which will enable both commissioner and provider perspectives to be modelled. For the purposes of this example we have not differentiated between costs and tariffs.
Table 3 Initial inputs for testing and running the model C10 C11 C12 C13 C14 C15
Service users being treated in community 273 1189 769 468 102 62
Rate of entry to inpatient per 1000 community service users per months
29.3 15.1 32.5 47 197 48.2
Inpatients 9 18 39 45 32 4
Average length of stay in inpatient care (days) 34.5 30.3 47 62.4 49 42.4
Average cost per IP bed day 300 300 400 500 600 600
Average cost per month for community service 820 580 760 900 1400 1200
Average LOS in a cluster (in months) 36 323 200 142 2.4 3.2
Average number of visits per service user per month by service
AOT1 0 0 0.01 0.01 0.02 0
Assessment team 0.07 0.04 0.04 0.02 0.4 0.31
CMHT2 2.72 2.08 2.74 3.26 2.7 2.08
CRHT3 0.02 0.02 0.02 0.02 0.34 0.42
Early Intervention 0.13 2 0.04 0.02 0.05 0.03
Hospital @ home 0.12 0.07 0.09 0.13 0.81 0.76
Inpatient Bed days per service user per month 0.84 0.62 1.64 2.67 7.11 4.86
Cost per community AOT1 125 125 125 125 125 125
10
intervention (£) Assessment team 200 200 200 200 200 200
CMHT2 125 125 125 125 125 125
CRHT3 300 300 300 300 300 300
Early Intervention 200 200 200 200 200 200
Hospital @ home 200 200 200 200 200 200
Monthly local tariff (£) 2450 2150 2000 2050 2250 2300
Transitions percentage per month
From cluster 10 to 0 1.35 1 0.5 0.4 0.1
From cluster 11 to 0.7 0 0.1 0.1 0 0.4
From cluster 12 to 0 0 0 0.1 1 0.2 1AOT (Assertive outreach team), 2 CMHT (Community mental health team), 3CRHT (Crisis resolution home treatment)
Running the model
The computer model is shown below. The rectangles or ‘stocks’ (i.e. Inpatients by cluster and Community patients by cluster) represent how many service users are in each service area. The ‘valves’ depict the speed at which service users join or leave these stocks. The red arrows between the various parts of the model show the interactions, i.e. how the number of inpatients affects the speed at which they are discharged. As the model runs, and the numbers of service users in the stocks changes over time, the speed at which they are discharged or admitted will also change over time.
Figure 8 iThink™ model of relationship between inpatient and community service users
One question that must be answered is how long we want the model to run over. This is dependent on the system that is being modelled as well as the questions that the model is trying to answer. If a system is measured in fractions of a second, e.g. bodily responses, then there is no point running the model for years.
Thus, in situations where people spend months or even years within a cluster, the model needs to run for a longer time period in order to capture the impact of changes. However, at the same time the model is designed to look at costs, some of which may require more frequent reassessment (e.g. as part of an annual budgeting process) and this necessitates a shorter modelling period.
11
In order to balance these perspectives, in this example the model was run over a four-year period. This was done to demonstrate three distinct periods of time: Year 1 = baseline, steady-state/historic; Year 2 = scenario implementation and initial impact; Years 3&4 = medium term implications.
Baseline outputs
In this baseline case the model is run with the initial set-up to show the impact of doing nothing. This provides us with a baseline situation against which we can compare all the other scenarios.
Figure 9 shows the output when the model is run over 4 years (shown in months on the horizontal axis). The vertical axis shows two scales, the blue line shows the total number of community service users (2863), while the red line shows the total number of inpatients (147). From the graph it can be seen that the model is in ‘steady-state’ which means that numbers remain static over time.
Figure 9 Baseline model output: Numbers of service users
Figure 10 shows the costs associated with the current state. This graph shows that the monthly costs (in £000s) for community services (blue) is £2136, for inpatient care (red) is £2062, and that the monthly total bill (pink) is £4198. These also stay static over the 4-year time period (inflation costs are not included in the model), but this is to be expected as the number of service users using each of the services remains constant over that same period.
15:21 13 Dec 2012
Number of patients supported in different settings
Page 4
0 12 24 36 48
Months
1:
1:
1:
2:
2:
2:
3:
3:
3:
0
2000
4000
0
100
200
1: Total community patients 2: Total IPs 3: total intensive service patients
1 1 12 2 2
3 3 3
12
Figure 10 Baseline model output: Costs per month
Modelling change The model uses knowledge (or beliefs) about the relationships between variables and the quantified impact of change elsewhere in the system.
This is best illustrated by means of a simple example where we can test two scenarios. The costs in the examples are indicative and organisations can replace these with their own cost and other variables. Invest in community services to prevent some inpatient admissions Close some inpatient beds and provide alternative care in the community
What is the impact on costs of each of these scenarios independently, and what is the impact if they are applied concurrently?
Scenario 1
Levels of community input to each service user are increased. This will prevent some admissions to inpatient care. Starting assumptions (all can be varied through model interface):
25% increase in unit cost of community care (= additional inputs in the community) will prevent 33% of inpatient admissions each month;
the admissions prevented would have had a shorter inpatient length of stay than average (10 days on average);
this applies uniformly across all clusters; the new input level start in month 12 and take 6 months to be fully implemented.
15:21 13 Dec 2012
Monthly provider costs broken down into community and inpatient services
Page 3
0 12 24 36 48
Months
1:
1:
1:
2:
2:
2:
3:
3:
3:
0
3000
6000
1: Community costs in 000s pcm 2: Inpatient cost in 000s pcm 3: Provider cost in 000s pcm
1 1 12 2 2
3 3 3
13
Output from scenario 1
Figure 11 Scenario 1 output: Numbers of patients
Figure 11 shows how the number of community service users (blue) and inpatient (red) once the new strategy is applied in month 12. The first 12 months shows the current situation, then in month 12 there is a small shift in service user numbers after implementation, but by month 30 things are back to the starting position.
Figure 12 Scenario 1 output: Costs per month
If we now look at Figure 12 showing costs we see that the monthly costs for inpatient actually rise a little, then return to the baseline, while community costs have increased by 25% and total costs rise by £531k per month.
This may not have been the result we were anticipating. This shows we need to more closely consider what is happening within the system. So what’s going on?
Some inpatient admissions are prevented; however those admissions that do take place have longer average length of stay than before (due to the severity threshold having risen in response to the reduction in beds). The small percentage fall in admissions is outweighed by the large percentage change
15:23 13 Dec 2012
Number of patients supported in different settings
Page 4
0 12 24 36 48
Months
1:
1:
1:
2:
2:
2:
3:
3:
3:
0
2000
4000
0
100
200
1: Total community patients 2: Total IPs 3: total intensive service patients
1 1 12 2 2
3 3 3
15:23 13 Dec 2012
Monthly provider costs broken down into community and inpatient services
Page 3
0 12 24 36 48
Months
1:
1:
1:
2:
2:
2:
3:
3:
3:
0
3000
6000
1: Community costs in 000s pcm 2: Inpatient cost in 000s pcm 3: Provider cost in 000s pcm
1
1 1
2 2 2
3
3 3
14
in costs for the vast majority of service users. It is not possible to predict this simply by looking at the influence diagrams and therefore the model dynamically showing this over time helps us to understand the influences within the system.
Scenario 2
Reduce inpatient bed capacity to 100 (from 150, with 147 being occupied in the baseline); Service users who would previously have been referred to inpatient care, but for whom there is no bed, now go to a new intensive community service; The new service delivers 2 x the standard level of community input for that cluster, and lasts for 6 weeks on average; After leaving the intensive service, service users return to the standard community service; The change begins in month 12 and takes 12 months to full implementation.
Output from scenario 2
Figure 13 Scenario 2 output: Numbers of patients
From Figure 13 it can be seen that the number of inpatients decline as capacity reduces: new intensive service users are created and there is a small rise in standard community service users.
Figure 14 Scenario 2 output: Costs per month
15:26 13 Dec 2012
Number of patients supported in different settings
Page 4
0 12 24 36 48
Months
1:
1:
1:
2:
2:
2:
3:
3:
3:
0
2000
4000
0
100
200
1: Total community patients 2: Total IPs 3: total intensive service patients
1 1 12
2
2
3
3
3
15:26 13 Dec 2012
Monthly provider costs broken down into community and inpatient services
Page 3
0 12 24 36 48
Months
1:
1:
1:
2:
2:
2:
3:
3:
3:
0
3000
6000
1: Community costs in 000s pcm 2: Inpatient cost in 000s pcm 3: Provider cost in 000s pcm
1 1 12
22
33
3
15
Figure 14 shows that implementing this strategy reduces overall costs are reduced by £513k per month.
Combining scenario 1 & 2
We now want to consider the impact of implementing both scenario 1 and 2 simultaneously.
Figure 15 Scenario 1&2 combined output: Costs per month
Figure 15 shows that if we reduce inpatient bed capacity AND increase standard community inputs the costs rise initially but then return to within a few £k of the baseline by about month 30.
Current limitations
1. As many of these scenarios have not been implemented and therefore we are unsure of the strength of the relationships.
We have used clinicians’ ‘best guesses’ to quantify relationships. We can also do ‘what f’ type
analyses to identify how sensitive the system is to these guesses. This will enable us to ask
questions such as “What will be the impact on overall cost if the admissions are reduced by 0.1%, 0.5%, 1%, 2%, 5%?”
2. The data may be unreliable due to a number of issues including length of time that it has been collected and recording issues.
Many Trusts have only a few months of cluster data, and service users can be expected to stay in
clusters for years, so at the moment there is a natural scarcity of data. However as time
progresses cluster data will become more available, and reliable, and generally improve. The data
and relationships within the model can be updated regularly with new data as and when it
becomes available. Given current average time spent in each cluster the impact of some potential
changes are likely to be seen only over a longer time frame (and not within a 4 year model run).
3. Only clusters 10-15 are currently included.
The model is more of a proof of concept and clusters 10 to 15 were chosen as they are associated
with higher levels of in-patient visits – and these are the more expensive service users, and so
have a higher impact on costs. The design of the model can be expanded to the other clusters in
time to create a more complete model of the system.
15:27 13 Dec 2012
Monthly provider costs broken down into community and inpatient services
Page 3
0 12 24 36 48
Months
1:
1:
1:
2:
2:
2:
3:
3:
3:
0
3000
6000
1: Community costs in 000s pcm 2: Inpatient cost in 000s pcm 3: Provider cost in 000s pcm
1
11
22
2
33
3
16
4. The data is not readily available within the electronic clinical information system and needs quite a bit of cleaning and analysing locally to produce in the correct format.
Currently the data required to feed the model cannot be taken directly from the electronic clinical
information system. However, as the exact data to feed the model is identified, it may be possible
to create queries within the system that can interrogate the database and automatically produce
the data required to run the model. Other Trusts using alternative information systems may not
experience the same issues.
5. We don’t incorporate the costs and restrictions that fall outside the NHS Trust’s control, though they may have impacts (some of which may be considerable) on the system e.g. availability of suitable social housing or changes in the provision of social care.
A model has to be a representation of a system and therefore cannot include every aspect of
reality. However some impacts can be incorporated into the model by proxy, so for instance if a
lack of housing meant that service users cannot leave hospital when they otherwise would, we
could calculate the number of service users this affects and accordingly increase length of stay to
account for this.
6. The model does not take into account costs of inflation.
This does not matter within the model as the cost of inflation would apply equally to both the
baseline model and the scenarios modelled, and therefore the results can be seen as an absolute
increase or decrease within the system.
7. The model is based on average costs and is not designed to reflect the impact of marginal costing (i.e. it does not indicate how long it might take to release fixed costs when reducing beds and/or closing a ward).
Commissioner and Provider Perspectives
Local customisation The Tool is intended to be used jointly by commissioners and providers, to encourage shared priming and tailoring of the data and modelling capability to reflect local factors, such as cluster-prevalence, time spent in cluster, service models and thresholds, costs per cluster and assessment costs. In these respects, we have found data accuracy to be especially critical as they are affected by local population needs, the capacity of other related services (third sector, primary care, local authority, etc.) and service specifications.
When commissioners’ costs are based on a stable system for calculating national tariffs per cluster, local changes to the service model that lead to lower costs of provision will not necessarily deliver cash savings to commissioners. Until then, commissioners’ costs that derive from a local tariff per cluster may benefit directly from savings resulting from service model changes.
The model is capable of simulating changes over time in the flows into the system, and the flows between clusters. This may lead to changes in the proportions of the total population within each cluster. As tariffs are calculated on a service user per cluster basis then these changes will have a direct impact on provider costs and ultimately on commissioner tariffs.
Specified data items are needed locally to prime the tool and some commissioners may require help to develop local algorithms for extracting these. As the model will use local cost and usage data, discrepancies occurring where some providers will have clients bunched towards the severe or complex
17
end of clusters, others may have more benign case mix, will not be an issue as the cost per service usage will reflect the local case mix. What this will facilitate is discussion between commissioner and provider about the most appropriate service configuration for their particular needs rather than simply adopting a one-size-fits-all strategy.
Negotiation It is envisaged over time that the tool will refine the quality of analysis required in effective commissioner/provider negotiations. Initially, depending on local circumstances and the level of prevailing expertise, some commissioners may have access to providers’ costs as well as prices per cluster.
It is currently uncertain whether tariffs will be renegotiable locally to reflect changes in providers’ costs and, if so, how often and on what basis renegotiation might occur.
These are important considerations, which fall outside the scope of the prototype tool.
Consultation Undertaken
Although consultation beyond consortium members also falls outside the scope of the costing tool consortium’s contract, as an act of good faith the following were consulted during the tool’s development through a combination of face to face meetings, workshops, correspondence (both directly and through intermediaries) and telephone and WebEx conferences. The costing tool consortium gratefully acknowledges these individuals’ interest and contributions.
Camden & Islington NHS Foundation Trust – Wendy Wallace (Chief Executive and Chair of LCDB). Central and North West London NHS Foundation Trust – Jenny Greenshield (Deputy Director of
Finance), Sarah Khan (PbR Project Director). London MH PbR Programme – Fionuala Bonnar (Project Manager, Mental Health PbR and Outcomes
Measurement). Mazars – Marie Edwards (Adviser, Mental Health) and Peter Finn (Lead Adviser, Mental Health). NHS Midlands and East MH PbR Team - Ian McCarley (Finance Lead, Mental Health PbR) and Nick
Adams (Programme Specialist, Mental Health). NHS North East London and the City – Tony Martin (London Health Programmes Costing Team NHS South of England Clinical Commissioning Group – co-ordinated through Julie Kell (NHS North
Somerset Programme manager, Mental Health). North East London Foundation Trust – Sarah Haspel (Assistant Operations Director and Older Adults
MHS and PbR Lead). Oxleas NHS Foundation Trust – Sophie Donnellan (Associate Director, Strategic Business
development). South London and Maudsley NHS Foundation Trust – Graham Burgess (Chair, Costing Sub-group),
Julia Gannon (Head of Contracting & Business Systems Design), Kevin Smith (Clinical Projects Manager, Trust Clinical Outcomes Team).
Surepoint – Rowan Purdy (Director).
18
Further Development
This document has already set out some of the issues around onward dissemination, deployment, support and generalizability that the consortium identified during the course of its development and consultative work.
By far the most significant factor that the team encountered is the paucity of consistent and reliable long-term data; a factor also experienced by most of those consulted who are involved in similar systems development work. This makes it highly unlikely, at least in the short-term, that there will be a single, universal algorithm capable of priming data at all localities. It also suggests that resourcing will be needed to ‘hand-hold’ some commissioning teams during implementation.
As a proof of concept, the prototype tool requires extension of its functionality to remaining clusters (i.e. beyond clusters 10 to 15) and this scalability exists within the current design. The tool’s adaptability to other Trusts’ settings also needs further testing.
Alignment with Other MH Commissioning Tools
The prototype cost-consequence tool, along with the Influence Diagrams, has been developed closely in conjunction with the Cases for Change, to ensure complementarity.
Additionally, an initial piece of collaborative work between Mazaars, The Whole System Partnership and Brunel University has begun examining the feasibility of aligning the tool with the London Needs
Assessment Service & Financial Profiling Tool and the West Midlands PBR tool. This work will be reported separately to the JCP and the MHWCP Board.
19
Appendix 1 Influence Diagrams
This appendix presents a seies of influence diagrams that relate to some possible scenarios that commissioners and providers may want to consider. The aim of these influence diagrams is not to provide an answer as to whether a particular scenario or comination of scenarios should be implemented, but rather to instigate consideration and discussion of the wider aspects of implementing a change. By depicting the influences in this way, we are able to see a system-wide view, which allows consideration of more wider-reaching effects of changes. This effect is analogous to squeezing a balloon in one place and being able to predict where the balloon will bulge as a consequence.
The scenarios included in this appendix are not meant to be exhaustive, and indeed there may be other influences that have been missed. Indeed many of the influences associated with some of these scenarios, such as service users becoming a more integrated and productive member of society, and the associated impact on the benefits system are outside the scope of these models – but are no less important than those included in the models. These models were designed for commissioners and providers to consider costs under their control and as such they do provide a starting point for discussion between interested parties.
These scenarios can be used as a discussion aid without the model, or can be used in conjunction with the model to actually see the impact in monetary or numerical terms. It must be stated though that the ‘strength’ of these links within the models is often unknown, so the model itself is more useful to consider questions such as “how big an impact does this intervention need to make to results in cost-savings?”
20
21
Key Description of influence Assumptions 1 As the number of beds available decreases the number of service users admitted will also
decrease Beds utilisation remains the same, so empty beds will be filled as the threshold for admission will change
2 As the number of service users admitted decreases the acuity of the admitted patients will increase
This assumes that the most severe cases will still need to be admitted, and the less severe will be treated in the community
3 As the acuity of the admitted patient’s condition increases, the average length of stay as an inpatient will increase
Assumed that those with the more severe problems take longer to treat and stabilise and therefore have longer length of stay
4 As the average length of stay increases the number of available beds will decrease
5 As the number of service users admitted decreases, the number of service users that receive treatment in the community will increase
This assumes that as the service users are in the system they need to be treated somewhere
6 As the number of service users receiving treatment in the community increases the staff to service user ratio decreases
This assumes that the number of staff and skill mix doesn’t change
7 As the staff to service user ratio decreases the number of emergency admissions will increase
8 As the number of emergency admissions increases the number of available beds will decrease
9 As the number of available beds decreases the number of service users discharged earlier will increase
This assumes that service users will be discharged to ensure enough capacity for unanticipated service users rather than sending to in-patient facility out of area
10 As the number of service users discharged earlier increases the average length of stay will decrease
11 As the number of service users discharged earlier increases then the number of service users being treated in the community will also increase
This assumes that as the service users are in the system they need to be treated somewhere
12 As the number of service users being treated in the community increases the demand for supported accommodation services will increase
Accommodation service is outside the scope of this model
13 As the staff to service user decreases the number of GP visits will increase As appointments are more difficult to make service users will start to rely on GP services. GP visits are outside the scope of this model
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
Appendix 2 User Guide To Model
Introduction This document provides brief notes on using the prototype simulation tool built by Whole Systems Partnership as part of the Mental Health Commissioning Pack.
The tool is intended to be used jointly by commissioners and providers, to encourage discussion and facilitate decisions about the type and level of services being commissioned to meet service users’ mental health needs. It runs in an online environment and illustrates the impact of two scenarios for future change on a system of care for people with psychosis.
Details of the assumptions underlying the prototype model, and the scenarios for change, are outlined in the project report (Mental Health & Wellbeing Commissioning Pack Programme: Cost-Consequence Tool (Proof of Concept) – Final Deliverable, December 2012)
Accessing the model The model is available on the Web. Simply click on the link (this will be made available with the publication of the report) and wait for the model to appear.
The home page You will see a home page like this one:
44
Run the model
To run the model in its default, steady state, click the ‘Run the model’ button at top right. The model will tell you it’s running and (after a small delay) will display 3 straight lines on the graph pad, representing the numbers of patients being supported in the community (blue), inpatients (red) and intensive service patients (in green) in the model in its baseline state.
Exploring the home page The model runs over 4 years, shown in months along the X-axis of the graph. The graph contains several different pages, each showing a different set of outputs. Click the small
triangle in the bottom left hand corner of the graph to move through the pages in sequence (note that page 1 is blank in the December 2012 prototype version).
There is a small printer icon in the bottom left hand corner of the graph. Click to print the current graph showing on the page
The grey table below the graph pad shows the annual tariff costs for the patients in the model. Note that when you run the model more than once the tables show the comparative results for each run.
45
The ‘Scenarios’ button on the left hand side allows you to navigate to the page where the two change scenarios can be applied to the model.
The ‘Reset’ button will clear the graphs and tables on the home page, and restore all the controls on other pages to their default state.
The Scenarios page
Move to the Scenarios page Click the ‘Scenarios’ button on the home page to move to the page containing controls for the
scenarios you want to explore. You will see a page like this one:
Explore the Scenarios page Controls relating to each scenario have been colour-coded: blue controls relate to Scenario 1;
increasing community input, and red controls relate to Scenario 2: reducing inpatient capacity and developing intensive community support.
Click the grey buttons for information about the assumptions built into the model for each scenario Each scenario includes a small switch at the top of the set of controls. Clicking on a switch will turn it
on and the centre of the switch will go green. If the switch is turned on, then any changes to the sliders below that switch will be applied to the model. If you don’t turn the switch on, changes to the sliders beneath it will have no effect (be careful about this – people often move the sliders and wonder why they’re having no impact, but it’s because the relevant switch isn’t turned on).
46
Use the sliders in each section to change your assumptions about the size of the changes that will be made in the scenario, and the timescale over which the changes will be made. To change the slider value, grab the slider knob and drag it to the new value you want.
You can switch the scenarios on separately or together. Switches will stay on until you switch them off or until you click the ‘Reset’ button on the home page.
You can change the settings for more than one slider at a time. Your changes will stay in place until you change them again, or until you press the ‘Reset’ button on the home page.
Once you have made your changes, return to the home page by clicking the ‘Home’ button and press the ‘Run the model’ button to see the impact of your assumptions on the behaviour of the system.
An example – working with scenarios
From the home page: Run the model in its default state Do not press ‘Reset’!
Now apply Scenario 1 On the Scenarios page, turn ON the switch for Scenario 1: leave the sliders at their default values Return to the home page and run the model again The home page should look like this, with graph page 4 showing the outputs from Scenario 1:
47
Do not press ‘Reset’!
Now apply Scenario 2 On the Scenarios page, turn OFF the switch for Scenario 1, and turn ON the switch for Scenario 2 Return to the home page and run the model again The home page should look like this, with graph page 4 showing the outputs from Scenario 2:
Do not press ‘Reset’!
Now apply both scenarios together On the Scenarios page, turn ON the switch for Scenario 1 (both scenarios should now be ON) Return to the home page and run the model again
48
The home page should look like this, with graph page 4 showing the outputs from Scenarios 1 and 2 combined:
Do not press ‘Reset’!
Now make some more changes Return to the Scenarios page Keep both scenarios switched on, and change the future bed capacity slider to 125 (instead of its
default value of 100 Return to the home page and run the model again The home page should look like this, with both scenarios on and your new value for future inpatient
beds:
49
Do not press ‘Reset’!
Now compare projected costs for each run On the home page, navigate to page 2 of the graph pad by clicking the triangle in the bottom left
corner You should see this:
Page 2 of the graph pad shows total projected provider costs for the system, for each model run It will carry on adding new runs to the comparative chart until you press the ‘Reset’ button Note also that the grey table below the graph pad also shows comparative information, but that the
output from each run is the same (the prototype model assumes that tariff is not linked to provider costs)
Now try your own ideas ...
Experimenting with changing some sliders but not others will help you understand how the model works and give you confidence in interpreting the outputs.
50
Good practice and helpful hints Note down the assumptions you make for each run (especially if you change several things each
time). It’s frustrating to run the model several times and then not be able to remember what changes applied to which run.
You can always return the model to its default state by pressing ‘Reset’ but don’t forget that that will reset everything (if you just want to reset one slider, click the little U attached to it).
You can’t break the model! The changes you make to the sliders and switches affect how the model runs, but don’t change its basic structure.
If you get an unexpected result, think about what changes you have made and how they might affect other parts of the system. You will probably find that the result isn’t really all that unexpected
More help and information If you:
need more help in using the model would like to explore the potential of the model further (eg by calibrating the model to use your own
data or your own local change scenarios, exporting data to and from Excel, etc) or think that you might want to explore how this approach could help your work in a different service
area
please contact Carol Cochrane or Lucy O’Leary at Whole Systems Partnership:
December 2012
© Whole Systems Partnership