innovation under uncertainty: maintaining progress

16
Follow us on Twitter: @QIOProgram Tweet with our conference hashtag: #CMSQualCon15 Innovation Under Uncertainty: Maintaining Progress David Blumenthal, MD, MPP President, The Commonwealth Fund

Upload: the-commonwealth-fund

Post on 22-Jan-2018

368 views

Category:

Healthcare


0 download

TRANSCRIPT

Follow us on Twitter: @QIOProgram

Tweet with our conference hashtag: #CMSQualCon15

Innovation Under Uncertainty: Maintaining ProgressDavid Blumenthal, MD, MPP

President, The Commonwealth Fund

U.S. Health Spending is Larger Than the GDP of Most Nations

2Notes: Data is an estimate for 2014; current US dollars (not adjusted for cost of living).

Sources: International Monetary Fund, Altarum Institute, National Health Expenditure Accounts.

The Affordable Care Act

3

• Where possible, build in an experimental design.

o Phased implementation.

o Concurrent controls.

ACOs.

Disease management programs.

Conditional payment for new therapies.

4

Designing Evaluations in Complex Programs

The Evaluation Challenge Persists

5

The Evaluation Challenge Persists

6

• Qualitative evaluations.

– Look for findings that affirm or undermine theories

of causative relationships.

• Collect sound before-and-after data, and link changes as

carefully as possible to timing of interventions.

Second-Best Approaches

7

Source: Furman J, “The Economic Benefits of the Affordable Care Act,” Presented at Center for American Progress, April 2, 2015.

Percent, 12-month moving average

Medicare Hospital Readmissions

Physician and Hospital Adoption of EHRs

29%

83%

9%

76%

0%

20%

40%

60%

80%

100%

1 2 3 4 5 6 7 8 9

Series1 Series2

Notes: Hospital data of those with at least a basic EHR system (ONCHIT, 2015); physician data of practices with any EHR system (National Center for

Health Statistics, 2014).

10

Are Positive Trends Sufficient to Press Ahead?

• Like clinical care, policy-making is both science and art.

• We rarely have the luxury of unassailable evidence of

efficacy or lack thereof when it comes to a critical decision.

• When is evidence good enough?

• Depends on context.

• Depends on intervention.

• Depends on the anticipated follow-on results and

evolution of initiative.

Ideal vs. Practical in Evidence-Based Policy-Making

11

• Build in opportunities for continued learning and refinement.

o Stages of meaningful use.

o Keep asking: what have I learned, and how can I learn

more.

• Inherent asymmetry between discontinuing a successful

program and an unsuccessful program.

Some Basic Rules

12

• Comparative data from external sources is very helpful for motivating

quality improvement.

o But not always helpful for QI itself.

• Quality and safety problems originate in local process issues, and

improvement requires collecting data specific to these processes.

o Identification of flawed processes.

o Interventions.

o Rapid cycle evaluation.

o Revisions of interventions.

A Word About DIY Data

13

PDSA Cycle

14

• Get started.

• Build your own data collection and improvement capacities.

Don’t Wait for Feds

15

Contact Information

• David Blumenthal

[email protected]

• Social media:

– @DavidBlumenthal

16