defining what to measure to manage risk and improve ... · mcc metric development framework define...
TRANSCRIPT
Defining what to measure to manage risk and
improve clinical research quality and
performance outcomes
Linda Sullivan Co-Founder and Executive Director
Metrics Champion Consortium
Clinical Trial Risk and Performance Management Summit
Princeton, NJ – November 14, 2018
©2018 Metrics Champion Consortium, All rights reserved
Today’s Session
• What are metrics?
• Is your organization measuring the most important things?
o Do your metrics answer the most important questions?
o Do your metrics reward the behaviors you are seeking?
o Do you measure time, cost and quality?
• What are organizations measuring today?
• How does ICHE6(R2) change the information that
organizations need to design and manage studies?
2
WHAT ARE METRICS?
©2018 Metrics Champion Consortium, All rights reserved
What are metrics?
Standards of measurement by which
efficiency, performance, progress, or quality
of a plan, process, or product can be
assessed.
Source - http://www.businessdictionary.com/definition/metrics.html
4
©2018 Metrics Champion Consortium, All rights reserved Adapted from Dave Zuckerman, Customized Improvement Strategies
Metric Type: Definition:
TIMELINESS Measures whether a milestone was achieved on-
time
CYCLE TIME Measures how long it takes to complete a task
QUALITY Measures the accuracy of completing a task or how
closely aligned performance is to a set of
requirements
EFFICIENCY/COST Measures the resources required to complete a task
Metric Types
5
©2018 Metrics Champion Consortium, All rights reserved
Every clinical trial
has components
and expectations of
Time, Quality and
Cost/Efficiency.
What happens
when you
apply pressure
to one side of
a balloon?
Need A Combination of Metric Types
6
©2018 Metrics Champion Consortium, All rights reserved
When time is the only component being measured,
everyone will focus on performing faster.
You may complete a task “on time” but end up with costly
rework later in the study due to quality problems.
Need A Combination of Metric Types
7 ©2018 Metrics Champion Consortium, All rights reserved
©2018 Metrics Champion Consortium, All rights reserved
“How can you perform faster, cheaper, and with higher
quality all at the same time?”
When all components are measured, the study balloon can
actually shrink by removing waste.
Study Performance
REMOVE:
• Rework
• Inefficient processes
• Duplicate work
• Waste
©2018 Metrics Champion Consortium, All rights reserved
Need A Combination of Metric Types
8
©2018 Metrics Champion Consortium, All rights reserved
SMART Metrics
Specific: The metric is clearly defined and reported. The metric will be interpreted
consistently. The results will aligned with a specific task or process.
Measurable: What is being measured can be quantified and analyzed. The formula is
logical, validated and uses common units of measure.
Actionable: The results enable informed decisions to be made or actions to be taken. It
should answer the audience’s critical success factor questions.
Reliable: The metric and data sources have been validated. The data is obtained close to
the source at a reasonable cost. The metric results are consistently credible.
Timely: The results should be calculated and communicated as close to “real time” as
possible for the metrics to have impact. The metrics are consistently reported at the same
time each frequency.
9
HOW DO YOU DETERMINE
WHAT TO MEASURE?
©2018 Metrics Champion Consortium, All rights reserved
Measure What Matters Most –
Metrics That Answer Important Questions
MCC Metric Development Framework
Define Critical
Success
Factors (CSFs) Define Key
Performance
Question (KPQ) Define &
Measure Metric
to Answer KPQ
Generate Metric
Display that
Answers KPQ
Format display to highlight answer(s)
Define &
Measure Metric
to Answer KPQ
Generate Metric
Display that
Answers KPQ
Define Key
Performance
Question (KPQ)
Define &
Measure Metric
to Answer KPQ
Generate Metric
Display that
Answers KPQ
Develop
Process Map
©2018 Metrics Champion Consortium, All rights reserved
©2018 Metrics Champion Consortium, All rights reserved
Site Contracting Process Map
CSF: Site contracts fully executed in
the shortest time with minimal non-
value added effort
©2018 Metrics Champion Consortium, All rights reserved
R&D Management
Executive level
Operations Process Owner
Functional Area 1 Lead
Functional Area 2 Lead
Therapeutic Area /
Program Management
Study Team
A
Study Team
B
Those with oversight for all R&D activities.
Owners of broad operational processes and those overseeing study programs.
Those overseeing a function and/or managing studies.
Broad Summary
Specific Summaries Responsibility Details
The higher the level, the fewer metric details that are typically reported.
Summaries of lower level metric details should be “rolled up” to the higher levels.
Key Performance Questions and Corresponding
Metrics Vary By Role/Responsibilities
13
©2018 Metrics Champion Consortium, All rights reserved
Measure What Matters Most –
Metrics That Answer Important Questions
MCC Metric Development Framework
Define Critical
Success
Factors (CSFs) Define Key
Performance
Question (KPQ) Define &
Measure Metric
to Answer KPQ
Generate Metric
Display that
Answers KPQ
Format display to highlight answer(s)
Define &
Measure Metric
to Answer KPQ
Generate Metric
Display that
Answers KPQ
Define Key
Performance
Question (KPQ)
Define &
Measure Metric
to Answer KPQ
Generate Metric
Display that
Answers KPQ
Develop
Process Map
©2018 Metrics Champion Consortium, All rights reserved
©2018 Metrics Champion Consortium, All rights reserved
Metric Definition Template
Critical Success Factor:
Metric Description Key Performance Question to be
answered with metric
Why is the Key Performance Question important?
What actions might be taken based upon results?
What the metric does not
tell you
Metric Type Reporting Level Basic/Advanced
Formula / Example Performance Target
(suggested)
Additional Analysis for Missed
Target
Companion Metrics
(Portfolio)
(Study)
Reporting Frequency
(Country)
Sub-process
(Site)
Glossary Terms (See Glossary Tab) Data Elements
©2018 Metrics Champion Consortium, All rights reserved
15
©2018 Metrics Champion Consortium, All rights reserved
• Business Operations Metrics
• Clinical CAPA Metrics
• Clinical Operations & Site
Management Metrics
• Compliance/QA Metrics
• Data Mgmt & Biostats Metrics
• Lab Performance Metrics
• ECG, ABPM and Spirometry
Performance Metrics
• eCOA Performance Metrics
• Imaging Performance Metrics
• Risk-based monitoring
– Site Key Risk Indicator Set
– Pilot Success Factors
• Safety/PV Metrics
• Site Contracting Performance Metrics
• Trial Master File Metrics
• Vendor Oversight Milestone and
Relationship Assessment Metrics
Existing Metric Toolkits
16
MCC Library of 200 Consensus-based Metrics
©2018 Metrics Champion Consortium, All rights reserved
Metric Toolkit
Component Purpose
White Paper High level overview (publicly available)
Recorded Webinar Overview of Metrics Workbook and Tools
Metrics Workbook Contains all metric definitions, process map and
glossary
NEW! Metric Implementation Support Tools:
Implementation Guide To aid implementation
Tool 1: Metrics to Data Elements To determine data elements needed to calculate
specific metrics
Tool 2: Data Elements to Metrics To determine metrics available with a given set of data
elements
Tool 3: Issues to Metrics To determine recommended metrics and data elements
or particular issues
©2018 Metrics Champion Consortium. All rights reserved
17
HOW DOES ICHE6(R2) IMPACT
WHAT YOU MEASURE?
©2018 Metrics Champion Consortium, All rights reserved
2009 2011 2014 2015 2016
19
©2018 Metrics Champion Consortium, All rights reserved
Do your metrics measure the right things?
As organizations update their processes to accommodate early
identification and oversight of critical processes and critical
data, many question whether they are measuring the right
metrics. For example:
• Do their metrics support the new approach – measure and reward the
right behavior?
• Do the metrics provide appropriate oversight of the revised
processes?
• Do they align with reviewing critical data or do they still count all
queries and missing data?
• Can they use past performance metrics to generate risk scores?
20
©2018 Metrics Champion Consortium, All rights reserved
Audience Poll #1
Has your organization considered how ICHE6(R2) impacts what you measure?
1. Yes
2. No
3. Don’t know
21
©2018 Metrics Champion Consortium, All rights reserved
MCC Study Quality Trailblazers MCC Centralized Monitoring WG
Risk Assessment & Mitigation KRI & Data Monitoring
5.0.1 Critical Process &
Data Identification
5.0.2 Risk Identification
5.0.3 Risk Evaluation
5.0.4 Risk Control
5.18.3 Extent & Nature of
Monitoring
ICH E6 (R2)
5.0.5 Risk Communication
5.0.6 Risk Review
5.0.7 Risk Reporting
5.0 “The Quality Management System should use a risk-based approach…”
22
©2018 Metrics Champion Consortium, All rights reserved
Metrics are a Key Component of Risk Evaluation
Likelihood: Has it occurred in the past? How often?
Impact: What was the impact when it occurred?
Detectable: Can you measure something that can
provide a signal that the error may occur / did occur?
23
RISK SCORE
L x I x D
©2018 Metrics Champion Consortium, All rights reserved
Metrics are a Key Component of Risk Control
24
Examples of site-level performance metrics used as KRIs*: • Protocol deviation rate
• SAE reporting rate
• Query rate
• Query response time
• Proportion of data changed after initial entry
*Source: MCC Centralized Monitoring Practices Report (In press)
Key Risk Indicators – Measurements that indicate when
changes in the risk profile are occurring (eg. a risk may
be becoming or has become an issue).
©2018 Metrics Champion Consortium, All rights reserved
MCC Study
Quality Trailblazer
Group updated
RAMMT to
improve alignment
with ICHE6(R2)
5.0 (5.0.1-5.0.7)
Quality
Management
Risk Assessment & Mitigation Management
Tool v2.0 released 9/2018
MCC Risk
Management
Training Course
available 12/2018
©2018 Metrics Champion Consortium, All rights reserved
Centralized Monitoring Practices Industry
Survey (available for purchase 12/2018)
Centralized Monitoring Guidance Document
(available to members)
Quality Tolerance Limits vs Statistical Process
Control Trigger Levels
Staff training - “Critical thinking” skills
development
5.18.3 - Extent and Nature of Monitoring
26
©2018 Metrics Champion Consortium, All rights reserved
Summary
• Implementation of risk-based
quality management and
centralized monitoring programs
are invigorating efforts to update
metric programs
• What you measure sends a
message about what is important
• KRIs and metrics don’t fix
problems – people do!
27
• ICHE6(R2) section 5.0
Risk Assessment & Mitigation Management Tool (v2)
• Study, Site and Patient Burden Scores
Protocol Operational Complexity Scoring Tool
• Financial impact of quality issues
Cost of Poor Quality Estimator Tool