business case for quality: best practices surrounding information technology stephen s. raab, m.d....

Post on 15-Dec-2015

217 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Business Case for Quality: Best Practices Surrounding

Information Technology

Stephen S. Raab, M.D.

Department of Pathology, University of Pittsburgh, Pittsburgh, PA

April 29, 2005

Background

Medical error is the failure of a planned action to be completed as intended or the use of a wrong plan to achieve an aim

Medical errors permeate all levels of patient care and the IOM report has resulted in the greater recognition of error and an increased focus on reporting and reducing errors

Background Scant data on anatomic pathology error

frequencies Institutional error frequencies may be low,

limiting hypothesis testing Limited data on how pathology errors affect

patient care Root cause analysis of anatomic pathology

errors rarely performed and results of these analyses are not disseminated

Error prevention Minimize psychological precursors;

design buffers; build in redundancy Design tasks: simplification;

constraints; standardization Systems designed to absorb error Root cause analysis Shift from training-focus to

performance-focus Data collection to define problem

Initiatives to decrease errors

Creation of audit systems Use of benchmarking and error tracking

systems Implementation of immediate error

reduction systems (e.g., Toyota Production Systems, Six Sigma)

Information technology and patient safety Most systems are focused on diagnostic

reporting Difficulty in performing quality

assurance Few checks to prevent errors Systems themselves are often a source

of error Few databases to study quality

Quality assurance

Cytologic-histologic (CH) correlation performed in all American labs (CLIA’ 88)

No national standards to perform CH correlation Information systems not structured to perform

cytologic correlation– Case pairs may be visualized prior or after sign-out– Most CH correlation performed with manual review

Cytologic-histologic correlation

What is the best way to perform correlation and how does one use the data?

Letter sent to 162 American labs requesting information on how they performed correlation (response frequency: 32.1%)

Separated material into forms, logs and tally sheets

CAP Checklist

Cytology case number Sign-out cytology diagnosis Sign-out cytologist Original cytotechnologist diagnosis Sign-out cytotechnologist Review cytology diagnosis Review cytologist

CAP Checklist

Surgical pathology case number Sign-out surgical pathology diagnosis Sign-out surgical pathologist Review surgical pathology diagnosis Review surgical pathologist Significance of discrepancy Action taken Reason for correlation

Minimum expected and additional variables listed on forms only. Bold line represents minimum expected variables that should be present (n = 15).

0

5

10

15

20

25

30

35

Received Form Material (n=31)

Nu

mb

er o

f It

ems

Additional VariablesMinimum Expected

UPMC technical quality assurance

Record laboratory error:– Accessioning– Gross groom– Histology– Cytology– Transcription

PUH Shy Total PUH Shy Total PUH Shy Total

ACC- Total 39 19 58 52 6 58 38 6 44

APLIS- Total 1 1 2 0 0 0 0 0 0

BILL- Total 2 0 2 1 0 1 1 0 1

CQI - Total 0 0 0 0 0 0 0 0 0

DICT- Total 41 37 78 46 13 59 61 39 100

FREE TEXT- Total 1 1 2 2 1 3 1 2 3

HISTO- Total 8 2 10 9 1 10 7 3 10

IHC- Total 1 0 1 1 0 1 1 0 1

PATH- Total 20 0 20 15 1 16 2 3 5

PROC- Total 14 0 14 3 0 3 2 1 3

QUAL- Total 12 5 17 12 11 23 14 7 21

REGIST- Total 35 1 36 28 0 28 55 1 56

REQ- Total 33 30 63 29 50 79 165 155 320

TRAN- Total 3 3 6 3 1 4 3 1 4

TUBE- Total 0 0 0 0 0 0 0 0 0

Total Specimens 6946 6275 6924

Total Part 9935 9038 10074

Total Blocks 13890 14146 15078

Total Stains 22231 20913 22217

J an Feb March

National benchmarking

CAP Q-Probes studies – error rates, turn around time, amended report rates

CAP Q-Tracks studies– Cytologic-histologic correlation– Frozen-permanent section review– Small surgical specimen turn around time

Second viewing of cases

CAP – Q-Probes study 2004 74 labs, 6186 specimens, 415 discrepancies

(6.7% rate) Breakdown of errors

– 48% with change in same category of diagnosis– 21% change in category of diagnosis– 18% typographical errors– 9% patient or specimen information– 4% change in margin status

Cytologic-histologic correlation

Year end report for participating labs (56) In 2003, a total of 19,478 Paps were correlated

with 11,336 true positive correlations and 2,433 false positive correlations

Predictive value for positive cytology: 82.3% Percent positive ASC diagnoses: 64.3% (range

34.1%-89.9%); percent positive AGC diagnoses: 31.4% (range 0%-66.7%)

Cytologic-histologic correlation

Best performers and clustering performed Best performers:

– Quarterly reports, track and trend– Summary reports to clinicians– Document correlation in report 5/9 labs– Keep written log of findings 2/9 labs– Review biopsy if discrepancy 3/9 labs

AHRQ national errors database

5 year project to monitor pathology errors by creation of a multi-institutional database

University of Pittsburgh, University of Iowa, Henry Ford Health System, Western Pennsylvania Hospital

Goal is to devise plans to reduce cytology and surgical pathology errors

Specific aims

1. Create voluntary, Web-based database and collect errors detected by correlation, secondary review, amended reports, frozen section review

2. Quantitatively analyze error data and generate performance improvement reports

3. Perform root cause analysis; plan and implement interventions to reduce errors

4. Assess success of interventions by quantitative measure; disseminate successful error reduction plans

Database challenges

Current absence of standardized and detailed laboratory workload and quality assurance data sets in widely used laboratory information systems

Current lack of efficient and comprehensive electronic de-identification of un-linked institutional laboratory information system and clinical data

Database construction

The database is Oracle 9.2.0.4 Enterprise Edition implemented on a Sun Ultra E450 Server running Solaris 2.9. The mid-tier is implemented with Oracle’s Application Server (v9.0.3) on a Compaq DL360 Server running Windows 2000. The application uses the Oracle http server and mod_pl/sql extensions to generate dynamic web pages from the database to the users via a Microsoft Internet Explorer web browser version 6.0 or higher.

Logical system design Schema layer contains the actual data and data

relations that are stored entirely as numbers and keys

Meta-data layer, the data is defined in terms of data elements and “data objects”

Procedure layer contains a set of dynamic procedures/functions (in PL/SQL) that externalize the data elements

Presentation layer is a series of forms (for data entry, display, query, etc.) that are populated by data elements from meta-data

Future database plans

Database interface to laboratory information systems

Data de-identification Revision of database architecture Incorporation of histopathologic image data

Variable frequency of errors across sites

Institution Number of errors

Total number of

cases

Error percent

A 196 28,396 0.69

B 279 66,115 0.42

C 796 143,835 0.55

D 174 82,787 0.21

Correlating case error frequency

Error frequency

Site Gyn correlating % Non-gyn correlating %

A 9.49 11.03

B 1.65 5.86

C 4.72 11.72

D 3.33 6.14

Variability in assessing sampling versus interpretive error

Institution

Reason A B C D Total

Interpretation 21% 11% 65% 4% 33%

Screening 2% 8% 4% 2% 5%

Sampling 61% 73% 38% 92% 60%

Unknown 18% 9% 0 3% 6%

Agreement between original and review reason for discrepancy (interpretation or sampling)

Review reason

Site A B C D

Original reason

A 0.615 0.412 0.024 0.286

B 0.737 0.211 0.118 0,737

C 0.400 0.545 0.615 0.40

D Und Und Und Und

Percentage of total error by specimen type

Institution

Organ A B C D Total

Lung 52% 20% 35% 17% 34%

Thyroid 0% 5% 8% 2% 6%

Bladder 22% 14% 16% 33% 18%

Breast 2% 2% 5% 17% 5%

Error outcome taxonomy

Significant event: error that affects patient outcome; may be classified by severity (mild, moderate, and severe)

No harm event: error does not affect patient outcome

Near miss event: intervention occurred before harm could take place

Differences in grading of discrepancy assignments

Institution

Assignment A B C D Total

Significant event

0 0 28% 14% 22%

No Harm 86% 98% 32% 50% 44%

Near miss 14% 0 40% 14% 32%

Unknown 0 2% 0 21% 2%

top related