medical errors

66
Medical Errors: Causes and Prevention Roger L. Bertholf, Ph.D. Associate Professor of Pathology University of Florida Health Science Center/Jacksonville

Upload: bea-marie-deocampo-esporton

Post on 19-Nov-2015

230 views

Category:

Documents


0 download

DESCRIPTION

aplod

TRANSCRIPT

  • Medical Errors:Causes and PreventionRoger L. Bertholf, Ph.D.Associate Professor of PathologyUniversity of Florida Health Science Center/Jacksonville

  • IOM: To Err Is Human: Building a Safer Health System (2000)FrequencyCostOutcomesTypesCausesRecommendations

  • Adverse Event vs. ErrorAn adverse event is an injury caused by medical management rather than the underlying condition of the patient. An adverse event attributable to error is a "preventable adverse event." Negligent adverse events represent a subset of preventable adverse events that satisfy legal criteria used in determining negligence (i.e., whether the care provided failed to meet the standard of care reasonably expected of an average physician qualified to take care of the patient in question).*An error is defined as the failure of a planned action to be completed as intended (i.e., error of execution) or the use of a wrong plan to achieve an aim (i.e., error of planning).

    *About half of preventable AEs are considered negligent

  • Examples of Medical ErrorsDiagnostic error (inappropriate therapy)Equipment failureInfection (nosocomial, post-operative)Transfusion-related injuryMisinterpretation of medical ordersSystem failures that compromise diagnostic or treatment processes.

  • Frequency of Medical ErrorsMVA = 43,000; Breast CA = 42,000; AIDS = 16,000. 8th most frequent cause overall.

    StudyAEsErrorsFatalEst. DeathsNY (1984)2.9%58%13.6%98,000CO/UT (1992)3.7%53%6.6%44,000

  • How reliable is this estimate?Includes only AEs producing a specified level or harmTwo reviewers had to agree on whether an AE was preventable or negligentIncluded only AEs documented in the patient record*

    *Some studies, using other sources of information about adverse events, produced higher estimates.

  • CostAdverse events: $37.6 50 billion*Preventable adverse events: $17 29 billionHalf of cost is for health careRepresent 4% (AE) and 2% (errors) of all health care costs

    *lost income, lost household production, disability, health care costs

    Exceeds total cost of treating HIV and AIDS

  • Causes**Leape et al. (1991) The nature of adverse events in hospitalized patients(1,133 AEs studied in 30,195 admissions)

    Overall frequency (inpatients) is 3 per 1,000 medication orders; 2 per 1,000 considered significant errors

    Medication error19%Wound infection14%Technical complications13%

  • AHA List of Medication ErrorsIncomplete patient informationUnavailable drug information (warnings)Miscommunication of medication orderConfusion between drugs with similar namesLack of appropriate drug labelingEnvironmental conditions that distract health care providers

  • Most Common Medication Errors

    Failure to adjust dosage in response to a change in hepatic/renal function13.9%History of allergy to the same or related medication12.1%Wrong drug name, dosage form, or abbreviation on order11.4%Incorrect dosage calculation11.1%Atypical or unusual critical dosage consideration10.8%

  • A Comparison of Risks*1 in 2 million from 1967-1976

    Risk (per flight) of dying in a commercial airline accident1 in 8 million*Risk (per hospital admission) of dying from a medical error>1 in 1,000

  • Six Sigma Quality ControlQuality Management program designed by Mikel Harry and Richard Schroeder in 2000Strives to make QM a quantitative scienceSets performance standards and goals for a production process

  • Six Sigma Paradigm: DMAIC

  • Six Sigma Process PerformanceProbability0123-1-2-3SD ().67456-4-5-6.95+ Tolerance- ToleranceTarget

  • Six Sigma PerformanceGoal is to achieve < 1 DPMNot all processes can achieve the 6 level of performanceDemings Principle is that fewer defects leads to increased productivity, efficiency, and lower cost

  • Healthcares Six Sigma Performance

    Process% ErrorsSigmaPreventable adverse events3.02.5Lab order accuracy1.83.6Reporting errors0.0484.8False negative PAP2.43.45Unacceptable specimen0.34.25Duplicate test orders1.523.65

  • What Causes Accidents?

  • Sidney Dekker

    What is striking about many accidents is that people were doing exactly the sorts of things they would usually be doingthe things that usually lead to success and safety. . .Accidents are seldom preceded by bizarre behavior.

    From The Field Guide to Human Error Investigations (2002)

  • A Primer on Accident InvestigationHuman error as a causeHuman error as a symptom

  • Human ErrorBad Apple TheoryComplex systems are inherently safeHuman intervention subverts the inherent safety of complex systemsReaction to failureBad outcome = bad decisionRetrospective, proximal, counterfactual, and judgmental

  • The Bad Apple TheoryThe illusion of successBad procedures often produce good resultsSuccess breeds confidenceFailure is an aberrationThe system must be safeThe economical answerIt is easier to change human behavior than it is to change systems

  • Assigning BlameRetrospective

  • Retrospective Analysis

  • Assigning BlameRetrospectiveProximal

  • ProximityIt is intuitive to focus on the location where the failure occurredSharp end vs. Blunt endThe sharp end is the point at which the failure occursThe blunt end is the set of systems and organizational structure that supports the activities at the sharp end

  • Retrospective AnalysisSharp EndBlunt EndInstitutionSystemsProceduresOrganization

  • Assigning BlameRetrospectiveProximalCounterfactual

  • What Might Have Been. . .In retrospect, it is always easy to see where different actions would have averted a bad outcomeIn retrospect, the outcome of any potential action is already knownCounterfactuals pose alternate scenarios, which are rarely useful in determining the true cause

  • Assigning BlameRetrospectiveProximalCounterfactualJudgmental

  • The Omniscient PerspectiveAs an investigator, you always know more than the participants didIt is very difficult, if not impossible to judge fairly the reactions of those who had less information than youInvestigators define failure based on outcome

  • Lessons for InvestigatorsThere is no primary causeEvery action affects anotherThere is no single causeErrors in complex systems are nearly always multi-focalA definition of human error is elusiveDefinition of errorHumans operate within complex systems

  • Failure Mode and Effects AnalysisEverything will eventually failHumans frequently make errorsThe cause of a failure is often beyond the control of an operator

  • 10 Steps for FMEAReview the processBrainstorm potential failure modesList potential effects of each failure modeAssign a severity ratingAssign an occurrence ratingAssign a detection ratingCalculate the risk priority number for each effectPrioritize these failure modes based on the RPN and severityTake action to reduce or eliminate the high-risk failure modesRecalculate the RPN

  • Ranking the Failure ModesCalculate the RPNRate Severity, Occurrence, and Detection on a scale of 1 10RPN = S x O x D (maximum 1000)Prioritize Failure modesNot strictly based on RPNSeverity of 9 or 10 should get priorityGoal is to reduce RPN

  • Case Exercise #1A 91-year-old female was transferred to a hospital-based skilled nursing unit from the acute care hospital for continued wound care and intravenous (IV) antibiotics for methicillin-resistant Staphylococcus aureus (MRSA) osteomyelitis of the heel. She was on IV vancomycin and began to have frequent, large stools.

  • Case Exercise #1The attending physician ordered a test for Clostridium difficile on Friday, and was then off for the weekend. That night, the test result came back positive. The lab called infection control, who in turn notified the float nurse caring for the patient. The nurse did not notify the physician on call or the regular nursing staff. Isolation signs were posted on the patient's door and chart, and the result was noted in the patient's nursing record. Each nurse who subsequently cared for this patient assumed that the physician had been notified, in large part because the patient was receiving vancomycin. However, it was IV vancomycin (for the MRSA osteomyelitis), not oral vancomycin, which is required to treat C. difficile.

  • Case Exercise #1On Monday, the physician who originally ordered the C. difficile test returned to assess the patient and found the isolation signs on her door. He asked why he was never notified and why the patient was not being treated. The nurse on duty at that time told him that the patient was on IV vancomycin. The float nurse, who had received the original notification from infection control, stated that she had assumed the physician would check the results of the test he had ordered. Due to the lack of follow-up, the patient went three days without treatment for C. difficile, and continued to have more than 10 loose stools daily. Given her advanced age, this degree of gastrointestinal loss undoubtedly played a role in her decline in functional status and extended hospital stay.

  • Case Exercise #1What are the systems/processes involved in this incident?What were the failure points?

  • AnalysisMD failed to check the result of an ordered testFloat RN wrongly assumed that MD had been notified of the resultRN incorrectly assumed that IV vancomycin was adequate therapy

  • Failure PointsLaboratory system for reporting critical resultsIs a positive C. difficile culture considered a panic result?To whom are panic values reported?RN/MD communicationDoes the institution foster an environment where RNs can comfortably question MD orders?

  • Lisa Belkin. . . it is virtually impossible for one mistake to kill a patient in the highly mechanized and backstopped world of a modern hospital. A cascade of unthinkable things must happen, meaning catastrophic errors are rarely a failure of a single person, and almost always a failure of a system.

    From How Can We Save the Next Victim? (NY Times Magazine, June 15, 1997)

  • Case Exercise #2An 81-year-old female maintained on warfarin for a history of chronic atrial fibrillation and mitral valve replacement developed asymptomatic runs of ventricular tachycardia while hospitalized. The unit nurse contacted the physician, who was engaged in a sterile procedure in the cardiac catheterization laboratory (cath lab) and gave a verbal order, which was relayed to the unit nurse via the procedure area nurse. Someone in the verbal order process said "40 of K." The unit nurse (whose past clinical experience was in neonatal intensive care) wrote the order as "Give 40 mg Vit K IV now."

  • Case Exercise #2The hospital pharmacist contacted the physician concerning the high dose and the route and discovered that the intended order was "40 mEq of KCl po." The pharmacist wrote the clarification order. However, the unit nurse had already obtained vitamin K on override from the Pyxis MedStation (an automated medication dispensing system) and administered the dose intravenoustly (IV). The nurse attempted to contact the physician but was told he was busy with procedures. A routine order to increase warfarin from 2.5 mg to 5 mg (based on an earlier INR) was written later in the day and interpreted by the evening shift nurse as the physicians response to the medication event. The physician was not actually informed that the vitamin K had been administered until the next day. Heparin was initiated and warfarin was re-titrated to a therapeutic level. The patients INR was sub-herapeutic for 3 days, but no untoward clinical consequences occurred.

  • Case Exercise #2What are the systems/processes involved in this incident?What were the failure points?

  • AnalysisVerbal ordersThird party messengersUse of abbreviationsFailure to question unusual ordersLack of control over medication availability

  • Failure PointsHospital policy for medication ordersRead Back requirementAbility to circumvent pharmacist review

  • J.C.R. Licklider (1915-1990)

    It seems likely that the contributions of human operators and [computers] will blend together so completely in many operations that it will be difficult to separate them neatly in analysis.

    From Man-Computer Symbiosis (1960)

  • Anatomy of a Laboratory Error

  • Phase I: A failed calibrationRecalibration of the acetaminophen assay was prompted by a QC failureRecalibration was followed by acceptable QC results

  • Phase II: QC failuresSubsequent QC measurements produced an error code indicating the result was above the linear limit of the methodQC failures went unnoticed, since the LIS did not display the error codeSeveral patient specimens were reported incorrectly, resulting in inappropriate treatment

  • Phase III: DiscoveryThe ED staff contacted the laboratory to question the high acetaminophen result on a patient who denied recent ingestion of the drugInvestigation revealed the QC failures, and the assay was successfully recalibrated

  • Phase IV: InvestigationPrincipal QuestionsWhy was an acceptable QC result obtained immediately after a failed calibration?Why didnt the technologists notice subsequent QC failures?Should the clinicians have been more suspicious of unusually high results?

  • The Process

  • Failure Points in The Process

  • Unrecognized calibration failureRoche modularThroughput/timing algorithm

  • Unnoticed QC failuresInterface through Digital Innovations boxError codes are rare in QC resultsSupervisory review does not occur regularly on weekends

  • Lack of clinical suspicionHistory is often unreliable in overdose casesAn antidote for acetaminophen existsSymptoms of acetaminophen toxicity may not appear until after the window of therapeutic opportunity has passed

  • ConclusionsAn unexpected error occurred in the calibration algorithm encoded in the instrument softwareThe failure of information to cross the instrument/LIS interface masked the erroneous control resultsSuspect results were not immediately apparent to clinicians

  • LessonsComplex technologies always have unexpected failure modesInterfaces between systems and operators are opportunities for distortion or loss of important informationThe fallacy of the un-rocked boat

  • Richard I. CookRecognizing hazard and successfully manipulating system operations to remain inside the tolerable performance boundaries requires intimate contact with failure.

    From How Complex Systems Fail (2002)

  • How Complex Systems FailComplex systems are intrinsically hazardous systemsComplex systems are heavily and successfully defended against failureCatastrophe requires multiple failuressingle point failures are not enoughComplex systems contain changing mixtures of failures latent within them

  • How Complex Systems FailCatastrophe is always just around the cornerPost-accident attribution to a root cause is fundamentally wrongHuman operators have dual roles: as producers and as defenders against failureHuman practitioners are the adaptable element of complex systems

  • How Complex Systems FailChange introduces new forms of failureSafety is a characteristic of systems and not of their componentsFailure-free operations require experience with failure

  • IOM RecommendationsEstablish national focusIdentify and learn from medical errors through mandatory reportingRaise standards and expectationsImplement safe practices

  • AHRQ Safety Recommendations for PatientsAsk questions if you have doubts or concerns Keep and bring a list of ALL the medicines you take Get the results of any test of procedure Talk to your doctor about which hospital is best for your health needs Make sure you understand what will happen if you need surgery