salemi v2 [read-only] · jason l. salemi, phd, mph assistant professor, baylor college of medicine...
TRANSCRIPT
9/11/2017
1
BEST PRACTICES IN QUALITY IMPROVEMENT STRATEGIES FOR BIRTH DEFECTS SURVEILLANCE PROGRAMS
Recommendations for both active and passive case ascertainment methodologies
Jason L. Salemi, PhD, MPH
Assistant Professor, Baylor College of Medicine
NBDPN President‐Elect 2017Chair, Surveillance Guidelines and Standard Committee (SGSC), 2016‐current
1
PRESENTATION OUTLINE
Intro to data quality management • Criteria/metrics used to assess the quality of birth defects data
• Difference between quality control (QC), assurance (QA), and improvement (QI)
• Overarching differences according to case ascertainment strategy
QI strategies for programs with active case finding• Example program: Texas Birth Defects Registry
QI strategies for programs with passive case finding• Example program: Florida Birth Defects Registry
Closing messages and additional resources• NBDPN Surveillance Guidelines and Standards
• SGSC and Data Committee participation
2
INTRO TO DATA QUALITY MANAGEMENT
3
9/11/2017
2
CRITERIA/METRICS USED TO ASSESS QUALITY OF BIRTH DEFECTS DATA
4
CRITERIA/METRICS USED TO ASSESS QUALITY OF BIRTH DEFECTS DATA
5
CRITERIA/METRICS USED TO ASSESS QUALITY OF BIRTH DEFECTS DATA
6
9/11/2017
3
QUALITY CONTROL (QC) vs. QUALITY ASSURANCE (QA)
QUALITY CONTROL
• Retrospective and reactive approach
• Discovery and detection
• Re‐casefinding, re‐abstracting
• Capture‐recapture for completeness
• Case verification for accuracy (validity audits)
• Timeliness monitoring
• Data source evaluations
QUALITY ASSURANCE
• Proactive
• Prevention, avoidance of inaccurate data
• Documentation of casefinding, abstraction, coding
• Database maintenance (real‐time validity checks)
• Standardization of variables collected
• Medical record reviews prior to data use
• Expert clinical reviewers
• Both refer to set of methods, activities, and procedures to achieve high‐quality data
“Detects the deficiencies” “Redesigns process to prevent recurrence”
7
DIFFERENCES IN QI ACCORDING TO CASE ASCERTAINMENT STRATEGY
ACTIVE CASE ASCERTAINMENT
• Field staff engage in process of case identification, gathering data, and confirming dx
• Quality improvement focuses mostly at improving the way staff ascertain cases
• Re‐casefinding
• Re‐abstracting
• Validity audits and medical record reviews
• Clinical review
• Reliability and interrater agreement checks
• Timeliness measurements
• Data source evaluation
• Computer technology
PASSIVE CASE ASCERTAINMENT
• Surveillance system receives case reports from data sources and/or uses data linkage to identify cases
• Diagnosis reported on a case report or present in existing data sources may not be verified (some passive programs do include diagnosis confirmation)
• Quality improvement focuses mostly on improving the results of the data collection/linkage processes
• Data source evaluation (reporting from hospitals)
• Capture‐recapture projects (completeness)
• Case‐confirmation projects (accuracy)
• Timeliness measurements
• Relative contribution of data sources
• May require active surveillance initiatives
8
QUALITY IMPROVEMENT STRATEGIESACTIVE CASE FINDING
9
9/11/2017
4
CASE STUDY: TEXAS BIRTH DEFECTS REGISTRY
• Program highlights• Texas Birth Defects Epidemiology and Surveillance Branch (Texas Dept of State Health Services)
• Started in 1994, available data from 1996, statewide since 1999
• Covers all major structural birth defects (through age 1) and FAS (through age 6)
• Data sources used in casefinding capture livebirths, spontaneous and elective terminations
• Surveillance covers over 380,000 resident live births each year
• Casefinding• Active, population‐based
• Discharge summaries, L&D logs, postmortem/pathology logs, surgery logs, specialty outpatient clinics, genetics and radiology logs, NICU logs, midwifery facilities
• In‐person abstraction @ facility, and remote access
• Paper abstraction forms and direct entry into database management system (Maven)
10
CASE STUDY: TEXAS BIRTH DEFECTS REGISTRY
• Staff organization and terminology • Registry Operations Manager (1)
• Regional Supervisor (6)
• QA Specialist (10)
• Surveillance Specialist (41)
• Clinical Reviewer (3)
• Overview of selected QA/QC/QI activities• Re‐casefinding
• Re‐review of medical records
• Re‐abstraction
• Field review
• Data entry review
• Clinical review (regular clinical review and diagnosis code review)
11
Review case lists
Request medical records
Review records
Abstract info on cases
Data entry (Maven) Confirm, document, code
Down syndrome
ICD-9 758.0BPA-TX: 758.000
758.040etc.
Clinical review and classification
12
9/11/2017
5
RE‐CASEFINDING
13
RE‐CASEFINDING
• Purpose• To evaluate the accuracy and completeness of the CASEFINDING process
• Simply stated• Was decision to deem as “potential case” or “not potential case” correct?
• Process• Surveillance specialist (SS) reviews casefinding sources (hospital logs, ICD‐based discharge lists)
• Many times these sources have NOT been “customized” to only include potential cases
• SS assigns status as POTENTIAL CASE or NO POTENTIAL CASE
• QA specialist performs that same task for a sample of entries
14
Review case lists
Request medical records
Review records
Abstract info on cases
Data entry (Maven) Confirm, document, code
Down syndrome
ICD-9 758.0BPA-TX: 758.000
758.040etc.
Clinical review and classification
15
9/11/2017
6
RE‐CASEFINDING
• Quality measures/indicators• Percentage of potential cases missed
• DENOMINATOR: potential cases identified by QA specialist (gold standard)
• NUMERATOR: potential cases identified by QA specialist – potential cases identified by both SS and QAS
“potential cases missed”
• Percentage of entries incorrectly identified as “not potential cases”• DENOMINATOR: number of entries reviewed
• NUMERATOR: potential cases identified by QA specialist – potential cases identified by both SS and QAS
“potential cases missed”
• Frequency• Frequency: 50 per surveillance specialist per quarter (200 per year)
16
RE‐CASEFINDING (EXAMPLE)
• Cara “re‐casefinds” 200 entries that Jason originally reviewed
• Cara decided that there were 100 entries that were “potential cases”
• Of those 100, Jason identified 95 as “potential cases” and 5 as “not potential cases”
• Percentage of potential cases missed• DENOMINATOR: 100
• NUMERATOR: 100 – 95 = 5
• INDICATOR: 5%
• Percentage of entries incorrectly identified as “not potential cases”• DENOMINATOR: 200
• NUMERATOR: 100 – 95 = 5
• INDICATOR: 2.5%
• What if Jason also identified 2 as “potential cases” that Cara identified as “not potential cases”?
17
RE‐REVIEW OF MEDICAL RECORDS
18
9/11/2017
7
RE‐REVIEW OF MEDICAL RECORDS
• Purpose• To evaluate the accuracy and completeness of the MEDICAL RECORD REVIEW process
• Simply stated• Was decision to “abstract” or “not abstract” correct?
• Process• Once identified as a potential case, medical records are requested or accessed remotely
• Surveillance specialist (SS) reviews all relevant medical records for potential case
• SS decides whether to ABTRACT (is a case) or NOT ABSTRACT (not a case)
• QA specialist performs that same task for a sample of medical records
• IMPORTANT: re‐review should be done on the same day since information used to make the final decision may change over time
• Include mix of on‐site review and remote access
19
Review case lists
Request medical records
Review records
Abstract info on cases
Data entry (Maven) Confirm, document, code
Down syndrome
ICD-9 758.0BPA-TX: 758.000
758.040etc.
Clinical review and classification
20
RE‐REVIEW OF MEDICAL RECORDS
• Quality measures/indicators• Percentage of cases missed
• DENOMINATOR: cases identified by QA specialist (gold standard), i.e., decision was to abstract
• NUMERATOR: cases identified by QA specialist – cases identified by both SS and QAS
“cases missed”
• Percentage of medical records incorrectly identified as “non‐cases”• DENOMINATOR: number of medical records reviewed
• NUMERATOR: cases identified by QA specialist – cases identified by both SS and QAS
“cases missed”
• Frequency• Frequency: 12 per surveillance specialist per quarter (48 per year)
21
9/11/2017
8
RE‐REVIEW OF MEDICAL RECORDS (EXAMPLE)
• Cara “re‐reviews” 48 medical records that Jason originally reviewed in a given year
• Cara decided that there were 30 entries that were “cases (abstract)”
• Of those 30, Jason identified 29 as “cases (abstract)” and 1 as “non‐cases (don’t abstract)”
• Percentage of cases missed• DENOMINATOR: 30
• NUMERATOR: 30 – 29 = 1
• INDICATOR: 3.3%
• Percentage of medical records incorrectly identified as “non‐cases”• DENOMINATOR: 48
• NUMERATOR: 30 – 29 = 1
• INDICATOR: 2.1%
• What if Jason also identified 2 as “cases” that Cara identified as “non‐cases”?
22
RE‐ABSTRACTION
23
RE‐ABSTRACTION
• Purpose• To evaluate the accuracy and completeness of the INFORMATION abstracted
• Simply stated• Did pertinent information get entered (low missingness) and was it correct (low errors)?
• Process• Once identified as a case, the surveillance specialist (SS) abstracts relevant information from medical records
• If the data exist in the medical record, the SS is responsible for finding it and documenting it correctly (either on a paper form or directly into the database management system)
• QA specialist performs that same task for a sample of abstracted records
• IMPORTANT: re‐abstraction should be done on the same day since information to be abstracted may change over time
• Include mix of abstracted records from on‐site review and from records accessed remotely
24
9/11/2017
9
Review case lists
Request medical records
Review records
Abstract info on cases
Data entry (Maven) Confirm, document, code
Down syndrome
ICD-9 758.0BPA-TX: 758.000
758.040etc.
Clinical review and classification
25
RE‐ABSTRACTION
• Quality measures/indicators• Total number of missing fields across all abstracted records
• Mean number of missing fields per abstracted record
• Percentage of abstraction records with any missing information
• Total number of errors across all abstracted records
• Mean number of errors per abstracted record
• Percentage of abstraction records with ≥1 error
What fields should be selected for QA calculations (some more important than others)…
• Frequency• Frequency: 3 per surveillance specialist per quarter (12 per year)
*Missing information refers to information that the QA specialist found in the medical record but that the surveillance specialist left missing
*Errors refer to information recorded inaccurately relative to what was reported in the medical record (e.g., wrong date of birth). Maximum of one error to be counted per abstraction field.
26
RE‐ABSTRACTION (EXAMPLE)
• Cara “re‐abstracts” 3 medical records for cases that Jason originally abstracted in a given quarter• Record 1:
• No instances in which Jason failed to document something in the medical record
• All fields were entered accurately when compared to the medical record
• Record 2:
• Jason failed to capture information in 4 fields (DOB, address, birth defect diagnosis, and gestational age) in med record
• One DOB was entered incorrectly (12/31/16 instead of 12/13/16)
• Record 3:
• No instances in which Jason failed to document something in the medical record
• The clinical estimate of gestation was entered incorrectly (24 instead of 42)
• Total number of missing fields across all abstracted records 4
• Mean number of missing fields per abstracted record 1.3
• Percentage of abstraction records with any missing information 33%
• Total number of errors across all abstracted records 2
• Mean number of errors per abstracted record 0.67
• Percentage of abstraction records with ≥1 error 67%
27
9/11/2017
10
FIELD REVIEW
28
FIELD REVIEW
• Purpose• To evaluate the accuracy of the most important information being entered into Maven
• Simply stated• Are we capturing defect diagnoses, descriptions, laterality, BPA coding, possible/probable, right?
• Process• Once identified as a case, the surveillance specialist (SS) abstracts relevant information from medical records
• If the data exist in the medical record, the SS is responsible for finding it and documenting it correctly (either on a paper form or directly into database management system)
• If no remote abstraction and entry directly into Maven, SS also transfer info from paper forms into Maven, and assigns them to “FIELD REVIEW PENDING” within 30 days of abstraction
• QA specialist is automatically assigned records pending “field review” and performs field review within 50 days of assignment
29
Review case lists
Request medical records
Review records
Abstract info on cases
Data entry (Maven) Confirm, document, code
Down syndrome
ICD-9 758.0BPA-TX: 758.000
758.040etc.
Clinical review and classification
30
9/11/2017
11
FIELD REVIEW
• Quality measures/indicators• Total number of errors across all abstracted records
• Mean number of errors per abstracted record
• Percentage of abstraction records with ≥1 error
• Frequency
• Frequency: ALL ABSTRACTED RECORDS (CASES) RECEIVE FIELD REVIEW
*Errors refer to information recorded inaccurately relative to what was reported in the medical record (e.g., wrong BPA code). Maximum of one error to be counted per field.
31
FIELD REVIEW (EXAMPLE)
• Cara performs “field review” for 20 cases that Jason originally abstracted in a given quarter• Records 1 through 18:
• All birth defect diagnoses, coding, laterality, etc. were entered accurately into Maven
• Record 19:
• Error in the BPA code that was assigned to one of the birth defects
• Record 20:
• Laterality was not documented for one defect
• “Small chin” and “micrognathia” were listed as separate defects
• Location of spina bifida, which is important for the assigned code, was not included in the defect description
• Total number of errors across all abstracted records 4
• Mean number of errors per abstracted record 0.2 (4 errors, 20 records)
• Percentage of abstraction records with ≥1 error 10% (2/20 records with ≥1 error)
Also very important to take thorough notes on errors made to identify trends, compare to previous errors, and assess the level of re‐training that may be required.
32
DATA ENTRY REVIEW
33
9/11/2017
12
DATA ENTRY REVIEW
• Purpose• To evaluate the accuracy and completeness of the DATA ENTRY process
• Simply stated• As information was transferred from paper‐based forms into a database management system, was all information entered (low missingness) and was it correct (low errors)?
• Process• When remote access is available for a facility/record, the surveillance specialist (SS) enters abstracted information directly into database (Maven) – no paper‐based forms used
• If abstraction occurs on‐site at facility, SS abstracts onto paper‐based forms, and then enters information into database (Maven) within 30 days of abstraction
• QA specialist performs evaluation by comparing the paper forms to what was entered into Maven
• Data entry review is NOT necessary if ALL abstraction is directly entered into electronic database• RE‐ABSTRACTION will capture the accuracy and completeness of the INFORMATION abstracted
34
Review case lists
Request medical records
Review records
Abstract info on cases
Data entry (Maven) Confirm, document, code
Down syndrome
ICD-9 758.0BPA-TX: 758.000
758.040etc.
Clinical review and classification
35
DATA ENTRY REVIEW
• Quality measures/indicators• Total number of missing fields across all abstracted records
• Mean number of missing fields per abstracted record
• Percentage of abstraction records with any missing information
• Total number of errors across all abstracted records
• Mean number of errors per abstracted record
• Percentage of abstraction records with ≥1 error
• Frequency• Frequency: 12 per surveillance specialist per quarter (48 per year)
*Missing information refers to information that the QA specialist found on the paper form but that the surveillance specialist left missing
*Errors refer to information recorded inaccurately relative to what was reported on the paper form (e.g., wrong date of birth). Maximum of one error to be counted per abstraction field.
36
9/11/2017
13
CLINICAL REVIEW: PART IREGULAR CLINICAL REVIEW
37
REGULAR CLINICAL REVIEW
• Purpose• To improve the accuracy, completeness, and conciseness of defect coding
• To exclude cases without reportable birth defects
• Process• Following abstraction by surveillance specialist and field review by QA specialist
• ALL of the following are evaluated by a Clinical Reviewer:• >1 regular reportable defect
• Therapeutic abortions, fetal deaths
• Chromosomal diagnosis
• Syndrome diagnosis
• NTDs
• Microcephaly and other defects potentially related to Zika virus
• Cases documented with questions or concerns
• Constitutes approximately 50% of all abstracted cases• Others are liveborn cases with only one regular reportable non‐syndromic defect (non‐NTD, chrom, Zika‐related)
38
Review case lists
Request medical records
Review records
Abstract info on cases
Data entry (Maven) Confirm, document, code
Down syndrome
ICD-9 758.0BPA-TX: 758.000
758.040etc.
Clinical review and classification
39
9/11/2017
14
REGULAR CLINICAL REVIEW
• Based on information in the electronic abstraction record
• Review diagnoses for plausibility
• Provide guidance in recording defects accurately, completely, and concisely
• Ensure proper 6‐digit BPA code is assigned to each defect
• Exclude cases without any reportable defect
40
CLINICAL REVIEW: PART IIDIAGNOSIS CODE REVIEW
41
DIAGNOSIS CODE REVIEW
• Purpose• To evaluate sample of records not sent for regular review (simple cases)
• To evaluate consistency of evaluation across Clinical Reviewers
• Process• Remember, 50% of abstracted cases are more complex or have questions
• Sent for REGULAR CLINCIAL REVIEW
• Remember, 50% of abstracted cases are liveborn cases with only one regular reportable non‐syndromic defect that is also non‐NTD, non‐chromosomal, and non‐Zika‐related
• NOT sent for REGULAR CLINCIAL REVIEW
• 10% of ALL records are sent for DIAGNOSIS CODE REVIEW
42
9/11/2017
15
QA specialist does initial coding, marks record complete
NO REGULAR REVIEW REGULAR REVIEW
50% 50%
DX CODE REVIEW
10%
Evaluate quality of sample not sent for regular review
DX CODE REVIEW
(by different reviewer)
10%
Assess consistency between clinical reviewers if differences in results of review
43
DIAGNOSIS CODE REVIEW
• Same process as regular review
• After review, returned to QA specialist
• QA specialist will make necessary corrections
• Inconsistencies between Clinical Reviewers will be handled by meeting and agreeing on a procedure for handling situation now and in the future
44
QUALITY IMPROVEMENT STRATEGIESPASSIVE (OR PASSIVE AGGRESSIVE) CASE FINDING
45
9/11/2017
16
CASE STUDY: FLORIDA BIRTH DEFECTS REGISTRY
• Program highlights• Partnership between Department of Health and university(ies) – currently USF
• Started in 1999, statewide since 1998
• Passive surveillance covers all major structural birth defects (through age 1) among live births
• Various enhanced/active surveillance projects since 2003
• Passive surveillance covers over 215,000 resident live births each year
• Casefinding• Passive surveillance relies on linkage of administrative databases and ICD codes to capture cases
• Vital records (birth certificates, infant death certificates)
• Hospital discharge data (inpatient, outpatient, emergency department)
• Service‐related datasets from Children’s Medical Services
• Enhanced surveillance similar in protocol to Texas Registry, includes all pregnancy outcomes
• Since 2007, also verified all passively identified cases for selected defects• These “enhanced” activities have facilitated evaluations and quality improvement strategies
46
CASE STUDY: FLORIDA BIRTH DEFECTS REGISTRY
• Staff organization and terminology • Senior abstraction coordinator (1) similar to QA specialist
• Medical records abstractor (7) similar to surveillance specialist
• Overview of selected QA/QC/QI activities• Evaluation of data linkage process if data are being linked to identify birth defects
• Evaluation of completeness of ascertainment• May be related to hospitals that report to the registry (e.g., New York)
• May use targeted active surveillance projects to assess completeness of administrative datasets used to identify cases
• Evaluation of accuracy
• Evaluation of timeliness (vs. completeness)• May be related to timeliness of reporting by hospitals
• May be related to timeliness vs. completeness balance if relying on local/state datasets to be compiled for data linkage
• Evaluation of relative contribution of existing data sources, new data sources
47
EVALUATION OF DATA LINKAGE
48
9/11/2017
17
EVALUATION OF DATA LINKAGE
• Purpose• To improve ability to capture cases while minimizing false positives
• To understand those disproportionately missed due to data linkage limitations
• Process
• Generate missingness report on key linkage variables within each database
• Manually inspect records from lower confidence linking steps
• To exclude false positive links as linking criteria are relaxed, perhaps exclude whole stages
• Compare linked and unlinked records on the distribution of key variables
• Compare prevalence estimates generated from linked datasets to rates reported in the literature
49
EVALUATION OF DATA LINKAGE
50
EVALUATION OF DATA LINKAGE
51
9/11/2017
18
EVALUATION OF COMPLETENESS
52
EVALUATION OF COMPLETENESS
• Purpose• To understand the extent to which data are all‐inclusive and comprehensive
• To assess whether all of the cases of birth defects that occur within the target population, within a specified time period, are identified by the surveillance system
• Process
• Capture‐recapture analytic methodology
• Relies on two distinct surveillance systems (e.g., passive vs. active)
• Linked both systems to common source (e.g., vital record)
• What % of actual cases of birth defects does the FBDR capture?
• May only be able to do for subset• Restricted number of defects
• Restricted time frame
• Restricted geographic area, scope of facilities
registry
case
53
EVALUATION OF COMPLETENESS
54
9/11/2017
19
EVALUATION OF COMPLETENESS
55
EVALUATION OF ACCURACY
56
EVALUATION OF ACCURACY
• Purpose• To understand the extent to which data are exact, correct, and valid
• To assess whether program is able to provide reliable disease rates and to maintain data comparable to those from other programs
• Process
• Relies on two distinct surveillance systems (e.g., passive vs. active)
• One system’s diagnoses serves as the gold standard; accuracy of the other system is assessed
• Of all of the cases identified by the FBDR with a birth defect (suspected), what % actually have it (truth)?
• Due to the need for a gold standard (may be resource intensive), may only be able to do for subset• Restricted number of defects
• Restricted time frame
• Restricted geographic area, scope of facilities
57
9/11/2017
20
EVALUATION OF ACCURACY
58
EVALUATION OF TIMELINESSBALANCING WITH COMPLETNESS
59
BALANCING TIMELINESS WITH COMPLETENESS
• Purpose• To understand the extent to which data are data are rapid, prompt, and responsive
• To assess degree to which program is able to provide timely prevention and intervention services, respond quickly to investigations, and monitor trends
• Process• Estimate current time lag (e.g., 18‐months from birth)
• Assess methods to improve timeliness (e.g., preliminary data release)
• Would improving timeliness affect completeness, accuracy?
• Is there an ideal “balance point” between quality metrics?
60
9/11/2017
21
BALANCING TIMELINESS WITH COMPLETENESS
61
BALANCING TIMELINESS WITH COMPLETENESS
62
ASSESSING RELATIVE CONTRIBUTION OF ASCERTAINMENT DATA SOURCES
63
9/11/2017
22
ASSESSING RELATIVE CONTRIBUTION
• Purpose• To understand which data sources are most important to casefinding
• To assess the potential loss of data sources, or impact of new data sources
• Process
• As the WHAT IF questions!
• What if the birth defects registry were constructed WITHOUT…
• the hospital discharge inpatient data set?
• the hospital discharge outpatient data set?
• the CMS Early Steps data set?
• the CMS RPICC data set?
• the CMS Minimum Data Set?
etc...
64
ASSESSING RELATIVE CONTRIBUTION
65
CLOSING SENTIMENTS
66
9/11/2017
23
TAKE HOME MESSAGES
• Must strive to maximize the quality of surveillance data
• Quality improvement can be achieved regardless of case finding approach
• This was just a cursory review of some strategies to employ
• The NBDPN “Guidelines for Conducting Birth Defects Surveillance” is an excellent resource
• The Surveillance Guidelines and Standards Committee (SGSC) has generated useful publications and annual quality assessment reports
67
68
69
9/11/2017
24
ACKNOWLEDGMENTS (in the highest order!)
• Dan Driggers
• Mark Canfield
• Jean Paul Tanner
• Diana Sampat
• Russ Kirby
• Heather Lake‐Burger
• Marlene Anderka
• Jennifer Isenburg
…and many more!
70
71