ahima data quality management model

9
ahima data quality management model In this file, you can ref useful information about ahima data quality management model such as ahima data quality management modelforms, tools for ahima data quality management model, ahima data quality management modelstrategies … If you need more assistant for ahima data quality management model, please leave your comment at the end of file. Other useful material for ahima data quality management model: qualitymanagement123.com/23-free-ebooks-for-quality-management • qualitymanagement123.com/185- free-quality- management- forms • qualitymanagement123.com/free-98-ISO-9001-templates-and-forms • qualitymanagement123.com/top-84-quality-management-KPIs • qualitymanagement123.com/top-18-quality-management-job-descriptions • qualitymanagement123.com/86-quality-management-interview-questions-and-answers I. Contents of ahima data quality management model ================== Healthcare leaders face many challenges, from the ICD-10-CM/PCS transition to achieving meaningful use, launching accountable care organizations (ACOs) and value-based purchasing programs, assuring the sustainability of health information exchanges (HIEs), and maintaining compliance with multiple health data reporting and clinical documentation requirements among others. These initiatives impacting the health information management (HIM) and healthcare industries have a common theme: data. As electronic health record (EHR) systems have become more widely implemented in all healthcare settings, these systems have developed and employed various methods of supporting documentation for electronic records. Complaints and concerns are often voiced whether via blogs, online newsletters, or listservsregarding the integrity, reliability, and compliance capabilities of automated documentation processes. Several articles in the Journal of AHIMA establish guidelines for preventing fraud in EHR systems. Documentation practices are within the domain of the HIM profession, for both paper and electronic records. 1 As a result, the need for more rigorous data quality governance, stewardship, management, and measurement is greater than ever. This practice brief will use the following definitions: Data Quality Management: The business processes that ensure the integrity of an organization's data during collection, application (including aggregation), warehousing, and analysis. 2 While the healthcare industry still has quite a journey ahead in order to reach the robust goal of national healthcare data standards, the following initiatives are a step in the right direction for data exchange and interoperability: Continuity of Care Document (CCD), Clinical Documentation Architecture (CDA) Data Elements for Emergency Department Systems (DEEDS)

Upload: selinasimpson2301

Post on 16-Jul-2015

125 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Ahima data quality management model

ahima data quality management model

In this file, you can ref useful information about ahima data quality management model such as

ahima data quality management modelforms, tools for ahima data quality management model,

ahima data quality management modelstrategies … If you need more assistant for ahima data

quality management model, please leave your comment at the end of file.

Other useful material for ahima data quality management model:

• qualitymanagement123.com/23-free-ebooks-for-quality-management

• qualitymanagement123.com/185- free-quality-management- forms

• qualitymanagement123.com/free-98-ISO-9001-templates-and-forms

• qualitymanagement123.com/top-84-quality-management-KPIs

• qualitymanagement123.com/top-18-quality-management-job-descriptions

• qualitymanagement123.com/86-quality-management- interview-questions-and-answers

I. Contents of ahima data quality management model

==================

Healthcare leaders face many challenges, from the ICD-10-CM/PCS transition to achieving

meaningful use, launching accountable care organizations (ACOs) and value-based purchasing programs, assuring the sustainability of health information exchanges (HIEs), and maintaining

compliance with multiple health data reporting and clinical documentation requirements—among others. These initiatives impacting the health information management (HIM) and healthcare industries have a common theme: data.

As electronic health record (EHR) systems have become more widely implemented in all healthcare settings, these systems have developed and employed various methods of supporting

documentation for electronic records. Complaints and concerns are often voiced—whether via blogs, online newsletters, or listservs—regarding the integrity, reliability, and compliance capabilities of automated documentation processes. Several articles in the Journal of AHIMA

establish guidelines for preventing fraud in EHR systems. Documentation practices are within the domain of the HIM profession, for both paper and electronic records.1

As a result, the need for more rigorous data quality governance, stewardship, management, and measurement is greater than ever. This practice brief will use the following definitions:

Data Quality Management: The business processes that ensure the integrity of an organization's data during collection, application (including aggregation), warehousing, and analysis.2 While the healthcare industry still has quite a journey ahead in order to reach the

robust goal of national healthcare data standards, the following initiatives are a step in the right direction for data exchange and interoperability:

Continuity of Care Document (CCD), Clinical Documentation Architecture (CDA) Data Elements for Emergency Department Systems (DEEDS)

Page 2: Ahima data quality management model

Uniform Hospital Discharge Data Set (UHDDS) Minimum Data Set (MDS) for long-term care

ICD-10-CM/PCS, Systemized Nomenclature of Medicine—Clinical Terms (SNOMED CT), Logical Observation Identifiers Names and Codes (LOINC), RxNorm

Data Quality Measurement: A quality measure is a mechanism to assign a quantity to quality

of care by comparison to a criterion. Quality measurements typically focus on structures or processes of care that have a demonstrated relationship to positive health outcomes and are under the control of the healthcare system.3 This is evidenced by the many initiatives to capture

quality/performance measurement data, including:

The Joint Commission Core Measures Outcomes and Assessment Information Set (OASIS) for home health care

National Committee for Quality Assurance's (NCQA) Health Plan Employer Data and Information Set (HEDIS)

Meaningful Use-defined core and menu sets

Key Terms

Understanding of the following term definitions is key to ensure clarity in this article. Data Quality Management: The business processes that ensure the integrity of an

organization's data during collection, application (including aggregation), warehousing, and analysis.

Data Quality Measurement: A quality measure is a mechanism to assign a quantity to quality of care by comparison to a criterion. Quality measurements typically focus on structures or processes of care that have a demonstrated relationship to positive health

outcomes and are under the control of the healthcare system.

These data sets will be used within organizations for continuous quality improvement efforts and

to improve outcomes. They draw on data as raw material for research and comparing providers and institutions with one another. Payment reform and quality measure reporting initiatives increase a healthcare organization's

data needs for determining achievement of program goals, as well as identifying areas in need of improvement. The introduction of new classification and terminology systems—with their

increased specificity and granularity—reinforce the importance of consistency, completeness, and accuracy as key characteristics of data quality.4 The implementation of ICD-10 CM/PCS will impact anyone using diagnosis or inpatient procedure codes, which are pervasive throughout

reimbursement systems, quality reporting, healthcare research and epidemiology, and public health reporting. SNOMED CT, RxNorm, and LOINC terminologies have detailed levels for a

variety of healthcare needs, ranging from laboratory to pharmacy, and require awareness of the underlying quality from the data elements. Healthcare data serves many purposes across many settings, primarily directed towards patient

care. The industry is also moving towards an increased focus on ensuring that collected data is available for many other purposes. The use of new technologies such as telemedicine, remote

monitoring, and mobile devices is also changing the nature of access to care and the manner in which patients and their families interact with caregivers. The rates of EHR adoption and development of HIEs continue to rise, which brings attention to assuring the integrity of the data

Page 3: Ahima data quality management model

regardless of the practice setting, collection method, or system used to capture, store, and transmit data across the healthcare continuum of care.

The main outcome of data quality management (DQM) is knowledge regarding the quality of healthcare data and its fitness for applicable use in the intended purposes. DQM functions

involve continuous quality improvement for data quality throughout the enterprise (all healthcare settings) and include data application, collection, analysis, and warehousing. DQM skills and roles are not new to HIM professionals. As use of EHRs becomes widespread, however, data are

shared and repurposed in new and innovative ways, thus making data quality more important than ever.

Data quality begins when EHR applications are planned. For example, data dictionaries for applications should utilize standards for definitions and acceptable values whenever possible. For additional information on this topic, please refer to the practice brief "Managing a Data

Dictionary."5 The quality of collected data can be affected by both the software—in the form of value labels or

other constraints around data entry—and the data entry mechanism whether it be automated or manual. Automated data entry originates from various sources, such as clinical lab machines and vital sign tools like blood pressure cuffs. All automated tools must be checked regularly to

ensure appropriate operation. Likewise, any staff entering data manually should be trained to enter the data correctly and monitored for quality assurance.

For example, are measurements recorded in English or metric intervals? Does the organization use military time? What is the process if the system cannot accept what the person believes is the correct information?

Meaningful data analysis must be built upon high-quality data. Provided that underlying data is correct, the analysis must use data in the correct context. For example, many organizations do

not collect external cause data if it is not required. Gunshot wounds would require external cause data, whereas slipping on a rug would not. Developing an analysis around external causes and representing it as complete would be misleading in many facilities. Additionally, the copy

capabilities available as a result of electronic health data are likely to proliferate as EHR utilization expands. Readers can refer to AHIMA's Copy Functionality Toolkit for more

information on this topic.6Finally, with many terabytes of data generated by EHRs, the quality of the data in warehouses will be paramount. The following are just some of the determinations that need to be addressed to ensure a high-quality data warehouse:

Static data (date of birth, once entered correctly, should not change) Dynamic data (patient temperature should fluctuate throughout the day) When and how data updates (maintenance scheduling)

Versioning (DRGs and EHR systems change across the years—it is important to know which grouper or EHR version was used)

Consequently, the healthcare industry needs data governance programs to help manage the growing amount of electronic data.

Data Governance

Data governance is the high-level, corporate, or enterprise policies and strategies that define the

purpose for collecting data, the ownership of data, and the intended use of data. Accountability and responsibility flow from data governance, and the data governance plan is the framework for

overall organizational approach to data governance.7

Page 4: Ahima data quality management model

Information Governance and Stewardship

Information governance provides a foundation for the other

data-driven functions in AHIMA's HIM Core Model by providing parameters based on organizational and compliance policies, processes, decision rights, and responsibilities. Governance functions and stewardship ensure the use and management of health information is

compliant with jurisdictional law, regulation, standards, and organizational policies. As stewards of health information, HIM professionals strive to protect and ensure the ethical use of health

information.8 The DQM model was developed to illustrate the different data quality challenges. The table "Data Quality Management Model" includes a graphic of the DQM domains as they relate to the

characteristics of data integrity, and "Appendix A" includes examples of each characteristic within each domain. The model is generic and adaptable to any care setting and for any

application. It is a tool or a model for HIM professionals to transition into enterprise-wide DQM roles.

Assessing Data Quality Management Efforts

Traditionally, healthcare data quality practices were coordinated by HIM professionals using paper records and department-based systems. These practices have evolved and now utilize data

elements, electronic searches, comparative and shared databases, data repositories, and continuous quality improvement. As custodians of health records, HIM professionals have historically performed warehousing functions such as purging, indexing, and editing data on all

types of media: paper, images, optical disk, computer disk, microfilm, and CD-ROM. In addition, HIM professionals are experts in collecting and classifying data to support a variety of

needs. Some examples include severity of illness, meaningful use, pay for performance, data mapping, and registries. Further, HIM professionals have encouraged and fostered the use of data by ensuring its timely availability, coordinating its collection, and analyzing and reporting

collected data. To support these efforts, "Checklist to Assess Data Quality Management Efforts" on page 67 outlines basic tenets in data quality management for healthcare professionals to

follow. With AHIMA members fulfilling a wide variety of roles within the healthcare industry, HIM professionals are expanding their responsibilities in data governance and stewardship.

Leadership, management skills, and IT knowledge are all required for effective expansion into these areas.

Roles such as clinical data manager, terminology asset manager, and health data analyst positions will continue to evolve into opportunities for those HIM professionals ready to upgrade their expertise to keep pace with changing practice.

==================

III. Quality management tools

1. Check sheet

Page 5: Ahima data quality management model

The check sheet is a form (document) used to collect data in real time at the location where the data is generated.

The data it captures can be quantitative or qualitative. When the information is quantitative, the check sheet is

sometimes called a tally sheet. The defining characteristic of a check sheet is that data

are recorded by making marks ("checks") on it. A typical check sheet is divided into regions, and marks made in

different regions have different significance. Data are read by observing the location and number of marks on the sheet.

Check sheets typically employ a heading that answers the

Five Ws:

Who filled out the check sheet

What was collected (what each check represents, an identifying batch or lot number)

Where the collection took place (facility, room, apparatus)

When the collection took place (hour, shift, day of the week)

Why the data were collected

2. Control chart

Control charts, also known as Shewhart charts

(after Walter A. Shewhart) or process-behavior charts, in statistical process control are tools used to determine if a manufacturing or business

process is in a state of statistical control.

If analysis of the control chart indicates that the process is currently under control (i.e., is stable, with variation only coming from sources common

to the process), then no corrections or changes to process control parameters are needed or desired.

In addition, data from the process can be used to predict the future performance of the process. If the chart indicates that the monitored process is

not in control, analysis of the chart can help determine the sources of variation, as this will

Page 6: Ahima data quality management model

result in degraded process performance.[1] A process that is stable but operating outside of

desired (specification) limits (e.g., scrap rates may be in statistical control but above desired

limits) needs to be improved through a deliberate effort to understand the causes of current performance and fundamentally improve the

process.

The control chart is one of the seven basic tools of quality control.[3] Typically control charts are used for time-series data, though they can be used

for data that have logical comparability (i.e. you want to compare samples that were taken all at

the same time, or the performance of different individuals), however the type of chart used to do this requires consideration.

3. Pareto chart

A Pareto chart, named after Vilfredo Pareto, is a type

of chart that contains both bars and a line graph, where individual values are represented in descending order by bars, and the cumulative total is represented by the

line.

The left vertical axis is the frequency of occurrence, but it can alternatively represent cost or another important unit of measure. The right vertical axis is

the cumulative percentage of the total number of occurrences, total cost, or total of the particular unit of

measure. Because the reasons are in decreasing order, the cumulative function is a concave function. To take the example above, in order to lower the amount of

late arrivals by 78%, it is sufficient to solve the first three issues.

The purpose of the Pareto chart is to highlight the most important among a (typically large) set of

factors. In quality control, it often represents the most common sources of defects, the highest occurring type

of defect, or the most frequent reasons for customer complaints, and so on. Wilkinson (2006) devised an

Page 7: Ahima data quality management model

algorithm for producing statistically based acceptance limits (similar to confidence intervals) for each bar in

the Pareto chart.

4. Scatter plot Method

A scatter plot, scatterplot, or scattergraph is a type of

mathematical diagram using Cartesian coordinates to display values for two variables for a set of data.

The data is displayed as a collection of points, each having the value of one variable determining the position

on the horizontal axis and the value of the other variable determining the position on the vertical axis.[2] This kind of plot is also called a scatter chart, scattergram, scatter

diagram,[3] or scatter graph.

A scatter plot is used when a variable exists that is under the control of the experimenter. If a parameter exists that is systematically incremented and/or decremented by the

other, it is called the control parameter or independent variable and is customarily plotted along the horizontal

axis. The measured or dependent variable is customarily plotted along the vertical axis. If no dependent variable exists, either type of variable can be plotted on either axis

and a scatter plot will illustrate only the degree of correlation (not causation) between two variables.

A scatter plot can suggest various kinds of correlations between variables with a certain confidence interval. For

example, weight and height, weight would be on x axis and height would be on the y axis. Correlations may be

positive (rising), negative (falling), or null (uncorrelated). If the pattern of dots slopes from lower left to upper right, it suggests a positive correlation between the variables

being studied. If the pattern of dots slopes from upper left to lower right, it suggests a negative correlation. A line of

best fit (alternatively called 'trendline') can be drawn in order to study the correlation between the variables. An equation for the correlation between the variables can be

determined by established best-fit procedures. For a linear correlation, the best-fit procedure is known as linear

Page 8: Ahima data quality management model

regression and is guaranteed to generate a correct solution in a finite time. No universal best-fit procedure is

guaranteed to generate a correct solution for arbitrary relationships. A scatter plot is also very useful when we

wish to see how two comparable data sets agree with each other. In this case, an identity line, i.e., a y=x line, or an 1:1 line, is often drawn as a reference. The more the two

data sets agree, the more the scatters tend to concentrate in the vicinity of the identity line; if the two data sets are

numerically identical, the scatters fall on the identity line exactly.

5.Ishikawa diagram

Ishikawa diagrams (also called fishbone diagrams, herringbone diagrams, cause-and-effect diagrams, or

Fishikawa) are causal diagrams created by Kaoru Ishikawa (1968) that show the causes of a specific

event.[1][2] Common uses of the Ishikawa diagram are product design and quality defect prevention, to identify potential factors causing an overall effect. Each cause or

reason for imperfection is a source of variation. Causes are usually grouped into major categories to identify these

sources of variation. The categories typically include

People: Anyone involved with the process

Methods: How the process is performed and the specific requirements for doing it, such as policies,

procedures, rules, regulations and laws

Machines: Any equipment, computers, tools, etc. required to accomplish the job

Materials: Raw materials, parts, pens, paper, etc. used to produce the final product

Measurements: Data generated from the process that are used to evaluate its quality

Environment: The conditions, such as location, time, temperature, and culture in which the process

operates

6. Histogram method

Page 9: Ahima data quality management model

A histogram is a graphical representation of the distribution of data. It is an estimate of the probability

distribution of a continuous variable (quantitative variable) and was first introduced by Karl Pearson.[1] To

construct a histogram, the first step is to "bin" the range of values -- that is, divide the entire range of values into a series of small intervals -- and then count how many

values fall into each interval. A rectangle is drawn with height proportional to the count and width equal to the bin

size, so that rectangles abut each other. A histogram may also be normalized displaying relative frequencies. It then shows the proportion of cases that fall into each of several

categories, with the sum of the heights equaling 1. The bins are usually specified as consecutive, non-overlapping

intervals of a variable. The bins (intervals) must be adjacent, and usually equal size.[2] The rectangles of a histogram are drawn so that they touch each other to

indicate that the original variable is continuous.[3]

III. Other topics related to ahima data quality management model (pdf

download)

quality management systems

quality management courses

quality management tools

iso 9001 quality management system

quality management process

quality management system example

quality system management

quality management techniques

quality management standards

quality management policy

quality management strategy

quality management books