Hospital Quality Data
HHS Should Specify Steps and Time Frame for Using Information Technology to Collect and Submit Data
Gao ID: GAO-07-320 April 25, 2007
Hospitals submit data in electronic form on a series of quality measures to the Centers for Medicare & Medicaid Services (CMS) and receive scores on their performance. Increasingly, the clinical information from which hospitals derive the quality data for CMS is stored in information technology (IT) systems. GAO was asked to examine (1) hospital processes to collect and submit quality data, (2) the extent to which IT facilitates hospitals' collection and submission of quality data, and (3) whether CMS has taken steps to promote the use of IT systems to facilitate the collection and submission of hospital quality data. GAO addressed these issues by conducting case studies of eight hospitals with varying levels of IT development and interviewing relevant officials at CMS and the Department of Health and Human Services (HHS).
The eight case study hospitals used six steps to collect and submit quality data: (1) identify the patients, (2) locate information in their medical records, (3) determine appropriate values for the data elements, (4) transmit the quality data to CMS, (5) ensure that the quality data have been accepted by CMS, and (6) supply copies of selected medical records to CMS to validate the data. Several factors account for the complexity of abstracting all relevant information in a patient's medical record, including the content and organization of the medical record, the scope of information and the clinical judgment required for the data elements, and frequent changes by CMS in its data specifications. Due in part to these complexities, most of the case study hospitals relied on clinical staff to abstract the quality data. Increases in the number of quality measures required by CMS led to increased demands on clinical staff resources. Offsetting the demands placed on clinical staff were the benefits that case study hospitals reported finding in the quality data, such as providing feedback to clinicians and reports to hospital administrators. GAO's case studies showed that existing IT systems can help hospitals gather some quality data but are far from enabling hospitals to automate the abstraction process. IT systems helped hospital staff to abstract information from patients' medical records, in particular by improving accessibility to and legibility of the medical record. The limitations reported by officials in the case study hospitals included having a mix of paper and electronic records, which required staff to check multiple places to get the needed information; the prevalence of data recorded as unstructured narrative or text, which made locating the information time-consuming because it was not in a prescribed place in the record; and the inability of some IT systems to access related data stored in another IT system in the same hospital, which required staff to access each IT system separately to obtain related pieces of information. Hospital officials expected the scope and functionality of their IT systems to increase over time, but this process will occur over a period of years. CMS has sponsored studies and joined HHS initiatives to examine and promote the current and potential use of hospital IT systems to facilitate the collection and submission of quality data, but HHS lacks detailed plans, including milestones and a time frame against which to track its progress. CMS has joined efforts by HHS to promote the use of IT in health care, including a Quality Workgroup charged with specifying how IT could capture, aggregate, and report inpatient and outpatient quality data. HHS plans to expand the use of health IT for quality data collection and submission through contracts with nongovernmental entities that currently address the use of health IT for a range of other purposes. However, HHS has identified no detailed plans, milestones, or time frames for either its broad effort to encourage IT in health care nationwide or its specific objective to promote the use of health IT for quality data collection.
Recommendations
Our recommendations from this work are listed below with a Contact for more information. Status will change from "In process" to "Open," "Closed - implemented," or "Closed - not implemented" based on our follow up work.
Director:
Team:
Phone:
GAO-07-320, Hospital Quality Data: HHS Should Specify Steps and Time Frame for Using Information Technology to Collect and Submit Data
This is the accessible text file for GAO report number GAO-07-320
entitled 'Hospital Quality Data: HHS Should Specify Steps and Time
Frame for Using Information Technology to Collect and Submit Data'
which was released on May 7, 2007.
This text file was formatted by the U.S. Government Accountability
Office (GAO) to be accessible to users with visual impairments, as part
of a longer term project to improve GAO products' accessibility. Every
attempt has been made to maintain the structural and data integrity of
the original printed product. Accessibility features, such as text
descriptions of tables, consecutively numbered footnotes placed at the
end of the file, and the text of agency comment letters, are provided
but may not exactly duplicate the presentation or format of the printed
version. The portable document format (PDF) file is an exact electronic
replica of the printed version. We welcome your feedback. Please E-mail
your comments regarding the contents or accessibility features of this
document to Webmaster@gao.gov.
This is a work of the U.S. government and is not subject to copyright
protection in the United States. It may be reproduced and distributed
in its entirety without further permission from GAO. Because this work
may contain copyrighted images or other material, permission from the
copyright holder may be necessary if you wish to reproduce this
material separately.
Report to the Committee on Finance, U.S. Senate:
United States Government Accountability Office:
GAO:
April 2007:
Hospital Quality Data:
HHS Should Specify Steps and Time Frame for Using Information
Technology to Collect and Submit Data:
GAO-07-320:
GAO Highlights:
Highlights of GAO-07-320, a report to the Committee on Finance, U.S.
Senate
Why GAO Did This Study:
Hospitals submit data in electronic form on a series of quality
measures to the Centers for Medicare & Medicaid Services (CMS) and
receive scores on their performance. Increasingly, the clinical
information from which hospitals derive the quality data for CMS is
stored in information technology (IT) systems.
GAO was asked to examine
(1) hospital processes to collect and submit quality data, (2) the
extent to which IT facilitates hospitals‘ collection and submission of
quality data, and (3) whether CMS has taken steps to promote the use of
IT systems to facilitate the collection and submission of hospital
quality data. GAO addressed these issues by conducting case studies of
eight hospitals with varying levels of IT development and interviewing
relevant officials at CMS and the Department of Health and Human
Services (HHS).
What GAO Found:
The eight case study hospitals used six steps to collect and submit
quality data: (1) identify the patients, (2) locate information in
their medical records, (3) determine appropriate values for the data
elements, (4) transmit the quality data to CMS, (5) ensure that the
quality data have been accepted by CMS, and (6) supply copies of
selected medical records to CMS to validate the data. Several factors
account for the complexity of abstracting all relevant information in a
patient‘s medical record, including the content and organization of the
medical record, the scope of information and the clinical judgment
required for the data elements, and frequent changes by CMS in its data
specifications. Due in part to these complexities, most of the case
study hospitals relied on clinical staff to abstract the quality data.
Increases in the number of quality measures required by CMS led to
increased demands on clinical staff resources. Offsetting the demands
placed on clinical staff were the benefits that case study hospitals
reported finding in the quality data, such as providing feedback to
clinicians and reports to hospital administrators.
GAO‘s case studies showed that existing IT systems can help hospitals
gather some quality data but are far from enabling hospitals to
automate the abstraction process. IT systems helped hospital staff to
abstract information from patients‘ medical records, in particular by
improving accessibility to and legibility of the medical record. The
limitations reported by officials in the case study hospitals included
having a mix of paper and electronic records, which required staff to
check multiple places to get the needed information; the prevalence of
data recorded as unstructured narrative or text, which made locating
the information time-consuming because it was not in a prescribed place
in the record; and the inability of some IT systems to access related
data stored in another IT system in the same hospital, which required
staff to access each IT system separately to obtain related pieces of
information. Hospital officials expected the scope and functionality of
their IT systems to increase over time, but this process will occur
over a period of years.
CMS has sponsored studies and joined HHS initiatives to examine and
promote the current and potential use of hospital IT systems to
facilitate the collection and submission of quality data, but HHS lacks
detailed plans, including milestones and a time frame against which to
track its progress. CMS has joined efforts by HHS to promote the use of
IT in health care, including a Quality Workgroup charged with
specifying how IT could capture, aggregate, and report inpatient and
outpatient quality data. HHS plans to expand the use of health IT for
quality data collection and submission through contracts with
nongovernmental entities that currently address the use of health IT
for a range of other purposes. However, HHS has identified no detailed
plans, milestones, or time frames for either its broad effort to
encourage IT in health care nationwide or its specific objective to
promote the use of health IT for quality data collection.
What GAO Recommends:
GAO recommends that the Secretary of HHS identify the specific steps
the department plans to take to promote the use of health IT for the
collection and submission of data for CMS‘s hospital quality measures
and inform interested parties about those steps, the expected time
frame, and associated milestones. In commenting on a draft of this
report on behalf of HHS, CMS concurred with these recommendations.
[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-07-320].
To view the full product, including the scope and methodology, click on
the link above. For more information, contact Cynthia A. Bascetta,
(202) 512-7101 or BascettaC@gao.gov.
[End of section]
Contents:
Letter:
Results in Brief:
Background:
Hospitals Use Six Basic Steps to Collect and Submit Quality Data, Two
of Which Involve Complex Abstraction by Hospital Staff:
Existing IT Systems Can Help Hospitals Gather Some Quality Data but Are
Far from Enabling Automated Abstraction:
CMS Sponsored Studies and Joined Broader HHS Initiatives to Promote Use
of IT for Quality Data Collection and Submission, but HHS Lacks
Detailed Plans, Milestones, and Time Frame:
Conclusions:
Recommendations for Executive Action:
Agency Comments and Our Evaluation:
Appendix I: Medicare Quality Measures Required for Full Annual Payment
Update:
Appendix II: Data Elements Used to Calculate Hospital Performance on a
Heart Attack Quality Measure:
Appendix III: Tables on Eight Case Study Hospitals:
Appendix IV: Scope and Methodology:
Appendix V: Comments from the Centers for Medicare & Medicaid Services:
Appendix VI: GAO Contact and Staff Acknowledgments:
Tables:
Table 1: Case Study Hospital Characteristics:
Table 2: How Case Study Hospital Officials Described the Steps Taken to
Complete Quality Data Collection and Submission:
Table 3: Resources Used for Abstraction and Data Submission at Eight
Case Study Hospitals:
Table 4: Electronic and Paper Records at Eight Case Study Hospitals:
Figures:
Figure 1: Six Basic Steps for Hospitals Collecting and Submitting
Quality Data:
Figure 2: Example of the Process for Locating and Assessing Clinical
Information to Determine the Appropriate Value for One Data Element:
Figure 3: Data Elements Used to Calculate Hospital Performance on the
Heart Attack Quality Measure That Asks Whether a Beta Blocker Was Given
When the Patient Arrived at the Hospital:
Abbreviations:
ACEI: angiotensin-converting enzyme inhibitor:
AHIC: American Health Information Community:
AHIMA: American Health Information Management Association:
AHRQ: Agency for Healthcare Research and Quality:
Alliance: National Alliance for Health Information Technology:
AMI: acute myocardial infarction:
APU: Annual Payment Update:
ARB: angiotensin receptor blocker:
CART: CMS Abstraction & Reporting Tool:
CCHIT: Certification Commission for Health Information Technology:
CHI: Consolidated Healthcare Informatics:
CMS: Centers for Medicare & Medicaid Services:
CPOE: computerized physician order entry:
DICOM: Digital Imaging Communications in Medicine:
DRA: Deficit Reduction Act of 2005:
FTE: full- time equivalent:
H&P: history and physical:
HCAHPS: Hospital Consumer Assessment of Healthcare Providers and
Systems:
HHS: Department of Health and Human Services:
HIMSS: Healthcare Information and Management Systems Society:
HITSP: Healthcare Information Technology Standards Panel:
ICD- 9: International Classification of Diseases, Ninth Revision:
IFMC: Iowa Foundation for Medical Care:
IT: information technology:
JCAHO: Joint Commission on Accreditation of Healthcare Organizations:
LOINC: Laboratory Logical Observation Identifier Name Codes:
LPN: licensed practical nurse:
LVSD: left ventricular systolic dysfunction:
MAR: medication administration record:
MMA: Medicare Prescription Drug, Improvement, and Modernization Act:
MSA: metropolitan statistical area:
NCDCP: National Council on Prescription Drug Programs:
ONC: Office of the National Coordinator for Health Information
Technology:
POS: provider of services:
QIO: quality improvement organization:
RN: registered nurse:
SNOMED-CT: Systematized Nomenclature of Medicine Clinical Terms:
United States Government Accountability Office:
Washington, DC 20548:
April 25, 2007:
The Honorable Max Baucus:
Chairman:
The Honorable Charles E. Grassley:
Ranking Minority Member:
Committee on Finance:
United States Senate:
The Medicare Prescription Drug, Improvement, and Modernization Act of
2003 (MMA) created a financial incentive for hospitals to submit to the
Centers for Medicare & Medicaid Services (CMS) data that are used to
calculate hospital performance on measures of the quality of care
provided.[Footnote 1] CMS established the Annual Payment Update (APU)
program[Footnote 2] to implement that incentive. The APU program
requires that participating hospitals submit these quality
data[Footnote 3] on a quarterly basis in order to avoid a reduction in
their full Medicare payment update each fiscal year.[Footnote 4]
Although the APU program was originally set to expire in 2007, the
Deficit Reduction Act of 2005[Footnote 5] (DRA) made the APU program
permanent. The act also raised the reduction[Footnote 6] and required
the Secretary of Health and Human Services (HHS) to increase the number
of measures for which hospitals participating in the APU program would
have to provide data in order to receive their full Medicare payment
update.[Footnote 7] CMS plans to continue expanding the number of
required measures in future years.[Footnote 8] Furthermore, DRA
directed the Secretary to develop a plan to implement a value-based
purchasing program for Medicare that beginning in fiscal year 2009
would adjust payments to hospitals based on factors related to the
quality of care they provide. Such pay-for-performance programs are
intended to strengthen the financial incentives for hospitals to invest
in quality improvement efforts.
Each quality measure consists of a set of standardized data elements,
which define the specific data that hospitals need to submit to CMS.
Hospitals determine a value for each data element of a measure for
patients--Medicare and non-Medicare--who have a medical condition
covered by the APU program, that is, heart attack, heart failure,
pneumonia, or surgery. The values for the data elements consist of
numerical data and other administrative and clinical information that
are obtained from the medical records of the patients.[Footnote 9] For
example, there are 8 required quality measures for the heart attack
condition, one of which is whether a beta blocker was given to the
patient upon arrival at the hospital.[Footnote 10] This single measure,
in turn, consists of 11 data elements, including administrative data
elements, such as the patient's date of arrival at the hospital, and
clinical data elements, such as whether the patient received a beta
blocker within 24 hours after hospital arrival (see app. II). The
values entered for data elements are used to calculate hospital
performance on the 21 quality measures that are in effect as of fiscal
year 2007. For a hospital submitting data on all 21 measures, CMS
receives values for a total of 73 unique data elements. For heart
attack measures alone, the 8 measures utilize 35 of the 73 data
elements. (Some data elements are used in more than 1 measure. See app.
I for the number of data elements required for each measure.) Hospitals
submit their quality data electronically, over the Internet, to a
clinical data warehouse operated by a CMS contractor.
Increasingly, the information in patients' medical records that
provides the basis for hospital quality data submissions may be stored
and accessed in electronic form in information technology (IT) systems.
Currently, many hospitals record and store such clinical information on
patients in a combination of paper and electronic systems. Over time,
hospitals have added new health IT systems to expand the amount of
information that is stored electronically. In 2005, the Secretary of
HHS established the American Health Information Community (AHIC) to
advance the adoption of electronic health records, after the President
called in 2004 for the widespread adoption of interoperable electronic
health records within 10 years and appointed a National Coordinator for
Health Information Technology to promote that goal. On August 12, 2005,
CMS issued a regulation for the APU program that stated a goal of
facilitating the use of health IT by hospitals to make it easier for
them to collect the quality data from the medical record and submit
them to CMS.[Footnote 11] In the preamble to the regulation, CMS said
that it intended to begin working toward modifying its requirements and
mechanisms for accepting quality data to allow hospitals to transfer
their data directly from hospital IT systems without having to first
transfer the data into specially formatted files as is currently
required.
Because the vast majority of acute care hospitals treating Medicare
patients choose to submit quality data each quarter to CMS, rather than
accept a reduced annual payment update, you asked us to examine (1) how
hospitals collect and submit quality data for the Medicare hospital
quality measures, (2) the extent to which IT facilitates hospitals'
collection and submission of quality data for the Medicare hospital
quality measures, and (3) whether CMS has taken steps to promote the
development and use of IT systems that could facilitate the collection
and submission of hospital quality data.
To assess how hospitals collect and submit quality data, we conducted
case studies of eight individual acute care hospitals to obtain
information about the processes they used to collect and submit the
data.[Footnote 12] The hospitals varied on a number of standard
hospital characteristics, including size, urban/rural location, and
teaching status (see app. III, table 1). We visited each case study
hospital, and we interviewed the individuals responsible for collecting
and submitting the quality data to CMS, managers of the hospital's
quality department, and hospital administrators. To assess the extent
to which IT facilitates hospitals' collection and submission of quality
data, we selected the case study hospitals to include both hospitals
with relatively well-developed IT systems that supported electronic
patient records and hospitals with less-developed levels of IT, based
on screening interviews done at the time we selected the case study
hospitals.[Footnote 13] During our site visits, we also interviewed IT
staff involved in the process of collecting and submitting the quality
data. To assess whether CMS has taken steps to promote the development
and use of IT systems that could facilitate the collection and
submission of hospital quality data, we reviewed relevant federal
regulations, reports, and related documents and interviewed CMS
officials and CMS contractors, as well as officials in HHS's Office of
the National Coordinator for Health Information Technology (ONC).
Because our evidence is limited to the eight case studies, it does not
offer a basis for relating any differences we observed among these
particular hospitals to their differences on specific dimensions, such
as size or teaching status. Nor can we generalize from the group of
eight as a whole to acute care hospitals across the country. Where
appropriate, we obtained relevant information about these hospitals
from CMS documents and databases; however, most of our information for
these case studies was reported by hospital officials.[Footnote 14]
Furthermore, although we examined the processes hospitals used to
collect and submit quality data and the role that IT plays in that
process, we did not examine general IT adoption in the hospital
industry. We conducted our work from February 2006 to April 2007 in
accordance with generally accepted government auditing standards. For a
complete description of our methodology, see appendix IV.
Results in Brief:
The case study hospitals we visited used six steps to collect and
submit quality data, two of which (steps 2 and 3) involved complex
abstraction--the process of reviewing and assessing all relevant pieces
of information in a patient's medical record to determine the
appropriate value for each data element. Whether that patient
information was recorded electronically, on paper, or as a mix of both,
the six steps were (1) identify the patients, (2) locate information in
their medical records, (3) determine appropriate values for the data
elements, (4) transmit the quality data to CMS, (5) ensure that the
quality data have been accepted by CMS, and (6) supply copies of
selected medical records to CMS to validate the data. Several factors
account for the complexity of the abstraction process (steps 2 and 3),
including the content and organization of the medical record, the scope
of information and clinical judgment required for the data elements,
and frequent changes by CMS in its data specifications. Due in part to
these complexities, most of our case study hospitals relied on clinical
staff to abstract the quality data. Increases in the number of quality
measures required by CMS led to increased demands on clinical staff
resources. Offsetting the demands placed on clinical staff were the
benefits that case study hospitals reported finding in the quality
data. For example, all the hospitals reported having a process in place
to track changes in their performance over time and provide feedback to
clinicians and reports to hospital administrators and trustees.
Our case studies showed that existing IT systems can help hospitals
gather some quality data but are far from enabling hospitals to
automate the abstraction process. IT systems helped hospital staff
abstract information from patients' medical records, in particular by
improving accessibility to and legibility of the medical record and by
enabling hospitals to incorporate CMS's required data elements into the
medical record. The limitations reported by officials in the case study
hospitals included having a mix of paper and electronic records, which
required staff to check multiple places to get the needed information;
the prevalence of data recorded as unstructured narrative or text,
which made locating the information time-consuming because it was not
in a prescribed place in the record; and the inability of some IT
systems to access related data stored in another IT system in the same
hospital, which required hospital staff to access each IT system
separately to obtain related pieces of information. While hospital
officials expected the scope and functionality of their IT systems to
increase over time, they projected that this process would occur
incrementally over a period of years.
CMS has sponsored studies and joined HHS initiatives to examine and
promote the current and potential use of hospital IT systems to
facilitate the collection and submission of quality data, but HHS lacks
detailed plans, including milestones and a time frame against which to
track its progress. CMS sponsored two studies that examined the use of
hospital IT systems for quality data collection and submission.
Promoting the use of health IT for quality data collection is also 1 of
14 objectives that HHS has identified in its broader effort to
encourage the development and nationwide implementation of
interoperable IT in health care. CMS has joined this broader effort by
HHS, as well as the Quality Workgroup that AHIC created in August 2006
to specify how IT could capture, aggregate, and report inpatient and
outpatient quality data. Through its representation in AHIC and the
Quality Workgroup, CMS has participated in decisions about the specific
focus areas to be examined through contracts with nongovernmental
entities. These contracts currently address the use of health IT for a
range of purposes, which may also include quality data collection and
submission in the near future. However, HHS has identified no detailed
plans, milestones, or time frames for either its broad effort to
encourage IT in health care nationwide or its specific objective to
promote the use of health IT for quality data collection.
To support the expansion of quality measures for the APU program, we
recommend that the Secretary of HHS identify the specific steps that
the department plans to take to promote the use of health IT for the
collection and submission of data for CMS's hospital quality measures
and inform interested parties on those steps and the expected time
frame, including milestones for completing them. In commenting on a
draft of this report on behalf of HHS, CMS expressed its appreciation
of our thorough analysis of the processes that hospitals use to report
quality data and the role that IT systems can play in that reporting,
and it concurred with our two recommendations.
Background:
The quality data submitted by hospitals are collected from the medical
records of patients admitted to the hospital. Hospital patient medical
records contain many different types of information, which are
organized into different sections. Frequently found examples of these
sections include:
* the face sheet, which summarizes basic demographic and billing data,
including diagnostic codes;
* history and physicals (H&P), which record both patient medical
history and physician assessments;
* physician orders, which show what medications, tests, and procedures
were ordered by a physician;
* medication administration records (MAR), which show that a specific
medication was given to a patient, when it was given, and the dosage;
* laboratory reports, radiology reports, and test results, such as an
echocardiogram reading;
* progress notes, in which physicians, nurses, and other clinicians
record information chronologically on patient status and response to
treatments during the patient's hospital stay;
* operative reports for surgery patients;
* physician and nursing notes for patients treated in the emergency
department; and:
* discharge summaries, in which a physician summarizes the patient's
hospital stay and records prescriptions and instructions to be given to
the patient at discharge.
Hospitals have discretion to determine the structure of their patient
medical records, as well as to set general policies stating what,
where, and how specific information should be recorded by clinicians.
To guide the hospital staff in the abstraction process--that is, in
finding and properly assessing the information in the patient's medical
record needed to fill in the values for the data elements--CMS and the
Joint Commission[Footnote 15] have jointly issued a Specifications
Manual.[Footnote 16] It contains detailed specifications that define
the data elements for which the hospital staff need to collect
information and determine values and the correct interpretation of
those data elements. The Joint Commission also requires hospitals to
submit the same data that they submit to CMS for the APU program (and
some additional data) to receive Joint Commission accreditation.
In many hospitals, information in a patient's medical record is
recorded and stored in a combination of paper and electronic systems.
Patient medical records that clinicians record on paper may be stored
in a folder in the hospital's medical record department and contain all
the different forms, reports, and notes prepared by different
individuals or by different departments during the patient's stay.
Depending on the length of the patient's hospital stay and the
complexity of the care, an individual patient medical record can amount
to hundreds of pages.[Footnote 17] For information stored
electronically, clinicians may enter information directly into the
electronic record themselves, as they do for paper records, or they may
dictate their notes to be transcribed and added to the electronic
record later. Information may also be recorded on paper and then
scanned into the patient's electronic record. For example, if a patient
is transferred from another hospital, the paper documents from the
transferring hospital may be scanned into the patient's electronic
record.
The patient medical information that hospitals store electronically,
rather than on paper, typically resides in multiple health IT systems.
One set of IT systems usually handles administrative tasks such as
patient registration and billing. Hospitals acquire other IT systems to
record laboratory test results, to store digital radiological images,
to process physician orders for medications, and to record notes
written by physicians and nurses. Hospitals frequently build their
health IT capabilities incrementally by adding new health IT systems
over time.[Footnote 18] If the systems that hospitals purchase come
from different companies, they are likely to be based on varying
standards for how the information is stored and exchanged
electronically. As a result, even in a single hospital, it can be
difficult to access from one IT system clinical data stored in a
different health IT system.
One of the main objectives of ONC is to overcome the problem of
multiple health IT systems, within and across health care providers,
that store and exchange information according to varying standards. The
mission of ONC is to promote the development and nationwide
implementation of interoperable health IT in both the public and the
private sectors in order to reduce medical errors, improve quality of
care, and enhance the efficiency of health care.[Footnote 19] Health IT
is interoperable when systems are able to exchange data accurately,
effectively, securely, and consistently with different IT systems,
software applications, and networks in such a way that the clinical or
operational purposes and meaning of the data are preserved and
unaltered.
Hospitals Use Six Basic Steps to Collect and Submit Quality Data, Two
of Which Involve Complex Abstraction by Hospital Staff:
The case study hospitals we visited used six steps to collect and
submit quality data, two of which involved complex abstraction--the
process of reviewing and assessing all relevant pieces of information
in a patient's medical record to determine the appropriate value for
each data element. Factors accounting for the complexity of the
abstraction process included the content and organization of the
medical record, the scope of information required for the data
elements, and frequent changes by CMS in its data specifications. Due
in part to these complexities, most of our case study hospitals relied
on clinical staff to abstract the quality data. Increases in the number
of required quality measures led to increased demands on clinical staff
resources. However, all case study hospitals reported finding benefits
in the quality data that helped to offset the demands placed on
clinical staff.
Hospitals Collect and Submit Quality Data by Completing Six Basic
Steps:
We found that whether patient information was recorded electronically,
on paper, or as a mix of both, all the case study hospitals collected
and submitted their quality data by carrying out six sequential steps
(see fig. 1). These steps started with identifying the patients for
whom the hospitals needed to provide quality data to CMS and continued
through the process of examining each patient's medical record, one
after the other, to find the information needed to determine the
appropriate values for each of the required data elements for that
patient. Then, for each patient, those values were entered by computer
into an electronic form or template listing each of the data elements
for that condition. These forms were provided by the data vendor with
which the hospital had contracted to transmit its quality data to CMS.
The vendors also assisted the hospitals in checking that the data were
successfully received by CMS. Finally, the hospitals sent copies of the
medical records of a selected sample of patients to a CMS contractor
that used those records to validate the accuracy of the quality data
submitted by the hospital.
Figure 1: Six Basic Steps for Hospitals Collecting and Submitting
Quality Data:
[See PDF for image]
Source: GAO.
Note: Patient information may be obtained from either electronic or
paper records.
[End of figure]
Specifically, the six steps, which are summarized for each case study
hospital in appendix III, table 2, were as follows:
Step 1: Identify patients--The first step was to identify the patients
for whom the hospitals needed to submit quality data to CMS. Staff at
three case study hospitals identified these patients using information
on the patient's principal diagnosis, or principal procedure in the
case of surgery patients, obtained from the hospital's billing
data.[Footnote 20] Five case study hospitals had their data vendor use
the hospital's billing data to identify the eligible patients for them.
Every month, all eight hospitals that we visited identified patients
discharged in the prior month for whom quality data should be
collected. The hospitals identified all patients retrospectively for
quality data collection because hospitals have to wait until a patient
is discharged to determine the principal diagnosis.[Footnote 21]
CMS permits hospitals to reduce their data collection effort by
providing quality data for a representative sample of patients when the
total number of patients treated for a particular condition exceeds a
certain threshold.[Footnote 22] Five case study hospitals drew samples
for at least one condition. The data vendor performed this task for
four of those case study hospitals, and assisted the hospital in
performing this task for the fifth hospital.
Only one of the case study hospitals reported using nonbilling data
sources to check the accuracy of the lists of patients selected for
quality data collection that the hospitals drew from their billing data
(see app. III, table 3). Several stated that they occasionally noted
discrepancies, such as patients selected for heart attack measures who,
upon review of their medical record, should not have had that as their
principal diagnosis. However, the hospital officials we interviewed
told us that discrepancies of this sort were likely to be minor.
Officials at three hospitals noted that hospitals generally have
periodic routine audits conducted of the coding practices of their
medical records departments, which would include the accuracy of the
principal diagnoses and procedures.
Step 2: Locate information in the medical record--Steps 2 and 3 were in
practice closely linked in our case study hospitals.
Abstractors[Footnote 23] at the eight case study hospitals examined
each selected patient's medical record, looking for all of the discrete
pieces of information that, taken together, would determine what they
would decide--in step 3--was the correct value for each of the data
elements. For some data elements, there was a one-to-one correspondence
between the piece of information in the medical record and the value to
be entered. Typical examples included a patient's date of birth and the
name of a medication administered to the patient. For other data
elements, the abstractors had to check for the presence or absence of
multiple pieces of information in different parts of the medical record
to determine the correct value for that data element. For example, to
determine if the patient did, or did not, have a contraindication for
aspirin, abstractors looked in different parts of the medical record
for potential contraindications, such as the presence of internal
bleeding, allergies, or prescriptions for certain other medications
such as Coumadin.[Footnote 24]
In order for abstractors to find information in the patient's medical
record, it had to be recorded properly by the clinicians providing the
patient's care. Officials at all eight case study hospitals described
efforts designed to educate physicians and nurses about the specific
data elements for which they needed to provide information in each
patient's medical record. The hospital officials were particularly
concerned that the clinicians not undermine the hospital's performance
on the quality measures by inadequately documenting what they had done
and the reasons why. For example, one heart failure measure tracks
whether a patient received each of six specific instructions at the
time of discharge, but unless information was explicitly recorded in a
heart failure patient's medical record for each of the six data
elements, that patient was counted by CMS as one who had not received
all pertinent discharge instructions and therefore did not meet that
quality measure.[Footnote 25] This particular measure was cited by
officials at several hospitals as one that required a higher level of
documentation than had previously been the norm at their hospital.
Step 3: Determine appropriate data element values--Once abstractors had
located all the relevant pieces of information pertaining to a given
data element, they had to put those pieces together to arrive at the
appropriate value for the data element. The relevance of that
information was defined by the detailed instructions provided by the
hospitals' vendors, as well as the Specifications Manual jointly issued
by CMS and the Joint Commission that serves as the basis for the vendor
instructions. The Specifications Manual sets out the decision rules for
choosing among the allowable values for each data element. It also
identifies which parts of the patient's medical record may or may not
provide the required information, and often lists specific terms or
descriptions that, if recorded in the patient's medical record, would
indicate the appropriate value for a given data element. In addition,
the Specifications Manual provides abstractors with guidance on how to
interpret conflicting information in the medical record, such as a note
from one clinician that the patient is not a smoker and a note
elsewhere in the record from another clinician that the patient does
smoke. To help keep track of multiple pieces of information, many
abstractors reported that they first filled in the data element values
on a paper copy of the abstraction form provided by the data vendor. In
this way, they could write notes in the margin to document how they
came to their conclusions.
Step 4: Transmit data to CMS--In order for the quality data to be
accepted by the clinical data warehouse, they must pass a battery of
edit checks that look for missing, invalid, or improperly formatted
data element entries.[Footnote 26] All the case study hospitals
contracted with data vendors to submit their quality data to CMS. They
did so, in part, because all of the hospitals submitted the same data
to the Joint Commission, and it requires hospitals to submit their
quality data through data vendors that meet the Joint Commission's
requirements. The additional cost to the hospitals to have the data
vendors also submit their quality data to CMS was generally minimal
(see app. III, table 3).
All of the case study hospitals submitted their data to the data vendor
by filling in values for the required data elements on an electronic
version of the vendor's abstraction form.[Footnote 27] Many abstractors
did this for a batch of patient records at a time, working from paper
copies of the form that they had filled in previously. Some abstractors
entered the data online at the same time that they reviewed the
patient's medical records. In other cases, someone other than the
abstractor who filled in the paper form used the completed form to
enter the data on a computer.
Step 5: Ensure data have been accepted by CMS--The case study hospitals
varied in the extent to which they actively monitored the acceptance of
their quality data into CMS's clinical data warehouse. After the data
vendors submitted the quality data electronically, they and the
hospitals could download reports from the clinical data warehouse
indicating whether the submitted data had passed the screening edits
for proper formatting and valid entries. The hospitals could use these
reports to detect data entry errors and make corrections prior to CMS's
data submission deadline. Three case study hospitals shared this task
with their data vendors, three hospitals left it for their data vendors
to handle, and two hospitals received and responded to reports on data
edit checks produced by their data vendors, rather than reviewing the
CMS reports. Approximately 2 months after hospitals submitted their
quality data, CMS released reports to the hospitals showing their
performance scores on the quality measures before posting the results
on its public Web site.
Step 6: Supply copies of selected medical records--CMS has put in place
a data validation process to ensure the accuracy of hospital quality
data submissions. It requires hospitals to supply a CMS contractor with
paper copies of the complete medical record for five patients selected
by CMS each quarter.[Footnote 28] Officials at five hospitals noted
that they check to make sure that all parts of the medical records that
they used to abstract the data originally are included in the package
shipped to the CMS contractor. Most of the case study hospitals relied
on CMS's data validation to ensure the accuracy of their abstractions.
However, two hospitals reported that they also routinely draw their own
sample of cases, which are abstracted a second time by a different
abstractor in the hospital, followed by a comparison of the two sets of
results (see app. III, table 3).
Two Most Complex Steps Were Locating Relevant Clinical Information and
Determining Appropriate Values for Data Elements:
The description by hospital officials of the processes they used to
collect and submit quality data indicated that locating the relevant
clinical information and determining appropriate values for the data
elements (steps 2 and 3) were the most complex steps of the six
identified, due to several factors. These included the content and
organization of the medical record, the scope of the information
encompassed by the data elements, and frequent changes in data
specifications.
The first complicating factor related to the medical record was that
the information abstractors needed to determine the correct data
element values for a given patient was generally located in many
different sections of the patient's medical record. These included
documents completed for admission to the hospital, emergency department
documents, laboratory and test results, operating room notes,
medication administration records, nursing notes, and physician-
generated documents such as history and physicals, progress notes and
consults, orders for medications and tests, and discharge summaries. In
addition, the abstractors may have had to look at documents that came
from other providers if the patient was transferred to the hospital.
Much of the clinical information needed was found in the sections of
the medical record prepared by clinicians. Often the information in
question, such as contraindications for aspirin or beta blockers, could
be found in any of a number of places in the medical record where
clinicians made entries. As a result, abstractors frequently had to
read through multiple parts of the record to find the information
needed to determine the correct value for just one data element. At two
case study hospitals, abstractors said that they routinely read each
patient's entire medical record.
Experienced abstractors often knew where they were most likely to find
particular pieces of information. They nevertheless also had to check
for potentially contradictory information in different parts of the
medical record. For example, as noted, patients may have provided
varying responses about their smoking history to different clinicians.
If any of these responses indicated that the patient had smoked
cigarettes in the last 12 months, the patient was considered to be a
smoker according to CMS's data specifications. Another example concerns
the possibility that a heart attack or heart failure patient may have
had multiple echocardiogram results recorded in different parts of the
medical record. Abstractors needed to find all such results in order to
apply the rules stated in the Specifications Manual for identifying
which result to use in deciding whether the patient had left
ventricular systolic dysfunction (LVSD). This data element is used for
the quality measure assessing whether an angiotensin-converting enzyme
inhibitor (ACEI) or angiotensin receptor blocker (ARB) was prescribed
for LVSD at discharge.[Footnote 29]
The second factor was related to the scope of the information required
for certain data elements. Some of the data elements that the
abstractors had to fill in represented a composite of related data and
clinical judgment applied by the abstractor, not just a single discrete
piece of information. Such composite data elements typically were
governed by complicated rules for determining the clinical
appropriateness of a specific treatment for a given patient. For
example, the data element for contraindications for both ACEIs and ARBs
at discharge requires abstractors to check for the presence and assess
the severity of any of a range of clinical conditions that would make
the use of either ACEIs or ARBs inappropriate for that
patient.[Footnote 30] (See fig. 2.) These conditions may appear at any
time during the patient's hospital stay and so could appear at any of
several places in the medical record. Abstractors must also look for
evidence in the record from a physician[Footnote 31] linking a decision
not to prescribe these drugs to one or more of those conditions.
Figure 2: Example of the Process for Locating and Assessing Clinical
Information to Determine the Appropriate Value for One Data Element:
[See PDF for image]
Source: GAO, CMS.
Note: In this illustrative case, adapted from CMS training materials,
an abstractor would find that the patient was given an ACEI, Zestril,
in the emergency department (see MAR,1/30), but because of its apparent
effect on the patient's pulse and blood pressure (see Progress Notes,
01/31), it was not continued during the hospital stay (see Progress
Notes, 02/03) and no ACEI was prescribed at discharge (see Discharge
Summary). However, there is no mention in the patient's record of ARBs
or aortic stenosis. The arrows point to some of the key pieces of
information an abstractor would take note of in determining that the
appropriate value for this data element was "N" for "no."
[End of figure]
The third factor is the necessity abstractors at the case study
hospitals faced to adjust to frequent changes in the data
specifications set by CMS. Since CMS first released its detailed data
specifications jointly with the Joint Commission in September 2004, it
has issued seven new versions of the Specifications Manual.[Footnote
32] Therefore, from fall 2004 through summer 2006, roughly every 3
months hospital abstractors have had to stop and take note of what had
changed in the data specifications and revamp their quality data
collection procedures accordingly. Some of these changes reflected
modifications in the quality measures themselves, such as the addition
of ARBs for treatment of LVSD. Other changes revised or expanded the
guidance provided to abstractors, often in response to questions
submitted by hospitals to CMS. CMS recently changed its schedule for
issuing revisions to its data specifications from every 3 months to
every 6 months, but that change had not yet affected the interval
between new revisions issued to hospitals at the time of our case study
site visits.
Clinical Staff Abstract Quality Data at Most Hospitals:
Case study hospitals typically used registered nurses (RN), often
exclusively, to abstract quality data for the CMS quality measures (see
app. III, table 3). One hospital relied on a highly experienced
licensed practical nurse, and two case study hospitals used a mix of
RNs and nonclinical staff. Officials at one hospital noted that RNs
were familiar with both the nomenclature and the structure of the
hospital's medical records and they could more readily interact with
the physicians and nurses providing the care about documentation
issues. Even when using RNs, all but three of the case study hospitals
had each abstractor focus on one or two medical conditions with which
they had expertise.
Four hospitals had tried using nonclinical staff, most often trained as
medical record coders, to abstract the quality data. Officials at one
of these hospitals reported that this approach posed challenges. They
said that it was difficult for nonclinical staff to learn all that they
needed to know to abstract quality data effectively, especially with
the constant changes being made to the data specifications. At the
second hospital, officials reported that using nonclinical staff for
abstraction did not work at all and they switched to using clinically
trained staff. At the third hospital, the chief clinician leading the
quality team stated that the hospital's nonclinical abstractors worked
well enough when clinically trained colleagues were available to answer
their questions. Officials at the fourth hospital cited no concerns
about using staff who were not RNs to abstract quality data, but they
subsequently hired an RN to abstract patient records for two of the
four conditions.
Case study hospitals drew on a mix of existing and new staff resources
to handle the collection and submission of quality data to CMS. In two
hospitals, new staff had been hired specifically to collect quality
data for the Joint Commission and CMS. In other hospitals, quality data
collection was assigned to staff already employed in the hospital's
quality management department or performing other functions.
Adding Quality Measures Required a Proportionate Increase in Staff
Resources:
All the case study hospitals found that, over time, they had to
increase the amount of staff resources devoted to abstracting quality
data for the CMS quality measures, most notably as the number of
measures on which they were submitting data expanded. Officials at the
case study hospitals generally reported that the amount of staff time
required for abstraction increased proportionately with the number of
conditions for which they reported quality data. The hospitals had all
begun to report most recently on the surgical quality measures. They
found that the staff hours needed for this new set of quality measures
were directly related to the number of patient records to be abstracted
and the number of data elements collected. In other words, they found
no "economies of scale" as they expanded the scope of quality data
abstraction. At the time of our site visits, four hospitals continued
to draw on existing staff resources, while others had hired additional
staff. Hospital officials estimated that the amount of staff resources
devoted to abstracting data for the CMS quality measures ranged from
0.7 to 2.5 full-time equivalents (FTE) (app. III, table 3).[Footnote
33]
Hospitals Value and Use Quality Data:
Hospital officials reported that the demands that quality data
collection and submission placed on their clinical staff resources were
offset by the benefits that they derived from the resulting information
on their clinical performance. Each one had a process for tracking
changes in their performance over time. Based on those results, they
provided feedback to individual clinicians and reports to hospital
administrators and trustees. Because they perceived feedback to
clinicians to be much more effective when provided as soon as possible,
several of the case study hospitals found ways to calculate their
performance on the quality measures themselves, often on a monthly
basis, rather than wait for CMS to report their results for the
quarter.
Officials at all eight case study hospitals pointed to specific changes
they had made in their internal procedures designed to improve their
performance on one or more quality measures. Most of the case study
hospitals developed "standing order sets" for particular diagnoses.
Such order sets provide a mechanism for standardizing both the care
provided and the documentation of that care, in such areas as
prescribing beta blockers and aspirin on arrival and at discharge for
heart attack patients. Another common example involved prompting
physicians to administer pneumococcal vaccinations to pneumonia
patients. However, at most of the case study hospitals, use of many
standing order sets was optional for physicians, and hospital officials
reported widely varying rates of physician use, from close to 100
percent of physicians at one hospital using its order set for heart
attack patients to just a few physicians using any order sets in
another hospital.
Case study hospitals also responded to the information generated from
their quality data by adjusting their treatment protocols, especially
for patients treated in their emergency departments. For example, five
hospitals developed or elaborated on procedural checklists for
emergency department nurses treating pneumonia patients. The objective
of these changes was to more quickly identify pneumonia patients when
they arrived at the emergency department and then expeditiously perform
required blood tests so that the patients would score positively for
the quality measure on receiving antibiotics within 4 hours of arrival
at the hospital. Three hospitals strengthened their procedures to
identify smokers and make sure that they received appropriate
counseling.
Hospital officials noted that they provided quality of care data to
entities other than CMS and the Joint Commission, such as state
governments and private insurers, but for the most part they reported
that the CMS quality measures had two advantages. First, the CMS
quality measures enabled hospitals to benchmark their performance
against the performances of virtually every other hospital in the
country. Second, officials at two hospitals noted that the CMS measures
were based on clinical information obtained from patient medical
records and therefore had greater validity as measures of quality of
care than measures based solely on administrative data.[Footnote 34]
Many hospital officials said that they wished that state governments
and other entities collecting quality data would accept the CMS quality
measures instead of requiring related quality data based on different
definitions and patient populations. Hospital officials in two states
reported some movement in that direction.
Existing IT Systems Can Help Hospitals Gather Some Quality Data but Are
Far from Enabling Automated Abstraction:
In the case studies, existing IT systems helped hospital abstractors to
complete their work more quickly, but the limitations of those IT
systems meant that trained staff still had to examine the entire
patient medical record and manually abstract the quality data submitted
to CMS. IT systems helped abstractors obtain information from patients'
medical records, in particular by improving their accessibility and
legibility, and by enabling hospitals to incorporate CMS's required
data elements into those medical records. The challenges reported by
hospital officials included having a mix of paper and electronic
records, which required abstractors to check multiple places to get the
needed information; the prevalence of unstructured data, which made
locating the information time-consuming because it was not in a
prescribed place in the record; and the presence of multiple IT systems
that did not share data, which required abstractors to separately
access each IT system for related pieces of information that were in
different parts of the medical record. While hospital officials
expected the scope and functionality of their IT systems to increase
over time, they projected that this would occur incrementally over a
period of years.[Footnote 35]
Existing IT Systems Help Abstractors Obtain Information from Medical
Records but Have Notable Limitations:
Hospitals found that their existing IT systems could facilitate the
collection of quality data, but that there were limits on the
advantages that the systems could provide. IT systems, and the
electronic records they support, offered hospitals two key benefits:
(1) improving accessibility to and legibility of the medical record,
and (2) facilitating the incorporation of CMS's required data elements
into the medical record.
Many hospital abstractors noted that existing electronic records helped
quality data collection by improving accessibility and legibility of
patient records. In general, paper records were less accessible than
electronic records because it took time to find them or to have them
transported if hospitals had stored them in a remote location after the
patients were discharged. Also, paper records were more likely to be
missing or in use by someone else. However, in one case study hospital,
an abstractor noted difficulties in gaining access to a computer
terminal to view electronic medical records. Many abstractors noted
improvements in legibility as a fundamental benefit of electronic
records. This advantage applied in particular to the many sections of
the medical record that consisted of handwritten text, including
history and physicals, progress notes, medication administration
records, and discharge summaries.
Some hospitals have used their existing IT systems to facilitate the
abstraction of information by designing a number of discrete data
fields that match CMS's data elements. For example, two hospitals
incorporated prompts for pneumococcal vaccination in their electronic
medication ordering system. These prompts not only reminded physicians
to order the vaccination (if the patient was not already vaccinated)
but also helped to insure documentation of the patient's vaccination
status. One hospital developed a special electronic discharge program
for heart attack and heart failure patients that had data elements for
the quality measures built into it. Another hospital built a prompt
into its electronically generated discharge instructions to instruct
patients to measure their weight daily. This enabled the hospital to
document more consistently one of the specific instructions that heart
failure patients are supposed to receive on discharge but that
physicians and nurses tended to overlook in their documentation.
The limitations that hospital officials reported in using existing IT
systems to collect quality data stemmed from having a mix of paper and
electronic systems; the prevalence of data recorded in IT systems as
unstructured paragraphs of narrative or text, as opposed to discrete
data fields reserved for specific pieces of information; and the
inability of some IT systems to access related data stored on another
IT system in the same hospital. Because all but one of the case study
hospitals stored clinical records in a mix of paper and electronic
systems, abstractors generally had to consult both paper and electronic
records to obtain all needed information. What was recorded on paper
and what was recorded electronically varied from hospital to hospital
(see app. III, table 4). However, admissions and billing data were
electronic at all the case study hospitals. Billing data include
principal diagnosis and birth date, which are among the CMS-required
data elements. With regard to clinical data, all case study hospitals
had test results, such as echocardiogram readings, in an electronic
form. In contrast, nurse progress notes were least likely to be in
electronic form at the case study hospitals. Moreover, it was not
uncommon for a hospital to have the same type of clinical documentation
stored partly in electronic form and partly on paper. For example, five
of the eight case study hospitals had a mix of paper and electronic
physician notes, reflecting the differing personal preferences of the
physicians. Discharge summaries and medication administration records,
on the other hand, tended to be either paper or electronic at a given
hospital.
Many of the data in existing IT systems were recorded in unstructured
formats--that is, as paragraphs of narrative or other text, rather than
in data fields designated to contain specific pieces of information--
which created problems in locating the needed information. For example,
physician notes and discharge summaries were often dictated and
transcribed. Abstractors typically read through the entire electronic
document to make sure that they had found all potentially relevant
references, such as for possible contraindications for a beta blocker
or an ACEI. By contrast, some of the data in existing IT systems were
in structured data fields so that specific information could be found
in a prescribed place in the record. One common example was a list of
medication allergies, which abstractors used to quickly check for
certain drug contraindications. However, officials at several hospitals
said that developing and implementing structured data fields were labor
intensive, both in terms of programming and in terms of educating
clinical staff in their use. That is why many of the data stored in
electronic records at the case study hospitals remained in unstructured
formats.
Another limitation with existing IT systems was the inability of some
systems to access related data stored on another IT system in the same
hospital. This situation affected six of the eight case study hospitals
to some degree. For example, one hospital had an IT system in the
emergency department and an IT system on the inpatient floors, but the
two systems were independent and the information in one was not linked
to the information in the other. Abstractors had to access each IT
system separately to obtain related pieces of information, which made
abstraction more complicated and time-consuming.
Existing IT systems helped hospital abstractors to complete their work
more quickly, but the limitations of those IT systems meant that, for
the most part, the nature of their work remained the same. Existing IT
systems enabled abstractors at several hospitals to more quickly locate
the clinical information needed to determine the appropriate values for
at least some of the data elements that the hospitals submitted to CMS.
Where hospitals designed a discrete data field in their IT systems to
match a specific CMS data element, abstractors could simply transcribe
that value into the data vendor's abstraction form. However, in all the
case study hospitals there remained a large number of data elements for
which there was no discrete data field in a patient's electronic record
that could provide the required value for that data element. As a
result, trained staff still had to examine the medical record as a
whole and manually abstract the quality data submitted to CMS, whether
the information in the medical record was recorded electronically or on
paper.[Footnote 36]
Full Automation of Quality Data Collection Is Not Imminent:
All the case study hospitals were working to expand the scope and
functionality of their IT systems, but this expansion was generally
projected to occur incrementally over a period of years. Hospital
officials noted that with wider use of IT systems, the advantages of
these systems--including accessibility, legibility, and the use of
discrete data fields--would apply to a larger proportion of the
clinical records that abstractors have to search. As the case study
hospitals continue to bring more of their clinical documentation into
IT systems, and to link separate systems within their hospital so that
data in one system can be accessed from another, it should reduce the
time required to collect quality data.
However, most officials at the case study hospitals viewed full-scale
automation of quality data collection and submission through
implementation of IT systems as, at best, a long-term prospect. They
pointed to a number of challenges that hospitals would have to overcome
before they could use IT systems to achieve full-scale automation of
quality data collection and submission. Primary among these were
overcoming physician reluctance to use IT systems to record clinical
information and the intrinsic complexity of the quality data required
by CMS. One hospital with unusually extensive IT systems had initiated
a pilot project to see how close it could get to fully automating
quality data collection for patients with heart failure. Drawing to the
maximum extent on the data that were amenable to programming, which
excluded unstructured physician notes, the hospital found that it could
complete data collection for approximately 10 percent of cases without
additional manual abstraction. Reflecting on this effort, the hospital
official leading this project noted that at least some of the data
elements required for heart failure patients represented "clinical
judgment calls." An official at another hospital observed that someone
had to apply CMS's complex decision rules to determine the appropriate
value for the data elements. If a hospital wanted to eliminate the need
for an abstractor, who currently makes those decisions retrospectively
after weighing multiple pieces of information in the patient's medical
record, the same complex decisions would have to be made by the
patient's physician at the time of treatment. The official suggested
that it was preferable not to ask physicians to take on that additional
task when they should be focused on making appropriate treatment
decisions.
Another barrier to automated quality data collection mentioned by
several hospital officials was the frequency of change in the data
specifications. As noted above, hospitals had to invest considerable
staff resources for programming and staff education to develop
structured data fields for the clinical information required for the
data elements. Officials at one hospital stated that it would be
difficult to justify that investment without knowing how long the data
specifications underlying that structured data field would remain
valid.
CMS Sponsored Studies and Joined Broader HHS Initiatives to Promote Use
of IT for Quality Data Collection and Submission, but HHS Lacks
Detailed Plans, Milestones, and Time Frame:
CMS has sponsored studies and joined HHS initiatives to examine and
promote the current and potential use of hospital IT systems to
facilitate the collection and submission of quality data, but HHS lacks
detailed plans, including milestones and a time frame against which to
track its progress. CMS sponsored two studies that examined the use of
hospital IT systems for quality data collection and submission.
Promoting the use of health IT for quality data collection is also 1 of
14 objectives that HHS has identified in its broader effort to
encourage the development and nationwide implementation of
interoperable IT in health care. CMS has joined this broader effort by
HHS, as well as the Quality Workgroup that AHIC created in August 2006
to specify how IT could capture, aggregate, and report inpatient and
outpatient quality data. Through its representation in AHIC and the
Quality Workgroup, CMS has participated in decisions about the specific
focus areas to be examined through contracts with nongovernmental
entities. These contracts currently address the use of health IT for a
range of purposes, which may also include quality data collection and
submission in the near future. However, HHS has identified no detailed
plans, milestones, or time frames for either its broad effort to
encourage IT in health care nationwide or its specific objective to
promote the use of health IT for quality data collection.
CMS Sponsored Studies Examining Use of IT Systems for Collection and
Submission of Quality Data:
Over the past several years, CMS sponsored two studies to examine the
current and potential capacity of hospital IT systems to facilitate
quality data collection and submission. These studies identified
challenges to using existing hospital IT systems for quality data
collection and submission, including gaps and inconsistencies in
applicable data standards, as well as in the content of clinical
information recorded in existing IT systems. Data standards create a
uniform vocabulary for electronically recorded information by providing
common definitions and coding conventions for a specified set of
medical terms. Currently, an array of different standards apply to
different aspects of patient care, including drug ordering, digital
imaging, clinical laboratory results, and overall clinical terminology
relating to anatomy, problems, and procedures.[Footnote 37] The studies
also found that existing IT systems did not record much of the specific
clinical information needed to determine the appropriate data element
values that hospitals submit to CMS. To achieve CMS's goal of enabling
hospitals to transmit quality data directly from their own IT systems
to CMS's nationwide clinical database, the sets of data in the two
systems should conform to a common set of data standards and capture
all the data necessary for quality measures.[Footnote 38] A key element
in the effort to create this congruence is the further development and
implementation of data standards.
In the first study, completed in March 2005, CMS contracted with the
Colorado Foundation for Medical Care to test the potential for directly
downloading values for data elements for CMS's hospital quality
measures using patient data from electronic medical records in three
hospitals and one hospital system.[Footnote 39] The study found that
numerous factors impeded this process under current conditions,
including the lack of certain key types of information in the
hospitals' IT systems, such as emergency department data, prearrival
data, transfer information, and information on medication
contraindications. The study also noted that hospitals differed in how
they coded their data, and that even when they had implemented data
standards, the hospitals had used different versions of the standards
or applied them in different ways.[Footnote 40] For example, the study
found wide variation in the way that the hospitals recorded drug names
and laboratory results in their IT systems, as none of the hospitals
had implemented the existing data standards in those areas.
In the second study, which was conducted by the Iowa Foundation for
Medical Care and completed in February 2006, CMS examined the potential
to expand its current data specifications for heart attack, heart
failure, pneumonia, and surgical measures to incorporate the standards
adopted by the federal Consolidated Healthcare Informatics (CHI)
initiative.[Footnote 41] Unlike the first study, which focused on
actual patient data in existing IT systems, this study focused on the
relationship of current data standards to the data specifications for
CMS's quality data. It found that there were inconsistencies in the way
that corresponding data elements were defined in the CMS/Joint
Commission Specifications Manual and in the CHI standards that
precluded applying those standards to all of CMS's data elements.
Moreover, it found that some of the data elements are not addressed in
the CHI standards. These results suggested to CMS officials that the
data standards needed to undergo further development before they could
support greater use of health IT to facilitate quality data collection
and submission.
CMS Has Joined HHS's Efforts to Promote Greater Use of Health IT for
Quality Data Collection and Submission, but HHS Lacks Detailed Plans,
Milestones, and a Time Frame to Track Progress:
CMS has joined efforts by HHS to promote greater use of health IT in
general and, more recently, in facilitating the use of health IT for
quality data collection and submission. The overall goal of HHS's
efforts in this area, working through AHIC and ONC, is to encourage the
development and nationwide implementation of interoperable health IT in
both the public and the private sectors. To guide those efforts, ONC
has developed a strategic framework that outlines its goals,
objectives, and high-level strategies. One of the 14 objectives
involves the collection of quality information.[Footnote 42]
CMS, through its participation in AHIC, has taken part in the selection
of specific focus areas for ONC to pursue in its initial activities to
promote health IT. Those activities have largely taken place through a
series of contracts with a number of nongovernmental entities. ONC has
sought through these contracts to address issues affecting wider use of
health IT, including standards harmonization, the certification of IT
systems, and the development of a Nationwide Health Information
Network. For example, the initial work on standards harmonization,
conducted under contract to ONC by the Healthcare Information
Technology Standards Panel (HITSP), focused on three targeted areas:
biosurveillance,[Footnote 43] sharing laboratory results across
institutions, and patient registration and medication history.
Meanwhile, the Certification Commission for Health Information
Technology (CCHIT) has worked under a separate contract with ONC to
develop and apply certification criteria for electronic health record
products used in physician offices, with some initial work on
certification of electronic health record products for inpatient care
as well.[Footnote 44]
CMS is also represented on the Quality Workgroup that AHIC created in
August 2006 as a first step in promoting the use of health IT for
quality data collection and submission. One of seven workgroups
appointed by AHIC, the Quality Workgroup received a specific charge to
specify how health IT should capture, aggregate, and report inpatient
as well as outpatient quality data. It plans to address this charge by
adding activities related to using IT for quality data collection to
the work performed by HITSP and CCHIT addressing other objectives under
their ongoing ONC contracts. Members of the Quality Workgroup, along
with AHIC itself, have recently begun to consider the specific focus
areas to include in the directions given to HITSP and CCHIT for their
activities during the coming year.[Footnote 45] Early discussions among
AHIC members indicated that they would try to select focus areas that
built on the work already completed by ONC's contractors and that
targeted specific improvements in quality data collection that could
also support other priorities for IT development that AHIC had
identified.[Footnote 46] The focus areas that AHIC selects will, over
time, influence the decisions that HHS makes regarding the resources it
will allocate and the specific steps it will take to overcome the
limitations of existing IT systems for quality data collection and
submission.
In a previous report and subsequent testimony, we noted that ONC's
overall approach lacked detailed plans and milestones to ensure that
the goals articulated in its strategic framework were met. We pointed
out that without setting milestones and tracking progress toward
completing them, HHS cannot tell if the necessary steps are in place to
provide the building blocks for achieving its overall
objectives.[Footnote 47] HHS concurred with our recommendation that it
establish detailed plans and milestones for each phase of its health IT
strategic framework, but it has not yet released any such plans,
milestones, or a time frame for completion. Moreover, HHS has not
announced any detailed plans or milestones or a time frame relating to
the efforts of the Quality Workgroup to promote the use of health IT to
capture, aggregate, and report inpatient and outpatient quality data.
Without such plans, it will be difficult to assess how much the focus
areas AHIC selects in the near term on its contracted activities will
contribute to enabling the Quality Workgroup to fulfill its charge in a
timely way.
Conclusions:
There is widespread agreement on the importance of hospital quality
data. The Congress made the APU program permanent to provide a
financial incentive for hospitals to submit quality data to CMS and
directed the Secretary of HHS to increase the number of measures for
which hospitals would have to provide data. In addition, the hospitals
we visited reported finding value in the quality data they collected
and submitted to CMS to improve care.
Collecting quality data is a complex and labor-intensive process.
Hospital officials told us that as the number of quality measures
required by CMS increased, the number of clinically trained staff
required to collect and submit quality data increased proportionately.
They also told us that increased use of IT facilitates the collection
and submission of quality data and thereby lessens the demand for
greater staff resources. The degree to which existing IT systems can
facilitate data collection is, however, constrained by limitations such
as the prevalence of data recorded as unstructured narrative or text.
Overcoming these limitations would enhance the potential of IT systems
to ease the demand on hospital resources.
Promoting the use of health IT for quality data collection is 1 of 14
objectives that HHS has identified in its broader effort to encourage
the development and nationwide implementation of interoperable IT in
health care. The extent to which HHS can overcome the limitations of
existing IT systems and make progress on this objective will depend in
part on where this objective falls on the list of priorities for the
broader effort. To date, HHS has identified no detailed plans,
milestones, or time frames for either the broad effort or the specific
objective on promoting the use of health IT for collecting quality
data. Without such plans, HHS cannot track its progress in promoting
the use of health IT for collecting quality data, making it less likely
that HHS will achieve that objective in a timely way. Our analysis
indicates that unless activities to facilitate greater use of IT for
quality data collection and submission proceed promptly, hospitals may
have difficulty collecting and submitting quality data required for an
expanded APU program.
Recommendations for Executive Action:
To support the expansion of quality measures for the APU program, we
recommend that the Secretary of HHS take the following actions:
* identify the specific steps that the department plans to take to
promote the use of health IT for the collection and submission of data
for CMS's hospital quality measures; and:
* inform interested parties about those steps and the expected time
frame, including milestones for completing them.
Agency Comments and Our Evaluation:
In commenting on a draft of this report on behalf of HHS, CMS expressed
its appreciation of our thorough analysis of the processes that
hospitals use to report quality data and the role that IT systems can
play in that reporting, and it concurred with our two recommendations.
(CMS's comments appear in app. V.) With respect to the recommendations,
CMS stated that it will continue to participate in relevant HHS studies
and workgroups, and, as appropriate, it will inform interested parties
regarding progress in the implementation of health IT for the
collection and submission of hospital quality data as specific steps,
including time frames and milestones, are identified. In addition, as
health IT is implemented, CMS anticipates that a formal plan will be
developed that includes training for providers in the use of health IT
for reporting quality data. CMS also provided technical comments that
we incorporated where appropriate.
CMS made two additional comments relating to the information provided
on our case study hospitals and our discussion of patients excluded
from the hospital performance assessments. CMS suggested that we
describe the level of health IT adoption in the case study hospitals in
table 1 of appendix III; this information was already provided in table
4 of appendix III. CMS suggested that we highlight the application of
patient exclusions in adapting health IT for quality data collection
and submission. We chose not to because our analysis showed that the
degree of challenge depended on the nature of the information required
for a given data element. Exclusions based on billing data, such as
discharge status, pose much less difficulty than other exclusions, such
as checking for contraindications to ACEIs and ARBs for LVSD, which
require a wide range of clinical information.
CMS noted that the AHIC Quality Workgroup had presented its initial set
of recommendations at AHIC's most recent meeting on March 13, 2007, and
provided a copy of those recommendations as an appendix to its
comments. The agency characterized these recommendations as first
steps, with initial timelines, to address the complex issues that
affect implementation of health IT for quality data collection and
submission. Specifically with reference to collecting quality data from
hospitals as well as physicians, the Quality Workgroup recommended the
appointment of an expert panel that would designate a set of quality
measures to have priority for standardization of their data elements,
which, in turn, would enable automation of their collection and
submission using electronic health records and health information
exchange. The first recommendations from the expert panel are due June
5, 2007. The work of the expert panel is intended to guide subsequent
efforts by HITSP to fill identified gaps in related data standards and
by CCHIT to develop criteria for certifying electronic health record
products. In addition, the Quality Workgroup recommended that CMS and
the Agency for Healthcare Research and Quality (AHRQ) both work to
bring together the developers of health quality measures and health IT
vendors, so that development of future health IT systems would take
greater account of the data requirements of emerging quality measures.
AHIC approved these recommendations from the Quality Workgroup at its
March 13 meeting.
We also sent to each of the eight case study hospitals sections from
the appendixes pertaining to that hospital. We asked each hospital to
check that the section accurately described its processes for
collecting and submitting quality data as well as related information
on its characteristics and resources. Officials from four of the eight
hospitals responded and provided technical comments that we
incorporated where appropriate.
As arranged with your offices, unless you publicly announce its
contents earlier, we plan no further distribution of this report until
30 days after its issue date. At that time, we will send copies of this
report to the Secretary of HHS, the Administrator of CMS, and other
interested parties. We will also make copies available to others on
request. In addition, the report will be available at no charge on
GAO's Web site at http://www.gao.gov.
If you or your staffs have any questions about this report, please
contact me at (202) 512-7101 or BascettaC@gao.gov. Contact points for
our Offices of Congressional Relations and Public Affairs may be found
on the last page of this report. GAO staff who made major contributions
to this report are listed in appendix VI.
Signed by:
Cynthia A. Bascetta:
Director, Health Care:
[End of section]
Appendix I: Medicare Quality Measures Required for Full Annual Payment
Update:
Condition: Heart attack;
Quality measure: Aspirin at hospital arrival[A];
Number of required data elements: 11.
Condition: Heart attack;
Quality measure: Aspirin prescribed at discharge[A];
Number of required data elements: 7.
Condition: Heart attack;
Quality measure: Angiotensin-converting enzyme inhibitor or angiotensin
receptor blocker for left ventricular systolic dysfunction[A];
Number of required data elements: 9.
Condition: Heart attack;
Quality measure: Beta blocker at hospital arrival[A];
Number of required data elements: 11.
Condition: Heart attack;
Quality measure: Beta blocker prescribed at discharge[A];
Number of required data elements: 7.
Condition: Heart attack;
Quality measure: Thrombolytic agent received within 30 minutes of
hospital arrival;
Number of required data elements: 13.
Condition: Heart attack;
Quality measure: Percutaneous coronary intervention received within 120
minutes of hospital arrival;
Number of required data elements: 16.
Condition: Heart attack;
Quality measure: Adult smoking cessation advice/counseling;
Number of required data elements: 7.
Condition: Heart failure;
Quality measure: Left ventricular function assessment[A];
Number of required data elements: 7.
Condition: Heart failure;
Quality measure: Angiotensin-converting enzyme inhibitor or angiotensin
receptor blocker for left ventricular systolic dysfunction[A];
Number of required data elements: 10.
Condition: Heart failure;
Quality measure: Discharge instructions;
Number of required data elements: 12.
Condition: Heart failure;
Quality measure: Adult smoking cessation advice/ counseling;
Number of required data elements: 8.
Condition: Pneumonia;
Quality measure: Initial antibiotic received within 4 hours of hospital
arrival[A];
Number of required data elements: 16.
Condition: Pneumonia;
Quality measure: Oxygenation assessment[A];
Number of required data elements: 11.
Condition: Pneumonia;
Quality measure: Pneumococcal vaccination status[A];
Number of required data elements: 8.
Condition: Pneumonia;
Quality measure: Blood culture performed before first antibiotic
received in hospital;
Number of required data elements: 19.
Condition: Pneumonia;
Quality measure: Adult smoking cessation advice/counseling;
Number of required data elements: 9.
Condition: Pneumonia;
Quality measure: Appropriate initial antibiotic selection;
Number of required data elements: 24.
Condition: Pneumonia;
Quality measure: Influenza vaccination status;
Number of required data elements: 9.
Condition: Surgery;
Quality measure: Prophylactic antibiotic received within 1 hour prior
to surgical incision;
Number of required data elements: 14.
Condition: Surgery;
Quality measure: Prophylactic antibiotics discontinued within 24 hours
after surgery end time;
Number of required data elements: 17.
Sources: Federal Register, CMS, GAO (analysis).
Notes: The 21 measures are listed in 71 Fed. Reg. 47870, 48033-48034,
48045 (Aug. 18, 2006), and we analyzed the Specifications Manual for
National Hospital Quality Measures, version 2.1a, to calculate the
number of required data elements for each. This set of quality measures
is effective for discharges from July 2006 on. The Centers for Medicare
& Medicaid Services (CMS) uses 73 different data elements to calculate
hospital performance on the 21 measures required for the APU program.
The total number of unique data elements is less than the sum of the
data elements used to calculate each measure because some data elements
are included in the calculation of more than one quality measure. In
addition, CMS obtains from hospitals approximately 20 other data
elements on each patient, including demographic and billing data.
[A] One of the 10 original quality measures.
[End of table]
[End of section]
Appendix II: Data Elements Used to Calculate Hospital Performance on a
Heart Attack Quality Measure:
Figure 3: Data Elements Used to Calculate Hospital Performance on the
Heart Attack Quality Measure That Asks Whether a Beta Blocker Was Given
When the Patient Arrived at the Hospital:
[See PDF for image]
Source: GAO.
Notes: The boxes represent data elements and the circles and rounded
rectangles represent values for those elements. In addition to the
seven data elements shown in the figure (including arrival date and
discharge date that appear in the same box), an eighth data element,
comfort measures only, is first applied for this quality measure, as
well as all the other heart attack, heart failure, and pneumonia
quality measures, to screen out terminal patients receiving palliative
care. Three other data elements--principal diagnosis, admission date,
and birthdate--are used to initially identify the patients for whom the
heart failure quality measures apply in a given quarter.
[A] Included codes consist of eight different values for admission
source that represent patients who were admitted from any source other
than those listed in footnote b, including physician referral, skilled
nursing facility, and the hospital's emergency room.
[B] Excluded codes consist of three different values for admission
source that represent patients who were transferred to this hospital
from another acute care hospital, from a critical access hospital, or
within the same hospital with a separate claim.
[C] Patients may be excluded from the population used to calculate a
hospital's performance for a variety of reasons, including
inappropriateness of beta blockers for their treatment--for example, if
they have a contraindication for their use--or prior treatment in
another acute care facility.
[D] Included codes consist of 13 different values for discharge status
that represent patients who were discharged to any setting other than
those listed in footnote e, including home care, skilled nursing
facility, and hospice.
[E] Excluded codes consist of five different values for discharge
status that represent patients who were discharged to another acute
care hospital or federal health care facility, left against medical
advice, or died.
[End of figure]
[End of section]
Appendix III: Tables on Eight Case Study Hospitals:
Table 1: Case Study Hospital Characteristics:
Number of beds;
Case study hospital: A: 300-349;
Case study hospital: B: 500+;
Case study hospital: C: 50-99;
Case study hospital: D: 500+;
Case study hospital: E: 100-149;
Case study hospital: F: 500+;
Case study hospital: G: 150-199;
Case study hospital: H: 500+.
Urban/rural;
Case study hospital: A: Urban;
Case study hospital: B: Urban;
Case study hospital: C: Rural;
Case study hospital: D: Urban;
Case study hospital: E: Suburban;
Case study hospital: F: Urban;
Case study hospital: G: Suburban;
Case study hospital: H: Urban.
Major teaching;
Case study hospital: A: Yes;
Case study hospital: B: Yes;
Case study hospital: C: No;
Case study hospital: D: Yes;
Case study hospital: E: No;
Case study hospital: F: Yes;
Case study hospital: G: No;
Case study hospital: H: Yes.
Member of multihospital system;
Case study hospital: A: Yes;
Case study hospital: B: Yes;
Case study hospital: C: Yes;
Case study hospital: D: No;
Case study hospital: E: No;
Case study hospital: F: No;
Case study hospital: G: No;
Case study hospital: H: No.
Joint Commission accredited;
Case study hospital: A: Yes;
Case study hospital: B: Yes;
Case study hospital: C: Yes;
Case study hospital: D: Yes;
Case study hospital: E: Yes;
Case study hospital: F: Yes;
Case study hospital: G: Yes;
Case study hospital: H: Yes.
Vendor submits quality data;
Case study hospital: A: Yes;
Case study hospital: B: Yes;
Case study hospital: C: Yes;
Case study hospital: D: Yes;
Case study hospital: E: Yes;
Case study hospital: F: Yes;
Case study hospital: G: Yes;
Case study hospital: H: Yes.
Patients identified for data collection how often;
Case study hospital: A: Monthly;
Case study hospital: B: Monthly;
Case study hospital: C: Weekly;
Case study hospital: D: Monthly;
Case study hospital: E: Monthly;
Case study hospital: F: Monthly;
Case study hospital: G: Monthly;
Case study hospital: H: Monthly.
Abstraction tool used;
Case study hospital: A: Vendor's;
Case study hospital: B: Vendor's;
Case study hospital: C: Vendor's;
Case study hospital: D: CART[A];
Case study hospital: E: Vendor's;
Case study hospital: F: Vendor's;
Case study hospital: G: Vendor's;
Case study hospital: H: Vendor's.
Conditions reported on;
Case study hospital: A: Heart attack, heart failure, pneumonia,
surgery;
Case study hospital: B: Heart attack, heart failure, pneumonia,
surgery;
Case study hospital: C: Heart attack, heart failure, pneumonia,
surgery;
Case study hospital: D: Heart attack, heart failure, pneumonia,
surgery;
Case study hospital: E: Heart attack, heart failure, pneumonia,
surgery;
Case study hospital: F: Heart attack, heart failure, pneumonia,
surgery;
Case study hospital: G: Heart attack, heart failure, pneumonia,
surgery;
Case study hospital: H: Heart attack, heart failure, pneumonia,
surgery.
Entities that receive Annual Payment Update (APU) program data;
Case study hospital: A: CMS, Joint Commission;
Case study hospital: B: CMS, Joint Commission, vendor database, private
insurers;
Case study hospital: C: CMS, Joint Commission;
Case study hospital: D: CMS, Joint Commission;
Case study hospital: E: CMS, Joint Commission, vendor database;
Case study hospital: F: CMS, Joint Commission;
Case study hospital: G: CMS, Joint Commission;
Case study hospital: H: CMS, Joint Commission.
Entities that receive different quality data;
Case study hospital: A: Leapfrog[B];
Case study hospital: B: Leapfrog, state health department, private
insurers;
Case study hospital: C: Private insurer;
Case study hospital: D: Leapfrog, private insurer;
Case study hospital: E: Private insurer;
Case study hospital: F: Leapfrog, private insurer;
Case study hospital: G: State health department, private insurers;
Case study hospital: H: Private insurer.
Amount of projected reduction in fiscal year 2006 Medicare payments if
quality data not submitted[C];
Case study hospital: A: $139,000;
Case study hospital: B: $608,000;
Case study hospital: C: $33,000;
Case study hospital: D: $449,000;
Case study hospital: E: $57,000;
Case study hospital: F: $430,000;
Case study hospital: G: $93,000;
Case study hospital: H: $123,000.
Amount of projected reduction in fiscal year 2007 Medicare payments if
quality data not submitted[C];
Case study hospital: A: $801,000;
Case study hospital: B: $3,250,000;
Case study hospital: C: $161,000;
Case study hospital: D: $2,298,000;
Case study hospital: E: $283,000;
Case study hospital: F: $2,451,000;
Case study hospital: G: $503,000;
Case study hospital: H: $608,000.
Sources: American Hospital Association, GAO, Centers for Medicare &
Medicaid Services (CMS).
[A] CART, which stands for the CMS Abstraction and Reporting Tool, was
developed by CMS and made available to hospitals at no charge for
collecting and submitting quality data.
[B] The Leapfrog Group is a consortium of large private and public
health care purchasers that publicly recognizes hospitals that have
implemented certain specific quality and safety practices, such as
computerized physician order entry.
[C] The projected reduction in fiscal year 2006 and fiscal year 2007
Medicare payments (rounded to the nearest $1,000) represents the amount
that the hospital's revenue from Medicare would have decreased for that
fiscal year had the hospital not submitted quality data under the
Annual Payment Update program. These estimates are based on information
on the number and case mix of Medicare patients served by these
hospitals during the previous period. This is the information that was
available to hospital administrators from CMS at the beginning of the
fiscal year. The actual reduction would ultimately depend on the number
and case mix of the Medicare patients that the hospital actually
treated during the course of that fiscal year. The projected reduction
for fiscal year 2007 was substantially larger because that was the
first year in which the higher rate of reduction mandated by the
Deficit Reduction Act of 2005--from 0.4 percentage points to 2.0
percentage points--took effect.
[End of table]
Table 2: How Case Study Hospital Officials Described the Steps Taken to
Complete Quality Data Collection and Submission:
1. Identify patients[A];
Case study hospital: A: Vendor prepares list of patients to abstract,
sampling heart failure, pneumonia, and surgery;
Case study hospital: B: Vendor prepares list of patients based on
diagnosis codes, and draws samples for heart failure, pneumonia, and
surgery;
Case study hospital: C: Vendor prepares list of patients to abstract
based on billing data, no sampling;
Case study hospital: D: Hospital IT department identifies patients
based on billing data, no sampling;
Case study hospital: E: Hospital prepares list of patients from billing
data, no sampling;
Case study hospital: F: Hospital provides billing data to vendor;
vendor draws samples and generates list of patients to abstract;
Case study hospital: G: Hospital creates list from billing data; vendor
provides instructions to draw sample of pneumonia cases;
Case study hospital: H: Hospital submits billing data to vendor, which
identifies eligible patients and draws samples.
2. Locate information in the medical record;
Case study hospital: A: Abstractor searches through emergency room and
inpatient electronic and paper records, checking multiple forms and
screens where relevant information could be found;
Case study hospital: B: Abstractor starts search with electronic
discharge summary, then other electronic records and paper documents;
Case study hospital: C: Abstractor searches through different
components of paper record, including printouts from electronic
records;
Case study hospital: D: Abstractor clicks through various electronic
screens representing different types of records, plus some scanned
documents, for example, from other providers;
Case study hospital: E: Abstractor works through paper records, such as
face sheet, emergency room treatment forms, progress notes, and
discharge summary;
Case study hospital: F: Abstractor starts with electronic records (for
heart attack and heart failure)--first structured records (discharge)
and then free text--and then examines paper records if needed; paper
records searched for pneumonia and surgery;
Case study hospital: G: Abstractor starts searching through paper
records, then looks for additional information in electronic records
(e.g., for echocardiogram results);
Case study hospital: H: Abstractor searches through both electronic and
paper records.
3. Determine appropriate data element values;
Case study hospital: A: Some demographic data prepopulated; abstractor
notes ambiguous or conflicting information on paper abstraction form;
Case study hospital: B: Some demographic data prepopulated; other data
elements written on paper abstraction form;
Case study hospital: C: Some demographic data prepopulated; other data
elements entered directly into vendor's online abstraction tool;
Case study hospital: D: Some demographic data prepopulated; most
abstractors fill in data elements on paper abstraction form;
Case study hospital: E: Data elements entered into computerized
abstraction form;
Case study hospital: F: Some demographic data prepopulated; abstractors
fill out abstraction form, some on paper and some online;
Case study hospital: G: Some demographic data prepopulated; other data
elements written on paper abstraction form;
Case study hospital: H: Some demographic data prepopulated; other data
elements written on paper abstraction form.
4. Transmit data to CMS;
Case study hospital: A: Data elements copied from paper abstraction
form to vendor's online form;
Case study hospital: B: Data elements copied from paper abstraction
form to vendor's online form;
Case study hospital: C: Data elements entered directly into vendor's
online abstraction tool;
Case study hospital: D: Data elements copied from paper abstraction
form to vendor's electronic form; data manager checks data and uploads
file to vendor;
Case study hospital: E: Completed abstraction forms sent on disk to
vendor; will change soon to completion of forms online;
Case study hospital: F: For pneumonia and surgery, abstractor enters
data online, for heart attack and heart failure, hospital scans paper
abstraction forms and sends electronic file to vendor, which submits
data to CMS;
Case study hospital: G: Data elements copied from paper abstraction
form to vendor's online form;
Case study hospital: H: Data elements copied from paper abstraction
form to vendor's online form.
5. Ensure data have been accepted by CMS;
Case study hospital: A: Performed by vendor;
Case study hospital: B: Hospital staff reviews error reports from
clinical data warehouse and corrects errors;
Case study hospital: C: Performed by vendor;
Case study hospital: D: Hospital staff reviews error reports from
vendor;
Case study hospital: E: Hospital reviews error reports from vendor and
clinical warehouse;
Case study hospital: F: Performed by vendor;
Case study hospital: G: Hospital receives error report from vendor and
clinical data warehouse and makes corrections;
Case study hospital: H: Hospital reviews error reports from vendor and
makes corrections; vendor deals with clinical data warehouse.
6. Supply copies of selected medical records;
Case study hospital: A: Hospital copies and ships requested patient
records;
Case study hospital: B: Hospital copies, checks completeness of, and
ships requested patient records;
Case study hospital: C: Hospital copies, checks completeness of, and
ships requested patient records;
Case study hospital: D: Hospital copies and ships requested patient
records.
Case study hospital: E: Hospital copies and ships requested patient
records;
Case study hospital: F: Hospital copies and ships requested patient
records; before shipping hospital flags relevant information;
Case study hospital: G: Hospital copies, checks completeness of, and
ships requested patient records;
Case study hospital: H: Hospital copies, checks completeness of, and
ships requested patient records.
Source: GAO.
Note: Information summarized from hospital case study interviews.
[A] The identifying patients step included both determining all the
patients who met the CMS criteria for inclusion and the application of
the CMS sampling procedures, if applicable. CMS only permitted
hospitals to sample patients for a given condition in a given quarter
if the number of eligible patients met a certain threshold. Otherwise,
the hospital was required to abstract quality data for all patients who
met the inclusion criteria for any one of the four conditions.
Hospitals could also choose not to sample, even if it were permitted
under the CMS sampling procedures.
[End of table]
Table 3: Resources Used for Abstraction and Data Submission at Eight
Case Study Hospitals:
Qualifications of abstractors;
Case study hospital: A: Medical record coders and a Master of Public
Health;
Case study hospital: B: Registered nurse (RN) and nonclinical;
Case study hospital: C: All RN;
Case study hospital: D: All RN;
Case study hospital: E: Licensed practical nurse (LPN);
Case study hospital: F: Medical records coder and RN with physician
support;
Case study hospital: G: RN and LPN[A];
Case study hospital: H: All RN.
Number of abstractors;
Case study hospital: A: 3;
Case study hospital: B: 3;
Case study hospital: C: 3;
Case study hospital: D: 9;
Case study hospital: E: 2;
Case study hospital: F: 3;
Case study hospital: G: 3;
Case study hospital: H: 4.
Estimated full time equivalents for abstraction of data elements;
Case study hospital: A: 0.7;
Case study hospital: B: