Nursing Homes
Addressing the Factors Underlying Understatement of Serious Care Problems Requires Sustained CMS and State Commitment
Gao ID: GAO-10-70 November 24, 2009
Under contract with the Centers for Medicare and Medicaid Services (CMS), states conduct surveys at nursing homes to help ensure compliance with federal quality standards. Over the past decade, the Government Accountability Office (GAO) has reported on inconsistencies in states' assessment of nursing homes' quality of care, including understatement--that is, when state surveys fail to cite serious deficiencies or cite them at too low a level. In 2008, GAO reported that 9 states had high and 10 had low understatement based on CMS data for fiscal years 2002 through 2007. This report examines the effect on nursing home deficiency understatement of CMS's survey process, workforce shortages and training, supervisory reviews of surveys, and state agency practices. GAO primarily collected data through two Web-based questionnaires sent to all eligible nursing home surveyors and state agency directors, achieving 61 and 98 percent response rates, respectively.
A substantial percentage of both state surveyors and directors identified general weaknesses in the nursing home survey process, that is, the survey methodology and guidance on identifying deficiencies. On the questionnaires, 46 percent of surveyors and 36 percent of directors reported that weaknesses in the traditional survey methodology, such as too many survey tasks, contributed to understatement. Limited experience with a new data-driven survey methodology indicated possible improvements in consistency; however, an independent evaluation led CMS to conclude that other tools, such as survey guidance clarification and surveyor training and supervision, would help improve survey accuracy. According to questionnaire responses, workforce shortages and greater use of surveyors with less than 2 years' experience sometimes contributed to understatement. Nearly three-quarters of directors reported that they always or frequently experienced a workforce shortage, while nearly two-thirds reported that surveyor inexperience always, frequently, or sometimes led to understatement. Substantial percentages of directors and surveyors indicated that inadequate training may compromise survey accuracy and lead to understatement. According to about 29 percent of surveyors in 9 high understatement states compared to 16 percent of surveyors in 10 low understatement states, initial surveyor training was not sufficient to cite appropriate scope and severity--a skill critical in preventing understatement. Furthermore, over half of directors identified the need for ongoing training for experienced surveyors on both this skill and on documenting deficiencies, a critical skill to substantiate citations. CMS provides little guidance to states on supervisory review processes. In general, directors reported on our questionnaire that supervisory reviews occurred more often on surveys with higher-level rather than on those with lower-level deficiencies, which were the most frequently understated. Surveyors who reported that survey teams had too many new surveyors also reported frequent changes to or removal of deficiencies, indicating heavier reliance on supervisory reviews by states with inexperienced surveyors. Surveyors and directors in a few states informed us that, in isolated cases, state agency practices or external pressure from stakeholders, such as the nursing home industry, may have led to understatement. Forty percent of surveyors in five states and four directors reported that their state had at least one practice not to cite certain deficiencies. Additionally, over 40 percent of surveyors in four states reported that their states' informal dispute resolution processes favored concerns of nursing home operators over resident welfare. Furthermore, directors from seven states reported that pressure from the industry or legislators may have compromised the nursing home survey process, and two directors reported that CMS's support is needed to deal with such pressure. If surveyors perceive that certain deficiencies may not be consistently upheld or enforced, they may choose not to cite them.
Recommendations
Our recommendations from this work are listed below with a Contact for more information. Status will change from "In process" to "Open," "Closed - implemented," or "Closed - not implemented" based on our follow up work.
Director:
Team:
Phone:
GAO-10-70, Nursing Homes: Addressing the Factors Underlying Understatement of Serious Care Problems Requires Sustained CMS and State Commitment
This is the accessible text file for GAO report number GAO-10-70
entitled 'Nursing Homes: Addressing the Factors Underlying
Understatement of Serious Care Problems Requires Sustained CMS and
State Commitment' which was released on December 28, 2009.
This text file was formatted by the U.S. Government Accountability
Office (GAO) to be accessible to users with visual impairments, as part
of a longer term project to improve GAO products' accessibility. Every
attempt has been made to maintain the structural and data integrity of
the original printed product. Accessibility features, such as text
descriptions of tables, consecutively numbered footnotes placed at the
end of the file, and the text of agency comment letters, are provided
but may not exactly duplicate the presentation or format of the printed
version. The portable document format (PDF) file is an exact electronic
replica of the printed version. We welcome your feedback. Please E-mail
your comments regarding the contents or accessibility features of this
document to Webmaster@gao.gov.
This is a work of the U.S. government and is not subject to copyright
protection in the United States. It may be reproduced and distributed
in its entirety without further permission from GAO. Because this work
may contain copyrighted images or other material, permission from the
copyright holder may be necessary if you wish to reproduce this
material separately.
Report to Congressional Requesters:
United States Government Accountability Office:
GAO:
November 2009:
Nursing Homes:
Addressing the Factors Underlying Understatement of Serious Care
Problems Requires Sustained CMS and State Commitment:
GAO-10-70:
GAO Highlights:
Highlights of GAO-10-70, a report to congressional requesters.
Why GAO Did This Study:
Under contract with the CMS, states conduct surveys at nursing homes to
help ensure compliance with federal quality standards. Over the past
decade, GAO has reported on inconsistencies in states‘ assessment of
nursing homes‘ quality of care, including understatement”that is, when
state surveys fail to cite serious deficiencies or cite them at too low
a level. In 2008, GAO reported that 9 states had high and 10 had low
understatement based on CMS data for fiscal years 2002 through 2007.
This report examines the effect on nursing home deficiency
understatement of CMS‘s survey process, workforce shortages and
training, supervisory reviews of surveys, and state agency practices.
GAO primarily collected data through two Web-based questionnaires sent
to all eligible nursing home surveyors and state agency directors,
achieving 61 and 98 percent response rates, respectively.
What GAO Found:
A substantial percentage of both state surveyors and directors
identified general weaknesses in the nursing home survey process, that
is, the survey methodology and guidance on identifying deficiencies. On
the questionnaires, 46 percent of surveyors and 36 percent of directors
reported that weaknesses in the traditional survey methodology, such as
too many survey tasks, contributed to understatement. Limited
experience with a new data-driven survey methodology indicated possible
improvements in consistency; however, an independent evaluation led CMS
to conclude that other tools, such as survey guidance clarification and
surveyor training and supervision, would help improve survey accuracy.
According to questionnaire responses, workforce shortages and greater
use of surveyors with less than 2 years‘ experience sometimes
contributed to understatement. Nearly three-quarters of directors
reported that they always or frequently experienced a workforce
shortage, while nearly two-thirds reported that surveyor inexperience
always, frequently, or sometimes led to understatement. Substantial
percentages of directors and surveyors indicated that inadequate
training may compromise survey accuracy and lead to understatement.
According to about 29 percent of surveyors in 9 high understatement
states compared to 16 percent of surveyors in 10 low understatement
states, initial surveyor training was not sufficient to cite
appropriate scope and severity”a skill critical in preventing
understatement. Furthermore, over half of directors identified the need
for ongoing training for experienced surveyors on both this skill and
on documenting deficiencies, a critical skill to substantiate
citations.
CMS provides little guidance to states on supervisory review processes.
In general, directors reported on our questionnaire that supervisory
reviews occurred more often on surveys with higher-level rather than on
those with lower-level deficiencies, which were the most frequently
understated. Surveyors who reported that survey teams had too many new
surveyors also reported frequent changes to or removal of deficiencies,
indicating heavier reliance on supervisory reviews by states with
inexperienced surveyors.
Surveyors and directors in a few states informed us that, in isolated
cases, state agency practices or external pressure from stakeholders,
such as the nursing home industry, may have led to understatement.
Forty percent of surveyors in five states and four directors reported
that their state had at least one practice not to cite certain
deficiencies. Additionally, over 40 percent of surveyors in four states
reported that their states‘ informal dispute resolution processes
favored concerns of nursing home operators over resident welfare.
Furthermore, directors from seven states reported that pressure from
the industry or legislators may have compromised the nursing home
survey process, and two directors reported that CMS‘s support is needed
to deal with such pressure. If surveyors perceive that certain
deficiencies may not be consistently upheld or enforced, they may
choose not to cite them.
What GAO Recommends:
GAO is making seven recommendations to the CMS Administrator to address
state and surveyor issues about CMS‘s survey methodology and guidance,
workforce shortages and insufficient training, inconsistencies in the
focus and frequency of the supervisory review of deficiencies, and
external pressure from the nursing home industry. CMS concurred with
five of GAO‘s seven recommendations and indicated it would explore
alternate solutions to the remaining two recommendations.
View [hyperlink, http://www.gao.gov/products/GAO-10-70] or key
components. For more information, contact John E. Dicken at (202) 512-
7114 or dickenj@gao.gov. Also see [hyperlink,
http://www.gao.gov/products/GAO-10-74SP] for summary data from the
questionnaires.
[End of section]
Contents:
Letter:
Background:
Weaknesses in CMS Survey Process Contributed to Understatement, but
Long-Term Effect of New Survey Methodology Is Not Yet Known:
Workforce Shortages and Training Inadequacies May Contribute to
Understatement:
State Supervisory Reviews Often Are Not Designed to Identify
Understatement:
State Agency Practices and External Pressure May Compromise Survey
Accuracy and Lead to Understatement in a Few States:
Conclusions:
Recommendations for Executive Action:
Agency and AHFSA Comments and Our Evaluation:
Appendix I: Scope and Methodology:
Appendix II: Comments from the Department of Health & Human Services:
Appendix III: GAO Contact and Staff Acknowledgments:
Related GAO Products:
Tables:
Table 1: Scope and Severity of Deficiencies Identified during Nursing
Home Surveys:
Table 2: Surveyors' and State Agency Directors' Responses to Questions
on CMS's Survey Process:
Table 3: Percentage of Surveyors Reporting That Guidance for Certain
Federal Quality Standards Was Not Sufficient to Identify Deficiencies:
Table 4: State Agency Directors' Responses to Questions about Surveyor
Workforce Issues:
Table 5: State Survey Agency Vacancy Rates and Percentage of State
Surveyors with Less Than 2 Years' Experience:
Table 6: Surveyors' and State Agency Directors' Responses to Questions
on Workforce Issues:
Table 7: Responses from Surveyors and State Agency Directors to Key
Questions on Training:
Table 8: Percentage of Surveyors Reporting Changes in Deficiency
Citations during Supervisory Review:
Table 9: Response Rates to GAO's Questionnaire of Nursing Home
Surveyors, 2008:
Figures:
Figure 1: Zero-, Low-, and High-Understatement States, Fiscal Years
2002-2007:
Figure 2: Eight State Agency Director Responses on Five Questions
Related to the QIS:
Figure 3: Number of State Supervisory Reviews at the Potential for More
than Minimal Harm (D-F) and Immediate Jeopardy Levels (J-L):
Figure 4: Percentage of Surveyors in Each State Reporting at Least One
Noncitation Practice:
Figure 5: Percentage of Surveyors in Each State Reporting the IDR
Process Favored Concerns of Nursing Home Operators over Resident
Welfare:
Abbreviations:
AHFSA: Association of Health Facility Survey Agencies:
CMS: Centers for Medicare & Medicaid Services:
HHS: Department of Health & Human Services:
IDR: Informal Dispute Resolution:
OSCAR: On-Line Survey, Certification, and Reporting system:
RN: registered nurse:
SMQT: Surveyor Minimum Qualifications Test:
SOM: State Operations Manual:
QIS: Quality Indicator Survey:
[End of section]
United States Government Accountability Office:
Washington, DC 20548:
November 24, 2009:
The Honorable Herb Kohl:
Chairman:
Special Committee on Aging:
United States Senate:
The Honorable Charles E. Grassley:
Ranking Member:
Committee on Finance:
United States Senate:
Federal and state governments share responsibility for ensuring that
nursing homes provide quality care in a safe environment for the
nation's 1.5 million residents dependent on such care. The federal
government is responsible for setting quality requirements that nursing
homes must meet to participate in the Medicare and Medicaid programs.
[Footnote 1] The Centers for Medicare & Medicaid Services (CMS), within
the Department of Health & Human Services (HHS), contracts with state
survey agencies to conduct periodic inspections, known as surveys, and
complaint investigations, both of which assess whether homes meet
federal standards.[Footnote 2] State survey agencies are required to
follow federal regulations for surveying facilities; however, several
survey activities and policies are left largely to the discretion of
state survey agencies, including hiring and retaining a surveyor
workforce, training surveyors, reviewing deficiency citations, and
managing regulatory interactions with the industry and public.
In response to congressional requests over the last decade, we have
reported significant weaknesses in federal and state activities
designed to detect and correct quality and safety problems in nursing
homes and the persistence of serious deficiencies, which are those
deficiencies that harm residents or place them at risk of death or
serious injury.[Footnote 3] In the course of our work, we regularly
found significant variation across states in their citations of serious
deficiencies--indicating inconsistencies in states' assessment of
quality of care. We also found evidence of substantial understatement--
that is, state inspections that failed to cite serious deficiencies or
that cited deficiencies at too low a level.
In this report, we complete our response to your request to examine the
understatement of serious deficiencies in nursing homes by state
surveyors nationwide and the factors that contribute to understatement.
Our first report, issued in May 2008, identified the extent of nursing
home understatement nationwide.[Footnote 4] It found that 15 percent of
federal nursing home surveys nationwide and 25 percent of these surveys
in nine states identified state surveys that failed to cite serious
deficiencies. This report examines how the following factors affect the
understatement of nursing home deficiencies: (1) the CMS survey
process, (2) workforce shortages and training, (3) supervisory reviews,
and (4) state agency practices.
To do this work, we analyzed data collected from two GAO-administered
Web-based questionnaires, one to nursing home surveyors and the other
to state agency directors; analyzed federal and state nursing home
survey results; interviewed CMS officials from the Survey and
Certification Group and selected Regional Offices; reviewed federal
regulations and guidance, and our prior work; and conducted follow-up
interviews with state agency directors, as needed, to clarify and
better understand their unique state circumstances.[Footnote 5]
Our prior work documented the prevalence of understatement nationwide
and described several factors that may contribute to survey
inconsistency and the understatement of deficiencies by state survey
teams: (1) weaknesses in CMS's survey methodology, including poor
documentation of deficiencies,[Footnote 6] (2) confusion among
surveyors about the definition of actual harm,[Footnote 7] (3)
predictability of surveys, which allows homes to conceal problems if
they so desire,[Footnote 8] (4) inadequate quality assurance processes
at the state level to help detect understatement in the scope and
severity of deficiencies,[Footnote 9] and (5) inexperienced state
surveyors as a result of retention problems.[Footnote 10] We relied on
this information and feedback from pretests with six surveyors from a
local state and five current or former state agency directors to
develop our questionnaires on the nursing home survey process and
factors that contribute to the understatement of deficiencies.
Our Web-based questionnaires of nursing home surveyors and state agency
directors achieved response rates of 61 percent and 98 percent,
respectively. The first questionnaire collected responses from 2,340 of
the total 3,819 eligible nursing home surveyors in 49 states and the
District of Columbia.[Footnote 11] The resulting sample of surveyors
who responded to our questionnaire between May and July 2008 was
representative of surveyors nationally, with the exception of
Pennsylvania.[Footnote 12] Fifty state agency directors responded to
the second questionnaire from September to November 2008.[Footnote 13]
Many questions on our questionnaires asked respondents to identify the
frequency that an event occurred using the following scale--always,
frequently, sometimes, infrequently, or never; however, for reporting
purposes, we grouped responses into three categories--always/
frequently, sometimes, and infrequently/never. In addition, our
questionnaire to state agency directors asked them to rank the degree
to which several factors, derived from our previous work, contributed
to understatement.[Footnote 14] Summary results from the GAO
questionnaires are available as an e-supplement to this report. See
Nursing Homes: Responses from Two Web-Based Questionnaires to Nursing
Home Surveyors and State Agency Directors (GAO-10-74SP), an e-
supplement to GAO-10-70.
We analyzed the data collected from these questionnaires as stand-alone
datasets and in relationship to state performance on federal
comparative and observational surveys as captured in the federal
monitoring survey database, which we reported on in 2008.[Footnote 15]
In addition, to inform our understanding of the extent to which each
factor contributed to understatement, we examined relationships among
the responses to both questionnaires and the results of the federal
comparative and observational surveys for fiscal years 2002 through
2007. We used the results of the federal comparative surveys for these
years to identify states with high and low percentages of serious
missed deficiencies. We report results for tests of association and
differences between group averages. We also interviewed directors and
other state agency officials in eight states to better understand
unusual or interesting circumstances related to surveyor workforce and
training, supervisory review, or state policies and practices. We
selected these eight state agencies based on our analysis of
questionnaire responses from the state agency directors and nursing
home surveyors.
To compare average facility citations on state survey records with the
average citations on federal survey records, we collected information
from the On-Line Survey, Certification, and Reporting (OSCAR) system
for those facilities where federal teams assessed state surveyor
performance for fiscal years 2002 through 2007.[Footnote 16] Except
where otherwise noted, we used data from fiscal year 2007 because they
were the most recently available data at the time of our analysis (see
appendix I for more on our scope and methodology).
We conducted this performance audit from April 2008 through November
2009 in accordance with generally accepted government auditing
standards. Those standards require that we plan and perform the audit
to obtain sufficient, appropriate evidence to provide a reasonable
basis for our findings and conclusions based on our audit objectives.
We believe that the evidence obtained provides a reasonable basis for
our findings and conclusions based on our audit objectives.
Background:
Oversight of nursing homes is a shared federal-state responsibility. As
part of this responsibility, CMS (1) sets federal quality standards,
(2) establishes state responsibilities for ensuring federal quality
standards are met, (3) issues guidance on determining compliance with
these standards, and (4) performs oversight of state survey activities.
It communicates these federal standards and state responsibilities in
the State Operations Manual (SOM) and through special communications
such as program memorandums and survey and certification letters. CMS
provides less guidance on how states should manage the administration
of their survey programs. CMS uses staff in its 10 regional offices to
oversee states' performance on surveys that ensure that facilities
participating in Medicare and Medicaid provide high-quality care in a
safe environment. Yet, the persistent understatement of serious nursing
home deficiencies that we have reported and survey quality weaknesses
that we and the HHS Office of Inspector General identified serve as
indicators of weaknesses in the federal, state, or shared components of
oversight.
Survey Process:
Every nursing home receiving Medicare or Medicaid payment must undergo
a standard state survey not less than once every 15 months, and the
statewide average interval for these surveys must not exceed 12 months.
During a standard survey, teams of state surveyors--generally
consisting of registered nurses, social workers, dieticians, or other
specialists--evaluate compliance with federal quality standards. The
survey team determines whether the care and services provided meet the
assessed needs of the residents and measure resident outcomes, such as
the incidence of preventable pressure sores, weight loss, and
accidents. In contrast to a standard survey, a complaint investigation
generally focuses on a specific allegation regarding a resident's care
or safety and provides an opportunity for state surveyors to intervene
promptly if problems arise between standard surveys.
Surveyors assess facilities using federal nursing home quality
standards that focus on the delivery of care, resident outcomes, and
facility conditions. These standards total approximately 200 and are
grouped into 15 categories, such as Quality of Life, Resident
Assessment, Quality of Care, and Administration.[Footnote 17] For
example, there are 23 standards (known as F-tags) within the Quality of
Care category ranging from the prevention of pressure sore development
(F-314) to keeping the resident environment as free of accident hazards
(F-323) as is possible.
Surveyors categorize deficient practices identified on standard surveys
and complaint investigations--facilities' failures to meet federal
standards--according to scope (i.e., the number of residents
potentially or actually affected) and severity (i.e., the degree of
relative harm involved)--using a scope and severity grid (see table 1).
Homes with deficiencies at the A through C levels are considered to be
in substantial compliance, while those with deficiencies at the D
through L levels are considered out of compliance. Throughout this
report, we refer to deficiencies at the actual harm and immediate
jeopardy levels--G through L--as serious deficiencies. CMS guidance
requires state survey teams to revisit a home to verify that serious
deficiencies have actually been corrected.[Footnote 18]
Table 1: Scope and Severity of Deficiencies Identified during Nursing
Home Surveys:
Severity: Immediate jeopardy[A];
Scope: Isolated: J;
Scope: Pattern: K;
Scope: Widespread: L.
Severity: Actual harm;
Scope: Isolated: G;
Scope: Pattern: H;
Scope: Widespread: I.
Severity: Potential for more than minimal harm;
Scope: Isolated: D;
Scope: Pattern: E;
Scope: Widespread: F.
Severity: Potential for minimal harm[B];
Scope: Isolated: A;
Scope: Pattern: B;
Scope: Widespread: C.
Source: CMS.
[A] Actual or potential for death/serious injury.
[B] Nursing home is considered to be in "substantial compliance."
[End of table]
In addition, when serious deficiencies are identified, sanctions can be
imposed to encourage facilities to correct the deficiencies and enforce
federal quality standards. Sanctions include fines known as civil money
penalties, denial of payment for new Medicare or Medicaid admissions,
or termination from the Medicare and Medicaid programs. For example,
facilities that receive at least one G through L level deficiency on
successive standard surveys or complaint investigations must be
referred for immediate sanctions. Facilities may appeal cited
deficiencies and if the appeal is successful, the severity of the
sanction could be reduced or the sanction could be rescinded.
Facilities have several avenues of appeal, including informal dispute
resolution (IDR) at the state survey agency level.[Footnote 19] The IDR
gives providers one opportunity to informally refute cited deficiencies
after any survey. While CMS requires that states have an IDR policy in
place, it does not specify how IDR processes should be structured.
Survey Methodology:
To conduct nursing home surveys, CMS has traditionally used a
methodology that requires surveyors to select a sample of residents and
(1) review data derived from the residents' assessments and medical
records; (2) interview nursing home staff, residents, and family
members; and (3) observe care provided to residents during the course
of the survey. When conducting a survey, surveyors have discretion in:
selecting a sample of residents to evaluate; allocating survey time and
emphasis within a framework prescribed by CMS; investigating
potentially deficient practices observed during the survey; and
determining what evidence is needed to identify a deficient practice.
CMS has developed detailed investigative protocols to assist state
survey agencies in determining whether nursing homes are in compliance
with federal quality standards. These protocols are intended to ensure
the thoroughness and consistency of state surveys and complaint
investigations.
In 1998, CMS awarded a contract to revise the survey methodology. The
new Quality Indicator Survey (QIS) was developed to improve the
consistency and efficiency of state surveys and provide a more reliable
assessment of quality. The QIS uses an expanded sample of residents and
structured interviews with residents and family members in a two-stage
process. Surveyors are guided through the QIS process using customized
software on tablet personal computers. In stage 1, a large resident
sample is drawn and relevant data from on-and off-site sources is
analyzed to develop a set of quality-of-care indicators, which will be
compared to national benchmarks.[Footnote 20] Stage 2 systematically
investigates potential quality-of-care concerns identified in stage 1.
Because of delays in implementing the QIS, we recommended in 2003 that
CMS finalize the development, testing, and implementation of a more
rigorous survey methodology, including investigative protocols that
provide guidance to surveyors in documenting deficiencies at the
appropriate scope and severity level.[Footnote 21] CMS concluded a five-
state demonstration process of the QIS in 2007 and is currently
expanding the implementation of the QIS. As of 2008, only Connecticut
had implemented the QIS statewide, and CMS projected that the QIS would
not be fully implemented in every state until 2014.
State Administration:
States are largely responsible for the administration of the survey
program. State survey agencies administer and have discretion over many
survey activities and policies, including hiring and retaining a
surveyor workforce, training surveyors, conducting supervisory reviews
of surveys, and other activities.
* Hiring and Retaining a Surveyor Workforce: State survey agencies hire
the staff to conduct surveys of nursing homes and determine the
salaries of these personnel according to the workforce practices and
restrictions of the state. Salaries, particularly surveyor salaries,
are the most significant cost component of state survey activities,
which are supported through a combination of Medicare, Medicaid, and
non-Medicaid state funds.[Footnote 22] CMS has some requirements for
the make-up of nursing home survey teams, including the involvement of
at least one registered nurse (RN) in each nursing home survey. In
February 2009, we reported that officials from the Association of
Health Facility Survey Agencies (AHFSA) and other state officials told
us they have had difficulty recruiting and retaining the survey
workforce for several years. In our report, we recommended that CMS
undertake a broad-based reexamination to ensure, among other aspects,
an adequate survey workforce with sufficient compensation to attract
and retain qualified staff.[Footnote 23]
* Training: States are responsible for training new surveyors through
participating in actual surveys under direct supervision. Within their
first year of employment, surveyors must complete two CMS online
training courses--the Basic Health Facility Surveyor Course and
Principles of Documentation--and a week-long CMS-led Basic Long-Term
Care Health Facility Surveyor Training Course; at the conclusion of the
course surveyors must pass the Surveyor Minimum Qualifications Test
(SMQT) to survey independently. In addition, state survey agencies are
required to have their own programs for staff development that respond
to the need for continuing development and education of both new and
experienced employees. Such staff development programs must include
training for surveyors on all regulatory requirements and the skills
necessary to conduct surveys. To assist in continuing education, CMS
develops a limited number of courses for ongoing training and provides
other training materials.
* Supervisory Reviews: States may design a supervisory review process
for deficiencies cited during surveys, although CMS does not require
them to do so. In July 2003, we recommended that CMS require states to
have a minimum quality-assurance process that includes a review of a
sample of survey reports below the level of actual harm to assess the
appropriateness of scope and severity levels cited and help reduce
instances of understated quality-of-care problems.[Footnote 24] CMS did
not implement this recommendation.[Footnote 25]
* State Agency Practices and Policies: State survey agencies'
practices, including those on citing deficiencies and addressing
pressure from the industry or others, are largely left to the
discretion of state survey agencies. In the past, we reported that in
one state, CMS officials had found surveyors were not citing all
deficiencies.[Footnote 26] If a state agency fails to cite all
deficiencies associated with noncompliance, nursing home deficiencies
are understated on the survey record. CMS can identify or monitor
states for systematic noncitation practices through reviews of citation
patterns, informal feedback from state surveyors, state performance
reviews, and federal monitoring surveys (discussed below).[Footnote 27]
CMS also gives states latitude in defining their IDR process.
Federal Monitoring Surveys and Evidence of Understatement:
Federal law requires federal surveyors to conduct federal monitoring
surveys in at least 5 percent of state-surveyed Medicare and Medicaid
nursing homes in each state each year. CMS indicates it meets the
statutory requirement by conducting a mix of on-site reviews:
comparative and observational surveys.[Footnote 28]
Comparative surveys. A federal survey team conducts an independent
survey of a home recently surveyed by a state survey agency in order to
compare and contrast its findings with those of the state survey team.
This comparison takes place after completion of the federal survey.
When federal surveyors identify a deficiency not cited by state
surveyors, they assess whether the deficiency existed at the time of
the state survey and should have been cited.[Footnote 29] This
assessment is critical in determining whether understatement occurred,
because some deficiencies cited by federal surveyors may not have
existed at the time of the state survey.
Our May 2008 report stated that comparative surveys found problems at
the most serious levels of noncompliance--the actual harm and immediate
jeopardy levels (G through L).[Footnote 30] About 15 percent of federal
comparative surveys nationwide identified at least one deficiency at
the G through L level that state surveyors failed to cite. While this
proportion is small, CMS maintains that any missed serious deficiencies
are unacceptable. Further, state surveys with understated deficiencies
may allow the surveyed facilities to escape sanctions intended to
discourage repeated noncompliance.
In our May 2008 report we found that for nine states federal surveyors
identified missed serious deficiencies in 25 percent or more
comparative surveys for fiscal years 2002 through 2007; we defined
these states as high-understatement states (see figure 1). Zero-
understatement states were states that had no federal comparative
surveys identifying missed deficiencies at the actual harm or immediate
jeopardy levels; and low-understatement states were the 10 states with
the lowest percentage of missed serious deficiencies (less than 6
percent), including all 7 zero-understatement states.
Figure 1: Zero-, Low-, and High-Understatement States, Fiscal Years
2002-2007:
[Refer to PDF for image: map of the United States]
Zero:
Alaska:
Idaho:
Maine:
North Dakota:
Oregon:
Vermont:
West Virginia:
Low:
Arkansas:
Nebraska:
Ohio:
Mid-range:
California:
Colorado:
Connecticut:
Delaware:
District of Columbia:
Florida:
Georgia:
Hawaii:
Illinois:
Indiana:
Iowa:
Kansas:
Kentucky:
Louisiana:
Maryland:
Massachusetts:
Michigan:
Minnesota:
Mississippi:
Montana:
Nevada:
New Hampshire:
New Jersey:
New York:
North Carolina:
Pennsylvania:
Rhode Island:
Texas:
Utah:
Virginia:
Washington:
Wisconsin:
High:
Alabama:
Arizona:
Missouri:
New Mexico:
Oklahoma:
South Carolina:
South Dakota:
Tennessee:
Wyoming:
Source: GAO analysis of CMS data. Map: Copyright © Corel Corp. All
rights reserved.
Note: Zero-understatement states were those that had no missed serious
deficiencies on federal comparative surveys. Low-understatement states
were the 10 states with the lowest percentage of missed serious
deficiencies on federal comparative surveys (less than 6 percent),
including all zero-understatement states. High-understatement states
were the 9 states with the highest percentage of serious missed
deficiencies (25 percent or more) on federal comparative surveys.
[End of figure]
Our May 2008 report also found that missed deficiencies at the
potential for more than minimal harm level (D through F) were
considerably more widespread than those at the G through L level on
comparative surveys, with approximately 70 percent of comparative
surveys nationwide identifying at least one missed deficiency at this
level. Undetected care problems at this level are of concern because
they could become more serious over time if nursing homes are not
required to take corrective actions.[Footnote 31]
Observational surveys. Federal surveyors accompany a state survey team
to evaluate the team's performance and ability to document survey
deficiencies. State teams are evaluated in six areas, including two--
General Investigation and Deficiency Determination--that affect the
appropriate identification and citation of deficiencies. The General
Investigation segment assesses the effectiveness of state survey team
actions such as collection of information, discussion of survey
observations, interviews with nursing home residents, and
implementation of CMS investigative protocols. The Deficiency
Determination segment evaluates the skill with which the state survey
teams (1) analyze and integrate all information collected, (2) use the
guidance for surveyors, and (3) assess compliance with regulatory
requirements. Federal observational surveys are not independent
evaluations of the state survey because state surveyors may perform
their survey tasks more attentively than they would if federal
surveyors were not present; however, they provide more immediate
feedback to state surveyors and may help identify state surveyor
training needs.
We previously reported that state survey teams' poor performance on
federal observational surveys in the areas of General Investigation and
Deficiency Determination may contribute to the understatement of
deficiencies.[Footnote 32] Further, poor state performance in these two
areas supported the finding of understatement as identified through the
federal comparative surveys. We found that about 8 percent of state
survey teams observed by federal surveyors nationwide received below-
satisfactory ratings on General Investigation and Deficiency
Determination from fiscal years 2002 through 2007. However, surveyors
in high-understatement states performed worse in these two areas of the
federal observational surveys than surveyors in the low-understatement
states. For example, an average of 12 and 17 percent of state survey
teams observed by federal surveyors in high-understatement states
received below satisfactory ratings for these two areas, respectively.
In contrast, an average of 4 percent of survey teams in low-
understatement states received the same below-satisfactory scores for
both deficiency determination and investigative skills.
Nationwide, one-third of nursing homes had a greater average number of
serious deficiencies on federal observational surveys than on state
standard surveys during fiscal years 2002 through 2007, but in eight
states, it was more than half of homes. Of the one-third of homes
nationwide, state standard surveys cited 83 percent fewer serious
deficiencies than federal surveys during this same time period.
Weaknesses in CMS Survey Process Contributed to Understatement, but
Long-Term Effect of New Survey Methodology Is Not Yet Known:
Over a third of both surveyors and state agency directors responding to
our questionnaire identified weaknesses in the federal government's
nursing home survey process that contributed to the understatement of
deficiencies.[Footnote 33] The weaknesses included problems with the
current survey methodology; written guidance that is too long or
complex; and to a lesser extent, survey predictability or other advance
notice of inspections, which may allow nursing homes to conceal
deficiencies. At the time our questionnaires were fielded, eight states
had started implementing CMS's new survey methodology. The limited
experience among these states suggests that the new methodology may
improve consistency of surveys, but information is limited, and the
long-term ability of the new methodology to reduce understatement is
not yet known.
Weaknesses in CMS's Survey Process Contributed to Understatement:
Both surveyors and state agency directors reported weaknesses in the
survey process, and on our questionnaire linked these weaknesses to
understatement of deficiencies. Nationally, 46 percent of nursing home
surveyors responded that weaknesses in the current survey methodology
resulted in missed or incorrectly identified deficiencies, with this
number ranging by state from 0 to 74 percent (see table 2).[Footnote
34] Thirty-six percent of state agency directors responded that
weaknesses in the current survey methodology at least sometimes
contributed to understatement of deficiencies in their states. One such
weakness identified by both surveyors and directors was the number of
survey tasks that need to be completed.
Table 2: Surveyors' and State Agency Directors' Responses to Questions
on CMS's Survey Process:
Questions related to CMS's survey process: Weaknesses in the current
survey methodology at least sometimes result in missed or incorrectly
identified deficiencies at the facility;
Percentage of surveyors' responses: 46%;
Percentage of directors' responses: 36%.
Questions related to CMS's survey process: Additional training is
needed to apply CMS guidance;
Percentage of surveyors' responses: 40%;
Percentage of directors' responses: 58%.
Source: GAO.
[End of table]
[Sidebar: Surveyor Quotation about CMS Written Guidance:
’Appreciate the guidances and protocols. However, making Appendix PP
[guidance for investigating federal quality standards] into a tome is
not helping us out in the field. They are too cumbersome and
voluminous. Please find a way to be more concise in these guidances.“
End of sidebar]
According to surveyors and agency directors responding to our
questionnaire, another weakness with the federal survey process
involved CMS's written guidance to help state agencies follow federal
regulations for surveying long-term care facilities.[Footnote 35] Both
surveyors and state agency directors mentioned concerns about the
length, complexity, and subjectivity of the written guidance. One state
agency director we interviewed told us that the size of the SOM made it
difficult for surveyors to carry the guidance and consult it during
surveys. Although the SOM is available in an electronic format,
surveyors in this state did not use laptops. In addition, a small
percentage of surveyors commented on our questionnaire that CMS
guidance was inconsistently applied in the field. A common complaint
from these surveyors was that different supervisors required different
levels of evidence in order to cite a deficiency at the actual harm or
immediate jeopardy level. Forty percent of surveyors and 58 percent of
state agency directors reported that additional training on how to
apply CMS guidance was needed.
A specific concern raised about the current survey guidance was
determining the severity level for an observed deficiency. Forty-four
percent of state agency directors reported on our questionnaire that
confusion about CMS's definition of the actual-harm level severity
requirements at least sometimes contributed to understatement in their
states. CMS's guidance for determining actual harm states, "this does
not include a deficient practice that only could or has caused limited
consequence to the resident."[Footnote 36] State agency directors from
several states found this language confusing, including one director
who said it is unclear whether conditions like dehydration that are
reversed in the hospital should be cited as actual harm. As we reported
in 2003, CMS officials acknowledged that the language linking actual
harm to practices that have "limited consequences" for a resident has
created confusion; however, the agency has not changed or revised this
language.[Footnote 37]
State agency directors and surveyors indicated that CMS's written
guidance for certain federal nursing home quality standards could be
improved and that revised investigative protocols were helpful.
[Footnote 38] Specifically, 11 state agency directors reported that CMS
guidance on quality standards related to abuse could be improved. State
agency directors commented that the guidance for certain quality
standards was too long, with the guidance for two standards being over
50 pages long. One state agency director also noted that overly complex
guidance will lead to an unmanageable survey process. Surveyors'
concerns about the sufficiency of CMS's guidance varied for different
quality standards (see table 3). For instance, 21 percent of surveyors
nationwide reported that CMS guidance on pain management was not
sufficient to identify deficiencies, whereas only 5 percent reported
that guidance on pressure ulcers was not sufficient. Our analysis found
that fewer surveyors had concerns with the guidance on quality
standards revised through CMS's guidance update initiative.[Footnote
39] For example, the guidance on pressure ulcers was revised in 2004
and the guidance on accidents was revised in 2007; these topics ranked
last among the areas of concern.[Footnote 40] Furthermore, state agency
directors from several states commented on the usefulness of CMS's
revised investigative protocols for federal quality standards.
Table 3: Percentage of Surveyors Reporting That Guidance for Certain
Federal Quality Standards Was Not Sufficient to Identify Deficiencies:
Federal quality standard (number): Pain Management (multiple F-tags)[A,
B];
Percentage reporting guidance on quality standard was not sufficient:
21.
Federal quality standard (number): Quality of Care/Provide Necessary
Care and Services for Highest Practicable Well-Being (F-309)[C];
Percentage reporting guidance on quality standard was not sufficient:
20.
Federal quality standard (number): Range of Motion Mobility Treatment
(F-318);
Percentage reporting guidance on quality standard was not sufficient:
14.
Federal quality standard (number): Accuracy of Resident Assessment (F-
278);
Percentage reporting guidance on quality standard was not sufficient:
13.
Federal quality standard (number): Comprehensive Care Plans (F-279);
Percentage reporting guidance on quality standard was not sufficient:
12.
Federal quality standard (number): Sanitary Conditions for Food (F-
371)[B];
Percentage reporting guidance on quality standard was not sufficient:
12.
Federal quality standard (number): Abuse (F-223 through F-226)[D];
Percentage reporting guidance on quality standard was not sufficient:
11.
Federal quality standard (number): Maintains Body Weight (F-325)[B];
Percentage reporting guidance on quality standard was not sufficient:
11.
Federal quality standard (number): Physical Restraints (F-221);
Percentage reporting guidance on quality standard was not sufficient:
11.
Federal quality standard (number): Unnecessary Drugs (F-329)[E];
Percentage reporting guidance on quality standard was not sufficient:
11.
Federal quality standard (number): Resident Participation in Planning
Care and Treatment (F-280);
Percentage reporting guidance on quality standard was not sufficient:
10.
Federal quality standard (number): Accidents (F-323)[E];
Percentage reporting guidance on quality standard was not sufficient:
8.
Federal quality standard (number): Pressure Ulcers (F-314)[E];
Percentage reporting guidance on quality standard was not sufficient:
5.
Source: GAO.
[A] CMS consolidated guidance on pain management into F-309 on March
31, 2009.
[B] CMS revised guidance after our questionnaire of nursing home
surveyors was administered in May 2008.
[C] CMS added guidance to F-309 for residents receiving hospice or
dialysis services on April 10, 2009.
[D] CMS plans to begin revising guidance in Fall 2009.
[E] CMS revised guidance before our questionnaire of nursing home
surveyors was administered in May 2008.
[End of table]
Another weakness associated with the federal survey process was the
potential for surveys to be predictable based solely on their timing.
[Footnote 41] Eighteen percent of state agency directors reported that
survey predictability or other advance notice of inspections at least
sometimes contributed to understatement in their states. We analyzed
state agencies' most-recent nursing home surveys and found that 29
percent of these surveys could be considered predictable due to their
timing. We previously reported that survey predictability could
contribute to understatement because it gives nursing homes the
opportunity to conceal deficiencies if they choose to do so.[Footnote
42] CMS officials previously stated that reducing survey predictability
could require increased funding because more surveys would need to be
conducted within 9 months of the previous survey.[Footnote 43] However,
CMS noted that state agencies are not funded to conduct any surveys
within 9 months of the last standard survey.
New Survey Methodology's Effect on Understatement Inconclusive:
There was no consensus among the eight state agency directors who had
started implementing the QIS as of November 2008 about how the new
survey methodology would affect understatement.[Footnote 44] Three
directors reported that the QIS was likely to reduce understatement;
three directors reported that it was not likely to reduce
understatement; and two directors were unsure or had no opinion (see
figure 2). However, all eight directors reported that the new QIS
methodology was likely to improve survey consistency both within and
across states. In addition, five of these directors reported that the
new QIS methodology was likely to improve survey quality. Five of the
eight directors also indicated that the QIS required more time than the
traditional survey methodology.
Figure 2: Eight State Agency Director Responses on Five Questions
Related to the QIS:
[Refer to PDF for image: stacked vertical bar graph]
Question: Improve consistency within this state;
Yes: 8;
No: 0;
Not sure/no opinion: 0.
Question: Improve consistency across states;
Yes: 8;
No: 0;
Not sure/no opinion: 0.
Question: Improve quality of nursing home surveys;
Yes: 5;
No: 1;
Not sure/no opinion: 2.
Question: Require more time to complete;
Yes: 5;
No: 3;
Not sure/no opinion: 0.
Question: Reduce understatement;
Yes: 3;
No: 3;
Not sure/no opinion: 2.
Source: GAO.
[End of figure]
CMS funded an independent evaluation of the QIS, which was completed by
a contractor in December 2007.[Footnote 45] The evaluation assessed the
effectiveness of the new methodology by studying (1) its effect on
accuracy of surveys, (2) documentation of deficiencies, (3) time
required to complete survey activities, (4) number of deficiencies
cited, and (5) surveyor efficiency. The evaluation did not draw a firm
conclusion about the overall effectiveness of the QIS as measured
through these five areas. For instance, the QIS methodology was
associated with an increase in the total number of deficiencies cited,
including an increase in the number of G-level deficiencies and the
number of quality standard areas cited. However, the evaluation did not
find that the QIS methodology increased survey accuracy, noting that
QIS and traditional survey samples were comparable in overall quality
and in the frequency of standards cited for deficiencies with either a
pattern or widespread scope.[Footnote 46] The results suggested that
more deficiencies with higher scope could have been cited for both the
QIS and traditional surveys. Similarly, there was no evidence that the
QIS resulted in higher-quality documentation or improved surveyor
efficiency. Although five state agency directors reported that the QIS
required more time to complete than the traditional methodology, the
evaluation found some evidence of a learning curve, suggesting that
surveyors were able to complete surveys faster as they became familiar
with the new process. The evaluation generated a number of
recommendations for improving the QIS that are consistent with reducing
understatement, such as improving the specificity and usability of
investigative protocols and evaluating how well the new methodology
accurately identifies the areas in which there are potential quality
problems. Since the evaluation did not find improved accuracy, CMS
concluded that non-QIS factors, including survey guidance clarification
and surveyor training and supervision, would help improve survey
accuracy. Additionally, CMS concluded that future QIS development
efforts should concentrate on improving survey consistency and giving
supervisors more tools to assess the performance of surveyor teams.
Ten state agency directors that had not yet started implementing the
QIS responded to our questionnaire with concerns about the cost
associated with implementing the new methodology, including the
resources required to train staff and obtain new equipment.[Footnote
47] Of these 10 directors, 3 also expressed concerns that allotting
staff time for QIS implementation would prevent the agency from
completing mandatory survey activities.
Workforce Shortages and Training Inadequacies May Contribute to
Understatement:
Workforce shortages and training inadequacies affected states' ability
to complete thorough surveys, contributing to understatement of nursing
home deficiencies. Responses to our questionnaires indicated that
states experienced workforce shortages or were attempting to accomplish
their workload with a high percentage of inexperienced surveyors. In
states with fewer staff to do the work, time frames were compressed.
The increased workload burden may have had an effect on the
thoroughness of surveys in those states and surveyors' ability to
attend training. The frequent hiring of new surveyors to address
workforce shortages also burdened states' surveyor training programs.
Surveyors, state agency directors, and state performance on federal
observational surveys indicated that inadequacies in initial and
ongoing training may have compromised survey accuracy in high-
understatement states.
Workforce Shortages Sometimes Contributed to Understatement:
Although a small percentage of state agency directors reported that
workforce shortages always or frequently contributed to the
understatement of nursing home deficiencies in their states, 36 percent
indicated that workforce shortages sometimes contributed to
understatement (see table 4). In many states, workforce shortages
resulted in a greater reliance on inexperienced surveyors. According to
state agency directors and surveyors, this collateral effect--
inexperienced surveyors--also may have contributed to understatement.
States also expressed concern about completing their workload, which
appeared to be, in part, an outgrowth of workforce shortages and use of
inexperienced surveyors.
Table 4: State Agency Directors' Responses to Questions about Surveyor
Workforce Issues:
How frequently do the following issues contribute to understatement in
this state survey agency?
Percentage of state agency directors' responses:
Inadequate number of staff to complete thorough surveys:
Always/frequently: 6;
Sometimes: 36;
Infrequently/never: 58.
Inadequate time to complete thorough surveys:
Always/frequently: 8;
Sometimes: 38;
Infrequently/never: 54.
Reluctance to cite serious deficiencies because of workload burden:
Always/frequently: 8;
Sometimes: 10;
Infrequently/never: 86.
Inexperienced surveyors not yet comfortable with job responsibilities:
Always/frequently: 16;
Sometimes: 48;
Infrequently/never: 34.
Source: GAO.
[End of table]
Workforce Shortages. Since 2003, we have reported that states have
experienced pervasive workforce shortages, and responses to our
questionnaires indicate that shortages continue to affect states.
[Footnote 48] Seventy-two percent of state agency directors reported
that they always or frequently had a surveyor workforce shortage, and
another 16 said it occurred sometimes. The average vacancy rate for
surveyors was 14 percent, and one-fourth of states had a vacancy rate
of higher than 19 percent (see table 5).[Footnote 49] Among the 49
reporting states, the vacancy rate ranged from a maximum of 72 percent
in Alabama to 0 percent in Nevada, Rhode Island, Vermont, and Utah. The
workforce shortages have stemmed mostly from the preference to employ
RNs as surveyors in state survey agencies, with half of reporting
states employing RNs as more than 75 percent of their surveyor
workforce.[Footnote 50] In the past, states have claimed that they had
difficulty matching RN salaries offered by the private sector, and this
hampered the hiring and retention of RNs. The Virginia state agency
director commented during an interview that the nursing home industry
values individuals who have passed CMS's SMQT and hires its surveyors
after they are trained and certified by CMS. Virginia and others also
identified the stress of the job--regular travel, time pressures to
complete the workload, and the regulatory environment--as a challenge
to retaining staff. Previously, we reported that workforce instability
arising from noncompetitive RN surveyor salaries and hiring freezes
affected states' abilities to complete their survey workload or
resulted in the hiring of less-qualified staff.[Footnote 51] Most
recently, the poor economy has further constrained state budgets for
surveyors. For example, to address its budget shortfall in 2009,
California will furlough its state employees including surveyors for 2
days every month from February 2009 through June 2010.[Footnote 52] An
additional 11 states also reported furloughs for 2009, and 13 are
considering furloughs, salary reductions, or layoffs or will employ
such measures in the future.
Table 5: State Survey Agency Vacancy Rates and Percentage of State
Surveyors with Less Than 2 Years' Experience:
Percentage of state agency directors' responses:
Vacancy rate[A]:
All states: 14;
Low-understatement states: 12;
High-understatement states: 24.
Surveyors with less than 2 years' experience[B]:
All states: 30;
Low-understatement states: 25;
High-understatement states: 38.
Source: GAO.
[A] Virginia did not provide the information needed to compute a
vacancy rate and it was not a high-or low-understatement state.
[B] Seven states did not report the number of surveyors with less than
2 years of experience. Among the high-and low-understatement states,
only West Virginia, a low-understatement state, did not report this
information.
[End of table]
[Sidebar: Surveyor Quotation about Inexperienced Staff:
’I have been in this department for just over 3 years, and I still do
not feel comfortable with the process. I could personally use a mentor
to ensure a thorough understanding of the process. I don‘t feel as if I
can accurately identify deficiencies with the short amount of time
given the survey teams to conduct surveys. I feel as if I overlook
things due to trying to meet survey length time frames.“ End of
sidebar]
Inexperienced Surveyors. Many states are attempting to accomplish their
workload with a larger share of inexperienced surveyors, and state
agency directors sometimes linked this reliance on inexperienced staff
to the understatement of nursing home deficiencies. On average, 30
percent of surveyors had less than 2 years' experience (see table 5);
however the percentage of inexperienced surveyors ranged from 10 to 82
percent across states who reported this information.[Footnote 53] Among
state agency directors, 16 percent indicated that inexperienced
surveyors always or frequently contributed to understatement, while
another 48 percent indicated that surveyor inexperience sometimes
contributed to understatement in their states. In response to our
questionnaires, 26 percent of surveyors indicated that survey teams
always or frequently had too many inexperienced surveyors and another
33 percent indicated that sometimes survey teams had too many
inexperienced surveyors (see table 6). Half or more of all surveyors in
six states--Alabama, Alaska, Arizona, Idaho, New Mexico, and Utah--
reported that there were always or frequently too many new surveyors
who were not yet comfortable with their job responsibilities. For
example, 79 percent of surveyors in Arizona reported that too many new
surveyors were not comfortable with their job responsibilities, and the
state agency director was among the 34 percent who reported that survey
teams sometimes had an insufficient number of experienced surveyors.
Overall, 26 percent of state agency directors indicated that the skill
level of surveyors has decreased in the last 5 years.
Table 6: Surveyors' and State Agency Directors' Responses to Questions
on Workforce Issues:
Percentage of responses:
Respondents: Surveyors;
Question: How frequently have you observed the following problems on
the nursing home surveys that you have worked on?
Too many new surveyors not yet comfortable with job responsibilities;
Always/frequently: 26;
Sometimes: 33;
Infrequently/never: 35.
Question: Survey team too small to conduct a thorough survey;
Always/frequently: 21;
Sometimes: 33;
Infrequently/never: 42.
Respondents: State Agency Directors;
Question: In this state survey agency, how frequently do the following
occur?
Survey team not given sufficient time to conduct a thorough survey;
Always/frequently: 25;
Sometimes: 29;
Infrequently/never: 42.
Survey teams have a sufficient number of experienced surveyors;
Always/frequently: 62;
Sometimes: 34;
Infrequently/never: 4.
Survey teams are sufficient size to conduct thorough surveys;
Always/frequently: 74;
Sometimes: 18;
Infrequently/never: 8.
Survey teams are given sufficient time to conduct thorough surveys;
Always/frequently: 78;
Sometimes: 16;
Infrequently/never: 6.
Source: GAO.
[End of table]
In interviews, six state agency directors commented that inexperienced
surveyors possessed different skills or needed more time than
experienced surveyors to complete surveys and that workforce shortages
resulted in constant recruiting, over-burdened experienced surveyors,
or the need for additional supervision and training resources. Four
states--Kentucky, Nevada, New Mexico, and Virginia--reported not having
enough dedicated training staff to handle the initial training for new
surveyors.
[Sidebar: Surveyor Quotation about Insufficient Time to Complete
Surveys:
’Frequently G level or I/J level are not cited [due to a] lack [of]
staff time.“ End of sidebar]
Workload. State inability to complete workload was, in part, an
outgrowth of the workforce shortages and reliance on inexperienced
surveyors. More than two-thirds of state agency directors reported on
our questionnaire that staffing posed a problem for completing
complaint surveys, and more than half reported that staffing posed a
problem for completing standard or revisit surveys.[Footnote 54] In
addition, 46 percent of state agency directors reported that time
pressures always, frequently, or sometimes contributed to
understatement in their states. In response to our questionnaire, 16
percent of surveyors nationwide reported that workload burden
influenced the citation of deficiencies--including 14 states with 20
percent or more surveyors reporting the same. More than 50 percent of
surveyors identified insufficient team size or time pressures as having
an effect on the thoroughness of surveys. Surveyors' comments
reiterated these concerns--over 15 percent of surveyors who wrote
comments complained about the amount of time allotted to complete
surveys or survey paperwork, and 11 percent indicated that staffing was
insufficient to complete surveys.[Footnote 55] One state agency
director suggested to us that CMS establish a national team of
surveyors to augment states' when they fell behind on their workload or
had staffing shortages. He thought the availability of national
surveyors could assist states experiencing workforce shortages and help
ensure state workloads were completed. This state had experience with a
similar arrangement when it hired a national contractor to complete its
surveys of Intermediate Care Facilities for the Mentally Retarded.
Training Inadequacies May Compromise Survey Accuracy:
Surveyors, state agency directors, and state performance on federal
observational surveys indicated that inadequacies in initial or ongoing
training may compromise the accuracy of nursing home surveys and lead
to the understatement of deficiencies. In addition, workload affected
surveyors' ability to attend training.
Initial Surveyor Training. As noted earlier, even though CMS has
established specific training requirements, including coursework and
the SMQT certification test, states are responsible for preparing their
new surveyors for the SMQT. According to CMS, 94 percent of new
surveyors nationally passed the SMQT test in 2008 and, on average,
surveyors answered about 77 percent of the questions correctly. These
results seem to support the state agency directors' assertions that
initial training was insufficient and suggest that the bar for passing
the test may be set too low. Even though we cannot be certain whether
the inadequacies are with the federal or state components of the
training, reported differences among states in satisfaction with the
initial surveyor training also could reflect gaps in state training
programs. About 29 percent of surveyors in high-understatement states
reported that initial training was not sufficient to cite appropriate
scope and severity levels, compared with 16 percent of surveyors in low-
understatement states (see table 7). Similarly, 28 percent of surveyors
in high-understatement states, compared with 20 percent of those in low-
understatement states, indicated that initial training was not
sufficient to identify deficiencies for nursing homes. Further, 18
percent of state agency directors linked the occurrence of
understatement always, frequently, or sometimes with insufficient
initial training. From 16 to 20 percent of state agency directors
indicated that initial training was insufficient to (1) enable
surveyors to identify deficiencies and (2) assign the appropriate level
of scope and severity.
Table 7: Responses from Surveyors and State Agency Directors to Key
Questions on Training:
Initial training is not sufficient: To ensure surveyors are able to
cite appropriate scope and severity levels;
Percentage of surveyors' responses: All states: 24;
Percentage of surveyors' responses: Low-understatement states: 16;
Percentage of surveyors' responses: High-understatement states: 29;
Percentage of directors' responses: 20.
Initial training is not sufficient: To enable surveyors to identify
deficiencies;
Percentage of surveyors' responses: All states: 26;
Percentage of surveyors' responses: Low-understatement states: 20;
Percentage of surveyors' responses: High-understatement states: 28;
Percentage of directors' responses: 16.
Additional training is needed to: Interview nursing home residents;
Percentage of surveyors' responses: All states: 13;
Percentage of surveyors' responses: Low-understatement states: 9;
Percentage of surveyors' responses: High-understatement states: 16;
Percentage of directors' responses: 36[A].
Additional training is needed to: Identify scope and severity levels;
Percentage of surveyors' responses: All states: 26;
Percentage of surveyors' responses: Low-understatement states: 16;
Percentage of surveyors' responses: High-understatement states: 34;
Percentage of directors' responses: 56[B].
Additional training is needed to: Document deficiencies;
Percentage of surveyors' responses: All states: 32;
Percentage of surveyors' responses: Low-understatement states: 27;
Percentage of surveyors' responses: High-understatement states: 35;
Percentage of directors' responses: 62.
Source: GAO.
[A] Two state agency directors did not respond to this question.
[B] One state agency director did not respond to this question.
[End of table]
Ongoing Training. Ongoing training programs are the purview of state
agencies; therefore, differences between states about the sufficiency
of this training also may point to gaps in the state training programs.
On our questionnaire, about 34 percent of surveyors in high-
understatement states indicated a need for additional training on (1)
identifying appropriate scope and severity levels and (2) documenting
deficiencies. This was significantly more than those from low-
understatement states, which indicated less of a need for additional
training in these areas--16 and 27 percent, respectively. Among state
agency directors, 10 percent attributed understatement always or
frequently to insufficient ongoing training, while 14 percent indicated
that insufficient ongoing training sometimes gave rise to
understatement. Although 74 percent of state agency directors indicated
that the state had ongoing annual training requirements, the required
number of hours and the type of training varied widely by state in
2007. Among the 33 states that provided the required amount of annual
state training, these hours ranged from 0 to 120 hours per year.
Meanwhile, 37 states reported one or more type of required training: 32
states required surveyors to attend periodic training, 22 required on-
the-job training, 10 required online computerized training, and 13
states required some other type of training.
State agency directors indicated that they relied on CMS materials for
ongoing training of experienced surveyors, yet many reported additional
training needs and suggested that use of electronic media could make
continuing education and new guidance more accessible. While 98 percent
of states indicated that the CMS written guidance materials and
resources were useful, over 50 percent of all state agency directors
identified additional training needs in documenting deficiencies,
citing deficiencies at the appropriate scope and severity level, and
applying CMS guidance. On federal observational surveys, an average of
17 to 12 percent of survey teams in high-understatement states received
below-satisfactory ratings for Deficiency Determination and General
Investigation, respectively--two skills critical for preventing
understatement. In contrast, an average of 4 percent of survey teams in
low-understatement states received the same below-satisfactory scores
for both deficiency determination and investigative skills.
Furthermore, of the 476 surveyors who commented about training needs,
one-quarter indicated a need for training support from either CMS or
state agencies; and between 12 to 7 percent of those who commented on
training needs identified topics such as: documenting deficiencies,
identifying scope and severity, CMS guidance, and medical knowledge.
[Footnote 56]
Inability to Attend Training. States' workload requirements and
workforce shortages affected the surveyors' ability to attend initial
and ongoing training. Seven of the eight state agency directors we
interviewed linked workforce shortages and resource constraints to
their state's ability to complete the survey workload or allow staff to
participate in training courses. One director stated that workload
demands compromised comprehensive training for new staff, and another
reported difficulty placing new staff in CMS's initial training
programs. Due to workload demands, a third state agency director stated
that she could not allow experienced staff time away from surveying to
attend training courses even when staff paid their own way. Five of the
seven state agency directors suggested that it would be more efficient
for training activities to be conducted more locally such as in their
states or to be available through online, video, or other electronic
media, and several emphasized the need to reduce or eliminate travel
for training. Although four states also expressed a preference for
interactive training opportunities, one state believed that
technological solutions could allow for more accessible training that
was also interactive.
State Supervisory Reviews Often Are Not Designed to Identify
Understatement:
State supervisory reviews, which generally occurred more frequently on
higher-level deficiencies, often are not designed to identify
understated deficiencies. State agencies generally conducted more
supervisory reviews on surveys with higher-level deficiencies, compared
to surveys with deficiencies at the potential for more than minimal
harm level (D through F)--the deficiencies most likely to be
understated. While focus on higher-level deficiencies enables states to
be certain that such deficiencies are well documented, not reviewing
surveys with deficiencies at lower levels represents a missed
opportunity to ensure that all serious deficiencies are cited. State
surveyors who reported having frequent changes made to their survey
reports during supervisory reviews also more often reported they were
burdened by other factors contributing to understatement, such as
workforce shortages and survey methodology weaknesses.
Supervisory Reviews Often Focused on Higher-Level Deficiencies:
According to state agency directors' responses to our questionnaire,
states generally focused supervisory review on surveys with higher-
level deficiencies, rather than on the surveys with deficiencies at the
potential for more than minimal-harm level (D through F)--the
deficiencies most likely to be understated. During supervisory reviews,
either direct-line supervisors or central state agency staff may review
draft survey records.[Footnote 57] On average, surveys at the D through
F level underwent about two steps of review, while surveys with
deficiencies at the immediate jeopardy level (J through L) went through
three steps.[Footnote 58] For example, Washington reviews its surveys
using either a two-step review that includes survey team and field
manager reviews or a three-step process that includes both these
reviews and an additional review by central state agency staff for
serious deficiencies. As a result, central state agency staff in
Washington do not review deficiencies below the level of actual harm.
In addition we found that five states--Alaska, Hawaii, Illinois,
Nebraska, and Nevada--did not review all surveys with deficiencies at
the D through F levels. In fact, Hawaii did not report supervisory
review of deficiencies at any level (see figure 3).[Footnote 59] It is
difficult to know if additional supervisory reviews--the second, third,
or fourth review--help make survey records more accurate and less
likely to be understated, or if these reviews result in more frequent
changes to deficiency citations. However, if deficiency citations with
the potential for more than minimal-harm level (D through F) are not
reviewed, states miss the opportunity to assess whether these
deficiencies warrant a higher-level citation, for example, the level of
actual harm or immediate jeopardy.
Figure 3: Number of State Supervisory Reviews at the Potential for More
than Minimal Harm (D-F) and Immediate Jeopardy Levels (J-L):
[Refer to PDF for image: two map of the U.S. with associated data]
States with D-F supervisory reviews:
Alabama: 4;
Alaska: 0;
Arizona: 3;
Arkansas: 2;
California: 2;
Colorado: 4;
Connecticut: 4;
Delaware: 2;
Florida: 3;
Georgia: 1;
Hawaii: 0;
Idaho: 1;
Illinois: 0;
Indiana: 1;
Iowa: 4;
Kansas: 2;
Kentucky: 3;
Louisiana: 1;
Maine: 4;
Maryland: 2;
Massachusetts: 1;
Michigan: 1;
Minnesota: 2;
Mississippi: 2;
Missouri: 3;
Montana: 2;
Nebraska: 0;
Nevada: 0;
New Hampshire: 3;
New Jersey: 2;
New Mexico: 3;
New York: 3;
North Carolina: 1;
North Dakota: 3;
Ohio: 3;
Oklahoma: 2;
Oregon: 2;
Pennsylvania: 4;
Rhode Island: 4;
South Carolina: 2;
South Dakota: 2;
Tennessee: 3;
Texas: 4;
Utah: 1;
Vermont: 2;
Virginia: 1;
Washington: 2;
West Virginia: 1;
Wisconsin: 3;
Wyoming: 1.
States with J-L supervisory reviews:
Alabama: 4;
Alaska: 1;
Arizona: 3;
Arkansas: 3;
California: 3;
Colorado: 4;
Connecticut: 4;
Delaware: 2;
Florida: 5;
Georgia: 1;
Hawaii: 0;
Idaho: 1;
Illinois: 4;
Indiana: 5;
Iowa: 4;
Kansas: 3;
Kentucky: 4;
Louisiana: 3;
Maine: 3;
Maryland: 3;
Massachusetts: 2;
Michigan: 3;
Minnesota: 5;
Mississippi: 2;
Missouri: 3;
Montana: 2;
Nebraska: 1;
Nevada: 0;
New Hampshire: 3;
New Jersey: 3;
New Mexico: 2;
New York: 5;
North Carolina: 4;
North Dakota: 3;
Ohio: 4;
Oklahoma: 2;
Oregon: 3;
Pennsylvania: 5;
Rhode Island: 4;
South Carolina: 3;
South Dakota: 2;
Tennessee: 6;
Texas: 5;
Utah: 2;
Vermont: 2;
Virginia: 1;
Washington: 2;
West Virginia: 1;
Wisconsin: 6;
Wyoming: 1.
Source: GAO. Map: Copyright © Corel Corp. All rights reserved.
Note: Hawaii did not report conducting supervisory reviews. Forty
states review a sample of all draft surveys. Such reviews may include
additional examination of surveys with deficiencies at either the D
through F or J through L levels.
[End of figure]
Because a majority of states are organized into geographically-based
district or regional offices, review by central state agency staff,
particularly quality assurance staff, is critical to help ensure
consistency and detect understatement. However, 26 states reported that
no central state agency staff reviews were conducted for surveys with
deficiencies at the potential for more than minimal harm (D through F).
These results are consistent with a finding from our 2003 report--that
half of the 16 states we contacted for that report did not have a
quality assurance process to help ensure that the scope and severity of
less serious deficiencies were not understated.[Footnote 60]
According to most of the eight state officials we interviewed,
supervisory reviews commonly focused on documentation principles or
evidentiary support, not on reducing understatement. For example, all
eight states used supervisory reviews to assess the accuracy and
strength of the evidence surveyors used to support deficiency
citations, and three of these states reported that they emphasized
reviewing survey records for documentation principles. Furthermore,
seven out of eight states indicated that surveys with serious
deficiencies--those that may be subject to enforcement proceedings--
went through additional steps of review compared with surveys citing
deficiencies with the potential for more than minimal harm (D through
F).
Reports of Changes to Deficiencies during Supervisory Reviews May Be
Related to Other Factors That Contribute to Understatement:
Surveyor reports of changes to deficiency citations during supervisory
reviews may be related to other factors the state is experiencing that
also contribute to understatement, such as workforce shortages and
survey methodology weaknesses.
[Sidebar: Surveyor Quotation about Supervisory Review:
’We have problems at times with nonclinical supervisors and district
managers, [and with] a past branch chief not understanding clinical
issues and thus not supporting surveyor findings. We‘ve had
deficiencies tossed out for surveys and IDR deficiencies deleted, not
for lack of documentation, but for lack of understanding of the issues
involved.“ End of sidebar]
Changes to Deficiencies. Fifty-four percent of surveyors nationwide
reported on our questionnaire that supervisors at least sometimes
removed the deficiency that was cited, and 53 percent of surveyors
noted that supervisors at least sometimes changed the scope and
severity level of cited deficiencies. Of the surveyors, who reported
that supervisors sometimes removed deficiencies, 13 percent reported
that supervisors always or frequently removed deficiencies--including
12 states with 20 percent or more of their surveyors reporting that
deficiencies were removed.
Surveyor reports of changes in deficiency citations alone make it
difficult to know whether the original deficiency citation or the
supervisor's revised citation was a more accurate reflection of a
nursing home's quality of care. Additionally, there are many reasons
that survey records might be changed during supervisory review. When a
surveyor fails to provide sufficient evidence for deficient practices,
it may be difficult to tell whether the deficiency was not
appropriately cited or if the surveyor did not collect all the
available evidence. Kentucky's state agency director offered one
possible explanation--that changes to surveys often reflected a need
for more support for the deficiencies cited, such as additional
evidence from observations. Nevada's state agency director stated that
changes to survey records occurred when it was often too late to gather
more evidence in support of deficiencies.
Surveyors who reported that supervisors frequently changed deficiencies
also more often reported experiencing other factors that contribute to
understatement. We found associations between surveyor reports of
changes to deficiencies and workforce shortages and survey methodology
weaknesses.
* Workforce shortages. Surveyors reporting workforce shortages,
including survey teams with too many new surveyors and survey teams
that were either too small or given insufficient time to conduct
thorough surveys, more often also reported that supervisors frequently
removed deficiencies or changed the scope and severity of deficiency
citations during supervisory reviews.
* Survey methodology weaknesses. Surveyors reporting weaknesses in the
current survey methodology more often also reported that supervisors
frequently removed deficiencies or changed the scope and severity of
deficiency citations during supervisory reviews.
Supervisory Reviews and Understatement. In certain cases, survey agency
directors and state performance on federal comparative surveys linked
supervisory reviews to understatement. Twenty-two percent of state
agency directors attributed inadequate supervisory review processes to
understatement in their states at least sometimes.[Footnote 61] In
addition, significant differences existed between zero-understatement
states and all other states, including high-understatement states, in
the percentage of surveyors reporting frequent changes to citations
during supervisory reviews. Only about 4 percent of surveyors in zero-
understatement states reported that citations were always or frequently
removed or changed and that the scope and severity cited were changed,
while about 12 percent of surveyors in all other states indicated the
same (see table 8). To address concerns with supervisory reviews,
Nevada recently reduced its process from two steps to a single step
review by survey team supervisors to address surveyor complaints about
changes made during supervisory reviews.
Table 8: Percentage of Surveyors Reporting Changes in Deficiency
Citations during Supervisory Review:
Surveyors reporting that: Supervisors always or frequently remove or
change deficiency cited;
Percentage of surveyors' responses: Zero-understatement states: 5;
Percentage of surveyors' responses: All other states: 13.
Surveyors reporting that: Supervisors always or frequently changed the
scope and severity cited;
Percentage of surveyors' responses: Zero-understatement states: 4;
Percentage of surveyors' responses: All other states: 12.
Source: GAO.
[End of table]
In addition, we observed a relationship between state practices to
notify surveyors of changes made during supervisory reviews and
surveyor reports of deficiency removal and explanation of changes.
Specifically, compared to surveyors in states that require supervisors
to notify surveyors of changes made during supervisory review,
surveyors from states where no notification is required reported more
often that supervisors removed deficiencies and less often that
explanations for these changes, when given, were reasonable.
Similarly, we found an association between the frequency of explained
and reasonable changes and zero-understatement states, possibly
demonstrating the positive effect of practices to notify surveyors of
changes made during supervisory reviews. Nursing home surveyors from
zero-understatement states more often reported that supervisors
explained changes and that their explanations seemed reasonable
compared to surveyors in all other states. State agency directors in
Massachusetts and New Mexico stated that explanations of changes to the
survey record provided opportunities for one-on-one feedback to
surveyors and discussions about deficiencies being removed.
State Agency Practices and External Pressure May Compromise Survey
Accuracy and Lead to Understatement in a Few States:
Nursing home surveyors and state agency directors in a minority of
states told us that in isolated cases issues such as a state agency
practice of noncitation, external pressure from the nursing home
industry, and an unbalanced IDR process may have led to the
understatement of deficiencies. In a few states, surveyors more often
identified problems with noncitation practices and IDR processes
compared to state agency directors. Yet, a few state agency directors
acknowledged either noncitation practices, external pressure, or an IDR
process that favored nursing home operators over resident welfare.
Although not all the issues raised by surveyors were corroborated by
the state agency directors in their states, surveyor reports clustered
in a few states gives credence to the notion that such conditions may
lead to understatement.
Surveyors Reported Noncitation Practices in a Small Number of States:
[Sidebar: Surveyor Quotation about State Noncitation Practices:
’I have been criticized by my supervisor on more than one occasion for
citing too many deficiencies at facilities [that] have an ongoing
history of repeat tags from survey to survey and many complaints
surveys between annual surveys. My supervisor states that citing too
many deficiencies …confuses‘ the facility and creates a …hostile‘
environment.“ End of sidebar]
Approximately 20 percent of surveyors nationwide and over 40 percent of
surveyors in five states reported that their state agency had at least
one of the following noncitation practices: (1) not citing certain
deficiencies, (2) not citing deficiencies above a certain scope and
severity level, and (3) allowing nursing homes to correct deficiencies
without receiving a citation (see figure 4). Only four state agency
directors acknowledged the existence of such practices in their states
on our questionnaire and only one of these directors was from the five
states most often identified by surveyors. One of these directors
commented on our questionnaire that one of these practices occurs only
in "rare individual cases." Another director commented that a
particular federal quality standard is not related to patient outcome
and therefore should not be cited above a Level F. According to CMS
protocols, when noncompliance with a federal requirement has been
identified, the state agency should cite all deficiencies associated
with the noncompliance. CMS regional officials we interviewed were not
aware of any current statewide noncitation practices.[Footnote 62]
Figure 4: Percentage of Surveyors in Each State Reporting at Least One
Noncitation Practice:
[Refer to PDF for image: U.S. map and associated data]
No data:
Pennsylvania[A].
0-10 Noncitation Practices:
Alabama:
Alaska:
Georgia:
Hawaii:
Idaho:
Montana:
Oklahoma:
Tennessee:
Vermont:
West Virginia.
11-20 Noncitation Practices:
Colorado:
Connecticut:
District of Columbia:
Florida:
Illinois:
Iowa:
Kentucky:
Maine:
Massachusetts:
Michigan:
Minnesota:
Mississippi:
Missouri:
New York:
North Dakota:
Rhode Island:
South Carolina:
Virginia:
Washington:
Wisconsin.
21-30 Noncitation Practices:
Arkansas:
California:
Indiana:
Louisiana:
Maryland:
New Jersey:
North Carolina:
Ohio:
Oregon:
South Dakota:
Texas:
Utah:
Wyoming.
31-40 Noncitation Practices:
Arizona:
Kansas.
41 on more Noncitation Practices:
Delaware:
Nebraska:
Nevada:
New Hampshire:
New Mexico.
Source: GAO. Map: Copyright © Corel Corp. All rights reserved.
[A] Responses from Pennsylvania surveyors could not be included because
the state agency directed nursing home surveyors not to respond to our
questionnaire.
[End of figure]
Not citing certain deficiencies. Nationally, 9 percent of surveyors
reported a state agency practice that surveyors not cite certain
deficiencies. However, in four states over 30 percent of surveyors
reported their state agency had this noncitation practice, including
over 60 percent of New Mexico surveyors. In some cases, surveyors
reported receiving direct instructions from supervisors not to cite
certain deficiencies. In other cases, surveyors' reports of noncitation
practices may have been based on their interpretation of certain
management practices. For instance, surveyors commented that some state
agency practices--such as providing inadequate time to observe and
document deficiencies or frequently deleting deficiency citations
during supervisory review--seemed like implicit or indirect leadership
from the agency to avoid citing deficiencies. One state agency director
we interviewed agreed that surveyors may report the existence of
noncitation practices when their citations are changed during
supervisory review. This official told us that when surveyors'
deficiencies are deleted or downgraded, the surveyors may choose not to
cite similar deficiencies in the future because they perceive being
overruled as an implicit state directive not to cite those
deficiencies.
Not citing deficiencies above a certain scope and severity level.
Although nationwide less than 8 percent of surveyors reported a state
agency practice that surveyors not cite deficiencies above a certain
scope and severity level, in two states over 25 percent of surveyors
reported that their state agency used this type of noncitation
practice. One reason state agencies might use this noncitation practice
could be to help manage the agency's workload. In particular, citing
deficiencies at a lower scope and severity might help the agency avoid
additional work associated with citing higher-level deficiencies, such
as survey revisits or IDR.[Footnote 63] In one of the two states
mentioned above, 54 percent of surveyors indicated that the workload
burden influenced their citations. Additionally, as we described
earlier, 16 percent of surveyors nationwide indicated that workload
burden influenced the citation of deficiencies and more than half of
state agency directors (including those from the two states mentioned
above) responded that staffing was not sufficient to complete revisit
surveys. While our questionnaire focused on not citing deficiencies
above a certain scope and severity level, a few surveyors commented on
being discouraged from citing lower-level deficiencies due to time
pressures to complete surveys. Agency officials in two states told us
that surveyors may miss some deficiencies due to limited survey time
and resources.
Allowing nursing homes to correct deficiencies without citing them on
the survey record. Nationwide, approximately 12 percent of surveyors
reported this type of noncitation practice. However, in five states, at
least 30 percent of surveyors reported their state agency allowed
nursing homes to correct deficiencies without citing those deficiencies
on the official survey record. Comments from surveyors suggest that
state agencies may use this type of practice to avoid actions that
nursing homes or the industry would dispute or interpret as excessive.
Similarly, several surveyors commented that they were instructed to
cite only one deficiency for a single type of negative outcome, even
when more than one problem existed. However, CMS guidance requires
state agencies to cite all problems that lead to a negative outcome.
The decrease in G-level citations that occurred after CMS implemented
the double G immediate sanctions policy in January 2000 also suggests
that some states may have avoided citing deficiencies that would result
in enforcement actions for the nursing home.[Footnote 64] The total
number of G-level deficiency citations nationwide dropped from
approximately 10,000 in 1999 to 7,700 in 2000.[Footnote 65]
State Agency Directors in a Few States Reported That External Pressure
Contributed to Understatement:
[Sidebar: Surveyor Quotation about External Pressure:
’The larger corporations often pressure our [state agency] Central
Office to change and delete citations. Our Central Office changes not
only wording but content and intent of the citation, when they were not
on site. There is a great deal of political push and pull”the
interference from State Senators and Representatives protecting their
re-electability and not the rights of the residents (who don‘t vote).“
End of sidebar]
State agency directors from 12 states reported experiencing external
pressure from at least one of the following stakeholder groups: (1) the
nursing home surveyed, (2) the nursing home industry, or (3) state or
federal legislators. Examples of such external pressure include
pressure to reduce federal or state nursing home regulation or to
delete specific deficiencies cited by the state agency. Of the 12 state
agency directors, 7 reported that external pressure at least sometimes
contributed to the understatement of deficiencies in their states,
while the other 5 indicated that it infrequently or never contributed
to understatement.
Adversarial attitude toward nursing home surveys. Two states we
interviewed--State A and State B--commented on the adversarial attitude
that industry and legislative representatives had toward nursing home
surveys at times.[Footnote 66] For instance, state agency officials
from State A told us that the state nursing home association organized
several forums to garner public and legislative support for curtailing
state regulation of facilities. According to officials in this state,
the influential industry groups threatened to request legislation to
move the state agency to a different department and to deny the
confirmation of the director's gubernatorial appointment if the
citations of G level or higher deficiencies increased. CMS regional
office officials responsible for State A told us that the state may be
experiencing more intense external pressure this year given the current
economy, because providers have greater concerns about the possible
financial implications of deficiency citations--fines or increased
insurance rates.
Similarly, officials from State B told us that when facilities are
close to termination, the state agency receives phone calls from state
delegates questioning the agency's survey results. Officials from State
B also told us that the Governor's office instructed the state agency
not to recommend facilities for enforcement actions. Officials from the
CMS regional office responsible for State B told us that this situation
was not problematic because CMS was ultimately responsible for
determining enforcement actions based on deficiency citations. However,
this regional office's statement is inconsistent with (1) language in
the SOM that calls for states to recommend enforcement actions to the
regional office, and (2) assertions from the regional office
responsible for State A that it infrequently disagrees with state
recommendations for sanctions. A third state agency director commented
that the agency had been called before state legislative committees in
2007-2008 to defend deficiency citations that led to the termination of
facilities. A fourth state agency director also commented on our
questionnaire that legislators had pressured the state agency on behalf
of nursing homes to get citations reduced or eliminated and prevent
enforcement actions for the facilities. In addition, a few surveyors
commented that at times when nursing homes were unhappy with their
survey results the homes or their state legislators would ask state
agency management to remove the citations from the survey record,
resulting in the deletion or downgrading of deficiencies. Further,
comments from a few surveyors indicated that they may steer clear of
citing deficiencies when they perceive the citation might cause a home
to complain or exert pressure for changes in the survey record.
Interference in the survey process. In a few cases, external pressure
appeared to directly interfere with the nursing home survey process.
State agency officials from two states--State A and an additional fifth
state--reported that state legislators or industry representatives had
appeared on-site during nursing home surveys. Although in some cases
the legislators just observed the survey process, officials from these
two states explained that third parties also have interfered with the
process by questioning or intimidating surveyors. The state agency
director from the fifth state commented on our questionnaire that the
nursing home industry sent legal staff on-site during surveys to
interfere with the survey process. Similarly, officials from State A
told us that during one survey, a home's lawyer was on-site reviewing
nursing home documentation before surveyors were given access to these
documents. Officials from State A also told us that state legislators
have attended surveys to question surveyors about their work and
whether state agency executives were coercing them to find
deficiencies. We discussed this issue with the CMS regional officials
responsible for State A, who acknowledged that this type of
interference had occurred.
States' need for support from CMS. In the face of significant external
pressure, officials from States A and B suggested that they need
support from CMS; however, CMS regional office officials did not always
acknowledge external pressure reported by the states. This year, State
A terminated a survey due to significant external pressure from a
nursing home and requested that the CMS regional office complete the
revisit survey for them. Six weeks later, the federal team completed
the survey and found many of the same problems that this state team had
previously identified before it stopped the survey. Officials from
State A suggested the need for other support as well, such as creating
a federal law that would require state agencies to report external
pressure and ensure whistleblower protections for state officials who
report pressure and allowing sanctions for inappropriate conduct. CMS
officials from the regional office responsible for State A stated that
external pressure might indirectly contribute to understatement by
increasing surveyor mistakes from the additional stress, workload,
focus on documentation, and supervisory reviews. Conversely, CMS
regional officials did not acknowledge that State B experienced
external pressure and officials from State B thought that CMS should be
more consistent in its requirements and enforcement actions.
Unbalanced IDR Processes Might Have Contributed to Understatement in a
Few States:
States with unbalanced IDR processes may experience more
understatement. IDR processes vary across states in structure, volume
of proceedings, and resulting changes. According to state agency
directors' responses to our questionnaire, 16 IDRs were requested per
100 homes in fiscal year 2007, with this number ranging among states
from 0 to 57 per 100 homes.[Footnote 67] For IDRs occurring in fiscal
year 2007, 20 percent of disputed deficiencies were deleted and 7
percent were downgraded in scope or severity, but in four states, at
least 40 percent of disputed deficiencies were deleted through this
process.[Footnote 68] CMS does not provide protocols on how states
should operate their IDR processes, leaving IDR operations to state
survey agencies' discretion. For example, states may choose to conduct
IDR meetings in writing, by telephone, or through face-to-face
conferences. State agencies also have the option to involve outside
entities, including legal representation, in their IDR operations.
[Sidebar: Surveyor Quotation about the IDR Process:
’The IDR process is inconsistent. Over the years we have been on all
ends of the spectrum”between having involved panels who have an
understanding of the survey process versus people who know nothing
about the process and have no idea how to apply the federal
regulations. (The latter is the current make-up.) When we have a panel
made up of the latter, the word spreads throughout the state and there
is a very large increase in requests for IDR. The reason is that this
type of panel tends to delete most everything for …insufficient
evidence‘ but cannot coherently explain how they came [to] that
decision.“ End of sidebar]
On the basis of responses from surveyors and state agency directors
clustered in a few states, problems with the IDR processes--such as
frequent hearings, deficiencies that are frequently deleted or
downgraded through the IDR process, or outcomes that favor nursing home
operators over resident welfare--may have contributed to the
understatement of deficiencies in those states. Although reports of
such problems were not common--only 16 percent of surveyors nationwide
reported on our questionnaire that their state's IDR process favored
nursing home operators--in four states over 40 percent of surveyors
reported that their IDR process favored nursing home operators (see
figure 5), including one state where a substantial percentage of
surveyors identified at least one noncitation practice. While only one
state agency director reported that the IDR process favored nursing
home operators, three other directors acknowledged that frequent IDR
hearings at least sometimes contributed to the understatement of
deficiencies. For example, in some states surveyors may hesitate to
cite deficiencies that they believe will be disputed by the nursing
home.
Figure 5: Percentage of Surveyors in Each State Reporting the IDR
Process Favored Concerns of Nursing Home Operators over Resident
Welfare:
[Refer to PDF for image: U.S. map and associated data]
No data:
Pennsylvania[A].
0-10% of Surveyors:
Alaska:
Arkansas:
California:
Colorado:
Connecticut:
District of Columbia:
Georgia:
Hawaii:
Idaho:
Illinois:
Indiana:
Kansas:
Maine:
Michigan:
Mississippi:
Montana:
New Hampshire:
New Mexico:
North Dakota:
Oklahoma:
Oregon:
Rhode Island:
South Dakota:
Utah:
Vermont:
Wisconsin.
11-20% of Surveyors:
Alabama:
Arizona:
Florida:
Iowa:
Minnesota:
Missouri:
Nevada:
New York:
North Carolina:
South Carolina:
Tennessee:
West Virginia.
21-30% of Surveyors:
Kentucky:
Maryland:
Nebraska:
New Jersey:
Ohio:
Texas:
Washington:
Wyoming.
31-40% of Surveyors:
None.
41% of more of Surveyors:
Delaware:
Louisiana:
Massachusetts:
Virginia.
Source: GAO. Map: Copyright © Corel Corp. All rights reserved.
[A] Responses from Pennsylvania surveyors could not be included because
the state agency directed nursing home surveyors not to respond to our
questionnaire.
[End of figure]
In isolated cases, a lack of balance with the IDR process appeared to
be a result of external pressure. In one state, the state agency
director reported that the nursing home industry sent association
representatives to the IDR, which increased the contentiousness of the
process. In another state, officials told us that a large nursing home
chain worked with the state legislature to set up an alternative to the
state IDR process, which has been used only by facilities in this
chain. Through this alternative appeals process, both the state agency
and the nursing home have legal representation, and compliance
decisions are made by an adjudicator. According to agency officials in
this state, the adjudicators for this alternative appeals process do
not always have health care backgrounds. While CMS gives states the
option to allow outside entities to conduct the IDR, the states should
maintain ultimate responsibility for IDR decisions.[Footnote 69] CMS
regional officials stated it would not consider the outcome of this
alternative appeals process when assessing deficiencies or determining
enforcement actions. Regardless, these actions may have affected
surveyors' perceptions of the balance of the states' IDRs, because over
twice the national average of surveyors in this state reported that
their IDR process favored nursing home operators.
Conclusions:
Reducing understatement is critical to protecting the health and safety
of vulnerable nursing home residents and ensuring the credibility of
the survey process. Federal and state efforts will require a sustained,
long-term commitment because understatement arises from weaknesses in
several interrelated areas--including CMS's survey process, surveyor
workforce and training, supervisory review processes, and state agency
practices and external pressure.
* Concerns about CMS's Survey Process. Survey methodology and guidance
are integral to reliable and consistent state nursing home surveys, and
we found that weaknesses in these areas were linked to understatement
by both surveyors and state agency directors. Both groups reported
struggling to interpret existing guidance, and differences in
interpretation were linked to understatement, especially in determining
what constitutes actual harm. Surveyors noted that the current survey
guidance was too lengthy, complex, and subjective. Additionally, they
had fewer concerns about care areas for which CMS has issued revised
interpretive protocols. In its development of the QIS, CMS has taken
steps to revise the nursing home survey methodology. However,
development and implementation of the QIS in a small group of states
has taken approximately 10 years, and full implementation of the new
methodology is not expected to be completed until 2014. The experience
of the QIS was mixed regarding improvement in the quality of surveys,
and the independent evaluation generated a number of recommendations
for improving the QIS. CMS concluded that it needed to focus future QIS
development efforts on improving survey consistency and giving
supervisors more tools to assess performance of surveyor teams.
* Ongoing Workforce and Surveyor Training Challenges. Workforce
shortages in state survey agencies increase the need for high-quality
initial and ongoing training for surveyors. Currently, high vacancy
rates can place pressure on state surveyors to complete surveys under
difficult circumstances, including compressed time frames, inadequately
staffed survey teams, and too many inexperienced surveyors. States are
responsible for hiring and retaining surveyors and have grappled with
pervasive and intractable workforce shortages. State agency directors
struggling with these workforce issues reported the need for more
readily accessible training for both their new and experienced
surveyors that did not involve travel to a central location. Nearly 30
percent of surveyors in high-understatement states stated that initial
surveyor training, which is primarily a state activity that
incorporates two CMS on-line computer courses and a 1-week federal
basic training course culminating in the SMQT, was not adequate to
identify deficiencies and cite them at the appropriate scope and
severity level. State agency directors reported that workforce
shortages also impede states' ability to provide ongoing training
opportunities for experienced staff and that additional CMS online
training and electronic training media would help states maintain an
experienced, well-informed workforce. They noted that any such support
should be cognizant of states' current resource constraints, including
limited funding of travel for training.
* Supervisory Review Limitations. Currently, CMS provides little
guidance on how states should structure supervisory review processes,
leaving the scope of this important quality-assurance tool exclusively
to the states and resulting in considerable variation throughout the
nation in how these processes are structured. We believe that state
quality assurance processes are a more effective preventive measure
against understatement because they have the potential to be more
immediate and cover more surveys than the limited number of federal
comparative surveys conducted in each state. However, compared to
reviews of serious deficiencies, states conducted relatively fewer
reviews of deficiencies at the D through F level, those that were most
frequently understated throughout the nation, to assess whether or not
such deficiencies were cited at too low a scope and severity level. In
addition, we found that frequent changes to survey results made during
supervisory review were symptomatic of workforce shortages and survey
methodology weaknesses. For example, surveyors who reported that survey
teams had too many new surveyors, more often also reported either
frequent changes to or removals of deficiencies during supervisory
reviews--indicating that states with inexperienced workforces may rely
more heavily on supervisory reviews. In addition, variation existed in
the type of feedback surveyors receive when deficiencies are changed or
removed during supervisory reviews, providing surveyors with
inconsistent access to valuable feedback and training. CMS did not
implement our previous recommendation to require states to have a
quality assurance process that includes, at a minimum, a review of a
sample of survey reports below the actual harm level to assess the
appropriateness of the scope and severity cited and help reduce
understatement.
* State Agency Practices and External Pressure. In a few states,
noncitation practices, challenging relationships with the industry or
legislators, or unbalanced IDR processes--those that surveyors regard
as favoring nursing home operators over resident welfare--may have had
a negative effect on survey quality and resulted in the citation of
fewer nursing home deficiencies than was warranted. In one state, both
the state agency director and over 40 percent of surveyors acknowledged
the existence of a noncitation practice such as allowing a home to
correct a deficiency without receiving a citation. Forty percent of
surveyors in four other states also responded on our questionnaire that
noncitation practices existed. Currently, CMS does not explicitly
address such practices in its guidance to states, and its oversight is
limited to reviews of citation patterns, feedback from state surveyors,
state performance reviews, and federal monitoring surveys to determine
if such practices exist. Twelve state agency directors reported on our
questionnaire experiencing some kind of external pressure. For example,
in one state a legislator attended a survey and questioned surveyors as
to whether state agency executives were coercing them to find
deficiencies. Under such circumstances, it is difficult to know if the
affected surveyors are consistently enforcing federal standards and
reporting all deficiencies at the appropriate scope and severity
levels. States' differing experiences regarding the enforcement of
federal standards and collaboration with their CMS regional offices in
the face of significant external pressure also may confuse or undermine
a thorough and independent survey process. If surveyors believe that
CMS does not fully or consistently support the enforcement of federal
standards, these surveyors may choose to avoid citing deficiencies that
they perceive may trigger a reaction from external stakeholders. In
addition, deficiency determinations may be influenced when IDR
processes are perceived to favor nursing home operators over resident
welfare.
Because many aspects of federal and state operations contribute to the
understatement of deficiencies on nursing home surveys, mitigating this
problem will require the concerted effort of both entities. The
interrelated nature of these challenges suggests a need for increased
CMS attention on the areas noted above and additional federal support
for states' efforts to enforce federal nursing home quality standards.
Recommendations for Executive Action:
To address concerns about weaknesses in CMS survey methodology and
guidance, we recommend that the Administrator of CMS take the following
two actions:
* make sure that action is taken to address concerns identified with
the new QIS methodology, such as ensuring that it accurately identifies
potential quality problems; and:
* clarify and revise existing CMS written guidance to make it more
concise, simplify its application in the field, and reduce confusion,
particularly on the definition of actual harm.
To address surveyor workforce shortages and insufficient training, we
recommend that the Administrator of CMS take the following two actions:
* consider establishing a pool of additional national surveyors that
could augment state survey teams or identify other approaches to help
states experiencing workforce shortages;
* evaluate the current training programs and division of responsibility
between federal and state components to determine the most cost-
effective approach to: (1) providing initial surveyor training to new
surveyors, and (2) supporting the continuing education of experienced
surveyors.
To address inconsistencies in state supervisory reviews, we recommend
that the Administrator of CMS take the following action:
* set an expectation through guidance that states have a supervisory
review program as a part of their quality-assurance processes that
includes routine reviews of deficiencies at the level of potential for
more than minimal harm (D-F) and that provides feedback to surveyors
regarding changes made to citations.
To address state agency practices and external pressure that may
compromise survey accuracy, we recommend that the Administrator of CMS
take the following two actions:
* reestablish expectations through guidance to state survey agencies
that noncitation practices--official or unofficial--are inappropriate,
and systematically monitor trends in states' citations; and:
* establish expectations through guidance to state survey agencies to
communicate and collaborate with their CMS regional offices when they
experience significant pressure from legislators or the nursing home
industry that may affect the survey process or surveyors' perceptions.
Agency and AHFSA Comments and Our Evaluation:
We provided a draft of this report to HHS and AHFSA for comment. In
response, the Acting Administrator of CMS provided written comments.
CMS noted that the report adds value to important public policy
discussions regarding the survey process and contributes ideas for
solutions on the underlying potential causes of understatement. CMS
fully endorsed five of our seven recommendations and indicated it would
explore alternate solutions to our remaining two recommendations, one
of which the agency did not plan to implement on a national scale.
(CMS's comments are reprinted in appendix II.) AHFSA's comments noted
that several states agreed with one of our recommendations, but did not
directly express agreement or disagreement with the other
recommendations. AHFSA made several other comments on our findings and
recommendations as summarized below.
CMS:
CMS agreed with five of our recommendations that called for: (1)
addressing issues identified with the new QIS methodology, (2)
evaluating current training programs, (3) setting expectations that
states have a supervisory review program, (4) reestablishing
expectations that noncitation practices are inappropriate, and (5)
establishing expectations that states communicate with their CMS
regional office when they experience significant pressure from
legislators or the nursing home industry. In its comments, the agency
cited several ongoing efforts as mechanisms for addressing some of our
recommendations. While we acknowledge the importance of these ongoing
efforts, in some areas we believe more progress and investigation are
likely needed to fully address our findings and recommendations. For
example, we recommended that CMS ensure that measures are taken to
address issues identified with the new QIS methodology, such as
ensuring that it accurately identifies potential quality problems;
CMS's response cited Desk Audit Reports that enable supervisors to
provide improved feedback to surveyors and quarterly meetings of a user
group as evidence of efforts under way to continuously improve the QIS
and to increase survey consistency. However, we noted that a 2007
evaluation of the QIS did not find improved survey accuracy compared to
the traditional survey process and recommended that CMS evaluate how
well the QIS accurately identifies areas in which there were potential
quality problems. While improving the consistency of the survey process
is important, CMS must also focus on addressing the accuracy of QIS
surveys.
For the remaining two recommendations, CMS described alternative
solutions that it indicated the agency would explore:
* Guidance. The agency agreed in principle with our recommendation to
clarify and revise existing written guidance to make it more concise,
simplify its application in the field, and reduce confusion. However,
CMS disagreed with shortening the guidance as the preferred method for
achieving such clarification. Instead, the agency suggested an
alternative--the creation of some short reference documents for use in
the field that contain cross-links back to the full guidance--that we
believe would fulfill the intent of our recommendation.
* National surveyor pool. CMS indicated it did not plan to implement
our recommendation to consider establishing a pool of additional
national surveyors that could augment state survey teams experiencing
workforce shortages, at least not on a national scale. The agency
stated that the establishment of national survey teams was problematic
for several reasons, including that it (1) began to blur the line
between state accountability for meeting performance expectations and
compensating states for problematic performance due to state management
decisions, and (2) was improper for CMS to tell states how to make
personnel decisions While the agency noted that it used national
contractors to perform surveys for other types of facilities such as
organ transplant centers, it expressed concern about their use to
compensate for state performance issues because of the more frequent
nursing home surveys.
We believe that state workforce shortages are a separate issue from
state performance on surveys. Since 2003, we have reported pervasive
state workforce shortages and this report confirms that such shortages
continue.[Footnote 70] For example, we reported that one-fourth of
states had vacancy rates higher than 19 percent and that one state
reported a 72 percent vacancy rate. We also believe that addressing
workforce shortages is critical to creating an effective system of
oversight for nursing homes and reducing understatement throughout the
nation.
However, CMS noted that it would explore this issue with a state-
federal work group in order to identify any circumstances in which a
national pool may be advisable and to identify any additional
solutions. Reflecting this comment from CMS, we have revised our
original recommendation to include other potential solutions as well as
a national pool of surveyors. One suggestion in AHFSA comments may be
worth exploring in this regard--providing funds to state survey
agencies for recruitment and retention activities.
AHFSA:
AHFSA commented that vigorous oversight and enforcement are essential
to improving the quality of life and quality of care for health care
consumers and are critical if improvements already achieved are to be
maintained. The association noted that several states agreed with our
recommendation on the need for CMS to revise existing written guidance
to make it more concise. While the association did not directly express
agreement or disagreement with our other recommendations, it did note
that most states would need additional funding to meet any new staffing
requirements associated with our recommendation that CMS set an
expectation for states to have a supervisory review program.
However, AHFSA noted what it considered to be conflicting assertions
within the report. For example, it noted that we cited inexperienced
staff as a factor that contributes to understatement but also appeared
to take issue with the practice of supervisors changing reports
prepared by inexperienced staff. While our report identifies a wide
variety of factors that may contribute to understatement, we did not
and could not meaningfully prioritize among these factors based on the
responses of nursing home surveyors and state agency directors. We did
find that many states were attempting to accomplish their survey
workload with a large share of inexperienced surveyors and that state
agency directors sometimes linked this reliance on inexperienced staff
to the understatement of nursing home deficiencies. In addition, we
found that frequent changes made during supervisory review were
symptomatic of workforce shortages and survey methodology weaknesses.
For example, surveyors who reported that survey teams had too many new
surveyors, more often also reported either frequent changes to or
removals of deficiencies during supervisory reviews. We believe that
state quality assurance processes have the potential to play an
important role in preventing understatement, which may result in states
with inexperienced workforces relying more heavily on supervisory
reviews.
AHFSA also stated that our report did not address limitations of
federal monitoring surveys, specifically the potential inconsistency
among CMS regional offices in how these surveys are conducted.
Assessing CMS's performance on federal monitoring surveys was beyond
the scope of this report. However, our May 2008 report noted several
improvements CMS had made since fiscal years 2002 and 2003 in federal
comparative surveys intended to make them more comparable to the state
surveys they are assessing; these improvement include; (1) reducing the
time between the state and federal surveys to ensure that they more
accurately capture the conditions at the time of the state survey, (2)
including at least half of the residents from state survey
investigative samples to allow for a more clear-cut determination of
whether the state survey should have cited a deficiency, and (3) using
the same number of federal surveyors as the corresponding state survey,
again to more closely mirror the conditions under which the state
survey was conducted.[Footnote 71]
Finally, AHFSA questioned whether the information that we received from
surveyors about the IDR process was universally valid because their
input about quality assurance reviews might be biased. Our methodology
did not rely solely on surveyor responses to our questionnaire but used
a separate questionnaire sent to state survey agency directors to help
corroborate their responses. Thus we reported both that (1) over 40
percent of surveyors in four states indicated that their IDR process
favored nursing home operators and (2) one state survey agency director
agreed and three others acknowledged that frequent IDR hearings
sometimes contributed to the understatement of deficiencies. We also
collected and reported data on the number of deficiencies modified or
overturned, which AHFSA said was a more accurate measure of the effect
of IDRs.
We also incorporated technical comments from AHFSA as appropriate.
As arranged with your offices, unless you publicly announce its
contents earlier, we plan no further distribution of this report until
30 days after its issue date. At that time, we will send copies to the
Administrator of the Centers for Medicare & Medicaid Services and
appropriate congressional committees. In addition, the report will be
available at no charge on GAO's Web site at [hyperlink,
http://www.gao.gov].
If you or your staffs have any questions about this report, please
contact me at (202) 512-7114 or dickenj@gao.gov. Contact points for our
Offices of Congressional Relations and Public Affairs may be found on
the last page of this report. GAO staff who made major contributions to
this report are listed in appendix II.
Signed by:
John E. Dicken:
Director, Health Care:
[End of section]
Appendix I: Scope and Methodology:
This appendix describes the data and methods we used to identify the
factors that contribute to the understatement of serious deficiencies
on nursing home surveys.[Footnote 72] This report relies largely on the
data collected through (1) two GAO-administered Web-based
questionnaires to nursing home surveyors and state agency directors and
(2) analysis of federal and state nursing home survey results as
reported in the federal monitoring survey database and the On-Line
Survey, Certification, and Reporting (OSCAR) system. Summary results
from the GAO questionnaires are available as an e-supplement to this
report. See Nursing Homes: Responses from Two Web-Based Questionnaires
to Nursing Home Surveyors and State Agency Directors (GAO-10-74SP), an
E-supplement to GAO-10-70. To augment our quantitative analysis, we
also interviewed officials at the Centers for Medicare & Medicaid (CMS)
Survey and Certification Group and select regional offices;[Footnote
73] reviewed federal regulations, guidance, and our prior work; and
conducted follow-up interviews with eight state agency directors and a
select group of surveyors. Except where otherwise noted, we used data
from fiscal year 2007 because they were the most recently available
data at the time of our analysis.
Development of Questionnaires and Analysis of Responses:
We developed two Web-based questionnaires--one for the nursing home
surveyors and one for the state agency directors.
Development of the Questionnaires:
The questionnaires were developed and the data collection and analysis
conducted to (1) minimize errors arising from differences in how a
particular question might be interpreted and in the sources of
information available to respondents and (2) reduce variability in
responses that should be qualitatively the same. GAO social science
survey specialists aided in the design and development of both
questionnaires. We pretested the two questionnaires with six surveyors
from a local state and five former or current state agency directors,
respectively. Based on feedback from these pretests, the questionnaires
were revised to improve clarity and the precision of responses, and
ensure that all questions were fair and unbiased. Most questions were
closed-ended, which limited the respondent to answers such as yes or
no, or to identifying the frequency that an event occurred using a
scale--always, frequently, sometimes, infrequently, or never. For
reporting purposes, we grouped the scaled responses into three
categories--always/frequently, sometimes, and infrequently/never. Both
questionnaires included some open-ended questions to allow respondents
to identify specific training needs or other concerns.
With few exceptions, respondents entered their responses directly into
the Web-based questionnaire databases.[Footnote 74] These
questionnaires were sent to the eligible population of nursing home
surveyors and all state agency directors. We performed computer
analyses to identify illogical or inconsistent responses and other
indications of possible error. We also conducted follow-up interviews
with select respondents to clarify and gain a contextual understanding
of their responses.[Footnote 75]
Questionnaire for Nursing Home Surveyors:
This questionnaire was designed to gather information from nursing home
surveyors nationwide about the process for identifying and citing
nursing home deficiencies. It included questions about various aspects
of the survey process identified by our prior work that may contribute
to survey inconsistency and the understatement of deficiencies. Such
aspects included survey methodology and guidance, deficiency
determination, surveyor training, supervisory review of draft surveys,
and state agency policies and procedures.[Footnote 76]
We fielded the questionnaire from May through July 2008 to 3,819
eligible nursing home surveyors. To identify the eligible population,
we downloaded a list of identification numbers for surveyors who had
conducted at least one health survey of a nursing home in fiscal years
2006 or 2007 from CMS's OSCAR database and we obtained surveyors' e-
mail addresses from state survey agencies. We received complete
responses from 2,340 state surveyors, for a 61 percent response
rate.[Footnote 77] The state-level response rates were above 40 percent
for all but three states--Connecticut, Illinois, and
Pennsylvania.[Footnote 78] We excluded Pennsylvania from our analysis
because Pennsylvania's Deputy Secretary for Quality Assurance
instructed the state's surveyors not to respond to our survey and few
responded. (For response rates by state, see table 9.)
Table 9: Response Rates to GAO's Questionnaire of Nursing Home
Surveyors, 2008:
State: Alabama;
Number of respondents: 36;
Number of eligible surveyors: 52;
Response rate: 69%.
State: Alaska;
Number of respondents: 4;
Number of eligible surveyors: 6;
Response rate: 67%.
State: Arizona;
Number of respondents: 19;
Number of eligible surveyors: 28;
Response rate: 68%.
State: Arkansas;
Number of respondents: 28;
Number of eligible surveyors: 54;
Response rate: 52%.
State: California;
Number of respondents: 306;
Number of eligible surveyors: 544;
Response rate: 56%.
State: Colorado;
Number of respondents: 16;
Number of eligible surveyors: 38;
Response rate: 42%.
State: Connecticut;
Number of respondents: 17;
Number of eligible surveyors: 61;
Response rate: 28%.
State: Delaware;
Number of respondents: 13;
Number of eligible surveyors: 16;
Response rate: 81%.
State: District of Columbia;
Number of respondents: 6;
Number of eligible surveyors: 10;
Response rate: 60%.
State: Florida;
Number of respondents: 128;
Number of eligible surveyors: 226;
Response rate: 57%.
State: Georgia;
Number of respondents: 47;
Number of eligible surveyors: 54;
Response rate: 87%.
State: Hawaii;
Number of respondents: 4;
Number of eligible surveyors: 7;
Response rate: 57%.
State: Idaho;
Number of respondents: 6;
Number of eligible surveyors: 11;
Response rate: 55%.
State: Illinois;
Number of respondents: 34;
Number of eligible surveyors: 171;
Response rate: 20%.
State: Indiana;
Number of respondents: 92;
Number of eligible surveyors: 101;
Response rate: 91%.
State: Iowa;
Number of respondents: 37;
Number of eligible surveyors: 59;
Response rate: 63%.
State: Kansas;
Number of respondents: 34;
Number of eligible surveyors: 59;
Response rate: 58%.
State: Kentucky;
Number of respondents: 44;
Number of eligible surveyors: 86;
Response rate: 51%.
State: Louisiana;
Number of respondents: 79;
Number of eligible surveyors: 134;
Response rate: 59%.
State: Maine;
Number of respondents: 24;
Number of eligible surveyors: 28;
Response rate: 86%.
State: Maryland;
Number of respondents: 29;
Number of eligible surveyors: 47;
Response rate: 62%.
State: Massachusetts;
Number of respondents: 39;
Number of eligible surveyors: 88;
Response rate: 44%.
State: Michigan;
Number of respondents: 50;
Number of eligible surveyors: 80;
Response rate: 63%.
State: Minnesota;
Number of respondents: 58;
Number of eligible surveyors: 85;
Response rate: 68%.
State: Mississippi;
Number of respondents: 21;
Number of eligible surveyors: 35;
Response rate: 60%.
State: Missouri;
Number of respondents: 175;
Number of eligible surveyors: 192;
Response rate: 91%.
State: Montana;
Number of respondents: 20;
Number of eligible surveyors: 21;
Response rate: 95%.
State: Nebraska;
Number of respondents: 26;
Number of eligible surveyors: 33;
Response rate: 79%.
State: Nevada;
Number of respondents: 21;
Number of eligible surveyors: 25;
Response rate: 84%.
State: New Hampshire;
Number of respondents: 9;
Number of eligible surveyors: 15;
Response rate: 60%.
State: New Jersey;
Number of respondents: 35;
Number of eligible surveyors: 77;
Response rate: 45%.
State: New Mexico;
Number of respondents: 18;
Number of eligible surveyors: 25;
Response rate: 72%.
State: New York;
Number of respondents: 108;
Number of eligible surveyors: 250;
Response rate: 43%.
State: North Carolina;
Number of respondents: 53;
Number of eligible surveyors: 84;
Response rate: 63%.
State: North Dakota;
Number of respondents: 12;
Number of eligible surveyors: 16;
Response rate: 75%.
State: Ohio;
Number of respondents: 77;
Number of eligible surveyors: 145;
Response rate: 53%.
State: Oklahoma;
Number of respondents: 63;
Number of eligible surveyors: 92;
Response rate: 68%.
State: Oregon;
Number of respondents: 31;
Number of eligible surveyors: 45;
Response rate: 69%.
State: Rhode Island;
Number of respondents: 20;
Number of eligible surveyors: 27;
Response rate: 74%.
State: South Carolina;
Number of respondents: 15;
Number of eligible surveyors: 27;
Response rate: 56%.
State: South Dakota;
Number of respondents: 20;
Number of eligible surveyors: 22;
Response rate: 91%.
State: Tennessee;
Number of respondents: 52;
Number of eligible surveyors: 79;
Response rate: 66%.
State: Texas;
Number of respondents: 201;
Number of eligible surveyors: 281;
Response rate: 72%.
State: Utah;
Number of respondents: 16;
Number of eligible surveyors: 25;
Response rate: 64%.
State: Vermont;
Number of respondents: 11;
Number of eligible surveyors: 16;
Response rate: 69%.
State: Virginia;
Number of respondents: 36;
Number of eligible surveyors: 42;
Response rate: 86%.
State: Washington;
Number of respondents: 60;
Number of eligible surveyors: 85;
Response rate: 71%.
State: West Virginia;
Number of respondents: 14;
Number of eligible surveyors: 22;
Response rate: 64%.
State: Wisconsin;
Number of respondents: 66;
Number of eligible surveyors: 83;
Response rate: 80%.
State: Wyoming;
Number of respondents: 10;
Number of eligible surveyors: 10;
Response rate: 100.0%.
State: Total;
Number of respondents: 2,340;
Number of eligible surveyors: 3,819;
Response rate: 61%.
Source: GAO.
[End of table]
Questionnaire for State Agency Directors:
The questionnaire for state agency directors was designed to gather
information on the nursing home survey process in each state. Directors
were asked many of the same questions as the surveyors, but the survey
agency directors' questionnaire contained additional questions on the
overall organization of the survey agency, resource and staffing
issues, CMS's Quality Indicator Survey (QIS), and experience with CMS's
federal monitoring surveys.[Footnote 79] In addition, the questionnaire
for state agency directors asked them to rank the degree to which
several factors, derived from our previous work, contributed to
understatement.[Footnote 80] This questionnaire was fielded from
September to November 2008 to all 50 state survey agency directors and
the survey agency director for the District of Columbia. We received
completed responses from 50 of 51 survey agency directors, for a 98
percent response rate. The District of Columbia survey agency director
did not respond.
Analysis of Responses:
To analyze results from the survey questions among groups, we used
standard descriptive statistics. In addition, we looked for
associations between questions through correlations and tests of the
differences in means for groups. For certain open-ended questions, we
used a standard content review method to identify topics that
respondents mentioned such as "applying CMS guidance," "on-the-job
training," "time to complete survey onsite," or "time to complete the
survey paperwork." Our coding process involved one independent coder
and an independent analyst who verified a random sample of the coded
comments. For open-ended questions that enabled respondents to provide
additional general information, we used similar standard content review
methods, including independent coding by two raters who resolved all
disagreements through discussion.
Validity and Reliability of Data:
In addition to the precautions taken during the development of the
questionnaires, we performed automated checks on these data to identify
inappropriate answers. We also reviewed the data for missing or
ambiguous responses.[Footnote 81] Where comments on open-ended
questions provided more detail or contradicted responses to categorical
questions, the latter were corrected. On the basis of the strength of
our systematic survey processes and follow-up procedures, we determined
that the questionnaire responses were representative of the experience
and perceptions of nursing home surveyors and state agency directors
nationally and at the state level, with the exception of Pennsylvania
surveyors and the survey agency director of the District of Columbia.
On the basis of the response rates and these activities, we determined
that the data were sufficiently reliable for our purposes.
We also interviewed directors and other state agency officials in eight
states to better understand unusual or interesting circumstances
related to surveyor workforce and training, supervisory review, or
state policies and practices. We selected these eight states based on
our analysis of questionnaire responses from state agency directors and
nursing home surveyors.
Analysis of Federal Comparative and Observational Surveys:
We used information from our May 2008 report on federal comparative
surveys nationwide for fiscal years 2002 through 2007 to categorize
states into groups.[Footnote 82] We used these results to identify
states with high and low percentages of serious missed deficiencies.
[Footnote 83] We classified nine states as high-understatement states--
those that had 25 percent or more federal comparative surveys
identifying at least one missed deficiency at the actual harm or
immediate jeopardy levels across all years. These states were Alabama,
Arizona, Missouri, New Mexico, Oklahoma, South Carolina, South Dakota,
Tennessee, and Wyoming. Zero-understatement states were those that had
no federal comparative surveys identifying missed deficiencies at the
actual harm or immediate jeopardy levels. These seven states were
Alaska, Idaho, Maine, North Dakota, Oregon, Vermont, and West Virginia.
Low-understatement states were the 10 with the lowest percentage of
missed serious deficiencies (less than 6 percent)--Arkansas, Nebraska,
Ohio, and all seven zero-understatement states.
Response rates among the high-, low-, and zero-understatement states--
approximately 77, 62, and 71 percent, respectively--supported
statistical testing of associations and differences among these state
groupings. Therefore, in addition to descriptive statistics, we used
correlations and tests of the differences in means for groups to
identify questionnaire responses that were associated with differences
in understatement.[Footnote 84] We reported the statistically
significant results for tests of association and differences between
group averages at the 5 percent level, unless otherwise noted.
In a previous report, we found a possible relationship between the
understatement of nursing home deficiencies on the federal comparative
surveys and surveyor performance in General Investigation and
Deficiency Determination on federal observational surveys--that is,
high-understatement states more often had below-satisfactory ratings in
General Investigation and Deficiency Determination than low-
understatement states.[Footnote 85] For this report, we applied the
same statistical analysis to identify when responses to our
questionnaires were associated with satisfactory performance on General
Investigative and Deficiency Determination skills on the federal
observational surveys. We interpreted such relationships as an
indication of additional training needs.
Analysis of OSCAR:
We used information from OSCAR and the federal monitoring survey
databases to (1) compare the deficiencies cited by state and federal
surveyors, (2) analyze the timing of nursing home surveys, and (3)
assess trends in deficiency citations. OSCAR is a comprehensive
database that contains information on the results of state nursing home
surveys. CMS reviews these data and uses them to compute nursing home
facility and state performance measures. When we analyzed these data,
we included automated checks of data fields to ensure that they contain
complete information. For these reasons, we determined that the OSCAR
data were sufficiently reliable for our purposes.
* We used OSCAR and the federal monitoring survey database to compare
average facility citations on state survey records with the average
citations on federal observational survey records for the same
facilities during fiscal years 2002 through 2007.[Footnote 86] We
computed the average number of serious deficiencies cited on federal
observational surveys between fiscal years 2002 through 2007, and for
the same facilities and time period, calculated the average number of
serious deficiencies cited on state surveys. Next, we determined which
facilities had greater average serious deficiency citations on federal
observational surveys compared to state standard surveys between fiscal
years 2002 through 2007. For these facilities, we computed the
percentage difference between the average number of serious
deficiencies cited on federal observational surveys and those cited on
state surveys.
* We used OSCAR to determine the percentage of the most recent state
surveys that were predictable because of their timing. Our analysis of
survey predictability compared the time between state agencies' current
and prior standard nursing home surveys as of June 2008. According to
CMS, states consider 9 months to 15 months from the last standard
survey as the window for completing standard surveys because it yields
a 12-month average. We considered surveys to be predictable if (1)
homes were surveyed within 15 days of the 1-year anniversary of their
prior survey or (2) homes were surveyed within 1 month of the maximum
15-month interval between standard surveys.
* We calculated the number of serious deficiencies on state surveys in
OSCAR from calendar year 1999 through 2007. We examined the trend in G-
level and higher deficiencies to assess whether CMS's expanded
enforcement policy appeared to affect citation rates. Effective January
2000, CMS completed the implementation of its immediate-sanctions
policy, requiring the referral of homes that caused actual harm or
immediate jeopardy on successive standard surveys or intervening
complaint investigations.
[End of section]
Appendix II: Comments from the Department of Health & Human Services:
Department Of Health & Human Services:
Office Of The Secretary:
Assistant Secretary for Legislation:
Washington, DC 20201:
October 30, 2009:
John Dicken:
Director, Health Care:
U.S. Government Accountability Office:
441 G Street N.W.
Washington, DC 20548:
Dear Mr. Dicken:
Enclosed are comments on the U.S. Government Accountability Office's
(GAO) report entitled: "Nursing Homes: Addressing the Factors
Underlying Understatement of Serious Care Problems Requires Sustained
CMS and State Commitment" (GAO-10-70).
The Department appreciates the opportunity to review this report before
its publication.
Sincerely,
Signed by:
Andrea Palm:
Acting Assistant Secretary for Legislation:
Enclosure:
[End of letter]
Department Of Health & Human Services:
Centers for Medicare & Medicaid Services:
Administrator:
Washington, DC 20201:
Date: October 30, 2009:
To: Andrea Palm:
Acting Assistant Secretary for Legislation:
Office of the Secretary:
From: [Signed by] Charlene Frizzera:
Acting Administrator:
Subject: Government Accountability Office (GAO) Draft Report: "Nursing
Homes: Addressing the Factors Underlying Understatement of Serious Care
Problems Requires Sustained CMS and State Commitment" (GA0-10-70):
The Centers for Medicare & Medicaid Services (CMS) appreciates the
opportunity to review and comment on the subject GAO Draft Report. The
GAO was asked to examine four factors relevant to potential
understatement of nursing home deficiencies including:
1. CMS' survey process;
2. Workforce shortages and training;
3. Supervisory reviews of surveys; and;
4. State agency practices.
We believe that the GAO study adds value to the important public policy
discussions regarding the survey process. The report derives its
information primarily from opinion surveys of State surveyors and
directors of State survey agencies. We particularly appreciate the
GAO's effort to identify underlying causes of issues in the survey
process and contribute ideas for solutions. We fully endorse five of
the seven GAO recommendations. For the remaining two, we will convene a
workgroup of State and Federal officials to explore alternate solutions
that may be responsive to the intent of the GAO recommendations.
We offer the following, more detailed responses to the GAO
recommendations.
GAO Recommendation 1: Quality Indicator Survey (OIS):
To improve the survey process, CMS must make sure measures are taken to
address issues identified with the new QIS methodology.
CMS Response:
We agree with this recommendation. We have designed the QIS
specifically to improve consistency of survey processes and to provide
both States and CMS with tools for continuous improvement. Since the
tools for continuous improvement were not part of the original QIS
design, their development has slowed QIS implementation somewhat, but
the investment has been very worthwhile. A set of QIS "Desk Audit
Reports" (DARs) represents one such tool. The DARs contain over 30 key
investigative and decision-making factors for each QIS survey. The DARs
also enable State survey supervisors to provide improved feedback to
their surveyors, identify patterns in deficiency citation, and
strengthen the consistency of the survey process between State surveyor
teams. These tools provide information for the CMS regional offices and
become part of a quarterly review teleconference in which each State
can review the patterns of survey results together with the CMS
regional office.
Finally, we established a user group that convenes quarterly to
continue to address QIS issues as they arise. This user group may also
represent a useful forum to share ideas for methods to facilitate
implementation and to generate additional ideas for refinements to the
system or to the implementation process. For example, Florida has just
completed its statewide conversion from the traditional survey to the
QIS and made a total revision to their surveyor orientation program
materials. We plan to share these materials with the other States, so
they can upgrade their orientation efforts to more effectively address
the investigation and deficiency determination skills of newly hired
surveyors.
GAO Recommendation 2: CMS Guidance for Surveyors:
To improve guidance to surveyors, CMS should clarify and revise
existing CMS written guidance to make it more concise, simples its
application in the field and reduce confusion, particularly on the
definition of actual harm.
CMS Response:
We will seek alternate methods to address these issues, and will work
with a workgroup of State and Federal officials to do so. We agree that
it is desirable to clarify any areas of Federal guidance that may be
ambiguous, but do not agree that shortening the guidance is necessarily
the preferable method of doing so, or that greater conciseness would
have the desired effect. One method of striking a balance between full
guidance and conciseness may be to create some short reference
documents for use in the field that contain cross-links back to the
full guidance.
Based on previous GAO recommendations to ensure consistency, CMS
embarked on a multi-year project to upgrade and clarify our
interpretive guidelines for key regulations (Tags) in order to provide
surveyors with accurate and up-to-date information in each topic area
(such as nutrition, infection control, and incontinence, etc.). The
guidance was developed with assistance from expert panels and is much
more informative than previous guidance. We believe it is imperative
for all surveyors to master the guidance and that the GAO's
recommendation to bolster training efforts (recommendation #4) is the
superior method, rather than seeking to shorten the guidance.
Many survey agencies that the revised guidance has enhanced the ability
of their surveyors to correctly investigate concerns and select the
correct levels of severity for non-compliance, including making
distinctions as to when "actual harm" (Level 3) is reached for common
types of deficient practices that fall under the various Tag topics. An
example of a positive impact from revised guidance (in addressing
deficiency understatement issues) can be seen in CMS' issuance of
improved pharmacy guidelines. The new guidance clearly improved
surveyors' ability to identify the use of unnecessary medications.
Widely publicized concerns about the use of unnecessary medications had
previously failed to alter the extent to which such problems were
identified by surveyors. CMS issued the expanded guidance in late 2006
and combined the issuance with national training of surveyors. The
percent of standard surveys in which unnecessary' medications was
identified by surveyors subsequently increased from the previously
consistent rate of 13-14 percent in fiscal year (FY) 2000 through 2006
to 18 percent in FY 2007. These results are portrayed in Figure 1.
Figure 1: Percent of Surveys Citing Unnecessary Drug Use:
[Refer to PDF for image: vertical bar graph]
Calendar years 2000 through 2007 plotted versus Percentage of Surveys
9% through 19%.
[End of figure]
GAO Recommendation 3: Establish a National Pool of Surveyors for
States:
To address surveyor workforce shortages, CMS should consider
establishing a pool of additional national surveyors that could augment
State survey teams experiencing workforce shortages.
CMS Response:
We do not plan to implement this recommendation, at least on any scale
that would make a national difference. Section 1864 of the Social
Security Act (the Act) directs the Secretary of the Department of
Health and Human Services to enter into agreement with any State that
is willing and capable of carrying out certain survey and certification
responsibilities identified in the Act. According to the 1864
Agreement, the responsibility for conducting inspections and making
recommendations for enforcement rests with the State survey agency.
Establishing national survey teams to augment State surveys is
problematic for a variety of reasons. First, it begins to blur the
lines between (a) holding States accountable for meeting performance
expectations versus (b) compensating for problematic performance due to
State management decisions. While it is proper for CMS to set survey
performance expectations and to establish qualifications for surveyor
knowledge and training, we believe it is improper to tell States how to
make personnel decisions, establish pay scales, recruit staff or, hire
staff, or for CMS to try to alleviate State performance problems that
arise because of State personnel management decisions. It is incumbent
upon each State to determine alternate methods to fulfill the terms of
the 1864 Agreement; it is not up to CMS to compensate for those State
decisions.
While we have national contractors for some provider types (such as
organ transplant centers and psychiatric hospitals), the use of
national contractors has generally been limited to areas of specialty
knowledge. The number of providers in many States may be so small that
it is more cost-effective to have a national contractor than to
contract with States and seek to maintain specialty surveyors for
infrequent surveys. In those instances, we have offered the contracting
to States and utilized a national contractor where States declined the
offer. However, we regard as questionable the significant expansion of
the use of national contractors to perform the more frequent nursing
home surveys, in order to compensate for State performance issues.
We will explore this issue with our State-Federal workgroup in more
detail in order to identify any circumstances in which a national pool
may be advisable, and to identify any additional solutions. In the
past, CMS has examined several promising practices with States and
published policy briefs to assist States on issues such as surveyor
recruitment strategies, retaining surveyors, and strategies to promote
consistent surveyor performance.
GAO Recommendation 4: Training of Surveyors:
To address insufficient training, CMS should (a) evaluate the current
training programs and division of responsibility between Federal and
State components to determine the most cost-effective approach; and (b)
support the continuing education of experienced surveyors.
CMS Response:
We agree with this recommendation. Training is critically important in
fulfilling the mission of the survey and certification program. We have
made significant investments to increase: (a) the number and types of
CMS surveyor training courses, (b) the use of distance learning
(satellites and Web-based training), and (c) the accessibility of
training in geographical areas with large numbers of surveyors ("magnet
areas").
To increase the accessibility of training, we successfully inaugurated
"Magnet Area Training (MAT)" in Florida and California. Evaluations of
the training showed that MAT instruction produced results comparable to
our other training events. We also initiated Web-based training (WBT)
for one course in FY 2008, and we continue to develop new training
courses that employ Web technology. These efforts have improved the
survey and certification training profile. Figure 2, for example, shows
the results of our annual survey of State training coordinators. The
percent of respondents who indicated they were not satisfied with the
number of CMS training courses declined from 30 percent in FY 2005 to 6
percent in FY 2008. The percent of respondents indicating they were
satisfied increased from 30 percent in FY2005 to 41 percent in FY 2008.
Figure 2: Percent State Training Coordinators Not Satisfied with the
Number of CMS Training Courses:
[Refer to PDF for image: vertical bar graph]
FY 2005: 30%;
FY 2006: 10%;
FY 2008: 6%.
[End of figure]
While the trendline is very positive, the FY 2008 results still left 53
percent of participants indicating that they were only "somewhat
satisfied." Gaps remain in both the number and types of courses.
As part of our ongoing dialogue with States, we will (a) poll States to
complete a needs assessment for the continuing education of experienced
surveyors; (h) offer selected continuing education opportunities based
on the needs assessment; and (c) continue to expand the above
initiatives.
GAO Recommendation 5: State Supervisory Reviews:
To address inconsistencies in State supervisory reviews, CMS should set
an expectation through guidance that States have a supervisory review
program as a part of their Quality Assurance processes that includes
routine reviews of deficiencies at the level of potential for more than
minimal harm (1)-F) and that provides feedback to surveyors regarding
changes made to citations.
CMS Response:
We agree with the recommendation. We will start by working with a State-
Federal workgroup to identify promising practices, and initiate the
process of setting more defined expectations for quality review.
Historically, CMS has set forth expectations for documentation and for
quality of the surveys, however, ensuring that deficiency determination
and severity selection are consistent and accurate has been left to
State management. We believe many States have developed effective
programs or methods of supervisory review. Such methods include offsite
review of citations and/or at least occasional participation onsite of
a supervisor during deficiency decision-making meetings of surveyors,
in order to coach and correct mistaken processes used by survey teams.
In addition, CMS regional offices conduct validation reviews of State
surveys and, particularly through the follow-along surveys, provide
feedback to the State survey teams. We will build on these existing
processes and identify additional steps that will be taken.
GAO Recommendation 6: Guidance on Non-Citation Practices:
To address State agency practices and external pressure, CMS should
reestablish expectations through guidance to State survey agencies that
non-citation practices”official or unofficial”are inappropriate and
systematically monitor trends in States' citations.
CMS Response:
We agree with this recommendation. We will issue a Survey and
Certification policy letter outlining our expectations regarding the
survey and certification program as well as avenues to pursue if
inappropriate practices are applied.
GAO Recommendation 7: Guidance on Communicating with CMS when External
Pressures Exist:
To address State agency practices and external pressure, CMS should
establish expectations through guidance to State survey agencies to
communicate and collaborate with their CMS Regional offices when they
experience significant pressure from legislators or the nursing home
industry that may have an impact on the survey process or surveyors'
perceptions.
CMS Response:
We agree with this recommendation. We will provide guidance for
communicating surveyor concerns and provide feedback to State surveyors
outlining our expectations regarding the survey and certification
program, including avenues to pursue if inappropriate pressures are
applied.
We appreciate the effort that went into this report and look forward to
working with the GAO on this and other issues.
[End of section]
Appendix III: GAO Contact and Staff Acknowledgments:
GAO Contact:
John E. Dicken, (202) 512-7114 or dickenj@gao.gov:
Staff Acknowledgments:
In addition to the contact named above, Walter Ochinko, Assistant
Director; Stefanie Bzdusek; Leslie V. Gordon; Martha R. W. Kelly;
Katherine Nicole Laubacher; Dan Lee; Elizabeth T. Morrison; Dan Ries;
Steve Robblee; Karin Wallestad; Rachael Wojnowicz; and Suzanne Worth
made key contributions to this report.
[End of section]
Related GAO Products:
Nursing Homes: Opportunities Exist to Facilitate the Use of the
Temporary Management Sanction. [hyperlink,
http://www.gao.gov/products/GAO-10-37R]. Washington, D.C.: November 20,
2009.
Nursing Homes: CMS's Special Focus Facility Methodology Should Better
Target the Most Poorly Performing Homes, Which Tended to Be Chain
Affiliated and For-Profit. [hyperlink,
http://www.gao.gov/products/GAO-09-689]. Washington, D.C.: August 28,
2009.
Medicare and Medicaid Participating Facilities: CMS Needs to Reexamine
Its Approach for Funding State Oversight of Health Care Facilities.
[hyperlink, http://www.gao.gov/products/GAO-09-64]. Washington, D.C.:
February 13, 2009.
Nursing Homes: Federal Monitoring Surveys Demonstrate Continued
Understatement of Serious Care Problems and CMS Oversight Weaknesses.
[hyperlink, http://www.gao.gov/products/GAO-08-517]. Washington, D.C.:
May 9, 2008.
Nursing Home Reform: Continued Attention Is Needed to Improve Quality
of Care in Small but Significant Share of Homes. [hyperlink,
http://www.gao.gov/products/GAO-07-794T]. Washington, D.C.: May 2,
2007.
Nursing Homes: Efforts to Strengthen Federal Enforcement Have Not
Deterred Some Homes from Repeatedly Harming Residents. [hyperlink,
http://www.gao.gov/products/GAO-07-241]. Washington, D.C.: March 26,
2007.
Nursing Homes: Despite Increased Oversight, Challenges Remain in
Ensuring High-Quality Care and Resident Safety.[hyperlink,
http://www.gao.gov/products/GAO-06-117]. Washington, D.C.: December 28,
2005.
Nursing Home Quality: Prevalence of Serious Problems, While Declining,
Reinforces Importance of Enhanced Oversight. [hyperlink,
http://www.gao.gov/products/GAO-03-561]. Washington, D.C.: July 15,
2003.
Nursing Homes: Quality of Care More Related to Staffing than Spending.
[hyperlink, http://www.gao.gov/products/GAO-02-431R]. Washington, D.C.:
June 13, 2002.
Nursing Homes: Sustained Efforts Are Essential to Realize Potential of
the Quality Initiatives. [hyperlink,
http://www.gao.gov/products/GAO/HEHS-00-197]. Washington, D.C.:
September 28, 2000.
Nursing Home Care: Enhanced HCFA Oversight of State Programs Would
Better Ensure Quality. [hyperlink,
http://www.gao.gov/products/GAO/HEHS-00-6]. Washington, D.C.: November
4, 1999.
Nursing Homes: Proposal to Enhance Oversight of Poorly Performing Homes
Has Merit. [hyperlink, http://www.gao.gov/products/GAO/HEHS-99-157].
Washington, D.C.: June 30, 1999.
Nursing Homes: Additional Steps Needed to Strengthen Enforcement of
Federal Quality Standards. [hyperlink,
http://www.gao.gov/products/GAO/HEHS-99-46]. Washington, D.C.: March
18, 1999.
California Nursing Homes: Care Problems Persist Despite Federal and
State Oversight.
[hyperlink, http://www.gao.gov/products/GAO/HEHS-98-202]. Washington,
D.C.: July 27, 1998.
[End of section]
Footnotes:
[1] Medicare is the federal health care program for elderly and certain
disabled individuals. Medicaid is a joint federal-state health care
financing program for certain categories of low-income individuals.
[2] In addition to the oversight of nursing homes, CMS and state survey
agencies are responsible for oversight of other Medicare and Medicaid
providers, such as home health agencies, intermediate care facilities
for the mentally retarded, and hospitals.
[3] See a list of related GAO products at the end of this report.
[4] See GAO, Nursing Homes: Federal Monitoring Surveys Demonstrate
Continued Understatement of Serious Care Problems and CMS Oversight
Weaknesses, [hyperlink, http://www.gao.gov/products/GAO-08-517]
(Washington, D.C.: May 9, 2008).
[5] CMS's Survey and Certification Group is responsible for ensuring
the effectiveness of state survey activities and managing the federal
monitoring survey program.
[6] See GAO, Nursing Home Quality: Prevalence of Serious Problems,
While Declining, Reinforces Importance of Enhanced Oversight,
[hyperlink, http://www.gao.gov/products/GAO-03-561] (Washington, D.C.:
July 15, 2003) and GAO, Nursing Home Reform: Continued Attention Is
Needed to Improve Quality of Care in Small but Significant Share of
Homes, [hyperlink, http://www.gao.gov/products/GAO-07-794T]
(Washington, D.C.: May 2, 2007). In response to our recommendation to
finalize the development, testing, and implementation of a more
rigorous survey methodology, CMS evaluated and is currently
implementing a revised survey methodology.
[7] See GAO, Nursing Homes: Despite Increased Oversight, Challenges
Remain in Ensuring High-Quality Care and Resident Safety, [hyperlink,
http://www.gao.gov/products/GAO-06-117] (Washington, D.C.: Dec. 28,
2005).
[8] See [hyperlink, http://www.gao.gov/products/GAO-03-561]. Our
analysis of survey predictability considered surveys to be predictable
if (1) homes were surveyed within 15 days of the 1-year anniversary of
the prior survey or (2) homes were surveyed within 1 month of the
maximum 15-month interval between standard surveys. We used this
rationale because homes know the maximum allowable interval between
surveys, and those whose prior surveys were conducted 14 or 15 months
earlier are aware that they are likely to be surveyed soon.
[9] See [hyperlink, http://www.gao.gov/products/GAO-03-561].
[10] See [hyperlink, http://www.gao.gov/products/GAO-03-561].
[11] Eligible surveyors are those who had conducted at least one health
survey of a nursing home in fiscal year 2006 or 2007 and for whom we
could obtain an e-mail or other address from their state survey agency.
[12] We excluded Pennsylvania from our analysis because Pennsylvania's
Deputy Secretary for Quality Assurance instructed the state's surveyors
not to respond to our survey. Two other states had response rates below
40 percent--Connecticut (28 percent), and Illinois (20 percent).
Illinois' response rate probably reflected that surveyors' access to
their e-mail accounts and our Web-based survey was limited to only 1
day per month.
[13] The District of Columbia agency director did not respond to our
questionnaire.
[14] We did not ask nursing home surveyors a similar question because
survey agency directors, as a result of their positions, were a more
consistent source of knowledge about the influence of these factors on
understatement.
[15] See [hyperlink, http://www.gao.gov/products/GAO-08-517]. This
database captures the results of two types of federal monitoring
surveys. Federal comparative surveys are conducted independently by
federal surveyors to evaluate state surveys. Federal surveyors resurvey
a home that was recently inspected by state surveyors and compare the
deficiencies identified during the two surveys. When federal surveyors
accompany state surveyors to directly observe them during a nursing
home survey it is considered a federal observational survey.
[16] We use the term survey record to refer to CMS's Form 2567, which
is the official statement of deficiencies with respect to federal
quality standards.
[17] Other areas include Admission, Transfer and Discharge Rights,
Resident Rights, Resident Behavior and Facility Practices, Nursing
Services, Pharmacy Services, Dietary Services, Physician Services,
Specialized Rehabilitative Services, Dental Services, Infection
Control, and Physical Environment. Surveys also examine compliance with
federal fire safety requirements.
[18] Revisits are not required for most deficiencies cited below the
actual-harm level--that is, A through F.
[19] Nursing homes can also appeal deficiency citations, which result
in hearings before an administrative law judge; nursing homes may also
request HHS's Departmental Appeals Board to review.
[20] On-site sources include observations, interviews, and records
review.
[21] See [hyperlink, http://www.gao.gov/products/GAO-03-561].
[22] The federal government funds state surveys through the Medicare
and Medicaid programs. States contribute a share of Medicaid and non-
Medicaid funds to support survey activities. State non-Medicaid
contributions are to reflect the benefit states derive from health care
facilities that meet federal quality standards as well as the cost of
assessing compliance with state licensing requirements. See GAO,
Medicare and Medicaid Participating Facilities: CMS Needs to Reexamine
Its Approach for Funding State Oversight of Health Care Facilities,
[hyperlink, http://www.gao.gov/products/GAO-09-64] (Washington, D.C.:
Feb. 13, 2009).
[23] See [hyperlink, http://www.gao.gov/products/GAO-09-64].
[24] See [hyperlink, http://www.gao.gov/products/GAO-03-561].
[25] CMS commented on the importance of quality-assurance processes and
noted it had already incorporated such reviews into CMS regional
offices' reviews of the state performance standards. However, the
agency did not require states to initiate an ongoing process that would
evaluate the appropriateness of the scope and severity of documented
deficiencies, as we recommended. See [hyperlink,
http://www.gao.gov/products/GAO-03-561].
[26] See [hyperlink, http://www.gao.gov/products/GAO-03-561].
[27] In addition to the federal monitoring surveys, CMS established
annual state performance reviews in fiscal year 2001 to measure a
state's compliance with specific standards. These standards generally
focus on the timeliness and quality of surveys, complaint
investigations, and enforcement actions. CMS's state performance
reviews include (1) an examination of the quality of state survey
agency investigations and decision making and (2) the timeliness and
quality of complaint investigations.
[28] In 1998, the Health Care Financing Administration, the HHS agency
now known as CMS, acknowledged the need to perform a greater number of
comparative surveys and have done so. Between October 1998 and July
1999 only about 9 percent (64) of federal monitoring surveys were
comparative. However, in our May 2008 report, we found that for the
period of fiscal years 2002 through 2007 about 20 percent (976) of
federal monitoring surveys were comparative surveys and the remaining
80 percent were observational surveys. By statute, comparative surveys
must be conducted within 2 months of the completion of the state
survey.
[29] CMS began requiring regional offices to make this determination in
fiscal year, 2002 and it is captured by a yes/no validation question.
[30] See [hyperlink, http://www.gao.gov/products/GAO-08-517].
[31] In May 2008, we found that understatement also occurred when state
survey teams cited deficiencies at too low a level of scope and
severity. At that time, CMS did not require federal surveyors to
evaluate scope and severity differences between state and federal
comparative surveys. However, as of October 2008, CMS began requiring
such assessments.
[32] See [hyperlink, http://www.gao.gov/products/GAO-08-517].
[33] For purposes of this report, we defined the federal nursing home
survey process as both the traditional methodology used to evaluate
compliance of nursing homes with federal requirements and the written
guidance provided by CMS to help state agencies carry out survey
activities.
[34] Survey methodology is defined as the traditional approach used to
evaluate nursing home compliance with federal regulations as outlined
by CMS in Appendix P of the SOM.
[35] For purposes of this report, we defined CMS written guidance as
the information in the SOM on the long-term care survey process,
including the survey protocol for long-term care facilities in Appendix
P and the guidance on federal quality standards in Appendix PP as well
as any additional materials provided by CMS to assist surveyors, such
as Survey and Certification letters.
[36] Guidance for determining actual-harm level deficiencies is
provided in Section IV, Appendix P of the SOM.
[37] See [hyperlink, http://www.gao.gov/products/GAO-03-561].
[38] Federal nursing home quality standards detail requirements for the
delivery of care, resident outcomes, and facility conditions. State
survey teams use these federal quality standards to assess compliance
during state nursing home surveys.
[39] In October 2000, CMS began revising investigative protocols for
assessing specific deficiencies. The intent of this initiative is to
enable surveyors to better (1) identify specific deficiencies, (2)
investigate whether a deficiency is the result of poor care, and (3)
document the level of harm resulting from a home's identified deficient
care practices. See [hyperlink, http://www.gao.gov/products/GAO-03-
561].
[40] Our questionnaire included 13 topics of the approximately 200
federal quality standards. Seven of these were taken from the Quality
of Care category of federal quality standards, the others originated
from different categories such as Resident Assessment and Dietary
Services. See [hyperlink, http://www.gao.gov/products/GAO-08-517].
[41] Beginning January 1, 1999, CMS directed states to avoid scheduling
surveys for the same month of the year as a nursing home's previous
survey. However, surveys can also be considered predictable if
occurring at a time other than near the 1-year anniversary or 15-month
maximum date. For example, nursing home operators could be alerted when
the state agency is surveying a facility in a nearby area if all the
facilities in that area were surveyed at about the same time.
[42] In 2003, we found that 34 percent of nursing home surveys were
predictable. See [hyperlink, http://www.gao.gov/products/GAO-03-561].
[43] According to CMS, states consider 9 months to 15 months from the
last standard survey as the window for completing standard surveys
because it yields a 12-month average. Thus, to maintain an average
survey interval of 12 months, given that some facilities are not
surveyed until near or after 15 months, more surveys would need to
occur within 9 months of the last standard survey. See GAO-06-117.
[44] At the time of our survey, the QIS methodology was being
implemented in eight state agencies: Connecticut, Florida, Kansas,
Louisiana, Minnesota, New Mexico, North Carolina, and Ohio. According
to a CMS official, Connecticut is the only one of these states that has
implemented the QIS statewide. As of May 2008, CMS projected that the
QIS would be fully implemented in all states in 2014.
[45] The QIS evaluation was conducted to answer questions about
accuracy, documentation, changes in the number and types of
deficiencies, and whether the QIS process is more efficient. HHS, CMS,
Evaluation of the Quality Indicator Survey (QIS), Final Report
(December 2007), [hyperlink,
http//:www.cms.hhs.gov/CertificationandComplianc/Downloads/QISExecSummar
y.pdf] (accessed July 17, 2009).
[46] Scope refers to the number of residents potentially or actually
affected and has three levels--isolated, pattern, and widespread. A
pattern scope refers to deficiencies at the B, E, H, and K levels and a
widespread scope refers to deficiencies at the C, F, I, and L levels.
[47] We asked all 42 directors who had not participated in the QIS to
provide their opinions on the new methodology; we received comments
from 18 of the 42 directors.
[48] See [hyperlink, http://www.gao.gov/products/GAO-03-561].
[49] Virginia did not provide the information needed to compute a
vacancy rate.
[50] Michigan and Illinois did not provide this information.
[51] See [hyperlink, http://www.gao.gov/products/GAO-09-64].
[52] This information was last updated in June 2009 before the governor
of California signed the state's budget revisions.
[53] Seven states did not report the number of surveyors with less than
2 years of experience--Illinois, Michigan, Minnesota, Texas,
Washington, West Virginia, and Wisconsin.
[54] Revisit surveys are generally conducted in facilities when a G-
level or higher deficiency is cited by a survey team, to verify that
serious deficiencies have been corrected by the home.
[55] In an open-ended question at the end of the questionnaire, 842
surveyors commented on a wide range of topics related to surveys. These
comments represented about 36 percent of all nursing home surveyor
respondents; 15 percent of the comments represented about 6 percent of
all respondents.
[56] The 476 surveyors who responded to this open-ended question about
training needs constituted about 20 percent of all respondents.
[57] For supervisory review processes, we defined direct-line
supervisors as including survey team leaders, direct supervisors, and
supervisors at district or regional offices. Central state agency staff
was defined as including quality assurance teams, legal counsel, state
training coordinators, and compliance specialists.
[58] These review steps could be done at the direct-line supervisor
level or by central state agency staff. The difference between
supervisory review levels for surveys with J through L citations and
those for surveys with D through F citations was significant at the 1
percent level.
[59] Forty states review a sample of all draft surveys. Such reviews
may include additional examination of surveys with deficiencies at
either the D through F or J through L levels.
[60] See [hyperlink, http://www.gao.gov/products/GA0-03-561].
[61] Seventy-four percent of state agency directors indicated that
inadequate supervisory review processes infrequently or never
contributed to understatement in their state, and 4 percent of state
agency directors were unsure or had no opinion on this topic.
[62] CMS previously identified the existence of a potential noncitation
practice in one state that had an unusually high number of homes with
no deficiencies on their standard surveys. Contrary to federal
guidance, surveyors in that state were not citing all identified
deficiencies but rather brought them to the homes' attention with the
expectation that the deficiencies would be corrected. See GAO-03-561.
[63] CMS requires on-site revisits for any noncompliance identified at
level F (with a finding of substandard quality of care) or any level
higher than F.
[64] Effective January 2000, CMS expanded its immediate sanctions
policy, requiring referral of homes found to have harmed one or a small
number of residents (G-level deficiencies) on successive routine
surveys or intervening complaint investigations.
[65] CMS officials previously acknowledged that the double G policy may
have had an unintended negative effect on the rate of deficiency
citations. See GAO, Nursing Homes: Efforts to Strengthen Federal
Enforcement Have Not Deterred Some Homes from Repeatedly Harming
Residents, [hyperlink, http://www.gao.gov/products/GAO-07-241]
(Washington, D.C.: Mar. 26, 2007).
[66] In this section, we refer to two states we interviewed as State A
and State B to maintain confidentiality for the officials from these
state agencies. The corresponding regional offices are referred to as
regional office responsible for State A or State B, respectively.
[67] Alaska, Hawaii, Illinois, and West Virginia did not report
information on the number of IDRs.
[68] The following states did not report the number of deficiencies
deleted or downgraded through the IDR process: Alaska, Hawaii,
Illinois, Kentucky, Nebraska, New Jersey, New Mexico, Vermont, or West
Virginia. Maine, Oklahoma, Utah, Washington, and Wyoming provided the
number of deficiencies deleted but not the number that were downgraded.
[69] According to CMS guidance, if an outside entity conducts the IDR,
the results of the process may serve only as a recommendation to the
state survey agency of noncompliance or compliance with the federal
requirements for nursing homes.
[70] See [hyperlink, http://www.gao.gov/products/GAO-03-561].
[71] See [hyperlink, http://www.gao.gov/products/GAO-08-517].
[72] This report follows and expands on our May 2008 report, which
examined (1) the information contained in federal monitoring surveys
about understatement nationwide, and (2) CMS management and oversight
of the federal monitoring survey program, see GAO-08-517.
[73] CMS's Survey and Certification Group is responsible for ensuring
the effectiveness of state survey activities and managing the federal
monitoring survey program.
[74] We mailed paper copies of the questionnaire to 15 surveyors in
Arkansas, who did not have a state-issued e-mail address; on request,
an additional copy was faxed to a surveyor. Seven out of the 16 paper
copies were completed and returned to GAO.
[75] Although nursing home surveyors' responses were anonymous to
preserve their confidentiality, a few surveyors voluntarily provided
their contact information and agreed to be interviewed.
[76] Questions about CMS's survey methodology directed surveyors to
respond about the traditional survey methodology, not the new Quality
Indicator Survey (QIS) methodology, which had been implemented in eight
states.
[77] When respondents indicated that they did not conduct health safety
surveys of nursing homes and therefore should have been excluded from
the population of eligible nursing home surveyors, these surveyors and
their responses were excluded.
[78] The Illinois response rate likely reflects that surveyors' access
to their e-mail accounts, and our Web-based survey, was limited to only
1 day per month.
[79] All state agency directors were asked about CMS's traditional
survey methodology, which all states used in 2008. However eight state
agency directors, who indicated that the QIS has been implemented in at
least part of their states, were asked additional questions
specifically about the QIS.
[80] We did not ask nursing home surveyors a similar question because
survey agency directors, as a result of their positions, were a more
consistent source of knowledge about the influence of these factors on
understatement.
[81] Where responses to particular questions were fewer than the
overall number of responses for the questionnaire, this limitation is
indicated in the text.
[82] During this period, fiscal year 2002 was the first year that the
database contained all the information needed to assess the results of
federal comparative surveys. See [hyperlink,
http://www.gao.gov/products/GAO-08-517].
[83] Federal comparative surveys are done on a small group of
facilities that are not randomly selected, and the understatement of
deficiencies identified through comparative surveys are not
representative of all nursing home surveys or survey teams within each
state.
[84] For our descriptive statistics, we computed means, the minimum and
maximum responses, responses at the 25th, 50th, and 75th percentiles,
frequencies among categories of respondents, such as those from high-
and low-understatement states, as well as frequencies across two or
more categories of respondents. Correlations were computed as Pearson's
correlations of association. T-tests were done to identify when the
mean response from two different categories of respondents, such as
high-and low-understatement states, were significantly different from
each other.
[85] [hyperlink, http://www.gao.gov/products/GAO-08-517]. The General
Investigation segment assesses the effectiveness of state survey team
actions such as collection of information, discussion of survey
observations, interviews with nursing home residents, and
implementation of CMS investigative protocols. The Deficiency
Determination segment evaluates the skill with which the state survey
teams (1) integrate and analyze all information collected and (2) use
the guidance for surveyors and identify deviations from regulatory
requirements.
[86] We use the term survey record to refer to CMS's Form 2567, which
is the official statement of deficiencies with respect to federal
quality standards.
[End of section]
GAO's Mission:
The Government Accountability Office, the audit, evaluation and
investigative arm of Congress, exists to support Congress in meeting
its constitutional responsibilities and to help improve the performance
and accountability of the federal government for the American people.
GAO examines the use of public funds; evaluates federal programs and
policies; and provides analyses, recommendations, and other assistance
to help Congress make informed oversight, policy, and funding
decisions. GAO's commitment to good government is reflected in its core
values of accountability, integrity, and reliability.
Obtaining Copies of GAO Reports and Testimony:
The fastest and easiest way to obtain copies of GAO documents at no
cost is through GAO's Web site [hyperlink, http://www.gao.gov]. Each
weekday, GAO posts newly released reports, testimony, and
correspondence on its Web site. To have GAO e-mail you a list of newly
posted products every afternoon, go to [hyperlink, http://www.gao.gov]
and select "E-mail Updates."
Order by Phone:
The price of each GAO publication reflects GAO‘s actual cost of
production and distribution and depends on the number of pages in the
publication and whether the publication is printed in color or black and
white. Pricing and ordering information is posted on GAO‘s Web site,
[hyperlink, http://www.gao.gov/ordering.htm].
Place orders by calling (202) 512-6000, toll free (866) 801-7077, or
TDD (202) 512-2537.
Orders may be paid for using American Express, Discover Card,
MasterCard, Visa, check, or money order. Call for additional
information.
To Report Fraud, Waste, and Abuse in Federal Programs:
Contact:
Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]:
E-mail: fraudnet@gao.gov:
Automated answering system: (800) 424-5454 or (202) 512-7470:
Congressional Relations:
Ralph Dawn, Managing Director, dawnr@gao.gov:
(202) 512-4400:
U.S. Government Accountability Office:
441 G Street NW, Room 7125:
Washington, D.C. 20548:
Public Affairs:
Chuck Young, Managing Director, youngc1@gao.gov:
(202) 512-4800:
U.S. Government Accountability Office:
441 G Street NW, Room 7149:
Washington, D.C. 20548: