Information Quality Act
National Agricultural Statistics Service Implements First Steps, but Documentation of Census of Agriculture Could Be Improved
Gao ID: GAO-05-644 September 23, 2005
The Information Quality Act (IQA) required the Office of Management and Budget to issue guidelines for ensuring the quality, objectivity, utility, and integrity of information disseminated by federal agencies. As part of our long-term examination of the quality of federal information, under the Comptroller General's authority, we reviewed how the act was implemented by the National Agricultural Statistics Service (NASS), and assessed the transparency of the documentation supporting its Census of Agriculture. NASS is part of the U.S. Department of Agriculture (USDA).
NASS fulfilled its various procedural responsibilities and reporting requirements under the Office of Management and Budget's (OMB) guidelines for implementing the act. For example, NASS drafted its own implementation guidance, and developed a mechanism allowing affected parties to request the correction of information they believe is of poor quality. As a result of our review, NASS has also taken steps to better document the criteria it uses to evaluate data users' input on the content of the Census of Agriculture. Building on these efforts, better documentation could improve the transparency of census data products. For example, the nine key products from the 2002 Census we examined lacked, among other things, discussions of any data limitations. This is contrary to NASS's own guidelines for ensuring transparency, which stress the importance of describing the methods, data sources, and other items to help users understand how the information was designed and produced. Although NASS complied with OMB's requirement to establish a mechanism under IQA to address requests to correct information, NASS has not documented its approach for handling correction requests not filed under IQA (NASS handles these correction requests using an existing, informal method). Agency officials told us that data users have been satisfied with the way NASS had responded to these requests. However, because NASS does not document its informal procedures for handling correction requests and lacks a recordkeeping system to log and track them, NASS could not provide us with specific data on the number of such requests it has handled, the nature of those requests, and whether and how they were addressed.
Recommendations
Our recommendations from this work are listed below with a Contact for more information. Status will change from "In process" to "Open," "Closed - implemented," or "Closed - not implemented" based on our follow up work.
Director:
Team:
Phone:
GAO-05-644, Information Quality Act: National Agricultural Statistics Service Implements First Steps, but Documentation of Census of Agriculture Could Be Improved
This is the accessible text file for GAO report number GAO-05-644
entitled 'Information Quality Act: National Agricultural Statistics
Service Implements First Steps, but Documentation of Census of
Agriculture Could Be Improved' which was released on September 23,
2005.
This text file was formatted by the U.S. Government Accountability
Office (GAO) to be accessible to users with visual impairments, as part
of a longer term project to improve GAO products' accessibility. Every
attempt has been made to maintain the structural and data integrity of
the original printed product. Accessibility features, such as text
descriptions of tables, consecutively numbered footnotes placed at the
end of the file, and the text of agency comment letters, are provided
but may not exactly duplicate the presentation or format of the printed
version. The portable document format (PDF) file is an exact electronic
replica of the printed version. We welcome your feedback. Please E-mail
your comments regarding the contents or accessibility features of this
document to Webmaster@gao.gov.
This is a work of the U.S. government and is not subject to copyright
protection in the United States. It may be reproduced and distributed
in its entirety without further permission from GAO. Because this work
may contain copyrighted images or other material, permission from the
copyright holder may be necessary if you wish to reproduce this
material separately.
Report to Congressional Committees:
September 2005:
Information Quality Act:
National Agricultural Statistics Service Implements First Steps, but
Documentation of Census of Agriculture Could Be Improved:
GAO-05-644:
GAO Highlights:
Highlights of GAO-05-644, a report to congressional committees:
Why GAO Did This Study:
The Information Quality Act (IQA) required the Office of Management and
Budget to issue guidelines for ensuring the quality, objectivity,
utility, and integrity of information disseminated by federal agencies.
As part of our long-term examination of the quality of federal
information, under the Comptroller General‘s authority, we reviewed how
the act was implemented by the National Agricultural Statistics Service
(NASS), and assessed the transparency of the documentation supporting
its Census of Agriculture. NASS is part of the U.S. Department of
Agriculture (USDA).
What GAO Found:
NASS fulfilled its various procedural responsibilities and reporting
requirements under the Office of Management and Budget‘s (OMB)
guidelines for implementing the act. For example, NASS drafted its own
implementation guidance, and developed a mechanism allowing affected
parties to request the correction of information they believe is of
poor quality. As a result of our review, NASS has also taken steps to
better document the criteria it uses to evaluate data users‘ input on
the content of the Census of Agriculture.
The Census of Agriculture Provides a Detailed Picture of U.S. Farms and
Ranches:
[See PDF for image]
[End of figure]
Building on these efforts, better documentation could improve the
transparency of census data products. For example, the nine key
products from the 2002 Census we examined lacked, among other things,
discussions of any data limitations. This is contrary to NASS‘s own
guidelines for ensuring transparency, which stress the importance of
describing the methods, data sources, and other items to help users
understand how the information was designed and produced.
Although NASS complied with OMB‘s requirement to establish a mechanism
under IQA to address requests to correct information, NASS has not
documented its approach for handling correction requests not filed
under IQA (NASS handles these correction requests using an existing,
informal method). Agency officials told us that data users have been
satisfied with the way NASS had responded to these requests. However,
because NASS does not document its informal procedures for handling
correction requests and lacks a recordkeeping system to log and track
them, NASS could not provide us with specific data on the number of
such requests it has handled, the nature of those requests, and whether
and how they were addressed.
What GAO Recommends:
To help enhance the transparency of the Census of Agriculture, we
recommend that the Secretary of Agriculture direct NASS to
(1) ensure that census products fully address NASS‘s guidelines for
data documentation or at least contain links to such information, and
(2) document and post on NASS‘s Web site its procedures for handling
data correction requests not filed under IQA. NASS agreed with our
findings and described the steps it is taking in response to our
recommendations. Additional actions, consistent with our
recommendations, would enhance NASS‘s efforts.
www.gao.gov/cgi-bin/getrpt?GAO-05-644.
To view the full product, including the scope and methodology, click on
the link above. For more information, contact Orice Williams at (202)
512-6806 or williamso@gao.gov.
[End of section]
Contents:
Letter:
Results in Brief:
Background:
Objectives, Scope, and Methodology:
NASS Met the Procedural and Reporting Requirements of OMB's IQA
Guidelines:
Better Documentation Could Improve the Transparency of Data Products
and Correction Procedures:
Conclusions:
Recommendations for Executive Action:
Agency Comments and Our Evaluation:
Appendixes:
Appendix I: Comments from the Department of Agriculture:
Appendix II: How Census of Agriculture Reports Address Various
Documentation Elements:
Appendix III: GAO Contact and Staff Acknowledgments:
Tables:
Table 1: NASS Addressed OMB's Agencywide Guidelines for Implementing
IQA:
Table 2: Census Reports Need More Robust Documentation:
Table 3: NASS Is Using More Extensive Outreach to Develop the 2007
Census Compared to 2002:
Letter September 23, 2005:
The Honorable Saxby Chambliss:
Chairman:
The Honorable Tom Harkin:
Ranking Democratic Member:
Committee on Agriculture, Nutrition and Forestry:
United States Senate:
The Honorable Collin C. Peterson:
Ranking Democratic Member:
Committee on Agriculture:
House of Representatives:
The information disseminated by federal agencies is a critical
strategic asset. For example, data collected for statistical purposes
provide indicators of the economic and social well-being of the nation,
while health, safety, environmental, and other scientific data help
inform agencies' rule-making activities. Given the widespread use and
impact of federal information, it is important for it to meet basic
quality standards.
Section 515 of the Treasury and General Government Appropriations Act
for Fiscal Year 2001--legislation that has come to be known as the
Information Quality Act (IQA)[Footnote 1]--required the Office of
Management and Budget (OMB) to issue governmentwide guidelines that
provide policy and procedural guidance to federal agencies for ensuring
and maximizing the quality, objectivity, utility, and integrity of
information disseminated by federal agencies.
OMB's guidelines, issued in final form in February 2002, directed
agencies covered by the act to issue their own quality guidelines, and
noted that, where appropriate, agencies should support their data with
transparent documentation. OMB's guidelines also required agencies to,
among other actions, report annually to the Director of OMB on the
number and nature of complaints received regarding compliance with the
guidelines, and establish an administrative mechanism whereby affected
parties can request the correction of information they deem to be of
poor quality.
As part of our long-term examination of the collection, dissemination,
and quality of federal information, we are reviewing the governmentwide
implementation of the IQA. As an initial step in our research on the
quality of statistical data, under the Comptroller General's authority,
we conducted a case study of the National Agricultural Statistics
Service (NASS), and its Census of Agriculture. NASS is a statistical
agency within the U.S. Department of Agriculture (USDA). We selected
the census because it is one of the largest government surveys with a
universe of 2.3 million respondents and an estimated paperwork burden
in excess of 1.3 million burden hours.[Footnote 2] The last census took
place in 2002 and the next census is scheduled for 2007.
Specifically, our objectives were to (1) review how NASS met OMB's
guidelines covering the IQA, and (2) examine the transparency of the
documentation behind the Census of Agriculture's processes and
products, both for the recently completed work on the 2002 Census and
the efforts underway for the 2007 Census. To achieve both objectives,
we reviewed OMB's and NASS's information quality guidelines and other
relevant documents. We also interviewed senior agency officials and
other personnel responsible for implementing the census including the
NASS Administrator, Associate Administrator, and Deputy Administrator
for Programs and Products.
To evaluate the transparency of census products, we reviewed nine
census products--eight reports and the Frequently Asked Questions (FAQ)
section on NASS's 2002 Census Web site--to determine the extent to
which NASS followed its own documentation guidelines. To obtain an
external perspective of how NASS processes and products address the IQA
guidelines, we interviewed six data users from different types of
agricultural and research organizations. We selected these six because
they use census data on a regular basis and have attended NASS's
outreach meetings. Additional information on our approach is provided
in the Objectives, Scope, and Methodology section below.
We performed our work in Washington, D.C., from August 2004 through
August 2005 in accordance with generally accepted government auditing
standards.
Results in Brief:
NASS fulfilled its various procedural responsibilities and reporting
requirements under OMB's guidelines. For example, NASS (1) drafted its
own implementation guidelines and posted them on its Web site, (2)
developed an administrative mechanism allowing affected persons to seek
the correction of information, (3) designated an official responsible
for ensuring NASS's compliance with OMB's requirements, and (4)
reported to OMB the number of complaints received regarding data
quality they received under the IQA.[Footnote 3]
With respect to the transparency of the documentation underlying the
Census of Agriculture's data products and processes, as a result of our
review, NASS has taken steps to better document the criteria it uses to
evaluate data users' suggestions on the questionnaire content. The
development of the 2002 and 2007 Censuses was led by the Census Content
Team, which consisted of experienced NASS statisticians. The 2002 Team
assessed users' input using a documented set of criteria, which
considered such factors as whether the questionnaire items were
mandated by Congress or whether they would provide data on current
agricultural issues. However, because of staff turnover and
reassignments, the 2007 Team was unaware of the 2002 criteria, and
initially relied on professional judgment rather than documented
factors to evaluate input on the 2007 Census. According to NASS, our
review raised the 2007 Team's awareness of the earlier criteria, and it
has since developed similar documentation that it will use in the
future. This approach is more consistent with NASS's own IQA guidelines
concerning transparency, and could help create a closer link between
the questions included in the census and evolving agricultural policy
requirements, and thus a more cost-effective data collection program.
Documenting the content selection criteria will also guard against the
loss of institutional memory to the extent there is further turnover in
Content Team membership.
Building on these efforts, better documentation could improve the
transparency of census data products. Although NASS's guidelines for
ensuring transparency stress the importance of describing the methods,
data sources, assumptions, and other items in order to help users
understand how information was designed and produced, the eight census
reports as well as the FAQ section we examined lacked a discussion of
such important documentation practices as the limitations of the data;
the impact of imputations, by item; and whether any of the collected
data have been suppressed for data quality reasons.
The transparency of NASS's procedures for handling data correction
requests filed outside of the IQA could also be improved. NASS complied
with OMB's requirement to establish a mechanism to process requests for
correction of disseminated information under the IQA. As part of this
process, an individual must state that their request is being submitted
under the IQA. To date, no individual has done so. NASS handles all
other correction requests using its existing informal, undocumented
procedures. Agency officials told us that it has resolved the informal
requests it has handled so far to the data users' satisfaction.
Nevertheless, because NASS does not document its informal procedures
for handling correction requests and lacks a recordkeeping system to
log and track them, NASS could not provide us with specific data on the
number of requests it has handled, the nature of those requests, and
whether and how they were addressed.
To help enhance the transparency of the Census of Agriculture's
processes and products, we recommend that the Secretary of Agriculture
direct NASS to (1) ensure its products fully address its own
requirements for transparent data documentation or at least contain
links to such information and (2) document and post on its Web site its
procedures for handling data correction requests not filed under the
IQA and track the disposition of those requests.
The NASS Administrator provided written comments on a draft of this
report (see app. I). NASS said the information was insightful, and
noted it will be used to strengthen the transparency of its methods and
procedures. In particular, NASS agreed with our findings and,
consistent with one of our two recommendations, said it will take steps
to better document its specialized reports. NASS also said it plans to
make a list of "common issues" raised by data users available on its
Web site, which is in line with our second recommendation to improve
the transparency of its procedures for handling data correction
requests not filed under the IQA. NASS's commitment to continually
improve its products is commendable, and its efforts to improve the
transparency of its processes and products would be further enhanced
if, consistent with our recommendations, it (1) ensures that all of its
census products fully address NASS's own guidelines for data
documentation and (2) posts on its Web site its procedures for handling
correction requests not filed under the IQA.
Background:
The IQA directed OMB to issue guidelines to federal agencies covered by
the Paperwork Reduction Act designed to ensure the "quality,
objectivity, utility, and integrity" of information disseminated to the
public.[Footnote 4] The IQA also directed OMB to include in its
guidelines requirements for agencies to (1) develop their own
information quality guidelines, (2) establish administrative mechanisms
for affected persons to seek correction of information that does not
comply with OMB's guidelines, and (3) annually report to OMB the number
and nature of complaints they receive regarding the accuracy of the
information they disseminate.[Footnote 5]
Prior to the IQA, there were several governmentwide actions aimed at
improving agency data. For example, Statistical Policy Directive No. 2,
first issued in 1952, required statistical agencies to inform users of
conceptual or other limitations of the data, including how the data
compare with similar statistics. In 1996, the Federal Committee on
Statistical Methodology--an OMB-sponsored interagency committee
dedicated to improving the quality of federal statistics--established a
subcommittee to review the measurement and reporting of data quality in
federal data collection programs. The results of the subcommittee's
work were published in a 2001 report that addressed such issues as what
information on sources of error federal data collection programs should
provide, and how they should provide it.[Footnote 6] For all federal
government information collections, the 1995 amendments to the
Paperwork Reduction Act called on federal agencies to manage
information resources with the goal of improving "the integrity,
quality, and utility of information to all users within and outside the
agency."[Footnote 7]
OMB's IQA guidelines were issued in final form in February
2002.[Footnote 8] They required agencies subject to the IQA to take
such steps as:
* issue information quality guidelines designed to ensure the quality,
objectivity, utility, and integrity of information disseminated to the
public;
* establish administrative mechanisms for affected persons to seek
correction of information they believe is not in compliance with the
guidelines;
* report annually to the Director of OMB on the number and nature of
complaints received regarding compliance with the guidelines and how
the agencies handled those complaints; and:
* designate an official responsible for ensuring compliance with OMB's
guidelines.
The OMB guidelines defined quality as an encompassing term comprising:
* utility, which is the usefulness of the information to its intended
users;
* integrity, which refers to the security of information and its
protection from unauthorized access or revision; and:
* objectivity, which addresses both presentation (i.e., whether the
information is being presented in an accurate, clear, complete, and
unbiased manner) and substance (i.e., whether the information is
accurate, reliable, and unbiased).
In addition, OMB addresses transparency within the definition of
objectivity and utility. As recognized in OMB's guidelines, agencies
that disseminate influential scientific, financial, or statistical
information must demonstrate a high degree of transparency about data
and methods. These measures are in place to facilitate the
information's reproducibility by an outside party or reanalysis of an
agency's results.
The National Research Council of the National Academies considers
transparency a key principle for federal statistical agencies, and
stated in a recent report that transparency, which it defines as "an
openness about the sources and limitations of the data," is
particularly important for instilling credibility and trust among data
users and providers.[Footnote 9]
As an agency within USDA, NASS is required to comply with the IQA. One
statistical program administered by NASS is the quinquennial Census of
Agriculture. According to NASS, the census provides a detailed picture
of U.S. farms and ranches every 5 years and is the only source of
uniform, comprehensive agricultural data at the county level. The
results are published in 18 reports divided among three categories:
Geographic Area Series, Census Quick Stats, and Specialty Products and
Special Studies. Users of this information include federal agencies
(for program and statistical purposes), farm organizations, businesses,
universities, state departments of agriculture, elected
representatives, legislative bodies at all levels of government, and
academia. The next Census of Agriculture is scheduled for 2007.
Objectives, Scope, and Methodology:
Our objectives were to (1) review how NASS met OMB's guidelines
covering the IQA and (2) examine the transparency of the documentation
behind the Census of Agriculture's processes and products, including
the recently completed work on the 2002 Census, and the efforts
currently underway for the 2007 Census.
To achieve both of these objectives, we reviewed OMB's and NASS's
information quality guidelines, Census of Agriculture reports,[Footnote
10] submissions to OMB, and other relevant documents. We also
interviewed NASS officials about how NASS conducted the 2002 Census and
how it is planning for the 2007 Census. The officials included the NASS
Administrator, Associate Administrator, and Deputy Administrator for
Programs and Products.
In addition, to evaluate the transparency of Census of Agriculture
products, we reviewed eight census reports and the Frequently Asked
Questions area of the 2002 Census Web site, to determine the extent to
which NASS followed its own procedures for ensuring the transparency of
its information products. NASS's IQA guidelines define transparency as,
"a clear description of the methods, data sources, assumptions,
outcomes, and related information that allows a data user to understand
how an information product was designed and produced."
NASS's guidelines state that its survey activities include such
activities as sample design, questionnaire design, pre-testing,
analysis of sampling, and imputation of missing data. However, the
guidelines were not clear as to the specific activities to be
documented. Consequently, we reviewed the practices employed by such
statistical agencies as the National Academies of Sciences,
International Monetary Fund, and U.S. Census Bureau, and developed a
set of 20 practices associated with transparent documentation that
encompassed the items NASS laid out in its own guidelines. The
practices include such actions as defining data items, discussing
sample design, and describing how the content of the survey differs
from past iterations (see app. II).
We looked for the presence or absence of these practices in 9 out of
the 18 census reports and related forms of data that NASS disseminates,
and verified the results with a second, independent analysis. In
instances where a report did not include a particular documentation
practice, we reviewed whether the report instead informed data users
where to obtain this information. We chose these 9 reports because they
all stem from the original census data collection, represent different
product categories, and were available on the census Web site as of
February 1, 2005.
To obtain an external perspective of how NASS processes and products
address the IQA guidelines, we interviewed six data users from
different types of agricultural and research organizations. We selected
these data users from lists of registrants for USDA and NASS outreach
meetings within the past 5 years. We selected these six data users
because they use information from the census on a regular basis.
Moreover, these data users attended the most recent NASS outreach
meeting, which specifically addressed the 2002 and 2007 Censuses. Some
data users had also provided NASS with feedback on the content of the
agricultural census. Their views cannot be projected to the larger
population of census data users.
We requested comments on a draft of this report from the Secretary of
Agriculture. On September 8, 2005, we received the NASS Administrator's
written comments and have reprinted them in appendix I. They are
addressed in the Agency Comments and Our Evaluation section of this
report.
NASS Met the Procedural and Reporting Requirements of OMB's IQA
Guidelines:
NASS fulfilled the various procedural responsibilities and reporting
requirements under OMB's guidelines. For example, NASS released its own
IQA guidelines for public comment on March 27, 2002. NASS officials
stated they received no substantive comments on them and OMB approved
the guidelines with only minimal changes. The officials also noted that
no revisions have been made since then. Table 1 shows in greater detail
how NASS addressed OMB's guidelines.
Table 1: NASS Addressed OMB's Agencywide Guidelines for Implementing
IQA:
OMB directed agencies to:
* Prepare a draft report explaining how their guidelines will ensure
and maximize the quality of information;
* Publish a notice in the Federal Register announcing the availability
of this report on the agency's Web site for public comment;
NASS's response:
* NASS posted its draft report and guidelines to its Web site from
March 27, 2002, until September 30, 2002;
* Notice of availability was published in the June 4, 2002, Federal
Register[A] and the report itself was available on NASS's Web site.
OMB directed agencies to:
* Post their final report and guidelines to their Web sites;
NASS's response:
* Final guidelines are on the NASS Web site.
OMB directed agencies to:
* Develop administrative mechanisms allowing affected persons to
correct disseminated information that does not comply with the OMB
guidelines;
NASS's response:
* NASS outlines its correction procedures in detail on its Web site.
OMB directed agencies to:
* Submit a report by January 1 of each year on the number and nature of
complaints received in the prior fiscal year;
NASS's response:
* USDA submitted a report to OMB both years since the guidelines have
been in effect; NASS reported no complaints.
OMB directed agencies to:
* Designate an official to be responsible for the agency's compliance
with OMB's guidelines;
NASS's response:
* USDA has designated its Chief Information Officer (CIO) as this
official. The CIO in turn delegates compliance questions to lower-level
offices, including the Standards Officer within the office of the
Associate Administrator at NASS.
Source: GAO analysis of OMB and NASS documents.
[A] 67 Fed. Reg. 38,467.
[End of table]
Better Documentation Could Improve the Transparency of Data Products
and Correction Procedures:
NASS's IQA guidelines define transparency as, "a clear description of
the methods, data sources, assumptions, outcomes, and related
information that allows a data user to understand how an information
product was designed and produced." NASS's guidelines also note that
"NASS will make the methods used to produce information as transparent
as possible" and that its "internal guidelines call for clear
documentation of data and methods used in producing estimates and
forecasts. . . ."
To assess the extent to which NASS processes help ensure the
transparency of the information it publishes, we examined key
publications from the 2002 Census of Agriculture. Census reports vary
in terms of scope and intended audience (see table 2). On the one hand,
the United States Summary and State Data report contains over 100 data
tables, an introduction, and four appendices. On the other hand, County
Profile reports summarize each county's agricultural situation on two
pages.
Overall, we assessed eight census reports within three product
categories, as well as the Frequently Asked Questions (FAQ) section of
the 2002 Census Web site, to determine the extent to which NASS
followed its own guidelines for ensuring the transparency of its
products. As shown in table 2, the transparency of the data
documentation in the reports we reviewed varied between the Geographic
Area Series reports--which are the most comprehensive of NASS's
products and addressed 15 of the 20 data documentation practices--and
the Specialty Products and Special Studies which, depending on the
specific product, addressed no more than 1 of the practices.
Table 2: Census Reports Need More Robust Documentation:
Product category: Geographic Area Series:
Product title: United States Summary and State Data;
General description: Contains over 100 national and state data tables;
Portion of 20 documentation practices addressed: 15.
Product title: State and County Data;
General description: Contains over 100 state and county data tables;
Portion of 20 documentation practices addressed: 15.
Product category: Census Quick Stats: Ag Statistics Data Base:
Product title: 2002 Census of Agriculture Downloadable Application;
General description: A database for users to download and generate data
tables at the national, state, and county levels;
Portion of 20 documentation practices addressed: 14.
Product category: Specialty Products and Special Studies:
Product title: State and County Profiles;
General description: Two-page reports containing summary data about a
state or county;
Portion of 20 documentation practices addressed: 0.
Product title: Quick Facts from the 2002 Census of Agriculture;
General description: This report presents national data in 16 charts or
graphs;
Portion of 20 documentation practices addressed: 1.
Product title: Ranking of Market Value of Agricultural Products Sold;
General description: This report contains state tables that rank the
agricultural products sold by market value. The report also includes
table definitions;
Portion of 20 documentation practices addressed: 1.
Product title: Congressional District Profiles;
General description: Each profile is a two-page report that contains
summary data about one congressional district;
Portion of 20 documentation practices addressed: 0.
Product title: Ranking of Congressional Districts;
General description: Contains tables for 46 data items, such as number
of farms, and ranks the congressional districts for each of these data
items;
Portion of 20 documentation practices addressed: 1.
Product category: Additional Information:
Product title: Frequently Asked Questions;
General description: This section of the census Web site contains
questions and answers grouped into four categories;
Portion of 20 documentation practices addressed: 6.
Source: GAO analysis of NASS reports.
[End of table]
All eight reports and the FAQ Web site lacked a discussion of four
documentation practices, including the following:
1. Questionnaire testing. NASS produced a separate, internal report
that discusses questionnaire testing in detail; however, publicly
available census publications do not address this topic.
2. Limitations of the data. NASS does not discuss data limitations in
the census reports we reviewed.
3. Impact of imputations, by item. When a statistical agency receives a
report form with missing values, it normally estimates or "imputes"
those values based on comparable data sources such as a similar farm
operation. Although NASS uses a complex editing and imputation process
to estimate missing values, and describes this process in the United
States Summary and State Data report appendix, it does not quantify the
impact of imputations by item in reports.
4. Whether any of the collected data have been suppressed for data
quality reasons. Without information on whether any of the data had
been suppressed because the quality was lacking, data users must assume
that reports include all data items collected in the census had met
agency publication standards.
Although NASS appropriately recognizes the variation in data user needs
by publishing several types of specialized reports, none of the reports
we reviewed direct data users where to find either a complete set of
documentation or additional documentation. For example, given the short
length and summary format of the County Profile reports, it is not
surprising that they lack documentation. However, in order for users to
assess the quality of the data contained in the reports, it is
important for NASS to at least provide links on its Web site or to
other publications where users can access definitions, response rates,
and other relevant information.
NASS Should Document Its Procedures for Handling Correction Requests
Not Filed under the IQA:
NASS has two methods for handling data correction requests, depending
on how they are submitted: a formal approach prescribed by OMB for
correction requests filed under IQA, and an informal approach that NASS
uses to address correction requests that are not filed under IQA.
NASS's informal correction procedures lack transparency because they
are not documented and individual cases are not tracked. As a result,
we could not determine the nature of these correction requests or
whether or how they were addressed.
Consistent with OMB's guidelines, NASS detailed its procedures to
request corrections under IQA on its Web site, and posted appropriate
Federal Register notices. For example, NASS's Web site explains that to
seek a correction under IQA, petitioners must, among other steps: (1)
state that their request for correction is being submitted under IQA,
(2) clearly identify the information they believe to be in error, and
(3) describe which aspects of NASS's IQA guidelines were not followed
or were insufficient.[Footnote 11]
According to the instructions posted on its Web site, NASS's IQA
procedures are triggered only when petitioners explicitly state they
are submitting a correction request under IQA. To date, none have done
so. NASS addresses all other correction requests using informal,
undocumented procedures that were in place before IQA was enacted. NASS
officials explained that such requests are forwarded to the agency
official responsible for preparing the report containing the
information in question. That official, in turn, determines if the
request can be resolved by clarifying the data, or whether a correction
is needed. If a data item needs to be corrected, NASS has a set of
procedures for documenting errors and issuing errata reports that are
detailed in its Policy and Standards Memorandum No. 38. The memorandum
describes the circumstances under which errata reports will be printed,
and provides a mechanism for NASS staff to describe the nature of the
error, its cause, and the action taken to resolve it.
According to the Administrator, Associate Administrator, and other
senior NASS officials we interviewed, the requests it has handled from
the 2002 Census have so far been resolved to the petitioners'
satisfaction, and none resulted in any corrections to the data from the
2002 Census. However, because NASS does not document its informal
procedures for handling inquiries and data correction requests, and
lacks a recordkeeping system to log and track them, NASS could not
provide us with firm information on the number of inquiries it has
handled, the nature of those inquiries, and whether and how they were
addressed.
This is not to say that all complaints should follow the same
procedures required by the IQA mechanism. For efficiency's sake, it is
important for agencies to respond to complaints in accordance with the
magnitude of the problem. However, to provide a more complete picture
of the questions NASS receives about its data and how those questions
were handled, it will be important for NASS to better document its
approach for handling correction requests not filed under IQA, and
track their disposition.
NASS Has Taken Steps to Better Document Its Criteria for Assessing
Input on Census Content:
The 2002 Census of Agriculture was the first in which NASS developed
the questionnaire (the 1997 Census of Agriculture was moved from the
Census Bureau to NASS after the content had been determined). In doing
so, NASS went to great lengths to obtain input from data users on what
questions to ask, and evaluated their suggestions using a documented
set of criteria. In preparing for the 2007 Census, NASS sought feedback
on the questionnaire content from a broader spectrum of data users, in
part because NASS solicited suggestions via the Internet. However,
unlike the 2002 cycle, the criteria NASS used to assess the feedback
were not initially documented, which is contrary to NASS's IQA
guidelines. However, as a result of our review, NASS has developed
documented criteria similar to that used during the previous census.
Under the Paperwork Reduction Act, agencies must obtain OMB's approval
prior to collecting information from the public. As part of this
process, agencies must certify to OMB that, among other things, the
effort is necessary for the proper performance of agency functions,
avoids unnecessary duplication, and reduces burden on small entities.
Agencies must also provide an estimate of the burden the information
collection would place on respondents.[Footnote 12]
For the 2002 Census, NASS submitted its request for approval--a form
called "OMB 83-I"--in August 2001, and OMB approved it in October 2001.
NASS estimated that the census would require a cumulative total of more
than 1.3 million hours for respondents to complete and would cost them,
in terms of their time, in excess of $21 million.
OMB's approval process also requires agencies to solicit input from
external sources. NASS obtained input on the 2002 Agricultural Census
content through a Federal Register notice, meetings with data users,
and by contacting federal and state agencies that use census statistics
to discuss data needs.
Likewise, NASS is obtaining input on the content of the 2007 Census
through a variety of channels. According to an agency official, the
process began around June 2004, when NASS began releasing publications
from the 2002 Census. NASS sent an evaluation form to its state offices
requesting feedback on the census, including their suggestions for
changing the content. NASS also asked the state offices to identify
users from whom it could obtain additional feedback.
NASS solicited further input by reaching out to data users within USDA
and other federal agencies, querying organizations included in a list
of "typical" data users maintained by NASS's Marketing and Information
Services Office, and holding periodic regional meetings with data
users. NASS also has a "hot button" on its Web site where visitors are
asked what items, if any, should be added or deleted from the
census.[Footnote 13]
In all, NASS obtained input on the 2007 Census through 10 distinct
conduits. Moreover, compared to the process used to develop the content
of the 2002 Census, its 2007 efforts were open to a wider spectrum of
customers, and involved more direct contact with data users during the
planning phase. Indeed, as shown in table 3, NASS's outreach via the
Internet, regional meetings, and queries to data users was over and
above the steps it took when developing the 2002 Census. This openness
was reflected in the comments of the six data users we interviewed.
Five of the six users said NASS's approach to eliciting input was
adequate, while three of the six had requested new content items for
the 2007 Census to better meet the needs of their organizations.
The content evaluation process began in December 2004, and NASS is
currently testing the questionnaire content. Following any refinements,
mail-out of the actual census is scheduled for December 2007.
Table 3: NASS Is Using More Extensive Outreach to Develop the 2007
Census Compared to 2002:
Method of outreach: Posted Federal Register notices;
2002 Census: Yes;
2007 Census: Yes.
Method of outreach: Solicited input from state agricultural statistical
offices;
2002 Census: Yes;
2007 Census: Yes.
Method of outreach: Solicited input from state governors;
2002 Census: Yes;
2007 Census: No.
Method of outreach: Solicited input from Advisory Committee on
Agricultural Statistics;
2002 Census: Yes;
2007 Census: Yes.
Method of outreach: Solicited input from land grant universities;
2002 Census: Yes;
2007 Census: Yes.
Method of outreach: Solicited input from federal data users;
2002 Census: Yes;
2007 Census: Yes.
Method of outreach: Held federal data user working group meetings;
2002 Census: Yes;
2007 Census: Yes.
Method of outreach: Held USDA-wide national data user outreach meeting;
2002 Census: Yes;
2007 Census: Yes.
Method of outreach: Solicited input from a list of "typical" census
users maintained by NASS's Marketing and Information Services Office;
2002 Census: No;
2007 Census: Yes.
Method of outreach: Solicited input via Web site feedback form;
2002 Census: No;
2007 Census: Yes.
Method of outreach: Held NASS-specific, regional data user meeting;
2002 Census: No;
2007 Census: Yes.
Source: GAO analysis of NASS data.
[End of table]
For both the 2002 and 2007 Census cycles, the solicitation, review, and
ultimate determination of the questionnaire content was led by the
Census Content Team, a group consisting of experienced NASS
statisticians representing different segments of the agency such as
livestock, crops, and marketing. The 2002 Content Team used specific,
documented criteria to inform its decisions. Specifically, suggestions
were assessed according to the following factors, which were also made
available to data users:
* items directly mandated by Congress or items that had strong
congressional support;
* items proposed by other federal agencies where legislation called for
that agency to provide data for Congress;
* items needed for evaluation of existing federal programs;
* items which, if omitted, would result in additional respondent burden
and cost for a new survey for other agencies or users;
* items required for classification of farms by historical groupings;
* items needed for improving coverage in the census; and:
* items that would provide data on current agricultural issues.
However, the criteria the 2007 Team used to assess input on the
questionnaire content were not initially documented. According to
agency officials we interviewed, NASS largely relied on professional
judgment to evaluate the feedback it received, considering such factors
as the need to keep the data comparable to past censuses and not
increase the length of the questionnaire.
Although a certain amount of professional judgment will invariably be
used in making determinations on questionnaire content, the absence of
documented assessment criteria is inconsistent with NASS's guidelines.
Indeed, these guidelines note that transparent documentation "allows a
data user to understand how an information product was designed and
produced." Moreover, without documented criteria, it is not clear
whether members of the Content Team are considering the same set of
factors, or even if they are weighing those factors in the same manner.
According to NASS, the shift in approach stemmed from staff turnover
and reassignments of members of the 2002 Team and, as a result, the
2007 Team was not aware of the criteria used in 2002. Our review made
the 2007 Team aware of the earlier set of criteria, and the Team has
since developed similar documentation. NASS noted that all future
content teams will use and update these criteria when developing the
content of subsequent censuses.
It will be important for NASS to continue with this approach because it
is more consistent with its own IQA guidelines, and will also help NASS
to do the following:
Ensure the utility and relevance of information. A key principle for
federal statistical agencies is to provide information relevant to
issues of public policy.[Footnote 14] However, the nation's information
needs are constantly evolving, and it is important for statistical
agencies to adapt accordingly. This is particularly true with
agriculture, where a variety of factors such as changing technology and
agricultural trends can affect what information should be collected.
Rigorous content selection criteria could help NASS methodically
evaluate the needs of different users, establish priorities, and keep
the census synchronized with changing public policy requirements.
Maximize cost-effectiveness and reduce public burden. As with all
federal surveys, there are financial and nonfinancial costs to
conducting the Census of Agriculture. These costs include the direct
expenditures related to planning, implementing, and analyzing the
census, as well as disseminating the information. There is also a cost
to respondents in terms of the time they take to complete the
questionnaire. Additionally, there are opportunity costs in that for
every question that is included in the census, another question might
need to be excluded so as not to increase the length of the census.
Rigorous, consistently applied criteria can help promote cost-
effectiveness because they can ensure that only those questions that
meet a particular, previously identified need are included in the
census. Applying such criteria also help inform decisions on the
appropriate role of the federal government in collecting the data, and
whether a particular question might be more appropriately addressed by
a different survey, government organization, or the by the private
sector.
Maintain credibility. Content selection criteria provide a basis for
consistent decision making on what to include in the census and what
gets left off. This is especially important for maintaining NASS's
credibility given the input it receives from various sources. Without
documented criteria, NASS's actions could be perceived as arbitrary or
disproportionately swayed by one particular interest or another; thus,
NASS's decisions would be more defensible.
Further, documented criteria will guard against the loss of
institutional memory to the extent there is further turnover in Content
Team membership.
Conclusions:
NASS satisfied the procedural responsibilities and reporting
requirements under OMB's IQA guidelines. Moreover, to the extent that
NASS continues to use the documented criteria it developed to inform
future decisions on the content of the Census of Agriculture, it could
help establish a closer alignment between the questions included in the
census and evolving agricultural policy requirements, resulting in a
more cost-effective data collection program.
Building on these efforts, the transparency of census data products
could be improved with more robust documentation. NASS's procedures for
addressing correction requests not filed under IQA could be more
transparent as well. More than just a paperwork issue, greater
transparency will help enhance NASS's accountability to public data
users and increase the credibility of census information.
Recommendations for Executive Action:
To help enhance the transparency of the Census of Agriculture's
processes and products, we recommend that the Secretary of Agriculture
direct NASS to take the following two steps:
1. Ensure that census products fully address NASS's own guidelines for
data documentation or at least contain links to such information. The
list of 20 documentation practices that we developed, while not
necessarily exhaustive, represents sound actions used by other
statistical agencies and could form a starting point for NASS.
2. Document and post on NASS's Web site its procedures for handling
data correction requests not filed under IQA, and track the disposition
of those requests.
Agency Comments and Our Evaluation:
The NASS Administrator provided written comments on a draft of this
report on September 8, 2005, which are reprinted in appendix I. NASS
noted that our "report and recommendations are insightful and will be
used to further strengthen the transparency of NASS methods and
procedures."
In particular, NASS concurred with our finding that the methods and
procedures in its specialized reports should be better documented and,
consistent with our recommendation, stated that these products "will
now provide links to this information." NASS's efforts, if fully
implemented, should make it easier for data users to understand how
these products were designed and produced, and NASS should be commended
for its actions to continually improve its products and better meet the
needs of its customers.
While NASS's more comprehensive products were better documented, our
analysis found that they could also benefit from more robust
documentation. Thus, in keeping with our recommendation, it will be
important for NASS to ensure that all of its census products--its
larger reports and more focused studies--fully address NASS's own
guidelines for data documentation.
In commenting on our recommendation for NASS to document and post on
its Web site its procedures for handling data correction requests not
filed under IQA, NASS concurred with our view that this information
would provide it with a better sense of the questions it receives about
its data, but added that "a detailed recordkeeping system to log and
track every inquiry" would not be the best use of its resources.
Instead, NASS plans to "compile a listing of the more common issues"
and make them available on its Web site in the form of frequently asked
questions. NASS believes this approach would be useful for future
planning, as well as provide answers to questions most likely to arise
among other data users.
As noted in our report, our recommendation stemmed from our finding
that NASS could not provide us with information on the number of
inquiries not filed under IQA, the characteristics of those inquiries,
and how they were addressed. Although the details remain to be seen,
NASS's proposed approach could provide this information and, consistent
with the intended outcome our recommendation, address the need for
greater transparency. NASS's efforts will be further strengthened if,
consistent with our recommendation, it posts on its Web site its
procedures for handling correction requests not filed under IQA.
We will send copies of this report to other interested congressional
parties, the Secretary of Agriculture, and the NASS Administrator.
Copies will be made available to others on request. This report will
also be available at no charge on GAO's Web site at [Hyperlink,
http://www.gao.gov].
If you or your staff have any questions about this report, please
contact me at (202) 512-6806 or [Hyperlink, williamso@gao.gov]. Contact
points for our Offices of Congressional Relations and Public Affairs
may be found on the last page of this report. GAO staff who made major
contributions to this report are listed in appendix III.
Signed by:
Orice M. Williams:
Director:
Financial Markets and Community Investments:
[End of section]
Appendixes:
Appendix I: Comments from the Department of Agriculture:
United States Department of Agriculture:
National Agricultural Statistics Service:
1400 Independence Avenue, SW
Washington, DC 20250-2000:
September 8, 2005:
Ms. Orice M. Williams:
Director, Financial Markets and Community Investments:
U.S. General Accountability Office:
Washington, D.C. 20548:
Dear Ms. Williams:
This is in response to the draft General Accountability Office (GAO)
report on Information Quality Act: National Agricultural Statistics
Service Implements First Steps but Documentation of Census of
Agriculture Could be Improved. NASS appreciates the detail and
professionalism exhibited by the General Accountability Office
auditors. The draft report and recommendations are insightful and will
be used to further strengthen the transparency of NASS methods and
procedures.
For clarity, the NASS comments have been organized under each of the
two recommendations to the Secretary of Agriculture.
1. Ensure that census products fully address NASS's own guidelines for
data documentation, or at least contain links to such information. The
list of 20 documentation practices that we developed, while not
necessarily exhaustive, represents sound actions used by other
statistical agencies and could form a starting point for NASS.
The draft GAO report correctly depicts the NASS guidelines on data
documentation. The NASS goal is to provide a clear description of the
methods and procedures used so a data user will understand how an
information product was designed and produced.
NASS concurs that it has done a much more thorough job of documenting
methods and procedures in its comprehensive products, such as the
"United States Summary: Geographic Area Series." NASS also agrees that
specialized reports, though generally short in length, should inform
data users where to obtain similar procedural information.
NASS will continue to provide thorough documentation of its methods and
procedures in its comprehensive products and will now provide links to
this information in the shorter, specialized reports.
2. Document and post on NASS's Web site its procedures for handling
data correction requests not filed under IQA, and track the disposition
of those requests.
NASS strives to handle all data correction requests in an unbiased and
thorough manner. These requests usually result from an individual's
professional opinion differing from the official estimate. The official
estimate is the product of predefined, consistent statistical
procedures. As stated in the draft report, none of the data correction
requests received to date for the 2002 Census of Agriculture resulted
in corrections to the data.
NASS concurs with the draft report observation that it might be useful
to obtain "a more complete picture of the questions NASS receives about
its data." However, NASS does not believe it would be the best use of
resources to attempt to maintain a detailed recordkeeping system to log
and track every inquiry. An alternative approach, which NASS will
implement, is to compile a listing of the more common issues raised
after census publication and disseminate, via the NASS Web site, a post-
census summary of frequently asked questions (FAQ) that seek data
clarification. This will allow NASS to develop a better picture of the
questions received for future planning as well as provides answers to
those questions most likely to arise among other data users.
We appreciate the opportunity to review and provide comments on the
draft report.
Sincerely,
Signed by:
R. Ronald Bosecker:
Administrator:
[End of section]
Appendix II: How Census of Agriculture Reports Address Various
Documentation Elements:
Documentation practices: Discussion of how NASS developed the content
of the census;
Product titles:
United States Summary and State Report;
State and County Reports (Alabama);
Quick Stats: Ag Statistics Database;
Frequently Asked Questions.
Documentation practices: Discussion of 2002 Census content consistency
with 1997;
Product titles:
United States Summary and State Report;
State and County Reports (Alabama);
Quick Stats: Ag Statistics Database;
Frequently Asked Questions.
Documentation practices: Description of how the content differs;
Product titles:
United States Summary and State Report;
State and County Reports (Alabama);
Quick Stats: Ag Statistics Database;
Frequently Asked Questions.
Documentation practices: Discussion of why NASS made content changes;
Product titles:
United States Summary and State Report;
State and County Reports (Alabama);
Quick Stats: Ag Statistics Database.
Documentation practices: Definition of the population;
Product titles:
United States Summary and State Report;
State and County Reports (Alabama);
Quick Stats: Ag Statistics Database;
Quick Facts;
Frequently Asked Questions.
Documentation practices: Definition of data items;
Product titles:
United States Summary and State Report;
State and County Reports (Alabama);
Quick Stats: Ag Statistics Database;
Quick Facts;
Ranking of Congress. Districts.
Documentation practices: Discussion of sample design;
Product titles:
United States Summary and State Report;
State and County Reports (Alabama);
Quick Stats: Ag Statistics Database.
Documentation practices: Discussion of questionnaire design;
Product titles:
United States Summary and State Report;
State and County Reports (Alabama);
Quick Stats: Ag Statistics Database.
Documentation practices: Discussion of questionnaire testing.
Documentation practices: Copy of the questionnaire;
Product titles:
United States Summary and State Report;
State and County Reports (Alabama).
Documentation practices: Discussion of data collection procedures;
Product titles:
United States Summary and State Report;
State and County Reports (Alabama);
Quick Stats: Ag Statistics Database.
Documentation practices: Discussion of nonresponse followup;
Product titles:
United States Summary and State Report;
State and County Reports (Alabama);
Quick Stats: Ag Statistics Database.
Documentation practices: Discussion of data entry procedures;
Product titles:
United States Summary and State Report;
State and County Reports (Alabama);
Quick Stats: Ag Statistics Database.
Documentation practices: Discussion of data editing procedures;
Product titles:
United States Summary and State Report;
State and County Reports (Alabama);
Quick Stats: Ag Statistics Database.
Documentation practices: Quantify the impact of imputations, by item.
Documentation practices: Discussion of response rates;
Product titles:
United States Summary and State Report;
State and County Reports (Alabama);
Quick Stats: Ag Statistics Database;
Frequently Asked Questions.
Documentation practices: Discussion of nonsampling error;
Product titles:
United States Summary and State Report;
State and County Reports (Alabama);
Quick Stats: Ag Statistics Database.
Documentation practices: Discussion of the limitations of the data.
Documentation practices: Discussion of whether any of the collected
data have been suppressed for data quality reasons.
Documentation practices: Comparison of census results to other survey
results for consistency;
Product titles:
Frequently Asked Questions.
Source: GAO analysis of NASS data.
Note: See the Objectives, Scope, and Methodology section of this report
for a complete explanation of our analysis.
[End of table]
[End of section]
Appendix III: GAO Contact and Staff Acknowledgments:
GAO Contact:
Orice M. Williams, (202) 512-6806:
Acknowledgments:
In addition to the contact named above, Robert Goldenkoff, Assistant
Director; David Bobruff; Jennifer Cook; Richard Donaldson; Andrea
Levine; Robert Parker; John Smale; and Michael Volpe made key
contributions to this report.
(450344):
FOOTNOTES
[1] Consolidated Appropriations - Fiscal Year 2001, Pub. L. No. 106-
544, § 515, 114 Stat. 2763A-153 to 2763A-154 (2000) (44 U.S.C. § 3516
note).
[2] Under the Paperwork Reduction Act, agencies must estimate the
burdens their data collections impose on the public.
[3] NASS officials reported that they have not received any complaints
under the IQA.
[4] Agencies covered by the IQA include all agencies subject to the
Paperwork Reduction Act (PRA) - cabinet departments, independent
regulatory agencies (e.g., Federal Communications Commission), and
other independent agencies (e.g., the Environmental Protection Agency).
44 U.S.C. § 3502(1).
[5] Discussion of the IQA often centers on its impact on agencies'
regulatory activities. Supporters of IQA, many of whom represent
businesses and other regulated entities, maintain that IQA could
enhance the quality of agency science and improve the rule-making
process. Critics of IQA, including some environmental and public
interest groups, view the law as a device to curtail health, safety,
and other regulations.
[6] OMB, Measuring and Reporting Sources of Error in Surveys,
Statistical Policy Working Paper 31, July 2001.
[7] 44 U.S.C. § 3506(b)(1)(C).
[8] 67 Fed. Reg. 8452 (Feb. 22, 2002).
[9] Margaret E. Martin, Miron L. Straf, and Constance F. Citro eds.,
Principles and Practices for a Federal Statistical Agency, (3rd ed.)
(Washington, D.C.: National Academies Press, 2005), pp. 8, 29.
[10] Reports can be obtained from USDA's Web site; see www.usda.gov/
nass.
[11] See www.usda.gov/nassinfo/infocorrection.htm.
[12] 44 U.S.C. § 3506(c).
[13] See http://www.nass.usda.gov/census/census02/feedback.htm.
[14] Margaret E. Martin, Miron L. Straf, and Constance F. Citro eds.,
Principles and Practices for a Federal Statistical Agency, (3rd ed.)
(Washington, D.C.: National Academies Press, 2005), p. 4.
GAO's Mission:
The Government Accountability Office, the investigative arm of
Congress, exists to support Congress in meeting its constitutional
responsibilities and to help improve the performance and accountability
of the federal government for the American people. GAO examines the use
of public funds; evaluates federal programs and policies; and provides
analyses, recommendations, and other assistance to help Congress make
informed oversight, policy, and funding decisions. GAO's commitment to
good government is reflected in its core values of accountability,
integrity, and reliability.
Obtaining Copies of GAO Reports and Testimony:
The fastest and easiest way to obtain copies of GAO documents at no
cost is through the Internet. GAO's Web site ( www.gao.gov ) contains
abstracts and full-text files of current reports and testimony and an
expanding archive of older products. The Web site features a search
engine to help you locate documents using key words and phrases. You
can print these documents in their entirety, including charts and other
graphics.
Each day, GAO issues a list of newly released reports, testimony, and
correspondence. GAO posts this list, known as "Today's Reports," on its
Web site daily. The list contains links to the full-text document
files. To have GAO e-mail this list to you every afternoon, go to
www.gao.gov and select "Subscribe to e-mail alerts" under the "Order
GAO Products" heading.
Order by Mail or Phone:
The first copy of each printed report is free. Additional copies are $2
each. A check or money order should be made out to the Superintendent
of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or
more copies mailed to a single address are discounted 25 percent.
Orders should be sent to:
U.S. Government Accountability Office
441 G Street NW, Room LM
Washington, D.C. 20548:
To order by Phone:
Voice: (202) 512-6000:
TDD: (202) 512-2537:
Fax: (202) 512-6061:
To Report Fraud, Waste, and Abuse in Federal Programs:
Contact:
Web site: www.gao.gov/fraudnet/fraudnet.htm
E-mail: fraudnet@gao.gov
Automated answering system: (800) 424-5454 or (202) 512-7470:
Public Affairs:
Jeff Nelligan, managing director,
NelliganJ@gao.gov
(202) 512-4800
U.S. Government Accountability Office,
441 G Street NW, Room 7149
Washington, D.C. 20548: