Higher Education
Institutions' Reported Data Collection Burden Is Higher Than Estimated but Can Be Reduced through Increased Coordination
Gao ID: GAO-10-871 August 13, 2010
The Integrated Postsecondary Education Data System (IPEDS) is the federal government's core postsecondary data collection program. Approximately 6,800 postsecondary schools are required to complete annual IPEDS surveys on topics including enrollment, graduation rates, and finances. As policymakers have sought additional data to increase accountability in postsecondary education, the number and complexity of questions on the IPEDS surveys have increased. GAO was mandated to examine: (1) the time and cost burden for schools completing the IPEDS surveys, (2) options for reducing this burden, and (3) the potential benefits and challenges of collecting additional graduation rate data. To do this, GAO interviewed staff from 22 postsecondary schools, reviewed existing estimates of the IPEDS time and cost burden, interviewed officials at the Department of Education (Education) and Office of Management and Budget, and interviewed higher education associations and higher education software providers.
The IPEDS burden reported by schools to GAO varies widely but was greater than Education's estimates for 18 of the 22 schools interviewed. Over half of these institutions reported time burdens that were more than twice Education's estimates. Schools reported time burdens ranging from 12 to 590 hours, compared with the 19 to 41 hours Education estimated for this group of institutions. Staff experience and school characteristics such as organizational structure appear to affect the burden. Education's official burden estimates may be lower than those reported to GAO because officials rely on potentially outdated baseline estimates and consult with few survey respondents (known as keyholders) about the impact of survey changes. Training, software, and administrative supports can reduce the IPEDS reporting burden and would be enhanced by increased coordination among institutions, Education, and software providers. Education is developing training modules targeting new keyholders, but some keyholders at career and technical schools are unaware of available training, which may be due to challenges Education faces in reaching these types of schools. Campus data systems may also reduce the burden through automated reporting features; however, few schools GAO interviewed use these features due to concerns that they do not always work correctly. One factor contributing to this is the lack of direct and timely coordination between software providers and Education to incorporate changes to the IPEDS surveys. Collecting additional graduation rate data disaggregated by race, ethnicity, and income could be useful but would increase the IPEDS burden. Graduation rates could be used to study achievement gaps, but they are a limited measure because they only account for first-time, full-time students. All 4- and 2-year schools are already required to report some graduation rates disaggregated by race and ethnicity to IPEDS, and staff at all types of schools told GAO they could do so at a modest additional burden. Reporting graduation rates by income is more challenging because income data are available only for the 71 percent of full-time students that apply for federal student aid. Keyholders said calculating graduation rates by income for these students would add a considerable burden by potentially requiring institutions to merge separate student records and financial aid databases. GAO recommends that Education reevaluate official IPEDS burden estimates, communicate IPEDS training opportunities to a wider range of schools, and coordinate with education software providers to help improve the quality and reliability of IPEDS reporting features. Education agreed with GAO's recommendations and plans to address these issues.
Recommendations
Our recommendations from this work are listed below with a Contact for more information. Status will change from "In process" to "Open," "Closed - implemented," or "Closed - not implemented" based on our follow up work.
Director:
George A. Scott
Team:
Government Accountability Office: Education, Workforce, and Income Security
Phone:
(202) 512-5932
GAO-10-871, Higher Education: Institutions' Reported Data Collection Burden Is Higher Than Estimated but Can Be Reduced through Increased Coordination
This is the accessible text file for GAO report number GAO-10-871
entitled 'Higher Education: Institutions' Reported Data Collection
Burden Is Higher Than Estimated but Can Be Reduced through Increased
Coordination' which was released on August 13, 2010.
This text file was formatted by the U.S. Government Accountability
Office (GAO) to be accessible to users with visual impairments, as
part of a longer term project to improve GAO products' accessibility.
Every attempt has been made to maintain the structural and data
integrity of the original printed product. Accessibility features,
such as text descriptions of tables, consecutively numbered footnotes
placed at the end of the file, and the text of agency comment letters,
are provided but may not exactly duplicate the presentation or format
of the printed version. The portable document format (PDF) file is an
exact electronic replica of the printed version. We welcome your
feedback. Please E-mail your comments regarding the contents or
accessibility features of this document to Webmaster@gao.gov.
This is a work of the U.S. government and is not subject to copyright
protection in the United States. It may be reproduced and distributed
in its entirety without further permission from GAO. Because this work
may contain copyrighted images or other material, permission from the
copyright holder may be necessary if you wish to reproduce this
material separately.
Report to Congressional Committees:
United States Government Accountability Office:
GAO:
August 2010:
Higher Education:
Institutions' Reported Data Collection Burden Is Higher Than Estimated
but Can Be Reduced through Increased Coordination:
GAO-10-871:
GAO Highlights:
Highlights of GAO-10-871, a report to congressional committees.
Why GAO Did This Study:
The Integrated Postsecondary Education Data System (IPEDS) is the
federal government‘s core postsecondary data collection program.
Approximately 6,800 postsecondary schools are required to complete
annual IPEDS surveys on topics including enrollment, graduation rates,
and finances. As policymakers have sought additional data to increase
accountability in postsecondary education, the number and complexity
of questions on the IPEDS surveys have increased. GAO was mandated to
examine: (1) the time and cost burden for schools completing the IPEDS
surveys, (2) options for reducing this burden, and (3) the potential
benefits and challenges of collecting additional graduation rate data.
To do this, GAO interviewed staff from 22 postsecondary schools,
reviewed existing estimates of the IPEDS time and cost burden,
interviewed officials at the Department of Education (Education) and
Office of Management and Budget, and interviewed higher education
associations and higher education software providers.
What GAO Found:
The IPEDS burden reported by schools to GAO varies widely but was
greater than Education‘s estimates for 18 of the 22 schools
interviewed. Over half of these institutions reported time burdens
that were more than twice Education‘s estimates. Schools reported time
burdens ranging from 12 to 590 hours, compared with the 19 to 41 hours
Education estimated for this group of institutions (see figure). Staff
experience and school characteristics such as organizational structure
appear to affect the burden. Education‘s official burden estimates may
be lower than those reported to GAO because officials rely on
potentially outdated baseline estimates and consult with few survey
respondents (known as keyholders) about the impact of survey changes.
Figure: Time Burdens Reported by 22 Institutions Compared with
Education‘s Official Estimates by Institution Type:
[Refer to PDF for image: horizontal bar graph]
Type of institution: Less than 2-year;
Reported burden hours:
Education estimate (upper bound): 18.7;
Reported to GAO by individual institutions: 21; 21; 23.1; 140; 353.5.
Type of institution: 2-year;
Reported burden hours:
Education estimate (upper bound): 40.9;
Reported to GAO by individual institutions: 11.6; 23; 23; 47; 72; 72;
78; 81.5.
Type of institution: 4-year;
Reported burden hours:
Education estimate (upper bound): 39.4;
Reported to GAO by individual institutions: 36; 50.6; 95.4; 120;
125.3; 129; 195.5; 298; 368; 456.5; 590.
Source: GAO analysis of Education documents and interviews.
[End of figure]
Training, software, and administrative supports can reduce the IPEDS
reporting burden and would be enhanced by increased coordination among
institutions, Education, and software providers. Education is
developing training modules targeting new keyholders, but some
keyholders at career and technical schools are unaware of available
training, which may be due to challenges Education faces in reaching
these types of schools. Campus data systems may also reduce the burden
through automated reporting features; however, few schools GAO
interviewed use these features due to concerns that they do not always
work correctly. One factor contributing to this is the lack of direct
and timely coordination between software providers and Education to
incorporate changes to the IPEDS surveys.
Collecting additional graduation rate data disaggregated by race,
ethnicity, and income could be useful but would increase the IPEDS
burden. Graduation rates could be used to study achievement gaps, but
they are a limited measure because they only account for first-time,
full-time students. All 4- and 2-year schools are already required to
report some graduation rates disaggregated by race and ethnicity to
IPEDS, and staff at all types of schools told GAO they could do so at
a modest additional burden. Reporting graduation rates by income is
more challenging because income data are available only for the 71
percent of full-time students that apply for federal student aid.
Keyholders said calculating graduation rates by income for these
students would add a considerable burden by potentially requiring
institutions to merge separate student records and financial aid
databases.
What GAO Recommends:
GAO recommends that Education reevaluate official IPEDS burden
estimates, communicate IPEDS training opportunities to a wider range
of schools, and coordinate with education software providers to help
improve the quality and reliability of IPEDS reporting features.
Education agreed with GAO‘s recommendations and plans to address these
issues.
View GAO-10-871 or key components. For more information, contact
George A. Scott at (202) 512-7215 or scottg@gao.gov.
[End of section]
Contents:
Letter:
Background:
Schools' Reported IPEDS Burdens Exceed Official Estimates, and
Education Lacks a Robust Process for Estimating the Burden:
Training, Software, and Administrative Supports Can Reduce the IPEDS
Burden and Would Be Enhanced by Increased Coordination:
Additional Graduation Rate Data, Although of Some Use, Is an
Incomplete Measure of Student Outcomes and Would Add to Schools'
Burden:
Conclusions:
Recommendations for Executive Action:
Agency Comments and Our Evaluation:
Appendix I: Objectives, Scope, and Methodology:
Appendix II: Comments from the Department of Education:
Appendix III: GAO Contact and Staff Acknowledgments:
Tables:
Table 1: IPEDS Survey Components and Collection Period:
Table 2: Uses of IPEDS data:
Table 3: IPEDS Graduation Rate Data Currently Collected by Race and
Ethnicity:
Table 4: List of Institutions Included in Study:
Figures:
Figure 1: The IPEDS Reporting Process:
Figure 2: Time Burdens Reported by 22 Institutions Compared with
Education's Official Estimates by Institution Type:
Figure 3: Frequency School Officials Reported Feeling Various Degrees
of Burdens from IPEDS:
Figure 4: Frequency with Which Each Survey Was Rated the Most
Burdensome by 22 Institutions:
Figure 5: Hypothetical Graduation Rate Calculation Example for 4-Year
Institution:
Abbreviations:
Education: Department of Education:
FAFSA: Free Application for Federal Student Aid:
IPEDS: Integrated Postsecondary Education Data System:
NCES: National Center for Education Statistics:
[End of section]
United States Government Accountability Office:
Washington, DC 20548:
August 13, 2010:
The Honorable Tom Harkin:
Chairman:
The Honorable Michael B. Enzi:
Ranking Member:
Committee on Health, Education, Labor, and Pensions:
United States Senate:
The Honorable George Miller:
Chairman:
The Honorable John P. Kline:
Ranking Member:
Committee on Education and Labor:
House of Representatives:
The Integrated Postsecondary Education Data System (IPEDS) is the
federal government's core postsecondary data collection program. Every
college, university, and career and technical institution that
participates in federal student financial aid programs is required to
complete this group of annual surveys on a variety of topics including
enrollments, graduation rates, staffing, finances, and financial aid.
[Footnote 1] The National Center for Education Statistics (NCES) at
the Department of Education (Education) compiles these survey data
from approximately 6,800 institutions and uses them to research trends
in postsecondary education and inform policy decisions. The data are
made publicly available to allow researchers and federal and state
agencies to analyze higher education issues and help students and
parents make informed choices about postsecondary educational
opportunities.
Over the last several years, Education has increased the number and
complexity of questions on the IPEDS surveys as policymakers have
sought additional data in an effort to increase transparency and
accountability in postsecondary education. For example, additional
questions about institutions' graduation rates were added to the
survey in 2009.[Footnote 2] However, the expansion of the surveys has
raised questions about the burden the surveys impose on participating
institutions. As required under the Paperwork Reduction Act,[Footnote
3] Education estimates the time and cost burden associated with
completing the surveys. For the 2009-2010 reporting cycle, Education
estimated an average IPEDS time burden ranging from 15 to 41 hours,
depending on the type of institution, and total estimated salaries and
computer costs of over $6 million. However, several postsecondary
institutions and associations have noted that these projections
substantially underestimate the actual survey burden. Moreover,
certain types of institutions, such as community colleges and
technical schools, are dealing with more data due to the jump in their
enrollments as a result of current economic conditions. In this
context, Congress mandated in the Higher Education Opportunity Act
that GAO study the time and cost burdens on institutions of completing
the IPEDS surveys.[Footnote 4] Accordingly, we examined the following
questions:
* What is known about the time and cost burden of completing the IPEDS
surveys for postsecondary institutions?
* What options exist for reducing this burden for these institutions?
* What are the potential benefits and challenges of collecting
additional data on institutions' graduation rates?
To understand the time and cost burden of completing the IPEDS
surveys, we interviewed institution staff from 22 postsecondary
institutions who are responsible for entering data into the IPEDS
surveys and are known as keyholders. This nonprobability sample of 22
institutions represented a mix of 4-year, 2-year, and less than 2-year
institutions, as well as public, not-for-profit, and for-profit
(proprietary) institutions in different geographic areas of the
country. While limiting our sample to 22 schools precluded us from
generalizing our findings to the entire population of about 6,800
postsecondary schools that complete IPEDS, our approach allowed us to
conduct detailed, in-person interviews with keyholders and relevant
staff without substantially burdening the schools. We also reviewed
existing estimates of the IPEDS time and cost burden and interviewed
officials from Education and the Office of Management and Budget about
the methodology and assumptions used to create Education's official
burden estimates. To examine options for reducing the IPEDS reporting
burden, we interviewed Education officials, higher education
associations, higher education software providers, and keyholders. To
assess the potential benefits and challenges of collecting additional
data on graduation rates, we interviewed keyholders as well as
researchers and Education officials.
We conducted this performance audit from August 2009 to August 2010,
in accordance with generally accepted government auditing standards.
Those standards require that we plan and perform the audit to obtain
sufficient, appropriate evidence to provide a reasonable basis for our
findings and conclusions based on our audit objectives. We believe
that the evidence obtained provides a reasonable basis for our
findings and conclusions based on our audit objectives. For more
information on our objectives, scope, and methodology, see appendix I.
Background:
Estimating the Burden of Federal Information Collections:
To better manage the federal government's imposition on the public
with information collections, the Paperwork Reduction Act requires
federal agencies like Education to estimate the burden, or the amount
of time, effort, and financial resources that the public expends to
comply with an agency's information collection.[Footnote 5] The time
burden is generally measured as the amount of time it takes
respondents to review instructions, search data sources, complete and
review their responses, and transmit or disclose information. Agencies
inform the person receiving the collection of information of the
estimated time burden of the collection, which helps respondents plan
for how long the collection will take to complete.
The Office of Information and Regulatory Affairs, within the Office of
Management and Budget, was created by the Paperwork Reduction Act to
approve information collections subject to the Paperwork Reduction
Act, which are generally all those collecting information from 10 or
more respondents.[Footnote 6] The Office of Information and Regulatory
Affairs weighs the value to society of collecting the data against the
burden imposed by collecting them to approve or deny information
collection requests. Once a collection has been approved, the agency
may carry out the information collection for 3 years or until there
are substantial changes to the collection, at which time the Office of
Information and Regulatory Affairs requests that agencies revise their
estimates.
IPEDS:
IPEDS is a set of surveys completed annually by institutions of higher
education in the United States since 1986. It is the successor to the
Higher Education General Information Survey, which collected
information on postsecondary institutions from 1966 to 1985. About
6,800 institutions completed IPEDS surveys in academic year 2008-2009.
Institutions are required to report IPEDS data or face a fine as high
as $27,500 per violation and lose their eligibility for federal
student financial aid. IPEDS collects information on institutions and
their students and is composed of nine surveys administered throughout
the year.[Footnote 7] IPEDS collects information on institutional
characteristics, degrees earned, finance, human resources, enrollment,
graduation rates, and financial aid, as illustrated in table 1.
Institutions report data on either the current or prior years
depending on the survey.
Table 1: IPEDS Survey Components and Collection Period:
Collection period: Fall; September-October;
Survey: Institutional Characteristics;
Description of survey content: General information on the institution
such as degrees offered, admission requirements, and tuition.
Collection period: Fall; September-October;
Survey: Completions;
Description of survey content: Degrees conferred by field of study.
Collection period: Fall; September-October;
Survey: 12-Month Enrollment;
Description of survey content: Unduplicated count of all enrolled
students for the prior year by gender, and race/ethnicity.
Collection period: Winter; December-January;
Survey: Human Resources;
Description of survey content: Institutional staff by full-time or
part-time, assigned position, salary, gender, and race/ethnicity.
Collection period: Spring; December-April;
Survey: Fall Enrollment;
Description of survey content: Fall student enrollment by level of
study, part-time or full-time, gender, and race/ethnicity.
Collection period: Spring; December-April;
Survey: Finance;
Description of survey content: Financial data on assets, liabilities,
revenues, and expenses.
Collection period: Spring; December-April;
Survey: Student Financial Aid;
Description of survey content: Average financial aid amounts and
percentages of students receiving various types of assistance.
Collection period: Spring; December-April;
Survey: Graduation Rates; Graduation Rates 200[A];
Description of survey content: Percentages of first-time, full-time
students who graduate within specific time periods.
Source: Education.
[A] IPEDS collects graduation rates through two separate surveys, the
Graduation Rates and Graduation Rates 200 Surveys, which use different
cohorts of students as the basis for the calculations.
[End of table]
Much of the information IPEDS collects from postsecondary institutions
is required by federal laws. For example, reporting student racial and
ethnic data is done in implementation of the Civil Rights Act of 1964,
[Footnote 8] and the data on vocational program completions were added
due to a requirement in the Carl D. Perkins Vocational Education Act.
[Footnote 9] Due to new statutory requirements and design changes, the
content and format of IPEDS has changed many times throughout its
history.
IPEDS Data Collection Process:
NCES contracts with RTI International, a nonprofit organization that
provides research and technical expertise to governments and
businesses, to administer the IPEDS surveys. RTI International uses an
online survey instrument to collect and validate IPEDS data from
institutions. At the institutional level, one individual is designated
the "keyholder," and has the authority to "lock" the institution's
data for a given survey, signaling to RTI International that it is
complete. The keyholder may be someone in the school's institutional
research office or especially in small institutions, may be the
institution's general manager. The keyholder will work with other
individuals and offices in his or her institution as necessary to
collect and report the institutional data. In addition to reviewing
the data submitted by the keyholders, an IPEDS state coordinator can
also help coordinate reporting activities for a specified group of
schools within a state. Figure 1 illustrates the general process for
collecting and reporting IPEDS data.
Figure 1: The IPEDS Reporting Process:
[Refer to PDF for image: illustration]
Reporting process:
Prepare:
Get data ready for IPEDS:
Clean student data for accuracy. Study IPEDS data definition for each
question,and check it against the data definitions used in the
school‘s database. Gather data from various school departments and
electronically manipulate the numbers into IPEDS preferred format.
Report:
Transmit data to IPEDS:
Enter the data into the online interface. Clear any automatic edit
checks. Lock the survey.
Train:
Read NCES e-mails and check the IPEDS Web site. Attend higher
education association conferences. Take online IPEDS training.
Reporting calendar for 2009-2010 collection:
Fall collection (Due Oct. 14, 2009):
Institutional Characteristics; Completions; 12-month Enrollment.
Winter collection (Due Jan. 20, 2010):
Human Resources.
Spring collection (Due April 14, 2010):
Fall Enrollment; Student Financial Aid; Finance; Graduation Rates;
Graduation Rates 200.
Source: GAO analysis of Education documents and interviews.
[End of figure]
The IPEDS survey online interface runs several automated checks of the
data before keyholders are able to lock and submit any data. This
review, known as edit checks, compares certain questions against data
from other questions or previous years. These edit checks help improve
the reliability of the data by flagging inconsistencies.
RTI also provides additional services to support the IPEDS reporting
process. It runs a help desk from which many institutions receive
guidance on completing surveys. In addition, RTI maintains a Technical
Review Panel of IPEDS experts that it convenes multiple times every
year to discuss related issues. This panel consists of individuals
representing the federal government, state government, institutions,
data users, and higher education associations.
IPEDS Graduation Rates:
The IPEDS Graduation Rates Survey collects data from institutions in
accordance with the Student Right-to-Know and Campus Security Act of
1990.[Footnote 10] The measure, as defined in statute, is based on the
number of full-time, first-time, degree/certificate-seeking,
undergraduate students entering an institution in a particular year
that end up completing their programs within certain time periods.
Part-time and transfer-in students are excluded from the calculation.
Graduation rates are calculated at several points once a cohort of
students enrolls, but the most widely cited rates are based on the
number of students who completed their program within 150 percent of
normal time, or 6 years for a 4-year bachelor's degree.[Footnote 11]
For example, the most recent Graduation Rates Survey required 4-year
institutions to report on the percentage of students that originally
enrolled in fall 2003 that had graduated as of August 31, 2009.
[Footnote 12] These graduation rates are reported by gender or race
and ethnicity depending on the reporting requirements for each type of
institution.
The Graduation Rates Survey also collects transfer-out rates from
institutions whose missions include providing substantial preparation
for students to enroll in another eligible institution without having
completed their program. A school is required to report only on those
students that the school knows have transferred to another school.
Transfer-out rates are reported separately from a school's graduation
rates.
In 2009, the Graduation Rates 200 Survey was added to the IPEDS spring
collection cycle for all institutions in order to comply with
requirements added by the Higher Education Opportunity Act.[Footnote
13] The Graduation Rates 200 Survey uses full-time, first-time,
degree/certificate-seeking student cohorts like the original
Graduation Rates Survey, but tracks students for the longer period of
200 percent of normal completion time. For example, the most recent
Graduation Rates 200 Survey required 4-year institutions to report on
the number of first-time, full-time students that originally enrolled
in fall 2001 and had graduated as of August 31, 2009.
Value of IPEDS data:
IPEDS data are used by government agencies, postsecondary
institutions, businesses, and citizens for a variety of research and
policy purposes. The general consensus among Education officials and
higher education experts we interviewed was that IPEDS provides the
public with essential information on the nation's higher education
system. It is a premier source for higher education data. Some of the
uses of IPEDS data are depicted in table 2.
Table 2: Uses of IPEDS data:
User of IPEDS data: Education;
Examples of use:
* Inform budgetary and policy decisions;
* Determine institutions' eligibility for grants;
* Identify samples for other postsecondary surveys.
User of IPEDS data: Parents and students;
Examples of use:
* Compare tuition, academic programs, and financial aid when selecting
a school to attend.
User of IPEDS data: Researchers;
Examples of use:
* Track trends in enrollment, completions, and costs.
User of IPEDS data: Postsecondary institutions;
Examples of use:
* Inform internal decision making;
* Compare salaries and tuition at peer institutions.
User of IPEDS data: Private-sector businesses;
Examples of use:
* Identify locations of skilled graduates.
User of IPEDS data: Other federal agencies;
Examples of use:
* Plan recruitment activities;
* Project future labor supply and demand.
User of IPEDS data: State government;
Examples of use:
* Inform budgetary and legislative decisions.
Source: Education.
[End of table]
Schools' Reported IPEDS Burdens Exceed Official Estimates, and
Education Lacks a Robust Process for Estimating the Burden:
Institutions' Reported Burdens Substantially Exceed Education's
Estimates:
The IPEDS burden reported by many schools in our sample exceeds
Education's official estimates, often to a substantial degree. The
time burdens schools reported were greater than Education's official
estimates for 18 of the 22 schools in our sample. Twelve schools
reported burdens more than twice Education's estimates. As illustrated
in figure 2, schools reported time burdens ranging from 12 to 590
hours, compared with the 19 to 41 hours Education estimated for these
22 institutions.
Figure 2: Time Burdens Reported by 22 Institutions Compared with
Education's Official Estimates by Institution Type:
[Refer to PDF for image: horizontal bar graph]
Type of institution: Less than 2-year;
Reported burden hours:
Education estimate (upper bound): 18.7;
Reported to GAO by individual institutions: 21; 21; 23.1; 140; 353.5.
Type of institution: 2-year;
Reported burden hours:
Education estimate (upper bound): 40.9;
Reported to GAO by individual institutions: 11.6; 23; 23; 47; 72; 72;
78; 81.5.
Type of institution: 4-year;
Reported burden hours:
Education estimate (upper bound): 39.4;
Reported to GAO by individual institutions: 36; 50.6; 95.4; 120;
125.3; 129; 195.5; 298; 368; 456.5; 590.
Source: GAO analysis of Education documents and interviews.
[End of figure]
The high burdens relative to Education's estimates reported by schools
in our sample are corroborated by the findings of a recent internal
NCES study that examined the burden at nine institutions. The NCES
study found the burden reported by all nine institutions to be much
higher than Education estimated. Eight of these institutions reported
burdens more than twice Education's estimates. In addition, 40 higher
education associations representing a wide range of institutions
signed a letter to the Office of Management and Budget in March 2007
commenting that Education's official IPEDS time burdens were serious
underestimates.
In addition to being time-consuming, keyholders generally perceive
IPEDS reporting to be a relatively demanding task. The majority of
keyholders we interviewed told us IPEDS is either moderately or very
burdensome and is more burdensome than their other external reports.
[Footnote 14] However, the amount of time keyholders reportedly spent
completing IPEDS did not always correspond with their subjective
attitudes on the level of burden. Figure 3 illustrates keyholders'
attitudes toward IPEDS. For example, the keyholder at a large
institution that reportedly spent over 350 hours completing IPEDS said
the surveys were only slightly burdensome, while the keyholder at a
small institution that reportedly spent less than 25 hours said the
surveys were extremely burdensome. These discrepancies may be due to
differences between keyholders' evaluation of burden, a complex and
subjective concept, which might include the perceived value of IPEDS
data and institutional reporting, and the level of effort and
difficulty reporting might require.
Figure 3: Frequency School Officials Reported Feeling Various Degrees
of Burdens from IPEDS:
[Refer to PDF for image: vertical bar graph]
Degree of burden reported: Not burdensome at all;
Number of schools: 0.
Degree of burden reported: Slightly burdensome;
Number of schools: 5.
Degree of burden reported: Moderately burdensome;
Number of schools: 7;
Degree of burden reported: Very burdensome;
Number of schools: 7.
Degree of burden reported: Extremely burdensome;
Number of schools: 3.
Source: GAO analysis of interview results.
[End of figure]
In part, because there is some overlap among all the reporting that
schools do, it was challenging for school officials to estimate the
time they spent exclusively on IPEDS. For example, one of our selected
schools produced a large package of data that the statewide central
office used to fulfill the school's multiple reporting requirements,
including state reporting and IPEDS. Since the institution was
submitting data for multiple purposes, it was hard to identify the
time spent compiling data for IPEDS rather than other reporting
requirements. However, some school officials commented that some of
the data they compile to report to IPEDS is useful to have when
fulfilling other reporting requirements and, as a result, may reduce
their burden for other reporting requirements. Individuals mentioned
state reporting, reporting for accreditation, and reporting to college
ranking publications as the main other reporting requirements they had
to fulfill. The majority of keyholders we interviewed reported IPEDS
to be more burdensome than any other external reporting. For example,
one keyholder said that while accreditation and state reports are only
due once a year, IPEDS surveys are due three times a year.
Since most schools in our sample reported time burdens higher than
Education estimated, the cost burden of IPEDS reporting may also be
more than Education estimated for those schools. The cost burden of
IPEDS reporting is determined almost entirely by staff time. Education
calculates the cost of IPEDS reporting at a constant rate of $30 per
hour, which is based on an average clerical salary and associated
computer costs for running programs to extract data. Only one of the
schools we interviewed had additional IPEDS-related expenses, which
was the cost of a contractor who completed its IPEDS Finance Survey.
Staff Experience and School Characteristics Greatly Influence the
IPEDS Burden Reported:
Staff experience and school characteristics are strong determinants of
the IPEDS burden. The majority of schools indicated that keyholder
experience, respondents' technical skills, organizational structure,
and institutional size were either moderately or extremely important
in determining the time burden of IPEDS reporting.
* Keyholder experience--The burden of completing the IPEDS surveys
generally declines as the keyholder becomes more familiar with the
reporting process, according to keyholders we interviewed. The first
year is generally the hardest because keyholders have to learn the
IPEDS data definitions and find the corresponding information in their
internal databases. For example, one keyholder said that the first
time he reported IPEDS data it took him twice the time it takes him
now. The school reporting the highest burden in our sample also had a
new keyholder. This school had recently undergone significant staff
turnover, so there was no institutional knowledge for the keyholder to
draw on while sifting through the school's data systems searching for
the appropriate data to report to IPEDS.
* Technical skills--The efficiency with which staff can operate
software and work with data affects a school's IPEDS reporting burden.
Cleaning, manipulating, and double checking a school's data to produce
the information IPEDS requires can be a time-consuming process, and
every school reported spending time on such work. These tasks are
often easier if keyholders have the technical skills to design
computer programs for sorting and calculating the data. For example,
the keyholder at a large community college was able to quickly process
large data files for the IPEDS Enrollment Survey because he had
advanced statistical programming skills.
* Organizational structure--It can be more burdensome to complete
IPEDS surveys when keyholders have to collaborate with other offices
at an institution to get the necessary information. In most
institutions in our sample, the keyholder had to collaborate with
other individuals in the school to report Human Resources, Student
Financial Aid, and Finance data to IPEDS. As illustrated in figure 4,
schools frequently reported these surveys to be the most burdensome.
Such collaboration sometimes entailed meetings between the keyholder
and these other stakeholders because the keyholder may not have access
to the data (e.g., payroll information for the Human Resource Survey)
or does not have subject matter expertise (e.g., accounting knowledge
for the Finance Survey). While the survey content may be more complex
than that of other surveys, meetings also expand the burden by
requiring the time of multiple individuals simultaneously. This
necessary collaboration makes it important for keyholders to establish
effective working relationships with other institutional offices.
Figure 4: Frequency with Which Each Survey Was Rated the Most
Burdensome by 22 Institutions:
[Refer to PDF for image: horizontal bar graph]
Survey: Student Financial Aid;
Frequency schools reported surveys as the most burdensome: 7.
Survey: Human Resources;
Frequency schools reported surveys as the most burdensome: 6.
Survey: Finance;
Frequency schools reported surveys as the most burdensome: 4.
Survey: Graduation Rates;
Frequency schools reported surveys as the most burdensome: 3.
Survey: Fall Enrollment;
Frequency schools reported surveys as the most burdensome: 3.
Survey: 12-Month Enrollment;
Frequency schools reported surveys as the most burdensome: 2.
Survey: Completions;
Frequency schools reported surveys as the most burdensome: 2.
Survey: Institutional Characteristics;
Frequency schools reported surveys as the most burdensome: 0.
Source: GAO analysis of interview results.
Note: Some schools gave more than one survey the same burden rating,
in some cases resulting in more than one "most burdensome" survey per
school.
[End of figure]
* Institution size--The size of an institution can have both positive
and negative effects on the reporting burden. The 22 schools in our
sample had enrollments ranging from less than 60 to more than 40,000
students. IPEDS reporting can sometimes be more time-consuming for
large institutions since there are more students and staff to report
on. However, larger institutions in our sample did not always have
higher burdens than their smaller counterparts, potentially because
large schools generally have more specialized staff than small
schools. The large schools we visited had institutional research
offices with full-time staff dedicated to regularly collecting,
analyzing, and reporting information on the institution for management
and planning purposes. At those smaller institutions, generally the
keyholder was a high-level school administrator for which
institutional reporting was a minor aspect of his or her
responsibilities. Among the keyholders in our sample were two
Directors and two Presidents. Keyholders we interviewed at smaller
schools might also handle the school's finances and payroll, as well
as teach classes when teachers are absent. Compared with full-time
institutional research professionals at larger schools, these staff
may have less sophisticated IT skills or expertise in working with
institutional data so IPEDS reporting may be more time-consuming even
though they have small numbers of students and staff to report on.
Education Does Not Have a Robust Process for Estimating Time and Cost
Burden:
Education's official burden estimates may be lower than those reported
to us because officials are still using the potentially unreliable
original baseline burden estimates for current burden calculations.
Education officials we spoke to attempted but were unable to ascertain
whether any systematic methodology was used to develop the original
baseline burden estimates. Officials said the original baseline was
developed in the late 1980s or early 1990s, and that some members of
the IPEDS Technical Review Panel were consulted at that time. They did
not know of any other steps taken to determine whether the burden
estimates were ever accurate. Every 3 years or when there are
substantial changes, as a requirement of the approval of the IPEDS
information collection request by the Office of Management and Budget,
Education updates its estimates of the burden imposed by each survey
form by taking into account changes to the survey or its
administration. For example, when it became possible to complete and
submit IPEDS surveys through the Web, Education lowered the burden
estimates. Education also publishes a notice in the Federal Register
to solicit public comments on new burden estimates. Office of
Management and Budget officials told us they do not independently
verify the accuracy of Education's burden estimates.
Education officials said the impact of survey changes on the burden is
estimated through ratio adjustments made relative to the baseline
estimates. For example, if the baseline estimate is 5 hours for a
survey form, and 20 percent of the questions on that survey are
removed, Education might estimate the new burden of that survey to be
4 hours. Before finalizing and submitting revised estimates to the
Office of Management and Budget for changes to required race and
ethnicity reporting, officials said they spoke with two schools in
addition to consulting with the IPEDS Technical Review Panel for an
indication of the impact the changes would have on the reporting
burden. If the wide variation of reported burdens in our sample is
indicative of the general population of institutions, it would be
difficult for Education to get a reliable assessment of the burden by
consulting with as few as two institutions.
Accurately estimating the IPEDS reporting burden is challenging, but
other federal agencies use methodologies that can serve as examples
for NCES. Currently burden estimates are associated with the survey
forms an institution completes; however, the characteristics of
institutions in our sample influenced their reported burdens as much
or more than the forms they completed. As we have previously reported,
burden-hour estimates are not a simple matter.[Footnote 15] It is
challenging to estimate the amount of time it will take for a
respondent to collect and provide information, particularly when there
is a high degree of variability like we found in our sample of
institutions. In addition, like all estimates, burden estimates are
not precise. Despite these challenges, at least one other federal
agency has developed a more systematic methodology for estimating the
reporting burden. We have previously reported on the statistical model
the Internal Revenue Service uses to improve the accuracy and
transparency of taxpayer burden estimates.[Footnote 16] According to
the Office of Management and Budget, rather than estimating burden on
a form-by-form basis, the Internal Revenue Service's methodology takes
into account broader and more comprehensive taxpayer characteristics
and activities, considering how the taxpayer prepares the return
(e.g., with or without software or a paid preparer), as well as the
taxpayer's activities, such as gathering tax materials, completing
forms, recordkeeping, and tax planning. NCES officials told us they
are planning to examine the information collections of other federal
agencies to learn about the methodologies they use for establishing
reporting burden estimates. Any methodology NCES uses to estimate the
IPEDS burden will still have limitations, but there appears to be
substantial room for improvement over the current estimates. Without
reliable burden estimates, policymakers will not be able to
effectively weigh the benefits of IPEDS against the costs it imposes
on institutions.
Training, Software, and Administrative Supports Can Reduce the IPEDS
Burden and Would Be Enhanced by Increased Coordination:
Expanding Training Could Reduce the Burden, but Some Keyholders Are
Not Aware of Current Training Opportunities:
According to NCES officials and institutional keyholders we
interviewed, expanding training could reduce the IPEDS reporting
burden at certain schools, but some keyholders are not aware of
current training opportunities. The Paperwork Reduction Act requires
agencies to reduce, to the extent practicable and appropriate, the
burden to respondents.[Footnote 17] Training is one way to achieve
this goal, according to institutional research experts we interviewed.
NCES currently offers in-person and online training on topics such as
leading or managing an IPEDS cycle and step-by-step guidance for
completing each IPEDS survey.[Footnote 18] NCES plans to expand its
current training options and is developing a training module targeting
new keyholders. New or inexperienced keyholders may face increased
reporting burdens because they are less familiar with the IPEDS
reporting process, according to keyholders, Education officials, and
higher education associations we interviewed. To address this, NCES's
proposed new keyholder training module and resources will include the
following:
* Communications directly targeted to new keyholders through a welcome
e-mail and phased e-mails outlining opportunities for training.
* Welcome packets specifically for new keyholders, which would include
training schedules and calendars to help keyholders keep track of key
dates.
* A new keyholder manual containing information on the importance of
data quality, keyholder responsibilities, and tips from veteran
keyholders.
* A new in-person workshop for new keyholders, supplemented by online
tutorials.
* Enlisting state IPEDS coordinators to help target communications to
new keyholders.
Current training opportunities are not being effectively communicated
to all institutions, according to NCES officials and keyholders we
interviewed. Keyholders at five schools in our sample were unaware of
currently available training resources. Not all keyholders may be
aware of currently available training resources due to challenges NCES
faces in reaching career and technical schools. Of the five schools in
our sample that were not aware of training options, three keyholders
represented career and technical schools. According to NCES officials,
reaching these types of schools is particularly challenging because
they do not generally participate in the channels NCES uses to
communicate with keyholders. NCES communicates with keyholders
primarily through e-mails and through their connections with national
higher education associations and networks. For example, NCES e-mails
all keyholders periodic newsletters titled, "This Week in IPEDS," that
include details about training opportunities. Even though all
keyholders presumably receive these e-mails, the long length of NCES e-
mails may cause some keyholders to ignore them, according to members
of the IPEDS Technical Review Panel. NCES also offers an optional e-
mail listserv that keyholders and others can subscribe to and discuss
IPEDS-related questions and topics, but very few career and technical
schools have joined this listserv. NCES works with one higher
education association that represents career and technical,
proprietary schools, but many of these schools do not participate in
any national associations. Without receiving effective communications
about training resources that can increase their skills and knowledge,
keyholders at these schools may face larger time burdens completing
the surveys and risk missing reporting deadlines or reporting
inaccurate data.
Campus Data Systems Include Automated IPEDS Reporting Tools That Could
Reduce the Burden, but Keyholders Are Concerned About Their
Reliability:
Campus data systems could reduce the IPEDS reporting burden, but some
keyholders we interviewed are concerned about the reliability of the
systems' automated IPEDS reporting features. Some schools develop
their own internal data systems, while other schools purchase campus
data systems primarily to manage a wide range of campus business
functions, such as student records, financial aid, human resources,
and finance. To assist keyholders with IPEDS reporting, many campus
data systems can extract data from schoolwide databases and create
reports that schools can use to complete IPEDS surveys. Some features
produce electronic data files that can be uploaded directly into
IPEDS, saving keyholders time from entering data manually into IPEDS.
However, some keyholders do not use the IPEDS reporting functions
available in their campus data systems to complete IPEDS surveys due
to concerns about their reliability. Keyholders at 12 schools we
interviewed used software programs that included IPEDS reporting
functions. Among these 12 schools, 9 keyholders did not use these
functions for IPEDS reporting. Keyholders cited concerns with the data
produced by these functions as one reason for not using them. For
example, keyholders at four schools felt more comfortable with their
own calculations because they were concerned that the data produced
from these features may not be correct. Specifically, two keyholders
stated that the data produced from these features were unreliable. A
NCES-funded study of campuswide reporting software also found that
most keyholders surveyed do not use these reporting functions to
gather data needed for IPEDS.[Footnote 19] The keyholders surveyed in
this study did not use these functions because they were unsure of the
results produced and because the functions did not align with recent
changes to IPEDS.
One contributing factor to the limitations of these automated
reporting features is the lack of direct and timely coordination
between campus data system software providers and Education to
incorporate upcoming changes to the IPEDS surveys. Although Education
is not responsible for developing these IPEDS reporting functions,
NCES is mandated to assist institutions in improving and automating
statistical and data collection activities.[Footnote 20] Many schools
use campus data systems with these features to manage other campus
functions, but keyholders are reluctant to use these systems' IPEDS
reporting features because of their concerns about the performance of
these features. Improving the reliability of these reporting functions
could encourage keyholders to use these features, which could help
keyholders reduce their IPEDS reporting burden. Without direct and
frequent coordination with Education, software providers risk
misinterpreting reporting requirements and do not have time to fully
test automated IPEDS features before their release to schools. All
four major higher education software providers we interviewed
indicated they have limited or no direct coordination with Education
to learn about upcoming changes to IPEDS. These companies instead rely
on alternative means such as communications from their client schools,
attending conferences, or checking the IPEDS Web site. According to
these companies, these means are less effective than direct contact
with NCES. Software providers may not fully understand certain IPEDS
reporting requirements, according to one expert, which may further
affect software providers' timelines to fully test their updates. Two
software providers we interviewed indicated that it was challenging to
deliver timely updates to IPEDS features because they did not receive
information about upcoming changes in IPEDS early enough. According to
one software provider, the company was not able to fully test the
updated automated IPEDS reporting functions, and it was unclear if the
functions were going to work properly upon their release to clients.
If IPEDS reporting functions are not always fully tested, they may not
align with reporting requirements. This deters keyholders from using
tools that could potentially reduce their burden or may negatively
affect the reported data. The software providers we spoke with cited
examples of coordination with Education that could be expanded or
replicated with regard to IPEDS. For example, Education holds an
annual conference on student financial aid that some software
providers attend to stay up-to-date on changing eligibility rules.
This conference includes sessions on reporting student financial aid
data to IPEDS but does not address other IPEDS surveys. Education also
works with the Postsecondary Electronic Standards Council, an
association which includes software providers, colleges and
universities, and state and federal government agencies.
Respondents Are Generally Pleased with Components of Survey
Administration That Help Reduce the Reporting Burden:
Keyholders we interviewed are generally pleased with current
components of the IPEDS surveys' administration that help reduce the
reporting burden. They cited several components of the surveys'
administration that have been particularly effective at reducing the
burden:
* IPEDS Help Desk--Nearly all keyholders we interviewed reported high
levels of satisfaction with the IPEDS Help Desk in resolving
difficulties they had with completing the surveys. The IPEDS Help Desk
is a call center that NCES operates to assist keyholders with
completing the IPEDS surveys.[Footnote 21] Keyholders have contacted
the Help Desk for assistance on a range of issues, including
recovering a lost password, clarifying data definitions, and clearing
problems found in the data before the surveys are locked.
* Survey instructions--Both new and experienced keyholders in our
sample reported that the instructions were sufficient and helpful in
completing the IPEDS surveys. For example, one new keyholder referred
to the instructions to learn how to report data, while another
experienced keyholder reviewed them periodically to learn about
reporting changes.
* Collection schedule--Keyholders in our sample are generally
satisfied with the three-phase data collection schedule of IPEDS
surveys. The IPEDS surveys are collected during the fall, winter, and
spring reporting periods, distributing the survey burden throughout
the academic year. Some keyholders, however, indicated that they would
like the survey deadlines extended or to open earlier to provide
keyholders additional time.
Additionally, Education has modified IPEDS survey forms to lower the
reporting burden on nondegree-granting schools. For example, the
survey forms for nondegree-granting institutions do not include
standard questions about student charges for room and board since
these schools do not typically offer these services. Several data
elements in the Finance Survey for both proprietary and not-for-profit
nondegree-granting schools have also been eliminated to reduce the
reporting burden for these schools. Education also recently hosted an
IPEDS Technical Review Panel to discuss new tools and resources it is
developing for reducing the IPEDS burden. Education presented several
new initiatives to the panel that are intended to reduce institutions'
reporting burden. These included training for new keyholders, which we
previously discussed, and an aggregation tool that could help schools
convert their student data into a file that can be uploaded to the
IPEDS data collection system.
Additional Graduation Rate Data, Although of Some Use, Is an
Incomplete Measure of Student Outcomes and Would Add to Schools'
Burden:
IPEDS Graduation Rates Only Account for a Subset of Students, but
Additional Data Could Be Useful to Researchers and Students:
IPEDS graduation rates are a limited measure because they only track
outcomes for a subset of students. IPEDS graduation rates only measure
the outcomes for first-time, full-time, degree/certificate seeking
students, which comprise 49 percent of entering students nationwide
according to IPEDS data. Students who attend part-time or transfer-in
are not counted toward a school's graduation rate. All nongraduates
are treated as dropouts, even if they go on to graduate from another
institution. Figure 5 illustrates how certain types of students are
counted by the measure:
Figure 5: Hypothetical Graduation Rate Calculation Example for 4-Year
Institution:
[Refer to PDF for image: illustration]
Initial cohort:
Only a subgroup of the students who enroll are included in the initial
cohort used to calculate IPEDS graduations rates.
First-time, full-time students: represented in the illustration as six
students.
Transfer-in students: represented in the illustration as two students.
Part-time students: represented in the illustration as two students.
6 years later: 150 percent of normal completion time:
Completers:
Even if all 10 students graduate, only 2 students in the initial
cohort are counted toward the college‘s IPEDS graduation rate.
First-time, full-time students:
IPEDS Grad. Rate: 2 of 6 students (33%).
Transferred out and graduated from another college: 2 students.
Took time off but graduated after 9 years.
Transfer-in students: graduated.
Part-time students: graduated.
Source: GAO analysis of Education documents
[End of figure]
Since many students are excluded from the IPEDS graduation rate
calculation, it is an incomplete measure of student outcomes.
According to Education, the consensus is that IPEDS graduation rates
in their present form are an inadequate measure for school
accountability. The IPEDS graduation rate measure is less effective at
institutions that serve large proportions of nontraditional students,
like community colleges. Many community college students attend part-
time or enroll in multiple institutions. As a result, about 32 percent
of entering students at 2-year, public institutions are included in
the first-time, full-time cohorts used to calculate graduation rates
according to IPEDS data. The IPEDS Technical Review Panel has
considered using a separate measure for part-time students, but such
data would still exclude transfer-in students.
These limitations in IPEDS graduation rates, which are widely
acknowledged by Education, schools, and researchers, are primarily due
to the structure of the IPEDS collection process. IPEDS data are
collected at the institution level, and there is generally no way at
present to track outcomes for students who transfer from one
institution to another. Some states have developed their own
postsecondary data systems capable of tracking students who move among
schools--at least within the state. While the Higher Education
Opportunity Act explicitly prohibited Education from developing,
implementing, or maintaining a federal database that tracks individual
students (including a student unit record system), it also provided
explicitly that a state or a consortium of states could do so.
[Footnote 22]
Despite the limitations of IPEDS graduation rates, disaggregating
graduation rate data by race, ethnicity, and income could still be
somewhat beneficial for examining achievement gaps among schools and
assisting prospective students in the college selection process. IPEDS
is the primary federal source for comparable institution-level data on
graduation rates. Other sources of graduation rate information, such
as the Beginning Postsecondary Students Longitudinal Survey, can be
used to track nationwide trends in graduation rates, but the sample
size is too small for examining individual institutions.[Footnote 23]
Postsecondary education researchers told us IPEDS data on graduation
rates capture the wide range of variability in graduation rates among
institutions that is missed by other surveys. Additional graduation
rate data would still be limited to first-time, full-time, degree/
certificate seeking students, but disaggregating IPEDS graduation rate
data by race, ethnicity, and income would provide researchers with a
starting point for identifying schools that are doing comparatively
effective or ineffective jobs at graduating certain types of students.
This information could be used to increase transparency or to solicit
best practices from institutions with higher graduation rates. The
information could also assist students and parents in selecting
schools that have done a more effective job of graduating certain
types of students. For example, students can currently use Education's
College Navigator Web site to search for existing graduation rate data
on prospective schools.[Footnote 24] More detailed graduation rate
data would provide these students with further information before
making their decisions.
Schools Could Use Existing Data to Calculate Graduation Rates by Race
and Ethnicity at a Modest Burden:
Schools already collect data on student race and ethnicity that they
could use to report more detailed graduation rate data at a modest
burden. All schools that complete IPEDS are required to collect
student race and ethnicity data and report it in the Fall Enrollment,
12-Month Enrollment, and Completions Surveys. In addition, 4-and 2-
year schools are already required to report some race and ethnicity
data on the Graduation Rates Survey. Table 3 describes the graduation
rate data schools were required to submit during the 2009-2010 IPEDS
collection.
Table 3: IPEDS Graduation Rate Data Currently Collected by Race and
Ethnicity:
Level of institution: 4-year;
Graduation rate data reported by race and ethnicity:
* 150% normal time to completion;
* Completed bachelor's degree or equivalent in:
- 4 years or less;
- 5 years;
Graduation rate data not reported by race and ethnicity:
* 200% normal time to completion.
Level of institution: 2-year;
Graduation rate data reported by race and ethnicity:
* 150% normal time to completion;
Graduation rate data not reported by race and ethnicity:
* 100% normal time to completion;
* 200% normal time to completion.
Level of institution: Less than 2-year;
Graduation rate data reported by race and ethnicity: [Empty];
Graduation rate data not reported by race and ethnicity:
* 100% normal time to completion;
* 150% normal time to completion;
* 200% normal time to completion.
Source: GAO analysis of IPEDS surveys.
[End of table]
Although less than 2-year institutions do not currently report any
IPEDS graduation rate data by race and ethnicity, they were required
to report these data prior to the 2004-2005 IPEDS collection.
Education officials told us they shortened the Graduation Rates Survey
for less than 2-year institutions to help lower their reporting
burden. In addition, many less than 2-year schools also have small
numbers of students, so disaggregating graduation rates into multiple
categories can produce small subgroups that are statistically
unreliable and risk revealing personally identifiable information,
according to Education officials.[Footnote 25] For example, if only
one female Asian/Pacific Islander is enrolled in a school, reporting a
separate graduation rate for this subgroup would not yield any
statistically useful information.
Keyholders we spoke with said reporting all graduation rate data by
race and ethnicity would increase their reporting burden by a modest
amount. The majority of keyholders we interviewed said reporting race
and ethnicity for every graduation rate they report would be either
slightly or moderately burdensome.[Footnote 26] For example, a
keyholder from a less than 2-year institution told us that graduation
rates could be calculated using race and ethnicity data the school
already collects, but it would be more time-consuming. The additional
burden would arise because schools would have to make additional
calculations and enter data into more survey cells.
Calculating Graduation Rates by Income Would Be Limited to Students
That Applied for Federal Student Aid and Be Very Burdensome for
Schools to Report:
Collecting graduation rates by income for all students would be
difficult because income data are only available on students that
apply for federal financial aid. In general, schools only collect
income data from students that complete the Free Application for
Federal Student Aid (FAFSA), which includes questions about students'
and parents' income.[Footnote 27] According to Education data, 71
percent of full-time undergraduate students apply for federal
financial aid nationwide, but the percentage varies substantially by
type of institution. Obtaining income information on the remaining
students would be difficult because students may be unwilling to
voluntarily disclose this information, and the data could be
unreliable, according to researchers and keyholders. Unlike FAFSA
income data, which are based on IRS forms and subject to verification,
alternative methods of collecting income data depend on self-reported
information that is prone to errors. In light of these challenges,
schools currently could only reliably report graduation rates by
income for the subgroup of students that complete a FAFSA.[Footnote
28] These data would provide information on students that receive
federal assistance, but they may not be representative of all
students. In addition, using FAFSA income data for unintended purposes
may raise privacy concerns.
The majority of keyholders we interviewed said reporting graduation
rates by income would be either very or extremely burdensome. The
results were consistent across all levels of institutions. Calculating
these graduation rates may require institutions to merge financial aid
databases containing income data with student record databases
containing enrollment and completion data. These databases can be
maintained in different offices at an institution and, as previously
discussed, coordination with other offices is an important factor in
determining the burden of IPEDS reporting.
Researchers and some keyholders we interviewed suggested that rather
than using income data, schools could report graduation rates based on
whether or not students received federal Pell Grants. Since Pell
Grants are awarded to low-income students, a student's Pell Grant
status could be used as a proxy for income. Although the data are not
collected through IPEDS, the Higher Education Opportunity Act included
a new provision that requires institutions to disclose graduation
rates disaggregated by recipients of Pell Grant status to prospective
and enrolled students upon request.[Footnote 29] Some state higher
education systems are already using Pell Grant status to analyze
graduation rates and voluntarily reporting the information through a
mechanism other than IPEDS. Half of the keyholders we interviewed said
it would be easier to calculate graduation rates by Pell Grant status
than income. For example, one keyholder told us Pell Grant status is a
simple yes/no question compared with more complex income data.
However, other keyholders told us reporting graduation rates by Pell
Grant status would present the same challenges and be just as
burdensome as reporting the data by income.
Conclusions:
When the federal government collects information from the public, the
usefulness of the information must be balanced against the burden it
imposes. This trade-off is clearly apparent with IPEDS, which provides
Education and the public with valuable information on postsecondary
education, but it also creates a burden on all institutions that
collect and report the data. To effectively weigh the benefits of
IPEDS against the collection costs, it is essential for policymakers
to have reasonable estimates of the reporting burden. However,
Education's current IPEDS estimates appear to be low. As a result,
policymakers run the risk of making future decisions about IPEDS
without knowing how those decisions will affect the burden on
postsecondary institutions. Accurate burden estimates are therefore
essential when considering collecting additional data, such as
detailed graduation rates, or scaling back particular survey sections.
It is also important to minimize the burden imposed by data
collections. Several options exist for reducing the IPEDS reporting
burden without sacrificing valuable data. These options, including
improving communication about training opportunities, would be
particularly beneficial to small schools that generally have a higher
relative burden. These institutions may not have the resources to
devote staff to institutional research and reporting full-time.
Minimizing the burden on these schools would free up staff to focus on
their numerous other duties that are essential to operating a
postsecondary institution.
When considering the expansion of existing information collections, it
is important that policymakers also understand the strengths and
limitations of available data. In the case of IPEDS graduation rates,
there are significant limitations with the current collection of data
that reduce their usefulness as an accountability measure for schools.
Until these underlying issues are addressed and, for example,
postsecondary data systems are developed that are capable of tracking
all students who transfer among schools, additional graduation data
will only provide insights into the outcomes of one, albeit a large,
subgroup of students.
Recommendations for Executive Action:
We recommend that the Secretary of Education direct the Commissioner
of NCES to take the following three actions:
To improve the availability of reliable information to Congress and
postsecondary institutions about postsecondary institutions' data
collection efforts, reevaluate the official IPEDS burden estimates and
establish new baseline estimates as appropriate.
To help reduce the reporting burden on postsecondary institutions:
* Improve how NCES communicates IPEDS training opportunities to a
wider range of institutions, particularly smaller career and technical
institutions outside of traditional higher education networks.
* Coordinate with higher education software providers to help enhance
the quality and reliability of IPEDS reporting features.
Agency Comments and Our Evaluation:
We provided a draft of this report to Education for review and comment
and received a written response from NCES, which is reprinted in
appendix II. NCES generally agreed with our recommendations and
highlighted several steps it has taken, or intends to take, to address
issues raised in our report. For example, NCES has already initiated a
review of its IPEDS burden estimates, which includes a study of the
methodologies used by other federal agencies that might assist NCES in
making more accurate estimates. To communicate training opportunities
to a wider range of institutions, NCES plans to send dedicated e-mails
about training opportunities to keyholders and expand its outreach
among networks of career and technical institutions. In response to
our recommendation to coordinate with higher education software
providers, NCES noted, as we do in this report, that some schools do
not use commercially available campus data systems. NCES stated that
it will take steps to coordinate with software providers and others
that assist institutions with IPEDS reporting by creating a central
online source of relevant IPEDS information for software providers and
enabling them to register for e-mails about IPEDS updates.
We are sending copies of this report to the appropriate congressional
committees, the Secretary of Education, and other interested parties.
The report also is available at no charge on the GAO Web site at
[hyperlink, http://www.gao.gov].
If you or your staff members have any questions about this report,
please contact me at (202) 512-7215 or scottg@gao.gov. Contact points
for our Offices of Congressional Relations and Public Affairs may be
found on the last page of this report. GAO staff who made major
contributions to this report are listed in appendix III.
Signed by:
George A. Scott, Director:
Education, Workforce, and Income Security Issues:
[End of section]
Appendix I: Objectives, Scope, and Methodology:
The objectives of this report were to identify (1) the time and cost
burden for postsecondary schools completing the Integrated
Postsecondary Education Data System (IPEDS) surveys, (2) options for
reducing this burden, and (3) the potential benefits and challenges of
collecting additional data on institutions' graduation rates. To
address these questions, we analyzed existing estimates of the IPEDS
time and cost burden; reviewed relevant laws and documents; and
interviewed Department of Education (Education) and Office of
Management and Budget officials, higher education researchers, higher
education associations, and higher education software providers. We
also interviewed institution staff, known as keyholders, who are
responsible for entering data into the IPEDS surveys from 22
postsecondary institutions.
Document Review:
To understand the IPEDS time and cost burdens, we reviewed Education
documents on existing estimates of the IPEDS time and cost burdens. We
reviewed Education's January 2009 Paperwork Reduction Act submission
to the Office of Management and Budget that established the official
burden estimates published in the Federal Register. We compared these
estimates with time burdens reported by 22 schools we contacted as
described below. We also reviewed a 2009 National Center for Education
Statistics (NCES) internal study that evaluated the reported time
burden at nine institutions and relevant GAO reports on the Paperwork
Reduction Act.
To examine options for reducing the burden, we also reviewed documents
from the IPEDS Technical Review Panel meetings. To examine the
feasibility of collecting additional graduation rate data, we examined
the 2009-2010 IPEDS graduation rates surveys to identify what data are
currently collected.
Analysis of Sample of Postsecondary Schools and Keyholders:
To collect information on all three of our objectives, we selected a
nonprobability sample of 22 postsecondary schools.[Footnote 30] While
limiting our sample to 22 schools precluded us from generalizing our
findings to the entire population of postsecondary schools, our
approach allowed us to conduct detailed, in-person interviews with
keyholders and relevant staff without substantially burdening the
schools. This sample of 22 institutions represented a range of 4-year,
2-year, and less than 2-year institutions, as well as public, not-for-
profit, and proprietary institutions in four different geographic
areas of the country and the District of Columbia, as illustrated in
table 4. We selected our sample of 22 schools to generally correspond
with the proportion of 4-year, 2-year, and less than 2-year schools
and sectors (public, private not-for-profit, and private for-profit)
in the population of postsecondary schools receiving funding under
Title IV of the Higher Education Act. Our sample included schools with
relatively large and small enrollments for each major category of
institutions. Our 22 institutions also included one Historically Black
College and University, one Predominantly Black Institution, one
Hispanic Serving Institution, and two Tribal Colleges. To understand
the unique challenges faced by new keyholders, we included 6 new
keyholders in our sample. While new keyholders comprised 11 percent of
all keyholders in 2008, oversampling new keyholders in our study
allowed us to analyze new keyholder experiences over a broad range of
types of schools. Because we found that staff experience is a
determinant of the burden, this oversampling increases the overall
level of burden our sample reported over what might have been found in
a sample with fewer new keyholders.
Table 4: List of Institutions Included in Study:
Name of institution: Belmont Abbey College;
Location: Belmont, NC;
Sector: Private not-for-profit;
Level: 4-year or above.
Name of institution: Blue Hills Regional Technical School;
Location: Canton, MA;
Sector: Public;
Level: Less than 2-year.
Name of institution: Boston University;
Location: Boston, MA;
Sector: Private not-for-profit;
Level: 4-year or above.
Name of institution: Brookstone College;
Location: Charlotte, NC;
Sector: Private for-profit;
Level: Less than 2-year.
Name of institution: Bunker Hill Community College;
Location: Boston, MA;
Sector: Public;
Level: 2-year.
Name of institution: Central Piedmont Community College;
Location: Charlotte, NC;
Sector: Public;
Level: 2-year.
Name of institution: Chicago State University;
Location: Chicago, IL;
Sector: Public;
Level: 4-year or above.
Name of institution: City Colleges of Chicago[A];
Location: Chicago, IL;
Sector: Public;
Level: 2-year.
Name of institution: College of Santa Fe;
Location: Santa Fe, NM;
Sector: Private for-profit;
Level: 4-year or above.
Name of institution: Coyne American Institute Inc;
Location: Chicago, IL;
Sector: Private for-profit;
Level: 2-year.
Name of institution: FINE Mortuary College LLC;
Location: Norwood, MA;
Sector: Private for-profit;
Level: 2-year.
Name of institution: Institute of American Indian and Alaska Native
Culture;
Location: Santa Fe, NM;
Sector: Public;
Level: 4-year or above.
Name of institution: Livingstone College;
Location: Salisbury, NC;
Sector: Private not-for-profit;
Level: 4-year or above.
Name of institution: Navajo Technical College;
Location: Crownpoint, NM;
Sector: Public;
Level: 2-year.
Name of institution: Pine Manor College;
Location: Chestnut Hill, MA;
Sector: Private not-for-profit;
Level: 4-year or above.
Name of institution: Strayer University[A];
Location: Washington, DC;
Sector: Private for-profit;
Level: 4-year or above.
Name of institution: Taylor Business Institute;
Location: Chicago, IL;
Sector: Private for-profit;
Level: 2-year.
Name of institution: Universal Therapeutic Massage Institute;
Location: Albuquerque, NM;
Sector: Private for-profit;
Level: Less than 2-year.
Name of institution: University of Aesthetics;
Location: Chicago, IL;
Sector: Private for-profit;
Level: Less than 2-year.
Name of institution: University of New Mexico[A];
Location: Albuquerque, NM;
Sector: Public;
Level: 4-year or above.
Name of institution: University of North Carolina, Charlotte;
Location: Charlotte, NC;
Sector: Public;
Level: 4-year or above.
Name of institution: Vandercook College of Music;
Location: Chicago, IL;
Sector: Private not-for-profit;
Level: 4-year or above.
Source: GAO.
[A] Keyholders at these schools were responsible for reporting for
multiple campuses.
[End of table]
We conducted in-person interviews with keyholders and relevant staff
at each institution in our sample.[Footnote 31] We conducted these
interviews from January to March, 2010, which allowed us to interview
keyholders at the end of the fall and winter IPEDS reporting cycles,
while the surveys were relatively fresh in keyholders' minds. During
these structured interviews, we asked the institution staff to
estimate the time it took to prepare for and complete each survey
component. To limit the potential for self-reported over-or
underestimates of the burden, we structured our interviews to ask a
detailed series of both open-and closed-ended questions about the
processes and staff resources required to complete each survey. We
also conducted a second round of follow-up phone interviews with
keyholders in May 2010 to confirm keyholders' initial time estimates
and to collect time estimates for the spring collection cycle, which
many keyholders had not completed at the time of our in-person
interviews. This second round of follow-up interviews also enabled us
to ask keyholders about the spring surveys' time estimates soon after
the spring collection closed, while these surveys were still fresh in
their minds. We also used these two rounds of interviews to examine
options for reducing the burden and to understand the benefits and
challenges involved in collecting additional graduation rate data,
disaggregated by race, ethnicity, and income.
Additional Interviews:
To examine the methodology and assumptions used to create Education's
burden estimates, we interviewed officials from NCES and the Office of
Management and Budget's Office of Information and Regulatory Affairs.
We also interviewed staff from two organizations that Education
contracts with to operate and support IPEDS, RTI International and the
Association for Institutional Research. We also used these interviews
to examine options for reducing the IPEDS burden, as well as to
understand the benefits and challenges of collecting additional
information on graduation rates.
To understand the IPEDS reporting burden for schools and to understand
options for reducing this burden, we interviewed experts from a broad
range of higher education associations including the American Council
on Education, the American Indian Higher Education Consortium, the
Career College Association, the State Council of Higher Education for
Virginia, the State Higher Education Executive Officers, and The
Institute for College Access and Success. To examine challenges with
software's IPEDS reporting features, we interviewed representatives
from four major higher education software providers. We selected these
providers based on the findings of an NCES-sponsored study examining
the prevalence of software use among keyholders.
To examine the benefits and challenges of collecting additional data
on graduation rates, in addition to the groups listed above, we
interviewed experts from Education Sector, the Association of Public
Land Grant Universities, Education Trust, the Institute for Higher
Education Policy, and the Delta Cost Project.
We conducted this performance audit from August 2009 to August 2010,
in accordance with generally accepted government auditing standards.
Those standards require that we plan and perform the audit to obtain
sufficient, appropriate evidence to provide a reasonable basis for our
findings and conclusions based on our audit objectives. We believe
that the evidence obtained provides a reasonable basis for our
findings and conclusions based on our audit objectives.
[End of section]
Appendix II: Comments from the Department of Education:
U.S. Department of Education:
Institute of Educational Sciences:
National Center for Education Statistics:
August 2, 2010:
Mr. George A. Scott:
Director:
Education, Workforce, and Income Security Issues:
United States Government Accountability Office:
Washington, DC 20548:
Dear Mr. Scott:
Thank you for providing the Department of Education with a draft copy
of the U.S. Government Accountability Office's (GAO's) report
entitled, "Higher Education: Institutions' Reported Data Collection
Burden Is Higher Than Estimated but Can Be Reduced. Through Increased
Coordination" (GA0-10-871).
This study looks at the reporting burden for postsecondary
institutions through the Integrated Postsecondary Education Data
System (IPEDS), including Education's estimates of that burden and
opportunities to reduce burden for institutions. It also examines the
feasibility and burden associated with collecting additional
graduation rate data.
Regarding the recommendations made in the report, the National Center
for Education Statistics (NCES) responds as follows:
Recommendation #1: To improve the availability of reliable information
to Congress and postsecondary institutions about postsecondary
institutions' data collection efforts, re-evaluate the official IPEDS
burden estimates and establish new baseline estimates as appropriate.
Response: NCES agrees with this recommendation and has begun steps to
address it. Burden estimates have been made available for comment for
90 days through postings in the Federal Register. However, it was not
until 2007 that data providers first commented on burden estimates in
posting Federal Register comments, suggesting that this process does
not adequately elicit from data providers their evaluation of burden
estimates. To address this problem, we will include specific burden
estimates in the Technical Review Panel summaries that are posted to
the IPEDS Web site for comment, and will alert all keyholders of the
opportunity to comment on those estimates via an announcement in our
"This Week in IPEDS" electronic newsletter.
We have a two-year process of burden estimate review underway. In
2008, we commissioned an internal study to examine our burden
estimates, and we have initiated a 201 0 follow-up study to examine
methodologies used by other federal agencies that might assist NCES in
making more accurate estimates.
Recommendation #2: To help reduce the reporting burden on
postsecondary institutions, improve how NCES communicates IPEDS
training opportunities to a wider range of institutions. particularly
smaller career and technical institutions outside of traditional
higher education networks.
Response: NCES agrees with this recommendation. The IPEDS program
offers extensive training opportunities both by NCES staff and through
its training subcontract with the Association for Institutional
Research. All IPEDS keyholders are alerted to these training
opportunities through announcements in "This Week in IPEDS." However,
to draw greater attention to them, we will send separate e-mails
exclusively about training. In addition, as noted in your report, many
small career and technical colleges do not belong to the national
associations through which we have targeted training. We will continue
to work with the Career College Association to provide better
outreach, and will also expand our efforts into new networks, for
example, the National Accrediting Commission of Cosmetology Arts and
Sciences, which accredits approximately 1,300 institutions.
Recommendation #3: To help reduce the reporting burden on
postsecondary institutions, coordinate with higher education software
providers to help enhance the quality and reliability of IPEDS
reporting features.
Response: NCES agrees with the goal of "enhancing the quality and
reliability of IPEDS reporting features." We note, however, that
institutions may opt not to purchase reporting applications that are
commercially developed. This is true both of small and large
institutions. Many small institutions do not have a need for a
sophisticated student data system or the resources to invest in one.
They often maintain their student records using more common software
such as Microsoft Excel or Microsoft Access. Larger institutions often
have homegrown data systems or have developed their own programming
code and methods for reporting to IPEDS that complement the vendor-
provided software they use for other purposes on their campuses. In
addition, because different types of institutions report different
types of data to IPEDS, even if vendors improve the IPEDS modules
within their product, it is likely that they will still need to
customize them for different institutions, often at additional costs
to those institutions.
NCES will take steps to better coordinate with software vendors and
others that assist institutions in reporting to IPEDS. We will create
a vendor page within the IPEDS Web site that will include a link to
the IPEDS Data Provider Center, with descriptions of resources
available (e.g., collection schedule, proposed changes, and survey
materials) and the timetable for availability. This area of the Web
site will also provide a link to IPEDS training opportunities and
descriptions of the training that is available. To alert these third
parties to changes to that area of the Web site, we will offer them
the option to "register" as an IPEDS-related vendor and e-mail them
about any new information that is posted.
I appreciate your examination of this issue. NCES is committed to
providing more accurate burden estimates for this data collection and
improving communication and coordination of training opportunities to
help institutions reduce reporting burden.
Sincerely,
Signed by:
Stuart Kerachsky:
Deputy Commissioner, NCES:
[End of section]
Appendix III: GAO Contact and Staff Acknowledgments:
GAO Contact:
George A. Scott, (202) 512-7215, or scottg@gao.gov:
Staff Acknowledgments:
In addition to the individual named above, the following staff made
key contributions to this report: Gretta L. Goodwin, Assistant
Director, and William Colvin, Analyst-in-Charge, managed all aspects
of this assignment, and Grace Cho, Analyst, and Edward Leslie,
Analyst, made significant contributions to all phases of the work. In
addition, Carl M. Ramirez and Beverly Ross provided methodological
assistance; Craig H. Winslow provided legal counsel; Susannah Compton
assisted in message and report development; and James Bennett and Mimi
Nguyen drafted the report's graphics.
[End of section]
Footnotes:
[1] 20 U.S.C. § 1094(a)(17). IPEDS was initiated in 1986 and replaced
several surveys that collected similar information. IPEDS has been
conducted through a Web-based system since 2000.
[2] Institutions were required to collect graduation rate data under
the Student Right-to-Know and Campus Security Act to increase
information about institutions to students and parents. Pub. L. No.
101-542, § 103(a), 104 Stat. 2381, 2381-84. Education added the
Graduation Rates Survey to IPEDS in 1997 to help institutions satisfy
these requirements.
[3] 44 U.S.C. § 3506(c)(1)(B)(iii)(III).
[4] Pub. L. No. 110-315, § 1103, 122 Stat. 3078, 3492-93.
[5] 44 U.S.C. § 3506(c)(1)(B)(iii)(III).
[6] 44 U.S.C. §§ 3503(a) and 3502(3), respectively.
[7] Some of the nine IPEDS surveys have more than one form associated
with them to account for different school characteristics. For
example, nonprofit, for-profit, and public schools all complete
different Finance Survey forms, each with a different time burden
estimate associated with them. Education estimates the time burden for
each separate form. The total time Education estimates it takes
institutions to complete IPEDS is equal to the sum of the time-burden
estimates on all the survey forms applicable to an institution.
[8] Pub. L. No. 88-352, 78 Stat. 241.
[9] Pub. L. No. 98-524, § 421, 98 Stat. 2435, 2472-73.
[10] Pub. L. No. 101-542, § 103(a), 104 Stat. 2381, 2381-84.
[11] The Graduation Rates Survey requires less than 2-year and 2-year
institutions to report on the number of full-time, first-time, degree/
certificate-seeking students that complete within 100 percent and 150
percent normal time. Four-year institutions are required to report on
the number of full-time, first-time, bachelor's or equivalent degree-
seeking students that complete in 4 years, 5 years, and 6 years (100,
125, and 150 percent normal completion time).
[12] Institutions are only allowed to remove students from an initial
cohort if they left the institution for one of the following reasons:
death or total and permanent disability; service in the armed forces
(including those called to active duty); service with a foreign aid
service of the federal government, such as the Peace Corps; or service
on official church missions.
[13] 20 U.S.C. § 1092(a)(7)(A).
[14] Keyholders were asked to consider the amount of time they spend
on IPEDS reporting, the time frame they have to do that work in, any
difficulty they have in collecting or submitting IPEDS data, and the
overall level of effort IPEDS reporting requires, and then rank the
IPEDS burden using a scale from 1 to 5: (1) not at all burdensome, (2)
slightly burdensome, (3) moderately burdensome, (4) very burdensome,
(5) extremely burdensome.
[15] GAO, Paperwork Reduction Act: Increase in Estimated Burden Hours
Highlights Need for New Approach, [hyperlink,
http://www.gao.gov/products/GAO-06-974T] (Washington, D.C.: July 18,
2006).
[16] [hyperlink, http://www.gao.gov/products/GAO-06-974T].
[17] 44 U.S.C. § 3506(c)(3)(C).
[18] NCES offers keyholder training under contract through the
Association of Institutional Research.
[19] Crissie M. Grove, "Features of Campus Data Systems and Reporting
to IPEDS" (July 2009), [hyperlink,
http://www.airweb.org/images/Grove_Final_Report_2010.pdf].
[20] 20 U.S.C. § 9543(a)(4).
[21] NCES has contracted with RTI International to administer the
IPEDS Help Desk.
[22] 20 U.S.C. 1015c.
[23] Beginning Postsecondary Students Longitudinal Survey is conducted
by Education and follows students who first begin their postsecondary
education. These students are asked questions about their experiences
during, and transitions through, postsecondary education and into the
labor force, as well as family formation. Transfers, dropouts, and
vocational completers are among those included in the studies.
[24] See [hyperlink, http://www.nces.ed.gov/collegenavigator/].
[25] NCES does not publicly disclose personally identifiable IPEDS
data.
[26] Keyholders were asked to estimate the potential burden that would
be imposed by collecting and reporting additional types of graduation
rate data through IPEDS using a scale from 1 to 5: (1) not at all
burdensome, (2) slightly burdensome, (3) moderately burdensome, (4)
very burdensome, (5) extremely burdensome.
[27] The FAFSA only collects information on parents' income if the
student is classified as financially dependent on their parents. The
FAFSA also collects income data on a student's spouse if applicable.
[28] Schools could group all students for which income data is not
available into a separate category for analyzing graduation rates.
[29] 20 U.S.C. § 1092(a)(7)(A). The provision also requires that
completion or graduation rates must be disaggregated by recipients of
a subsidized Stafford Loan who did not receive a Pell Grant, as well
as students who did not receive either a Pell Grant or a subsidized
Stafford Loan. The requirement for disaggregation does not apply to 2-
year degree-granting institutions until academic year 2011-2012.
[30] Results from nonprobability samples cannot be used to make
inferences about a population because in a nonprobability sample some
elements of the population being studied have no chance or an unknown
chance of being selected as part of the sample.
[31] We conducted a preliminary site visit at the University of
Maryland, College Park, in November 2009 to help develop the keyholder
interview protocol.
[End of section]
GAO's Mission:
The Government Accountability Office, the audit, evaluation and
investigative arm of Congress, exists to support Congress in meeting
its constitutional responsibilities and to help improve the performance
and accountability of the federal government for the American people.
GAO examines the use of public funds; evaluates federal programs and
policies; and provides analyses, recommendations, and other assistance
to help Congress make informed oversight, policy, and funding
decisions. GAO's commitment to good government is reflected in its core
values of accountability, integrity, and reliability.
Obtaining Copies of GAO Reports and Testimony:
The fastest and easiest way to obtain copies of GAO documents at no
cost is through GAO's Web site [hyperlink, http://www.gao.gov]. Each
weekday, GAO posts newly released reports, testimony, and
correspondence on its Web site. To have GAO e-mail you a list of newly
posted products every afternoon, go to [hyperlink, http://www.gao.gov]
and select "E-mail Updates."
Order by Phone:
The price of each GAO publication reflects GAO‘s actual cost of
production and distribution and depends on the number of pages in the
publication and whether the publication is printed in color or black and
white. Pricing and ordering information is posted on GAO‘s Web site,
[hyperlink, http://www.gao.gov/ordering.htm].
Place orders by calling (202) 512-6000, toll free (866) 801-7077, or
TDD (202) 512-2537.
Orders may be paid for using American Express, Discover Card,
MasterCard, Visa, check, or money order. Call for additional
information.
To Report Fraud, Waste, and Abuse in Federal Programs:
Contact:
Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]:
E-mail: fraudnet@gao.gov:
Automated answering system: (800) 424-5454 or (202) 512-7470:
Congressional Relations:
Ralph Dawn, Managing Director, dawnr@gao.gov:
(202) 512-4400:
U.S. Government Accountability Office:
441 G Street NW, Room 7125:
Washington, D.C. 20548:
Public Affairs:
Chuck Young, Managing Director, youngc1@gao.gov:
(202) 512-4800:
U.S. Government Accountability Office:
441 G Street NW, Room 7149:
Washington, D.C. 20548: