Education's Data Management Initiative
Significant Progress Made, but Better Planning Needed to Accomplish Project Goals
Gao ID: GAO-06-6 October 28, 2005
As a condition of receiving federal funding for elementary and secondary education programs, states each year provide vast amounts of data to Education. While the need for information that informs evaluation is important (particularly with the No Child Left Behind Act), Education's data gathering has heretofore presented some problems. It has been burdensome to states because there are multiple and redundant requests administered by a number of offices. In addition, the resulting data supplied by states has not been accurate, timely, or conducive to assessing program performance. To improve the information by which it evaluates such programs and also to ease states' reporting burden, Education in 2002 initiated an ambitious, multiyear plan to consolidate elementary and secondary data collections into a single, department-wide system focused on performance. Given its importance, we prepared a study, under the authority of the Comptroller General, to provide Congress with information on its progress.
Through its Performance-Based Data Management Initiative (PBDMI), Education has consolidated and defined much of the data it anticipates collecting under a unified system. Education reports that many data definitions have been agreed-to and data redundancies eliminated. PBDMI officials also said that to date, however, it has not been able to resolve all remaining differences among the program offices that manage many of the different data collections. PBDMI officials have conducted extensive outreach to the states to advance the initiative. The outreach to states involved regional conferences, two rounds of site visits, and according to officials, $100,000 in grants to most states to help offset their costs. State data providers responding to our survey expressed general satisfaction with the department's outreach, but some were not optimistic that the initiative would ease their reporting burden or enhance their own analytic capacity. The states were not able to produce enough data during test submissions in 2003 and 2004 to enable data quality verification or phasing out the department's multiple data collections. With regard to the lack of sufficient data from many states, Education officials said some lack the technical capacity needed to produce new performance data requirements. State data providers reported having competing demands for their time and resources, given other federal initiatives. Education officials have decided to proceed with the undertaking and have developed a draft interim strategy for moving forward. But they currently have no formal plan for how they would overcome obstacles such as the lack of state data and other technical and training delays to the initiative.
Recommendations
Our recommendations from this work are listed below with a Contact for more information. Status will change from "In process" to "Open," "Closed - implemented," or "Closed - not implemented" based on our follow up work.
Director:
Team:
Phone:
GAO-06-6, Education's Data Management Initiative: Significant Progress Made, but Better Planning Needed to Accomplish Project Goals
This is the accessible text file for GAO report number GAO-06-6
entitled 'Education's Data Management Initiative: Significant Progress
Made, but Better Planning Needed to Accomplish Project Goals' which was
released on October 28, 2005.
This text file was formatted by the U.S. Government Accountability
Office (GAO) to be accessible to users with visual impairments, as part
of a longer term project to improve GAO products' accessibility. Every
attempt has been made to maintain the structural and data integrity of
the original printed product. Accessibility features, such as text
descriptions of tables, consecutively numbered footnotes placed at the
end of the file, and the text of agency comment letters, are provided
but may not exactly duplicate the presentation or format of the printed
version. The portable document format (PDF) file is an exact electronic
replica of the printed version. We welcome your feedback. Please E-mail
your comments regarding the contents or accessibility features of this
document to Webmaster@gao.gov.
This is a work of the U.S. government and is not subject to copyright
protection in the United States. It may be reproduced and distributed
in its entirety without further permission from GAO. Because this work
may contain copyrighted images or other material, permission from the
copyright holder may be necessary if you wish to reproduce this
material separately.
Report to Congressional Committees:
United States Government Accountability Office:
GAO:
October 2005:
Education's Data Management Initiative:
Significant Progress Made, but Better Planning Needed to Accomplish
Project Goals:
GAO-06-6:
GAO Highlights:
Highlights of GAO-06-6, a report to congressional committees:
Why GAO Did This Study:
As a condition of receiving federal funding for elementary and
secondary education programs, states each year provide vast amounts of
data to Education. While the need for information that informs
evaluation is important (particularly with the No Child Left Behind
Act), Education‘s data gathering has heretofore presented some
problems. It has been burdensome to states because there are multiple
and redundant requests administered by a number of offices. In
addition, the resulting data supplied by states has not been accurate,
timely, or conducive to assessing program performance. To improve the
information by which it evaluates such programs and also to ease
states‘ reporting burden, Education in 2002 initiated an ambitious,
multiyear plan to consolidate elementary and secondary data collections
into a single, departmentwide system focused on performance. Given its
importance, we prepared a study, under the authority of the Comptroller
General, to provide Congress with information on its progress.
What GAO Found:
Through its Performance-Based Data Management Initiative (PBDMI),
Education has consolidated and defined much of the data it anticipates
collecting under a unified system. Education reports that many data
definitions have been agreed-to and data redundancies eliminated. PBDMI
officials also said that to date, however, it has not been able to
resolve all remaining differences among the program offices that manage
many of the different data collections.
PBDMI officials have conducted extensive outreach to the states to
advance the initiative. The outreach to states involved regional
conferences, two rounds of site visits, and according to officials,
$100,000 in grants to most states to help offset their costs. State
data providers responding to our survey expressed general satisfaction
with the department‘s outreach, but some were not optimistic that the
initiative would ease their reporting burden or enhance their own
analytic capacity. The states were not able to produce enough data
during test submissions in 2003 and 2004 to enable data quality
verification or phasing out the department‘s multiple data collections.
With regard to the lack of sufficient data from many states, Education
officials said some lack the technical capacity needed to produce new
performance data requirements. State data providers reported having
competing demands for their time and resources, given other federal
initiatives.
Education officials have decided to proceed with the undertaking and
have developed a draft interim strategy for moving forward. But they
currently have no formal plan for how they would overcome obstacles
such as the lack of state data and other technical and training delays
to the initiative.
Reporting to Education: A Sample of Data Collections Seeking
Information on Elementary and Secondary Programs in One State in 2004:
[See PDF for image]
[End of figure]
What GAO Recommends:
GAO recommends that Education (1) develop a strategy to help states
provide quality data, (2) develop a process within the department to
resolve critical, outstanding issues, and (3) develop a clear plan for
completing final aspects of PBDMI, including specific time frames and
indicators of progress toward the initiative‘s goals. Education agreed
with our recommendations.
www.gao.gov/cgi-bin/getrpt?GAO-06-6.
To view the full product, including the scope and methodology, click on
the link above. For more information, contact David Bellis at (415) 904-
2272 or bellisd@gao.gov.
[End of section]
Contents:
Letter:
Results in Brief:
Background:
Education Has Made Progress Defining Data to Be Collected under a
Consolidated System, but Project Officials are Unable to Reconcile All
Differences:
PBDMI Officials Have Worked Extensively with States on Data Preparation
and Submissions, but Most States Cannot Produce the Requested Data:
Education Is Proceeding with Implementation despite the Data Shortage
and without a Detailed Plan of Action:
Conclusions:
Recommendations for Executive Action:
Agency Comments and Our Evaluation:
Appendix I: Scope and Methodology:
Appendix II: Comments from the Department of Education:
Appendix III: GAO Contact and Staff Acknowledgments:
Tables:
Table 1: Data Collections That Could Be Reduced or Eliminated as a
Result of PBDMI:
Table 2: Education's Outreach Activities to Improve Data Quality under
the PBDMI:
Table 3: State Survey Responses on Education's Outreach Activities
through PBDMI:
Table 4: Percentage of States Reporting Which Goals Were Most
Important, Attainable, and Difficult to Achieve:
Table 5: Percentage of States Reporting the Extent to Which PBDMI Would
Improve Their Analytical Capacity:
Figures:
Figure 1: Time and Money--Estimated Annual State Burden Hours and Costs
for Select Elementary and Secondary Education Data Collections:
Figure 2: Proposed Design for PBDMI's Web-based Network:
Figure 3: Illustration of Key Actors Involved in Development of the
PBDMI:
Figure 4: Fiscal Year 2002-2009 Funding for Education's Data Management
Initiative, including Key Activities Planned for Project Initiation
through Implementation:
Figure 5: School Year 2003-2004 Performance Data Submitted by States to
PBDMI as of June 3, 2005:
Abbreviations:
CIO: chief information officer:
FAPE: free and appropriate public education:
IDEA: Individuals with Disabilities Education Act:
IES: Institute for Education Science:
IG: Inspector General:
NCLBA: No Child Left Behind Act of 2001:
OCR: Office for Civil Rights:
OELA: Office of English Language Acquisition:
OESE: Office of Elementary and Secondary Education:
OMB: Office of Management and Budget:
OSDFS: Office of Safe and Drug-free Schools:
OSERS: Office of Special Education and Rehabilitative Services:
OVAE: Office of Vocational and Adult Education:
PBDMI: Performance-Based Data Management Initiative:
PRA: Paperwork Reduction Act of 1995:
SEA: state education agency:
United States Government Accountability Office:
Washington, DC 20548:
October 28, 2005:
The Honorable Michael B. Enzi:
Chairman:
The Honorable Edward M. Kennedy:
Ranking Minority Member:
Committee on Health, Education, Labor, and Pensions:
United States Senate:
The Honorable John A. Boehner:
Chairman:
The Honorable George Miller:
Ranking Minority Member:
Committee on Education and the Workforce:
House of Representatives:
Each year, state education agencies provide vast amounts of information
to the U.S. Department of Education (Education) in order to fulfill
reporting requirements for federal programs supporting elementary and
secondary education. While this information is important for managing
programs, it has been accompanied by some problems. Reporting has been
burdensome for the state data providers because the department makes
its data requests through multiple, ongoing, and uncoordinated data
collections. By Education's own account, there are currently 200 active
data collections for elementary and secondary programs--each resulting
in approximately 10,000 "person hours" for design, administration,
collection, and reporting. From the vantage point of the department and
its program offices, the information it receives has customarily been
compromised because the schools, districts, and states reporting data
employ their own definitions and, in some cases, report data that is
inaccurate, incomplete, and not timely. Finally, in terms of program
evaluation, much of the data that Education has traditionally requested
has not necessarily focused on program performance. Yet the need for
evaluative data has grown, particularly with passage of laws such as
the No Child Left Behind Act of 2001, which requires states receiving
assistance under the act to report on, among other things, the
achievement of their students on academic assessments required under
that law.
To address these problems and better evaluate its programs, Education
in 2002 began an initiative to consolidate and improve the information
it requests from states on elementary and secondary education and to
seek more consistency and quality in the data states supply. The
Performance Based Data Management Initiative (PBDMI) is a large-scale
effort within the department to combine more than a dozen separate data
collections into a single collection system, and better focus the
information Education requests from states by eliminating duplication,
conflicting definitions, and information that is not useful for the
evaluation of its programs. The PBDMI represents an important step
forward for Education in its efforts to monitor the performance of the
nation's elementary and secondary schools. The initiative is also a
large-scale undertaking for state education agencies, which are
volunteering to help develop uniform data and test the new data
collection system while they continue to meet their ongoing reporting
requirements. The PBDMI was scheduled to begin phasing out the old data
collections by September 2005, following final testing of the new
system and training of department staff.
In view of its importance and the inherent challenges, therefore, we
have prepared a study under the authority of the Comptroller General to
provide Congress with information about Education's progress with the
PBDMI. We have examined Education's work to (1) define what performance-
related data it will collect from states on behalf of the program
offices, (2) assist states in their efforts to submit quality
information, and (3) utilize performance-related data to provide
enhanced analytic capacity within the program offices.
To address our objectives we reviewed relevant documents, including
Education's business plans, information collected by Education on
states' capacity to supply data, various contracts for key pieces of
the initiative, Education's submissions to the Office of Management and
Budget (OMB) justifying the various data collections, the department's
concept of operations, and other information related to the development
of PBDMI. We also interviewed Education officials overseeing PBDMI,
officials from most of the participating program offices, and key
stakeholders in PBDMI, including a standards-setting organization, an
advocacy group, and contractors, to obtain their perspectives on the
progress of the initiative and to verify the information we reviewed.
Finally, we surveyed 52 state data coordinators, including the District
of Columbia and Puerto Rico, about their experiences with PBDMI, and we
received 50 responses. We performed this work between April 2004 and
September 2005 in accordance with generally accepted government
auditing standards. See appendix I for additional information on our
scope and methodology.
Results in Brief:
Through its PBDMI, Education officials have said that they have
identified and defined much of the data to be collected under a unified
data collection system. To determine what data will be collected,
project officials have engaged in an ambitious effort with the program
offices to identify data needed for program administration and
oversight. They also developed performance-related data that would meet
those needs, particularly for evaluating the effectiveness of federal
programs. They further worked to develop common definitions and
eliminate redundancy for data that would be collected through the
system. The end result of this work was a body of performance-based
data elements designed to better position the department to monitor the
performance of its elementary and secondary education programs.
However, officials responsible for the initiative told us that they
were unable to resolve all data differences among Education's program
offices, given the traditional, diffused control of information
collected throughout the department. PBDMI officials estimated that the
majority of the work to define these data elements had been completed,
although we found that they did not develop baseline data which would
allow them to track the full extent of their progress. We were also
told that these hard to resolve differences that remain would
ultimately be settled at higher levels within the department, but the
department has no formally agreed upon process for how or when such
decisions would occur.
PBDMI officials have conducted extensive outreach to the states to help
them meet Education's data request and in some cases upgrade their
collection and submission systems; but after 2 years of testing, most
states have not, for a variety of reasons, been able to provide
Education with enough reliable data to proceed with the initiative.
This outreach involved two rounds of site visits to all the
participating states to confer about data elements developed with the
programs and offer technical assistance, and $100,000 in grants to most
states to help offset their costs. In addition, Education sponsored
regional conferences and developed a call center to help states prepare
and submit data. The department's activities were focused largely on
state-level agencies, but did involve some educational organizations.
State data providers responding to our survey expressed general
satisfaction with the department's outreach, but about 75 percent
nevertheless predicted that the burden of collecting and reporting data
would increase or remain the same once PBDMI was completed. Many states
also expressed doubts that PBDMI could enhance their analytic
abilities. Only about 20 percent of states expected PBDMI to improve or
greatly improve their analytic capacity. Despite the extensive
outreach, most states were not able to produce enough data during test
submissions in 2002-2003 and again in 2003-2004 for the department to
validate its quality and consider phasing out its standing collection
systems. Thus, the department has decided to keep the latter collection
phase open longer. According to PBDMI officials, some states lack the
technical capacity to collect and report the requested data
electronically and others need to modify their existing processes to
meet the new specifications. Still others wanted clarification from the
department for data definitions. State data providers also reported
having competing demands for their time and attention, given other
federal initiatives.
Education officials have decided to proceed with PBDMI's implementation
despite a shortage of data, other delays, and reservations among a few
program offices; however, they do not have a specific plan for
addressing these obstacles. Currently the department expects to
complete its systems development efforts, which includes the full
implementation of its data analysis and reporting system by the spring
2006--1 year later that its initial completion date--primarily due to
the lack of state data and the failure of some of Education's
contractors to meet scheduled delivery dates. To the degree that it has
been able to proceed, the department has begun developing a set of
quality checks, although a few program offices expressed concern about
their adequacy for maintaining the value of the data. Meanwhile,
Education officials have said they are developing strategies to address
these obstacles, including exempting states from certain reporting
requirements, but they had no specific plan for providing further
assistance to the states or for meeting state expectations for phasing
out multiple data collections.
We are making recommendations to Education to improve its planning and
decision-making processes supporting PBDMI. In responding to a draft of
this document, Education's Assistant Secretary for the Office of
Planning, Evaluation and Policy Development provided written comments
on a draft of this report. In its comments, Education agreed with our
findings and recommendations. Copies of the written comments are in
appendix II.
Background:
The Department of Education annually administers data collections to
gather information from states about elementary and secondary education
programs receiving federal assistance. When it administers a data
collection, Education, like most federal agencies, is required to
follow the provisions of the Paperwork Reduction Act (PRA)[Footnote 1]
in order to maximize the utility of information to the federal agency
and minimize the level of burden incurred by the states and agencies
from whom it solicits the information. Traditionally, the department's
program offices, which have responsibility for the administration and
oversight of federal education programs, have developed and operated
similar data collections independent of one another, in a continuous
year-round process. In addition, much of the data requested from states
has been focused on compliance and procedural matters, and overlooked
performance and the impact of programs in the classroom. Moreover, the
collection of this data has been complex and prone to error, given that
it typically passes from about 94,000 public schools to more than
14,000 school districts and then to state education agencies before
Education receives it.
Collecting data can be both time-intensive and costly. Education
estimated that in 2004, for example, that states spent approximately
45,000 hours and nearly $1.2 million responding to the department's
requests for certain elementary and secondary education data. (See fig.
1.) Data collections are costly for Education also. Over $5 million was
spent in 2004 administering certain data collections that included
allocating federal funds for both the staff to administer the
collections and in many instances for contractors to analyze these
data.
Figure 1: Time and Money--Estimated Annual State Burden Hours and Costs
for Select Elementary and Secondary Education Data Collections:
[See PDF for image]
[A] State Education Agencies (SEA).
[B] Individuals with Disabilities Education Act (IDEA).
[C] Free and appropriate public education (FAPE).
Note: Figure includes burden estimates for ongoing collections for
which data were available.
[End of figure]
Initiated in 2002, the Education's PBDMI has four goals: to improve the
quality of the data Education collects about elementary and secondary
education in terms of accuracy, consistency, and timeliness; to reduce
the burden that states incur in reporting data to the department; to
improve the focus of data analysis on program performance; and to
improve Education's data-sharing relationship with the states. While
this initiative is not the department's first attempt to overhaul the
way it collects data, it nonetheless represents a fundamental change to
its data management in that it is agencywide as opposed to program
specific.[Footnote 2] As envisioned, the new collection would
consolidate 16 separate collections heretofore conducted by seven
program offices.[Footnote 3] Given the additional reporting effort that
development and testing of the system would require of states,
Education sought and received OMB approval to collect data from the
states through PBDMI.[Footnote 4] (See table 1 for a list of the
separate collections the PBDMI is designed to supplant.)
Table 1: Data Collections That Could Be Reduced or Eliminated as a
Result of PBDMI:
Program office: Office of Elementary and Secondary Education (OESE);
Data collections:
1. Consolidated State Performance Report;
2. State Data Collection for the McKinney-Vento Homeless Assistance
Act;
3. Elementary and Secondary Education Act Title I, Part C Migrant Child
Count Report.
Program office: Institute of Education Science (IES);
Data collections: 4. Common Core of Data Surveys.
Program office: Office of English Language Acquisition (OELA);
Data collections: 5. Title III Biennial Evaluation Report Required of
State Education Agencies Regarding Activities under the No Child Left
Behind Act of 2001;
6. Biennial Report Form for the Emergency Immigration Education
Program.
Program office: Office of Special Education and Rehabilitative Services
(OSERS);
Data collections: 7. Report of Children with Disabilities Receiving
Special Education under Part B of the Individuals with Disabilities
Education Act;
8. Part B, Individuals with Disabilities Act Implementation of "Free
and Appropriate Public Education" (FAPE) Requirements;
9. Personnel Employed to Provide Special Education and Related Services
for Children with Disabilities;
10. Report of Children with Disabilities Exiting Special Education
During the School Year;
11. Report of Children with Disabilities Unilaterally Removed or
Suspended/Expelled for More than 10 Days;
12. Part B of the Individuals with Disabilities Education Act Annual
Performance Report;
13. Consolidated Data Collection on Students with Disabilities [A].
Program office: Office of Safe and Drug-free Schools (OSDFS);
Data collections:
14. Gun-Free Schools Act Report.
Program office: Office for Civil Rights (OCR);
Data collections:
15. Elementary and Secondary School Civil Rights Compliance Report[ A].
Program office: Office of Vocational and Adult Education (OVAE);
Data collections:
16. Carl D. Perkins Vocational and Technical Act Annual Performance
Report.
Source: GAO analysis of Department of Education documents.
[A] Collections that have been replaced by PBDMI.
[End of table]
In addition to defining the information to be collected, the initiative
involves the development of a Web-based, data exchange network that
will provide states and others with the ability to submit school-based
data into one unified system to be stored in a data repository. The
network will comprise three separate, but interrelated systems--the
first system, the submission system, developed in late 2004, is used to
collect data from states, check data for quality, and store the data in
the data repository. The second system, the survey tool, which was also
developed in 2004, enables Education to collect supplemental data from
states and others that is also stored in the data repository. The third
system, the data analysis and reporting system, which is not yet
operational, will allow users (i.e., program office staff and the
public) to among other things, query the data repository to analyze
retrieved data and generate ad hoc reports. Education envisions that
states and school districts would be able to use the data to assess
their own program performance while also providing an opportunity for
them to verify the quality of data submitted through the system. Figure
2 depicts the system design for the data network.
Figure 2: Proposed Design for PBDMI's Web-based Network:
[See PDF for image]
[End of figure]
Education had originally planned to have all components of the data
exchange network fully operational in the spring of 2005 following the
completion of key activities, such as (1) defining the data to be
collected through in-depth consultations with department program
offices and with state data providers,(2) populating the database with
school-based data submitted by the states so that the quality of the
stored data can be checked, and (3) training program staff on how to
use the new network.
PBDMI's efforts to define what data were to be collected included
forging agreements among Education's individual program offices about
which data would be essential to administration and oversight,
particularly as performance indicators, and also developing common
definitions for those elements that had been redundant. As a
collaborative project, this involved developing consensus and receiving
feedback from many parties-program offices, state policymakers and data
providers, and organizations that develop data standards in the field
of Education. Within the department, the office responsible for the day-
to-day work of the project and for ensuring its success is the
Strategic Accountability Service, which also has responsibility for
developing and disseminating agencywide performance indicators.
However, a number of other offices and boards within the department
have been charged with providing oversight and guidance: a steering
committee convened to share information on the development of the
initiative consisting of the PBDMI managers and other senior officials
within the participating program offices, the Chief Information Officer
(CIO), a data information working group, and Education's investment
review board. The data information working group, which is headed by
Education's CIO, has responsibility for ensuring the consistency and
quality of new data collections and for facilitating the integration
and sharing of information between program offices. The department's
investment review board has overall responsibility for reviewing and
approving and prioritizing department investments in technology,
including the new network. As voluntary participants, stakeholders such
as data coordinators from each of the 50 state education agencies, the
District of Columbia, and Puerto Rico were provided with opportunities
to give their input and feedback on the development of the initiative.
The Education Information Advisory Committee established by Council for
Chief State School Officers facilitates this exchange. Figure 3 depicts
the various groups involved in the initiative.
Figure 3: Illustration of Key Actors Involved in Development of the
PBDMI:
[See PDF for image]
[End of figure]
Once departmental data requirements were identified, Education planned
a series of data collections to be followed by extensive testing of the
quality of that data by the program offices. Specifically, Education
planned to have states submit the newly defined data for the 2002-2003
and 2003-2004 school years. (States would voluntarily make these
submissions to PBDMI while also maintaining their current multiple
reporting obligations under Education's program offices.) In
conjunction with the program offices, PBDMI officials then anticipated
validating and verifying the quality of the new data submitted using a
number of checks and evaluations. Also at this time the development of
the system that staff would use to analyze data and generate reports
was to be finalized. Once these activities were completed, the program
offices were to assess whether the new system would be an adequate
substitute for their existing data collections.
Education has projected that it would spend just over $30 million
through 2005 and initial estimates indicate that the data network will
cost--beginning in 2006--just over $4 million annually to maintain. See
figure 4 for project time frames and projected costs through 2009.
Figure 4: Fiscal Year 2002-2009 Funding for Education's Data Management
Initiative, including Key Activities Planned for Project Initiation
through Implementation:
[See PDF for image]
[A] Includes $2 million to develop a Web-based survey tool designed to
collect supplemental data from schools, districts, and states.
[End of figure]
Education Has Made Progress Defining Data to Be Collected under a
Consolidated System, but Project Officials are Unable to Reconcile All
Differences:
Education officials spearheading PBDMI told us they have made progress
defining the data to be collected. To do this, project officials worked
with the program offices to identify their existing data needs. They
also worked with program offices to translate these needs into
performance-related data, such as math and reading achievement scores
for different groups of students. Officials told us they had eliminated
data elements collected by the program offices that are more indicative
of process than performance. PBDMI officials encouraged program offices
to identify performance-related data by using requirements specified in
laws such as the No Child Left Behind Act and using the goals in the
department's strategic plans.
PBDMI officials also worked with the program offices to reach agreement
on common definitions for the data elements selected and to eliminate
redundancy. For example, some programs needed information on charter
schools, and PBDMI officials coordinated efforts within the department
to develop one standard definition for them. The end result of these
efforts is a unified body of data elements that includes definitions
for each of the data elements and identifies the program with primary
stewardship over decisions about that element. According to one
department official managing the initiative, this collection will
improve the quality of the data by assuring more consistency in what
states provide.
Although PBDMI officials reported progress in identifying performance-
related data and establishing common data definitions, project
officials have not fully documented these achievements by establishing
a baseline and thus cannot be certain of the full extent of the
progress made toward achieving their goal to enhance the department's
focus on outcomes and accountability. For example, while PBDMI
officials were able to provide a list of 161 data elements focused on
performance they were unable to provide us with a comprehensive list of
"process-oriented" elements that had been eliminated. Similarly, while
PBDMI managers reported that the program offices had agreed to
definitions for the bulk of the data elements--one official estimated
that they reached agreement for about 90 percent of the data--they
could not provide us with a complete list of redundant elements that
had been eliminated or those that remain because they had not tracked
them.
While PBDMI officials could not provide a full list of disputed data
elements, they reported that some differences still remain among
program offices. Although PBDMI officials encouraged the use of
strategic plans and statutory requirements to justify the selection of
performance-based data, they told us that program offices had final say
over what data to collect. For example, one office uses similar
although somewhat broader criteria that allow it to collect "data that
can be reliably obtained from states or that Education has a documented
need for."[Footnote 5] Additionally, according to initiative officials,
some differences remain due to differences in legislative requirements
for the particular programs, while others resulted from preferences of
some offices to continue using the same definitions as in the past.
Officials responsible for carrying out the PBDMI told us they were
unable to reconcile all differences. Officials told us they were
working with the program offices to reach agreements, but said the
programs maintain primary control for defining their data needs and
would make final decisions. Additionally, we were told by Education's
CIO, who is required to review all data collections and who has a
primary role within the Data Information Working Group, that this
office does not have a role in resolving data disputes between program
offices in order to ensure uniformity. However, an official also said
that any differences that could not be resolved between the program
offices would ultimately be arbitrated at the assistant secretary level
within Education.
PBDMI Officials Have Worked Extensively with States on Data Preparation
and Submissions, but Most States Cannot Produce the Requested Data:
PBDMI officials have conducted extensive outreach to the states to help
unify their data definitions and upgrade their collection and
submission systems. State data providers responding to our survey
expressed general satisfaction with the department's outreach. However,
the majority thought that the burden of data collection and reporting
would either increase or remain the same with implementation of the
PBDMI. In addition, less than half expected the initiative to improve
their ability to conduct their own in-state analyses only somewhat.
Despite the extensive outreach, the states were not able to produce
enough data during test submissions in 2002-2003 and again in 2003-2004
for the department to validate its quality and consider phasing out its
standing collection systems.
Education Has Conducted Extensive Outreach to States to Improve Data
Quality:
In order to ensure that states could meet Education's requests for
quality data required as part of PBDMI, officials conducted extensive
outreach to state agencies, their data providers, and to data standards
organizations. After Education developed its body of data elements, it
consulted in 2002 with a task force consisting of a small number of
state data providers to advise the department on the availability of
the data it intended to collect. The department then conducted site
visits beginning in April 2003 to 50 states, the District of Columbia,
and Puerto Rico to obtain feedback on the ability of states to provide
needed data and to prepare for testing the states' ability to submit
data. Education officials said they also made $50,000 grants to all 52
states to offset costs of overhauling information systems or obtaining
additional staff. At the culmination of these visits, Education
originally planned for states to transmit 2002-2003 school year data
that could be tested for quality.
However, Education scaled back the scope of this first data collection
after recognizing that states would not, as yet, be able to offer
certain types of data, such as data needed to meet requirements of the
NCLBA. Consequently, Education delayed its plans to assess the quality
of the data states submitted and focused instead on the ability of
states to electronically transmit as much PBDMI data as they could to
the department. Also, Education decided to remove from PBDMI's
prospective collection some data elements that states reported were not
available at that time. Under this transmission pilot test, 50 states,
including the District of Columbia and Puerto Rico, were able to submit
some data to Education demonstrating that PBDMI was technically
feasible.
After establishing this technical feasibility, Education began
preparing in 2004 for its data collection of 2003-2004 school year by
providing additional outreach to the states. Project officials
conducted a second round of site visits beginning in April and provided
further guidance to help states align their data definitions with PBDMI
standards. By aligning definitions with PBDMI, Education attempted to
minimize possible confusion about what data to submit and when, further
assisting the department's efforts to improve data quality. Department
officials have said that establishing a unified body of data elements
across the department and states--so that all involved parties use the
same "language" when analyzing and sharing data[Footnote 6]--is a
priority. Education officials attribute the lack of quality in the data
it currently collects from states and others to a variety of reasons,
such as the lack of common data definitions that developed over time in
response to the specific information needs of the program offices and
data requirements arising at the state level.
Officials with the initiative also conducted a limited number of
quality assessments of state information systems to identify better
ways of collecting and reporting data to the department. To serve
states on a broader scale, Education conducted regional meetings,
providing them with updates and feedback on the progress of the
initiative. Officials also established a call center to answer states'
questions about the data to be submitted. Most states also received
another $50,000 in grants for their continued participation in the
initiative.[Footnote 7] Education began collecting 2003-2004 school
year data in November 2004.
To increase the likelihood that its definitions would be adopted by
states and other data providers, PBDMI officials also collaborated with
advocacy groups that establish data and influence the development of
technical standards. For example, PBDMI officials contracted with the
Council of Chief State School Officers to coordinate PBDMI conferences,
help states prepare and submit data, and provide feedback as PBDMI
developed data definitions. Education also collaborated with the
Schools Interoperability Framework, a group that develops data-sharing
standards and software primarily designed for schools and districts. By
working with the Schools Framework, Education officials said they could
improve data quality by increasing the likelihood that departmental
definitions and other standards would be incorporated into software
used by schools and districts. This interaction with the Schools
Framework is Education's primary attempt to deal with the long-standing
problem of poor data provided by schools and districts.[Footnote 8]
(See table 2 for a list of some of Education's outreach activities.)
Table 2: Education's Outreach Activities to Improve Data Quality under
the PBDMI:
Activity: Site visits (1st round);
Description: Met with states in summer 2003 to discuss data definitions
and availability for test data collection in November 2003;
Purpose: To introduce standards, encourage consistency, and assess data
availability and technical capacity of state information systems.
Activity: Site visits (2nd round);
Description: Met with states in spring 2004 to confirm data definitions
and availability for initial data collection in November 2004.
Activity: $50,000 PBDMI Participation Grants;
Description: Officials report awarding 52 state education agencies with
funds in 2003-2004; and grants to 46 states received funds in 2004-
2005;
Purpose: To obtain state buy-in and offset costs.
Activity: State taskforce;
Description: This advisory group, made up of a small number of states,
has provided input concerning available data that can be submitted by
states and how collections could yield better quality data;
Purpose: To solicit initial state input on data issues such as
availability and capacity.
Activity: Technical assistance;
Description: Call center available to all states, meetings,
conferences, and data quality assessments provided to 10 volunteer
states;
Purpose: To provide states with answers to questions, updates on the
status of the initiative, and information to help improve data systems.
Activity: Outreach to software vendors;
Description: Education coordinates with key standard setting
organizations such as the Schools Framework;
Purpose: To work collaboratively to develop educational data standards.
Source: Department of Education.
[End of table]
Most States Expressed Satisfaction with Education's Outreach, but Had
Mixed Views on PBDMI's Potential Benefits:
States were generally satisfied with Education's outreach activities.
(See table 3.) Most state data providers--72 percent--rated Education's
site visits effective in improving the partnership with the states. One
state data provider characterized his exchanges with the department as
open and non-defensive, and further reported that the department had
been responsive. More than half rated as effective or very effective
Education's technical assistance (57 percent) and regional meetings (52
percent). While most states thought Education's activities to improve
its partnership with states were effective, some suggested areas for
improvements. For example, 72 percent thought the site visits provided
only some or little information on successes achieved in other states.
Table 3: State Survey Responses on Education's Outreach Activities
through PBDMI:
[See PDF for image]
Source: GAO, Survey of States on the Department of Education's
Performance Based Data Management Initiative (PBDMI).
[End of table]
In their survey responses half of the states expressed the view that
reducing their reporting burden was the most important PBDMI goal;
however, fewer than a third of the states said they believe the
initiative will do so. (See table 4.) Some states emphasized their
burden had increased in the short term as they continued dual reporting
in order to meet the still ongoing data collection requirements of the
program offices. Three states reported to us their cost estimates of
systems development projects needed to support PBDMI, which ranged from
approximately $120,000 to as much as $5 million. Moreover, about 75
percent of the states reported that they thought the burden to collect
data would remain the same or increase once PBDMI was implemented. Some
state respondents expressed the opinion that until there is a firm
commitment by Education to halt multiple data collections their
reporting burden would not likely lessen. "We are asked from the
federal government for more and more information—. [which] opens the
flood gate for more and more reporting," noted one official, adding
that it is currently "hard to see the benefit at this time.":
Table 4: Percentage of States Reporting Which Goals Were Most
Important, Attainable, and Difficult to Achieve:
PBDMI goals: Reducing collecting and reporting burden;
Most important goal: 50%;
Most attainable goal: 30%;
Most difficult goal to achieve: 36%.
PBDMI goals: Improving data quality;
Most important goal: 26;
Most attainable goal: 16;
Most difficult goal to achieve: 30.
PBDMI goals: Improving the partnership with states based on common data
standards;
Most important goal: 12;
Most attainable goal: 24;
Most difficult goal to achieve: 14.
PBDMI goals: Focusing on outcomes;
Most important goal: 0;
Most attainable goal: 16;
Most difficult goal to achieve: 8.
PBDMI goals: Not sure;
Most important goal: 12;
Most attainable goal: 14;
Most difficult goal to achieve: 12.
Source: GAO, Survey of States on the Department of Education's
Performance Based Data Management Initiative (PBDMI).
[End of table]
Some states also had reservations about the benefits of PBDMI for
evaluation. One respondent cautioned, for example, that support within
his state had weakened because of the lack of perceived benefits. Only
about 20 percent of states expected PBDMI to improve or greatly improve
their analytic capacity--that is the ability to meet their own state
reporting requirements, analyze program effectiveness, analyze student
outcomes, and to compare outcomes within states. Their reasons varied.
For example, five states reported that they would continue to use their
own systems. A few elaborated that their own information systems allow
more detailed analyses of state performance than the information to be
collected through PBDMI. Additionally, an almost equal number of states
saw PBDMI as an effective tool to inform stakeholders as not. Table 5
lists the extent to which state data providers expect PBDMI to enhance
their analytical capacity in a variety of areas.
Table 5: Percentage of States Reporting the Extent to Which PBDMI Would
Improve Their Analytical Capacity:
Inform stakeholders;
Very great/great extent: 35%;
Little to no/some extent: 33%.
Meet state reporting requirements;
Very great/great extent: 22;
Little to no/some extent: 59.
Analyze student outcomes;
Very great/great extent: 22;
Little to no/some extent: 53.
Make budgetary decisions;
Very great/great extent: 16;
Little to no/some extent: 61.
Analyze program effectiveness;
Very great/great extent: 20;
Little to no/some extent: 45.
Compare outcomes within states;
Very great/great extent: 20;
Little to no/some extent: 51.
Source: GAO, Survey of States on the Department of Education's
Performance Based Data Management Initiative (PBDMI).
[End of table]
Many States Are Not Prepared to Meet Education's Data Requests:
As of June 3, 2005, only 9 states had submitted more than half of the
requested 2003-2004 school year data, while 29 states had submitted
less than 20 percent (see fig. 5). Although PBDMI officials said they
will wait until August 2005 for states to submit the 2003-2004 data,
they also acknowledged that many states would not be able to provide
significant portions. The lack of state data is particularly acute in
some programmatic areas. For example, many states have been unable to
provide data on homeless and migrant students or students with limited
English proficiency. States told Education officials early in the
process that changes to state data collection processes, systems, and
definitions would be needed to provide these types of information.
Figure 5: School Year 2003-2004 Performance Data Submitted by States to
PBDMI as of June 3, 2005:
[See PDF for image]
[End of figure]
We found that there were various reasons why states could not provide
data. Some states reported that they wanted better documentation from
the department in areas such as clarifying established data definitions
and file format specifications needed to transmit data. States needed
to make major modifications to their existing data collection and
reporting processes in order to provide new information required by
PBDMI. States also reported that they would not provide certain data
elements that were inapplicable, hard to collect, or available
elsewhere. Some also reported that there was still some confusion over
multiple or unclear definitions. Department officials said that many
states had initially overestimated their capabilities and that the data
states said would be available differed greatly from what they have
produced thus far. States have also noted competing demands for their
time and resources stemming from NCLBA. Some states reported they
lacked resources, such as staff and money, to implement changes
specific to the initiative. Specifically, 56 percent of the state
survey respondents said that all or a portion of the $50,000 in grants
they received from Education were used to contract for additional
personnel, a quarter of the states said that these funds were used to
improve their information systems. Some states noted, however, that
these funds were insufficient to make changes necessary for their
participation in PBDMI.
Recognizing that obtaining state data has been problematic, Education
has recently developed a preliminary strategy for working more closely
with states to ensure that it obtains 100 percent of data from all the
states. While not finalized, Education is currently considering actions
such as issuing regulations requiring states to submit PBDMI data and
allowing those states that provide acceptable amounts of "high quality"
data under PBDMI to be exempt from existing data collections. For
example, states that submit data to PBDMI that are also currently
collected through the Consolidated State Report--one of many data
collections required under the NCLBA--would not have to submit the same
data under this data collection. Officials have also tentatively
proposed collecting data of lesser quality that are readily available
and obtaining data through other systems to supplement what has been
provided thus far. It is not clear the extent to which this proposal
would undermine efforts to improve data quality and maintain program
office buy-in. Another option under consideration at Education is to
target departmental resources, such as $25 million in grants for system
improvements from the Institute of Education Sciences, at states that
actively participate in PBDMI.
Education Is Proceeding with Implementation despite the Data Shortage
and without a Detailed Plan of Action:
Education is proceeding with efforts toward full implementation of
PBDMI--using the data for analysis and reporting--despite the limited
amount of data collected. To do so, program offices decide whether the
quality of the data (in terms of accuracy, consistency, timeliness, and
utility) collected through PBDMI meets their needs. Once program
offices validate the quality of the data, Education would begin to
phase out existing data collections. Additionally staff will be trained
on how to access and use the data collected to date. Originally
Education expected to complete all of these activities by the spring of
2004. To the degree that it has been able to proceed, the department
has developed a set of quality checks designed to ensure the accuracy
and completeness of the data states submit.
Nevertheless, two program offices, which as members of the seven
principle offices included in the initiative and have a role in
determining whether the data are accurate and complete for their
purposes, expressed concern that PBDMI's procedures to ensure data
quality may not be adequate. An official in the Office of Special
Education and Rehabilitative Services (OSERS), which has collected
almost 30 years of longitudinal data about the effectiveness of the
nation's special education programs, told us that PBDMI had been
provided with information about the nearly 200 data quality checks used
in special education collections, but was not sure that PBDMI adopted
them all. PBDMI officials said they adopted those that were universally
relevant. Further, this official expressed concern that PBDMI would not
meet its special needs. Specifically, unlike other program offices,
OSERS programs bases student assessment on age as opposed to grade
level attained. Additionally, this official was concerned about the
timeliness of the data collected through PBDMI because that office
generated a number of congressionally mandated reports at specific
times of the year. Consequently, this office plans to compare the
quality of its own data with the data collected through PBDMI.
Officials in the Office for Civil Rights also expressed similar
reservations with PBDMI's administration of its large elementary and
secondary survey of schools and districts used to assess compliance
with civil rights laws and identify trends. Historically, district
superintendents have responded to this survey in large enough numbers
allowing Education to generalize on any findings with a high degree of
confidence.[Footnote 9] However, when PBDMI administered the survey,
fewer superintendents responded and, according to the Office for Civil
Rights, PBDMI did not have a readily available plan that adequately
outlined steps needed to raise the response rate. As of June 10, 2005,
the response rate for this survey was lower than previous surveys.
Final implementation has also been hampered by delays in training and
delivery of the analysis and reporting system. Both are more than a
year behind schedule. An official responsible for overseeing the
training efforts told us they could not focus on the delays because
considerable time was spent addressing state problems submitting data
through PBDMI. The data analysis and reporting system is more than a
year behind scheduled due to the lack of data and the failure of
Education's contractor to meet its scheduled delivery of the system.
Education officials now expect to fully implement the system by March
31, 2006. In lieu of developing its data analysis and reporting system
and training, PBDMI has offered presentations of these tools as a
preview for staff to see the new system's capabilities and to keep them
apprised of the initiative's progress.
Despite the many obstacles confronting the PBDMI, Education officials
said they expect to proceed with implementation of the initiative,
albeit with some activities postponed. In August, project officials
developed a preliminary strategy designed to address the problem of
collecting data from the states, such as providing exemptions from
certain reporting requirements for some states. However, this strategy
has not been finalized, and Education has not developed a specific plan
of action for how they might (1) help states that are deficient, (2)
deal with state expectations for phasing out the multiple data
collections, or (3) meet the expectations of their own program offices.
Conclusions:
The PBDMI represents an important step forward for the Department of
Education in its efforts to monitor the performance of the nation's
elementary and secondary schools. By developing the ability to collect
data that are more accurate, timely, consistent, and focused on key
national performance indicators, Education will be much better informed
to make its many policy and programmatic decisions. The initiative, by
asking for a clearly defined set of information that is to be submitted
only one time, has the potential to substantially reduce state
reporting burden for elementary and secondary programs as well as to
help states to develop better data systems. However, PBDMI is an
ambitious and risky undertaking that requires the continued cooperation
of a number of internal and external stakeholders.
In order for PBDMI to be successful, the department must rely on states
to provide new information at a time when they are busy implementing
large new federal initiatives, such as the No Child Left Behind Act.
While some states have been able to provide significant amounts of
data, others continue to lag far behind. In order for PBDMI to be
successful, it is important for all states to submit timely, reliable,
accurate, and consistent data. Consequently, it is important for the
department to have a clear plan for addressing states with problems
providing data and to continue to provide a proper combination of
support and incentives for states to participate. By having worked
closely with the states on their collection systems, PBDMI officials
have the information they would need to develop a plan of action to
help move them forward.
Because PBDMI represents a significant change in the way the Department
of Education conducts business, it can only be accompanied effectively
and efficiently by a change in management practices. However, program
offices still retain much discretion over what data they will collect,
how they will define it, and whether or not PBDMI's data will meet
their needs. While it is the initiative's responsibility to make sure
it collects data that meets the program offices' requirements, PBDMI is
also responsible for developing a data collection system focused on
program performance and quality data. To the extent that programmatic
differences, such as those over data definitions, inhibit PBDMI's goals
there should be a clear process for reconciling those differences. If
PBDMI truly represents a new way of doing business, Education should be
able to ensure that its organizational units go along. It is difficult
to see PBDMI achieving its full potential without a clear process for
furthering the initiative's goals.
Fundamental to any large, complex effort's success is a well thought
out plan that tracks its progress against a set of clearly defined and
measurable goals. PBDMI has not put in place such a planning and
tracking system. State governments and Education's program offices have
devoted much time, effort, and money participating in PBDMI with the
idea that they would see benefits as a result. A lack of demonstrated
progress and benefits potentially erodes state support, undermining the
viability of this important initiative. Some states are already
beginning to lose sight of the potential benefits of PBDMI. As the
department goes past its original completion deadline, it is important
for it to lay out a clear plan for how it will now proceed.
Recommendations for Executive Action:
To address the issues we have identified with regard to planning,
decision-making, and improving data quality, we recommend that the
Secretary of Education develop:
* a strategy to help states improve their ability to provide quality
data given the challenges that many states face in providing data;
* a clear process for reconciling differences between the program
offices and the PBDMI oversight office to ensure that decisions
critical to the success of PBDMI are made; and:
* a clear plan for completing final aspects of PBDMI, including
specific time frames and indicators of progress toward the initiative's
goals.
Agency Comments and Our Evaluation:
We received written comments on a draft of this report from the
Department of Education. Education agreed with our findings and
recommendations and stated that it has devoted additional resources to
the initiative and plan to issue a detailed project plan that outlines
the steps needed to complete the initiative. These comments are
reprinted in appendix II.
Education also provided technical corrections and comments that we
incorporated where appropriate.
We are sending copies of this report to the Secretary of Education, the
Office of Strategic Accountability Services, the Director of the Office
of Management and Budget, and appropriate congressional committees.
Copies will also be made available to other interested parties upon
request. Additional copies can be obtained at no cost from our Web site
at www.gao.gov.
If you or your staff should have any questions, please call me at 415-
904-2272 or bellisd@gao.gov. Contact points for our Offices of
Congressional Relations and Public Affairs may be found on the last
page of this report.
GAO staff who made major contributions to this report are listed in
appendix III.
Signed by:
David Bellis:
Director, Education, Workforce and Income Security Issues:
[End of section]
Appendix I: Scope and Methodology:
The objective of our review of the Performance Based Data Management
Initiative (PBDMI) was to assess the progress Education has made in its
implementation of the initiative, particularly with regard to (1)
defining what performance-related data it will collect from states on
behalf of the program offices, (2) assisting states in their efforts to
submit quality information, and (3) utilizing performance-related data
to provide enhanced analytic capacity within the program offices. We
conducted our review between April 2004 and September 2005 in
accordance with generally accepted government auditing standards.
Overall Approach:
To assess the department's progress in each of these areas, we reviewed
documents relating to the implementation of the initiative, relevant
laws, and information provided by the office responsible for PBDMI--the
Strategic Accountability Service (SAS) and others. We interviewed key
staff responsible for the initiative as well as officials in each of
the offices that are participating in PBDMI. We also interviewed senior-
level Education officials to determine their role in the implementation
of PBDMI. To gain insight into state perspectives on the initiative, we
administered a Web-based survey to state officials responsible for
providing these data to Education. We received responses from 50 states
including Puerto Rico. We also interviewed a variety of external
stakeholders, a data standards organization, and three contractors
involved in the initiative, including an official from the Council of
Chief State School Officers. We also reviewed previously issued reports
by Education's Office of the Inspector General (IG) as well as GAO
reports and testimonies.
In addition to interviewing departmental officials, we also reviewed
documentation on the initiative to gain a better understanding of what
actions Education was undertaking to implement the goals of the
initiative, including its data quality contract, data dictionary, its
business plans as well as justification reports to the Office of
Management and Budget (OMB) required under the Paperwork Reduction Act
to collect data. We also reviewed summary information about state
performance data that was obtained as a result of site visits to states
conducted in 2004 in order to analyze what data was obtained from
states as a result of their efforts.
Education provided information on states' submission of requested data
elements to PBDMI as of June 3, 2005. States were expected to provide
data for 64 data elements ranging from dropout rates, student
performance on reading, science, and writing assessments, teacher
certification, and many others. For each of these elements, Education
determined whether each state had submitted the information, had not
submitted the data, or did not collect the information. We incorporated
into our report Education's calculated percentages of elements
submitted for each state. We determined that these data were sufficient
for the purposes of this engagement.
In order to document the burden hours associated with certain
elementary and secondary data collections, we accessed 14 data
collection justifications authored by each of the department's program
offices and submitted to the chief information officer. These reports
had received OMB approval or were seeking approval to collect data from
states and others. We talked with an official responsible for
maintaining these documents at the department's Web site to verify that
these were the most recent data available for analysis. From each
document we obtained the estimated state burden hours and costs and
federal administrative costs associated with each data collection. Each
estimate was based on a formula that we adjusted to reflect these costs
for the 52 states participating in the initiative. In some instances
where an average was used, we assumed that the 52 states were similar
in characteristics to the overall population of states included in
Education's estimates. However, we did not find it feasible to prorate
the formulas for the federal administrative costs (based on 52 states)
for each of the collections. A statistician verified each of the
calculated estimates for accuracy.
We also surveyed all 52 state data coordinators using a Web-based
survey instrument in order to obtain their perspectives on various
aspects of the initiative. Our survey instrument was developed based on
information obtained during interviews with state data coordinators in
Pennsylvania, Virginia, Washington, and Oregon. Additionally, other
internal stakeholders specializing in technology and education were
asked to review and comment on our draft survey instrument. The survey
was pre-tested with Wyoming, North Carolina, and Illinois to determine
if the questions were clear and unbiased and whether the terms were
accurate and precise. We included these three states in our pretests
because they varied in size and technical capacity for data
transmission as determined by an earlier Education survey. Based on
their comments, we refined the questionnaire as appropriate.
Our final survey instrument asked a combination of questions that
allowed for closed-ended as well as open-ended responses and included
questions about state perspectives on PBDMI's ability to achieve its
goals. The survey was conducted using self-administered electronic
questionnaire posted on the Internet. We sent e-mail notifications
about the upcoming survey to all 52 state data coordinators (50 states,
the District of Columbia, and Puerto Rico) on November 15, 2004, and
activated the survey shortly thereafter. Each potential respondent was
provided a unique password and username by e-mail to limit
participation to members of the target population. To encourage
respondents to complete the questionnaire, we sent an e-mail message to
prompt each non-respondent approximately 2 weeks after the survey was
activated and followed up by e-mail or phone with each non-respondent
several times thereafter. We closed the survey on January 21, 2005,
after the 50th respondent had replied.
Because this was not a sample survey, there are no sampling errors.
However, the practical difficulties of conducting any survey may
introduce errors, commonly referred to as non-sampling errors. For
example, difficulties in how a particular question is interpreted, in
the sources of information that are available to respondents, or in how
the data are entered into a database or were analyzed can introduce
unwanted variability into the survey results. We took steps in the
development of the survey instrument, the data collection, and the data
analysis to minimize these non-sampling errors. For example, a survey
specialist designed the survey instrument in collaboration with GAO
staff with subject matter expertise. Then, as stated earlier, it was
pre-tested to ensure that the questions were clear, unbiased, and
accurate. When the data were analyzed, a second, independent analyst
checked all computer programs. Because this was a Web-based survey,
respondents entered their answers directly into the electronic
questionnaire, eliminating the need to have the data keyed into a
database, thus removing an additional source of error. [Footnote 10]
[End of section]
Appendix II: Comments from the Department of Education:
UNITED STATES DEPARTMENT OF EDUCATION:
ASSISTANT SECRETARY:
OFFICE OF PLANNING, EVALUATION AND POLICY DEVELOPMENT:
October 17, 2005:
David D. Bellis:
Director, Education, Workforce, and Income Security Issues:
United States Government Accountability Office:
441 G Street, NW:
Washington, DC 20548:
Dear Mr. Bellis:
Thank you for providing the Department of Education with a draft copy
of the U.S. Government Accountability Office's report entitled,
"Education's Data Management Initiative: Significant Progress Made but
Better Planning Needed to Accomplish Project Goals" (GAO-06-6). We
agree with your recommendations and value your observations on this
continuing initiative.
The Department's performance-based data management initiative is
essential to the Department's efforts to improve the use of data to
inform policy making and program management, to increase the focus on
student achievement outcomes rather than on process, to reduce the
burden on States of reporting data to the Department, and to improve
the accuracy, timeliness, and utility of data collected. As you note in
the report, the initiative is an ambitious undertaking, and we
recognize the important challenges to achieving our goals.
You recommend that the Department develop 1) a strategy to help States
improve their capability to provide quality data, 2) a process within
the Department to resolve critical, outstanding issues, and 3) a clear
plan for completing final aspects of the initiative. We have also
identified these as priority focus areas and began to address them
shortly after I joined the Department this summer. Since that time, we
have committed additional leadership and financial resources to the
initiative, among other actions, and have begun developing the detailed
project plan needed to successfully complete the initiative. We will
include a full description of planned actions in our corrective action
plan.
Sincerely,
Signed by:
Tom Luce:
[End of section]
Appendix III: GAO Contact and Staff Acknowledgments:
GAO Contact:
David Bellis (415) 904-2272:
Acknowledgments:
In addition to the contact named above the following individuals made
important contributions to this report: Bryon Gordon, Assistant
Director; Carla Craddock, Analyst-in-Charge; Susan Bernstein; David
Dornisch; Mary Dorsey; Kimberly Gianopoulos; Brandon Haller; Stuart
Kaufman; Jonathan McMurray; Valerie Melvin; James Rebbe; Gloria
Hernandez Saunders; Kimberly Siegel; Michelle Verbrugge; and Elias
Walsh.
FOOTNOTES
[1] The PRA was originally enacted in 1980 and most recently
reauthorized and amended in 1995 (Pub. L. No. 104-13, May 22, 1995).
Generally, the law requires each agency's chief information officer
(CIO) to review program offices' proposed collections to ensure that
they meet PRA standards before submission to the Office of Management
and Budget (OMB) for its approval. See also GAO, Paperwork Reduction
Act: New Approach May be Needed to Reduce Government Burden on Public,
GAO-05-424 (Washington, D.C.: May 20, 2005). The scope of this report
did not include a review of Education's compliance with the PRA.
[2] One earlier attempt, known as the Integrated Performance
Benchmarking System, was a two-state demonstration project designed to
consolidate department reporting requirements, but was terminated in
2000 without an assessment of its feasibility.
[3] Two collections that were formerly administered by the offices for
Civil Rights and Special Education and Rehabilitative Services have
already been subsumed into PBDMI.
[4] This approval will expire on September 30, 2005. Currently,
Education is seeking approval for further PBDMI data collections
beginning in 2006 through 2008. OMB's approval of this extension of
PBDMI data collection efforts was pending as of the end of August 2005.
Additionally, any subsequent data collections would also be subject to
the PRA process, including CIO review and OMB approval.
[5] Office of Elementary and Secondary Education, "Criteria for
Inclusion of Data Elements in the Consolidated State Report," 2005.
[6] Some of the work to establish a consistent standard has been
ongoing throughout the department prior to the development and
implementation of PBDMI such as the work undertaken by Education's
National Center on Education Statistics to establish a departmentwide
data dictionary. PBDMI has a role in contributing to some of these
other efforts.
[7] According to department officials, all 50 states, the District of
Columbia, and Puerto Rico accepted grants in 2003; and in 2004, 46
states acquired another $50,000. The grants were awarded solely on the
basis of state participation in PBDMI, and states were allowed wide
latitude in their usage.
[8] Past reports issued by GAO (published jointly with other education
agencies) and Education's Inspector General (IG) document that
inadequate data quality practices by schools and districts have
adversely affected the states' ability to produce quality data. In
2002, GAO and Education's IG reported that states had problems entering
accurate data and lacked sufficient supervisory review procedures to
check data received from schools and districts. GAO et al., A Joint
Audit Report on the Status of State Student Assessment Systems and the
Quality of Title I Accountability Data, SAO-02-064, (Austin, Tex.:
2002). OIG, Department of Education, Improving Title I Data Integrity
for Schools Identified for Improvement, ED-OIG/A03-B0025 (Philadelphia,
Pa.) March, 2002.
[9] Officials have told us that historically 97 percent or more of
randomly selected school districts have responded to this survey.
[10] Source: GAO Intranet, ARM Guidance, "Evaluating and Reporting of
Non-sampling Errors in Surveys."
GAO's Mission:
The Government Accountability Office, the investigative arm of
Congress, exists to support Congress in meeting its constitutional
responsibilities and to help improve the performance and accountability
of the federal government for the American people. GAO examines the use
of public funds; evaluates federal programs and policies; and provides
analyses, recommendations, and other assistance to help Congress make
informed oversight, policy, and funding decisions. GAO's commitment to
good government is reflected in its core values of accountability,
integrity, and reliability.
Obtaining Copies of GAO Reports and Testimony:
The fastest and easiest way to obtain copies of GAO documents at no
cost is through the Internet. GAO's Web site ( www.gao.gov ) contains
abstracts and full-text files of current reports and testimony and an
expanding archive of older products. The Web site features a search
engine to help you locate documents using key words and phrases. You
can print these documents in their entirety, including charts and other
graphics.
Each day, GAO issues a list of newly released reports, testimony, and
correspondence. GAO posts this list, known as "Today's Reports," on its
Web site daily. The list contains links to the full-text document
files. To have GAO e-mail this list to you every afternoon, go to
www.gao.gov and select "Subscribe to e-mail alerts" under the "Order
GAO Products" heading.
Order by Mail or Phone:
The first copy of each printed report is free. Additional copies are $2
each. A check or money order should be made out to the Superintendent
of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or
more copies mailed to a single address are discounted 25 percent.
Orders should be sent to:
U.S. Government Accountability Office
441 G Street NW, Room LM
Washington, D.C. 20548:
To order by Phone:
Voice: (202) 512-6000:
TDD: (202) 512-2537:
Fax: (202) 512-6061:
To Report Fraud, Waste, and Abuse in Federal Programs:
Contact:
Web site: www.gao.gov/fraudnet/fraudnet.htm
E-mail: fraudnet@gao.gov
Automated answering system: (800) 424-5454 or (202) 512-7470:
Public Affairs:
Jeff Nelligan, managing director,
NelliganJ@gao.gov
(202) 512-4800
U.S. Government Accountability Office,
441 G Street NW, Room 7149
Washington, D.C. 20548: