Department of Education
Improved Dissemination and Timely Product Release Would Enhance the Usefulness of the What Works Clearinghouse
Gao ID: GAO-10-644 July 23, 2010
In connection with the Omnibus Appropriations Act, 2009, GAO was required to study the What Works Clearinghouse (WWC), a federal source of evidence about effective education practices. Operating through a 5-year contract awarded by the U.S. Department of Education's Institute of Education Sciences (IES), the WWC reviews education research and disseminates its findings. GAO examined: (1) the extent to which the WWC review process meets accepted standards for research evaluation and how the WWC has responded to recommendations and criticism, (2) how WWC output and costs have changed over time and how its performance is measured, and (3) how WWC products are disseminated and how useful educators find them to be. To conduct its work, GAO reviewed WWC-related documents, analyzed cost and performance data, surveyed all states and a nationally representative sample of school districts, and interviewed IES officials, WWC contractors, researchers, and others.
GAO as well as a congressionallymandated panel of experts, found that the WWC's review process, which includes screening studies to determine if they meet WWC criteria, follows accepted standards for evaluating research on the effectiveness of education interventions. WWC is responding to recommendations made by the expert panel to further improve its review and reporting processes. For example, the panel recommended improvements in the way the WWC presents information to readers on the reasons why studies do not qualify for review. The WWC is revising a report template to include a table summarizing which studies met or did not meet WWC criteria for evaluating research. The WWC has also responded to researchers who have criticized the WWC for presenting limited information because its screening criteria exclude some rigorous research designs that may be appropriate for evaluating certain education programs, such as special education. The WWC responded to this criticism by creating new standards that include two additional study designs and by creating a new product, called a practice guide, which includes a wider range of research. WWC's report output and scope increased under the current contract. For example, the WWC increased its production of various reports, introduced new products, and developed new processes for evaluating research. However, IES had a substantial backlog in its product review process from January 2009 to May 2010. The backlog generally decreased the timeliness of WWC reports, with 20 reports being delayed by up to 6 months. To support the increases in output and scope, WWC's costs doubled from the previous contract to the current one. Both contracts designated about 60 percent of costs to production, while the other 40 percent of costs support other tasks, such as communications, dissemination, and process development. IES' performance goals for the WWC primarily relate to the number of reports produced. However, IES has not developed performance measures related to the cost or usefulness of WWC products. Education uses WWC contractors, Regional Educational Laboratories (RELS) and the Doing What Works (DWW) Web site to disseminate information about WWC products; however, awareness and use of the WWC varies among states, districts, teachers, and principals. WWC contractors disseminate product information in various ways including email alerts and presentations. The RELs host events featuring WWC products for state, district, and school officials and DWW provides resources to educators based on WWC products. Based on our survey, officials from 33 of 38 state education agencies that responded to our survey and an estimated 42 percent of school districts have heard of the WWC. Those states and school districts generally used the WWC to a small or moderate extent to inform decisions on effective practices. Based on our survey, states and school districts reported that they would likely increase their use of the WWC if it included a broader array of information or more timely information. GAO recommends that IES: develop and implement strategies to avoid backlogs in WWC product reviews; establish performance measures related to costs and usefulness; and improve dissemination efforts to promote awareness and use of the WWC. Education generally agreed with GAO's recommendations.
Recommendations
Our recommendations from this work are listed below with a Contact for more information. Status will change from "In process" to "Open," "Closed - implemented," or "Closed - not implemented" based on our follow up work.
Director:
Cornelia M. Ashby
Team:
Government Accountability Office: Education, Workforce, and Income Security
Phone:
(202) 512-8403
GAO-10-644, Department of Education: Improved Dissemination and Timely Product Release Would Enhance the Usefulness of the What Works Clearinghouse
This is the accessible text file for GAO report number GAO-10-644
entitled 'Department Of Education: Improved Dissemination and Timely
Product Release Would Enhance the Usefulness of the What Works
Clearinghouse' which was released on July 23, 2010.
This text file was formatted by the U.S. Government Accountability
Office (GAO) to be accessible to users with visual impairments, as
part of a longer term project to improve GAO products' accessibility.
Every attempt has been made to maintain the structural and data
integrity of the original printed product. Accessibility features,
such as text descriptions of tables, consecutively numbered footnotes
placed at the end of the file, and the text of agency comment letters,
are provided but may not exactly duplicate the presentation or format
of the printed version. The portable document format (PDF) file is an
exact electronic replica of the printed version. We welcome your
feedback. Please E-mail your comments regarding the contents or
accessibility features of this document to Webmaster@gao.gov.
This is a work of the U.S. government and is not subject to copyright
protection in the United States. It may be reproduced and distributed
in its entirety without further permission from GAO. Because this work
may contain copyrighted images or other material, permission from the
copyright holder may be necessary if you wish to reproduce this
material separately.
Report to Congressional Committees:
United States Government Accountability Office:
GAO:
July 2010:
Department Of Education:
Improved Dissemination and Timely Product Release Would Enhance the
Usefulness of the What Works Clearinghouse:
GAO-10-644:
GAO Highlights:
Highlights of GAO-10-644, a report to congressional committees.
Why GAO Did This Study:
In connection with the Omnibus Appropriations Act, 2009, GAO was
required to study the What Works Clearinghouse (WWC), a federal source
of evidence about effective education practices. Operating through a 5-
year contract awarded by the U.S. Department of Education‘s Institute
of Education Sciences (IES), the WWC reviews education research and
disseminates its findings. GAO examined: (1) the extent to which the
WWC review process meets accepted standards for research evaluation
and how the WWC has responded to recommendations and criticism, (2)
how WWC output and costs have changed over time and how its
performance is measured, and (3) how WWC products are disseminated and
how useful educators find them to be. To conduct its work, GAO
reviewed WWC-related documents, analyzed cost and performance data,
surveyed all states and a nationally representative sample of school
districts, and interviewed IES officials, WWC contractors,
researchers, and others.
What GAO Found:
GAO as well as a congressionally mandated panel of experts, found that
the WWC‘s review process, which includes screening studies to
determine if they meet WWC criteria, follows accepted standards for
evaluating research on the effectiveness of education interventions.
WWC is responding to recommendations made by the expert panel to
further improve its review and reporting processes. For example, the
panel recommended improvements in the way the WWC presents information
to readers on the reasons why studies do not qualify for review. The
WWC is revising a report template to include a table summarizing which
studies met or did not meet WWC criteria for evaluating research. The
WWC has also responded to researchers who have criticized the WWC for
presenting limited information because its screening criteria exclude
some rigorous research designs that may be appropriate for evaluating
certain education programs, such as special education. The WWC
responded to this criticism by creating new standards that include two
additional study designs and by creating a new product, called a
practice guide, which includes a wider range of research.
WWC‘s report output and scope increased under the current contract.
For example, the WWC increased its production of various reports,
introduced new products, and developed new processes for evaluating
research. However, IES had a substantial backlog in its product review
process from January 2009 to May 2010. The backlog generally decreased
the timeliness of WWC reports, with 20 reports being delayed by up to
6 months. To support the increases in output and scope, WWC‘s costs
doubled from the previous contract to the current one. Both contracts
designated about 60 percent of costs to production, while the other 40
percent of costs support other tasks, such as communications,
dissemination, and process development. IES‘ performance goals for the
WWC primarily relate to the number of reports produced. However, IES
has not developed performance measures related to the cost or
usefulness of WWC products.
Education uses WWC contractors, Regional Educational Laboratories
(RELS) and the Doing What Works (DWW) Web site to disseminate
information about WWC products; however, awareness and use of the WWC
varies among states, districts, teachers, and principals. WWC
contractors disseminate product information in various ways including
email alerts and presentations. The RELs host events featuring WWC
products for state, district, and school officials and DWW provides
resources to educators based on WWC products. Based on our survey,
officials from 33 of 38 state education agencies that responded to our
survey and an estimated 42 percent of school districts have heard of
the WWC. Those states and school districts generally used the WWC to a
small or moderate extent to inform decisions on effective practices.
Based on our survey, states and school districts reported that they
would likely increase their use of the WWC if it included a broader
array of information or more timely information.
What GAO Recommends:
GAO recommends that IES: develop and implement strategies to avoid
backlogs in WWC product reviews; establish performance measures
related to costs and usefulness; and improve dissemination efforts to
promote awareness and use of the WWC. Education generally agreed with
GAO‘s recommendations.
View [hyperlink, http://www.gao.gov/products/GAO-10-644] or key
components. For more information, contact Cornelia Ashby at (202) 512-
7215 or ashbyc@gao.gov.
[End of section]
Contents:
Letter:
Background:
WWC Reviews Research in Accordance with Accepted Standards and Has
Responded to Recommendations and Criticisms:
WWC's Output and Costs Increased; However, IES Has Not Developed
Adequate Performance Measures Related to Cost or Product Usefulness:
Education Has Three Primary Ways to Disseminate Information about WWC
Products, but Awareness and Use Vary among Target Audiences:
Conclusions:
Recommendations for Executive Action:
Agency Comments and Our Evaluation:
Appendix I: Objectives, Scope, and Methodology:
Appendix II: Other Sources of Information Districts Use To Identify
Effective Education Practices:
Appendix III: IES and WWC Response to Expert Panel Recommendations:
Appendix IV: Comments from the Department of Education:
Appendix V: GAO Contact and Staff Acknowledgments:
Related GAO Products:
Tables:
Table 1: Products and Registries Available on the WWC Web Site:
Table 2: Two Study Designs That Meet the WWC Standards with or without
Reservations:
Table 3: New WWC Publications and Reports:
Table 4: Task Category Definitions and Changes between Contracts:
Table 5: Estimates and Confidence Intervals for Figure 8:
Table 6: Estimates and Confidence Intervals for Figure 10:
Table 7: Estimates and Confidence Intervals for Figure 12:
Table 8: Estimates and Confidence Intervals for Figure 14:
Table 9: Conferences Attended to Administer Questionnaires to Teachers
and Principals:
Figures:
Figure 1: WWC Research Review Process for Interventions:
Figure 2: Percentage of Interventions with Positive or Potentially
Positive Ratings Categorized by the Amount of the Evidence Supporting
Those Ratings:
Figure 3: Studies Reviewed That Meet WWC Evidence Standards:
Figure 4: Publication Quantities, by Contract Year (CY) for Current
WWC Contract:
Figure 5: Average Time for IES Peer Reviews of Released Intervention
Reports and Quick Reviews, by Contract Year (CY) and Quarter (Q) for
Current WWC Contract:
Figure 6: IES Peer Review Backlog for Intervention Reports and Quick
Reviews, by Contract Year (CY) and Quarter (Q) for Current WWC
Contract:
Figure 7: WWC Costs, by Task Categories and Contracts:
Figure 8: Sources from Which District Officials Heard of the WWC:
Figure 9: Extent to Which States Use WWC for Various Purposes:
Figure 10: Extent to Which School Districts That Have Used the
Clearinghouse Used It for Various Purposes:
Figure 11: Extent to Which States Use Specific WWC Products:
Figure 12: Extent of Specific Product Use among Districts That Use the
Clearinghouse:
Figure 13: Number of States That Reported They Would Likely Increase
Their Use of WWC Given Certain Changes:
Figure 14: Estimated Percent of School Districts That Have Accessed
the WWC That Would Likely Increase Their Use of the WWC Given Various
Changes:
Figure 15: GAO's Web-based Survey of State Departments of Education
and Local Educational Agencies in the 50 States and the District of
Columbia:
Abbreviations:
AYP: adequate yearly progress:
DWW: Doing What Works:
Education: U.S. Department of Education:
ESEA: Elementary and Secondary Education Act of 1965:
IES: Institute of Education Sciences:
LEA: local educational agency:
Recovery Act: American Recovery and Reinvestment Act of 2009:
REL: Regional Educational Laboratories:
WWC or Clearinghouse: What Works Clearinghouse:
[End of section]
United States Government Accountability Office:
Washington, DC 20548:
July 23, 2010:
Congressional Committees:
The U.S. Department of Education's What Works Clearinghouse (WWC or
Clearinghouse) was established as a federal source of scientific
evidence about "what works" in education. The Institute of Education
Sciences (IES), a division of the U.S. Department of Education
(Education), created the WWC in 2002, in part to help educators
identify and use scientifically-based practices as specified in the
Elementary and Secondary Education Act of 1965 (ESEA).[Footnote 1] The
WWC, which is operated by an independent contractor, conducts
systematic reviews of education research and disseminates information
on its Web site about the effectiveness of the practices reported in
these research studies. Currently operating under a $50 million 5-year
contract, the Clearinghouse has generated criticism in the education
research evaluation field on the timeliness of its reviews, its
standards for study inclusion, and the methodological soundness of its
research review process.[Footnote 2]
An explanatory statement submitted in lieu of a conference report for
the Omnibus Appropriations Act, 2009, directed GAO to examine how the
WWC reviews education research and to address concerns about the
operation, cost, and usefulness of the WWC.[Footnote 3] Specifically,
GAO was required to determine whether the WWC review process met
current standards for evaluating research and to examine the output
and cost for completing reviews, the degree of consistency of review
procedures across the various topics addressed, and the usefulness of
the Clearinghouse for practicing educators. To conduct this work, we
examined (1) the extent to which the WWC review process meets accepted
standards for research evaluation and how the WWC has responded to
recommendations and criticisms of its processes, (2) how the WWC's
output and costs have changed over time and how IES measures WWC
performance, and (3) how WWC products are disseminated and how useful
education professionals find them to be.
To address all of our objectives, we interviewed and obtained
information from IES officials and the current and former WWC
contractors, as well as representatives from various educational
organizations. In addition, to address objective 1, we reviewed a
prior GAO report that examined WWC procedures and standards, an expert
panel report that previously assessed the validity of the WWC review
process, literature, and procedures used by other organizations that
conduct systematic reviews of research. We also reviewed the
Clearinghouse's response to the expert panel and to specific
criticisms in education research literature. To determine how
performance and costs changed over time (objective 2), we analyzed the
costs and productivity of the WWC contractors by reviewing budget,
expenditure, and performance data. For objective 3, we administered a
Web-based survey to state education agencies in all 50 states and the
District of Columbia and a nationally representative sample of school
districts;[Footnote 4] interviewed IES's 10 Regional Educational
Laboratories; and gathered nongeneralizeable information from teachers
and principals at four conferences. Appendix I explains our scope and
methodology in more detail. We performed our work from September 2009
to July 2010 in accordance with generally accepted government auditing
standards. Those standards require that we plan and perform the audit
to obtain sufficient, appropriate evidence to provide a reasonable
basis for our findings and conclusions based on our audit objectives.
We believe that the evidence obtained provides a reasonable basis for
our findings and conclusions based on our audit objectives.
Background:
The mission of the WWC is to be a central source of scientific
evidence for what works in education.[Footnote 5] To accomplish this,
the WWC reviews existing education research and posts information
based on its reviews on the WWC Web site, http://ies.ed.gov/ncee/wwc/.
The types of products currently available on the WWC Web site are
described in table 1.
Table 1: Products and Registries Available on the WWC Web Site:
Product: Intervention reports;
Description: Summarize all of the research reviewed for a particular
intervention within a topic area. Each report offers an overview of
the intervention, summarizes all relevant research, and provides a
rating of effectiveness. Studies featured in intervention reports must
meet WWC evidence standards with or without reservations;
Number: 130;
Example: Accelerated Reader: WWC reviewed the evidence pertaining to
the effectiveness of this specific curriculum with respect to certain
reading outcomes.
Product: Practice guides;
Description: Contain recommendations for educators to address
challenges in their classrooms. Assign strength of evidence ratings to
each recommendation (strong, moderate, low). Rely to some extent on
expert opinion;
Number: 12;
Example: Structuring Out-of-School Time to Improve Academic
Achievement: WWC published general recommendations on how to design
out-of-school time programs that will increase student learning.
Product: Quick reviews;
Description: Assess the quality of research evidence from single
studies recently featured in the media to determine if they meet WWC
evidence standards;
Number: 40;
Example: Recess and Classroom Behavior: WWC reviewed a study profiled
in the news that examined whether providing daily recess to third
graders improves their classroom behavior.
Product: Multimedia;
Description: Audio files, video files, presentations, and transcripts
from WWC events;
Number: N/A;
Example: Reducing Behavior Problems in the Elementary School
Classroom: WWC held a webinar featuring a practice guide on this topic.
Product: Registry of evaluation researchers;
Description: An online database of researchers who conduct evaluations
of the effectiveness of educational interventions to help schools,
school districts, and educational program developers identify
potential researchers;
Number: N/A;
Example: Individual researchers and various organizations.
Product: Registry of randomized controlled trials;
Description: An online database of completed and in-progress
randomized controlled trials in education. This resource is designed
to help schools, school districts, and educational program developers
identify research regarding the effectiveness of educational
interventions;
Number: N/A;
Example: A Randomized Trial of Two Promising Interventions for
Students with Attention Problems: WWC included this randomized
controlled trial in its registry.
Source: GAO analysis of WWC information.
Note: This table summarizes the WWC products and registries available
as of May 18, 2010. The Clearinghouse previously published topic
reports summarizing findings from all studies on all relevant
interventions for a particular topic, such as beginning reading. These
reports were replaced by dynamically generated summaries of evidence.
[End of table]
In addition to the Clearinghouse, Education provides other technical
assistance and research-related resources to assist states, districts,
and schools. Examples of research-related resources include the
Regional Educational Laboratories (REL) and the Doing What Works (DWW)
Web site (http://dww.ed.gov):
Regional Educational Laboratories. IES's Regional Educational
Laboratory Program is a network of 10 laboratories that conduct
research and provide policymakers and practitioners with expert
advice, training, and technical assistance on how to interpret
findings from scientifically valid research.[Footnote 6]
Doing What Works. Led by Education's Office of Planning, Evaluation
and Policy Development, DWW is a Web-based resource intended to help
teachers, schools, districts, states and technical assistance
providers implement research-based instructional practice.
Initial Years of the What Works Clearinghouse:
In 2002, IES awarded a $27 million 5-year contract to the initial
contractors to operate the Clearinghouse.[Footnote 7] The WWC
contractors developed the Clearinghouse's research review standards
with IES and reviewed research related to topic areas considered to be
pressing issues in education.[Footnote 8] One of the goals of the
Clearinghouse was to promote informed education decision making
through a Web-based dissemination system that featured rigorous
reviews of studies on the effectiveness of educational interventions.
The WWC experienced a slow start due in part to the amount of work
involved in developing a research review and reporting process that
was valid, transparent, and replicable, according to the initial
contractors. In developing the research review process, the
contractors and IES addressed over 60 technical issues, such as
determining what constitutes an acceptable level of participant loss
(attrition) from a study and what methods should be in place to
accommodate common education research techniques. In addition, initial
plans for topic areas and reporting formats were modified. For
example, IES decided to drop one planned topic area because IES
officials determined it to be too broad.[Footnote 9] The WWC and IES
also spent a substantial amount of time developing and refining a
reporting format to communicate research results to a lay audience. As
a result, the WWC began releasing reports in 2006. By September 2007,
the WWC had released 89 intervention reports, six topic reports, and
three practice guides.
WWC Research Review Process for Intervention Reports:
The WWC uses a three-step review process to assess the quality of
studies and report on what the research indicates about the
effectiveness of interventions. The WWC definition of interventions
includes programs (such as whole school reform), products (such as a
textbook or curriculum), practices (such as mixed-age grouping), or
policies (such as class size reduction).[Footnote 10] The process
begins with an initial screening of published and unpublished studies
relevant to the intervention being reviewed. Studies are collected
from electronic databases, journals, conference proceedings, and
nominations solicited from the general public. The studies that pass
initial screens are reviewed to determine whether they provide valid
evidence of an intervention's effectiveness. Using these studies, the
WWC then synthesizes the evidence about the effectiveness of an
intervention and publishes a report describing its findings. The
Clearinghouse categorizes interventions as either having positive
effects, potentially positive effects, mixed effects, no discernible
effects, potentially negative effects, or negative effects (see figure
1).
Figure 1: WWC Research Review Process for Interventions:
[Refer to PDF for image: illustration]
Step 1: Initial screening:
Is the study:
* Randomized or quasi-experimental?
* Published within 20 years of the beginning of the topic area review?
* Focused on a relevant intervention to the topic under review?
And does the study:
* Target students in the topic area‘s age or grade range and specified
location?
* Focus on populations relevant to the topic area (e.g., students with
learning disabilities, English language learners)?
* Report on at least one outcome relevant to the review?
Screened out:
Studies not meeting initial screening criteria.
Step 2: Quality review:
Studies meeting initial screening criteria:
Two Ph.D.-level research analysts independently rate studies using a
codebook that considers study design and execution, validity and
reliability of outcome measures, and data analysis and reporting to
evaluate the strength of the evidence in the study.
Screened out:
Studies not meeting evidence standards.
Step 3: Synthesize evidence:
Studies meeting evidence standards (with or without reservations):
Evidence from all studies meeting standards is synthesized and
summarized for use in reports;
Intervention reports summarize evidence on the effects of a specific
intervention.
Source: GAO analysis of WWC guidelines.
[End of figure]
The WWC uses evidence standards to assess the strengths and weaknesses
of a study's methodology, such as the type of design it uses, the
quality of the study's data, and the appropriateness of the study's
statistical procedures. Until recently, the WWC accepted two types of
study designs--randomized experiments and quasi-experimental studies.
[Footnote 11] Only randomized controlled trials (or randomized
experiments) that WWC has determined to be well-designed and well-
implemented are considered strong evidence and can receive the highest
rating of "meets evidence standards without reservations." The WWC
also considers evidence from quasi-experiments it has determined to be
well-designed and well-implemented. The highest rating a study using
quasi-experimental design can receive is "meets evidence standards
with reservations." This rating category is intended to inform
educators to interpret the study results with caution, as the results
may reflect other factors, in addition to the impact of the
intervention (see table 2).
Table 2: Two Study Designs That Meet the WWC Standards with or without
Reservations:
Study design: Randomized control-group experiments;
Description: Compare the outcomes of groups that were randomly
assigned either to the intervention group or to a nonparticipating
control group before the intervention. Such an assignment helps ensure
that any differences in outcomes can be attributed to the intervention;
Highest rating category WWC will assign if well-conducted, and why:
Meets evidence standards: Considers randomized experiments as the
design that is most likely to yield unbiased estimates of a program's
impact on student outcomes.
Study design: Comparison-group quasi-experiments;
Description: Compare the outcomes of groups in which individuals are
assigned to an intervention or control group in a way that minimizes
observable differences between the groups that could affect outcomes.
The researcher must demonstrate that the groups are equivalent on
observable participant characteristics, such as age, grade level,
prior academic achievement, or pretest results;
Highest rating category WWC will assign if well-conducted, and why:
Meets evidence standards with reservations: Even with equivalent
observable characteristics, there may be differences in other
participant characteristics related to the desired outcomes--for
example, certain family or social structures that are unknown to the
researcher.
Source: GAO analysis of WWC information.
[End of table]
IES Oversight and Support of the WWC:
The WWC is administered by IES through a contract with a private
research organization. IES monitors implementation of the specific
tasks detailed in the WWC contract by reviewing an annual work plan
and monthly performance and expenditure reports submitted by the
contractor. IES tracks implementation of the tasks, completion of
performance goals, and adherence to the budget outlined in the
contractor work plan.[Footnote 12] The contractor monitors the work of
any subcontractors that it uses to perform services such as research
reviews, technological support, and communications support.
IES is also involved in the development and dissemination of WWC
products. IES reviews and approves proposed topics for WWC products,
product formats, and the research review procedures. It also
coordinates a group of independent researchers to peer review WWC
products and reviews and approves all WWC products prior to public
release. IES required the contractor to develop a communications plan
to inform WWC customers about features of the Web site.
WWC Reviews Research in Accordance with Accepted Standards and Has
Responded to Recommendations and Criticisms:
WWC Follows Accepted Review Standards and Is Improving Its Review
Process in Response to a Congressionally Mandated Expert Panel Report:
We found that the WWC review process follows generally accepted
practices for reviewing research. Specifically, GAO's November 2009
report reviewing federally supported efforts to identify effective
interventions found that the WWC determines whether a study provides
credible evidence on effectiveness based on several dimensions,
including the quality of the research design, how the study was
implemented, and other technical considerations.[Footnote 13] Our 2009
report also noted that WWC follows a common approach to conducting its
reviews,[Footnote 14] and provides information to help educators
understand the body of existing research on specific interventions.
[Footnote 15]
Additionally, a congressionally mandated panel of experts found in
October 2008 that WWC's research review process was based on
appropriate methods for judging the strength of the evidence regarding
the effectiveness of interventions.[Footnote 16] For example, the
panel agreed that the minimum qualifications a study must meet in
order to be reviewed by the WWC are appropriate. The panel also found
that WWC's reporting process is reasonable and that the WWC provides
succinct and relevant evidence on the effectiveness of education
interventions. While the panel concluded that the WWC's processes are
generally appropriate, the panel made several recommendations to the
WWC for continued improvement. The recommendations primarily related
to establishing or clarifying procedures, reviewing statistical
methods, and documenting the screening process.
The WWC implemented or is considering implementing 14 of the panel's
17 recommendations.[Footnote 17] The WWC implemented nine
recommendations, in part by modifying some procedures and creating a
procedures and standards handbook.[Footnote 18] For example, in
response to the panel's recommendation that the WWC include a table of
study dispositions (e.g., whether studies meet WWC evidence standards)
at the front of intervention reports, the WWC is modifying the report
template to include a summary table along with the existing listing of
dispositions in the reference section. The WWC also addressed panel
concerns about technical issues in its review process by making its
treatment of study attrition--the rate at which subjects drop out of a
study--more consistent across topic areas. The panel noted that the
WWC's practice of determining acceptable attrition levels by topic
area led to arbitrary inconsistencies across the topic areas. In
response to the panel's recommendation that the WWC reconsider this
practice, the WWC took steps to increase its consistency by developing
attrition guidance that applies to all topic areas.[Footnote 19] (See
appendix III for a table detailing the recommendations, WWC and IES's
response, and the status of any changes made in response to
recommendations.)
In addition, the WWC is considering implementing five other panel
recommendations. For example, the panel raised concerns that the WWC
does not document some potential conflicts of interest for the studies
it reviews. In response to this concern, the WWC is considering
tracking and publishing whether studies of a program are funded or
conducted by the program's developers.[Footnote 20] Further, in
response to the panel's concern that the WWC's screening process may
exclude some eligible studies, the WWC is undertaking an evaluation of
the reliability of its screening process. According to IES officials,
they will postpone decisions about the recommendations until the newly
appointed Commissioner for the WWC is on board and actively involved
in the decision making.
WWC Also Responded to Criticism That It Produces Limited and
Potentially Misleading Information:
Some researchers claim that the WWC presents potentially misleading
information by including brief experiments involving small numbers of
students when evaluating interventions.[Footnote 21] As a result,
according to critics, educators may accept the WWC's rating of the
intervention's effectiveness, even though the evidence behind the
rating is limited. One researcher suggested the WWC emphasize larger
studies that span significant periods of time and set a minimum sample
size requirement. According to WWC staff, such changes would exclude
valuable research and prevent the WWC from providing educators with
research-based information about some interventions.[Footnote 22]
Instead of changing its treatment of sample size and study duration,
the WWC began publishing information on the extent of the evidence
supporting its findings in 2007. The WWC's "extent of evidence" rating
alerts educators when the WWC effectiveness ratings are based on a
small amount of evidence. As figure 2 shows, 76 percent of
interventions with positive or potentially positive ratings of
effectiveness are based on a small amount of evidence (see figure 2).
Figure 3: Percentage of Interventions with Positive or Potentially
Positive Ratings Categorized by the Amount of the Evidence Supporting
Those Ratings:
[Refer to PDF for image: pie-chart]
Small (51 interventions): 76%;
Medium to large (13): 19%;
Not rated (3): 4%.
Source: GAO analysis of WWC data.
Note: The figure excludes seven interventions that were rated with
different amounts of evidence as of April 27, 2010. Currently, the
extent of evidence rating has two categories: small and medium to
large. A rating of "medium to large" requires at least two studies and
two schools across studies and a total sample size across studies of
at least 350 students or 14 classrooms. Otherwise, the rating is
"small."
[End of figure]
Further, researchers suggested that the WWC presents misleading
information by rating interventions based on studies in which measures
of student performance closely match the content taught to the
intervention group, but not the control group.[Footnote 23] In such
studies, higher test scores among the intervention group may not
accurately represent the effectiveness of the intervention more
generally. The researchers suggested that the WWC exclude such
measures, or at least report on them separately. However, the WWC
includes these measures because, according to IES officials, they
answer questions about whether different interventions lead to
different content knowledge. The WWC agrees that there is a concern
regarding the reliability of outcome measures that are overly similar
to the intervention, but maintains that WWC procedures attempt to
exclude such measures. In addition, in response to researcher concerns
that tests created by intervention developers may be biased,[Footnote
24] the WWC added information to the intervention reports noting
whether outcome measures are based on tests created by the developer.
Some researchers and education professionals we interviewed suggested
that the WWC produces limited information because its screening
criteria are too restrictive--currently screening out about 90 percent
of studies initially identified as potentially relevant (see figure
3). Until recently, the WWC reviewed only two types of study designs--
randomized experiments and quasi-experimental studies--and according
to critics, this limited the amount and type of information available
to educators.[Footnote 25] For example, staff from one REL noted that
educators may not be able to find reviews of the interventions they
are using or considering because so few studies meet WWC standards.
[Footnote 26] Staff from another REL told us that if educators cannot
find relevant and useful information, they may be discouraged from
using evidence-based practices. Staff from a third lab noted that the
narrow focus prevents educators from learning from less rigorous but
nonetheless useful research, such as case studies describing an
intervention's costs and implementation requirements.
Figure 3: Studies Reviewed That Meet WWC Evidence Standards:
[Refer to PDF for image: illustration]
Initially reviewed: 2,669 studies;
Met evidence standards with or without reservations: 226 studies;
Out of scope or did not meet evidence standards: 2,443 studies.
Source: GAO analysis of WWC data.
[End of figure]
The WWC maintains that its screening criteria and study inclusion
standards focus on studies that provide strong evidence of an
intervention's effectiveness, and lowering these standards could
undermine the validity of the findings reported by the WWC. Although
the Clearinghouse screens out most studies, many of its reports have
identified interventions with positive effects. Data from the
contractor indicate that 58 percent of WWC's intervention reports
identify positive or potentially positive effects of interventions.
While the WWC plans to continue using its methodological standards for
reviewing randomized and quasi-experimental studies, the Clearinghouse
acknowledges that the emphasis on randomized experiments and quasi-
experiments can exclude useful information on interventions in certain
topic areas, such as special education, that do not lend themselves to
these study designs. The WWC created new standards to include
additional study designs.[Footnote 27]
The WWC also introduced practice guides in 2007 in response to
criticisms that its intervention reviews exclude too much research and
consequently provide limited information to educators. Written by a
panel of experts, practice guides include recommendations for
educators on various topics, such as reducing high school drop-out
rates and reducing behavioral problems in the classroom.[Footnote 28]
Whereas WWC's intervention reviews are based entirely on studies that
meet WWC evidence standards, practice guides also incorporate studies
that do not have designs that are eligible for WWC review, or in some
cases, are reviewed and do not meet WWC evidence standards, and
include the views of experts. To develop recommendations, the practice
guide panel reviews available literature about the particular topic
and then meets several times to discuss the topic. Through consensus,
the panel identifies effective practices based on the evidence. Once
the practice guide is developed, it undergoes a quality assurance
review by WWC and IES staff and external peer review. The following
text box provides an example of practice guide recommendations and the
level of evidence supporting them.
Table: Example of Practice Guide Recommendations and Evidence
Levels[A]:
In 2009, the WWC published a practice guide to help educators assist
students struggling with reading in the primary grades. The practice
guide authors used an early detection and prevention framework known
as Response to Intervention. The panel that authored the practice
guide consisted of six researchers and one expert in implementation of
the Response to Intervention model. Two WWC staff also assisted in the
practice guide development. The panel's recommendations follow.
Recommendation:
Screen all students for potential reading problems twice per year and
monitor those with higher risk;
Basis for recommendation: Numerous studies with designs that did not
meet WWC evidence standards or that did not use samples that
adequately resembled the population of interest;
Level of evidence[B]: Moderate.
Recommendation:
Provide time for differentiated reading instruction for all students
based on assessments of students' current reading level;
Basis for recommendation: One descriptive study and expert opinion;
Level of evidence[B]: Low.
Recommendation:
Provide intensive, systematic instruction on foundational reading
skills in small groups to students who score below the benchmark score;
Basis for recommendation: 11 studies that met WWC evidence standards;
Level of evidence[B]: Strong.
Recommendation:
Monitor the progress of these students at least once a month;
Basis for recommendation: 3 studies that met WWC evidence standards,
but did not evaluate the effectiveness of monitoring so no conclusive
inferences could be made, and expert opinion;
Level of evidence[B]: Low.
Recommendation:
Provide intensive interaction on a daily basis to students who show
minimal progress after reasonable time in small group instruction;
Basis for recommendation: 5 studies that met WWC evidence standards
but did not report statistically significant impacts on reading
outcomes; Level of evidence[B]: Low.
Source: GAO review of a WWC practice guide.
[A] A strong rating indicates that studies supporting the
recommendation generally meet WWC standards. A moderate rating
indicates that studies supporting the recommendation generally meet
WWC standards with reservations. A low rating indicates the
recommendation is based on expert opinion, derived from theory or
experience, and supported with evidence that does not rise to the
moderate or strong levels.
[B] Our analysis of practice guide recommendations found that almost
half of the 67 recommendations made in the 12 practice guides released
as of May 2010 were based on a low level of evidence.
[End of table]
WWC's Output and Costs Increased; However, IES Has Not Developed
Adequate Performance Measures Related to Cost or Product Usefulness:
WWC Increased Output and Introduced New Products:
WWC's report output increased under the current contract, and its
scope expanded to include new products and processes to support
production. Under the current contract, the WWC increased its total
number of publications from the first contract year to the second
contract year and generally kept pace with its increased scope, as
specified in the Clearinghouse's annual plans.[Footnote 29] For
example, the current contract calls for the WWC to increase the number
of topic areas and intervention reports. Under the current contract,
the WWC added three new topic areas and released 60 intervention
reports, including 5 in the new topic areas as of June 2010.[Footnote
30] In addition, the WWC produces practice guides and quick reviews
and increased its production of both of these products between the
first and second year of the current contract. Figure 4 shows the
production of all three WWC products as of June 30, 2010, the end of
the third contract year.
Figure 4: Publication Quantities, by Contract Year (CY) for Current
WWC Contract:
[Refer to PDF for image: 3 horizontal bar charts]
Intervention reports:
Contract year 1: 4 publications;
Contract year 2: 37 publications;
Contract year 3: 19 publications.
Practice guides:
Contract year 1: 0 publications;
Contract year 2: 3 publications;
Contract year 3: 3 publications.
Quick reviews:
Contract year 1: 5 publications;
Contract year 2: 21 publications;
Contract year 3: 16 publications.
Source: GAO analysis of WWC data.
Notes: CY is July 1 to June 30. CY1 covers this period for 2007 to
2008, CY2 for 2008 to 2009, and CY3 for 2009 to 2010.
[End of figure]
The WWC's scope of work increased under the current contract with the
addition of new products and work processes, as well as
responsibilities related to the American Recovery and Reinvestment Act
of 2009 (Recovery Act), which provided additional innovation and
improvement funding.[Footnote 31] The WWC is developing three new
types of publications and conducts an annual review of Education-
sponsored studies for IES's internal use. Specifically, the WWC is
developing research briefs, research perspectives, and practice
briefs, which will focus on Education policy priorities. Like practice
guides, the new publications will incorporate expert opinion and a
broad range of research. Table 3 provides more information on these
new initiatives.[Footnote 32]
Table 3: New WWC Publications and Reports:
Product: Research briefs;
Description: Short summaries of what research indicates about the
effectiveness and implementation challenges of policies, practices, or
issues in education;
Status: In process and template has been approved. First publications
projected for release in 2010.
Product: Research perspectives;
Description: Researchers' perspectives on what research has found will
work in addressing pressing educational issues. Topics will initially
focus on issues relevant to the Recovery Act;
Status: In process and template has been approved. First publications
projected for release in 2010.
Product: Practice briefs;
Description: Provide explicit information on how to implement one
practice from a WWC practice guide, and provides educators with
research-based, how-to steps and strategies for overcoming roadblocks,
and tools for educators;
Status: Template has been drafted but further work on this product is
on hold pending direction from IES.
Product: Reviews of IES-sponsored studies (annual);
Description: IES uses this report to evaluate the research it funds.
WWC reviews this research using WWC standards and reports on whether
the research studies identify effective or promising practices;
Status: First produced in 2008, with plans for annual reporting to IES.
Source: GAO review of WWC contracts and annual plans.
[End of table]
IES's Reviews Have Delayed the Release of Some Reports:
While the WWC contractor increased its report production, IES's review
process did not keep pace with output. IES is responsible for
administering independent peer reviews of all products and conducting
final reviews and approvals before products are released, and has
internal time frame estimates used in scheduling and completing such
reviews. For example, according to IES planning documents, IES
estimates 15 business days for the completion of peer reviews for
intervention reports and 6 business days for WWC quick reviews.
However, throughout 2009, IES took increasingly more time to schedule
and coordinate the completion of peer reviews for some intervention
reports and quick reviews. As a result, the release of 20 reports--11
intervention reports and nine quick reviews--was delayed by more than
6 months. For example, in the first quarter of the current contract
year (third contract year, 2009 to 2010), IES took an average of over
50 business days to have intervention reports and quick reviews peer
reviewed, compared to an average of 7 business days during the first
quarter of second contract year (see figure 5).
Figure 5: Average Time for IES Peer Reviews of Released Intervention
Reports and Quick Reviews, by Contract Year (CY) and Quarter (Q) for
Current WWC Contract:
[Refer to PDF for image: 2 vertical bar charts]
IES peer review goal:
Intervention reports: 15;
Quick reviews: 6.
Quarter released: CY2-Q1;
Intervention reports: 7;
Quick reviews: 7.
Quarter released: CY2-Q2;
Intervention reports: 10;
Quick reviews: 6.
Quarter released: CY2-Q3;
Intervention reports: 13;
Quick reviews: 33.
Quarter released: CY2-Q4;
Intervention reports: 16;
Quick reviews: 45.
Quarter released: CY3-Q1;
Intervention reports: 54;
Quick reviews: 56.
Quarter released: CY3-Q2;
Intervention reports: 55;
Quick reviews: NA.
Quarter released: CY3-Q3;
Intervention reports: 54;
Quick reviews: 134.
Quarter released: CY3-Q4;
Intervention reports: 111;
Quick reviews: 99.
Source: GAO analysis of WWC data.
Notes: CY is July 1 to June 30. CY1 covers this period for 2007 to
2008, CY2 for 2008 to 2009, and CY3 for 2009 to 2010. In CY3-Q2, no
quick reviews were released.
[End of figure]
These delays in the IES-administered peer review process resulted in
significant backlogs of intervention reports and quick reviews
awaiting release. For example, as shown in figure 6, reports that
entered the peer review process in the first quarter of the second
contract year (CY2-Q1) were completed within that quarter. However,
the majority of reports entering review the first quarter of the third
contract year (CY3-Q1) remained in process for subsequent quarters.
While the backlog persisted through the third quarter of the third
contract year (CY3-Q3), the number of reports that completed peer
review in the third and fourth quarters increased from prior quarters.
Figure 6 shows that 11 intervention reports completed peer review in
CY3-Q3 and an additional 27 completed peer review in CY3-Q4, compared
with 4, 5, and 8 intervention reports in the prior three quarters.
Figure 6: IES Peer Review Backlog for Intervention Reports and Quick
Reviews, by Contract Year (CY) and Quarter (Q) for Current WWC
Contract:
[Refer to PDF for image: horizontal bar graph]
CY2-Q1:
Intervention reports entering peer review: 11;
Intervention reports completing peer review: 10;
Intervention reports remaining in peer review: 1;
Quick reviews entering peer review: 3;
Quick reviews completing peer review: 3;
Quick reviews remaining in peer review: 0.
CY2-Q2:
Intervention reports entering peer review: 10;
Intervention reports completing peer review: 10;
Intervention reports remaining in peer review: 1;
Quick reviews entering peer review: 7;
Quick reviews completing peer review: 5;
Quick reviews remaining in peer review: 2.
CY2-Q3:
Intervention reports entering peer review: 19;
Intervention reports completing peer review: 14;
Intervention reports remaining in peer review: 6;
Quick reviews entering peer review: 7;
Quick reviews completing peer review: 4;
Quick reviews remaining in peer review: 5.
CY2-Q4:
Intervention reports entering peer review: 11;
Intervention reports completing peer review: 8;
Intervention reports remaining in peer review: 9;
Quick reviews entering peer review: 11;
Quick reviews completing peer review: 7;
Quick reviews remaining in peer review: 9.
CY3-Q1:
Intervention reports entering peer review: 8;
Intervention reports completing peer review: 5;
Intervention reports remaining in peer review: 12;
Quick reviews entering peer review: 3;
Quick reviews completing peer review: 1;
Quick reviews remaining in peer review: 11.
CY3-Q2:
Intervention reports entering peer review: 6;
Intervention reports completing peer review: 4;
Intervention reports remaining in peer review: 14;
Quick reviews entering peer review: 4;
Quick reviews completing peer review: 3;
Quick reviews remaining in peer review: 12.
CY3-Q3:
Intervention reports entering peer review: 16;
Intervention reports completing peer review: 11;
Intervention reports remaining in peer review: 19;
Quick reviews entering peer review: 3;
Quick reviews completing peer review: 10;
Quick reviews remaining in peer review: 5.
CY3-Q4:
Intervention reports entering peer review: 10;
Intervention reports completing peer review: 27;
Intervention reports remaining in peer review: 2;
Quick reviews entering peer review: 7;
Quick reviews completing peer review: 11;
Quick reviews remaining in peer review: 1.
Source: GAO analysis of WWC data.
Notes: CY is July 1 to June 30. CY1 covers this period for 2007 to
2008, CY2 for 2008 to 2009, and CY3 for 2009 to 2010. At the end of
each quarter, any report remaining in peer review would carry over to
the next quarter. For example, for intervention reports in CY2-Q3, 19
new reports entered peer review, joining the 1 report that remained
from the previous quarter. Fourteen of these 20 reports completed peer
review, and 6 remained.
[End of figure]
IES attributed these delays to several factors and recently took steps
to eliminate the backlog. IES officials told us that delays were, in
part, attributable to difficulty in identifying and scheduling
independent peer reviewers, vacancies in WWC-related positions at IES,
and an increasing amount of research that met WWC standards.[Footnote
33] For example, IES officials told us identifying and scheduling a
sufficient number of qualified, independent peer reviewers had become
increasingly difficult because several former peer reviewers were now
associated in some way with the WWC and therefore were no longer
independent. To reduce the delays and eliminate the backlog, IES
recently implemented a new database to help staff track and manage the
work of peer reviewers and other WWC-related tasks. IES officials also
told us that they began identifying additional potential peer
reviewers using the WWC online registry of researchers. In addition,
IES increased a staff member's responsibilities related to scheduling
and coordinating peer reviews. These efforts reduced the amount of
time reports remain in the IES peer review process and eliminated the
backlog as of June 2010.[Footnote 34]
In addition to delays in the peer review process, WWC contractors told
us that many of their daily decisions need IES approval, and slow
responses delayed contractor processes. For example, the contractor
needs IES approval on the format and content of the products in
development, hindering further work when responses are delayed. IES
officials acknowledged that some delays in the approval process
occurred during contract year three and told us that this was largely
due to staff vacancies that they anticipate filling.
The Cost of the Current WWC Contract Has Increased from the Previous
One:
WWC's contracted costs have doubled from about $5.3 million per year
under the previous 5-year contract to the current level of about $10.7
million per year.[Footnote 35] The increase in contracted costs
reflects the expanded scope--more publications and new products and
processes--of the second contract compared to the first. IES's
contract for the WWC includes a variety of tasks that the contractor
is responsible for, including tasks related to report production and
product development. Table 4 provides a description of six broad task
categories and how they changed between contracts.
Table 4: Task Category Definitions and Changes between Contracts:
Task category: WWC products;
Task category includes expenditures related to: Conducting research
reviews and developing and publishing WWC products;
Changes from first contract to current contract: New product types,
expanded practice guide review process;
Stopped producing topic reports (2008).
Task category: Strategic planning and coordination with IES;
Task category includes expenditures related to: Preparing annual
plans, managing reporting requirements, and communications and
workflow with IES;
Changes from first contract to current contract: New contractor
database increased process documentation and reporting capabilities to
IES.
Task category: Communications, collaboration, and dissemination of WWC
products;
Task category includes expenditures related to: Maintaining WWC Help
Desk; Promoting WWC through various means; Developing/implementing a
communications/dissemination plan;
Changes from first contract to current contract: WWC staff attend
conferences and coordinate some dissemination efforts with other IES
departments in the current contract.
Task category: WWC development, process revisions, and maintenance;
Task category includes expenditures related to: Revising and
developing review processes and policies; Administrating and
supporting technical staff training, technical advisory group, online
registries, and conflict of interest procedures;
Changes from first contract to current contract: Enhanced review
processes and standards, added new research designs; Developed policy
and procedures handbooks, new products, and staff training.
Task category: Web site and technical maintenance;
Task category includes expenditures related to: Coordinating content,
maintaining databases/search functions, and processing federal data
collection forms;
Changes from first contract to current contract: New online searchable
system and database.
Task category: Award fees;
Task category includes expenditures related to: Fixed and performance-
based contractor award fees based on a percentage of the overall
contract total;
Changes from first contract to current contract: No change.
Source: GAO review of WWC contracts, annual plans, and budget
documents.
[End of table]
Our analyses of costs associated with these six broad task categories
shows that the proportion of funds dedicated to producing WWC reports
was about 60 percent under both contracts (see figure 7).[Footnote 36]
Figure 7: WWC Costs, by Task Categories and Contracts:
[Refer to PDF for image: 2 pie-charts]
First 5-year contract ($26.5 million):
WWC products: 60%:
- Intervention and topic reports: 57%;
- Practice guides: 3%;
Other costs: 41%:
- A: 6%;
- B: 12%;
- C: 5%;
- D: 9%;
- E: 9%.
Second 5-year contract ($53.3 million):
WWC products: 60%:
- Intervention and topic reports: 29%;
- Practice guides: 21%;
Other costs: 41%:
- A: 7%;
- B: 6%;
- C: 11%;
- D: 7%;
- E: 9%.
Breakdown of other costs:
A: Strategic planning and coordination with Education.
B: Communications, collaboration, and dissemination.
C: WWC development, process revisions, and staff training.
D: Web site and technical maintenance.
E: Award fees.
Source: GAO analysis of WWC budget data.
Notes: Figures reflect actual expenditures for the first 5-year
contract and updated budgeted expenditures for the current 5-year
contract, which began on July 1, 2007. Percentages do not add to 100%
due to rounding. Total WWC expenditures for the first 5-year contract
were about $26,527,760, but cost category percentages do not reflect
$1,222,714 billed by a co-contractor but not itemized by task.
Category proportions for the first contract are estimates because IES
could not provide documentation that included final adjusted
expenditures by tasks. The WWC budget is $53,315,166 for the current 5-
year contract, of which $23,643,891 had been spent as of October 31,
2009. Cost category proportions for the current contract do not
include $104,559 related to transition from the first contract to the
second contract.
[End of figure]
The proportion of funds dedicated to some tasks changed from the first
contract to the second. For example, costs for tasks related to
process development and revisions doubled from 5 percent to 11
percent, supporting various activities such as expanding the practice
guide review process and revising the Clearinghouse's procedures and
standards handbook. According to IES officials, the current WWC
contractor developed and implemented new or enhanced processes that
affect all publications and deliverables. For example, the current
contractor developed a standardized system for conducting and
recording the WWC's searches of research studies.[Footnote 37]
Most WWC cost increases supported additional output and expansions in
product scope. While under both contracts more resources were devoted
to intervention reports than any other product, the proportion devoted
to practice guides increased significantly, currently comprising about
21 percent of total budgeted costs. IES noted that practice guides
were only added during the last year of the prior contract, but are
now a primary product. Other new WWC products make up a relatively
small proportion of budgeted costs in the current contract,
representing about 9 percent of the total contract budget combined.
IES Has Not Developed Performance Measures Related to Production Costs
or Product Usefulness:
IES established performance goals, which the WWC met or exceeded;
however, these goals do not address production costs or the usefulness
of WWC products. IES established WWC-related performance goals in its
annual organizational assessment, but Education discontinued the use
of these performance measures for fiscal year 2010.[Footnote 38] In
addition, IES established performance goals for its WWC contractor in
the contractor award fee plan, which IES uses to determine the amount
of performance-based funds awarded to the contractor.[Footnote 39]
IES measured WWC program performance from fiscal year 2003 to fiscal
year 2009, as part of Education's Organizational Assessment--its
departmentwide performance management system. The WWC-related
performance goals included in Education's Organizational Assessment
focused on WWC Web site visits and the quantity of publications, both
of which were areas of concern as the WWC was getting established.
[Footnote 40] Specifically, these performance goals included increased
WWC Web site visits, sustained productivity in the release of
intervention reports and quick reviews, and increased practice guide
production. The WWC met or exceeded these performance goals annually;
however, according to IES officials, these performance goals will not
be included in Education's fiscal year 2010 Organizational Assessment,
in part because IES is now satisfied with WWC activity in these areas.
[Footnote 41]
IES has not developed performance measures related to the cost of
specific WWC products.[Footnote 42] IES officials noted that the costs
per WWC publication vary greatly depending on the amount of available
research relevant to the specifications of a product. For example,
intervention reports based on a large number of studies meeting WWC
standards take longer and cost more to produce than do reports for
which few studies qualify for review. IES has tasked the current WWC
contractor to develop ways to streamline production processes and to
conduct a cost study, the results of which would improve budget
estimates and strengthen IES's monitoring of the contract. While the
contractor has begun this work, IES officials told us that they do not
know when cost-related performance measures, such as acceptable cost
ranges for each type of product, will be established.
WWC does not currently have a performance measure related to product
usefulness. While Web site visits were tracked as a measure of WWC
utilization in IES's Organizational Assessment through fiscal year
2009, this metric did not assess the degree to which WWC products were
reaching their target audience and did not provided any information on
the extent to which educators find WWC products to be useful. IES's
2010 budget justification calls for a representative survey of WWC use
among education decision makers to be conducted by 2013. However, IES
officials told us that they were unsure whether the survey would take
place, and IES does not currently have a plan in place to implement
this survey.[Footnote 43]
Education Has Three Primary Ways to Disseminate Information about WWC
Products, but Awareness and Use Vary among Target Audiences:
Education Has Various Ways to Disseminate Information about WWC
Products, but Awareness of the Clearinghouse Is Generally Limited:
Education uses the WWC contractor, RELs, and DWW to disseminate
information about the Clearinghouse to its target audience, which
includes state and school district officials, as well as teachers and
principals. In accordance with its contract, the WWC contractor
disseminates information about its products electronically and through
various events, such as formal presentations at conferences. The
Clearinghouse's electronic dissemination methods include an e-mail
listserv, Web-based seminars (webinars), and newsletters. For example,
the WWC sends out notices to its e-mail listserv, alerting subscribers
of the availability of new products, including intervention reports,
practice guides, and quick reviews.[Footnote 44] WWC staff told us
that the webinars cover the same topics as their reports and are a
relatively cost-effective way to disseminate information about
products and methodology. In addition, WWC staff disseminate
information about WWC products at education conferences, such as
teacher, principal, and researcher conferences. At these conferences,
WWC staff may conduct formal presentations, have an exhibit featuring
their products, or both. At conference exhibits, Clearinghouse staff
answer questions about their products and provide literature to
conference attendees. From July 2009 through June 2010, WWC staff were
scheduled to present or have an exhibit at 14 conferences. WWC staff
also told us that they work with other groups, such as education,
research, and legislative organizations, in order to further
disseminate information about WWC products to their members.
In addition, Education disseminates information about WWC products
through IES's 10 RELs, which hold events that may feature information
based on practice guides and refer educators to Clearinghouse
products. Officials at all 10 RELs told us that they spent time
disseminating information about WWC, in part by holding events that
bridge research and practice. According to REL officials, these bridge
events are attended primarily by school-, district-, and state-level
education professionals and provide an opportunity for educators to
discuss ways to implement research-based practices. Officials at all
10 RELs told us that bridge events focused on practice guides to some
extent, and 7 indicated that WWC practice guides were the primary
focus of these events. According to REL officials and WWC staff, these
events sometimes included a WWC staff member to discuss methodology
and panelists who helped develop the practice guides. RELs also
disseminate research from WWC when responding to educator questions or
concerns.[Footnote 45] Officials from 7 of the 10 RELs told us their
respective RELs generally use relevant WWC products (practice guides
and others) when searching for research-based information to address
educator questions.
In addition, Education's Office of Planning, Evaluation, and Policy
Development disseminates information about WWC practice guides on its
DWW Web site, which provides an online library of resources designed
to help educators implement research-based instructional practice.
This Web site uses different formats to present content based
primarily on WWC Practice Guides and provides examples of possible
ways educators might apply WWC research findings. For instance, to
help educators implement the recommendations from the practice guide
on dropout prevention, the DWW Web site features slideshows with
examples of supportive academic environments and interviews with
educators and experts on dropout prevention. In addition, the Web site
includes sample materials, such as lesson plan templates, that provide
an example of how to implement recommendations. The DWW also includes
information on the research behind the recommendations and a link to
the WWC Web site and the individual practice guides. According to IES
officials, a recent analysis of the DWW Web site traffic showed that
49 state Web sites have links to the DWW Web site, which helps
disseminate WWC products further to the education community.
We found that 33 of the 38 states[Footnote 46] that responded to our
survey reported that they had heard of the WWC. Based on our survey
results, we estimate that 42 percent of school districts have heard of
the WWC and that the percentage is greater for school districts that
rely to a very large extent on external sources for information on
research-based practices.[Footnote 47]
School districts identified several sources of information about the
Clearinghouse, including conferences and Education (see figure 8).
Figure 8: Sources from Which District Officials Heard of the WWC:
[Refer to PDF for image: vertical bar graph]
Potential sources of WWC information: Education;
Not a source of WWC information: 23;
Don't know: 5;
Source of WWC information: 72.
Potential sources of WWC information: Conferences;
Not a source of WWC information: 22;
Don't know: 5;
Source of WWC information: 73.
Potential sources of WWC information: Peers;
Not a source of WWC information: 28;
Don't know: 6;
Source of WWC information: 66.
Potential sources of WWC information: RELs;
Not a source of WWC information: 45;
Don't know: 8;
Source of WWC information: 47.
Potential sources of WWC information: Other;
Not a source of WWC information: 24;
Don't know: 40;
Source of WWC information: 37.
Source: GAO analysis of school district survey responses.
Notes: Estimates shown are based upon a probability survey. See
appendix I for associated confidence intervals. The responses for
other sources included state departments of education, Internet
searches, and journals.
[End of figure]
While the majority of states have accessed the WWC Web site, we
estimate that only 34 percent of school districts have done so.
Specifically, among the states that responded to our survey, 33 of 38
states[Footnote 48] reported that they had accessed the WWC Web site
at least once. In addition, 19 of states reported visiting the Web
site at least seven times per year.[Footnote 49] In contrast, an
estimated 34 percent of school districts accessed the WWC Web site at
least once.[Footnote 50] Further, we estimate that only 11 percent of
school districts visited the Web site at least seven times per year.
[Footnote 51] States and school districts that visited the WWC Web
site less than seven times per year most often cited time constraints
as the primary reason for their infrequent use.[Footnote 52]
In addition to the WWC, states and school districts use a variety of
other sources of information to identify effective education
practices. Most states and school districts use several broad sources
of information, such as academic journals, education periodicals, and
associations of educators. For example, 37 states reported using
academic journals to identify such practices, and we estimate that
about 97 percent of school districts used academic journals.[Footnote
53] Overall, more school districts and states that responded to our
survey used the WWC than used other research synthesis organizations.
[Footnote 54]
While the WWC also includes teachers and principals in its target
audience, we found that relatively few of the teachers and principals
we contacted at education conferences had heard of the WWC. While not
a generalizeable sample, we found that out of a total of 391 teachers
who completed our questionnaire at four education conferences, only 18
had accessed the WWC Web site.[Footnote 55] In addition, 341 teachers
who had not accessed the WWC Web site told us they had not heard of
the Web site. Similarly, among the 208 principals and other school
administrators who completed the questionnaire, only 32 had accessed
the WWC Web site. Further, 135 principals and other school
administrators told us they had not heard of the WWC.
States and School Districts Generally Used the Clearinghouse to a
Small or Moderate Extent to Inform Decisions and Used Specific WWC
Products to Varying Extents:
Based on our survey, most states and school districts that reported
accessing the WWC Web site used it to inform decisions on effective
education practices--a stated purpose of the WWC--to a small or
moderate extent. Specifically, 25 of the 33 states that use the
Clearinghouse indicated that they use it to a small or moderate extent
to inform their decisions, while 6 reported using it to a large or
very large extent.[Footnote 56] We estimate that 72 percent of school
districts that have accessed the Clearinghouse use the WWC to inform
education decisions to a small or moderate extent, while only 18
percent use it to a large or very large extent.[Footnote 57]
States that used the WWC to inform decisions reported that they used
the Clearinghouse for various purposes, including informing
professional development and curriculum decisions. For example, 25
states reported using the Clearinghouse to inform professional
development programs for teachers, and 22 reported using it to inform
curriculum decisions. Fewer states used the Clearinghouse to advise
districts that were not making adequate yearly progress (AYP) in
meeting academic targets or to develop improvement plans for such
districts. (Figure 9 provides a breakdown of the extent to which these
states use the Clearinghouse for various purposes.)
Figure 9: Extent to Which States Use WWC for Various Purposes:
[Refer to PDF for image: vertical bar graph]
Number of states:
Inform curriculum decisions:
Not at all: 4;
Small extent: 2;
Moderate extent: 12;
Large or very large extent: 8.
Inform professional development of teachers:
Not at all: 2;
Small extent: 4;
Moderate extent: 12;
Large or very large extent: 9.
Develop school improvement plan:
Not at all: 4;
Small extent: 6;
Moderate extent: 9;
Large or very large extent: 4.
Advise schools not making AVP on potential interventions:
Not at all: 4;
Small extent: 4;
Moderate extent: 11;
Large or very large extent: 5.
Source: GAO analysis of state responses.
Note: "I Don't Know" was also a response option, and is not displayed
in the figure.
[End of figure]
In addition, we estimate that among school districts that use the WWC
to inform decisions on effective education practices, about 90 percent
used it to inform curriculum decisions at least to a small extent,
similar to the percentage that used the WWC to inform professional
development decisions. However, fewer school districts used it to
advise schools that did not meet academic goals or to develop school-
level plans to help such schools improve.[Footnote 58] Figure 10
provides a breakdown of the extent to which these school districts use
the Clearinghouse for various purposes.
Figure 10: Extent to Which School Districts That Have Used the
Clearinghouse Used It for Various Purposes:
[Refer to PDF for image: vertical bar graph]
Estimated percentage of school districts:
Inform curriculum decisions:
Not at all: 3%;
Small extent: 20%;
Moderate extent: 37%;
Large or very large extent: 37%.
Inform professional development of teachers:
Not at all: 6%;
Small extent: 19%;
Moderate extent: 38%;
Large or very large extent: 32%.
Develop school improvement plan:
Not at all: 13%;
Small extent: 19%;
Moderate extent: 35%;
Large or very large extent: 27%.
Advise schools not making AVP on potential interventions:
Not at all: 29%;
Small extent: 11%;
Moderate extent: 28%;
Large or very large extent: 23%.
Source: GAO analysis of school district survey responses.
Note: Estimates shown are based upon a probability survey. See
appendix I for associated confidence intervals. "I Don't Know" was
also a response option, and is not displayed in the figure.
[End of figure]
States reported using specific WWC products--intervention reports and
practice guides--more than quick reviews. Specifically, of the states
that had used the Clearinghouse, 21 reported that they used
intervention reports and 20 reported using practice guides, while only
12 reported using quick reviews. States used intervention reports and
practice guides to a similar extent to inform education decisions. For
example, for each product, six states reported using them to large or
very large extent to inform such decisions (see figure 11).
Figure 11: Extent to Which States Use Specific WWC Products:
[Refer to PDF for image: vertical bar graph]
Number of states:
Intervention reports:
Not at all: 4;
Small extent: 3;
Moderate extent: 12;
Large or very large extent: 6.
Practice guides:
Not at all: 3;
Small extent: 3;
Moderate extent: 11;
Large or very large extent: 6.
Quick reviews:
Not at all: 10;
Small extent: 3;
Moderate extent: 7;
Large or very large extent: 2.
Source: GAO analysis of state survey responses.
[End of figure]
However, the relative use of specific WWC products was different among
school districts. We estimate that among school districts that use the
Clearinghouse to inform decisions on effective education practices,
more school districts use intervention reports relative to practice
guides or quick reviews. Specifically, we estimate that 74 percent of
those school districts that use the WWC have used its intervention
reports to inform education decisions,[Footnote 59] while practice
guides and quick reviews were each used by about half of such
districts.[Footnote 60] Based on our survey, an estimated 21 percent
of school districts that use the WWC have used intervention reports to
a large or very large extent,[Footnote 61] while about 10 percent use
practice guides to a large or very large extent (see figure 12).
[Footnote 62]
[Refer to PDF for image]
[End of figure]
Figure 12: Extent of Specific Product Use among Districts That Use the
Clearinghouse:
[Refer to PDF for image: vertical bar graph]
Intervention reports:
Not at all: 21;
Small extent: 24;
Moderate extent: 29;
Large or very large extent: 21.
Practice guides:
Not at all: 42;
Small extent: 17;
Moderate extent: 28;
Large or very large extent: 10.
Quick reviews:
Not at all: 40;
Small extent: 25;
Moderate extent: 17;
Large or very large extent: 13.
Source: GAO analysis of school district survey responses.
Note: Estimates shown are based upon a probability survey. See
appendix I for associated confidence intervals.
[End of figure]
States and School Districts Would Likely Increase Their Use of the
Clearinghouse If the WWC Made Certain Changes:
Many states and school districts that had accessed the Clearinghouse
reported that they would likely increase their use of the WWC if the
Clearinghouse provided a broader array of information. For example,
many states and school districts would be likely to increase their use
of the Clearinghouse if it reviewed more studies, covered additional
topics, or provided more relevant or timely reports. For example, 21
of the 33 states that had used the Clearinghouse reported that they
would be somewhat or very likely to use the Clearinghouse more often
if it had reviews that were more timely (see figure 13).[Footnote 63]
Figure 13: Number of States That Reported They Would Likely Increase
Their Use of WWC Given Certain Changes:
[Refer to PDF for image: horizontal bar graph]
Potential changes:
A greater number of intervention reports showing positive effects;
Likely or very likely: 26.
Reviews of efficacy of programs being used or considered in my state;
Likely or very likely: 23.
Additional practices with positives reviews;
Likely or very likely: 22.
Reviews that are more timely;
Likely or very likely: 21.
Additional topic areas;
Likely or very likely: 19.
Additional studies reviewed;
Likely or very likely: 19.
Additional practice guides;
Likely or very likely: 18.
Additional information on interventions based on studies that may not
meet WWC standards;
Likely or very likely: 15.
A broader definition of what studies meet WWC standards;
Likely or very likely: 15.
An easier web site to navigate;
Likely or very likely: 14.
Source: GAO analysis of state survey responses.
In addition, based on our survey, we estimate that about two thirds of
school districts that had accessed the Clearinghouse would likely
increase their use if it included reviews of programs or interventions
being used or considered in their school district.[Footnote 64] An
estimated 50 percent of school districts would likely increase their
use of the Clearinghouse if it had reviews that were more timely (see
figure 14).[Footnote 65]
Figure 14: Estimated Percent of School Districts That Have Accessed
the WWC That Would Likely Increase Their Use of the WWC Given Various
Changes:
[Refer to PDF for image: horizontal bar graph]
Potential changes:
Reviews of efficacy of programs being used or considered in my
district: 68%.
A greater number of intervention reports showing positive effects:
Likely or very likely: 67%.
Additional practices with positives reviews:
Likely or very likely: 60%.
Additional topic areas:
Likely or very likely: 57%.
Additional information on interventions based on studies that may not
meet WWC standards:
Likely or very likely: 56%.
Additional practice guides:
Likely or very likely: 56%.
Additional studies reviewed:
Likely or very likely: 55%.
Reviews that are more timely:
Likely or very likely: 50%.
A broader definition of what studies meet WWC standards:
Likely or very likely: 47%.
An easier web site to navigate:
Likely or very likely: 47%.
Source: GAO analysis of school district survey responses.
Note: Estimates shown are based upon a probability survey. See
appendix I for associated confidence intervals.
[End of figure]
Conclusions:
In 2007, Education substantially increased its financial investment in
the WWC, and the Clearinghouse is significantly expanding its scope in
an effort to better serve its target audience. Some of the new
products aim to be more responsive to educators and education decision
makers by providing timely information about evidence-based practices
relevant to pressing needs. Such information could help states and
districts identify strategies as they implement educational reform
efforts--such as reforming low-performing schools or improving
professional development--under ESEA and the Recovery Act. For
example, WWC research perspectives, still in development, are intended
to help education decision makers as they address challenges related
to spending Recovery Act funds. However, the development of these
products and the release of other products were delayed, in part, by a
substantial backlog in IES's review and approval processes. These
delays hindered the timely release of several publications, and some
products were released months after they were completed by the
contractor. While IES recently eliminated the backlog, educators need
to be able to rely on the Clearinghouse for timely and relevant
information. According to our survey, many states and school districts
reported that they would likely increase their use of the
Clearinghouse if it released information more quickly.
While IES has increased annual report production, IES has not
established reasonable production cost ranges or specific cost-related
performance measures related to each product type. Without acceptable
per product cost ranges, it is difficult for IES to assess the
reasonableness of costs associated with certain products, even as IES
takes steps to streamline production. IES's current study on costs may
help IES establish acceptable cost ranges that could inform IES's
performance measurements related to the WWC. In addition, such
information could inform cost comparisons between the WWC and other
research evaluation organizations or provide baselines for future
contractor work.
In addition, IES has not established meaningful performance measures
related to product usefulness. Until fiscal year 2010, IES tracked
visits to its Web site and annual report production as a way to
measure the productivity of the Clearinghouse. While these measures
were important to accurately track the WWC's initial growth, they did
not evaluate the degree to which the products were meeting the needs
of educators. Specifically, IES currently does not have a way to gauge
user satisfaction with WWC products, which is a common practice when
developing and providing new products. Further, while IES currently
incorporates some feedback from the WWC Web site users, to inform
future topic areas, it does not systematically gauge its target
audience's major areas of interest or concern--such as gathering
information on interventions currently being used or considered in
specific school districts or states. IES decides how to spread its
limited resources across the various product types without directly
measuring the extent to which educators use the WWC or how useful they
find the various products to be. Measuring the use and usefulness of
its products could help IES continue to improve content, develop
products, and respond to the needs of educators and policymakers.
While some educators and policymakers find WWC products useful, many
other educators are not familiar with the Clearinghouse. IES has spent
a substantial amount of money, time, and effort producing various
summaries of evidence-based practices, which cover both specific
education interventions and general practices. This investment in the
WWC was made in order to inform education professionals at all levels--
from classroom teachers to policymakers--as they make decisions on how
best to educate the nation's children. Improved dissemination of WWC
products could increase awareness and use of the WWC. Increased use of
the Clearinghouse could help education professionals identify and
implement effective educational interventions and practices, and
potentially lead to increased student achievement.
Recommendations for Executive Action:
We are making the following four recommendations based on our review.
To consistently release WWC products in a timely manner, we recommend
the Secretary of Education direct IES to develop and implement
strategies that help avoid future backlogs and ensure that IES's
review and approval processes keep pace with increased contractor
production. Strategies could include shifting IES resources to ensure
sufficient staff time for managing the peer review process and
streamlining its approval processes.
To better track the costs and usefulness of the WWC, we recommend that
the Secretary of Education direct IES to:
* incorporate findings from its cost studies to develop performance
measures related to costs, such as identifying a range of acceptable
costs per product and using that information to monitor contractor
spending; and:
* develop performance measures related to product usefulness and
periodically assess whether WWC products are meeting the needs of
target audiences by gathering information on product usefulness in the
proposed survey or through other means.
To reach more members of the target audience, we recommend the
Secretary of Education direct IES to assess and improve its
dissemination efforts to promote greater awareness and use of the WWC,
for example, by developing a way to inform school districts of new
products or encouraging educator professional development programs to
focus on research-based practices such as those discussed in practice
guides.
Agency Comments and Our Evaluation:
We provided a draft of this report to the U.S. Department of Education
for review and comment. Education officials provided written comments
on a draft of this report, which are reproduced in appendix IV.
Education also provided technical comments, which we incorporated into
the report as appropriate.
Education generally agreed with our recommendations. Specifically,
Education agreed to our recommendations on consistently releasing WWC
products in a timely manner and assessing and improving its
dissemination efforts. In its response to our recommendation on
tracking the cost and usefulness of the WWC and its products,
Education noted that IES has taken some steps that address the
recommendation. With regard to costs, Education stated that it intends
to incorporate the results of current cost studies into future work
plans and monitoring efforts. We continue to recommend that these
results be used to inform performance measures related to costs for
future operations. With regard to tracking the usefulness of the WWC,
Education noted that it uses a variety of tools to gather consumer
input, such as a Help Desk and online voting for future report topics.
While such feedback provides some information to the WWC, it relies on
existing users and reflects the views of those users who provide
feedback, rather than those of the broader population. However, as
shown in our survey, only an estimated 34 percent of school districts
have accessed the WWC Web site at least once--and fewer have used the
Web site frequently. Education also noted that it would include a
customer satisfaction survey in IES's review of its own performance,
but whether the survey would be directed at current Clearinghouse
customers or a broader audience, or whether the survey would identify
how useful various WWC products are and how the WWC can be improved is
unclear. More nationally representative information could help IES
prioritize topics for intervention reports and practice guides and
inform budget priorities.
We are sending copies of this report to the appropriate congressional
committees, the Secretary of the U.S. Department of Education, and
other interested parties. In addition, the report will be available at
no charge on the GAO Web site at [hyperlink, http://www.gao.gov].
If you or your staffs have any questions regarding this report, please
contact me at (202) 512-7215 or ashbyc@gao.gov. Contact points for our
Offices of Congressional Relations and Public Affairs may be found on
the last page of this report. GAO staff who made major contributors to
this report are listed in appendix V.
Signed by:
Cornelia M. Ashby:
Director, Education, Workforce, and Income Security Issues:
List of committees:
The Honorable Tom Harkin:
Chairman:
The Honorable Thad Cochran:
Ranking Member:
Subcommittee on Labor, Health and:
Human Services, Education and Related Agencies:
Committee on Appropriations:
United States Senate:
The Honorable David Obey:
Chairman:
The Honorable Todd Tiahrt:
Ranking Member:
Subcommittee on Labor, Health and:
Human Services, Education and Related Agencies:
Committee on Appropriations:
House of Representatives:
[End of section]
Appendix I: Objectives, Scope, and Methodology:
To address all three objectives, we interviewed officials from the
Institute of Education Sciences (IES), What Works Clearinghouse (WWC
or Clearinghouse) contractors, and representatives from various
educational organizations. To assess the research review process used
by the IES's WWC, we reviewed WWC standards and procedures, reviewed
an expert panel report that assessed the validity of the WWC review
process, and collected information about the extent to which the WWC
has implemented the panel's recommendations. To determine how
performance and costs changed over time, we analyzed the costs and
productivity of the two WWC contractors. To obtain information about
the usefulness of WWC products, we conducted a Web-based survey of all
state education agencies and a nationally representative sample of
school districts. We also collected information about the usefulness
of the WWC from teachers and principals at four education conferences.
We conducted our work from September 2009 through July 2010 in
accordance with generally accepted government auditing standards.
Those standards require that we plan and perform the audit to obtain
sufficient, appropriate evidence to provide a reasonable basis for our
findings and conclusions based on our audit objectives. We believe
that the evidence obtained provides a reasonable basis for our
findings and conclusions based on our audit objectives.
Assessment of WWC Research Review Process:
GAO previously assessed the procedures and criteria used by the WWC by
reviewing documents and interviewing IES officials and WWC
contractors.[Footnote 66] We reviewed WWC standards and procedures and
examined the degree of consistency of these standards and procedures
across education topic areas. We also reviewed the findings and
recommendations from an expert panel report that assessed the validity
of the WWC review process.[Footnote 67] We obtained information from
IES officials and WWC contractors on the extent to which the WWC has
implemented the panel's recommendations. Further, we identified other
concerns about the WWC review process through a literature review and
interviews with researchers, and we interviewed IES officials and WWC
contractors to assess the extent to which the Clearinghouse has
addressed these concerns. We also examined the degree to which the
WWC's review process is similar to that used by other entities engaged
in systematic research review efforts.
Performance and Cost Data Analyses:
To determine how performance and costs changed over time, we analyzed
the costs and productivity of the two WWC contractors. We reviewed
budget data and product release dates to analyze cost and productivity
trends of the WWC. To examine performance, we interviewed the two
contractors, as well as IES officials, and compared IES's performance
measures and goals to actual outcomes.
We assessed the reliability of the WWC performance and cost data by
(1) reviewing existing information about the data and the system that
produced them and (2) interviewing agency officials knowledgeable
about the data. We determined that the data were sufficiently reliable
for the purposes of this report.
Survey of States and School Districts:
To determine how WWC products are disseminated, we interviewed
officials from IES and all 10 RELs, as well as WWC contractors. To
determine how useful education professionals find WWC products to be,
we designed and administered a Web-based survey of state education
agencies in the 50 states and the District of Columbia and a
nationally representative sample of local educational agencies (LEA).
Specifically, the survey asked officials about (1) their general
sources of information on effective educational practices, (2) the
extent to which they use WWC products to inform curriculum decisions
(including questions on specific intervention reports and practice
guides), (3) how useful the officials find the information in the WWC,
(4) the likelihood they would increase their usage if certain changes
were made to the WWC Web site, and (5) the extent to which the
officials use the Doing What Works and Best Evidence Encyclopedia Web
sites to inform curriculum decisions and how useful the officials find
these other information sources to be. We reproduce the questions we
used in our analysis in figure 15. The survey was administered from
February 18, 2010 to April 14, 2010.
To determine how the WWC was being used at the state level, we
surveyed the state Secretary, Commissioner, or Superintendent of
Education in the 50 states and the District of Columbia. Out of the 51
state officials surveyed, 38 responded to the survey.
To determine how the WWC was being used at the school district level,
we surveyed a nationally representative sample of school districts
across the country. We selected a stratified random sample of 625 LEAs
from the population of 17,620 LEAs included in our sample frame of
data obtained from the Common Core of Data for the 2007-08 school
year. A total of 454 LEAs responded, resulting in a final response
rate of 74 percent. Because we surveyed a sample of LEAs, survey
results for the district are estimates of a population of LEAs and
thus are subject to sampling errors that are associated with samples
of this size and type. Our sample is only one of a large number of
samples that we might have drawn. As each sample could have provided
different estimates, we express our confidence in the precision of our
particular sample's results as a 95 percent confidence interval (e.g.,
plus or minus 10 percentage points). We excluded 12 of the sampled
LEAs for various reasons--6 were closed, 3 did not administer any
schools, 2 managed schools in a correctional facility, and 1 was a
private school--and therefore were considered out of scope. All
estimates produced from the sample and presented in this report are
representative of the in-scope population.
The practical difficulties of conducting any survey may introduce
nonsampling errors, such as difficulties interpreting a particular
question, which can introduce unwanted variability into the survey
results. We took steps to minimize nonsampling errors by pretesting
the questionnaire over the phone with officials from two school
districts and one state department of education in November and
December 2009. We conducted pretests to verify that (1) the questions
were clear and unambiguous, (2) terminology was used correctly, (3)
the questionnaire did not place an undue burden on officials, and (4)
the questionnaire was comprehensive and unbiased. An independent
reviewer within GAO also reviewed a draft of the questionnaire prior
to its administration. We made revisions to the questionnaire based on
feedback from the pretests and independent review before administering
the survey.
The survey-related data used in this report is based on the state and
school district responses to the survey questions.
Figure 15: GAO's Web-based Survey of State Departments of Education
and Local Educational Agencies in the 50 States and the District of
Columbia:
[Refer to PDF for image: survey]
3. To what extent, if at all, does your rely on external evidence-
based research to inform curriculum decisions?
(Check only one answer)
1. Very large extent:
2. Large extent:
3. Moderate extent:
4. Small extent:
5. Does not use external evidence-based research:
6. Don't know:
4. How useful, if at all, are each of the following research evaluation
resources to you or your staff in identifying effective practices to
implement in your______? (Please choose one response for each
resource.)
4a. Best Evidence Encyclopedia (Johns Hopkins University):
Very useful:
Useful:
Somewhat useful:
Slightly useful:
Not at all useful:
No opinion:
Have not used this source of information:
4b. Child Trends:
Very useful:
Useful:
Somewhat useful:
Slightly useful:
Not at all useful:
No opinion:
Have not used this source of information:
4c. Coalition for Evidence Based Policy:
Very useful:
Useful:
Somewhat useful:
Slightly useful:
Not at all useful:
No opinion:
Have not used this source of information:
4d. Doing What Works:
Very useful:
Useful:
Somewhat useful:
Slightly useful:
Not at all useful:
No opinion:
Have not used this source of information:
4e. RAND's Promising Practices:
Very useful:
Useful:
Somewhat useful:
Slightly useful:
Not at all useful:
No opinion:
Have not used this source of information:
4f. What Works Clearinghouse:
Very useful:
Useful:
Somewhat useful:
Slightly useful:
Not at all useful:
No opinion:
Have not used this source of information:
4g. Other research synthesis clearinghouses:
Very useful:
Useful:
Somewhat useful:
Slightly useful:
Not at all useful:
No opinion:
Have not used this source of information:
5. Recognizing that research evaluation resources are not necessarily
the primary sources of information used to identify effective
education practices, GAO is also interested in the role of other
information sources.
How useful, if at all, are each of the following sources to you or your
staff in identifying effective practices to implement in your ______?
(Please choose one response for each resource.)
5a. Academic journals:
Very useful:
Useful:
Somewhat useful:
Slightly useful:
Not at all useful:
No opinion:
Have not used this source of information:
5b. Education-related periodicals:
Very useful:
Useful:
Somewhat useful:
Slightly useful:
Not at all useful:
No opinion:
Have not used this source of information:
5c. Online databases (ERIC or others):
Very useful:
Useful:
Somewhat useful:
Slightly useful:
Not at all useful:
No opinion:
Have not used this source of information:
5d. University-based research institutions:
Very useful:
Useful:
Somewhat useful:
Slightly useful:
Not at all useful:
No opinion:
Have not used this source of information:
5e. Non-profit organizations:
Very useful:
Useful:
Somewhat useful:
Slightly useful:
Not at all useful:
No opinion:
Have not used this source of information:
5f. Associations of educators or researchers:
Very useful:
Useful:
Somewhat useful:
Slightly useful:
Not at all useful:
No opinion:
Have not used this source of information:
5g. Peer conferences:
Very useful:
Useful:
Somewhat useful:
Slightly useful:
Not at all useful:
No opinion:
Have not used this source of information:
5h. Regional Education Laboratories (Department of Ed):
Very useful:
Useful:
Somewhat useful:
Slightly useful:
Not at all useful:
No opinion:
Have not used this source of information:
5i. Other federal outreach center:
Very useful:
Useful:
Somewhat useful:
Slightly useful:
Not at all useful:
No opinion:
Have not used this source of information:
5j. State government offices and/or outreach centers:
Very useful:
Useful:
Somewhat useful:
Slightly useful:
Not at all useful:
No opinion:
Have not used this source of information:
5k. Local data and/or internal research:
Very useful:
Useful:
Somewhat useful:
Slightly useful:
Not at all useful:
No opinion:
Have not used this source of information:
51. Community and parent input:
Very useful:
Useful:
Somewhat useful:
Slightly useful:
Not at all useful:
No opinion:
Have not used this source of information:
5m. Mentors/Colleagues:
Very useful:
Useful:
Somewhat useful:
Slightly useful:
Not at all useful:
No opinion:
Have not used this source of information:
5n. Personal experience:
Very useful:
Useful:
Somewhat useful:
Slightly useful:
Not at all useful:
No opinion:
Have not used this source of information:
5o. Other resource (Please specify below):
Very useful:
Useful:
Somewhat useful:
Slightly useful:
Not at all useful:
No opinion:
Have not used this source of information:
Other resource:
7. Have you or your staff heard of the What Works Clearinghouse (WWC)?
(Check only one answer)
1. Yes:
2. No (Go to Section 3: Use of Doing What Works):
3. Don't know (Go to Section 3: Use of Doing What Works):
8. From which of the following source(s) did you or your staff hear
about the WWC?
(Please choose one response for each source.)
8a. Conferences:
Yes:
No:
Don't know:
8b. Peers:
Yes:
No:
Don't know:
8c. Regional Education Labs:
Yes:
No:
Don't know:
8d. U.S. Department of Education:
Yes:
No:
Don't know:
8e. Other source (Please specify below):
Yes:
No:
Don't know:
Other source:
9. How frequently, if at all, do you or your staff access the WWC
website? (Check only one answer)
1. Never (Go to question 28):
2. Less than twice a year (Go to question 10):
3. Between 2 and 6 times per year (Go to question 10):
4. Between 7 and 11 times per year (Go to question 11):
5. Monthly (Go to question 11):
6. More than once a month (Go to question 11):
10. You indicated that you or your staff access the WWC website less
than 7 times per year. Which of the following reasons best describes
why you and your staff do not access the website more frequently?
(Check only one answer)
1. Time Constraints:
2. Content is not relevant to our decisions:
3. Disagree with recommendations on the site:
4. Site is difficult to navigate:
5. Other reason (Please specify below) Other reason:
11. To what extent, if at all, have you or your staff used the WWC
website to inform decisions on effective education practices?
(Check only one answer)
1. Very large extent (Go to question 13):
2. Large extent (Go to question 13):
3. Moderate extent (Go to question 13):
4. Small extent (Go to question 13):
5. Have not used the VVWC to inform any decisions (Go to question 12):
6. Don't know (Go to question 12):
13. To what extent, if at all, have you or your staff used information
in the WWC to do any of the following? (Please choose one response for
each action.)
13a. Inform professional development of teachers:
To a very large extent:
To a large extent:
To a moderate extent:
To a small extent:
Not at all:
Don't know:
13b. Advise ______s that are not making AYP on potential interventions:
To a very large extent:
To a large extent:
To a moderate extent:
To a small extent:
Not at all:
Don't know:
13c. Develop ______improvement plans:
To a very large extent:
To a large extent:
To a moderate extent:
To a small extent:
Not at all:
Don't know:
13d. Inform curriculum decisions:
To a very large extent:
To a large extent:
To a moderate extent:
To a small extent:
Not at all:
Don't know:
13e. Other use (Please specify below):
To a very large extent:
To a large extent:
To a moderate extent:
To a small extent:
Not at all:
Don't know:
Other use:
14. To what extent have you or your staff used the WWC's Intervention
Reports to inform decisions on effective education practices?
Intervention Reports provide an assessment of the efficacy of
interventions based on existing research that meets certain standards.
(Check only one answer)
1. Very large extent:
2. Large extent:
3. Moderate extent:
4. Small extent:
5. Don't know:
6. Have not used the VVWC Intervention Reports to inform any decisions
(Go to question 18):
18. To what extent have you or your staff used the WWC's Practice
Guides to inform decisions on effective education practices?
Practice Guides are developed by a panel of experts and provide
recommendations to help educators address common classroom or school-
wide challenges. (Check only one answer)
1. Very large extent:
2. Large extent:
3. Moderate extent:
4. Small extent:
5. Don't know:
6. Have not used the WWC Practice Guides to inform any decisions (Go
to question 22):
22. To what extent have you or your staff used the WWC's Quick Reviews
to inform decisions on effective education practices?
Quick Reviews are designed to help educators and policy makers assess
the quality of recently released research papers and reports.
(Check only one answer)
1. Very large extent:
2. Large extent:
3. Moderate extent:
4. Small extent:
5. Don't know:
6. Have not used the VVWC Quick Reviews to inform any decisions (Go to
question 25):
25. How likely or unlikely would you or your staff be to increase your
usage of the WWC if any of the following information were added to the
website? (Please choose one response for each type of information.)
25a. Additional topic areas:
Very Likely:
Somewhat Likely:
Neither likely nor unlikely:
Somewhat unlikely:
Very unlikely:
No opinion:
25b. Additional practice guides:
Very Likely:
Somewhat Likely:
Neither likely nor unlikely:
Somewhat unlikely:
Very unlikely:
No opinion:
25c. Additional studies reviewed:
Very Likely:
Somewhat Likely:
Neither likely nor unlikely:
Somewhat unlikely:
Very unlikely:
No opinion:
25d. Additional information on interventions based on studies that may
not meet WWC standards:
Very Likely:
Somewhat Likely:
Neither likely nor unlikely:
Somewhat unlikely:
Very unlikely:
No opinion:
25e. A broader definition of what studies meet WWC standards:
Very Likely:
Somewhat Likely:
Neither likely nor unlikely:
Somewhat unlikely:
Very unlikely:
No opinion:
25f. Additional practices with positive reviews:
Very Likely:
Somewhat Likely:
Neither likely nor unlikely:
Somewhat unlikely:
Very unlikely:
No opinion:
25g. Reviews of efficacy of programs being used or considered in
my______:
Very Likely:
Somewhat Likely:
Neither likely nor unlikely:
Somewhat unlikely:
Very unlikely:
No opinion:
25h. Reviews that are more timely:
Very Likely:
Somewhat Likely:
Neither likely nor unlikely:
Somewhat unlikely:
Very unlikely:
No opinion:
25i. A greater number of intervention reports showing positive effects:
Very Likely:
Somewhat Likely:
Neither likely nor unlikely:
Somewhat unlikely:
Very unlikely:
No opinion:
25j. An easier website to navigate:
Very Likely:
Somewhat Likely:
Neither likely nor unlikely:
Somewhat unlikely:
Very unlikely:
No opinion:
25k. Other information (Please specify below):
Very Likely:
Somewhat Likely:
Neither likely nor unlikely:
Somewhat unlikely:
Very unlikely:
No opinion:
Other information:
Source: GAO survey of states' and school districts' use of educational
clearinghouses.
[End of figure]
The following tables contain the estimates and associated confidence
intervals for the data displayed in figures 8, 10, 12, and 14.
Table 5: Estimates and Confidence Intervals for Figure 8:
Q8. From which of the following source(s) did you or your staff hear
about the WWC?
Label: Conferences;
Response: Yes;
Percentage: 72.72;
Lower bound: 65.26;
Upper bound: 80.19.
Label: Conferences;
Response: No;
Percentage: 21.83;
Lower bound: 15.17;
Upper bound: 29.77.
Label: Conferences;
Response: Don't know;
Percentage: 5.45;
Lower bound: 2.36;
Upper bound: 10.50.
Label: Peers;
Response: Yes;
Percentage: 66.43;
Lower bound: 58.39;
Upper bound: 74.47.
Label: Peers;
Response: No;
Percentage: 27.80;
Lower bound: 20.22;
Upper bound: 35.37.
Label: Peers;
Response: Don't know;
Percentage: 5.77;
Lower bound: 2.32;
Upper bound: 11.62.
Label: Regional Education Labs;
Response: Yes;
Percentage: 47.14;
Lower bound: 38.40;
Upper bound: 55.89.
Label: Regional Education Labs;
Response: No;
Percentage: 45.24;
Lower bound: 36.51;
Upper bound: 53.97.
Label: Regional Education Labs;
Response: Don't know;
Percentage: 7.62;
Lower bound: 3.81;
Upper bound: 13.34.
Label: U.S. Department of Education;
Response: Yes;
Percentage: 71.93;
Lower bound: 64.41;
Upper bound: 79.46.
Label: U.S. Department of Education;
Response: No;
Percentage: 22.85;
Lower bound: 16.17;
Upper bound: 30.72.
Label: U.S. Department of Education;
Response: Don't know;
Percentage: 5.22;
Lower bound: 2.00;
Upper bound: 10.80.
Source: GAO analysis.
[End of table]
Table 6: Estimates and Confidence Intervals for Figure 10:
Q13: To what extent, if at all, have you or your staff used
information in the WWC to do any of the following?
Label: Inform professional development of teachers:
Response: To a very large extent;
Percentage: 12.33;
Lower bound: 6.24;
Upper bound: 21.15.
Response: To a large extent;
Percentage: 19.49;
Lower bound: 11.74;
Upper bound: 29.45.
Response: To a moderate extent;
Percentage: 38.12;
Lower bound: 28.25;
Upper bound: 48.00.
Response: To a small extent;
Percentage: 19.48;
Lower bound: 11.77;
Upper bound: 29.35.
Response: Not at all;
Percentage: 5.90;
Lower bound: 2.40;
Upper bound: 11.80.
Response: Don't know;
Percentage: 4.67;
Lower bound: 1.09;
Upper bound: 12.44.
Label: Advise schools that are not making adequate yearly progress on
potential interventions:
Response: To a very large extent;
Percentage: 13.23;
Lower bound: 6.90;
Upper bound: 22.23.
Response: To a large extent;
Percentage: 10.12;
Lower bound: 5.12;
Upper bound: 17.48.
Response: To a moderate extent;
Percentage: 28.21;
Lower bound: 18.82;
Upper bound: 39.24.
Response: To a small extent;
Percentage: 11.42;
Lower bound: 5.69;
Upper bound: 19.82.
Response: Not at all;
Percentage: 28.65;
Lower bound: 19.55;
Upper bound: 39.23.
Response: Don't know;
Percentage: 8.36;
Lower bound: 3.54;
Upper bound: 16.16.
Label: Develop school improvement plans:
Response: To a very large extent;
Percentage: 12.07;
Lower bound: 6.19;
Upper bound: 20.56.
Response: To a large extent;
Percentage: 14.79;
Lower bound: 7.59;
Upper bound: 24.99.
Response: To a moderate extent;
Percentage: 34.83;
Lower bound: 24.68;
Upper bound: 44.98.
Response: To a small extent;
Percentage: 19.15;
Lower bound: 11.72;
Upper bound: 28.63.
Response: Not at all;
Percentage: 13.37;
Lower bound: 7.35;
Upper bound: 21.73.
Response: Don't know;
Percentage: 5.79;
Lower bound: 1.80;
Upper bound: 13.36.
Label: Inform curriculum decisions:
Response: To a very large extent;
Percentage: 14.95;
Lower bound: 8.53;
Upper bound: 23.61.
Response: To a large extent;
Percentage: 21.99;
Lower bound: 14.15;
Upper bound: 31.65.
Response: To a moderate extent;
Percentage: 36.50;
Lower bound: 26.71;
Upper bound: 46.28.
Response: To a small extent;
Percentage: 19.50;
Lower bound: 12.02;
Upper bound: 29.00.
Response: Not at all;
Percentage: 3.31;
Lower bound: 0.92;
Upper bound: 8.23.
Response: Don't know;
Percentage: 3.75;
Lower bound: 0.63;
Upper bound: 11.47.
Source: GAO analysis.
[End of table]
Table 7: Estimates and Confidence Intervals for Figure 12:
Q14: To what extent have you or your staff used the WWCs intervention
reports to inform decisions on effective education practices?
Q18: To what extent have you or your staff used the WWCs practice
guides to inform decisions on effective education practices?
Q22: To what extent have you or your staff used the WWCs quick reviews
to inform decisions on effective education practices?
Label: Intervention reports:
Response: Very large or large extent;
Percentage: 20.98;
Lower bound: 13.66;
Upper bound: 29.99.
Response: Moderate extent;
Percentage: 29.42;
Lower bound: 20.13;
Upper bound: 38.71.
Response: Small extent;
Percentage: 23.99;
Lower bound: 15.26;
Upper bound: 34.67.
Response: Don't know;
Percentage: 4.60;
Lower bound: 1.48;
Upper bound: 10.51.
Response: Have not used the WWC intervention report to inform
decisions;
Percentage: 21.01;
Lower bound: 13.23;
Upper bound: 30.73.
Label: Practice guides:
Response: Very large or large extent;
Percentage: 10.06;
Lower bound: 5.28;
Upper bound: 16.95.
Response: Moderate extent;
Percentage: 27.78;
Lower bound: 18.84;
Upper bound: 38.23.
Response: Small extent;
Percentage: 16.77;
Lower bound: 10.12;
Upper bound: 25.44.
Response: Don't know;
Percentage: 2.91;
Lower bound: 0.70;
Upper bound: 7.75.
Response: Have not used the WWC practice guides to inform decisions;
Percentage: 42.48;
Lower bound: 32.26;
Upper bound: 52.70.
Label: Quick reviews:
Response: Very large or large extent;
Percentage: 12.93;
Lower bound: 6.66;
Upper bound: 21.92.
Response: Moderate extent;
Percentage: 16.66;
Lower bound: 9.79;
Upper bound: 25.70.
Response: Small extent;
Percentage: 24.65;
Lower bound: 15.93;
Upper bound: 35.21.
Response: Don't know;
Percentage: 5.78;
Lower bound: 2.35;
Upper bound: 11.55.
Response: Have not used the WWC quick reviews to inform decisions;
Percentage: 39.98;
Lower bound: 30.16;
Upper bound: 49.81.
Source: GAO analysis.
[End of table]
Table 8: Estimates and Confidence Intervals for Figure 14:
Q25: How likely or unlikely would you or your staff be to increase
your usage of the WWC if any of the following information were added
to the Web site?
Label: Reviews of efficacy of programs being used or considered in my
district;
Response: Very likely or somewhat likely;
Percentage: 67.92;
Lower bound: 59.77;
Upper bound: 76.06.
Label: A greater number of intervention reports showing positive
effects;
Response: Very likely or somewhat likely;
Percentage: 66.54;
Lower bound: 58.30;
Upper bound: 74.77.
Label: Additional practices with positives reviews;
Response: Very likely or somewhat likely;
Percentage: 60.49;
Lower bound: 51.95;
Upper bound: 69.03.
Label: Additional Topic Areas;
Response: Very likely or somewhat likely;
Percentage: 57.27;
Lower bound: 48.66;
Upper bound: 65.88.
Label: Additional information on interventions based on studies that
may not meet WWC standards;
Response: Very likely or somewhat likely;
Percentage: 55.97;
Lower bound: 47.43;
Upper bound: 64.51.
Label: Additional Practice Guides;
Response: Very likely or somewhat likely;
Percentage: 55.80;
Lower bound: 47.12;
Upper bound: 64.47.
Label: Additional studies reviewed;
Response: Very likely or somewhat likely;
Percentage: 54.89;
Lower bound: 46.24;
Upper bound: 63.53.
Label: Reviews that are more timely;
Response: Very likely or somewhat likely;
Percentage: 49.95;
Lower bound: 41.32;
Upper bound: 58.57.
Label: A broader definition of what studies meet WWC standards;
Response: Very likely or somewhat likely;
Percentage: 47.38;
Lower bound: 38.70;
Upper bound: 56.05.
Label: An easier web site to navigate;
Response: Very likely or somewhat likely;
Percentage: 47.01;
Lower bound: 38.30;
Upper bound: 55.73.
Label: Other Information;
Response: Very likely or somewhat likely;
Percentage: 17.37;
Lower bound: 8.69;
Upper bound: 29.58.
Source: GAO analysis.
[End of table]
Information from Teachers, Principals, and Researchers:
In addition to interviews with teacher, principal, and research
organizations, we obtained information about the usefulness of the WWC
by administering a questionnaire at four conferences of teachers and
principals. Table 9 provides more information about the conferences we
attended.
Table 9: Conferences Attended to Administer Questionnaires to Teachers
and Principals:
Conference: National Council of Teachers of Mathematics;
Regional/national: Regional;
Location: Nashville, Tennessee;
Conference dates: November 18-20, 2009;
Attendance dates: November 19-20, 2009.
Conference: National Council of Teachers of English;
Regional/national: National;
Location: Philadelphia, Pennsylvania;
Conference dates: November 19-24, 2009;
Attendance dates: November 21-22, 2009.
Conference: ASCD (formerly the Association of Supervisors and
Curriculum Developers);
Regional/national: National;
Location: San Antonio, Texas;
Conference dates: March 6-8, 2010;
Attendance dates: March 7-8, 2010.
Conference: National Association of Secondary School Principals;
Regional/national: National;
Location: Phoenix, Arizona;
Conference dates: March 12-14, 2010;
Attendance dates: March 12-13, 2010.
Source: GAO.
[End of table]
We selected these conferences because they were relevant to segments
of the WWC's target population that we were not reaching through our
survey and they were held at times that coincided with our report time
frames. At each of these conferences, conference organizers agreed to
have GAO have a table either inside the exhibit hall or just outside
it. The questionnaires included questions on awareness and use of WWC-
-including use of specific products and use of other information
sources to identify effective educational practices. For those who had
not used the WWC, we also asked them to specify the reason they had
not used it. The information gathered through the questionnaires is
not generalizable and does not represent the views of teachers and
principals nationwide.
[End of section]
Appendix II: Other Sources of Information Districts Use To Identify
Effective Education Practices:
Source of information: Personal Experience;
Estimated percent of district officials that find this source Very
Useful or Useful: 80.6; (76.46, 84.79);
Estimated percent of district officials that find this source Somewhat
or Slightly Useful: 16.6; (12.73, 20.53);
Estimated percent of district officials find this source not at all
useful or no opinion: 2.3; (0.90, 4.67);
Not used: 0.5; (0.06, 1.67).
Source of information: Local data and/or internal research;
Estimated percent of district officials that find this source Very
Useful or Useful: 77.4; (72.85, 81.84);
Estimated percent of district officials that find this source Somewhat
or Slightly Useful: 14.9; (11.30, 19.13);
Estimated percent of district officials find this source not at all
useful or no opinion: 6.2; (3.79, 9.58);
Not used: 1.5; (0.48, 3.55).
Source of information: Peer Conferences;
Estimated percent of district officials that find this source Very
Useful or Useful: 77.3; (72.93, 81.74);
Estimated percent of district officials that find this source Somewhat
or Slightly Useful: 16.7; (12.88, 20.58);
Estimated percent of district officials find this source not at all
useful or no opinion: 4.2; (2.28, 7.14);
Not used: 1.7; (0.60, 3.73).
Source of information: Mentors/Colleagues;
Estimated percent of district officials that find this source Very
Useful or Useful: 74.5; (69.97, 79.08);
Estimated percent of district officials that find this source Somewhat
or Slightly Useful: 21; (16.74, 25.17);
Estimated percent of district officials find this source not at all
useful or no opinion: 3.2; (1.53, 5.84);
Not used: 1.3; (0.42, 3.05).
Source of information: Education-related periodicals;
Estimated percent of district officials that find this source Very
Useful or Useful: 70.9; (66.00, 75.72);
Estimated percent of district officials that find this source Somewhat
or Slightly Useful: 23; (18.50, 27.50);
Estimated percent of district officials find this source not at all
useful or no opinion: 3.2; (1.48, 6.02);
Not used: 2.9; (1.34, 5.49).
Source of information: Associations of educators or researchers;
Estimated percent of district officials that find this source Very
Useful or Useful: 65; (59.94, 70.04);
Estimated percent of district officials that find this source Somewhat
or Slightly Useful: 23.1; (18.59, 27.57);
Estimated percent of district officials find this source not at all
useful or no opinion: 6.3; (3.83, 9.66);
Not used: 5.6; (3.39, 8.72).
Source of information: Academic journals;
Estimated percent of district officials that find this source Very
Useful or Useful: 64.3; (59.22, 69.41);
Estimated percent of district officials that find this source Somewhat
or Slightly Useful: 27.7; (22.90, 32.45);
Estimated percent of district officials find this source not at all
useful or no opinion: 4.7; (2.59, 7.81);
Not used: 3.3; (1.64, 5.85).
Source of information: Online databases (ERIC or others);
Estimated percent of district officials that find this source Very
Useful or Useful: 54.4; (49.10, 59.63);
Estimated percent of district officials that find this source Somewhat
or Slightly Useful: 30.6; (25.62, 35.53);
Estimated percent of district officials find this source not at all
useful or no opinion: 7.9; (5.12, 11.59);
Not used: 7.1; (4.58, 10.52).
Source of information: Community and parent input;
Estimated percent of district officials that find this source Very
Useful or Useful: 51.7; (46.48, 56.97);
Estimated percent of district officials that find this source Somewhat
or Slightly Useful: 39.9; (34.67, 45.06);
Estimated percent of district officials find this source not at all
useful or no opinion: 6.2; (3.82, 9.53);
Not used: 2.2; (0.95, 4.19).
Source of information: University based research institutions;
Estimated percent of district officials that find this source Very
Useful or Useful: 50.3; (45.11, 55.49);
Estimated percent of district officials that find this source Somewhat
or Slightly Useful: 35.4; (30.25, 40.45);
Estimated percent of district officials find this source not at all
useful or no opinion: 9.3; (6.30, 13.16);
Not used: 5; (2.94, 7.98).
Source of information: Regional Educational Laboratories;
Estimated percent of district officials that find this source Very
Useful or Useful: 45.9; (40.66, 51.03);
Estimated percent of district officials that find this source Somewhat
or Slightly Useful: 31.3; (26.32, 36.18);
Estimated percent of district officials find this source not at all
useful or no opinion: 11.5; (8.18, 15.59);
Not used: 11.4; (8.16, 15.36).
Source of information: State government offices and/or outreach
centers;
Estimated percent of district officials that find this source Very
Useful or Useful: 39.8; (34.68, 45.01);
Estimated percent of district officials that find this source Somewhat
or Slightly Useful: 37.8; (32.64, 42.88);
Estimated percent of district officials find this source not at all
useful or no opinion: 15.79; (11.87, 20.10);
Not used: 6.7; (4.33, 9.91).
Source of information: Other federal outreach centers;
Estimated percent of district officials that find this source Very
Useful or Useful: 17.9; (13.99, 21.83);
Estimated percent of district officials that find this source Somewhat
or Slightly Useful: 34.7; (29.65, 39.77);
Estimated percent of district officials find this source not at all
useful or no opinion: 24; (19.41, 28.67);
Not used: 23.3; (18.82, 27.86).
Source of information: What Works Clearinghouse;
Estimated percent of district officials that find this source Very
Useful or Useful: 24.4; (20.07 28.76;
Estimated percent of district officials that find this source Somewhat
or Slightly Useful: 14.8; (11.20, 19.09);
Estimated percent of district officials find this source not at all
useful or no opinion: 16.1; (12.25 20.57);
Not used: 44.7; (39.37 49.96).
Source of information: Doing What Works;
Estimated percent of district officials that find this source Very
Useful or Useful: 22.8; (18.40, 27.10);
Estimated percent of district officials that find this source Somewhat
or Slightly Useful: 13.8; (10.42, 17.68);
Estimated percent of district officials find this source not at all
useful or no opinion: 14.2; (10.52, 18.50);
Not used: 49.3; (44.03, 54.62).
Source of information: Non-profit organization;
Estimated percent of district officials that find this source Very
Useful or Useful: 21.7; (17.63, 25.83);
Estimated percent of district officials that find this source Somewhat
or Slightly Useful: 43.9; (38.58, 49.11);
Estimated percent of district officials find this source not at all
useful or no opinion: 17; (13.06, 21.60);
Not used: 17.4; (13.22, 21.59).
Source of information: RANDs Promising Practices;
Estimated percent of district officials that find this source Very
Useful or Useful: 10.1; (7.34, 13.36);
Estimated percent of district officials that find this source Somewhat
or Slightly Useful: 14; (10.48, 18.09);
Estimated percent of district officials find this source not at all
useful or no opinion: 18.9; (14.54, 23.25);
Not used: 57.1; (51.77, 62.40).
Source of information: Child Trends;
Estimated percent of district officials that find this source Very
Useful or Useful: 9.7; (6.74, 13.29);
Estimated percent of district officials that find this source Somewhat
or Slightly Useful: 14.5; (10.93, 18.77);
Estimated percent of district officials find this source not at all
useful or no opinion: 16.73; (12.84, 21.25);
Not used: 59.1; (53.84, 64.33).
Source: GAO analysis of survey results from the following questions:
(4) How useful, if at all, are each of the following research
evaluation resources to you or your staff in identifying effective
practices to implement in your district?; and (5) Recognizing that
research evaluation resources are not necessarily the primary sources
of information used to identify effective education practices, GAO is
also interested in the role of other information sources. How useful,
if at all, are each of the following sources to you or your staff in
identifying effective practices to implement in your district?
[End of table]
[End of section]
Appendix III: IES and WWC Response to Expert Panel Recommendations:
Expert panel recommendation: 1. Full review. IES should commission a
full review of the WWC, including a review of the Clearinghouse's
mission and of the WWC practice guides, which the panel did not
attempt to evaluate. The panel also recommends that IES consider
instituting a regular review process to ensure that the WWC is using
the most appropriate standards in its work;
IES/WWC response: IES is considering an appropriate mechanism and time
for conducting a complete review of the WWC. IES believes that the
first 2 years of the current contract necessitated a tremendous
development effort to transfer the infrastructure of the Clearinghouse
in year one from one contractor to another, and in year two, to
complete reviews in a consistent manner that began under the original
contract. Now that the Clearinghouse is more clearly in the production
phase, this may be the appropriate time to plan for a complete review;
Implementation status: Under consideration.
Expert panel recommendation: 2(i). Protocol templates. WWC should
develop standards for crossover and assignment noncompliance, and for
adjusting intention to treat effects across studies;
IES/WWC response: IES is currently considering having the WWC develop
a standard for assessing crossover compliance, following the process
recently used to revise its attrition standard; Currently, the WWC
documents crossover reported in studies. Principal investigators have
discretion to use this information to determine whether a study
represents a reasonable test of the intervention. Evidence of
crossover and assignment noncompliance is documented in the
intervention report and its appendix table A.1. Readers can use that
information to assess the findings. IES agrees there is value in
adjusting intent-to-treat effects for compliance, but believes this
adjustment is inconsistent with its goal of having the WWC be
transparent in how it reports findings. Making its own estimates to
account for compliance will lead to differences between what the WWC
reports and what is found in publicly available literature; Currently
the WWC does adjust for clustering when authors report their findings
incorrectly. However, the purpose of the clustering adjustment is to
correct for an analytic problem in the methods authors use to estimate
variances, which generally causes them to overstate the precision of
their findings. In contrast, adjusting for compliance will yield an
alternate estimate of effects that may differ from the one reported by
the study;
Implementation status: Under consideration.
Expert panel recommendation: 2(ii). Protocol templates. Develop
standards for documenting the program received in the control arm of
randomized experiments (or by members of the comparison group in quasi-
experimental designs), and potentially incorporating this information
in making comparisons across studies and/or interventions;
IES/WWC response: Though not based on a standard, WWC practice is for
reviewers to document the counterfactual in study review guides and in
intervention reports (the information is reported in appendix table
A.1). Reviewers routinely send author queries for this information, if
it is not provided in the study; IES has asked the WWC to assess how
other review organizations report counterfactual information and the
utility of incorporating this information into its reports. IES
officials are also considering an alternative approach that would code
information about the counterfactual in a study into the study
database, which then would generate summary tables that would report
results for studies that have similar counterfactuals. This approach
has downsides as well, since the set of counterfactuals could be quite
varied and many assumptions would have to be made to group
counterfactuals together. We are therefore proceeding cautiously in
making any changes to current WWC practice;
Implementation status: Considered but not planning to implement.
Expert panel recommendation: 2 (iii). Protocol templates. Revise
standards for multiple comparisons in light of the recent research
report by Peter Schochet entitled Guidelines for Multiple Testing in
Experimental Evaluations of Educational Intervention;
IES/WWC response: WWC staff consulted with Dr. Schochet to investigate
the possibility of revising the multiple comparison standards;
Dr. Schochet indicated that his report focused on issues related to
multiple comparisons within single studies. It did not tackle issues
related to multiple comparisons issues that may arise when
synthesizing evidence for a set of studies. WWC procedures are
consistent with his report for handling multiple comparisons within a
study;
Implementation status: Considered but not planning to implement.
Expert panel recommendation: 2(iv). Protocol templates. Reconsider the
current process of setting different attrition standards in different
topic areas;
IES/WWC response: At the time of the National Board of Education
Sciences Expert Panel's data collection, the WWC was already reviewing
its attrition standards. The WWC released new attrition standards in
December 2008 in the Procedures and Standards Handbook. The new
standards requires a principal investigator in a topic area to choose
one of two well-specified attrition boundaries, and the standards
include guidance on how to choose between the boundaries based on the
nature of research in the topic area; The attrition discussion is in
the WWC Procedures and Standards Handbook (Version 2.0) posted on the
Clearinghouse's Web site;
Implementation status: Implemented.
Expert panel recommendation: 2(v). Protocol templates. Establish a
protocol to keep track of potential conflicts of interest, such as
cases where a study is funded or conducted by a program developer, and
consider making that information available in its reports;
IES/WWC response: IES is considering options for collecting and
documenting potential conflicts of interest; Sources of funding are
rarely included in published documents beyond government and
foundation support. An alternate source of information for tracking
potential conflicts of interest would be for the WWC to request that
study authors identify their source of funding, which would provide
the WWC with a basis for flagging a potential conflict of interest.
Any effort would depend on cooperation from authors because the WWC
has no leverage to formally require authors to declare potential
conflicts (which some academic journals require as a condition for
publication). WWC's experience to date is that study authors
frequently fail to respond to requests for additional information, and
IES officials expect that many study authors likewise will not respond
to requests for information about funding sources, or may judge that
it is not in their proprietary interest to provide the information.
Currently the WWC only queries authors in cases where the
Clearinghouse needs additional information. Querying all authors and
tracking their responses would increase costs for intervention reports;
Another potential option is to ask developers, when they are reviewing
the list of studies WWC found during the literature search for
comprehensiveness, to note any studies that they funded;
Implementation status: Under consideration.
Expert panel recommendation: 2(vi). Protocol templates. Define
precisely the standards for "randomization" in a multilevel setting;
IES/WWC response: The current version of the handbook gives guidance
on standards for random assignment in simple cases. The next version
of the handbook (forthcoming in 2010) will provide guidance and
examples for multilevel settings, with explicit guidance on acceptable
practice and potential issues with random assignment in a multilevel
setting;
Implementation status: Implemented.
Expert panel recommendation: 3. Documentation of search process. WWC
should expand the protocol templates to specify more explicit
documentation of the actual search process used in each topic area and
maintain a record of the results of the process that can be used to
guide decision making on future modifications;
IES/WWC response: IES asked the WWC to review the search process. The;
WWC now takes steps to ensure that search records are maintained. Each
team and the library maintain a record of conducted searches. More
documentation on the process will be included in the forthcoming
revision of the handbook;
Implementation status: Implemented.
Expert panel recommendation: 4. Reliability of eligibility screening.
WWC should conduct regular studies of the reliability of the
eligibility screening process, using two independent screeners, and
use the results from these studies to refine the eligibility screening
rules and screening practices;
IES/WWC response: The WWC is undertaking a pilot using five recent
evidence reports in different topic areas. Because WWC screeners are
encouraged to pass to the next stage any study for which they are
uncertain about eligibility, the proportion of eligible studies that
are excluded is the salient error rate (the other source of error is
when screeners include an ineligible study in a review, but this error
is then offset by the review);
IES officials are not aware of any established standards for
acceptable error rates (there are tradeoffs between making Type I vs.
Type II errors relating to cost), but will examine this issue further.
If the screening error rate is larger than the IES and the WWC
believes is acceptable, IES officials will assess whether additional
training or two screeners is appropriate given the different costs and
benefits of each approach;
Implementation status: Under consideration.
Expert panel recommendation: 5. Documentation of screening process.
WWC reports should include a flow chart documenting the flow of
studies through each review and number of studies excluded at each
point, and a table of excluded studies, listing specific reasons for
exclusion for each study;
IES/WWC response: Currently, reference lists for WWC intervention
reports include all studies, both eligible and ineligible, located in
the search process. Ineligible studies are flagged with the primary
reason for not qualifying for further WWC review. Intervention reports
do not list materials such as product descriptions or reviews of
products that are deemed not relevant to the intervention being
reviewed; To make the number of studies (both eligible and ineligible
studies) more apparent to readers, the WWC will add a text box to
intervention reports located in front of the listing of reports. The
text box will summarize the number of studies that met different
conditions (this approach currently is used for reports in which none
of the studies meet standards). The box will serve the same purpose as
a flow chart but the codes used to describe the final status for
reports will be the same ones currently used in the citation appendix.
The WWC plans to begin including the text box in reports released in
2010 and thereafter;
Implementation status: Implemented.
Expert panel recommendation: 6. Misalignment adjustment. In cases
where a study analysis is "misaligned," WWC staff should request that
study authors reanalyze their data correctly, taking into account the
unit of randomization and clustering. The panel recommends that the
results from the process be compared to the adjustment procedure
currently specified, to develop evidence on the validity of the latter;
IES/WWC response: Ideally, the primary source for reanalyses of data
would be study authors. However, as noted above in response to
recommendation 2(v), it is common for authors not to respond to the
WWC's requests for additional information. Reanalyzing the data also
would require additional effort by the authors and would run into
difficulties when studies are dated or are based on data that has been
destroyed to comply with confidentiality or privacy restrictions; The
WWC recently undertook a survey of published clustering estimates. It
found that the WWC's current default clustering correction is
consistent with published estimates for achievement and behavioral
outcomes. The WWC will continue to monitor research developments on
this topic;
Implementation status: Considered but not planning to implement.
Expert panel recommendation: 7. Combining evidence across multiple
studies. WWC should re-evaluate its procedures for combining evidence
across studies, with specific attention to the issue of how the rules
for combining evidence can be optimally tuned, given the objectives of
the WWC review process and the sample sizes in typical studies for a
topic area;
IES/WWC response: There are, of course, many possible ways to
summarize evidence. Given its intended broad and primarily
nontechnical audience, the WWC's current approach is designed to be
transparent and easily explained. IES believes that having the WWC
conduct its own analyses to estimate intervention effects, as
statistical meta-analyses do, would be inconsistent with these goals;
However, as an alternative to modifying the WWC's main approach for
reporting findings, IES is considering having the WWC conduct
supplemental meta-analyses related to specific questions of interest,
and releasing these findings as a separate report that would
complement intervention reports. For example, a report could analyze
whether computer mediated approaches to teaching reading are more or
less effective than approaches that rely solely on teachers, based on
already-released interventions reports. Having a separate report
enables the WWC to continue using its current transparent approach,
while also using statistical techniques that combine evidence in other
ways;
Implementation status: Under consideration.
Expert panel recommendation: 8. Reporting. (i) Published reports on
the Web site should include the topic area protocols, as well as more
information on the screening process results that led to the set of
eligible studies actually summarized in the topic area reports. (ii)
WWC should make available its Standards and Procedures Handbook,
including appendixes, as well as all other relevant documents that
establish and document its policies and procedures;
IES/WWC response: Topic area protocols are available on the topic area
home pages; Just after the expert panel's report, the WWC released its
Procedures and Standards Handbook in December 2008. A revision
currently is under way that will include more detail on the screening
process. See the response to (3) above related to results of the
screening process and the response to (5) regarding the results of the
screening process;
Implementation status: Implemented.
Expert panel recommendation: 9. Practice guides. Clearly separate
practice guides from the topic and intervention reports;
IES/WWC response: IES agrees that these products need to remain
distinct. Practice guides are on a separate Web site tab that
separates them from intervention reports. The next revision of the
handbook (forthcoming in 2010) will include a chapter describing the
practice guide development process and how it is different from that
of the evidence reports; The recently released guide on What Works for
Practitioners also provides more information on reports and practice
guides, and the WWC is preparing a video tutorial that will explain
the differences to users;
Implementation status: Implemented.
Expert panel recommendation: 10. Outreach and collaboration with other
organizations. (i) The WWC should build and maintain a relationship
with national and international organizations focusing on systematic
reviews to engage in the broader scientific community and learn about
the latest standards and practices. (ii) The WWC should convene
working groups with a mixture of researchers (including specialists in
education research and systematic reviews) to address the development
of new standards for the review and synthesis of studies;
IES/WWC response: The WWC tries to keep abreast of developments in the
field, for example, by routinely checking materials from the Cochrane
Collaborative when developing new standards or approaches; Most
recently, the WWC has undertaken the following outreach efforts to
connect with other organizations conducting systematic reviews:
* The WWC sponsored a forum on research methods in December 2008 that
featured speakers from the National Academy of Sciences, the National
Cancer Institute, and the Cochrane Collaboration;
* In June 2009 WWC staff attended the Cochrane conference on practice
guides in June to learn about state of the art methods in research
synthesis and practice guides;
* The WWC is presenting a workshop on WWC standards at the upcoming
annual conference of the Association of Public Policy and Management;
The WWC has also met with six international contacts (from Sweden,
Denmark, Hungary, England, Interamerican Development Bank, and
Trinidad/Tobago) in response to inquiries about how governments or
organizations could implement their own clearinghouse operations;
Recently the WWC began a webinar series to disseminate its new
practice guides. The webinar includes researchers and practitioners in
its audience; The WWC convened two groups of researchers to develop
its forthcoming standards on single-subject designs and regression
discontinuity designs. It will continue to bring together researchers
as needs for new standards are identified. This approach will continue
to be used for developing new standards;
Implementation status: Implemented.
Source: GAO analysis of IES and WWC data.
[End of table]
[End of section]
Appendix IV: Comments from the Department of Education:
United States Department Of Education:
Institute Of Education Sciences:
The Director:
555 New Jersey Ave., NW:
Washington, D.C. 20208:
July 6, 2010:
Ms. Cornelia M. Ashby:
Director, Education, Workforce, and Income Security Issues:
U.S. Government Accountability Office:
441 G Street, NW:
Washington, DC 20548:
Dear Ms. Ashby:
Thank you for the opportunity to respond to the Government
Accountability Office (GAO) draft report, Improved Dissemination and
Timely Product Release Would Enhance the Usefulness of the What Works
Clearinghouse (GA040-644).
This report indicates that GAO, as well as a congressionally mandated
panel of experts, found that the What Works Clearinghouse's review
process follows accepted standards for evaluating research on the
effectiveness of education interventions. It notes that the
Clearinghouse is responding to recommendations made by the expert
panel to further improve its review and reporting processes, and to
researchers who have criticized the WWC for excluding some types of
research designs that may be appropriate for evaluating certain
education programs, such as special education. It also recognized that
product output and scope of the WWC have increased.
While the report emphasized the need for more timely release of WWC
reports, it noted, but did not highlight, other important findings
from the school district survey that point to important avenues for
improvement. For example, a higher proportion of district respondents
cited having evidence reviews of programs that they might use in their
local areas (68 percent) or that show positive effects (67 percent) as
a reason to use WWC products than cited getting reviews more quickly
(50 percent) (Figure 15, page 36). Given that the timetable of WWC
reports is not posted publicly, we are not sure how to interpret
districts' concerns with timeliness, but we have heard their need for
more relevant evidence reports covering more educational areas and
programs. As GAO noted, a backlog of reports that built up during a
time of senior leadership turnover has been resolved, and many new
reports are soon to be released. In addition, the Institute of
Education Sciences is continuing its efforts to ensure that expansion
into new topic areas is responsive to the needs of the field.
We appreciate the opportunity to comment on the draft report. The
Department of Education has prepared the attached comments in response
to your draft report. If you have any questions regarding this
response, please contact Dr. Rebecca Maynard, Commissioner for
Education Evaluation and Regional Assistance, in the Institute of
Education Sciences at (202) 208-1289.
Sincerely,
Signed by:
John Q. Easton:
Director:
Attachments:
[End of letter]
Attachment 1:
IES Response to GAO Report: Improved Dissemination and Timely Product
Release Would Enhance the Usefulness of the What Works Clearinghouse,
Draft:
Recommendation 1:
To consistently release WWC products in a timely manner, we recommend
the Secretary of Education direct IES to develop and implement
strategies that help avoid future backlogs and ensure that IES' review
and approval processes keep pace with increased contractor production.
Strategies could include shifting IES resources to ensure sufficient
staff time for managing the peer review process and streamlining its
approval process.
Response:
The Department of Education (Department) agrees with GAO's
recommendation and is developing and implementing strategies to ensure
that IES' review and approval processes keep pace with production. The
Director of the Institute of Education Sciences has recently appointed
a Commissioner of the National Center for Education Evaluation and
Regional Assistance who will oversee the content and operations of the
What Works Clearinghouse.
Recommendation 2:
To better track the usefulness and costs of the WWC, we recommend that
the Secretary of Education direct IES to:
* Incorporate findings from its cost studies to develop performance
measures related to costs, such as identifying a range of acceptable
costs per product and using that information to monitor contractor
spending.
* Develop performance measures related to product usefulness and
periodically assess whether WWC products are meeting the needs of
target audiences, by gathering information on product usefulness in
the proposed survey or through other means.
Response:
First, as noted in the report, WWC is currently conducting assessments
for streamlining procedures and costs at IES' request. The results,
due at the end of October 2010, will be incorporated into future WWC
work plans and IES's monitoring of contractor performance and costs.
IES intends to undertake such assessments on an ongoing basis for
continuous improvement.
Second, the WWC will continue to monitor and evaluate consumer use and
assessment of WWC products, as well as gather information regarding
suggestions for improvements. In addition, the WWC will be included as
part of IES' review of its own performance, which will include
surveying customers on satisfaction.
In addition, the Clearinghouse already has in place a number of tools
for users to provide input, and we will continue to emphasize their
availability to the education community. The WWC Help Desk allows
users to contact the WWC to send suggestions and ask questions about
specific products. Users contact the Help Desk through an 800 number
or through the WWC Web site. Since 2007, the WWC has responded to over
2,100 inquiries to the Help Desk.
Users have suggested over 200 interventions for the WWC to review, and
have suggested over 30 different topics for practice guides. This
information is continuously reviewed by WWC staff and informs
priorities. The WWC has used Web-based tools for soliciting specific
feedback from users. For example, the WWC has posted potential
practice guide topics on a Web-based survey, having users "vote" for
their favorite topics. The forthcoming practice guides on Reading
Comprehension, Fractions, and Teaching Writing were all identified
through this process.
In 2007, 2008, and 2009, the WWC conducted focus groups to solicit
feedback from educators on several aspects of its work. These included
such topics as the usability of the WWC Web site, which led to
improvement to the site functionality, and the format of practice
guides, which led to changes in the layout of the guides.
Visits to the WWC Web site have increased substantially over time.
Between fiscal years 2008 and 2009, the number of visits rose by 45
percent to 772,000. And, to date in 2010, there have been 618,000
visits to the Web site, an increase of 21 percent relative to the same
period in fiscal year 2009. IES will continue to track all of these
measures to ensure that we are using solid information on customer use
and satisfaction.
Recommendation 3:
To reach more members of the target audience, we recommend the
Secretary of Education direct IES to assess and improve its
dissemination efforts to promote greater awareness and use of the WWC,
for example, by developing a way to inform school districts of new
products or encouraging educator professional development programs to
focus on research-based practices such as those discussed in practice
guides.
Response:
The Department agrees with GAO's recommendation and is developing
further means to assess and improve dissemination efforts. IES'
monitoring of the WWC contract requires implementation of an Annual
Communication Plan that specifies how the contractor will conduct
outreach, dissemination, and communication activities that include,
for example, the WWCFlash, media and trade organization outreach,
development of dissemination partnerships with organizations and the
education community, WWC sponsored events and forums, and targeted
product outreach. IES has made all WWC products available through the
Department's Educational Resources Information Center (ERIC), an
online digital library of education research and information, and
provides RSS feeds of WWC products to ERIC users. The Department's
National Library of Education makes all WWC products, together with
all items indexed in ERIC, available to Google, MSN, and Yahoo search
engines for harvesting and, therefore, accessible to Web users
worldwide. All WWC products are now available to WorldCat, an
international bibliographic database that is used by libraries around
the world to identify and locate resources.
IES will continue to develop and implement the Communication Plan
vehicle to assess and improve dissemination, as well as fully utilize
other Department dissemination resources. One example of a new vehicle
that the Clearinghouse has in the works to expand its dissemination
partnerships with organizations and the education community is a
monthly WWC Newsletter on up-to-date information on recently released
Clearinghouse products as well as those in line for release in the
month ahead.
[End of section]
Appendix V: GAO Contact and Staff Acknowledgments:
GAO Contact:
Cornelia M. Ashby, (202) 512-7215, ashbyc@gao.gov:
Staff Acknowledgments:
In addition to the contact named above, individuals making key
contributions to this report include Elizabeth Morrison (Assistant
Director), Nagla'a El-Hodiri (analyst-in-charge), James Ashley, Carl
Barden, James Bennett, Valerie Caracelli, Laura Henry, Geoffrey King,
Jill Lacey, Luann Moy, Robert Owens, Cathy Roark, Stephanie Shipman,
Kate Van Gelder, and Craig Winslow.
[End of section]
Related GAO Products:
Program Evaluation: A Variety of Rigorous Methods Can Help Identify
Effective Interventions. [hyperlink,
http://www.gao.gov/products/GAO-10-30]. Washington, D.C.: November 23,
2009.
Teacher Quality: Sustained Coordination among Key Federal Education
Programs Could Enhance State Efforts to Improve Teacher Quality.
[hyperlink, http://www.gao.gov/products/GAO-09-593]. Washington, D.C.:
July 6, 2009.
Teacher Preparation: Multiple Federal Education Offices Support
Teacher Preparation for Instructing Students with Disabilities and
English Language Learners, but Systematic Departmentwide Coordination
Could Enhance This Assistance. [hyperlink,
http://www.gao.gov/products/GAO-09-573]. Washington, D.C.: July 20,
2009.
No Child Left Behind Act: Education Actions Could Improve the
Targeting of School Improvement Funds to Schools Most in Need of
Assistance. [hyperlink, http://www.gao.gov/products/GAO-08-380].
Washington, D.C.: February 29, 2008.
Program Evaluation: Strategies for Assessing How Information
Dissemination Contributes to Agency Goals. [hyperlink,
http://www.gao.gov/products/GAO-02-923]. Washington, D.C.: September
30, 2002.
The Evaluation Synthesis. [hyperlink,
http://www.gao.gov/products/PEMD-10.1.2]. Washington, D.C.: March 1992.
[End of section]
Footnotes:
[1] 20 U.S.C. §§ 6301-7941. The mission and functions to be performed
by IES are set out at 20 U.S.C. § 9511.
[2] This is the second 5-year contract for the Clearinghouse. The
first contract for about $27 million expired in 2007.
[3] H.R. Comm. on Appropriations, 111th Cong., Omnibus Appropriations
Act, 2009: Comm. Print of the Comm. on Appropriations U.S.
Representatives on H.R. 1105/Public Law 111-8, at 1483 (Comm. Print
2009).
[4] Thirty-seven states and the District of Columbia and 74 percent of
surveyed districts responded to our survey.
[5] WWC's mission is consistent with IES's broader mission to bring
rigorous and relevant research, evaluation, and statistics to the
nation's education system.
[6] The current priority for the 2006-2010 REL contract period is
providing policymakers and practitioners with expert advice, training,
and technical assistance on how to interpret the latest findings from
scientifically valid research pertaining to requirements of the ESEA.
[7] This contract was awarded to the American Institutes for Research
and the Campbell Collaboration. In 2007, Education awarded the second
5-year contract to Mathematica Policy Research, Inc. to operate the
WWC.
[8] The initial topic areas chosen in 2003 were: beginning reading, K-
12 math achievement, dropout prevention, adult literacy, peer-assisted
learning, reducing delinquency, and English language acquisition.
[9] IES dropped the peer-assisted learning topic area, which would
have covered interventions related to students learning with and from
other students--generally in the same class and at a similar academic
level.
[10] The WWC intervention reports primarily focus on branded products.
[11] In June 2010 IES made public its standards for two additional
study designs: regression discontinuity and single case design studies.
[12] IES uses some of this information to determine a performance-
based award for the contractor.
[13] GAO, Program Evaluation: A Variety of Rigorous Methods Can Help
Identify Effective Interventions, [hyperlink,
http://www.gao.gov/products/GAO-10-30] (Washington, D.C.: Nov. 23,
2009).
[14] For example, WWC rates the credibility of a study's evidence
along a continuum.
[15] To help educators understand the research behind a WWC report,
the WWC (1) combines information on the size and number of studies
reviewed to rate the extent of evidence as small or medium/large; (2)
includes an overall rating of effectiveness on each measured outcome,
which combines the size and direction of effects, statistical
significance, and the quality of the research designs; and (3) reports
the average improvement index across studies as the expected change in
percentile rank for an average control group student if the student
had received the intervention.
[16] H. Brown, D. Card, K. Dickersin, J. Greenhouse, J. Kling, and J.
Littell, Expert Report on the What Works Clearinghouse, a report
prepared by the National Board for Education Sciences, 2008. The
expert panel was convened by the National Board for Education Sciences
in 2008 in response to the Senate Appropriations Committee. S. Rep.
No. 110-410 at 228-29. The National Board for Education Sciences,
consisting of 15 voting members appointed by the President with the
advice and consent of the Senate, provides IES guidance and oversight.
20 U.S.C. § 9516. The mandate directed the Board to convene leading
experts in rigorous evaluations to assess the WWC, specified that
panel members should be free of conflicts of interest and that a
report with any recommendations was to be submitted within 4 months.
Expert panel members included economists, statisticians, and
professors with expertise in other systematic review efforts in the
fields of health care, and social policy.
[17] The WWC considered but is not implementing three of the panel's
recommendations. One recommendation suggested the WWC develop
standards for documenting the program received by the comparison group
that did not receive the intervention and potentially incorporating
this information when making comparisons across studies and/or
interventions. The other two related to WWC procedures for combining
evidence across studies and asking study authors to reanalyze their
data to correct a common error associated with the use of classrooms
rather than individual students in data analysis. This mismatch can
result in an overstatement of the statistical significance of the
effects of the intervention. The WWC maintains that its current
procedures are consistent with standard practices and has elected not
to ask authors to reanalyze their data. See appendix III for more
detail.
[18] The handbook documents the actions that WWC staff must take when
reviewing research and the items that must be included in the reports,
among other things. It is available at [hyperlink,
http://ies.ed.gov/ncee/wwc/pdf/wwc_procedures_v2_standards_handbook.pdf]
[19] WWC requires reviewers to select one of two levels of attrition
(higher or lower) depending on the topic area and context. WWC allows
a higher level of attrition for topic areas in which it assumes that
attrition is due to factors that are not strongly related to the
intervention. WWC allows a lower level of attrition for topic areas in
which attrition may be due to certain individuals choosing not to
participate in the intervention.
[20] In cases in which a study of a program is funded by the program
developer, the study authors may have incentives to find positive
effects of the program. Such incentives could call the validity of the
study's results into question.
[21] For example, see Robert Slavin and Dewi Smith, "The Relationship
Between Sample Sizes and Effect Sizes in Systematic Reviews in
Education," Educational Evaluation and Policy Analysis, vol. 31, no. 4
(2009): 500-506.
[22] WWC staff also contend that there is no statistical basis for
setting a minimum sample size and doing so would arbitrarily ignore
available evidence and potentially bias the findings of a systematic
review.
[23] Robert E. Slavin and Nancy A. Madden, "Measures Inherent to
Treatments in Program Effectiveness Reviews," paper presented at the
annual meetings of the Society for Research on Effective Education,
Crystal City, Virginia, March 3-4, 2008; and Robert E. Slavin, "What
Works? Issues in Synthesizing Educational Program Evaluations,"
Educational Researcher, vol. 37, no. 1 (Jan./Feb. 2008): 5.
[24] Intervention developers may intentionally or unintentionally
create a test that is more likely to favor the intervention because
they have financial or other interests in the success of the
intervention.
[25] While randomized control trials and quasi-experiments are
considered to be rigorous approaches in assessing program
effectiveness, they are not the only rigorous research designs
available and may not always be appropriate. For example, such
comparison group designs may not be appropriate for research on small
numbers of students receiving special education services in a self-
contained classroom. In such a case, an in-depth case study may be
more appropriate. Examples of other research methods include
statistical analyses of observational data, such as student records,
or analyses of surveys of an intervention's participants.
[26] The WWC produces intervention reports noting when no studies meet
standards.
[27] Specifically, the WWC developed standards--which were made
publicly available in June 2010--for reviewing single-case and
regression discontinuity designs. The WWC anticipates reviewing many
studies with single-case designs--studies that involve repeated
measurement of a single subject (e.g., a student or a classroom)--as
it evaluates interventions for special education. Regression
discontinuity designs compare outcomes for a treatment and control
group that are formed based on the results of a preintervention
measure.
[28] The WWC has published 12 practice guides as of May 2010. The
topics of these 12 practice guides are using data to support decision
making, helping students navigate the path to college, structuring out-
of-school time, assisting students in math, assisting students in
reading, reducing behavior problems, dropout prevention, improving
literacy, turning around low-performing schools, instruction for
English language learners, encouraging girls in math and science, and
organizing instruction and study.
[29] IES requires the contractor to file WWC annual plans that outline
planned product releases and other deliverables. IES and the
contractor update these plans once a year with revised estimates.
[30] Under the first contract (2002 to 2007), the WWC released 89
intervention reports, six topic reports, and three practice guides.
[31] Pub. L. No.111-5, 123 Stat. 115, 182.
[32] The current contract also requires WWC to create and maintain
other resources on its Web site, such as registries of researchers and
randomized trials and the WWC Policy and Procedures Handbook. IES
noted that these deliverables are either new or significantly enhanced
from those produced under the first WWC contract.
[33] IES indicated that the amount of research available meeting WWC
inclusion standards for a given report varied and had an impact on the
number of staff hours required in the production of reports. Reports
based on larger numbers of studies took more staff hours to complete
than those based on less available evidence.
[34] We discussed the backlog and its causes with IES officials in
February and May 2010. For the first six months of 2010, IES completed
the review of 59 report products (intervention reports and quick
reviews)--compared to 46 for the entirety of calendar year 2009--thus
eliminating the backlog.
[35] IES also spends about $200,000 per year on noncontracted WWC
expenses--including internal salaries, independent peer review
honorariums, and Web site support--which have not changed
significantly between the two 5-year contracts. In addition, three
practice guides were completed outside of the WWC contract, at a total
cost of about $319,000. IES noted that these preliminary guides were
produced through a less thorough process than the current process.
[36] Both contractors dedicated the same proportion of funds to WWC
products. The first contractor primarily published products in the
final year of the first contract (2007); however, products were
produced, reviewed, and modified--but not published--prior to that
year. As a result, despite limited publication in the first 4 years, a
large portion of the first contractor's expenditures were designated
for direct product costs.
[37] According to WWC staff, this system allows them to use results
from prior literature searches for related topics, rather than
conducting new searches. The current contractor also designed and
implemented standardized training for staff and subcontractors who
evaluate research. All WWC research evaluators complete 2 days of
training; are tested on WWC products, review standards, and policy;
and have initial reviews with monitored before working independently.
[38] The Organizational Assessment--Education's performance management
system--was developed in response to the requirements of Executive
Order 13450, Improving Government Program Performance, as well as the
Office of Personnel Management's requirement that each federal agency
evaluate its principal offices on an annual basis.
[39] The contract award fee plan includes performance measures related
to production, business management of the contract, and timeliness.
Business management of the contract includes cost management, business
relationships, efforts to meet small business subcontracting goals,
and accurate billing. These measures are linked to work specified in
the WWC contract and annual plans.
[40] While the WWC annually exceeded performance targets, it is
difficult to interpret these results as the performance measures
changed annually and, according to IES officials, the criteria for
meeting them were negotiated well into the fiscal year.
[41] In addition, IES determined that the current and prior
contractors generally met the award fee plan performance measures.
[42] The WWC's award fee plan includes cost management components but
has no cost per product measurements.
[43] This survey would ascertain whether IES has met its goal that at
least 25 percent of decision makers surveyed will have consulted the
Clearinghouse prior to making decisions on interventions in reading,
writing, mathematics, science, or teacher quality by 2013.
[44] WWC staff told us that the listserv had over 10,000 subscribers
and that Web site visits increase after conferences.
[45] IES's Web site hosts an "Ask A REL" page, where educators can
submit questions. "Ask A REL" is described on the Web site as being a
collaborative reference desk service provided by the 10 RELs that
functions much in the same way as a technical reference library.
[46] Thirty-seven states and the District of Columbia responded to our
survey. While the District of Columbia is not a state, we will refer
to the survey respondents as representing 38 states.
[47] The 95 percent confidence interval for this estimate is (36.7,
46.6).
[48] While based on 38 state-level respondents, this analysis provides
the minimum number (33) of states (overall) that have accessed the WWC
Web site. Regardless of whether or not the 13 states that did not
respond to our survey have accessed the Clearinghouse Web site, 33 is
about two-thirds of the 51 states and constitutes a majority of states.
[49] Our survey asked respondents to indicate the number of times they
had accessed the WWC Web site. Answer choices included never; less
than twice a year; between 2 and 6 times per year; between 7 and 11
times per year; monthly; and more than once a month.
[50] The 95 percent confidence interval for this estimate is (29.4 ,
38.8).
[51] The 95 percent confidence interval for this estimate is (8.3,
14.8).
[52] Fourteen states reported accessing the Clearinghouse six or fewer
times a year, as did an estimated 72 percent of districts. In addition
to time constraints (cited by 7 of the 14 states), five states
reported that they did not access the WWC more frequently because its
content was not relevant to their decisions.
[53] The 95 percent confidence interval for this estimate is (94.2,
98.4).
[54] Our survey asked states and districts to report how useful
certain sources of information were in identifying effective education
practices. The sources ranged from general (personal experience,
education periodicals) to specific (RELs, federal outreach centers).
In addition, we listed several research synthesis organizations by
name, including the WWC and the Best Evidence Encyclopedia. See
appendix II for more details.
[55] Between November 2009 and February 2010, we attended a regional
conference for the National Council for Teachers of Mathematics in
Nashville, Tenn., as well as three national conferences: National
Council for Teachers of English, ASCD (formerly the Association of
Supervision and Curriculum Development), and the National Association
of Secondary School Principals.
[56] One state official responded that he and his staff had not used
the WWC to inform any decisions on effective education practices,
while another state official responded "Don't know."
[57] The 95 percent confidence intervals for these estimates are
(55.5, 87.7) and (10.3, 29.8), respectively.
[58] The estimates and their 95 percent confidence intervals were as
follows: inform curriculum decisions--93 percent (85.5, 97.3); inform
professional development of teachers--89.4 percent (81.7, 94.7).
[59] The 95 percent confidence interval for this estimate is (64.3,
82.8).
[60] The estimates and their 95 percent confidence intervals were as
follows: practice guides--54.6 percent (44.4, 64.8); quick reviews--
54.2 percent (44.4, 64.3).
[61] The 95 percent confidence interval for this estimate is (13.7,
30).
[62] The 95 percent confidence interval for this estimate is (5.3, 17).
[63] Twenty-six states that had used the Clearinghouse reported that
they would be somewhat likely or very likely to increase their use of
the WWC if it had a greater number of intervention reports showing
positive effects--a number which depends both on the number of
interventions that the WWC reviews and whether the results of
available research meeting WWC standards show positive effects.
[64] The 95 percent confidence interval for this estimate is (59.8,
76.1).
[65] The 95 percent confidence interval for this estimate is (41.3,
58.6).
[66] GAO, Program Evaluation: A Variety of Rigorous Methods Can Help
Identify Effective Interventions, [hyperlink,
http://www.gao.gov/products/GAO-10-30] (Washington, D.C.: Nov. 23,
2009).
[67] The expert panel was convened by the National Board for Education
Sciences in 2008 in response to a mandate from the Senate
Appropriations Committee. The National Board for Education Sciences,
consisting of 15 voting members appointed by the President with the
advice and consent of the Senate, provides IES guidance and oversight.
The mandate directed the board to convene leading experts in rigorous
evaluations to assess the WWC and specified that panel members should
be free of conflicts of interest. Expert panel members included
economists, statisticians, and professors with expertise in other
systematic review efforts in the fields of health care, social policy,
and education.
[End of section]
GAO's Mission:
The Government Accountability Office, the audit, evaluation and
investigative arm of Congress, exists to support Congress in meeting
its constitutional responsibilities and to help improve the performance
and accountability of the federal government for the American people.
GAO examines the use of public funds; evaluates federal programs and
policies; and provides analyses, recommendations, and other assistance
to help Congress make informed oversight, policy, and funding
decisions. GAO's commitment to good government is reflected in its core
values of accountability, integrity, and reliability.
Obtaining Copies of GAO Reports and Testimony:
The fastest and easiest way to obtain copies of GAO documents at no
cost is through GAO's Web site [hyperlink, http://www.gao.gov]. Each
weekday, GAO posts newly released reports, testimony, and
correspondence on its Web site. To have GAO e-mail you a list of newly
posted products every afternoon, go to [hyperlink, http://www.gao.gov]
and select "E-mail Updates."
Order by Phone:
The price of each GAO publication reflects GAO‘s actual cost of
production and distribution and depends on the number of pages in the
publication and whether the publication is printed in color or black and
white. Pricing and ordering information is posted on GAO‘s Web site,
[hyperlink, http://www.gao.gov/ordering.htm].
Place orders by calling (202) 512-6000, toll free (866) 801-7077, or
TDD (202) 512-2537.
Orders may be paid for using American Express, Discover Card,
MasterCard, Visa, check, or money order. Call for additional
information.
To Report Fraud, Waste, and Abuse in Federal Programs:
Contact:
Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]:
E-mail: fraudnet@gao.gov:
Automated answering system: (800) 424-5454 or (202) 512-7470:
Congressional Relations:
Ralph Dawn, Managing Director, dawnr@gao.gov:
(202) 512-4400:
U.S. Government Accountability Office:
441 G Street NW, Room 7125:
Washington, D.C. 20548:
Public Affairs:
Chuck Young, Managing Director, youngc1@gao.gov:
(202) 512-4800:
U.S. Government Accountability Office:
441 G Street NW, Room 7149:
Washington, D.C. 20548: