No Child Left Behind Act

Education Could Do More to Help States Better Define Graduation Rates and Improve Knowledge about Intervention Strategies Gao ID: GAO-05-879 September 20, 2005

About one third of students entering high school do not graduate and face limited job prospects. The No Child Left Behind Act (NCLBA) requires states to use graduation rates to measure how well students are being educated. To assess the accuracy of states' graduation rates and to review programs that may increase these rates, GAO was asked to examine (1) the graduation rate definitions states use and how the Department of Education (Education) helped states meet legal requirements,(2) the factors that affect the accuracy of graduation rates and Education's role in ensuring accurate data, and (3) interventions with the potential to increase graduation rates and how Education enhanced and disseminated knowledge of intervention research.

As of July 2005, 12 states used a graduation rate definition--referred to as the cohort definition--that tracks students from when they enter high school to when they leave, and by school year 2007-08 a majority plan to use this definition. Thirty-two states used a definition based primarily on the number of dropouts over a 4-year period and graduates. The remaining states used other definitions. Because the cohort definition is more precise, most states not using it planned to do so when their data systems can track students over time, a capability many states do not have. Education has assisted states primarily on a case-by-case basis, but it has not provided guidance to all states on ways to account for selected students, such as for students with disabilities, thus creating less consistency among states in how graduation rates are calculated. The primary factor affecting the accuracy of graduation rates was student mobility. Students who come and go make it difficult to keep accurate records. Another factor was whether states verified student data, with fewer than half of the states conducting audits of data used to calculate graduation rates. Data inaccuracies can substantially raise or lower a school's graduation rate. Education has taken steps to help states address data accuracy issues. However, Education officials said that they could not assess state systems until they had been in place for a while. Data accuracy is critical, particularly since Education is using state data to calculate graduation rate estimates to provide consistency across states. Many interventions are used to raise graduation rates, but few are rigorously evaluated. GAO identified five that had been rigorously evaluated and showed potential for improving graduation rates, such as Project GRAD. In visits to six states, GAO visited three schools that were using such interventions. Other schools GAO visited were using interventions considered by experts and officials to show promise and focused on issues such as self esteem and literacy at various grades. Education has not acted on GAO's 2002 recommendation that it evaluate intervention research, a recommendation the agency agreed with, and has done little to disseminate such research.

Recommendations

Our recommendations from this work are listed below with a Contact for more information. Status will change from "In process" to "Open," "Closed - implemented," or "Closed - not implemented" based on our follow up work.

Director: Team: Phone:


GAO-05-879, No Child Left Behind Act: Education Could Do More to Help States Better Define Graduation Rates and Improve Knowledge about Intervention Strategies This is the accessible text file for GAO report number GAO-05-879 entitled 'No Child Left Behind Act: Education Could Do More to Help States Better Define Graduation Rates and Improve Knowledge about Intervention Strategies' which was released on September 21, 2005. This text file was formatted by the U.S. Government Accountability Office (GAO) to be accessible to users with visual impairments, as part of a longer term project to improve GAO products' accessibility. Every attempt has been made to maintain the structural and data integrity of the original printed product. Accessibility features, such as text descriptions of tables, consecutively numbered footnotes placed at the end of the file, and the text of agency comment letters, are provided but may not exactly duplicate the presentation or format of the printed version. The portable document format (PDF) file is an exact electronic replica of the printed version. We welcome your feedback. Please E-mail your comments regarding the contents or accessibility features of this document to Webmaster@gao.gov. This is a work of the U.S. government and is not subject to copyright protection in the United States. It may be reproduced and distributed in its entirety without further permission from GAO. Because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. Report to Congressional Requesters: United States Government Accountability Office: GAO: September 2005: No Child Left Behind Act: Education Could Do More to Help States Better Define Graduation Rates and Improve Knowledge about Intervention Strategies: GAO-05-879: GAO Highlights: Highlights of GAO-05-879, a report to congressional requesters: Why GAO Did This Study: About a third of students entering high school do not graduate and face limited job prospects. The No Child Left Behind Act requires states to use graduation rates to measure how well students are educated. To assess the accuracy of states‘ rates and to review programs that may increase rates, GAO was asked to examine (1) the graduation rate definitions states use and how the Department of Education (Education) helped states meet legal requirements, (2) the factors that affect the accuracy of states‘ rates and Education‘s role in ensuring accurate data, and (3) interventions with the potential to increase graduation rates and how Education enhanced and disseminated knowledge of intervention research. What GAO Found: As of July 2005, 12 states used a graduation rate definition”referred to as the cohort definition”that tracks students from when they enter high school to when they leave, and by school year 2007-08 a majority plan to use this definition. Thirty-two states used a definition based primarily on the number of dropouts over a 4-year period and graduates. The remaining states used other definitions. Because the cohort definition is more precise, most states not using it planned to do so when their data systems can track students over time, a capability many states do not have. Education has assisted states primarily on a case- by-case basis, but it has not provided guidance to all states on ways to account for selected students, such as for students with disabilities, thus creating less consistency among states in how graduation rates are calculated. States‘ Planned Definitions by School Year 2007-08: [See PDF for image] [End of figure] The primary factor affecting the accuracy of graduation rates was student mobility. Students who come and go make it difficult to keep accurate records. Another factor was whether states verified student data, with fewer than half of the states conducting audits of data used to calculate graduation rates. Data inaccuracies can substantially raise or lower a school‘s graduation rate. Education has taken steps to help states address data accuracy issues. However, Education officials said that they could not assess state systems until they had been in place for a while. Data accuracy is critical, particularly since Education is using state data to calculate graduation rate estimates to provide consistency across states. Many interventions are used to raise graduation rates, but few are rigorously evaluated. GAO identified five that had been rigorously evaluated and showed potential for improving graduation rates, such as Project GRAD. In visits to six states, GAO visited three schools that were using such interventions. Other schools GAO visited were using interventions considered by experts and officials to show promise and focused on issues such as self esteem and literacy at various grades. Education has not acted on GAO‘s 2002 recommendation that it evaluate intervention research, a recommendation the agency agreed with, and has done little to disseminate such research. What GAO Recommends: GAO recommends Education provide information to all states on ways to account for different types of students in graduation rate calculations, assess the reliability of state data used to calculate interim rates, and establish a timetable to implement the recommendation in GAO‘s 2002 report to evaluate research and also to disseminate such research. Education agreed with GAO‘s recommendations on accounting for different types of students and the need for research. On GAO‘s other recommendation, Education noted steps it was taking to assess data reliability though it is unclear that such steps address data to be used for interim rates. www.gao.gov/cgi-bin/getrpt?GAO-05-879. To view the full product, including the scope and methodology, click on the link above. For more information, contact Marnie S. Shaul at (202) 512-7215 or shaulm@gao.gov. [End of section] Contents: Letter: Results in Brief: Background: Many States Moving toward Using A Definition That Follows Students over Time; Education's Guidance Regarding NCLBA Requirements Is Limited: Several Factors Affected the Accuracy of Graduation Rates, and Data Quality Remains a Key Challenge: Few Interventions Have Been Rigorously Evaluated, and Education Has Done Little to Evaluate and Disseminate Existing Research: Conclusions: Recommendations for Executive Action: Agency Comments and Our Evaluation: Appendix I: Objectives, Scope, and Methodology: Appendix II: Comments from the Department of Education: Appendix III: GAO Contact and Staff Acknowledgments: Related GAO Products: Bibliography: Tables: Table 1: Number of States That Allow Schools to Maintain Previous Year's Rate or Show Progress toward Graduation Rate Targets to Make AYP, as of July 2005: Table 2: Number of Interventions Visited by School Level and Type: Table 3: Key Features of the Check and Connect Model: Table 4: States Selected for Site Visits and Phone Interviews by Purpose: Figures: Figure 1: Student Mobility and Graduation Outcome for a Hypothetical High School Class: Figure 2: Cohort Formula Definition: Figure 3: Departure Classification Definition: Figure 4: Definitions by State, as of April 2005, and Planned to Use by State, School Year 2007-08: Figure 5: State Graduation Rate Targets, as of July 2005: Figure 6: Estimated School Graduation Rates under Varying Assumptions of Errors in Counting Dropouts: Figure 7: Project GRAD Structural Model: Figure 8: Aviation High School Presentation by the Blue Angels: Abbreviations: AYP: Adequate Yearly Progress: GED: General Education Development: ESOL: English for Speakers of Other Languages: HOSTS: Help One Student to Succeed: IASA: Improving America's Schools Act of 1994: NCES: National Center for Education Statistics: NCLBA: No Child Left Behind Act of 2001: Project GRAD: Project Graduation Really Achieves Dreams: United States Government Accountability Office: Washington, DC 20548: September 20, 2005: The Honorable Edward M. Kennedy: Ranking Minority Member: Committee on Health, Education, Labor, and Pensions: United States Senate: The Honorable Lamar Alexander: Chairman: Subcommittee on Education and Early Childhood Development: Committee on Health, Education, Labor, and Pensions: United States Senate: The Honorable Jeff Bingaman: United States Senate: The Honorable Patty Murray: United States Senate: The Honorable Olympia J. Snowe: United States Senate: About one third of students who enter high school do not graduate and face limited job opportunities. The No Child Left Behind Act of 2001 (NCLBA) was passed in part to increase the likelihood that all of the 48 million students in our nation's public school systems will graduate and requires states to use high school graduation rates, along with test scores, to assess how much progress high schools are making in educating their students. Graduation rates--used in conjunction with test scores--provide a more complete picture of school performance than test scores alone, because a school's test proficiency rate will be higher if low-performing students drop out and do not have their scores included with their peers. Graduation rates are used as part of the determination about whether schools meet federal requirements for school progress. If schools do not meet such requirements, their students may be eligible to transfer to another school or receive tutoring. Currently, the Department of Education (Education), National Governors Association, and several national education organizations and foundations are working on high school reform initiatives to address issues, such as school structure and curriculum, which may help low- performing students and increase the likelihood of graduation. In addition, our 2002 report on high school dropouts identified the need for better information on the success of interventions designed to increase the likelihood of students staying in school until they graduate.[Footnote 1] NCLBA defines graduation rates as the percentage of students who graduate from high school with a regular diploma in the standard number of years. Education's regulations do not permit states to count an alternative degree that is not fully aligned with the state's academic standards, such as a certificate of attendance or a General Educational Development certificate (GED). Each state has flexibility, however, in determining how its graduation rate will be specifically calculated as long as the rate is, as the law requires, "valid and reliable." In response to congressional requests, we are providing information on: (1) the definitions states have developed for graduation rates and how Education supports states in meeting the law's requirements for defining and measuring graduation rates; (2) the factors, such as student mobility, that affect the accuracy of the data used to calculate graduation rates for all students and those in designated groups, and what Education does to ensure accuracy of rates reported by states; and (3) what is known about the success of interventions with the potential to increase graduation rates and how Education has enhanced and disseminated knowledge about these practices. To address these objectives, we used a variety of methodological approaches. We analyzed the plans states were required to submit to Education to identify the graduation rate definitions states used and graduation rate goals set by states, reviewed updates to plans through July 2005, and letters from Education to states regarding its decisions about state plans and updates. We also surveyed officials in 50 states, the District of Columbia, and Puerto Rico[Footnote 2] to obtain information about the extent to which states verify school and district data used to calculate high school graduation rates and use unique student identifiers. We selected and contacted 20 states for further analysis. States were selected to capture variation in high school graduation rate definitions, geographic location, and types of interventions with the potential to increase graduation rates. We conducted a case study in 1 state to calculate graduation rates; site visits in 3 states to review data accuracy; site visits in 6 states to observe interventions and interview staff at 16 schools; and phone interviews in all 20 states to obtain information on definitions used, implementation status, and guidance provided. To identify which interventions have the potential to increase graduation rates, we reviewed the research on interventions and interviewed Education officials and dropout prevention experts. We also reviewed available evaluations of the types of interventions we observed to assess their findings and methodological approaches. To determine how Education assists states, we reviewed Education regulations, guidance, and other documents and interviewed Education and state agency officials. We also interviewed Education and state officials to determine the degree to which Education has enhanced and disseminated knowledge about interventions. To determine the extent to which reported dropout rates may be understated, we interviewed experts in this area and reviewed research on the topic. Finally, we interviewed officials from the National Governors Association, national education organizations, and other experts in the area of high school graduation rates and reviewed related research to obtain an understanding of the issues surrounding these rates and high school reform efforts to address them. For a more detailed explanation of our methodology, see appendix I. We conducted our work between September 2004 and July 2005 in accordance with generally accepted government auditing standards. Results in Brief: A majority of states used or planned to use a graduation rate definition, referred to as the "cohort" definition, which follows a group of students over time from when they entered high school until they left. Education has assisted states; however, it has not provided guidance on ways to account for certain students. The cohort definition, used by 12 states as of spring 2005, compares the number of 12th grade graduates with the number of students enrolled as 9th graders 4 years earlier, while also taking into account the number of students who left the school, such as those who transferred in and out. Thirty-two states used a definition of high school graduation rate based primarily on the number of dropouts over a 4-year period and graduates, referred to as the "departure classification definition." The remaining eight states used a variety of other definitions. Many states using the departure or other definitions are planning to move to the cohort definition by school year 2007-08 or when their data systems can accommodate its use. This definition may help schools provide more precise graduation rates; however, it requires data systems that can track students or groups of students over time. Most states used these definitions to set graduation rate targets (for example, 80 percent a year). Although states generally set numerical targets, many considered a school as meeting state graduation rate requirements if the school showed progress toward these targets. The progress states allowed generally ranged from any progress up to 1 percent, with two states allowing schools to maintain the graduation rate of the previous year. Education has supported states' efforts to develop definitions that are intended to produce more precise results, developed some guidance, and provided support such as on-site peer reviews, conferences, and information on its Web site. Education also commissioned a task force that published a report identifying the advantages and disadvantages of different definitions. States also encountered challenges in resolving common issues, such as how to account for students with disabilities who graduate with a regular diploma in more than the standard number of years based on their Individualized Education Plans. Education has not provided guidance to all states on how to account for students in such programs; instead, Education's approach has been to provide such information to states on a case-by-case basis. As a result, some states were not aware of the modifications available to count such students in their graduation calculation, and there is less consistency among states, even those using similar definitions, in how their rates are calculated. Difficulty tracking mobile student populations was the primary factor affecting the accuracy of graduation rates; while Education has taken some steps to help states address this challenge, concerns about data accuracy still exist. According to state, school district, and school officials and experts we interviewed, the more that a school's students come and go, the more challenging it is for a school to maintain accurate records on whether students leave school by transferring or dropping out. Other factors--such as the degree to which states verify school and district data--also affect the accuracy of graduation rates. For example, fewer than half of the states reported conducting audits that verify these data. Data inaccuracies, such as miscounting the number of dropouts, can significantly raise or lower a school's reported graduation rate. Because most states were in the process of adopting a different graduation rate definition, Education officials told us that they could not examine the reliability of the data used to calculate such rates until after the new definitions had been in place for multiple years. Such time would allow them to determine if the rates produced consistent results. Also, Education enhanced its state monitoring by adding a review component to examine data states used for graduation rates, among other aspects of states' participation in the Title I program. Furthermore, in response to recommendations from GAO and Education's Inspector General, the agency contracted with a firm to develop a guide by the end of 2005 to help states improve data collection processes. In July 2005, Education announced that it planned to calculate and report interim graduation rate estimates for each state to provide a nationwide perspective. However, in our review we found that data problems exist, and it is unclear whether the department's monitoring efforts are sufficient for states to provide accurate data for Education's estimates. Few of the interventions that states and school districts have implemented to increase high school graduation rates have been rigorously evaluated, and Education has done little to evaluate and disseminate existing knowledge about effective interventions. We identified five interventions that had been rigorously evaluated and showed potential for improving graduation rates. In our visits to six states we visited three schools that were using such interventions. For example, Check and Connect, an intensive mentoring program, showed increased levels of educational attainment for students with emotional and behavioral disabilities. Another program, Project GRAD, a comprehensive kindergarten-to-12 reform program, demonstrated some promise in improving test scores and graduation rates. In addition to the programs we visited, recently completed rigorous evaluations of two other programs, the Talent Development High School Model and First Things First, suggest that these interventions may also increase graduation rates. Most other programs we visited fell into one of three categories--restructuring schools, providing supplemental services, such as tutoring, and creating alternative learning environments-- similar to findings in our 2002 report on high school dropouts. While these had not been rigorously evaluated, research and program officials noted some promising results that may lead to improving student outcomes including high school graduation. With the NCLBA requirement that interventions be research-based, there is a need in the education community for additional scientifically based research. However, Education's efforts to evaluate and disseminate existing knowledge on interventions have been minimal. We are recommending that the Secretary of Education develop approaches to provide information on how to account for different types of students to all states rather than providing this information on a state-by-state basis and assess the reliability of data submitted by states that Education plans to use to develop interim graduation rates. We are also recommending that the Secretary establish a timetable to carry out the recommendation in our 2002 report regarding evaluating research on dropout interventions, including those that focus on increasing graduation rates, and that the Secretary disseminate research on programs shown to be effective in increasing graduation rates. In comments on a draft of this report, Education concurred with our recommendations about accounting for different types of students and the need for evaluating and disseminating research on dropout interventions. On our recommendation to assess the reliability of data submitted by states, Education noted that it was taking steps to assess data reliability; however, it is not clear that these steps apply to data that Education plans to use to calculate interim rates. Background: Despite the increasing importance of a high school education, only an estimated two thirds of students graduate from high schools nationwide. Students in certain subgroups, such as the economically disadvantaged and certain racial and ethnic groups, have historically graduated from high school at substantially lower rates than their peers. Students who do not graduate from high school are at a serious disadvantage compared to their peers who do. They are much less likely to obtain good jobs or attend college. The NCLBA includes several requirements for states to improve school and student performance, including measuring high school graduation rates. NCLBA Requirements: NCLBA expanded the requirements of the Improving America's Schools Act of 1994 (IASA) for states, school districts, and schools to demonstrate that their students are making adequate progress toward their state's academic goals. IASA required testing in each of three grade spans to determine whether a school made adequate yearly progress (AYP). NCLBA requires, by the 2005-06 school year, that annual tests in math and reading be administered to students in grades 3 through 8 and once in high school; by 2007-08, students must also be tested in science. In order to make AYP, schools are to show that increasing numbers of students reach the proficient level on state tests and that every student is proficient by 2014. NCLBA also designated specific groups of students for particular focus. These four groups are students who (1) are economically disadvantaged, (2) represent major racial and ethnic groups, (3) have disabilities, and (4) are limited in English proficiency.[Footnote 3] For a school to make AYP, its student body as a whole and each of the student groups must, at a minimum, meet the state targets for testing proficiency. Under NCLBA, schools must also use at least one other academic indicator, in addition to annual tests, to measure AYP. High schools must use graduation rate as one of their other academic indicators. The law defines graduation rate as the percentage of students who graduate from secondary school with a regular diploma in the standard number of years. Education officials told us that standard number of years is determined by a state and is generally based on the structure of the school. For example, a high school with grades 9 through 12 would have 4 as its standard number of years while a school with grades 10 through 12 would have 3 as its standard number of years. NCLBA regulations specifically require a high school, in order to make AYP, to meet or exceed its other academic indicators, including what the state has set as the graduation rate for public high schools. NCLBA does not specify a minimum graduation rate that states must set. States have used a variety of methods to measure AYP on their graduation rate indicator. For example, states have set graduation rate targets or goals or have allowed schools to show progress toward a target or goal as a way for schools to meet the graduation rate indicator requirement. The law does not require states to increase their graduation rate over time. The law requires states to demonstrate that their definitions produce graduation rates that are valid and reliable. A valid rate would be one that measures what it intends to measure. A reliable rate is one which, with repeated data collections and calculations, produces the same result each time such collections and calculations are performed. A key aspect of the reliability of graduation rates is the quality of the data used to calculate them. The National Center for Education Statistics (NCES), Education's chief statistical agency, has funded a document that describes the following dimensions for ensuring that data are of high quality: * Accuracy. The information must be correct and complete. Data entry procedures must be reliable to ensure that a report will have the same information regardless of who fills it out. * Security. The confidentiality of student and staff records must be ensured and data must be safe. * Utility. The data must provide the right information to answer the question asked. * Timeliness. Deadlines are discussed, and data are entered in a timely manner.[Footnote 4] This document suggests that school staff members are responsible for entering data accurately and completely and maintaining data security. It provides ideas for assisting staff to accomplish these tasks, such as sharing best practices with a peer and implementing school-district policies on data security, such as changing passwords frequently. If schools receiving funding under Title I, Part A of the act do not make AYP--including meeting the state's requirements for graduation rates--for 2 consecutive years or more, they are "identified for improvement." They must take certain actions such as offering parents an opportunity to transfer students to a school that had made AYP (school choice). If these schools continue not to make AYP, they must take additional actions, such as providing supplemental services to students--including transportation, tutoring, and training.[Footnote 5] States and school districts are required to provide funding for such actions up to a maximum specified in law. However, according to Education officials, most high schools do not receive Title I funding, and therefore, if these schools do not make AYP, they are not required to take improvement actions, such as offering school choice or supplemental services. However, NCLBA requires each school district receiving Title I funds to prepare a report card that must contain graduation rates for high school students and is available to the public. Education's Responsibilities: Education has responsibility for general oversight of Title I of NCLBA. As part of its oversight effort, Education has implemented the Student Achievement and School Accountability Program for monitoring each states' administration of Title I programs. This monitoring effort was designed to provide regular and systematic reviews and evaluations of how states provide assistance in terms of funding, resources, and guidance to school districts to ensure that they administer and implement programs in accordance with the law. Monitoring is conducted on a 3-year cycle and addresses high school graduation rates among other requirements. Teams of federal officials visit state offices, interview state officials, and review documentation on how states comply with federal law and regulations. NCLBA also requires the Secretary of Education to report to the Congress annually regarding state progress in implementing various requirements, including the number of schools identified for improvement. Education has required states to report their graduation rates for the state as a whole and for designated student groups. All states submitted plans to Education as required under NCLBA, which were to include their definitions of graduation rates. By June 2003, Education reviewed and approved all state plans, including their definitions of graduation rates and their statements regarding how such rates were valid and reliable. Education provided many states with approval to use a definition of their choosing until they are able to develop ones that better meet the law's requirements for defining and measuring graduation rates. Education has also reviewed and approved many amendments to plans submitted by states, including those that make changes to the state's definition of its graduation rate. Additionally, NCES commissioned a task force to review issues about definitions, data, and implementation. In its report, the Task Force discussed the data challenges faced by states in calculating their graduation rates.[Footnote 6] Regarding data used to measure student performance generally, GAO and Education's Inspector General have commented on the importance of data accuracy.[Footnote 7] Dropout Prevention: To attempt to improve graduation rates in high schools or keep students from dropping out of school, Education, state governments, school districts, schools, and foundations have funded or implemented various interventions to address the educational needs of students. Such interventions are based on the idea that many factors influence a student's decision to drop out of school, such as low grades, socio- economic challenges, and disciplinary problems. These factors may be evident as early as elementary school, and therefore some interventions are designed for these students. During the late 1980s and through the mid-1990s, Education supported dropout prevention programs across the country. In an attempt to determine which programs effectively reduced the drop out rate, Education conducted several evaluations of these programs. The largest of these was the evaluation of the second phase (1991 to 1996) of the School Dropout Demonstration Assistance Program. This evaluation looked at more than 20 dropout prevention programs including school within a school, alternative middle and high schools, restructuring initiatives, tutoring programs, and GED programs. While two of these programs showed promise in reducing dropout rates--alternative high schools and middle schools--the major finding was that most programs did not reduce dropping out.[Footnote 8] In our 2002 report, we identified three intervention approaches to prevent students from dropping out of school:[Footnote 9] * Restructuring schools. This approach modifies a school or all schools in a district through such initiatives as curriculum reform or dividing schools into smaller, more individualized learning communities. * Providing supplemental services. This approach provides additional services such as tutoring or mentoring in language and math; interventions attempt to raise student academic achievement and self esteem.[Footnote 10] * Creating alternative learning environments. These interventions target at-risk students and attempt to create personalized learning environments, such as career academies that focus the entire school around a specific career theme. However, our 2002 report found that additional research was needed to document which interventions were particularly successful for certain groups of students. Education agreed that additional rigorous evidence is needed and that it would consider commissioning a systematic review of the literature. Many States Moving toward Using A Definition That Follows Students over Time; Education's Guidance Regarding NCLBA Requirements Is Limited: A majority of states used or planned to use a graduation rate definition based on the group of students entering high school who graduate on time, referred to as the cohort definition. Education has assisted states, approved their graduation rate definitions, and given some states more time to develop planned definitions intended to produce more precise results. However, states faced challenges in resolving common data issues and in providing information on how to modify definitions to better account for certain students, such as for those with disabilities. A Majority of States Used or Planned to Use a Definition That Follows Students over Time: According to state plans, 12 states used a definition that followed a group of students over time from when they entered high school until they left--referred to as the cohort definition. An additional 18 states using other definitions planned to adopt the cohort definition no later than the 2007-08 school year.[Footnote 11] The cohort definition compares the number of 12th grade graduates with a standard diploma, with the number of students enrolled as 9th graders 4 years earlier, while also taking into account those who left the cohort, such as those who transferred in and out.[Footnote 12] A study commissioned by NCES found that a cohort definition designed to track individual students over time--from when they enter high school until they leave- -could result in a more precise high school graduation rate than one calculated with other definitions.[Footnote 13] The data in figure 1 show a hypothetical high school class from the time students enrolled in 9th grade until they graduated with a standard diploma, including those who dropped out, transferred, received alternative degrees, continued in school, or took 5 years to graduate. Figure 1: Student Mobility and Graduation Outcome for a Hypothetical High School Class: [See PDF for image] [End of figure] If the school was in a state that used the cohort definition and considered 4 years to be on-time graduation, its graduation rate would be 60 percent. The 60 percent figure comes from using the number of students who started (100), the net number of transfers over the 4 years, and the number who graduate in 4 years (60).[Footnote 14] Figure 2 shows the formula of the cohort definition. The year students in the cohort graduate is denoted by "y," while "T" signifies the net number of students who transfer in and out in any given year. The cohort definitions actually used by states may vary somewhat from the basic definition. For example, Kansas used dropout and transfer data in its definition. Additionally, some states track individual students, while others track groups of students based on the entering 9th grade cohort. Figure 2: Cohort Formula Definition: [See PDF for image] [End of figure] According to state plans, 32 states used a definition of high school graduation rate, referred to as the departure classification definition, based primarily on the number of dropouts over a 4-year period and graduates. Essentially, this definition looks back from a 12th grade class at those who (1) graduated (regardless of when they started high school), (2) dropped out in 9th, 10th, 11th, and 12th grades (including those who enrolled in GED programs) and (3) did not graduate, but received some form of alternative completion certificate.[Footnote 15] So, using this definition, the data from the high school shown in figure 1 would result in a graduation rate of 65 percent. The 65 percent figure comes from using the number of students who graduated (65), the number who received an alternative certificate (5), and the number who dropped out (30), as shown in Figure 3. Unlike the cohort definition, this definition does not take into consideration the number of students entering high school 4 years earlier. As noted earlier, some of these states (13) planned to adopt the cohort definition by school year 2007-08. Figure 3: Departure Classification Definition: [See PDF for image] [End of figure] The departure classification definition includes students who drop out. Each of the "D" designations refers to the number of dropouts during one year. For example "D y-2g10" stands for the number of students who dropped out in the 10th grade. Prior to NCLBA, many states had been using a similar version of this formula, which NCES developed in collaboration with several states. However, earlier definitions used by states may have also included as graduates those who receive GED certificates. Under NCLBA, Education required states to modify the formula so that GED recipients were not counted as graduates. Different data systems accommodated the use of different definitions. The departure classification definition allowed many states to continue using existing data systems, according to Education officials. Such systems generally collect aggregate data, rather than data at the student level. The cohort definition generally requires states to implement a state-level student tracking system, often with a mechanism that can uniquely identify each student. Such a system identifies students in the 9th grade and tracks them throughout high school, indicating whether they graduate, transfer or drop out. This system also allows for students who transfer into a school to be placed in the proper cohort. The more specific information required by the cohort definition may result in the calculation of more precise graduation rates than those produced by the departure classification definition. Since the cohort definition follows students entering high school, either by individual students or groups of students, it can better be used to include only on-time graduates. However, how it is implemented may affect the level of precision of the rate calculated. Tracking individual students may result in a more precise rate than tracking groups of students. In our analysis of one state's school year 2002-03 data, we found that the variations in data collection and calculations between the two types of definitions, produced different graduation rates. Our analysis showed that the departure classification definition produced a graduation rate that was 12 percent greater than when we used the cohort definition.[Footnote 16] Because the departure classification definition does not track the entering cohort, it does not account for students who were held back, and therefore differences may result. Our findings are consistent with observations made by other researchers that show differences in graduation rates based on the definition used.[Footnote 17] In addition, NCES plans to complete a study this year that examines high school graduation rate definitions and how rates differ depending on the definition used. According to state plans, the remaining eight states that did not use either a departure classification or cohort definition used a variety of other definitions. Five of these states plan to adopt cohort definitions no later than 2007-08.[Footnote 18] Figure 4 shows the definitions each state used as April 2005 and planned to use by school year 2007-08. Figure 4: Definitions by State, as of April 2005, and Planned to Use by State, School Year 2007-08: [See PDF for image] Panel A: Definitions by State, April 2005: Panel B: Definitions by State, Planned for School Year 2007-08: [End of figure] Most States Allowed Schools to Show Progress toward State Graduation Rate Targets in Order to Meet Graduation Rate Requirements: Most states set graduation rate targets, and many allowed schools to show progress toward these targets as a way for schools to make AYP. NCLBA requires that states set a graduation rate indicator. Most states have set such rates to help determine which schools make AYP. Additionally, many states allow schools to make AYP even if their graduation rates are not as high as the state's required rate, so long as the school shows progress toward the required rate. States' graduation rate targets ranged from 50 percent in Nevada to 100 percent in South Carolina, with about half at 80 percent or greater, as shown in figure 5. Figure 5: State Graduation Rate Targets, as of July 2005: [See PDF for image] Notes: These state graduation rate targets were drawn from state plans on Education's Web site (http://www.ed.gov/admins/lead/account/stateplans03/index.html) as of July 7, 2005, for all states except Arizona, Colorado, District of Columbia, Louisiana, Maine, Mississippi, Missouri, New Hampshire, New Jersey, New York, Oregon, and Puerto Rico. Education provided information on these states. This figure includes only those states that were using graduation rates at the time of our review. States that used other rates, such as dropout rates, were not included. These states are Arkansas, Indiana, Louisiana, Massachusetts, and New Jersey. Florida is also not included in this chart because its requirement is that schools show a one percent annual increase in their graduation rates. [End of figure] Valid comparisons of graduation rate targets across states cannot be made, in part, because of differences in rates used. For example, Alabama and North Carolina both had targets of 90 percent graduation rates. However, Alabama arrived at its target by using a departure classification definition that accounted for dropouts, while North Carolina used a definition that did not account for dropouts. According to state plans, 36 states considered their schools as meeting their graduation rate requirements if the schools increased their graduation rates from the previous year, known as "showing progress." In addition, two states allowed their schools to meet such requirements if they maintained the previous year's rates. A majority of states that allowed progress as a way for schools to demonstrate they met state graduation rate requirements had set no minimum rate of progress. We found instances in which very little progress, less than 1 percent, enabled a school to meet such requirements. Table 1 shows the number of states that allow schools to show progress toward the state goals as a means of meeting state graduation rate requirements, for all states as of the time we completed our review. Table 1: Number of States That Allow Schools to Maintain Previous Year's Rate or Show Progress toward Graduation Rate Targets to Make AYP, as of July 2005: Number of states[B]; Maintain previous year's rate: 2; Any progress allowed: 28; Progress must be of a specific amount: 0.1 percent: 3; Progress must be of a specific amount: 1 percent: 4; Progress must be of a specific amount: Other[A]: 1; Total: 38. Source: State plans on Education's Web site as of July 7, 2005, with exceptions (see note). [NOTE: This information was drawn from state plans found on Education's Web site. (http://www.ed.gov/admins/lead/account/stateplans03/index.html) as of July 7, 2005, for all states except Arizona, Colorado, District of Columbia, Indiana, Louisiana, Maine, Mississippi, Missouri, New Hampshire, New Jersey, New York, Oregon and Puerto Rico. Education provided information for these states. [A] Reduce the difference between the actual and target rate by 10 percent over a 2-year period. [B] This table does not include 14 states--the 5 states that did not use a graduation rate (but instead used a different rate, such as a dropout rate), and the 9 states that did not allow schools to show progress toward the state graduation rate target to make AYP, but instead required the schools to meet the target. [End of table] By showing progress toward state graduation rate targets, schools can still make AYP even though they do not meet target rates.[Footnote 19] For example, our analysis of one state's data from the 2002-03 school year showed that 46 out of 444 high schools made AYP by increasing their graduation rates toward the state graduation rate target of 66 percent rather than by meeting or exceeding this target. Specifically, these schools met or exceeded the state's requirement for 1 percentage point progress in increasing the graduation rate, even though the schools were below the 66 percent target. Another 232 schools made AYP for the year by meeting or exceeding the target of 66 percent. In addition, allowing schools to use progress as the NCLBA graduation rate indicator could result in schools making AYP annually, while not meeting state graduation rate targets for decades, if at all. For example, a hypothetical school with a graduation rate of 56 percent can meet the state high school graduation indicator by increasing its graduation rate by 0.1 percent each year. At this rate, the school would not make the state graduation rate target of 66 percent for 100 years. Education's Guidance Did Not Specify Modifications Available to Account for Certain Students: Education provided states with assistance with their graduation rate definitions; however, Education's guidance did not specify modifications available to account for certain types of students. To help states with their definitions, Education developed some guidance and provided support such as on-site peer reviews, conferences, and information posted on its Web site. Education also commissioned a task force that published a report identifying the advantages and disadvantages of different definitions. In addition, Education officials told us they granted states time to develop definitions that met the law's requirements better for defining and measuring graduation rates. Education has provided information on how to account for students in special programs and students with disabilities to states that have requested it. Education's approach has been to provide such information on a case-by-case basis rather than to all states. Education officials stated that they preferred to work with each state's specific circumstances. However, we found that issues raised, such as students enrolled in 5-year programs, were common to many states.[Footnote 20] States varied in how they included students enrolled in these programs in their graduation rate definitions. For example, one state counted students in 5 year programs who graduated as dropouts until it received approval to count them as graduates. Another state planned to count such students as graduates without requesting approval to do so. Officials in that state said that since it was unclear what the actual requirements for counting graduates were, they were doing what they believed was allowable under the law. Without guidance on how to account for students in special programs and students with disabilities, there is less consistency among states in how students in these programs are included in graduation rates. Education also has not provided information to all states on how their definitions can be modified to better accommodate students with disabilities. State plans in 16 of the 52 states indicated that Education approved these states to allow students with disabilities more than the standard number of years to graduate based on the number of years in their Individualized Education Plans.[Footnote 21] In the 20 states we contacted, we found that they varied in whether they sought approval from Education on how to include students with disabilities in their graduation rate definitions. For example, six of the states we contacted had sought approval from Education to include students with disabilities who need more than the standard number of years to graduate in their graduation rate definitions. In contrast, officials in seven other states contacted told us they did not seek approval for the same issue. Officials in the remaining seven states provided no information on this topic or said it did not apply to them. Several Factors Affected the Accuracy of Graduation Rates, and Data Quality Remains a Key Challenge: State, school district, and school officials and experts we interviewed reported several factors that affect the accuracy of data used to calculate graduation rates, especially student mobility. While Education has taken steps to assist states and districts in improving the quality of their data, the Department has not reviewed the accuracy of all states' data, because, at the time of our review, many states were in the process of implementing new definitions, data collection strategies, or both. Several Factors, Especially Student Mobility, Compromise the Accuracy of Data Used to Calculate High School Graduation Rates: Officials in six schools, three school districts, and three states we visited and several experts we interviewed cited challenges in tracking student mobility, the key factor in calculating accurate high school graduation rates. Some inaccuracies may lead to the reporting of lower graduation rates, such as recording all students with "unknown" status as dropouts or counting students who drop out, return to school, and then drop out again as a dropout each time, as may happen in schools in states that use the departure classification definition. Other inaccuracies may lead to the reporting of higher graduation rates, such as schools' recording students who drop out as transfers. This may occur when school staff record such students as transfers before they receive documentation that the student actually enrolled in a different school.[Footnote 22] Since the number of dropouts counts against a school in calculating its graduation rate in many states, schools that record such students as transfers--because they were unaware that the students had actually dropped out--may be reporting inflated graduation rates.[Footnote 23] A second factor that affects data accuracy is how staff members understand and follow policies and procedures for recording students as transfers to other schools. For example, staff members in schools in two states reported that they electronically record a student as having transferred to another school on the day that student withdraws from their schools. However, the policy in these states is that a student is to be recorded as having transferred only upon receiving a request for records from the school to which the student transfers. In one of these schools, staff assigned to record student data reported contradictory practices and beliefs about state policy regarding when to record a student as a transfer. One staff member stated that the policy and her practice was to record the student as a transfer upon receiving the records request while another staff member said that no such policy existed and that she recorded the student as a transfer on the day of withdrawal. Therefore, how a student transferring out the school was counted depended on which staff member recorded the student's data. The accuracy of data may be further compromised when schools have large numbers of students who transfer in a given year because the more students come and go, the more difficult it is for schools to accurately account for them. Some schools are in areas where families tend to move more frequently. For example, officials in one school we visited near an Army base reported that their school had an enrollment of about 1,200 students and that 187 students had left the school by December of the academic year. The status of 19 of those 187 students was recorded as "unknown" because of difficulty in maintaining contact with these families. The policy in that state was for students whose status is "unknown" (because they could not be contacted) to be counted as dropouts, even if, in fact, the student had transferred to another school. Staff in another school reported the presence of several children from another country. Their experience has been that these particular students report plans to return to their country of origin, but they often do not know the status of these students once they leave the school. The school's procedure is to record such students as having an "unknown" status, and these are eventually counted as dropouts, unless another school requests their records. Research has shown higher mobility rates among certain subgroups of students compared to all other students, including those who are African-American, Hispanic, Native American, and those classified as having limited English proficiency and as children from migrant families.[Footnote 24] Consequently, schools with higher concentrations of these subgroups would likely report less accurate graduation rates. Another factor affecting the accuracy of graduation rate data is the absence of state audits or verification checks. For example, in our survey of state officials, over half (27) reported that their states did not audit the data received from local officials that the state used to calculate high school graduation rates. The lack of such auditing or verification implies that states were likely to be unaware of the extent of certain errors in data--such as students' indicating they were transferring to another school but not actually doing so--and consequently were unable to ensure that data they received from schools and districts were accurate. Officials in only one of the six schools we visited reported that their data on student transfers had been audited or verified by an outside party, leaving the accuracy of transfer data in the other schools uncertain. A fourth factor that contributes to challenges in assuring accurate data is the lack of a unique identifier for each student. In our survey, officials in 22 states reported that their state did not have a unique identifier for each of their students. Concerns about using student identifiers include the cost of implementing data systems that support such identifiers and privacy issues. The lack of a unique identifier for students made it difficult to obtain accurate data. Officials in one state that did not use unique identifiers stated that they had to compute graduation rates based on aggregating student data and as a result, they could not track on-time graduates. Officials in another state estimated that they were only 90 percent accurate in identifying students, because, without a unique identifier for each student, they had to use other information. Using this information, such as the student's name or birth date, can lead to identifying more than one student with the same characteristics, resulting in inaccurate data used in calculating graduation rates. A fifth factor we found that may affect data accuracy is variation in security and accountability practices. For example, we found that while some schools restricted the ability to change student enrollment information (such as transfers) to one or two people in the building (e.g., a registrar), others allowed many staff members to do so. Further, while some schools' data systems kept a record of each person who accessed a student's record and the changes made, other systems did not maintain such information. Without sufficient security and record monitoring, there is a greater risk of inaccurate data being entered and used to calculate graduation rates. Data Inaccuracies May Affect Schools' Meeting State Graduation Rate Goals: We analyzed data from one state to estimate the effect of errors of various sizes in reporting dropouts on school graduation rates and found that such errors could raise or lower a school's graduation rate substantially. This state used a high school graduation definition that incorporated the number of graduates and dropouts in calculating its graduation rate. For example, its median high school in school year 2002-03, with 924 students, reported 41 dropouts and had a graduation rate of 75 percent.[Footnote 25] We re-estimated its graduation rate after assuming that the school had more dropouts, up to twice as many more than reported.[Footnote 26] In this case, if the school had 82 dropouts, its graduation rate fell to 64 percent. We also re-estimated its graduation rate after assuming that it had fewer dropouts, as few as half as many dropouts as reported. Thus, if it had 21 dropouts, its graduation rate rose to 88 percent. Figure 6 shows how the estimates of graduation rates were affected by assumed errors in counting dropouts for this school. Figure 6: Estimated School Graduation Rates under Varying Assumptions of Errors in Counting Dropouts: [See PDF for image] [End of figure] Our analysis was performed for all high schools in the state. As expected, when we assumed the number of dropouts was higher than what schools reported, their estimated graduation rates decreased. Our analysis also found the extent to which schools miscount their dropouts affects their likelihood of reaching the state's graduation rate target. We estimated that an additional 70 of 444 high schools in the state in school year 2002-03 would not have reached the state target if they were in fact reporting only half of their dropouts. On the other hand, an additional 77 high schools would have reached the state target if in fact their dropout counts were overreported at twice the actual level. According to the NCLBA, high schools that do not meet the state's requirements for its graduation rate are designated as not making AYP. Such designations if made for 2 or more consecutive years would result in the district's providing technical assistance to schools that receive Title I funding. Thus, schools that undercount their dropouts may be precluded from receiving the attention and assistance from the state they need to improve students' school retention and graduation while those with over counts may receive such services unnecessarily. Education Has Taken Some Steps to Help States with Data Issues, but Data Accuracy Remains a Key Challenge: Education has taken steps to help states address data collection issues. First, Education helped states prepare information to address how their graduation rate definitions were valid and reliable. Education gave instructions in its regulations and in a template given to each state to help states prepare the accountability plans they were to submit to Education for approval in 2003. Education also worked with states on an as-needed basis when state officials had questions about what information the Department needed to review. Education officials indicated that they reviewed information in each state's plan when they conducted site visits to states as part of the state plan approval process. According to Education, most states were in some stage of transition in calculating their graduation rates: some were implementing plans to transition from their current definition to a cohort indicator; others were improving their data systems; and some were collecting information on designated student groups for the first time. For these states, Education reported that it was unable to meaningfully examine the reliability of data used to calculate the graduation rate because such definitions of such rates had not been in place for a sufficient number of years necessary to determine whether the rate would produce consistent results. Second, Education, as part of its state monitoring, introduced a data review component to examine data states used for graduation rates, among other aspects of their participation in the Title I program. As of August 2005, Education had monitored and reported on 29 states, and expected to monitor the remaining states by the end of fiscal year 2006 as part of its 3-year monitoring plan. This monitoring consisted of broad questions intended to collect information about how states corrected or addressed errors in student data received from districts and schools, including data used to calculate graduation rates. The monitoring was also designed to identify written procedures states used to confirm the accuracy of their data, the extent to which these procedures were communicated to districts, and how data validity issues related to schools and districts have been addressed. According to Education officials, their reviews of the nine states identified no significant problems with data systems these states used to calculate high school graduation rates. Third, in response to recommendations from GAO and Education's Inspector General, Education contracted with a firm to develop a guide to help states improve data collection processes. According to Education officials, this guide is to consist of three parts. One part is designed for state officials and is to focus on the design and implementation of data systems. A second part, which focuses on data management issues such as methods for verifying the accuracy of data, is designed for district and school officials. A third part summarizes the first two parts and is to be suitable for oral presentation to state, district, and school officials. According to department officials, this guide will be issued by the end of 2005.[Footnote 27] Although Education monitors states to determine if they have written procedures for ensuring data quality and have methods to address data quality issues, it does not evaluate other methods of ensuring data accuracy. For example, it does not assess whether states ensure that districts and schools have effective controls to accurately record student status, including transfers. Further, Education's monitoring approach does not capture whether states ensure that schools have computer controls that allow only authorized staff to make changes to student data. Department officials said that the guide it is developing is planned to address these issues. However, departmental efforts have not resolved immediate data accuracy problems. In July 2005, Education announced that it planned to calculate and report interim graduation rate estimates for each state to provide a nationwide, comprehensive perspective. Education stated that the interim rate that it developed, based on data NCES collects from states, will provide more accurate on-time graduation rates. Some states' graduation rates rely on the same data reported to NCES, while other states rely on different data. However, these states also provide data that are requested by NCES. The quality of the data states provide to NCES varies across states depending, in part, on the extensiveness and rigor of their internal controls and other data verification checks. Because Education plans to rely on state-reported data to calculate interim graduation rates, the accuracy of such data is critical.[Footnote 28] Few Interventions Have Been Rigorously Evaluated, and Education Has Done Little to Evaluate and Disseminate Existing Research: While states and school districts have implemented numerous interventions designed to increase high school graduation rates, few of these programs have been rigorously evaluated, and Education has done little to evaluate and disseminate existing research. Several of the interventions that have been rigorously evaluated have shown potential to increase graduation rates. In addition to these interventions, schools are trying other approaches to enhance students' chances of success, though the effectiveness of these approaches has not been demonstrated. About one third of students who enter high school do not graduate and are likely to earn less money, are more frequently unemployed, and are more likely to receive public assistance compared with those who graduate from high school. In response, some schools and districts have implemented programs to address the factors that influence a student's decision not to complete high school. Research has shown that a student's decision to leave school may be affected by experiences that begin as early as elementary school. For example, studies have shown that students who are not at least moderately skilled at reading by the end of 3rd grade are less likely to graduate from high school.[Footnote 29] Besides basic literacy skills, there are a variety of other academic and family-related factors that contribute to whether a student graduates. For example, poor grades and attendance, school disciplinary problems, and failure to advance to the next grade can all gradually lead to disengagement from school and result in a student not finishing high school. In addition to these academic factors, students from low-income backgrounds, students with low levels of self esteem, or students with a learning or behavioral disability drop out at a much higher rate than other students. Schools and districts have implemented a range of interventions to address these factors and they vary in scope from redesigning the structure of an entire school to an individual school's mentoring program. While there is variability among interventions, most generally fall into one of the three following categories that we identified in our 2002 report[Footnote 30]: (1) school wide restructuring efforts; (2) alternative forms of education for students who do not do well in a regular classroom; and (3) supplemental services, such as mentoring or tutoring services, for at-risk students. While most of the schools we visited used interventions from only one of the three categories identified above, some schools combined aspects of these categories. (See table 2 for a complete list). Table 2: Number of Interventions Visited by School Level and Type: Number of schools visited: 1 Elementary schools; Supplemental services: 1. 1 Elementary/middle school; Supplemental services: 1. 2 Middle schools; School restructuring efforts: 1; Supplemental services: 1. 1 Middle/high school; Supplemental services: 1. 9 High schools[A]; School restructuring efforts: 4; Alternative learning environment: 7; Supplemental services: 1. 2 Elementary/middle/high schools[A]; School restructuring efforts: 1; Alternative learning environment: 1; Supplemental services: 2. Source: GAO analysis of interventions visited. [A] One of these schools/programs used more than one approach. [End of table] Few Interventions Have Been Rigorously Evaluated, Though Some Showed Potential to Increase Graduation Rates: Several of the programs at schools we visited have conducted evaluations of how they affect high school completion, while others are reporting positive results on other outcomes such as attendance or academic performance. We identified and reviewed five intervention evaluations that used a rigorous research design and have shown potential to increase graduation rates. We visited schools that had implemented three of these programs.[Footnote 31] In addition, we visited other schools that were trying other interventions that experts and Education officials noted were promising for improving high school graduation rates. While the effectiveness of these approaches to increase graduation rates had not been demonstrated, research does point towards the possibility that these interventions may help increase high school completion. The three schools we visited whose programs displayed positive results all used a rigorous research design. However, evaluations of the effectiveness of these interventions are not as strong as they need to be for results to be conclusive. For example, design limitations or data collection concerns were evident during our review of these evaluations. It is worth keeping in mind that research of this nature is limited in the education field due to a variety of factors, and these studies represent some of the most promising research on graduation rate interventions available. Promising Approaches: Check and Connect, Project GRAD, Help One Student to Succeed (HOSTS), Talent Development, and First Things First: In our visits to 16 school programs in 6 states, we observed 3 interventions where research has indicated potential for improving high school graduation rates. These interventions addressed a variety of student risk factors and provided services to students in elementary through high school. One school we visited in Minneapolis, Minnesota, had implemented the Check and Connect program which provides mentoring services in an alternative-learning environment. The program began in 1990 with a model developed for urban middle school students with learning and behavioral challenges. It has since been expanded to serve additional at-risk populations as well. This intervention is designed around a mentor who acts as both an advocate and service coordinator for students who have been referred into the program due to excessive absences combined with poor academic performance and behavioral problems. Program officials noted that the mentors offer around-the- clock services including monitoring school performance, regularly checking student data (attendance, grades, and suspensions), and identifying and addressing out of school issues. The mentor also regularly communicates with the student's parents or relatives to ensure that the whole family is engaged in the student's education. The mentoring is built into a program model that relies on several inter-related features including relationship building, individualized and timely intervention, and long-term commitment. A complete listing of program features can be seen in table 3. Table 3: Key Features of the Check and Connect Model: Feature: Relationship building; Definition: Mutual trust and open communication, nurtured through a long-term commitment that is focused on student's educational success. Feature: Routine monitoring of alterable indicators; Definition: Systemically checking warning signs of withdrawal (attendance, academic performance, behavior) that are readily available to school personnel and that can be altered through intervention. Feature: Individualized and timely intervention; Definition: Support that is tailored to individual student needs, based on level of engagement with school, associated influences of home and school, and the leveraging of local resources. Feature: Long-term commitment; Definition: Committing to students and families for at least 2 years, including the ability to follow highly mobile youth from school to school and program to program. Feature: Persistence plus; Definition: Refers to a persistent source of academic motivation, a continuity of familiarity with the youth and family, and a consistency in the message that "education is important for your future." Feature: Problem solving; Definition: Designed to promote the acquisition of skills to resolve conflict constructively and to look for solutions rather than a source of blame. Feature: Affiliation with school and learning; Definition: Facilitating student's access to and active participation in school-related activities and events. Source: Check and Connect Web site, http://ici.umn.edu/checkandconnect/. [End of table] The school we visited in Minneapolis had 220 students in the program during the 2004-05 school year. Program officials noted that students in the program were divided among four mentors and had two separate classrooms they could use to meet with their mentor or to study between classes. The program had no set schedule for the student--it was the responsibility of the mentor to make sure they followed up with the students, parents, teachers, courts or counselors on a regular basis. A student in the program noted that Check and Connect helps because it "provides someone who cares how you do and keeps after you about coming to school and doing well academically." A school official remarked that both attendance and retention rates had improved significantly since the program was implemented. An evaluation of program impacts on students with emotional and behavioral disabilities showed that students participating in Check and Connect were more likely than students not participating to have either completed high school, including GED certification, or be enrolled in an educational program.[Footnote 32] While graduation rates are not available yet for the first Check and Connect cohort at the school we visited, a teacher at the school commented that the staff knows that the program is working "because the students are coming to class everyday." School officials noted that the program is funded through a renewable grant from a private foundation. Another program we visited, Project GRAD (Graduation Really Achieves Dreams), seeks to ensure a quality public education for students in economically disadvantaged communities through school restructuring, curriculum reform, and social services. The goal of the program is to increase high school graduation rates in Project GRAD schools to at least 80 percent, with 50 percent of those students entering and completing college. Originally established in 1989 as a scholarship program, it has since developed into a replicable and comprehensive k-12 school reform model. The reform design relies on two components--a structural model and an instructional model. Structural components include an independent local organization to provide implementation oversight, and community involvement such as mentoring, tutoring, and financial support. Figure 7 shows Project GRAD's structural components. Figure 7: Project GRAD Structural Model: [See PDF for image] [End of figure] Local Project GRAD sites--such as one located in Atlanta--also used the instructional component of the model, which emphasizes specific reading and math programs for students in kindergarten through 8th grade. Program officials commented that this component also incorporates campus based social services (which focus on dropout prevention as well as family case management), classroom management techniques, and college scholarships to all high school students who qualify. In 2004, the local Atlanta site served 29 schools and approximately 17,000 students in the inner city. Officials at one of Atlanta's schools noted that the program provided additional outreach staff to advocate on behalf of students and address other issues that may interfere with a student's ability to attend school and learn. Students at the school, commenting on the program's effect on their lives, noted that the program should be expanded to all of the schools in the district because of the opportunities it offers students. Project GRAD- Atlanta officials noted that the effectiveness of the program has been demonstrated through higher test scores and increased college attendance since implementing Project GRAD in these schools. Additionally, the results of an independent evaluation of Project GRAD also suggest an increase in students' test scores and graduation rates.[Footnote 33] However, aspects of the study's design may limit the strength of study findings. The Project GRAD--Atlanta model relies on a mix of public funding and private local fundraising. As of school year 2003-04, Project GRAD had also been replicated in feeder systems in Akron, Ohio; Brownsville, Tex; Cincinnati, Ohio; Columbus, Ohio; Houston, Tex; Kenai Peninsula, Alaska; Knoxville, Tenn; Lorain, Ohio; Los Angeles, Calif; Newark, N.J. and Roosevelt, N.Y. We also visited a school that had implemented the language arts component of the HOSTS program, an intervention focused on literacy, an area that research has linked to students' graduating. This program is a structured tutoring program in reading and language arts that targets low performing elementary students whose reading skills are below grade level. School officials at the elementary school we visited noted that they had been using the program for 7 years to increase at-risk student's reading scores as well as raise their self esteem. The 90 students in the program worked individually with a tutor 4 days a week for 30 minutes each day. School officials considered the program a success because of the number of students who successfully transitioned into grade level reading in the regular classroom. The program, which has been replicated in schools or districts in 12 states, was cited in the report language of the NCLBA as a scientifically based intervention that has assisted schools in improving student achievement. A recent study of the program in nine Michigan elementary schools suggests reading improvement for students at schools participating in HOSTS programs.[Footnote 34] While this study displayed some promising results for elementary literacy, students were not tracked over time to determine its effect on high school graduation rates. Two recently completed rigorous program evaluations also displayed promising results for increasing graduations rates. These two programs, the Talent Development Model and First Things First, are both comprehensive school reform initiatives with numerous components. The Talent Development program in Philadelphia, Pennsylvania, is designed to improve large urban high schools that face serious problems with attendance, discipline, achievement scores, and graduation rates. The program has been implemented in twenty districts nationwide and consists of several different components including a separate career academy for all 9th graders, career academies for students in 10th through 12th grades, block scheduling (4 courses a semester, each 80-90 minutes long) and an after hours program for students with attendance or behavioral problems. An evaluation of the first five schools in Philadelphia to implement the Talent Development program suggest that it may have contributed to increasing the graduation rate for two high schools compared with other high schools in the district that did not implement the program.[Footnote 35] The First Things First program was first launched in Kansas City, Kansas, and has since been tested in 12 middle schools and high schools in four additional districts. The program has three central components: small learning communities of up to 350 students, a family advocate system that pairs students with a staff member who monitors their progress, and instructional improvement that aims to make lessons more rigorous and better aligned with state and local standards. A recent evaluation in Kansas City schools suggests that students in the four high schools with First Things First had increased reading and math scores, improved attendance, lowered dropout rates, and increased graduation rates compared with schools that did not participate in the program.[Footnote 36] For middle schools in Kansas City, the study found increased reading and math scores and somewhat improved attendance compared with other scores. However, the research did not show significance differences in the First Things First schools when compared with other schools in two other school districts. Approaches Selected Schools Are Trying to Enhance Students Chances for Success: In addition to the 3 school programs we visited whose rigorous evaluations displayed potential for increasing graduation rates, we also visited 13 other school programs which experts, Education officials, and evaluations noted were promising. While the effectiveness of these approaches has not been demonstrated, research points toward the possibility that these interventions may help increase high school completion. These other school programs generally focused on one specific approach which generally fell into one of three categories--school restructuring, alternative learning environment, and supplemental services. Selected programs that illustrate these approaches are discussed below. School-Restructuring Efforts: Making Schools Smaller: Schools and districts used schoolwide restructuring to change a school or all schools in the district to provide a more personalized education and increase graduation rates. Schoolwide restructuring efforts are generally implemented in schools or districts that have a history of high dropout rates. One restructuring approach is to create many small schools from larger low performing schools. For example, the New Century High Schools Consortium for New York City is a New York Public School's small schools initiative that is funded by the Bill and Melinda Gates Foundation, the Carnegie Corporation of New York, and the Open Society Institute. School officials commented that the project began in the Bronx with the conversion of six low performing high schools that served between 1,500 and 3,000 students each. This intervention began in 2001 and, as of September 2004, New York City had created 77 small schools. One of those schools, Morris High School, has been a part of this program since the small schools program begun in 2001. School officials noted that the school has been divided into several small schools including the Bronx International High School and the Bronx Leadership Academy, which serve 300 and 252 students respectively. While housed in the same building, each school has a different curriculum and student population. For example, the Bronx International High School provides an intensive English language program for recent immigrants while the Bronx Leadership Academy offers a science-based curriculum for college bound students. The core concepts for both these programs are the small school size, team approach to teaching, and school-based learning that also has relevance within their community. A student at the school noted that the small groups they work in allow students to help and support each other, something that did not happen in junior high school. School officials commented that teacher investment in the school is expected and is often displayed by working overtime, serving as counselors to students, and participating in school governance. Additionally, the project-based curriculum is developed by teacher teams who work collaboratively to plan activities for incoming students. School officials did not indicate a plan for a formal outcome-based evaluation of the schools; however, they did consider the intervention a success based on positive improvement in a number of areas including higher percentages of students meeting state standards, higher attendance rates, and higher passing grades. The New York City Department of Education reported similar results for small schools throughout the district including more students advancing from 9th to 10th grade and higher attendance rates. While these results provide a snapshot of some possible benefits of New York's school reform initiative, it is still too early to look at student outcomes. The Gates Foundation has commissioned an 8-year evaluation of the small schools program. Alternative-Learning Environment: Providing Individualized Education: States and school districts are also using alternative learning environments for students at risk of school failure. These interventions are designed to foster a supportive school environment through small enrollments, one-on-one interaction, flexible schedules, and structures, and a curriculum that appeals to students' interests.[Footnote 37] Often, enrollment is limited and the programs are tailored to individual students' needs to ensure that they graduate. One type of alternative learning environment, the career academy, is focused on keeping students in school by providing an interesting curriculum focused on a specific career theme. For example, Aviation High School in Washington State is an aviation-themed public high school housed at a local community college. School officials noted that the school addresses a range of student risk factors, including those related to academics (learning and literacy), social issues (attendance and behavior), and family (counseling and strategies for living with drug addicted family members). With a 2004 enrollment of only 103 students, Aviation High School offers small class sizes, aviation themed curriculum, and mentoring opportunities. (See figure 8 for an example of a school event focused on aviation). Figure 8: Aviation High School Presentation by the Blue Angels: [See PDF for image] [End of figure] Additionally, school officials report that each teacher at the high school serves as a student advisor who assists students with academic, social, and emotional development. Students noted that while transportation to the school was challenging due to its distance from their home, they still selected the program because of the aviation curriculum, the personalized attention they received, and the highly motivated students at the school. Aviation High School officials indicated that it is too soon to tell the impact of the program, but they noted that the school will be included in a national evaluation to be conducted by the Gates foundation. Research on career academies has demonstrated positive gains for employment and earnings for graduates, but also found that high school completion rates of career and non career academy students were not significantly different.[Footnote 38] Alternative learning environments may also allow students to tailor their learning experience to individual needs that are not being met in traditional schools. For example, we visited an alternative high school in Atlanta, Georgia, that uses a computer-based instructional program designed for students to learn the state-certified curriculum at their own pace. Students rotate through classrooms, each of which contains a different computer module for the particular subject being taught. Students received assistance from teachers as needed. According to officials, the school is made up of a team of 6 teachers and 75 at-risk 11th and 12th grade students (for the 2004-05 school year). The school's enrollment is composed of students who were referred to the school either through other schools, court tribunals, or parents. School officials noted that the program also includes a motivational component. For example, each school morning begins with an assembly where students discuss the obstacles they have had to overcome and the people who have helped make a difference in the world. After the assembly, students get up and shake hands with each other and then move to their first hour class. School personnel stated that this allows students to begin each day with confidence and prepares them to learn. School officials noted that the school's graduation rate, which they stated was consistently over 90 percent, indicated that the program was effective. Research on alternative programs in general has shown some promising outcomes. For example, an evaluation of 8 middle school dropout prevention programs showed some positive impacts on dropout rates, grade promotion, grades, and test scores for students in alternative programs.[Footnote 39] The same study also looked at five alternative high school programs and found limited evidence that these programs reduced dropout rates, but did note that alternative programs oriented toward GED certificates experience were more effective than those oriented toward high school diplomas.[Footnote 40] Supplemental Services: Targeting Literacy and Self-Esteem: Several schools we visited used targeted supplemental services to provide at-risk students with extra help. These services aim to improve students' academic performance, acclimate them to a new culture, or increase their self-esteem. Supplemental service programs are offered at all grade levels, with research showing the importance of building academic and social skills at an early age. Supplemental services can focus on the needs of a specific group of students, such as immigrant students or students with limited English proficiency. One such intervention we visited in Georgia was designed to provide educational and cultural services to immigrant students with low level English skills and limited formal schooling. These interventions, often referred to as "newcomer" models, provide intensive language development courses and may also offer a cultural orientation component. Newcomer programs can take place within a school or at a separate site and vary in the amount of time a student is enrolled. The benefits of the newcomer program is supported by research on English language learners that notes one major factor that decreases risk of dropping out of school is level of understanding and mastery of the English language.[Footnote 41] At the program we visited, international students who were new to the district were registered, tested, and placed depending on their skill level. Students with no English language skills were placed in an intensive 3 - to 6-week English program that helped ease the transition into school. Students who were 14 years or older and had fewer than 7 years of formal schooling in their native country were placed in the English for Speakers of Other Languages (ESOL) lab program. School officials noted that the lab served 132 students in school year 2004-05 and is designed to help students achieve grade level proficiency within 3 years. The ESOL lab focused on listening, speaking, reading, and writing English in addition to other core high school courses such as math, science, and social studies. Additionally, several district schools have added Saturday school tutorials for parents and students. Students can study language arts while their parents attend citizenship classes, orientation, and career awareness sessions. School officials noted that they believe the number of ESOL students graduating has increased, based on state-reported rates as well as the numbers of students who pass the ESOL tests and exit the program. Other supplemental services incorporate cultural elements as a means of addressing student self-esteem. For example, a k-8 school located on the Arapahoe Indian reservation in Wyoming offers all students services that include after-school academic programs, drug awareness events, and a 2-week summer cultural camp focusing on Native American traditions. School personnel emphasized that the path to high school graduation begins with helping students address their self-esteem issues. School officials mentioned that students already have a mindset that they are not going to graduate from high school and do not have a future on or off the reservation. The cultural element of the school's programs is a significant component of building up the student's self-esteem and instilling a pride about their Native American identity. Students commented that they participated in the program because of the Native American cultural activities offered, including clogging, dancing, and drumming. Program officials noted that since implementing interventions designed specifically to address the issues of Native Americans, they have noticed general improvement in student attitudes and performance. While studies suggest that self-esteem affects dropout rates,[Footnote 42] a study over time of the intervention programs used by the Arapahoe school would be needed to determine its effectiveness. Education Has Done Little to Evaluate and Disseminate Knowledge about Interventions: Graduation rates have become increasingly important since the passage of NCLBA, but Education has done little to evaluate and disseminate knowledge about interventions that could help increase such rates. The increased interest in high school reform by the National Governors Association, combined with concerns about low graduation rates, have set the stage for designing strategies that encourage more students to graduate. While many types of interventions are available for school districts, most have not been rigorously evaluated, and there is little information on which are successful and for what student subgroups. Most officials from the 20 states we included in our study told us that such information would be useful. For example, one school official noted that little information exists on what interventions increase graduation rates among Native American students and that such information would be helpful in designing interventions. Education has made some efforts to address the problem of high school completion by sponsoring research and disseminating information through conferences and on its Web site. For example, Education officials noted that Education's Office of Special Education Programs has supported research papers on dropout interventions for youth with disabilities. These studies are currently being completed and will be available in late 2005. In terms of dissemination, Education's 2nd Annual High School Leadership Summit held in December 2004 included sessions on dropout prevention and recovery as well as strategies for creating higher-performing schools. Additionally, Education's Office of Vocation and Adult Education has dedicated a part of its Web site to the High School Initiative. The pages on the Web site contain information on high school reform models, adolescent literacy initiatives as well as information on research based practices that may help high schools. While Education has made some efforts to help states and districts address the dropout problem, the agency has not acted on its commitment to implement the recommendation, contained in our 2002 report on interventions, that Education evaluate results from research. Agency officials have commented several times that they plan to evaluate the research on dropout prevention efforts and then disseminate the results through the agency's What Works clearinghouse. However, the Web space for this effort still contains placeholder information.[Footnote 43] Agency officials indicated that reviews of other topics, such as elementary reading and math, have come before the reviews necessary for the dropout section of the Web site. Conclusions: The nation's public school systems are responsible for educating 48 million students, the majority of our future workforce. Providing them with the skills needed to succeed is vital to the nation's economic strength and ability to compete in a global economy. NCLBA was passed to ensure that all students have access to a high-quality education and to increase the likelihood that these students will graduate. In particular, the act seeks to make significant changes in public education by asking federal, state, and local education officials to reconsider how they assess the academic achievement of the nation's students. NCLBA specifies that states must set high school graduation rate indicators as an additional benchmark, along with test results, for measuring schools' progress. However, increasing and accurately calculating graduation rates have been formidable challenges for many states and districts. Many states have used flexibility to define their indicators as both numerical goals as well as progress toward those goals, where progress has generally ranged from no increase to a 1 percent increase from the previous year. Therefore, some states have set expectations that their schools may not graduate many more students than previously. Education has addressed these challenges by developing some guidance and providing support such as on-site peer reviews, conferences, and information on its Web site. However, because Education's approach has been to provide guidance on how to deal with specific student circumstances on a case-by-case basis, not all states have received such guidance. Without guidance, state officials may not appropriately include students in these specific circumstances in their graduation rate definitions, resulting in graduation rates that may be inaccurate. Such inconsistent calculations raise questions about the quality of graduation rates reported by states. A key challenge for states is to ensure that student data used for calculating state graduation rates, as well as data provided to NCES, are accurate and that state systems have the internal controls and data verification checks to promote data reliability. As some states transition to new graduation rate definitions, it is important that they ensure that such controls are part of new student data systems. Student data accuracy is particularly important because Education plans to use those state data reported to NCES to develop interim graduation rate estimates, which are intended to promote consistency across states and provide a nationwide perspective. Finally, little is known about the success of interventions that are designed to increase high school graduation rates. While some programs have shown potential to increase such rates, few have been rigorously evaluated. Some interventions have conducted limited evaluations of a variety of different outcomes (attendance, test scores, job attainment), but more comprehensive evaluations are necessary to understand programs' effects on graduation rates. As a result, schools and districts may not be using the most effective approaches to help their students stay in school and graduate. Education could play an important role in evaluating existing research, which was a recommendation we made in our 2002 dropout report. Although Education agreed with this recommendation, the agency has not established a clear plan or timetable for carrying it out. Additionally, Education should disseminate the results of research, since such information will be critical as high school reform moves forward. Recommendations for Executive Action: To assist states in improving their definitions of high school graduation rates and enhancing the consistency of these rates, we recommend that the Secretary of Education make information available to all states on modifications available to account for students in special programs and students with disabilities in their graduation rate calculations. This information could include fuller explanations or examples of available flexibilities. We recommend that the Secretary of Education, before developing interim graduation rate estimates, assess the reliability of data submitted by states used for this purpose. This assessment could include specific criteria that demonstrate that states' data systems can produce accurate data. We recommend that the Secretary establish a timetable for carrying out the recommendation in our 2002 report that Education evaluate research on dropout interventions, including those interventions that focus on increasing graduation rates. In addition, we recommend that the Secretary disseminate research on programs shown to be effective in increasing graduation rates. Agency Comments and Our Evaluation: We provided a draft of this report to Education for review and comment. In its letter, Education concurred with two of our three recommendations: (1) about making information available to all states on modifications available to account for students in special programs and students with disabilities in their graduation rate calculations and (2) about evaluating research on dropout interventions and disseminating such research on those programs shown to be effective in increasing graduation rates. Regarding our recommendation that that the department assess the reliability of data submitted by states that it plans to use to develop interim graduation rate estimates, Education noted that it has taken a number of steps to conduct such reliability assessments. However, it is not clear whether these efforts include those data that Education will be using to develop interim graduation rate estimates. Although data submitted to Education are publicly available and have been reported by states for years, their reliability has not been determined. We believe that Education should take additional steps to ensure the reliability of these data before they are used in calculating such estimates. Education officials also provided technical comments that we incorporated into the report where appropriate. Education's written comments are reproduced in appendix II. We are sending copies of this report to the Secretary of Education, relevant congressional committees, and other interested parties. We will also make copies available to others upon request. In addition, the report will be made available at no charge on GAO's Web site at http://www.gao.gov. Please contact me at (202) 512-7215 if you or your staff have any questions about this report. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this report. Other contacts and major contributors are listed in appendix III. Signed by: Marnie S. Shaul, Director: Education, Workforce, and Income Security Issues: [End of section] Appendix I: Objectives, Scope, and Methodology: To address the objectives of this study, we used a variety of methodological approaches. We analyzed the plans states were required to submit to Education to identify the graduation rate definitions states used and graduation rate indicators set by states, reviewed updates to plans submitted through July 2005 and reviewed letters from Education to states regarding its decisions about state plans and updates. As part of another GAO review, we surveyed officials in 50 states, the District of Columbia, and Puerto Rico to obtain information about two issues--the extent to which (1) states verify school and district data used to calculate high school graduation rates and (2) have unique student identifiers. The surveys were conducted using self-administered electronic questionnaires posted on the World Wide Web. We sent e-mail notifications to all 52 state Performance Based Data Management Initiative coordinators (50 U.S. states, the District of Columbia, and Puerto Rico) beginning on November 15, 2004. We closed the survey on January 13, 2005, after the 50th respondent had replied. Washington state and the District of Columbia did not complete the survey in time to be included in our analysis. We selected 20 states for further analysis. States were selected to capture variation in high school graduation rate definitions, geographic location, and types of interventions with the potential to increase graduation rates. We conducted: * a case study in 1 state (Washington state) to calculate graduation rates; * site visits in 3 states (Georgia, North Carolina, and Washington) to review data accuracy; * site visits in 6 states (Georgia, Illinois, Minnesota, New York, Washington, and Wyoming) to observe interventions and interview program staff; and: * semi structured telephone interviews in all 20 states to obtain information on definitions used, implementation status, and guidance provided by Education. See table 4 for a list of states selected for site visits and phone interviews based on the research objective we studied. Table 4: States Selected for Site Visits and Phone Interviews by Purpose: To address the first research question regarding data definitions and calculations: Washington[A]; To address the first research question regarding rationale for selecting definitions: California; Colorado; Connecticut; Delaware; Florida; Georgia[A]; Illinois; Indiana; Kansas; Massachusetts; Minnesota; Mississippi; New Hampshire; New Mexico; New York; North Carolina[A]; Pennsylvania; Washington[A]; Wisconsin; Wyoming. To address the second research question regarding data accuracy: Georgia[A] North Carolina[A]; Washington[A]. To address the third research question regarding interventions: Georgia[A]; Illinois[A]; Minnesota[A]; New York[A]; Washington[A]; Wyoming[A]. Number of states: To address the first research question regarding data definitions and calculations: 1; To address the first research question regarding rationale for selecting definitions: 20; To address the second research question regarding data accuracy: 3; To address the third research question regarding interventions: 6. Source: GAO Analysis. [A] States where GAO team conducted site visits. [End of table] In our case study we used student data from Washington state for the 2002-03 school year, the most recent school year for which data were available at the time of our review. Using these data, we conducted an analysis comparing the results of calculating the high school graduation rate using two different graduation rate definitions--the cohort definition and the departure classification definition. Washington state used a modified cohort formula that was based on tracking student dropouts rather than on tracking student transfers.[Footnote 44] It also required all students with "unknown" status to be reported as dropouts. We also used these data to analyze the effects of allowing schools to make progress toward the graduation rate target as a means of making AYP and using an estimated miscount of the number of dropouts on the graduation rate. We interviewed experts to determine reasonable rates at which dropouts may be in error. We analyzed data using a set of 444 out of 547 of the state's high schools. The 103 high schools that were not included in our analysis were those with graduation rates of 10 percent or less. These were generally alternative high schools, such as those designed to serve students who had committed serious crimes. We also interviewed a state official who confirmed our understanding of the omitted schools and agreed with the reasonableness of the criterion. Although our analyses were based on a 4-year period, we used the 1 year of student data and estimated information for the 3 prior years. We did not obtain student data from prior years because state officials told us that data accuracy had improved significantly in the 2002-03 school year. We assessed the reliability of the Washington state data by (1) performing electronic testing of required data elements for missing data and for obvious errors, (2) reviewing existing information about the data and the system that produced them, and (3) interviewing Washington state officials knowledgeable about the data. However, we did not check the data to source information. We determined that the data were sufficiently reliable for the purposes of this report. To identify interventions with the potential to increase graduation rates, we used a "snowballing" approach. Using this approach, we reviewed the literature on interventions and interviewed Education officials and dropout prevention experts and reviewed Web sites, such as the National Dropout Prevention Centers Web site (http://www.dropoutprevention.org/), to identify those that have the potential to increase high school graduation rates. Based on the research we reviewed and on recommendations from experts, we selected several interventions at various locations around the country. For those interventions we selected to visit we reviewed available evaluations, including the findings related to outcomes, such as increased graduation rates and improved literacy. We also assessed the methodological approaches of these evaluations. Based on our review, we identified 3 interventions that had been rigorously evaluated and have shown potential to increase graduation rate and visited 3 schools that had implemented these programs. (Rigorous evaluations of 2 other interventions which showed promising results were released subsequent to our field work. We reviewed the results of these evaluations and reported their findings.) We also visited schools that had implemented 13 other interventions that experts and research showed promise in affecting factors that may improve grad rates. However, rigorous evaluations on these programs had not been done at the time of our review. To determine how Education assists states, we reviewed Education regulations, guidance, and other documents and interviewed Education and state agency officials. We also interviewed these officials to determine the degree to which Education's actions have enhanced and disseminated knowledge about interventions. Finally, we interviewed officials from the National Governors Association, national education organizations, and other experts in the area of high school graduation rates and reviewed related research to obtain an understanding of the issues surrounding these rates and high school reform efforts to address them. We conducted our work between September 2004 and July 2005 in accordance with generally accepted government auditing standards. [End of section] Appendix II: Comments from the Department of Education: UNITED STATES DEPARTMENT OF EDUCATION: THE ASSISTANT SECRETARY: OFFICE OF ELEMENTARY AND SECONDARY EDUCATION: August 25, 2005: Ms. Marnie S. Shaul: Director, Education, Workforce and Income: Security Issues: Government Accountability Office: 441 G Street, NW: Washington, DC 20548: Dear Ms. Shaul: I am writing in response to your request for comments on the Government Accountability Office (GAO) draft report (GAO-05-879), dated September 2005, and entitled "No Child Left Behind Act: Education Could Do More to Help States Better Define Graduation Rates and Improve Knowledge about Intervention Strategies." I appreciate the opportunity to comment on the draft report and provide insight on actions the Department of Education is taking to help States better define graduation rates and improve knowledge about intervention strategies. Governors and education leaders across the country have acknowledged the need for a more accurate graduation rate across the States. As Deputy Secretary Raymond Simon stated at the National Governors Association meeting in June 2005, "There is no doubt that this nation needs a better way to get a handle on how many students graduate from high school. Right now, each state calculates and reports graduation rates differently, which prevents us from seeing the big picture of the country's education level." For this reason, the Department will calculate the Averaged Freshman Graduation Rate for each State and report this rate alongside the graduation rates reported by the States. I believe that by using this approach the nation will achieve a more comprehensive and accurate assessment of the percentage of students who graduate from high school in four years. Improving the accuracy of our graduation rate statistics will allow the Department, States, and school districts to better target resources and modify instructional practices for children who might otherwise become school dropouts. Regarding the recommendations contained in the draft report, I provide the following responses: GAO Recommendation 1: We recommend that the Secretary of Education provide information to all States on ways to account for different types of students in graduation rate calculations. We agree with this recommendation and will work with various offices in the Department to provide additional policy guidance to States on ways to account for different types of students in graduation rate calculations. We initiated this discussion with the States during the first round of accountability system plan peer reviews conducted during the spring of 2003. During those reviews, Department staff and peer reviewers discussed with each State which students should be included as graduates for NCLB accountability purposes as States developed their NCLB accountability system plans. In addition, the Office of Elementary and Secondary Education has incorporated several measures in its Title I monitoring process to address this concern. Our Title I monitors, during their State monitoring visits, collect evidence on whether a State has established clear criteria and quality control mechanisms for collecting data from schools and school districts that are used for accountability purposes. Department staff specifically focus on the graduation rate indicator in the accountability section of the Title I monitoring instrument to determine how States are calculating the graduation rate and to ensure that, as required by NCLB, dropouts and students who earn equivalency and special diplomas are not counted as regular diploma graduates. The Department is also working with States to consolidate and streamline State data collections and to establish a set of common definitions across many of the programs we fund. Although not yet operational in every State, we believe the Performance Based Data Management Initiative will greatly improve data collection and result in significantly improved, more consistent data. I believe that the efforts undertaken by the Department reflect our leadership in addressing data quality concerns that are raised by your draft report. GAO Recommendation 2: We recommend that the Secretary of Education, before developing interim graduation rate estimates, assess the reliability of data submitted by states for this purpose. This assessment could include specific criteria that demonstrate that states 'data systems can produce accurate data. We believe that the Department's decision to calculate the Averaged Freshman Graduation Rate for each State and to report this rate alongside the graduation rates reported by the State will help to enhance the reliability of the graduation rate data reported. We agree with the recommendation of the National Institute of Statistical Sciences (NISS)/Education Statistics Service Institute (ESSI) Task Force on Graduation, Completion, and Dropout Indicators, that the most accurate measure of an on-time graduation rate is a cohort rate that is computed from a student record data system that includes verified data on the status of individual students. Data from such a student record data system could also be used to calculate five-year graduation rates, and rates that allow time accommodations for students with Individualized Educational Programs (IEPs) that specify longer than four years for the student to complete a high school program. At the present time several States have student record data systems up and running and many more States are in the planning and development stages for student record systems. To assist in these development activities, the Department will be awarding grants to States this year under the new Statewide Data Systems program, for which Congress provided $24.8 million. By definition, a cohort rate for a four-year on time graduation rate requires four years of data. Because some States are still in the early planning stages, it will be some years before cohort graduation rates will be available for all States. There is some variability in what States have been approved to report in the NCLB accountability reports as their data collection efforts in this area evolve. However, in the interest of having a common metric to use across States, the Department turned to an analysis conducted by NCES to select an interim graduation rate that would be independent of the graduation rates States calculate for determining adequate yearly progress. That analysis examined the range of alternative proposed graduation rates using publicly available data from the NCES Common Core of Data (CCD), and using data provided by two States that have had student record data systems in place for a number of years. The results of the analysis pointed to the Averaged Freshman Graduation Rate as the best available graduation indicator that can be computed on an interim basis using cross-sectional data currently reported in CCD. The Averaged Freshman Graduation Rate uses the State's report of regular diploma recipients in the numerator, and the denominator is the average of the number of 8THgraders five years earlier, 99THraders four years earlier, and 1010 graders three years earlier. One of the positive qualities of this interim measure is that it relies on basic data elements that have been reported to NCES for the CCD for a number of years. This measure does not require the use of dropout data, and thus avoids the problems that GAO, NCES, and others have identified with unverified dropout data. GAO also points to possible differences in definitions of graduates across States resulting from the continued use of existing data collection systems. The Department agrees with the NISS/ESSI Task Force recommendation that new energies should focus primarily on the development of student record data systems in the States as opposed to efforts to retool or improve existing systems; however, in the interest of transparency, NCES is currently conducting a review of the individual State's reported practices for identifying and categorizing regular diploma recipients and other types of high school completers. This information will be provided with the interim rates to ensure that any definitional differences are available. Additionally, the Department has taken a number of steps to assess the reliability of data submitted by States, including a review of the data elements contained in State information management systems. The Department's Strategic Accountability Service office conducted a review of the characteristics of individual States as related to their technological readiness for participation in the pilot of the Department's Performance Based Data Management Initiative in 2003 and again in the Spring of 2004. We learned through the site visits to the 50 States, the District of Columbia and Puerto Rico that State education agencies (SEAs) are moving steadily toward automated data collections at the individual student (and staff) level. However, major hurdles must be overcome that affect the accuracy and reliability of the data collected, which include building and maintaining the technical infrastructure of hardware, software, networks, and staff, and facilitating the inclusion, education, and reassurance of stakeholders about the controls that can exist within an automated system. During the SEA site visits, teams composed of Department staff and consultants provided technical assistance to the SEAs on the Department's efforts to streamline its data collection efforts. We also collected information on State information system data element definitions as a part of the process of learning how States define such data elements as graduation rate. and how they ensure via the data collection process that the data they are collecting are accurate. In our Title I, Part A Report Card Guidance, issued during September 2003, the Department provided information on State and school district responsibilities for ensuring that the information on report cards, including graduation rates, is statistically reliable and does not reveal personally identifiable information about individual students. The Report Card Guidance also presents information regarding how States and local educational agencies can ensure the accuracy of report card data. In response to the September 2004 GAO report (GAO-04-734), entitled "No Child Left Behind Act: Improvements Needed in Education's Process for Tracking States' Implementation of Key Provisions," the Department initiated a contract task order to develop data quality guidelines for States and school districts that will provide guidance and suggestions for improving their internal quality control systems to reduce errors and increase reliability, as well as improve data quality monitoring procedures. The Department will disseminate this guide to States, along with a Power Point presentation that States can use to train district and school personnel on the application of information contained in the guide. GAO Recommendation 3: We recommend that the Secretary ofEducation establish a timetable for carrying out the recommendation in our 2002 report that Education evaluate research on dropout interventions, including those interventions that focus on increasing graduation rates. In addition, we recommend that the Secretary disseminate research on programs shown to be effective in increasing graduation rates. We agree with this recommendation and began, in April 2005, through the IES What Works Clearinghouse, to review and identify research on effective intervention strategies for dropout prevention. The review, Interventions for Preventing High School Dropout, is examining secondary school (middle, junior, and high school) interventions designed to keep students in school and contribute to high school completion, and will address the following questions: * Which dropout prevention programs are effective in keeping students in school and helping them progress in school? * Are some components and types of dropout prevention programs more effective than others? * Are some dropout prevention programs more effective for some types of students, such as minority students or special education students? The Clearinghouse has developed drafts of the protocol, coding guide, and intervention list that are initial steps in the review process. A preliminary literature search for this review has yielded more than 1700 articles and, of those articles, 1038 studies have been identified as meeting initial relevancy for inclusion in the review. So far, the review team has identified approximately 15 - 18 potential interventions. The first release of reports on Interventions for Preventing High School Dropout is planned for the early part of 2006 for dissemination on the What Works Clearinghouse website. The completion date for this review is contingent on the final number of interventions that have studies that pass the Clearinghouse's evidence standards, but the Department is projecting completion by the end of 2006. The Department continues to provide guidance and technical assistance to States that are refining their graduation rate definitions. We are also reviewing information that will provide States with best practices for intervening with students at risk of dropping out of school. The Department acknowledges that there is still much work to be done in increasing the accuracy of graduation rates and to improve instructional practices that will promote school completion. We look forward to continuing to work with States as they refine their graduation rates and implement programs that will keep at risk students in school. The Department will also continue to support States in their efforts to improve data quality and accountability. Thank you again for the opportunity to comment. Sincerely, Henry L. Johnson: [End of section] Appendix III: GAO Contact and Staff Acknowledgments: GAO Contact: Marnie S. Shaul, (202) 512-7215, shaulm@gao.gov: Staff Acknowledgments: Harriet Ganson (Assistant Director), Julianne Hartman Cutts (Analyst- in-Charge), and Jason Palmer (Senior Analyst) managed all aspects of the assignment. Dan Klabunde made significant contributions to this report, in all aspects of the work. In addition, Sheranda Smith- Campbell, Nagla'a El-Hodiri, and Greg Kato provided analytic assistance. Jean McSween, Karen O'Conor, and Beverly Ross provided technical support. Jim Rebbe and Sheila McCoy provided legal support, and Corinna Nicolaou assisted in the message and report development. [End of section] Related GAO Products: No Child Left Behind Act: Improvements Needed in Education's Process for Tracking States' Implementation of Key Provisions. GAO-04-734. Washington, D.C.: September 30, 2004. No Child Left Behind Act: Additional Assistance and Research on Effective Strategies Would Help Small Rural Districts. GAO-04-909. Washington, D.C.: September 23, 2004. Special Education: Additional Assistance and Better Coordination Needed among Education Offices to Help States Meet the NCLBA Teacher Requirements. GAO-04-659. Washington, D.C.: July 15, 2004. Student Mentoring Programs: Education's Monitoring and Information Sharing Could Be Improved. GAO-04-581. Washington, D. C.: June 25, 2004. Title I: Characteristics of Tests Will Influence Expenses; Information Sharing May Help States Realize Efficiencies. GAO-03-389. Washington, D.C.: May 8, 2003. Title I: Education Needs to Monitor States' Scoring of Assessments. GAO- 02-393. Washington, D.C.: April 1, 2002. School Dropouts: Education Could Play a Stronger Role in Identifying and Disseminating Promising Prevention Strategies. GAO-02-240. Washington, D.C.: February 1, 2002. Elementary School Children: Many Change Schools Frequently, Harming Their Education. GAO/HEHS-94-45. Washington, D.C.: February 4, 1994. [End of section] Bibliography: Burns, Matthew K., Barbara V. Senesac, and Todd Symington. "The Effectiveness of the HOSTS Program in Improving the Reading Achievement of Children At-Risk for Reading Failure." Reading Research and Instruction, vol. 43, no. 2 (2004): 87-103: Dynarski, Mark, Philip Gleason, Anu Rangarajan, and Robert Wood. Impacts of Dropout Prevention Programs, Final Report. Princeton, New Jersey: Mathematica Policy Research, Inc., 1998. Dynarski, Mark, Philip Gleason, Anu Rangarajan, and Robert Wood. Impacts of School Restructuring Initiatives, Final Report. Princeton, New Jersey: Mathematica Policy Research, Inc., 1998. Dynarski, Mark and Philip Gleason. How Can We Help? What We Have Learned From Evaluations of Federal Dropout Prevention Programs? A Research Report from the School Dropout Demonstration Assistance Program Evaluation. Princeton, New Jersey: Mathematica Policy Research, Inc., 1998. Gingras, Rosano, and Rudy Careaga. Limited English Proficient Students at Risk: Issues and Prevention Strategies. Silver Spring, Maryland: National Clearinghouse for Bilingual Education, 1989. Greene, J. P. and Marcus A. Winters. Public School Graduation Rates in the United States (New York: Manhattan Institute for Policy Research, 2002), http://www.manhattan-institute.org/pdf/cr_31.pdf (accessed June 21, 2005). Kemple, James J. Career Academies: Impacts on Labor Market Outcomes and Educational Attainment. New York: Manpower Demonstration Research Corporation, December 2001. Kemple, James J., Corinne M. Herlihy, and Thomas J. Smith. Making Progress towards Graduation: Evidence from the Talent Development High School Model. New York: Manpower Demonstration Research Corporation, May 2005. Kerbow, David. "Patterns of Urban Student Mobility and Local School Reform." Journal of Education for Students Placed at Risk. vol. 1, no. 2 (1996): 149-171. Lehr, Camilla A. and Cheryl M. Lange. "Alternative Schools Serving Students with and without Disabilities: What Are the Current Issues and Challenges." Preventing School Failure, vol. 47, no. 2 (2003): 59-65. Opuni, K. A. Project GRAD Newark: 2003-2004 Program Evaluation Report, Houston, Texas: Center for Research on School Reform, February 2005. Quint, Janet, Howard S. Bloom, Alison Rebeck Black, LaFleur Stephens LaFleur, and Theresa M. Akey. The Challenge of Scaling Up Educational Reform: Findings and Lessons from First Things First, New York: Manpower Demonstration Research Corporation, July 2005. Rumberger, Russell, and Scott Thomas. "The Distribution of Dropout and Turnover Rates among Urban and Suburban High Schools." Sociology of Education, vol. 73, no. 1 (2000): 39-69. Sinclair, M. F., S. L. Christenson, and M. L. Thurlow. "Promoting School Completion of Urban Secondary Youth with Emotional or Behavioral Disabilities." Exceptional Children, (in press). Snow, Catherine E., M. Susan Burns, and Peg Griffin, Eds. Preventing Reading Difficulties in Young Children. Washington, D.C.: National Academy Press, 1998. Swanson, Christopher B. Keeping Count and Losing Count: Calculating Graduation Rates for All Students under NCLB Accountability. Washington, D.C.: Urban Institute, 2003, http://www.urban.org/url.cfm?ID=410843 (downloaded June 21, 2005). Shannon, Sue G., and Pete Bylsma. Helping Students Finish School: Why Students Drop Out, and How to Help Them Graduate. Olympia, Washington: Office of Superintendent of Public Instruction, 2003. U.S. Department of Education, National Center for Education Statistics, National Forum on Education Statistics. Forum Guide to Building a Culture of Quality Data: A School and District Resource. NFES 2005-801. Washington, D.C.: 2004. U.S. Department of Education, National Center for Education Statistics. National Institute of Statistical Sciences/Education Statistics Services Institute Task Force on Graduation, Completion, and Dropout Indicators. NCES 2005-105. Washington, D.C.: 2004. Wagner, Mary. Dropouts with Disabilities: What Do We Know? What Can We Do? A Report from the National Longitudinal Transition Study of Special Education Students. Menlo Park, California: SRI International, 1991. FOOTNOTES [1] GAO, School Dropouts: Education Could Play A Stronger Role in Identifying and Disseminating Promising Prevention Strategies. GAO-02- 240 (Washington, D.C.: Feb. 1, 2002). [2] Hereinafter, the term states will refer collectively to the 50 states plus the District of Columbia and Puerto Rico. [3] Students with disabilities refers to students covered under the Individuals with Disabilities Education Improvement Act of 2004, the primary law that addresses the unique educational needs of children with disabilities. [4] U.S. Department of Education. National Forum on Education Statistics, Forum Guide to Building a Culture of Quality Data: A School and District Resource, NFES 2005-801 (Washington, D.C.: 2004). [5] Schools designated as in need of improvement under the IASA had their designation carry over after NCLBA took effect. Also, schools receiving students through the school choice option must not be identified for improvement. [6] U.S. Department of Education, National Center for Education Statistics. National Institute of Statistical Sciences/Education Statistics Services Institute Task Force on Graduation, Completion, and Dropout Indicators, NCES 2005-105 (Washington, D.C.: 2004). [7] GAO, Title I: Education Needs to Monitor States' Scoring of Assessments, GAO-02-393, (Washington, D.C.: Apr. 1, 2002) and Title I Program: Stronger Accountability Needed for Performance of Disadvantaged Students, GAO/HEHS-00-89, (Washington, D.C.: June 1, 2000). U. S. Department of Education, Office of Inspector General, Department of Education Management Challenges (November 2004). [8] Dynarski, Mark and Philip Gleason, How Can We Help? What We Have Learned from Evaluations of Federal Dropout Prevention Programs? A Research Report from the School Dropout Demonstration Assistance Program Evaluation (Princeton, New Jersey: Mathematica Policy Research, Inc., 1998). [9] The following two approaches we identified--restructuring and supplemental services--do not refer to the specific restructuring and supplemental services provisions in the NCLBA. Instead, these approaches include those that are more diverse and include a variety of different intervention practices that states and districts are attempting. [10] These included literacy programs, which, although not specifically discussed in our 2002 report, are also examples of how these approaches can be implemented. [11] In July 2005 governors of 47 states signed a compact agreeing to adopt the National Governors Association's recommended cohort-based graduation rate formula in order to develop a comparable graduation rate definition. However, our analysis was based on the state plans rather than on this agreement. [12] States may either track individual students from a 9th grade cohort or approximate a cohort, such as by estimating the number of students who enter the 9th grade and who transfer in and out. [13] U.S. Department of Education, National Center for Education Statistics, National Institute of Statistical Sciences/Education Statistics Services Institute Task Force on Graduation, Completion, and Dropout Indicators, NCES 2005-105 (Washington, D.C.: 2004). [14] For the purposes of simplifying this example, we set the number of net transfers over the 4-year period at zero. We recognize that cohorts likely would have some number of net transfers. [15] Ten of these states consider students receiving alternative certificates separately from dropouts, while the remaining 22 states count them as dropouts in their definitions. NCES calculates a high school graduation rate using only diploma recipients as graduates and excluding other high school completers, such as those who earned a certificate of attendance and GED certificates. It also calculates a "high school completer rate" using diploma recipients and other high school completers, except GED recipients, as completers. [16] We followed the state's version of the cohort definition, which used dropout rates and not transfers. The basic cohort definition (fig. 2) accounts for the original number of students in the cohort plus transfers, while the state's version accounts for dropouts. [17] See for example, Greene, J. P. Public School Graduation rates in the United States (New York: Manhattan Institute for Policy Research, 2002), http://www.manhattan-institute.org/pdf/cr_31.pdf (downloaded June 21, 2005); and Swanson, Christopher B. Keeping Count and Losing Count: Calculating Graduation Rates for All Students under NCLB Accountability (Washington, D.C.: Urban Institute, 2003), http://www.urban.org/url.cfm?ID=410843 (downloaded June 21, 2005). [18] For example, one state's graduation rate definition divides the number of graduates by the number of 12th graders at the beginning of the school year. This definition does not take into consideration the number of students who dropped out in earlier years, resulting in a higher graduation rate than would have been produced using a definition that considered such students. In contrast, 2 states used a dropout rate definition that divides the number of dropouts in grades 9 through 12 by the number of students enrolled in those grades for the current year. [19] These schools would make AYP, assuming they also met the testing requirements. [20] This issue is relevant because the number of states that had such a college component is growing. For example, 19 states had Early College High Schools as of September 2004 and 25 were projected to as of 2005. These high schools are designed so that students can receive 2 years of college credit at the same time as they earn a high school diploma--up to 5 years after starting 9th grade. [21] As of July 2005, Education stated that it had received requests from 5 additional states to consider those students with disabilities who receive a regular diploma as graduates, but take additional years. Education also received requests from 4 states for similar consideration for Limited English Proficient students. The remaining plans did not include or did not address this topic. [22] For example, research has shown that this is particularly true for students with disabilities. See Wagner, Mary, Dropouts with Disabilities: What Do We Know? What Can We Do? (Menlo Park, Calif.: SRI International, 1991), a report based on the National Longitudinal Transition Study of Special Education Students. According to the author, a second phase of the study is under way, and data collected as of June 2005 have shown that this continues to be the case. [23] States were required to provide an assurance that students who drop out would not be counted as transfers. [24] Rumberger, Russell, and Scott Thomas, "The Distribution of Dropout and Turnover Rates among Urban and Suburban High Schools," Sociology of Education, vol. 73, no. 1 (2000): 39-67. Kerbow, David. "Patterns of Urban Student Mobility and Local School Reform." Journal of Education for Students Placed at Risk, vol. 1, no. 2 (1996): 147-169. GAO. Elementary School Children: Many Change Schools Frequently, Harming Their Education,. GAO/HEHS-94-45 (Washington, D.C.: Feb. 4, 1994). [25] The median high school in this example is the school in the middle of all the state's schools when they were rank-ordered according to their graduation rates. [26] Experts we interviewed said that the hypothetical error rates chosen were reasonable given the quality of dropout data typically maintained by schools and school districts. [27] The National Forum on Education Statistics issued a similar guide, Forum Guide to Building a Culture of Quality Data: A School and District Resource, NFES 2005-801 (Washington D.C.: 2004). [28] Education will calculate the rate based on the number of high school graduates receiving a regular diploma in a given year divided by the average number of students enrolled in 8th grade 5 years earlier, 9th grade 4 years earlier, and 10th grade 3 years earlier. [29] See, for example, Snow, Catherine E, Susan M. Burns, and Peg Griffin, Eds. Preventing Reading Difficulties in Young Children (Washington, D.C.: National Academy Press, 1998). [30] GAO-02-240. [31] Two of these evaluations, the Talent Development Model and First Things First, were released after we had completed our fieldwork. [32] Sinclair, M.F., S. L. Christenson and M. L. Thurlow, "Promoting School Completion of Urban Secondary Youth with Emotional or Behavioral Disabilities." Exceptional Children (in press). [33] Opuni, K.A., Project GRAD Newark: 2003-2004 Program Evaluation Report (Houston, Texas: Center for Research on School Reform, February 2005). [34] Burns, Matthew K., Barbara V. Senesac, and Todd Symington, "The Effectiveness of the HOSTS Program in Improving the Reading Achievement of Children At-risk for Reading Failure." Reading Research and Instruction, vol. 43, no. 2 (2004): 87-104. [35] Kemple, James J., Corinne M. Herlihy, and Thomas J Smith. Making Progress Towards Graduation: Evidence from the Talent Development High School Model (New York: Manpower Demonstration Research Corporation, May 2005). [36] Quint, Janet, Howard S. Bloom, Alison Rebeck Black, LaFleur Stephens, and Theresa M. Akey, The Challenge of Scaling Up Educational Reform: Findings and Lessons from First Things First (New York: Manpower Demonstration Research Corporation, July 2005). [37] Lehr, Camilla A. and Cheryl M Lange, "Alternative Schools Serving Students with and without Disabilities: What Are the Current Issues and Challenges," Preventing School Failure, vol. 47, no. 2 (2003): 59-65. [38] Kemple, James J., and Judith Scott-Clayton, Career Academies: Impact on Students' Initial Transitions to Post-Secondary Education and Employment (New York: Manpower Demonstration Research Corporation, December 2001). [39] Dynarski, Mark, Philip Gleason, Anu Rangarajan, and Robert Wood, Impacts of Dropout Prevention Programs, Final Report (Princeton, New Jersey: Mathematica Policy Research, Inc., 1998). [40] Ibid. [41] See for example, Gingras, Rosano, and Rudy Careaga, Limited English Proficient Students at Risk: Issues and Prevention Strategies (Silver Spring, Md.: National Clearinghouse for Bilingual Education, 1989). [42] See for example, Shannon, Sue G., and Pete Bylsma, Helping Students Finish School: Why Students Drop Out and How to Help Them Graduate (Olympia, Wash.: Office of Superintendent of Public Instruction, 2003). [43] The What Works Clearinghouse, funded by Education, has a Web site that will summarize evidence on the effectiveness of different programs, products, practices, and policies intended to improve student outcomes. The site is planned to include interventions in middle school, junior high school, or high school designed to increase high school completion including such techniques as the use of incentives, counseling, or monitoring as the prevention/intervention of choice. [44] Generally, cohort definitions are based on tracking student transfers. GAO's Mission: The Government Accountability Office, the investigative arm of Congress, exists to support Congress in meeting its constitutional responsibilities and to help improve the performance and accountability of the federal government for the American people. GAO examines the use of public funds; evaluates federal programs and policies; and provides analyses, recommendations, and other assistance to help Congress make informed oversight, policy, and funding decisions. GAO's commitment to good government is reflected in its core values of accountability, integrity, and reliability. Obtaining Copies of GAO Reports and Testimony: The fastest and easiest way to obtain copies of GAO documents at no cost is through the Internet. GAO's Web site ( www.gao.gov ) contains abstracts and full-text files of current reports and testimony and an expanding archive of older products. The Web site features a search engine to help you locate documents using key words and phrases. You can print these documents in their entirety, including charts and other graphics. Each day, GAO issues a list of newly released reports, testimony, and correspondence. GAO posts this list, known as "Today's Reports," on its Web site daily. The list contains links to the full-text document files. To have GAO e-mail this list to you every afternoon, go to www.gao.gov and select "Subscribe to e-mail alerts" under the "Order GAO Products" heading. Order by Mail or Phone: The first copy of each printed report is free. Additional copies are $2 each. A check or money order should be made out to the Superintendent of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or more copies mailed to a single address are discounted 25 percent. Orders should be sent to: U.S. Government Accountability Office 441 G Street NW, Room LM Washington, D.C. 20548: To order by Phone: Voice: (202) 512-6000: TDD: (202) 512-2537: Fax: (202) 512-6061: To Report Fraud, Waste, and Abuse in Federal Programs: Contact: Web site: www.gao.gov/fraudnet/fraudnet.htm E-mail: fraudnet@gao.gov Automated answering system: (800) 424-5454 or (202) 512-7470: Public Affairs: Jeff Nelligan, managing director, NelliganJ@gao.gov (202) 512-4800 U.S. Government Accountability Office, 441 G Street NW, Room 7149 Washington, D.C. 20548:

The Justia Government Accountability Office site republishes public reports retrieved from the U.S. GAO These reports should not be considered official, and do not necessarily reflect the views of Justia.