Child Care

States Have Undertaken A Variety of Quality Improvement Initiatives, but More Evaluations of Effectiveness Are Needed Gao ID: GAO-02-897 September 6, 2002

The demand for child care has increased dramatically in the past several decades as the number of mothers who work outside the home has grown. Welfare reform has further increased this demand. To support low-income parents moving into the workforce, welfare reform established the Child Care and Development Fund (CCDF). In fiscal year 2000, states spent $5.3 billion in CCDF funds to subsidize child care for low-income families. Out of concern for the quality of care that the CCDF funds, welfare reform legislation also required states to set aside at least 4 percent of the total grant to improve the quality and availability of child care. Department of Health and Human Services (HHS) regulations provide examples of allowable activities, such as providing child care providers with financial incentives for meeting state and local standards, improving the compensation of child care staff, and offering resource and referral services. However, the regulations do not limit states' use of funds to these activities; rather, the fund's block grant structure allows states considerable flexibility in choosing appropriate quality and availability improvements to pursue. Using primarily the four percent quality set-aside, states reported undertaking a variety of child care quality improvement initiatives, such as training caregivers, raising the compensation of caregivers, referring parents to child care providers, and efforts to enhance the safety of child care facilities. Although few states have evaluated the effects of their quality improvement initiatives on children's development, some studies provide useful findings about them. The research on child care quality does not evaluate initiatives as actually implemented by states, but a few studies, using rigorous methods, show that some of the attributes of child care quality that these initiatives address, such as caregiver qualifications, affect children's social, emotional and cognitive development.

Recommendations

Our recommendations from this work are listed below with a Contact for more information. Status will change from "In process" to "Open," "Closed - implemented," or "Closed - not implemented" based on our follow up work.

Director: Team: Phone:


GAO-02-897, Child Care: States Have Undertaken A Variety of Quality Improvement Initiatives, but More Evaluations of Effectiveness Are Needed This is the accessible text file for GAO report number GAO-02-887 entitled 'Child Care: States Have Undertaken a Variety of Quality Improvement Initiatives, but More Evaluations of Effectiveness Are Needed' which was released on September 6, 2002. This text file was formatted by the U.S. General Accounting Office (GAO) to be accessible to users with visual impairments, as part of a longer term project to improve GAO products' accessibility. Every attempt has been made to maintain the structural and data integrity of the original printed product. Accessibility features, such as text descriptions of tables, consecutively numbered footnotes placed at the end of the file, and the text of agency comment letters, are provided but may not exactly duplicate the presentation or format of the printed version. The portable document format (PDF) file is an exact electronic replica of the printed version. We welcome your feedback. Please E-mail your comments regarding the contents or accessibility features of this document to Webmaster@gao.gov. This is a work of the U.S. government and is not subject to copyright protection in the United States. It may be reproduced and distributed in its entirety without further permission from GAO. Because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. United States General Accounting Office: GAO: Report to Congressional Requesters: September 2002: Child Care: States Have Undertaken a Variety of Quality Improvement Initiatives, but More Evaluations of Effectiveness Are Needed: GAO-02-897: Contents: Letter: Results in Brief: Background: States Undertook a Variety of Initiatives, Primarily Using the 4 Percent Set-Aside: Few States Have Evaluated the Effectiveness of State Quality Improvement Initiatives: Conclusion: Recommendation: Agency Comments: Appendix I: Scope and Methodology: Scope: Methodology: Appendix II: State-Initiated Studies of Quality Improvement: Appendix III: Child Care Quality Research Findings: Child Care Quality Linked to Socio-Emotional Development: Child Care Quality Linked to Cognitive Development: Child Care Quality Linked to Child Development Over Time: Appendix IV: Comments from the Department of Health and Human Services: Appendix V: GAO Contacts and Staff Acknowledgments: GAO Contacts: Staff Acknowledgments: Bibliography: Related GAO Products: Tables: Table 1: Types and Descriptions of Child Care Providers: Table 2: Rules for Obligating and Spending Funds in the CCDF Funding Streams: Table 3: Categories Used to Describe States‘ Child Care Quality Improvement Initiatives: Table 4: States‘ Reported CCDF Quality Improvement Expenditures in Fiscal Year 2000: Table 5: States‘ Reported Expenditures Devoted to Each Provider Type, by Initiative: Table 6: Comparison of Quality Improvement Expenditures Distributed to Individual Providers that Were Devoted to Each Provider Type, with Percentage of CCDF-Subsidized Children, by State: Table 7: Major Reviewers‘ Findings Regarding Child Care Quality Research: Table 8: Data Quality Criteria: Table 9: Criteria for Assessing Evaluations: Table 10: State-initiated Studies of Quality Improvement: Figures: Figure 1: State Expenditures for Quality Improvement Initiatives in Fiscal Year 2000: Figure 2: Percentage of States that Reported Undertaking Nine Categories of Initiatives: Figure 3: States‘ Reported Expenditures for Each Initiative, Fiscal Year 2000: Figure 4: States‘ Reported Expenditures from Each Funding Source, Fiscal Year 2000: Figure 5: Massachusetts‘s Expenditures on Quality Improvement Initiatives, Fiscal Year 2000: Figure 6: Tennessee‘s Fiscal Year 2000 Expenditures on Quality Improvement Initiatives: Figure 7: South Dakota‘s Fiscal Year 2000 Expenditures on Quality Improvement Initiatives: Figure 8: States‘ Reported Expenditures on Quality Improvement Initiatives Targeted to Providers: Abbreviations: CCDBG: Child Care and Development Block Grant Act of 1990: CCDF: Child Care and Development: ECERS: Early Care Environment Rating Scale: FDCRS: Family Day Care Rating Scale: HHS: Department of Health and Human Services: ITERS: Infant/Toddler Environment Rating Scale: MCHB: Maternal and Child Health Bureau: MOE: maintenance of effort: NAEYC: National Association for the Education of Young children: NICHD: National Institute of Child Health and Human Development: OPRE: Office of Planning, Research and Evaluation: PRWORA: Personal Responsibility and Work Opportunity Reconciliation Act of 1996: TANF: Temporary Assistance for Needy Families: TEACH: Teacher Education and Compensation Helps: [End of section] United States General Accounting Office: Washington, DC 20548: September 6, 2002: The Honorable Edward M. Kennedy: Chairman: Committee on Health, Education, Labor and Pensions: United States Senate: The Honorable Christopher J. Dodd: Chairman: Subcommittee on Children and Families: Committee on Health, Education, Labor and Pensions: United States Senate: The Honorable Jack Reed: United States Senate: The demand for child care has increased dramatically in the past several decades as the number of mothers who work outside the home has grown. Welfare reform has further increased this demand. To support low-income parents moving into the workforce, welfare reform established the Child Care and Development Fund (CCDF). In fiscal year 2000, states spent $5.3 billion in federal CCDF to subsidize child care for low-income families. Out of concern for the quality of care supported by CCDF funds, welfare reform legislation also required states to set aside at least 4 percent of the total grant to improve the quality and availability of child care. Department of Health and Human Services (HHS) regulations provide examples of allowable activities, such as providing child care providers with financial incentives for meeting state and local standards, improving the compensation of child care staff, and offering resource and referral services. However, the regulations do not limit states‘ use of funds to these activities; rather, the fund‘s block grant structure allows states considerable flexibility in choosing appropriate quality and availability improvements to pursue. As Congress considers the CCDF‘s structure and funding level in preparation for reauthorization in 2002, interest has increased in the types of quality improvement initiatives 4 percent set-aside funds are supporting, the estimated percentage of federal and state funds being spent on such initiatives, and the extent to which states are assessing the initiatives‘ effects. Accordingly, in preparation for CCDF‘s reauthorization, you asked us to examine (1) what quality improvement initiatives states have undertaken with the 4 percent set-aside and other funding sources and (2) what evidence has been gathered, if any, about the effectiveness of states‘ initiatives. To determine what initiatives states have conducted, we surveyed CCDF lead state agencies in the 50 states and the District of Columbia about the use of CCDF and other funds in fiscal year 2000. We received responses from 42 states. We asked them to classify their quality improvement initiatives into nine general categories, which include the major activities identified in the law, HHS‘s regulations, and in the child care literature and to identify the funding sources for each initiative category and the amount spent. We also conducted case studies of five states”California, Massachusetts, South Dakota, Tennessee, and Wisconsin”to gather data that would amplify information on states‘ initiatives collected by the survey. We selected states that were diverse geographically and in population density and that represented a variety of child care quality improvement initiatives. We also considered the state‘s income distribution, licensing caseloads, use of Temporary Assistance for Needy Families funds and whether state licensing requirements reflected child-to-staff ratios recommended by national child care accrediting bodies. To examine the evidence of effectiveness, we asked state lead agencies for evaluations of their initiatives, contacted HHS and researchers regarding their work, and assessed the evaluations we identified. We also reviewed major summaries and methodological critiques of the research literature on child care quality. Appendix I provides additional details about our scope and methodology. We conducted our work between December 2001 and June 2002 in accordance with generally accepted government auditing standards. Results in Brief: Using primarily the 4 percent quality set-aside, states reported undertaking a variety of child care quality improvement initiatives, such as training caregivers, raising the compensation of caregivers, referring parents to child care providers, and efforts to enhance the safety of child care facilities, as shown in the figure below. State officials in the five case study states cited several factors that influenced the initiatives states undertook, including the perspective of the governor or state legislature about high quality care, recent events in the child care community and previous research. Figure 1: State Expenditures for Quality Improvement Initiatives in Fiscal Year 2000: [See PDF for image] This figure is a pie-chart depicting the following data: State Expenditures for Quality Improvement Initiatives in Fiscal Year 2000: Resource and referral: 20%; Enhanced inspections: 14%; Meeting state standards: 13%; Caregiver compensation: 12%; Off-site caregiver training: 8%; Incentives for accreditation: 8%; On-sire caregiver training: 2%; All other activities: 12%. Source: GAO analysis of GAO survey data. [End of figure] The majority of states reported expenditures exceeding the 4 percent set-aside‘s minimum requirement. Among the 34 states that tracked the type of provider targeted, child care centers received over two-thirds of expenditures on quality initiatives that distributed funds and resources to providers, while less than a third of such expenditures went to family child care or after-school care. While few states have evaluated the effects of their quality improvement initiatives on children‘s development, some studies provide useful findings about them. Officials in four of five states we talked to explained that states must make trade-offs between serving more families and conducting evaluations of their own quality improvement initiatives. Out of the handful of state-sponsored studies, a few had study designs that isolated an initiative‘s effect and survey response rates that provided reliable estimates. The research on child care quality does not evaluate initiatives as actually implemented by states, but a few studies, using rigorous methods, show that some of the attributes of child care quality that these initiatives address, such as caregiver qualifications, are linked to children‘s social, emotional and cognitive development. To provide states with rigorous research evidence about how to modify ongoing initiatives or invest in new ones, we are recommending that HHS include selected state quality improvement initiatives in a major impact evaluation of state child care subsidy strategies. Background: Child care services are supplied by providers operating in varied settings: in center care, a child is cared for in a nonresidential setting and in family child care, a child is cared for in the home of a provider. Child care centers provide care outside of the home, but family child care is provided to a small number of unrelated children”typically fewer than six”in a provider‘s home. Some child care centers and family child care homes also offer school-aged care for children before and after school. (See table 1.) Generally, children in center-based care and family child care have not yet started school and after-school care is offered to children in kindergarten through age 12. Table 1: Types and Descriptions of Child Care Providers: Type of provider: Child care center; Description[A]: Care typically provided for 12 or more children in a nonresidential facility. Type of provider: Family child care; Description[A]: Care provided for a small group of children in a provider‘s home. Type of provider: Informal care; Description[A]: Legally operating care given by adults, including relatives and friends and usually unregulated. [A] Table 1 provides a general description of different types of child care providers. In actuality, states define child care differently and have different licensure and regulatory requirements. Source: U.S. General Accounting Office, States Increased Spending on Low-Income Families, GAO-01-293 (Washington, D.C.: Feb. 2, 2002) and Implications of Increased Work Participation for Child Care, GAO/HEHS-95-75 (Washington, D.C.: May 29, 1997). [End of table] Research on child care quality identified two broad sets of attributes that pertain to quality in all child care settings: structural attributes of the child care environment and children‘s daily interactions with their caregivers. Structural attributes of child care include characteristics such as child-to-staff ratios, the number of children per caregiver in a classroom; group size, the number of children assigned to a team of caregivers in a classroom; caregiver formal education; caregivers‘ specialized training; caregiver wages; staff turnover; the amount of floor space per child; and health and safety features, such as frequent staff and child hand washing. Child-caregiver interactions refers to actual experiences that occur in child care settings, and include such attributes as caregiver sensitivity and responsiveness, caregiver participation in children‘s play and learning activities, and language stimulation by caregivers. State and local governments are responsible for the oversight of child care providers that operate in their state. Each state establishes its own child care standards, determining the areas and types of providers that the standards will cover and the specific criteria that will be used to determine provider compliance. Most child care providers are required to meet a state‘s standards to obtain a license to operate legally in a state. State child care standards primarily focus on the structural attributes of care. States can turn to organizations such as the National Association for the Education of Young Children (NAEYC) and the Maternal and Child Health Bureau (MCHB) in HHS‘s Public Health Service that have developed standards based on research and professional practice. NAEYC, the nation‘s largest association of early child professionals, was formed to improve professional practice in early childhood care and education and increase public understanding of high quality early childhood programs. NAEYC also accredits, through a voluntary system, early childhood centers and schools. In 1998, we reported that state licensing standards varied in the extent to which the standards reflected those of NAEYC and MCHB. For example, we found that only two states had standards for caregiver education and training that matched NAEYC standards. Typically, state standards tended to require significantly fewer years of education than the standards set by NAEYC. Thus, to achieve accreditation by a national accrediting body, child care providers may have to meet higher standards than those they would meet to obtain and keep a state operating license. CCDF Structure and Spending Requirements: Title I of the Personal Responsibility and Work Opportunity Reconciliation Act of 1996 (PRWORA) overhauled the nation‘s welfare system by replacing the legal entitlement to cash assistance under the previous welfare program with the Temporary Assistance for Needy Families (TANF) block grant. Title VI of PRWORA amended the Child Care and Development Block Grant Act of 1990 (CCDBG) and combined CCDBG funds with the funding of three other federal child care programs. HHS named the combined set of funds the CCDF. Each state receives an annual CCDF allocation composed of funds from three separate funding streams: discretionary, mandatory and matching. Assessing the portion of CCDF funds states spend on quality improvement is complicated to some extent by the distinct set of rules covering each stream that determine the time period allowed for obligating and spending the funds. (See table 2.) Table 2: Rules for Obligating and Spending Funds in the CCDF Funding Streams: Funding streams: Discretionary; Time period for obligating funds: Within 2 fiscal years after a grant award; Time period for spending funds: Within 3 years after a grant award. Funding streams: Mandatory; Time period for obligating funds: To receive CCDF matching funds, within the fiscal year of a grant award; Time period for spending funds: Available until spent. Funding streams: Matching; Time period for obligating funds: Within the fiscal year of a grant award; Time period for spending funds: Within 2 years after a grant award. [End of table] Each state receives a share of the total amount of money in the discretionary funding stream, which is determined each year by the congressional appropriations process. A state‘s share of discretionary funds is based on a formula stipulated in the statute. A state must obligate discretionary funds within 2 fiscal years after a grant award and spend the funds by the end of the following fiscal year. A state‘s share of mandatory funds is based on the amount of funds the state received from a set of federal child care programs in a base year. Mandatory funds are available until they are spent. However, to receive federal matching funds, a state must obligate all mandatory funds by the end of the fiscal year in which they were awarded; maintain program spending of state funds at a specified level, referred to as a state‘s maintenance of effort (MOE); and spend additional state funds above that level. States may spend more of their own funds on child care than the amount actually accounted for under CCDF‘s MOE and matching requirements. Federal and state matching funds must be committed by the end of the fiscal year in which they are received and spent by the end of the following fiscal year. Finally, funds transferred from the TANF block grant represent an additional source of funds for the CCDF. PRWORA allowed states the flexibility to transfer up to 30 percent of TANF funds to the CCDF. Transferred TANF funds are treated as part of the discretionary funding stream and are subject to CCDF rules. States must spend at least 4 percent of their CCDF funds”of discretionary, mandatory and matching, but not of state MOE funds”for a given fiscal year on activities intended to improve the quality and availability of child care. Specifically, the law requires that states use at least 4 percent of these funds for activities to provide comprehensive consumer education to parents and the public, activities that increase parental choice, and activities designed to improve the quality and availability of child care. As stated earlier, HHS, through its regulations, has provided illustrative examples of activities designed to improve the quality of child care. In addition, the regulations permit other expenditures that are consistent with the intent of the regulation. This provision of PRWORA is known as the 4 percent set-aside. Congress also has earmarked money in CCDF‘s discretionary fund for resource and referral services and school-age care, infant and toddler care and quality-related activities. Any funds expended for the activity beyond the designated earmarks can be used to meet the 4 percent set-aside requirement. Earmarked funds must be tracked and reported separately from 4 percent set-aside expenditures. For fiscal year 2001, Congress provided $19,100,000 for the resource and referral services and school-age care earmark, $100,000,000 for the infant and toddler earmark, and $172,600,000 for the quality-related activities earmark. [Footnote 1] These earmark amounts were continued for fiscal year 2002. HHS guidance for expenditure of the quality-related activities earmark includes activities similar to those approved for the 4 percent set-aside, but covers additional suggestions such as specific health activities, special needs child care and activities that support cultural diversity. Methodology: To encompass the broad range of quality improvement initiatives that states are undertaking, including those allowed by the 4 percent set-aside provision and HHS‘s regulations as well as strategies suggested in child care quality research and practice, we developed a framework for describing the initiatives and analyzing states‘ expenditures on them. To assess evidence on the effectiveness of states‘ initiatives that has been developed in the research community, we developed criteria for data and research quality that reflect GAO‘s methodological standards and those of the broader policy research community. Framework for Analyzing States‘ Quality Improvement Expenditures: The CCDF 4 percent set-aside provision and HHS regulations specify several types of activities for which quality improvement funds may be expended but also allow states the discretion to include other activities. HHS also requires states to report total expenditures of 4 percent set-aside funds annually but does not require separate reporting for the quality improvement initiatives the states undertake. Thus, to examine states‘ quality improvement expenditures, we developed a framework to guide data collection and analysis. Beginning with examples of the allowable activities included in the 4 percent set-aside provision and HHS regulations, we specified nine categories to characterize states‘ initiatives. The categories are based on several sources. (See table 3.) Most categories”caregiver compensation, meeting state standards, safety equipment or improvement, caregiver education and training and resource and referral”are based on examples in the law and regulations. Because our analytic framework includes the full range of states‘ quality improvement initiatives, including those funded by sources other than CCDF, we identified additional categories based on child care quality literature. On-site training and enhanced inspections were included as categories based on the Department of Defense‘s child development program, which has been widely recognized as a model of high quality care. [Footnote 2] Incentives for achieving accreditation or exceeding standards is a category based on several studies of child care quality improvement strategies that look beyond the scope of activities cited in the law. The final category for other quality-related activities included initiatives that may be unique to a state and those that may foster the availability of high quality care, such as strategies that provide consumer education or increase parental choice. Because activities to provide comprehensive consumer education to parents and the public and increase parental choice were not included among the activities noted by the law or HHS regulations as designed to improve child care quality, we did not include these activities in our framework. However, states were free to report these or other quality-related activities when they construed them as such. Table 3: Categories Used to Describe States‘ Child Care Quality Improvement Initiatives: Initiative: Caregiver compensation; Basis: CCDF regulations; Description: Funding for caregivers or providers to increase caregivers‘ salary or benefits. Initiative: On-site training; Basis: Military child development program; Description: Funding for training of caregivers provided at employment site. Initiative: Meeting State Standards; Basis: CCDF regulations; Description: Funding for the purpose of helping providers meet state standards and consequently become licensed. Initiative: Safety equipment or improvement; Basis: CCDF regulations; Description: Funding for the purpose of helping providers improve safety. Initiative: Incentives for accreditation or exceeding standards; Basis: Literature on child care quality and CCDF regulations; Description: Funding to encourage providers to meet some higher standard. Initiative: Caregiver education or training; Basis: CCDF regulations; Description: Funding for caregivers to receive training or education, often in child development or health and safety; may include scholarships, funding of class at a community college, or other training not at the caregivers‘ place of employment. Initiative: Resource and Referral Activities; Basis: CCDBG Act and CCDF regulations; Description: Funding for parent and provider support, including activities to help parents find a provider, coordination of caregiver training, provision of materials and training to caregivers, or provision of technical assistance to caregivers. Initiative: Enhanced Inspections; Basis: Military child development program; Description: Funding to increase the frequency of inspections of child care providers, increase the scope of the inspections, or decrease inspector caseload. Initiative: Other quality-related activities; Basis: CCDBG Act and CCDF regulations; Description: Funding for other state-initiated activities. Source: GAO analysis. [End of table] Methods for Evaluating the Effectiveness of States‘ Quality Improvement Initiatives: Prior to and since CCDF‘s creation, a large body of research on child care that included an analysis of its effects on children‘s development has been accumulating. In 1990, the National Research Council assessed this research, focusing on the costs, effects and feasibility of child care policies and programs. [Footnote 3] As part of this assessment, the council concluded that child care quality is linked to children‘s development. The council emphasized that it would be important for future research to examine exactly how the various components of quality affected children‘s development, what magnitude of improvement in development could be expected from measured improvements in quality and, most importantly, whether the quality of child care has an effect on children‘s development that is separate from that of family characteristics. Noting that studies using random assignment of children to differing child care arrangements of varying quality provide the most rigorous evidence of whether child care quality has an effect that is separate from family characteristics, the council also found that random assignment had been used rarely in studies of community-based child care settings. Pointing out the contributions of experimental designs, the term given to studies that employ random assignment, to research on early interventions for children from disadvantaged families, the council urged that experimental designs be used in future research on child care quality. Other reviews of the research on child care quality, while agreeing on the importance of looking at the effects of child care quality separately from the effects of family characteristics, acknowledged the practical difficulties of random assignment and recommended an alternative approach that uses advanced statistical methods and a comparison group, an approach known as quasi-experimental design. [Footnote 4] In conducting our assessment of research on the effectiveness of states‘ quality improvement initiatives, we also used the criterion that to determine a program‘s effect, an evaluation should employ an experimental or quasi-experimental design. Appendix I provides additional details about these designs and our complete scope and methodology. States Undertook a Variety of Initiatives, Primarily Using the 4 Percent Set-Aside: Using primarily the 4 percent quality set-aside, states reported undertaking a variety of child care quality improvement initiatives, such as efforts to train caregivers, raise the compensation of caregivers, and enhance the safety of child care facilities. State officials in the five case study states cited several factors that influenced the initiatives they undertook, including state legislators‘ perspectives on what constitutes quality care, recent events in the state child care community, evaluations, and other previous research. While states are required to spend 4 percent of the CCDF on quality, the majority of states reported quality expenditures in excess of this minimum requirement. Among the 34 states that tracked the type of provider targeted, child care centers received over two-thirds of those quality expenditures distributed to providers, while less than a third of such expenditures went to family child care or after-school care. Because initiatives that distributed funds to providers constituted 54 percent of states‘ expenditures for quality improvement, expenditures devoted to centers represented about 39 percent of states‘ total reported expenditures for quality improvement. Resource and Referral Predominates in States‘ Quality Improvement Expenditures: States reported undertaking resource and referral activities more than any other initiative. (See fig. 2.) Resource and referral services are identified in the CCDBG Act as an example of activities for which states may make expenditures for quality improvement. Two of the states we visited described the use of resource and referral agencies to deliver technical assistance to providers. In South Dakota, the state‘s five resource and referral agencies provided child care providers with technical assistance needed to meet regulatory requirements. In Massachusetts, the child care agency used resource and referral agencies to assist providers in caring for children with special needs, such as a child with a disability. In collaboration with the state‘s Department of Public Health, the Massachusetts child care agency developed a consultation program for special needs children. Consultation program representatives helped resource and referral agencies understand what a child‘s needs were when placing the child with a provider. Three states described the use of resource and referral agencies as a vehicle for training. The South Dakota and California child care agencies used resource and referral agencies to deliver all training for child care providers. In Massachusetts, the state‘s resource and referral agencies trained providers in using the Early Childhood Environment Rating Scale for self-assessments of quality, in using information technology and in early literacy. Providers who then implemented early literacy initiatives for their staff were offered rate increases. Initiatives received different proportions of total reported expenditures for quality improvement, using all funding sources. [Footnote 5] In addition to being the most frequently undertaken, states reported that resource and referral activities received a larger share of reported expenditures on quality improvement than did any other initiative, about 20 percent of all expenditures. (See fig. 3.) Figure 2: Percentage of States that Reported Undertaking Nine Categories of Initiatives: [See PDF for image] This figure is a vertical bar graph depicting the following approximated data: Category: Resource and referral; Percentage of states: approximately 85%. Category: Off-site caregiver training; Percentage of states: approximately 80%. Category: Meeting state standards; Percentage of states: approximately 72%. Category: Safety equipment improvement; Percentage of states: approximately 70%. Category: Incentives for accreditation; Percentage of states: approximately 60%. Category: Enhanced inspections; Percentage of states: approximately 55%. Category: Caregiver compensation; Percentage of states: approximately 35%; Category: On-site caregiver training; Percentage of states: approximately 30%. Category: All other activities; Percentage of states: approximately 70%. Source: GAO analysis of GAO survey data. [End of figure] Figure 3: States‘ Reported Expenditures for Each Initiative, Fiscal Year 2000: [See PDF for image] This figure is a pie-chart depicting the following data: States‘ Reported Expenditures for Each Initiative, Fiscal Year 2000: Resource and referral: 20%; Enhanced inspections: 14%; Meeting state standards: 13%; Caregiver compensation: 12%; Off-site caregiver training: 11%; Safety equipment/improvement: 8%; Incentives for accreditation: 8%; On-site caregiver training: 2%; All other activities: 12%. Source: GAO analysis of GAO survey data. [End of figure] States reported undertaking several initiatives to improve caregiver qualifications and compensation. Eighty-two percent of the states funded an off-site caregiver training initiative. One example is the Teacher Education and Compensation Helps (TEACH) program, which provides caregivers with scholarships to attend college classes related to child development. TEACH began in North Carolina and has been replicated in 17 states. One-third of the states undertook initiatives to improve caregiver compensation through increased wages or benefits. For example, child care officials in Massachusetts use Quality Awards to reward child care staff and family child care providers with one-time bonuses for excellence in their work. A similar number of states reported funding on-site caregiver training, which provides caregivers with training and education opportunities at their place of employment. Officials in Wisconsin reported funding caregiver training for increased safety in family child care homes. Taken together, these initiatives received about 25 percent of the expenditures states reported. Most states also reported undertaking initiatives to assist providers in meeting state standards and to reward providers for exceeding state standards. Thirty of the 42 responding states reported providing funding to assist child care providers in meeting state licensing standards, such as California‘s provision of funding to assist child care providers with health and safety standards and a variety of training requirements. Additionally, 29 of the 42 responding states reported funding safety improvements. For example, South Dakota‘s health and safety funding offers child care providers up to 75 percent of the cost of safety equipment, such as windows designed to provide an escape route in the event of an emergency. Over half the states reported providing incentives for child care providers to become accredited or exceed state standards. Under its child care program, Wisconsin did so by setting the maximum reimbursement rate for providers that met accreditation standards, which exceed licensing standards, 10 percent higher than the regular reimbursement rate. [Footnote 6] Initiatives to assist providers in meeting state standards received about 13 percent of states‘ reported quality improvement expenditures. However, although many states reported funding safety improvements and offering incentives for accreditation or exceeding standards, these initiatives received the smallest shares of funding. Half the states reported initiatives devoted to enhancing inspections of child care facilities, either by increasing the frequency or the thoroughness of such inspections. Although less commonly reported than several other initiatives, these inspection efforts received the second largest proportion of quality funds. Finally, over half the states reported undertaking initiatives in the all other activities category. These included consumer education campaigns and improvement of the quality and availability of care for special populations, such as infants, toddlers, and children with special needs. California, for example, reported funding a variety of other activities including school-age curriculum and material development and a program for infant/toddler caregiver training coordinators. The CCDF 4 percent set-aside funded nearly half of all state-reported expenditures on quality. (See fig. 4.) State funds were the next largest funding source, constituting almost one-third of all expenditures on quality improvement initiatives. States also made use of earmark funds, additional CCDF funds, and money available from TANF. Though some states did make use of funds available from private foundations and other sources, this constituted a negligible proportion of the total. Figure 4: States‘ Reported Expenditures from Each Funding Source, Fiscal Year 2000: [See PDF for image] This figure is a pie-chart depicting the following data: Reported Expenditures from Each Funding Source, Fiscal Year 2000: CCDF 4 percent set-aside: 48%; State funds: 29%; CCDF earmarks: 12%; Additional CCDF: 10%; TANF: 6%. Source: GAO analysis of GAO survey data. [End of figure] State Officials Attributed Quality Improvement Decisions to Two Key Factors: The views of state officials in both the executive and legislative branches of state government are considered in the allocation of federal and state child care funds for quality improvement. Officials in four of the five states we visited cited the views of state officials and previous research as two key factors in their selection of initiatives. In four states, decisions on how child care funds are allocated among the various quality initiatives are determined through the legislative process. For example, in Massachusetts, the state legislature‘s perspective and previous research were cited as reasons that most quality initiative funds were devoted to caregiver compensation. Officials we spoke with said the state legislature supported early literacy, which led officials to offer rate increases to providers that implemented early literacy initiatives. In addition, research led state officials to believe that improving caregiver compensation would increase child care quality. [Footnote 7] Therefore, these rate increases were meant to enable providers to improve caregiver compensation. (See fig. 5.) Figure 5: Massachusetts‘s Expenditures on Quality Improvement Initiatives, Fiscal Year 2000: [See PDF for image] This figure is a pie-chart depicting the following data: Massachusetts‘s Expenditures on Quality Improvement Initiatives, Fiscal Year 2000: Resource and referral: 59%; Meeting state standards: 27%; Caregiver compensation: 14%. Source: GAO analysis of GAO survey data. [End of figure] Officials in Tennessee explained that several factors”recent events in the child care community, the state legislature‘s perspective, and research”influenced the state‘s emphasis on enhanced inspections. The Tennessee lead agency director told us that in 1999, the accidental deaths of two children in a child care van prompted the legislature to focus on quality improvement initiatives. Subsequently, Tennessee instituted a policy of criminal background checks and an increase in the number of unannounced inspections of child care facilities. Tennessee‘s distribution of funds emphasized this focus on inspections, as seen in figure 6. Tennessee now conducts six unannounced inspections of each facility per year. [Footnote 8] In addition, Tennessee officials consulted research on child care quality to inform their decisions but did not sponsor evaluation, pointing to the trade-off between funding evaluations and direct services to improve quality. Figure 6: Tennessee‘s Fiscal Year 2000 Expenditures on Quality Improvement Initiatives: [See PDF for image] This figure is a pie-chart depicting the following data: Tennessee‘s Fiscal Year 2000 Expenditures on Quality Improvement Initiatives: Enhanced inspections: 74%; Off-site caregiver education: 25%; Meeting state standards: 1%. Source: GAO analysis of GAO survey data. [End of figure] Wisconsin officials told us that research and gubernatorial proposals influenced the selection of a range of quality improvement initiatives. State officials said they analyzed data on quality improvement programs, and consulted experts in the child care field. The state legislature and the governor also influenced priorities. For example, in January 1999, the governor put forth a proposal to direct $15 million into an Early Childhood Excellence Commission to develop high quality child care in low income neighborhoods. In South Dakota, the decision to emphasize resource and referral agencies was guided by previous research and the governor‘s perspective. State officials relied on existing child care quality research for making choices about how to improve quality because they believed that sponsoring evaluations would be too resource intensive. On the basis of previous research findings, state officials believed training caregivers to be the central mechanism through which child care quality could be improved. After obtaining the governor‘s support, child care officials directed funding to resource and referral centers to train caregivers. (See fig. 7.) Figure 7: South Dakota‘s Fiscal Year 2000 Expenditures on Quality Improvement Initiatives: [See PDF for image] This figure is a pie-chart depicting the following data: South Dakota‘s Fiscal Year 2000 Expenditures on Quality Improvement Initiatives: Resource and referral: 85%; Off-site caregiver training: 14%; Safety equipment/improvement: 1%. Source: GAO analysis of GAO survey data. [End of figure] Most States Spend More Than 4 Percent on Quality Improvement: While HHS requires that states spend at least 4 percent of the CCDF on quality improvement, the majority of states reported expenditures for quality in excess of this minimum requirement in fiscal year 2000. In fact, in that year, 23 of 42 states reported expenditures representing 8 percent or more of the CCDF on quality related activities. We estimated that the percentage of the CCDF expended on quality ranged from 3 percent in California, Idaho and New Mexico to 38 percent in Kansas. (See table 4.) These reported expenditures are a snapshot of states‘ expenditures for quality improvement in fiscal year 2000. Because of the distinct set of rules covering each of CCDF‘s three funding streams, expenditures in that year by an individual state may have drawn on funds available from CCDF grants made in fiscal years 1998, 1999, or 2000. The percentage expenditure of funds from a particular fiscal year‘s grant cannot be determined definitively until time limitations on all funding streams have expired. [Footnote 9] Table 4: States‘ Reported CCDF Quality Improvement Expenditures in Fiscal Year 2000: State[A]: Alabama; State reported CCDF quality expenditures: $4,725,482; Average CCDF grant[B]: $77,493,201; Percentage expended based on state report of expenditures[B]: 6. State[A]: Alaska; State reported CCDF quality expenditures: $2,198,373; Average CCDF grant[B]: $23,327,524; Percentage expended based on state report of expenditures[B]: 9. State[A]: Arizona; State reported CCDF quality expenditures: $6,197,000; Average CCDF grant[B]: $88,131,425; Percentage expended based on state report of expenditures[B]: 7. State[A]: Arkansas; State reported CCDF quality expenditures: $1,998,221; Average CCDF grant[B]: $33,560,449 Percentage expended based on state report of expenditures[B]: 6. State[A]: California; State reported CCDF quality expenditures: $20,700,000; Average CCDF grant[B]: $639,666,033; Percentage expended based on state report of expenditures[B]: 3. State[A]: Colorado; State reported CCDF quality expenditures: $1,509,043; Average CCDF grant[B]: $39,379,887; Percentage expended based on state report of expenditures[B]: 4. State[A]: Delaware; State reported CCDF quality expenditures: $871,830; Average CCDF grant[B]: $11,186,263 Percentage expended based on state report of expenditures[B]: 8. State[A]: Georgia; State reported CCDF quality expenditures: $6,683,969; Average CCDF grant[B]: $131,333,851; Percentage expended based on state report of expenditures[B]: 5. State[A]: Hawaii; State reported CCDF quality expenditures: $1,758,003; Average CCDF grant[B]: $20,180,147; Percentage expended based on state report of expenditures[B]: 9. State[A]: Idaho; State reported CCDF quality expenditures: $967,425; Average CCDF grant[B]: $34,699,953; Percentage expended based on state report of expenditures[B]: 3. State[A]: Illinois; State reported CCDF quality expenditures: $22,500,000; Average CCDF grant[B]: $211,895,440; Percentage expended based on state report of expenditures[B]: 11. State[A]: Kansas; State reported CCDF quality expenditures: $14,315,739; Average CCDF grant[B]: $37,897,643; Percentage expended based on state report of expenditures[B]: 38. State[A]: Kentucky; State reported CCDF quality expenditures: $2,670,451; Average CCDF grant[B]: $74,719,865; Percentage expended based on state report of expenditures[B]: 4. State[A]: Louisiana; State reported CCDF quality expenditures: $6,349,109; Average CCDF grant[B]: $109,582,724; Percentage expended based on state report of expenditures[B]: 6. State[A]: Maine; State reported CCDF quality expenditures: $4,080,000; Average CCDF grant[B]: $17,754,746; Percentage expended based on state report of expenditures[B]: 23. State[A]: Maryland; State reported CCDF quality expenditures: $29,011,806; Average CCDF grant[B]: $101,462,889; Percentage expended based on state report of expenditures[B]: 29. State[A]: Massachusetts; State reported CCDF quality expenditures: $15,498,039; Average CCDF grant[B]: $171,959,431; Percentage expended based on state report of expenditures[B]: 9. State[A]: Michigan; State reported CCDF quality expenditures: $14,662,330; Average CCDF grant[B]: $178,511,186; Percentage expended based on state report of expenditures[B]: 8. State[A]: Minnesota; State reported CCDF quality expenditures: $8,124,224; Average CCDF grant[B]: $79,371,131; Percentage expended based on state report of expenditures[B]: 10. State[A]: Mississippi; State reported CCDF quality expenditures: $1,770,041; Average CCDF grant[B]: $48,674,596; Percentage expended based on state report of expenditures[B]: 4. State[A]: Missouri; State reported CCDF quality expenditures: $14,360,255; Average CCDF grant[B]: $84,871,476; Percentage expended based on state report of expenditures[B]: 17. State[A]: Montana; State reported CCDF quality expenditures: $1,414,227; Average CCDF grant[B]: $18,137,917; Percentage expended based on state report of expenditures[B]: 8. State[A]: Nebraska; State reported CCDF quality expenditures: $3,000,000; Average CCDF grant[B]: $25,634,209; Percentage expended based on state report of expenditures[B]: 12. State[A]: New Hampshire; State reported CCDF quality expenditures: $1,169,031; Average CCDF grant[B]: $11,343,569; Percentage expended based on state report of expenditures[B]: 10. State[A]: New Jersey; State reported CCDF quality expenditures: $10,700,000; Average CCDF grant[B]: $175,379,185; Percentage expended based on state report of expenditures[B]: 6. State[A]: New Mexico; State reported CCDF quality expenditures: $1,119,790; Average CCDF grant[B]: $40,719,569; Percentage expended based on state report of expenditures[B]: 3. State[A]: North Carolina; State reported CCDF quality expenditures: $9,520,719; Average CCDF grant[B]: $179,122,483; Percentage expended based on state report of expenditures[B]: 5. State[A]: North Dakota; State reported CCDF quality expenditures: $1,404,790; Average CCDF grant[B]: $7,311,957; Percentage expended based on state report of expenditures[B]: 19. State[A]: Ohio; State reported CCDF quality expenditures: $13,446,256; Average CCDF grant[B]: $170,661,715; Percentage expended based on state report of expenditures[B]: 8. State[A]: Oklahoma; State reported CCDF quality expenditures: $3,500,000; Average CCDF grant[B]: $82,646,909; Percentage expended based on state report of expenditures[B]: 4. State[A]: Oregon; State reported CCDF quality expenditures: $4,262,400; Average CCDF grant[B]: $41,411,987; Percentage expended based on state report of expenditures[B]: 10. State[A]: Pennsylvania; State reported CCDF quality expenditures: $12,556,326; Average CCDF grant[B]: $193,953,622; Percentage expended based on state report of expenditures[B]: 6. State[A]: South Carolina; State reported CCDF quality expenditures: $2,204,027; Average CCDF grant[B]: $44,374,421; Percentage expended based on state report of expenditures[B]: 5. State[A]: South Dakota; State reported CCDF quality expenditures: $1,408,042; Average CCDF grant[B]: $9,727,797; Percentage expended based on state report of expenditures[B]: 14. State[A]: Tennessee; State reported CCDF quality expenditures: $11,593,876; Average CCDF grant[B]: $120,436,809; Percentage expended based on state report of expenditures[B]: 10. State[A]: Texas; State reported CCDF quality expenditures: $15,183,207; Average CCDF grant[B]: $288,255,772; Percentage expended based on state report of expenditures[B]: 5. State[A]: Utah; State reported CCDF quality expenditures: $4,100,000; Average CCDF grant[B]: $33,632,252; Percentage expended based on state report of expenditures[B]: 12. State[A]: Vermont; State reported CCDF quality expenditures: $1,613,691; Average CCDF grant[B]: $14,941,847; Percentage expended based on state report of expenditures[B]: 11. State[A]: Virginia; State reported CCDF quality expenditures: $5,557,225; Average CCDF grant[B]: $91,906,307; Percentage expended based on state report of expenditures[B]: 6. State[A]: Washington; State reported CCDF quality expenditures: $9,317,551; Average CCDF grant[B]: $162,038,398; Percentage expended based on state report of expenditures[B]: 6. State[A]: Wisconsin; State reported CCDF quality expenditures: $18,500,000; Average CCDF grant[B]: $105,431,604; Percentage expended based on state report of expenditures[B]: 18. State[A]: Wyoming; State reported CCDF quality expenditures: $1,241,670; Average CCDF grant[B]: $10,236,055; Percentage expended based on state report of expenditures[B]: 12. State[A]: Average; State reported CCDF quality expenditures: $$7,470,575; Average CCDF grant[B]: $$94,591,882 Percentage expended based on state report of expenditures[B]: 8. [A] The following states did not reply to the questionnaire: Connecticut, Florida, Indiana, Iowa, Nevada, New York, Rhode Island, and West Virginia. The complete questionnaire submitted by the District of Columbia was not received in time to include the responses in our analyses. [B] In our survey of state CCDF lead agencies, states were asked to consider the amount of 4 percent set-aside funding available in fiscal year 2000, and to report the amount spent on quality in fiscal year 2000. Funds available for expenditure may have included fiscal year 1999 or 2000 matching funds and fiscal years 1998-2000 mandatory and discretionary funds, including TANF transfers. With the exception of mandatory funds, all funds must be expended within three years. Because states‘ expenditures in fiscal year 2000 could have drawn on grants made in fiscal years 1998-2000, the percentage of CCDF expended on quality was estimated by dividing the states‘ response by the average of CCDF grant amounts in fiscal years 1998-2000. In cases where states included earmark or state funds--other than state funds used to match federal funds--in the response, these were removed before calculating a percentage. Thus, the numerator is based on states‘ response to our survey of state CCDF lead agencies. We contacted state officials that reported unusually high or low expenditures to confirm their reports. The denominator is based on CCDF grant information for fiscal years 1998-2000 from HHS, which can be accessed at: [hyperlink, http://www.acf.dhhs.gov/programs/ccb/research/00acf696/summary.htm] and [hyperlink, http://www.acf.dhhs.gov/programs/ccb/research/archive/99acf696/summary.htm]. [End of table] Expenditures on Initiatives Directed to Providers Primarily Targeted Child Care Centers: Among the 34 states that tracked the type of provider targeted, child care centers received over two-thirds of all expenditures for six initiatives that states targeted to individual providers. [Footnote 10] (See fig. 8.) However, nationwide, 55 percent of all children whose care involves CCDF assistance are attending child care centers and 32 percent of all children are in center-based care. Thus, centers receive a larger share of quality improvement expenditures targeted to providers than the share of CCDF-subsidized children in their care. In addition, while there was insufficient information in states‘ responses to analyze initiatives devoted to informal care, the policy research community has expressed interest in quality improvement initiatives targeted on these providers because we have the least information about them and a significant number of children are cared for in informal settings. [Footnote 11] Figure 8: States‘ Reported Expenditures on Quality Improvement Initiatives Targeted to Providers: [See PDF for image] This figure is a pie-chart depicting the following data: States‘ Reported Expenditures on Quality Improvement Initiatives Targeted to Providers: Center-based care: 72%; Family child care: 25%; After school care: 3%. Source: GAO analysis of GAO survey data. {End of figure] When we looked at expenditures on individual initiatives by the thirty-four states, we saw the same pattern of emphasis on centers. (See table 5.) For the six initiatives, centers received the majority of funds, followed by family child care. Moreover, for initiatives related to meeting standards, the proportion of expenditures devoted to centers was smaller than for other initiatives, but still greatly exceeded the proportion devoted to family child care and after-school care. Table 5: States‘ Reported Expenditures Devoted to Each Provider Type, by Initiative: Initiative: Incentives for accreditation; Percentage to centers: 87; Percentage to family child care: 13; Percentage to after-school: 0. Initiative: Caregiver Compensation; Percentage to centers: 76; Percentage to family child care: 23; Percentage to after-school: 1. Initiative: Safety equipment/improvements; Percentage to centers: 75; Percentage to family child care: 20; Percentage to after-school: 5. Initiative: On-site caregiver training; Percentage to centers: 74; Percentage to family child care: 21; Percentage to after-school: 5. Initiative: Off-site caregiver training; Percentage to centers: 70; Percentage to family child care: 28; Percentage to after-school: 2. Initiative: Meeting state standards; Percentage to centers: 68; Percentage to family child care: 28; Percentage to after-school: 4. Source: GAO analysis of GAO survey data. [End of table] However, when we examined expenditures on initiatives by individual states, the proportion of expenditures on quality improvement activities devoted to each provider type varied. (See table 6.) For example, Minnesota, Mississippi, Tennessee, Texas and Washington reported devoting 90 percent or more of quality expenditures to centers, and Delaware, Hawaii, Michigan, North Dakota, and Oregon reported devoting less than one-third of quality expenditures to centers. These differences can be explained in part by state-to-state differences in the proportion of children receiving CCDF subsidies that attend each type of provider. For example, in Michigan, 19 percent of children receiving CCDF subsidies attend centers, while in Tennessee, 73 percent of children receiving CCDF subsidies attend centers. Given the variation in the proportion of subsidized children attending center-based care, it would be reasonable for Michigan to devote relatively less of its quality expenditures to centers and for Tennessee to devote relatively more of its quality expenditures to centers. Because the CCDF set-aside is intended to improve child care for all children, the law allows states flexibility in developing programs and policies, including quality improvement initiatives and the types of providers targeted. Table 6: Comparison of Quality Improvement Expenditures Distributed to Individual Providers that Were Devoted to Each Provider Type, with Percentage of CCDF-Subsidized Children, by State: State: Alaska; Percentage of quality expenditures to centers: 79; Percentage of CCDF-subsidized children in centers: 35; Percentage of quality expenditures to FCCs: 15; Percentage of CCDF-subsidized children in FCCs: 49; Percentage of quality expenditures to after-school[A]: 6. State: Arizona; Percentage of quality expenditures to centers: 68; Percentage of CCDF-subsidized children in centers: 73; Percentage of quality expenditures to FCCs: 32; Percentage of CCDF-subsidized children in FCCs: 14; Percentage of quality expenditures to after-school[A]: 0. State: Arkansas; Percentage of quality expenditures to centers: 72; Percentage of CCDF-subsidized children in centers: 82; Percentage of quality expenditures to FCCs: 28; Percentage of CCDF-subsidized children in FCCs: 18; Percentage of quality expenditures to after-school[A]: 0. State: California; Percentage of quality expenditures to centers: 79; Percentage of CCDF-subsidized children in centers: 71; Percentage of quality expenditures to FCCs: 16; Percentage of CCDF-subsidized children in FCCs: 17; Percentage of quality expenditures to after-school[A]: 5. State: Colorado; Percentage of quality expenditures to centers: 47; Percentage of CCDF-subsidized children in centers: 57; Percentage of quality expenditures to FCCs: 48; Percentage of CCDF-subsidized children in FCCs: 25; Percentage of quality expenditures to after-school[A]: 5. State: Delaware; Percentage of quality expenditures to centers: 31; Percentage of CCDF-subsidized children in centers: 55; Percentage of quality expenditures to FCCs: 69; Percentage of CCDF-subsidized children in FCCs: 35; Percentage of quality expenditures to after-school[A]: 0. State: Georgia; Percentage of quality expenditures to centers: 70; Percentage of CCDF-subsidized children in centers: 76; Percentage of quality expenditures to FCCs: 28; Percentage of CCDF-subsidized children in FCCs: 17; Percentage of quality expenditures to after-school[A]: 2. State: Hawaii; Percentage of quality expenditures to centers: 27; Percentage of CCDF-subsidized children in centers: 27; Percentage of quality expenditures to FCCs: 64; Percentage of CCDF-subsidized children in FCCs: 20; Percentage of quality expenditures to after-school[A]: 9. State: Kansas; Percentage of quality expenditures to centers: 54; Percentage of CCDF-subsidized children in centers: 36; Percentage of quality expenditures to FCCs: 46; Percentage of CCDF-subsidized children in FCCs: 50; Percentage of quality expenditures to after-school[A]: 0. State: Kentucky; Percentage of quality expenditures to centers: 80; Percentage of CCDF-subsidized children in centers: 61; Percentage of quality expenditures to FCCs: 20; Percentage of CCDF-subsidized children in FCCs: 29; Percentage of quality expenditures to after-school[A]: 0. State: Maine; Percentage of quality expenditures to centers: 63; Percentage of CCDF-subsidized children in centers: 29; Percentage of quality expenditures to FCCs: 37; Percentage of CCDF-subsidized children in FCCs: 33; Percentage of quality expenditures to after-school[A]: 0. State: Maryland; Percentage of quality expenditures to centers: 57; Percentage of CCDF-subsidized children in centers: 34; Percentage of quality expenditures to FCCs: 43; Percentage of CCDF-subsidized children in FCCs: 31; Percentage of quality expenditures to after-school[A]: 0. State: Massachusetts; Percentage of quality expenditures to centers: 60; Percentage of CCDF-subsidized children in centers: 56; Percentage of quality expenditures to FCCs: 40; Percentage of CCDF-subsidized children in FCCs: 23; Percentage of quality expenditures to after-school[A]: 0. State: Michigan; Percentage of quality expenditures to centers: 18; Percentage of CCDF-subsidized children in centers: 19; Percentage of quality expenditures to FCCs: 69; Percentage of CCDF-subsidized children in FCCs: 20; Percentage of quality expenditures to after-school[A]: 13. State: Minnesota; Percentage of quality expenditures to centers: 100; Percentage of CCDF-subsidized children in centers: 27; Percentage of quality expenditures to FCCs: 0; Percentage of CCDF-subsidized children in FCCs: 56; Percentage of quality expenditures to after-school[A]: 0. State: Mississippi; Percentage of quality expenditures to centers: 100; Percentage of CCDF-subsidized children in centers: 69; Percentage of quality expenditures to FCCs: 0; Percentage of CCDF-subsidized children in FCCs: 9; Percentage of quality expenditures to after-school[A]: 0. State: Missouri; Percentage of quality expenditures to centers: 80; Percentage of CCDF-subsidized children in centers: 37; Percentage of quality expenditures to FCCs: 20; Percentage of CCDF-subsidized children in FCCs: 42; Percentage of quality expenditures to after-school[A]: 0. State: Montana; Percentage of quality expenditures to centers: 39; Percentage of CCDF-subsidized children in centers: 30; Percentage of quality expenditures to FCCs: 61; Percentage of CCDF-subsidized children in FCCs: 69; Percentage of quality expenditures to after-school[A]: 0. State: Nebraska; Percentage of quality expenditures to centers: 33; Percentage of CCDF-subsidized children in centers: 58; Percentage of quality expenditures to FCCs: 67; Percentage of CCDF-subsidized children in FCCs: 41; Percentage of quality expenditures to after-school[A]: 0. State: New Mexico; Percentage of quality expenditures to centers: 62; Percentage of CCDF-subsidized children in centers: 43; Percentage of quality expenditures to FCCs: 33; Percentage of CCDF-subsidized children in FCCs: 27; Percentage of quality expenditures to after-school[A]: 5. State: North Carolina; Percentage of quality expenditures to centers: 83; Percentage of CCDF-subsidized children in centers: 81; Percentage of quality expenditures to FCCs: 15; Percentage of CCDF-subsidized children in FCCs: 13; Percentage of quality expenditures to after-school[A]: 2. State: North Dakota; Percentage of quality expenditures to centers: 28; Percentage of CCDF-subsidized children in centers: 26; Percentage of quality expenditures to FCCs: 72; Percentage of CCDF-subsidized children in FCCs: 71; Percentage of quality expenditures to after-school[A]: 1. State: Oklahoma; Percentage of quality expenditures to centers: 63; Percentage of CCDF-subsidized children in centers: 81; Percentage of quality expenditures to FCCs: 32; Percentage of CCDF-subsidized children in FCCs: 19; Percentage of quality expenditures to after-school[A]: 5. State: Oregon; Percentage of quality expenditures to centers: 0; Percentage of CCDF-subsidized children in centers: 21; Percentage of quality expenditures to FCCs: 100; Percentage of CCDF-subsidized children in FCCs: 52; Percentage of quality expenditures to after-school[A]: 0. State: Pennsylvania; Percentage of quality expenditures to centers: 59; Percentage of CCDF-subsidized children in centers: 59; Percentage of quality expenditures to FCCs: 28; Percentage of CCDF-subsidized children in FCCs: 18; Percentage of quality expenditures to after-school[A]: 13. State: South Dakota; Percentage of quality expenditures to centers: 47; Percentage of CCDF-subsidized children in centers: 27; Percentage of quality expenditures to FCCs: 50; Percentage of CCDF-subsidized children in FCCs: 53; Percentage of quality expenditures to after-school[A]: 3. State: Tennessee; Percentage of quality expenditures to centers: 90; Percentage of CCDF-subsidized children in centers: 73; Percentage of quality expenditures to FCCs: 5; Percentage of CCDF-subsidized children in FCCs:26; Percentage of quality expenditures to after-school[A]: 4. State: Texas; Percentage of quality expenditures to centers: 95; Percentage of CCDF-subsidized children in centers: 79; Percentage of quality expenditures to FCCs: 5; Percentage of CCDF-subsidized children in FCCs: 6; Percentage of quality expenditures to after-school[A]: 0. State: Utah; Percentage of quality expenditures to centers: 87; Percentage of CCDF-subsidized children in centers: 65; Percentage of quality expenditures to FCCs: 13; Percentage of CCDF-subsidized children in FCCs: 25; Percentage of quality expenditures to after-school[A]: 0. State: Vermont; Percentage of quality expenditures to centers: 69; Percentage of CCDF-subsidized children in centers: 44; Percentage of quality expenditures to FCCs: 31; Percentage of CCDF-subsidized children in FCCs: 50; Percentage of quality expenditures to after-school[A]: 0. State: Virginia; Percentage of quality expenditures to centers: 84; Percentage of CCDF-subsidized children in centers: 54; Percentage of quality expenditures to FCCs: 16; Percentage of CCDF-subsidized children in FCCs: 29; Percentage of quality expenditures to after-school[A]: 0. State: Washington; Percentage of quality expenditures to centers: 100; Percentage of CCDF-subsidized children in centers: 41; Percentage of quality expenditures to FCCs: 0; Percentage of CCDF-subsidized children in FCCs: 23; Percentage of quality expenditures to after-school[A]: 0. State: Wisconsin; Percentage of quality expenditures to centers: 70; Percentage of CCDF-subsidized children in centers: 60; Percentage of quality expenditures to FCCs: 30; Percentage of CCDF-subsidized children in FCCs: 39; Percentage of quality expenditures to after-school[A]: 0. State: Wyoming; Percentage of quality expenditures to centers: 33; Percentage of CCDF-subsidized children in centers: 31; Percentage of quality expenditures to FCCs: 54; Percentage of CCDF-subsidized children in FCCs: 40; Percentage of quality expenditures to after-school[A]: 13. Total percentage: Percentage of quality expenditures to centers: 71; Percentage of CCDF-subsidized children in centers: [B]; Percentage of quality expenditures to FCCs: 26; Percentage of CCDF-subsidized children in FCCs: [B]; Percentage of quality expenditures to after-school[A]: 3. [A] Information is not available on the percentage of CCDF-subsidized children in after-school care. [B] Average not applicable. Source: Percentage of quality expenditures devoted to centers, family child care and after-school care is based on GAO analysis of states‘ responses to our survey of CCDF lead state agencies. Percentage of CCDF-subsidized children in centers and family child care is based on data reported in U.S. House Of Representatives, Committee On Ways And Means, 2000 Green Book (Washington, D.C., 2000). [End of table] Few States Have Evaluated the Effectiveness of State Quality Improvement Initiatives: While few states have evaluated the effectiveness of state quality improvement initiatives on children‘s development, some studies provide useful findings about them. Officials in four of five states we talked to explained that states must make trade-offs between serving more families and conducting evaluations of their own quality improvement initiatives. Out of a handful of state-sponsored studies, a few had study designs that isolated an initiative‘s effect and survey response rates that provided reliable estimates. The research on child care quality does not evaluate initiatives as actually implemented by states, but a few studies, using rigorous methods, show that some of the attributes of child care quality that these initiatives address, such as caregiver qualifications, affect children‘s social, emotional and cognitive development. HHS has begun to support some analyses of states‘ quality improvement efforts and could play an even more important role in supporting rigorous studies of the initiatives states are undertaking. Of the Handful of Studies on the Effectiveness of States‘ Initiatives, Three Had Conclusive Findings: Of the handful of studies that examined the effectiveness of states‘ initiatives, three had methodological approaches sufficient to produce conclusive findings. In considering studies of the initiatives‘ effectiveness, we looked primarily for studies that analyzed the effect of an initiative on children‘s development. We also considered studies that examined effects on attributes of child care quality, such as caregiver qualifications or turnover. Improvements in attributes of child care quality can be seen as an intermediate step toward strengthening children‘s development. [Footnote 12] One of the three studies with conclusive findings, sponsored by Florida, analyzed how Florida‘s implementation of more stringent child-to-staff ratios and caregiver education requirements in child care centers was related to children‘s cognitive and socio-emotional development over time. The two other studies with conclusive findings, sponsored by Massachusetts and Washington state, examined caregiver compensation and caregiver recruitment and retention rates. Taking measures of child care quality and children‘s development before and after Florida instituted more stringent teacher-to-child ratios and caregiver education requirements, Florida‘s study found that a reduction in child-to-staff ratios and an increase in early education requirements for center providers contributed to gains in children‘s development and the quality of early education and care they received. The study‘s design allowed the contribution of child-to-staff ratios and caregivers‘ education to children‘s development to be examined, but, without a comparison group, was unable to isolate their effects completely. [Footnote 13] However, this limitation did not compromise the study‘s findings. Massachusetts‘s recruitment and retention study examined caregiver compensation, conducting a survey of providers regarding the reasons for the shortage and high turnover of providers in child care centers across Massachusetts. The study confirmed findings of other studies that caregivers who receive low wages are difficult to hire and retain. However, the study design it employed did not rule out explanations other than low salaries for the association between high turnover rates and workers receiving low wages. Washington State also evaluated caregiver compensation and retention, using a quasi-experimental design, but found no effect of the compensation on retention. It is important to acknowledge that while we looked at all of the studies we identified for evidence of the effectiveness of state initiatives, the studies that states sponsored may not always have been designed for that purpose and in some cases provided useful data on other issues that they were intended to address. For example, when the data sources used in nonexperimental studies meet data quality standards, as did data collected for the Massachusetts recruitment and retention study, state-sponsored studies can provide reliable information that is needed to address program design issues, such as setting reimbursement rates; to assess program implementation, such as examining the number of caregivers that have acquired training in child development; or to understand the child care market, such as determining the number of providers that offer health benefits to their caregivers. [Footnote 14] Studies that collect this type of descriptive information also help in planning research that employs rigorous designs. We also recognize that more definitive studies are labor and resource intensive; studies that employ experimental designs are difficult and expensive to conduct. Similarly, surveys that involve low-income families, which may be needed for studies using quasi-experimental designs, require special procedures, such as the use of financial incentives or several rounds of follow-up with nonrespondents, to achieve a response rate that meets minimum data quality standards. Moreover, while state child care agencies may partner with universities or contract research organizations to conduct such studies as CCDF funding sources permit, officials in four of the five states we talked to explained that states must make trade-offs between serving more families and conducting evaluations of their own quality improvement initiatives. The remaining studies we identified did not meet our criteria for data quality, because of low survey response rates or self-selected samples. California conducted a comprehensive study of caregiver compensation, and Massachusetts conducted a second study of caregivers‘ salaries, but both studies had very low response rates. North Carolina examined the reliability of criteria used in the state‘s incentive for accreditation initiative, but the sample of providers they studied was self-selected and included few centers with low quality ratings. California also evaluated a statewide toll-free telephone line using administrative and survey data but this survey had a very low response rate. The results of our assessments of particular studies are described in greater detail in appendix II. The Broader Literature Suggests, and a Few Studies Confirm, a Link between Child Care Quality Attributes and Children‘s Developmental Progress: The extensive body of research on child care quality that has been developed over the past 20 years has laid the foundation for understanding how the quality of care affects children‘s progress. Child care research studied a variety of child care quality attributes and a few studies provided evidence of the effects of these attributes on children‘s developmental progress. We examined reviews of the broad range of studies in this area to supplement the studies available on states‘ initiatives. While the findings of this research suggest that some states‘ initiatives are attempting to influence aspects of child care that have demonstrable effects on children‘s development, this is not sufficient to determine that these initiatives are necessarily effective as implemented. In 2000, the National Research Council conducted a second methodological review of research on early childhood development that included research on child care quality. [Footnote 15] The council, like a team of reviewers sponsored by HHS‘s Office of the Assistant Secretary for Planning and Evaluation and a team of reviewers sponsored by a foundation, examined the relationship between the structural attributes of quality and child-caregiver interactions on children‘s developmental progress. [Footnote 16] (See table 7.) The reviewers found that structural attributes, such as caregiver qualifications, child-to-staff ratios and smaller group size, lead to developmental gains directly or fostered supportive and responsive caregiver behavior. All reviewers concluded that child-caregiver interactions that are responsive and supportive have positive effects on children‘s developmental progress. Each of these reviews also emphasized that studies of the effect of child care quality on child development should employ study designs and statistical methods that separate the effects of family characteristics on children‘s development from the quality of the child care setting. [Footnote 17] The third review in table 7 includes a detailed discussion of the study designs and statistical methods, other than experimental design, that can be used to isolate the effect of child care quality. Among the large number of studies that were reviewed, the findings of those that met these criteria are summarized in appendix III. Table 7: Major Reviewers‘ Findings Regarding Child Care Quality Research: Author and review: Jack P. Shonkoff and Deborah A. Phillips, eds., From Neurons to Neighborhoods; Structural attributes that the review concluded contribute to children‘s developmental progress or caregivers‘ ability to create developmentally supportive environment: Staff wages; Lower staff turnover; Caregiver education; Caregiver training; Aspects of child-caregiver interactions that the review concluded contribute to children‘s developmental progress: Caregiver continuity fosters the attachments that improve social development. Verbal environment that child care providers create contributes to children‘s cognitive and language development. Author and review: Deborah Vandell and Barbara Wolfe, Child Care Quality: Does it Matter and Does it Need to be Improved? Structural attributes that the review concluded contribute to children‘s developmental progress or caregivers‘ ability to create developmentally supportive environment: Smaller group size; Lower child-to-staff ratios; Caregiver education; Caregiver training; Aspects of child-caregiver interactions that the review concluded contribute to children‘s developmental progress: Emotionally supportive and cognitively enriching settings. Author and review: John M. Love, Peter Z. Schochet and Alicia L. Meckstroth, Are They in any Real Danger? What Research Does – and Doesn‘t”Tell us About Child Care Quality and Children‘s Well-Being; Structural attributes that the review concluded contribute to children‘s developmental progress or caregivers‘ ability to create developmentally supportive environment: Smaller group size; Lower child-to-staff ratios; Safer equipment and space; Aspects of child-caregiver interactions that the review concluded contribute to children‘s developmental progress: Appropriate caregiving; Developmentally appropriate practice. [End of table] These studies have shown relationships between structural attributes, child-caregiver interactions and children‘s developmental progress that suggest many state initiatives are targeted on aspects of child care settings that have the potential for enhancing developmental outcomes. However, this is not sufficient to conclude that states‘ initiatives are necessarily effective in enhancing child care quality. Such a conclusion would presume that they are not only targeted on aspects of child care quality with the potential to improve developmental outcomes, but that they are reaching providers in need of help and reflect the individual attributes and the context in which they were studied originally. For example, because many studies were conducted at an earlier time period, the qualifications of the caregivers studied may differ from the pool of caregivers available in the current labor market. In addition, the populations of providers that were drawn at the state or substate level are not necessarily similar to the populations of other states. Thus, while existing research findings help states plan their initiatives, rigorous evaluations of initiatives actually implemented by the states are needed to provide evidence of the initiatives‘ effectiveness. HHS‘s Role in Supporting Studies of States‘ Initiatives: Using CCDF funds set aside for research, demonstration and evaluation by the 2001 Consolidated Appropriations Act, HHS has developed a research agenda that includes studies of child care quality and a commitment to rigorous evaluation. HHS‘s research agenda covers three goals and four categories of activity. The goals are (1) improve the capacity to respond to policy questions, (2) strengthen data collection and analysis systems for child care research, and (3) increase knowledge about the effectiveness of child care policies and programs on child development and in helping low-income families obtain and retain work. [Footnote 18] HHS supports these goals by funding state research partnerships, field-initiated research, demonstrations and evaluations and data collection and analysis systems for child care research. Of 23 quality-related on-going research projects HHS identified for us, components of three projects are investigating quality improvement initiatives. With funding in the state research partnership area, Minnesota and Massachusetts are examining how tiered reimbursement strategies affect child care quality. [Footnote 19] Minnesota‘s study is a component of a child care research partnership grant and Massachusetts‘s is part of a grant that supports state data and research capacity building. Under the same grant, Massachusetts is evaluating the impact of caregiver compensation strategies on the quality of care. In addition, HHS has undertaken a multiyear evaluation of the implementation, net impact and benefits of selected state child care policies and strategies that will be conducted using experimental design to determine if there are effects. Currently in its first year, this 7-year, 9 million dollar study will examine state strategies in four locations. HHS has taken an important first step by initiating this evaluation. However, to represent the diversity of the 50 states and their quality improvement approaches, more research that employs experimental or quasi-experimental designs will be needed to determine the effectiveness of states‘ quality improvement initiatives. Conclusion: Few states have evaluated the effectiveness of their quality improvement initiatives. While current research provides states with promising directions in which to target their efforts, it offers little specific guidance on how to modify ongoing initiatives or the most cost effective placement of additional expenditures to improve quality. This limits states‘ capacity to sustain and enhance initiatives that effectively improve the quality and availability of child care. Having additional rigorous research in this area would provide important information to both policymakers and administrators at all levels of government and support the Congress‘ efforts to improve child care quality. Recommendation: We recommend that HHS include in its planned multiyear evaluation of the net impact and benefits of state child care policies, an analysis of the effects on children‘s development of selected state quality improvement initiatives, such as off-site caregiver training or enhanced inspections. Agency Comments: We obtained comments on a draft of this report from HHS‘s Administration for Children and Families (ACF). These comments are reproduced in appendix IV. ACF also provided technical clarifications, which we incorporated when appropriate. HHS generally agreed with the findings of our report, expressing appreciation for the work we have done that makes the case for more research that evaluates the effectiveness of state quality improvement initiatives. HHS also described the department‘s role in supporting studies of states‘ initiatives and mentioned the technical assistance it provides states about relevant research findings through initiatives such as the National Child Care Information Center and the Healthy Child Care America campaign. In reference to our recommendation that HHS initiate research on state quality improvement initiatives, HHS expressed optimism that one or more of the analyses of state child care subsidy strategies in the multisite evaluation it is undertaking will test the effectiveness of state quality improvement initiatives. As requested, unless you publicly announce its contents earlier, we plan no further distribution of this report until 30 days after the date of this letter. At that time we will send copies of this report to the Secretary of Health and Human Services, appropriate congressional committees, and other interested parties. In addition, the report will be available at no charge on the GAO Web site at [hyperlink, http://www.gao.gov]. If you or your staff have any questions about this report, please contact me on (202) 512-7215 or Betty Ward-Zukerman, Assistant Director on (202) 512-2732. Other staff who contributed to this report are listed in appendix V. Signed by: Marnie S. Shaul: Director, Education, Workforce and Income Security Issues: [End of section] Appendix I: Scope and Methodology: We studied the initiatives that the states have implemented to improve child care quality. Our assessment examined two questions: (1) what initiatives have states undertaken with the 4 percent quality set-aside and other funding sources and (2) what evidence has been gathered, if any, about the effectiveness of states‘ quality improvement initiatives? Scope: The scope of our study was broader and more detailed than CCDF-mandated state reports because we asked states for data beyond what they reported to HHS. States receiving CCDF money must report to HHS aggregate expenditures to meet the set-aside requirement for child care quality improvement. However, states are not required to report to HHS how much they spend on specific quality improvement initiatives or which initiatives have shown evidence of effectiveness. Moreover, state or local governments may spend more money on quality improvements, using state funds or other resources, than is reflected in CCDF-mandated reports to HHS. We asked states to report all expenditures in federal fiscal year 2000 by initiative, including those made with CCDF funds that may have been appropriated in prior fiscal years but spent in federal fiscal year 2000, plus funds from other sources. [Footnote 20] The CCDBG Act and HHS regulations give states discretion in deciding how to spend money to meet the 4 percent set-aside requirement. The set-aside may be spent on activities to provide comprehensive consumer education, parents‘ choice of child care and on activities designed to improve the quality and availability of care children receive. The CCDBG Act defines quality in terms of activities states may undertake to meet the set-aside provision‘s requirements. The act includes two provisions that apply to parents seeking child care: (1) comprehensive consumer education for parents and the public and (2) increased parental choice. The act identifies expenditures on these two activities as appropriate uses of set-aside funds but does not cite them as examples of child care quality and availability improvement activities. The provision does cite resource and referral services as an example of an activity designed to improve the quality and availability of care. Similarly, HHS regulations for the CCDF 4 percent set-aside aim at improving parents‘ child care knowledge and choices and at improving the quality and availability of care children receive. The regulations include comprehensive consumer education and increasing parental choice as ’quality“ activities. The regulations state that activities to improve the quality of child care may include, but are not limited to, the following: * Improving resource and referral programs; * Making grants or loans to providers to assist in meeting child care standards; * Improving compliance with and enforcement of state and local licensing requirements; * Providing training and technical assistance to providers in health and safety, nutrition, child abuse detection and prevention, and care of children with special needs; * Improving salaries and compensation for staff who provide child care services. The regulations also include a provision that allows expenditures for any other activities that are consistent with the intent of the 4 percent set-aside section, which grants states considerable discretion. Because of the discretion that the law and HHS regulations allow states in selecting quality improvement initiatives, collecting information on quality improvement across all states required a common set of categories. The starting point for constructing categories for states‘ quality improvement initiatives was the CCDBG Act and its regulations. However, the quality improvement activities specified in the law and regulations did not include all of the initiatives states are undertaking to improve the quality of care children receive. To ensure that our study analyzed all state spending for child care quality improvement, we developed nine categories for states‘ child care quality improvement initiatives and asked states in which categories they funded activity, how much they spent, and the funding source. We developed the categories by combining federally designated activities with initiatives from contemporary child care quality analyses. To address the limitations of the federal activities, to create a more complete picture of state child care quality initiatives and to capture innovations in child care improvement, we added three categories to the federal categories. Two were derived from quality improvement initiatives undertaken by the Department of Defense‘s military child development program and one category was derived from literature analyzing child care quality improvement. An ’other quality-related activities“ category was added because the regulations were not exhaustive and permitted states to develop initiatives not listed in regulations, provided that they were consistent with them. Thus, the category for other quality-related activities included initiatives that may be unique to a state and those that may foster the availability of high quality care, such as strategies that provide consumer education or increase parental choice. Because activities to provide comprehensive consumer education to parents and the public and increase parental choice were not included among the activities noted by the law or HHS regulations as designed to improve child care quality, we did not include these activities in our framework. However, states were free to report these or other quality-related activities when they construed them as such. Table 3 in the background section lists the nine initiative categories, their derivation, and descriptions. Survey Data Collection: We surveyed CCDF lead state agency officials in the 50 states and the District of Columbia, asking that they report how much their state spent in each of the nine categories, the percentage of funds spent from each funding source in each category, types of providers and caregivers that initiatives targeted, and other information. For initiatives that included spending in more than one category, we asked state officials to record the spending for that initiative in the predominate category. When the draft data collection instrument was complete, GAO analysts and methodologists conducted a pretest in 4 states to ensure that the data collection instrument was clear and could be answered accurately in a reasonable amount of time. We made changes in the data collection instrument to incorporate comments from the pretest. Using a list of lead CCDF state agencies that was provided by HHS, we sent the data collection instrument on December 6, 2001, to 50 states and the District of Columbia by facsimile. The survey relied on state self-reporting of quality improvement initiatives and expenditures. While we did not independently verify states‘ reports, we compared state survey responses to data collected from our case study sites to provide some checks on the validity of state responses and cross-checked states‘ estimates of 4 percent set-aside expenditures. We compared state expenditure data reported in the data collection instrument with expenditure data that states reported to HHS and resolved discrepancies through interviews with state officials. We worked with state officials to ensure a uniform understanding of the categories but the possibility exists that 2 states might have categorized similar initiatives differently. Forty-two of the 50 states and the District of Columbia responded to the survey, yielding a response rate of 82 percent. However, the District of Columbia‘s complete data collection instrument was not received in time to be included in our analysis. For the analyses of how states‘ devoted expenditures to providers of different types, we supplemented our survey data with data on CCDF-subsidized children from the House Committee on Ways and Means‘s Greenbook. Case Studies: We selected California, Massachusetts, South Dakota, Tennessee and Wisconsin as case study states. Our selection criteria included diversity in geography and population density; representation of a variety of child care quality improvement initiatives, such as direct and indirect attempts to improve caregiver compensation, initiatives directed at informal care giving; and whether a state used tiered reimbursement rates as incentives for quality improvement. We also considered the population‘s income distribution, licensing caseloads, use of Temporary Assistance for Needy Families funds, and whether state licensing requirements reflected NAEYC recommendations for child-to-staff ratios. We excluded states where we pretested our data collection instrument to minimize burden on any single state. The purpose of the case studies was to collect data that would explain or amplify data gathered by the data collection instrument. The case study protocol allowed state officials to provide explanations about what initiatives had been conducted, what factors influenced the state to undertake particular initiatives, what evaluations the state had performed, what innovations the state was undertaking, and whether the state had any unusual needs or problems in child care. Evidence of the Effectiveness of States‘ Quality Improvement Initiatives: Question 2 asked us to examine the evidence that had been gathered, if any, about the effectiveness of states‘ quality improvement initiatives. We sought evidence of effectiveness in evaluations of the initiatives. We employed several search strategies to identify the evaluations. In our survey of CCDF lead state agencies, we asked states to identify evaluations they had conducted. We contacted HHS officials and child care researchers regarding their efforts to evaluate child care quality and reviewed major research efforts. Our review included a discussion with experts engaged in a study of child care quality funded by HHS‘s Child Care Bureau. [Footnote 21] We also reviewed the literature on child care quality improvement initiatives, including information from previous GAO work, literature suggested by experts and information from electronic searches. Our review included both searching for studies on the effectiveness of quality improvement initiatives and conducting a citation search using a highly regarded state evaluation of child care quality improvement and an electronic search for reviews of research on child care quality. We also reviewed state reports regarding child care subsidies. From our survey and search strategies, we obtained reports from nine states that had sponsored research on quality improvement initiatives. We used a structured data collection instrument to analyze the reports. We collected information on the type of report, the report‘s timeframe, quality improvement initiatives studied, design, data collection and analysis methods and findings. In addition, we assessed the study‘s methodological strengths and limitations. Our assessment included both the quality of the data used in the evaluation and the methodological quality of the research. The criteria we used for assessing the data‘s quality are shown in table 8. While we recognized that the administrative data were not collected to meet research standards, we paid particular attention to the administrative data‘s completeness and the surveys‘ response rates. When 30 percent or more of the administrative or survey data were missing, we looked for analyses showing no important difference between individuals represented in the data and those who were not included. Table 8: Data Quality Criteria: Survey data: Use of a random sample; Administrative data: Correspondence to the entire study population. Survey data: Sample size greater than 30; Administrative data: Sample size not applicable. Survey data: Response rate of 70 to 75 percent or greater; Administrative data: High percentage of the study population for whom information was located in the data base. Survey data: Nonresponse analysis showing no important difference between individuals or families represented in the data and those missing from the data; Administrative data: Comparative analyses showing no important difference between individuals or families represented in the data and those missing from the data if 30 percent or more of the records are missing. [End of table] Our assessment of the evaluations focused on the designs and analysis methods required to determine effects. The criteria we used in the assessment are shown in table 9. Table 9: Criteria for Assessing Evaluations: Study component: Design; Criteria: For an experimental design, selecting the group receiving the program and the control group randomly. Study component: Design; Criteria: For both experimental and quasi-experimental designs, using a comparison group. Study component: Data collection; Criteria: Meeting the criteria for survey and administrative data quality shown in table 18. Study component: Data analysis; Criteria: Using a multivariate analysis procedure, as appropriate. Study component: Data analysis; Criteria: Using controls for influences other than the program. Study component: Data analysis; Criteria: Testing and correcting for limitations such as nonrandom selection to the program and comparison group, and missing survey and administrative data. [End of table] An evaluation determines a program‘s effect on its participants by isolating a program‘s contribution from the effects of other influences that could have affected participant outcomes. To isolate the program‘s influences, an evaluation studies two groups: those receiving program services and a similar group not receiving program services. Researchers compare the relevant outcomes of these two groups, such as children‘s socioemotional development, to determine the program‘s effect. The criteria for study design in table 9 apply to two types of evaluations: an experimental design and a quasi-experimental design. The two designs differ primarily in the way that the comparison groups are developed. In an experimental design, because comparison group members are selected randomly, researchers can compare outcomes to determine the program‘s effect without using statistical controls for other factors that could have influenced the program. In a quasi-experimental design, the comparison group is composed of individuals who share characteristics with program participants, but who have not been randomly selected and who have not received program services. [Footnote 22] With this design, statistical controls, such as those provided by a multivariate analysis procedure, are needed to isolate the program from other factors that could influence outcomes. While there can be substantial practical difficulties in implementing experimental designs of social programs because program staff may be reluctant to participate and because of the tendency for comparison group participants to leave the study, there is no substantial debate about the desirability of a comparison group of some type in drawing conclusions about program effects. The criteria for assessing the administrative and survey data used in the evaluations were the same data quality criteria we discussed above. The criteria for data analysis in table 9 refer to the need to control for factors other than the program when program participants and comparison group members are not randomly selected. They also encompass additional analyses that may be needed if the group receiving program services and the comparison group were not randomly selected or to determine if missing data affect the reliability of the estimates of the program‘s effect. Finally, several of the studies we assessed and the reviews of child care research we examined made reference to scales for measuring child care quality. In child care quality research, the structural attributes of quality are measured directly by, for example, counting the number of children per caregiver in classrooms or the years of education that a caregiver has attained. However, because child-caregiver interactions must be observed and recorded for research purposes, researchers have developed various scales to measure it. These scales contain numerous items that evaluate the areas of personal care routines, furnishings, language reasoning experiences, motor activities, creative activities, social development, and staff needs. Three of the most well known scales used in measuring process quality are the Early Care Environment Rating Scale (ECERS), the Infant/Toddler Environment Rating Scale (ITERS), and the Family Day Care Rating Scale (FDCRS). [Footnote 23] The ECERS and ITERS scales measure child-caregiver interactions at center-based care while FDCRS measures process quality in child care homes. One of the reviews assessed the scales‘ strengths and limitations. The strengths of these three scales are their ease of use, reliability, good psychological measurement properties, and ability for cross-study comparisons. However, the reviewer pointed out, these scales also have some limitations. For example, their global composite scores combine features of various environments and influences when some of these areas may have greater influence on children‘s development as compared to others. Additionally, these scales are setting-specific which means it is not possible to make simple comparisons across types of care or to combine scores in meta analyses. Another review pointed out that none of the existing scales include measures for the aspects of informal care that parents see as important, including such characteristics as shared values and language, a homelike atmosphere, the opportunity for a child to be cared for with siblings, and flexibility about hours and schedule. [End of section] Appendix II: State-Initiated Studies of Quality Improvement: Table 10 presents studies that we identified as attempting to examine the effectiveness of state quality improvement initiatives. They are ordered so that studies that meet GAO‘s criteria for data and research quality are presented first. The table provides the study‘s title, the quality improvement initiative the study examined, major findings, and methodological strengths and limitations. Studies in the table were conducted or sponsored by the state. Table 10: State-initiated Studies of Quality Improvement: Study and quality improvement initiative: The Florida Child Care Quality Improvement Study: 1996 Report: Classroom ratios; Off-site training; Purpose: To determine the effects of Florida‘s new ratio and education requirements on children‘s cognitive and socio-emotional development; Findings: Reduced child to staff ratios significantly contributed to gains in children‘s cognitive and language development and attachment to their teachers; Strengths: Before and after study design analyzed effects on children‘s development; Random sample; Representative sample; Multivariate analyses conducted; Limitations: 28 percent of the child care centers in the original sample had to be replaced at the second measurement time; Results not generalizable beyond four counties; No comparison group. Study and quality improvement initiative: Massachusetts Recruitment and Retention Study: Caregiver compensation; Purpose: To determine the reasons for the shortage and high turnover of child care center providers; Findings: Low wages are associated with difficulty in recruiting and retaining child care center staff; Strengths: Stratified random sample; 100 percent response rate for telephone survey of center directors; Large sample size of center directors; Limitations: Nonexperimental design. Study and quality improvement initiative: Washington State Child Care Career and Wage Ladder Pilot Project: Caregiver compensation; Retention; Off-site training; Purpose: To determine the effects of a career and wage ladder pilot project, which establishes specific job titles and related wages based on teacher education and experience, on staff retention, education, wage, and benefit changes in child care centers; Findings: A statistically significant difference was not found between pilot and comparison centers on retention rates and average length of employment; Strengths: Quasi-experimental study design with a comparison group; High survey response rates; Stratified random sample; Large sample size for mail surveys sent to pilot and comparison center directors; Limitations: Measuring progress toward the goal of improved quality of child care was beyond the scope of this phase of the evaluation; Small sample size of directors interviewed by telephone; Sample of telephone-interviewed directors was judgmentally selected. Study and quality improvement initiative: Carolina‘s 5-Star Child Care Licensing System: Incentives for accreditation; Purpose: To determine if the state licensing system accurately portrays the quality of child care centers; Licensing system accurately reflects the overall quality of a child care center. Centers with different ratings exhibit meaningful differences; Strengths: Comparison of ECERS scores, determined by an independent team of university-based researchers and 5-star ratings constituted an independent validation of the 5-star assessment; Limitations: Nonexperimental design; Participants self-selected. Study and quality improvement initiative: Massachusetts Child Care Center & School Age Program Salary and Benefits Report: Caregiver compensation; Purpose: To determine the starting salary ranges and benefits for different types of child care center staff; Findings: Staff in licensed group child care centers started at $7 to $17 per hour; Staff in licensed school age child care programs started at $6.50 to $17.30 per hour; Strengths: Data gathered from a variety of populations which provided a more representative picture of recruitment and retention; Limitations: Nonexperimental design Low survey response rate. Study and quality improvement initiative: California Quality Improvement Program Evaluation: Healthline; Children‘s health and safety issues; Purpose: To determine whether Healthline, toll-free telephone line for information on children‘s health and safety, effectively reached providers and parents and promoted child health and safety; Findings: 79 percent of Healthline calls were providers; 25 percent of providers statewide had heard of the Healthline. Overall, callers matched distribution of state population. Most callers reported that information they received met their needs; Strengths: Use of both administrative and survey data to increase population coverage Use of multiple measures of Healthline‘s outreach; Limitations: Nonexperimental design; Callers‘ county used to represent callers‘ characteristics; Response rates below standard; Nonresponse analysis not conducted. Study and quality improvement initiative: California Child Care and Development Compensation Study: Towards Promising Policy and Practice: Caregiver compensation; Purpose: To determine to what extent the educational status of child care providers affects their wages, benefits, and turnover in different types of centers; Findings: Approximately one third of caregivers hold a B.A. or higher, with no statistically significant differences among center types. Teachers in public centers earned $1.55 more per hour than teachers in nonprofit centers and $2.10 per hour more than teachers in for-profit centers. Child care staff received total benefits valued at 23 percent of for-profit wages and 30 percent wages at nonprofit and public centers. Caregiver turnover in public centers lower than for profit centers. Caregivers in centers with the highest turnover had the lowest wages; Strengths: Quasi-experimental study design with a comparison group. Random samples. Standard tests of significance. Multivariate regression to analyze effects. Findings compared to analogous study results to compensate for data limitations; Limitations: Center survey response rate of 45 percent was below standard. For-profit center response rate of 20 percent was below the standard. No nonresponse analysis or sample weights to adjust for low survey response. Less stringent tests of significance for analyses of effect. Study and quality improvement initiative: The Colorado Expanding Quality Infant and Toddler Care Initiative: Off-site training; Purpose: To determine the effect of a 45-hour infant and toddler training curriculum on the quality of care provided by students participating in the training; Findings: Of the classrooms where a child care provider had participated in the training, 97% achieved quality scores. 99% of the students who completed the post-training assessment survey reported the training would help them improve the quality of care they provided. All of the training instructors reported they felt the quality of care their students provided improved as a result of the training; Strengths: Random sample. Before and after study design. Tests of significance conducted; Limitations: Nonexperimental design; Survey response rate of 43 percent was below standard; Nonresponse analysis was not conducted. Study and quality improvement initiative: Smart Start and Child Care in North Carolina: Effects on Quality and Changes over Time: Caregiver compensation; On-site training; Incentives for accreditation; Purpose: To determine the effect of Smart Start activities (enhanced subsidies for higher child care quality or higher teacher education; license upgrades; technical assistance; quality improvement and facility grants; teacher education scholarships; and teacher salary supplements) on the quality of child care over time; Findings: Quality of child care increased significantly from 1994 to 1999, with a greater increase from 1994 to 1996 and a smaller increase from 1996 to 1999. Twice as many centers in 1999 compared to 1994 scored in the ’good to excellent“ quality range. Extensive previous participation in Smart Start does not guarantee that a center‘s current quality is high. Number of teachers participating in programs to obtain more education increased. Number of teachers with some college coursework increased. Percentage of centers licensed at higher levels and percentage of nationally accredited centers increased. Benefit levels were positively related to participation in Smart Start. Median teacher turnover remained steady at 17–20 percent. Group sizes and teacher-child ratios have remained fairly constant; Strengths: High response rates in 1994 and 1999 samples; Longitudinal design; Multiple regression analyses; Large sample sizes; Tests of significance conducted; Limitations: Nonexperimental design; Smaller sample in 1999 as compared to 1994 and 1996 (attrition of 52 centers, or 28 percent less); Did not correct for selection effects among centers participating in Smart Start; Low response rate in 1996 (68 percent). Study and quality improvement initiative: Oklahoma Tiered Licensing and Differential Quality Study: Incentives for accreditation; Off-site training; Purpose: To examine the variability in child care centers operating within the different regulatory climates; Findings: Accredited centers, whether two-star or not, were more likely than licensed and two-star by criteria centers to offer better quality child care. Centers with a smaller proportion of their enrollment receiving subsidies were more likely to offer better quality care. Master teachers who qualified by education were more likely to offer better quality care; Strengths: Used multiple measures to evaluate the quality of care; Tests of significance conducted; Limitations: Nonexperimental design; Nonrandom sample; Sample limited to certain age groups of children. [End of table] [End of section] Appendix III: Child Care Quality Research Findings: Appendix III presents findings from two of the reviews of research on child care quality, discussed in the letter of the report, that found effects of the structural attributes of quality and child-caregiver interactions on children‘s developmental progress. [Footnote 24] These two reviews provided sufficient methodological detail about the studies they assessed to identify those that met the criteria for analyses of the effects of child care quality that we describe in appendix I. The findings from these reviews are broken out in appendix III by those that are linked to children‘s socio-emotional development, cognitive development, and development over time. We present only findings of studies the reviewers examined that could isolate the effect of child care quality on children‘s development. The attributes that underlie the quality improvement initiatives being implemented by the states are primarily structural. These include child-teacher ratios, group class size, caregiver formal education, caregiver specialized training, classroom structure, and health and safety features. While research shows that child-caregiver interactions are equally as important in improving the quality of child care, states‘ initiatives tend to address these attributes only through such initiatives as incentives for achieving accreditation. [Footnote 25] Thus, findings from research examining structural attributes may be more useful for targeting states‘ quality improvement initiatives. Child Care Quality Linked to Socio-Emotional Development: Several studies have determined that children cared for in high quality child care settings show positive socio-emotional development. Lower child-to-adult ratios and smaller class sizes improve children‘s social and emotional development. Lower child-to-adult ratios result in children appearing less apathetic and distressed; [Footnote 26] fewer behavior problems at 24 and 36 months of age; [Footnote 27] enhancements in children‘s social development; [Footnote 28] and teachers and children interacting more beneficially. [Footnote 29] Smaller class size has been linked with children being more cooperative and less hostile and conflict-prone in their interactions with others; [Footnote 30] fewer behavior problems at 24 and 36 months of age; [Footnote 31] and enhancements in children‘s social development. [Footnote 32] Additionally, researchers have found that when caregivers have more formal education and specialized training, children are more cooperative, [Footnote 33] have fewer behavior problems at 24 and 36 months of age,[Footnote 34] and have a greater security of attachment. [Footnote 35] Low staff turnover is associated with children being more competent in social development, and higher staff wages are linked with higher-quality centers. [Footnote 36] Finally, children appeared happier and more positively engaged with their classmates when their caregivers were more involved, positive, and responsive with them. [Footnote 37] Children showed greater interest and participation when centers had ECERS scores in the high-quality range. [Footnote 38] Child Care Quality Linked to Cognitive Development: Child care quality research also has found that high quality care contributes to improvement in children‘s cognitive development. Lower child-to-staff ratios are linked with enhancements in children‘s cognitive development, [Footnote 39] including improvements in general knowledge, receptivity to language,[Footnote 40] and at 36 months, school readiness, and language comprehension scores.[Footnote 41] Smaller groups are associated with enhancements in children‘s cognitive development,[Footnote 42] school readiness, and language comprehension scores.[Footnote 43] Caregiver education and training are also associated with better cognitive development in children. More highly educated or trained caregivers have been found to improve children‘s school readiness and language comprehension scores.[Footnote 44] In addition, low staff turnover is associated with children being more competent in language development.[Footnote 45] The experiences that occur in the environment in which children are cared for are also linked to cognitive development. Higher quality experiences are associated with children performing better on tests of language,[Footnote 46] intelligence,[Footnote 47] and reading.[Footnote 48] In addition, child-to-caregiver interactions are linked to better cognitive development and improvements in cognitive competence during free play after participating in activities involving art, playing blocks, and dramatic play.[Footnote 49] Caregiver language stimulation (in both centers and homes) is associated with better performance on standardized language tests.[Footnote 50] Child Care Quality Linked to Child Development Over Time: Studies that examined children‘s development over time have shown that high quality care is a predictor of improvement in children‘s receptive language and functional communication skills, verbal IQ skills, cognitive skills, behavioral skills, and attainment of higher math and receptive language scores. Changes in these skills can be detected with greater certainty when examined over time. When children attend classrooms that meet recommended child-to-staff ratio guidelines, they exhibit better receptive language and communication skills over time.[Footnote 51] However, when children attend classrooms with higher than recommended child-to-staff ratios, the children, once they reach preschool and kindergarten, are rated by their teachers as being more difficult and hostile. In addition, these children tend to engage in less social play and display less positive emotions.[Footnote 52] Caregivers with more specialized training were associated with children having higher math and receptive language scores over time.[Footnote 53] Girls whose caregivers had at least 14 years of education displayed better cognitive and receptive language skills over time.[Footnote 54] On the other hand, once in preschool and kindergarten, children who (during their first 3 years of age) attended child care where caregivers had no formal child development training or where they were cared for by more than two primary caregivers in a year, were rated by their teachers as being more difficult and hostile. In addition, those children engaged in less social play and displayed less positive emotions.[Footnote 55] Finally, when more involved and invested caregivers care for children during their first three years, kindergarten teachers report that those children have fewer behavior problems and better verbal IQ scores.[Footnote 56] In addition, higher quality experiences are associated with children exhibiting better receptivity to language and communication skills over time. [Footnote 57] [End of section] Appendix IV: Comments from the Department of Health and Human Services: Department Of Health And Human Services: Administration For Children And Families: Office of the Assistant Secretary, Suite 600: 370 L'Enfant Promenade, S.W. Washington, D.C. 20447: August 23, 2002: Ms. Marnie S. Shaul: Director: Education, Workforce, and Income Security Issues: U.S. General Accounting Office: Washington, D.C. 20548: Dear Ms. Shaul: Thank you for the opportunity to comment on the General Accounting Office draft report entitled: "Child Care: States Have Undertaken a Variety of Quality Improvement Initiatives, but More Evaluations of Effectiveness Are Needed" GAO-02-897. Attached are the Administration for Children and Families' comments. If you have any questions regarding our comments, please call Shannon Christian, Associate Commissioner, Child Care Bureau, at (202) 690-6782. Signed by: [Illegible] for: Wade F. Horn; Ph.D. Assistant Secretary for Children and Families: Enclosure: Comments Of The Administration For Children And Families On The GAO Draft Report: "Child Care: States Have Undertaken A Variety Of Quality Improvement Initiatives But More Evaluations Of Effectiveness Are Needed" GAO-02-897: The Administration for Children and Families (ACF) appreciates the opportunity to comment on this report, which addresses an important topic. GAO Recommendations: GAO recommends that HHS include in its planned multi-year evaluation of the net impact and benefits of state child care policies, an analysis of the effects on children's development of selected state quality improvement initiatives, such as off-site caregiver training or enhanced inspection. ACF Comments: The Child Care and Development Fund (CCDF) provides states with significant flexibility in determining where to target their quality set-aside funds. As GAO indicates, these decisions often involve governors and state legislatures. While most states use some of their CCDF funds for new and innovative quality initiatives, they also use CCDF for basic services such as resource and referral and ensuring the health and safety of children in care. This is consistent with the GAO finding that 35 percent of reported state expenditures are for enhanced inspections and to assist providers in meeting standards and making safety improvements. ACF, through its Child Care Bureau, provides technical assistance and support to assist states in responding to the competing demands associated with improving child care affordability and supply while making quality investments. Through written materials, meetings of state child care administrators, and technical assistance efforts such as the National Child Care Information Center (NCCIC) and Healthy Child Care America, the Bureau responds to issues of concern to states and provides information about innovative practices and relevant research findings. For example, through NCCIC, the Bureau has produced publications regarding the recruitment and retention of child care staff and state initiatives that provide rate incentives for higher quality care. On-site training and technical assistance have been provided to states regarding strategies to improve the quality of legally exempt care. Through the Bureau's technical assistance efforts, it has become clear that states have an increasing interest in scientifically-based evidence on which to support decisions about improving child care services and systems. In particular, states indicate the desire for research that can help them make the best investments from among possible quality initiatives. ACF has made substantial commitment in research designed to promote better understanding of state and local child care policies and child care markets and the choices low-income parents make within the context of those policies and markets. Much of this work, such as the National Study of Child Care for Low-Income Families, has been descriptive in nature and complements studies sponsored by the National Institutes of Health and the Department of Education. In addition, the Child Care Bureau has embarked on an ambitious effort to build the capacity for child care research at national, state and community levels. This effort includes a Research Consortium comprised of nearly 50 current and prior-year grantees and contractors who have been funded by the Bureau to conduct policy-relevant research on child care issues. Through the Research Consortium, researchers come together to address critical definitional, policy, and methodological issues. This work provides the basis for sound, more rigorous research on child care issues. Finally, two research priorities have been initiated to encourage state and community-level research efforts. First, competitive grants are being offered to state CCDF lead agencies to assist them in improving their capacity to conduct policy-relevant research and analysis. This effort is intended to further the design and implementation of cost-effective child care policies and programs that are based on solid evidence. Second, as noted in the GAO study, ACF has contracted for a multi-site evaluation to study the impact, implementation, and costs and benefits of selected child care subsidy strategies. For purposes of this study, "subsidy strategies" are defined broadly to include policies that govern state child care programs and state initiatives to improve child care quality and outcomes for families and children. This evaluation will expand knowledge about child care by investigating causality using experimental methods. Exploratory work is in progress with states to identify sites and topics that may be appropriate for study using experimental methods. Early indications are that states are interested in knowing more about how they can best support child care quality, school readiness, and early literacy. We are optimistic that one or more of these studies will test the effectiveness of state quality initiatives. We appreciate the work your office has done to highlight state efforts to improve child care quality and to make the case for more research that evaluates the effectiveness of these efforts. [End of enclosure] [End of section] Appendix V: GAO Contacts and Staff Acknowledgments: GAO Contacts: Betty Ward-Zukerman, Assistant Director, (202) 512-2732: Sara E. Edmondson, Analyst-in-Charge, (202) 512-8516: Staff Acknowledgments: In addition to those named above, the following individuals made important contributions to this report: Cara Jackson, in collaboration with methodologists from our Advanced Research Methods (ARM) team, designed the data collection instrument used to survey CCDF lead state agencies, oversaw data collection and designed and conducted the analysis of the states‘ quality improvement initiatives and expenditures; Cara also contributed to selection of the 5 case study states, development of the case study protocol and collection of case study data; Jyoti Gupta, of GAO‘s Atlanta Field Office, lead case study data collection and analysis and played a major role in the research assessment; James Wright and Joel Grossman of ARM lead design and development of the data collection instrument; and Bill Keller provided timely insights and consultation on CCDF funding and expenditure patterns and block grant implementation issues. [End of section] Bibliography: Besharov, Douglas J. and Nazanin Samari. ’Child Care After Welfare Reform.“ Unpublished policy paper, n.p., 2000. Besharov, Douglas J. and Nazanin Samari. ’Quality“ Child Care? Assessing the Impact on Child Outcomes.“ Unpublished policy paper, n.p., n.d. Blau, D. M. ’The Effects of Child Care Characteristics on Child Development.“ Journal of Human Resources 34, no. 4 (1999): 786-822. Committee on Early Childhood Pedagogy, National Research Council. Eager to Learn: Educating our Preschoolers. Eds. Barbara T. Bowman, M. Suzanne Donovan, and M. Susan Burns. Washington, D.C.: National Academy Press, 2001. Boyd, Brenda J. and Mary R. Wandschneider. Washington State Child Care Career and Wage Ladder Pilot Project: Phase 1 Final Evaluation Report. Pullman, Wash.: Department of Human Development, Washington State University, 2002. Bryant, Donna. Validating North Carolina‘s 5-Star Child Care Licensing System. Chapel Hill, N.C.: Smart Start Evaluation Team, Frank Porter Graham Child Development Institute, University of North Carolina, 2001. Bryant, Donna, Kathleen Bernier, Ellen Peisner-Feinberg, and Kelly Maxwell. Smart Start and Child Care Changes in North Carolina: Effects on Quality and Changes over Time. Chapel Hill, N.C.: Smart Start Evaluation Team, Frank Porter Graham Child Development Institute, University of North Carolina, 2002. Burchinal, M.R., J.E. Roberts, R. Riggins, S. A. Zeisel, E. Neebe, and D. Bryant. ’Relating Quality of Center Child Care to Early Cognitive and Language Development Longitudinally.“ Child Development (In press). Burton, Alice and Marcy Whitebook. Recruiting and Retaining Low-Income Child Care Workers in Wisconsin: The Wisconsin Child Care Mentor Project Evaluation – A Summary of the Findings. Washington, D.C.: Center for the Child Care Workforce, 2000. Capizzano, Jeffrey, Gina Adams, and Freya Sonenstein. Child Care Arrangements for Children under Five: Variation Across States. (Washington, D.C.: The Urban Institute, 2000). [hyperlink, http://www.urban.org] (downloaded June 3, 2002). Coleman, Richard, Maria Stahla, and Ivan Djambov. Report to the Utah Legislature: A Performance Audit of the Office of Child Care. February, 2001. Cost, Quality, and Child Outcomes Study Team. Cost, Quality, and Child Outcomes in Child Care Centers, Public Report. Ed. Suzanne W. Helburn. Denver: University of Colorado, 1995. Dunn, L. ’Proximal and Distal Features of Day Care Quality and Children‘s Development,“ Early Childhood Research Quarterly 8 (1993): 167-192. Dunn, L., S. A. Beach, and S. Kontos. ’Quality of the Literacy Environment in Day Care and Children‘s Development,“ Journal of Research in Childhood Education 9 (1994): 24-34. Fuller, Bruce, Sharon Lynn Kagan, Susanna Loeb, Judith Carroll, Jan McCarthy, Gege Kreicher, Bidemi Carrol, Ginger Cook, Yueh-Wen Chang, and Susan Sprachman. New Lives for Poor Families? Mothers and Young Children Move through Welfare Reform. The Growing up In Poverty Project – Wave 2 Findings, California, Connecticut, and Florida. Berkeley: University of California, 2002. Galinsky, Ellen, Carollee Howes, and Susan Kontos. The Family Child Care Training Study: Highlights of Findings. New York: Families and Work Institute, 1995. Glantz, Frederic, B. and Jean Layzer. The Cost, Quality and Child Outcomes Study: A Critique. (Cambridge: Abt Associates Inc., 2000). [hyperlink, http://www.abtassociates.com] (downloaded Oct. 17, 2001). Goelman, H. ’The Relationship between Structure and Process Variables in Home and Day Care Settings on Children‘s Language Development.“ In The practice of Ecological Research: From Concepts to Methodology, edited by A. Pence and H. Goelman. N.p., 1988. Helburn, Suzanne W., John R. Morris, and Kathy Modigliani. The Economics of Family Child Care. Forthcoming. Hestenes, L. L., S. Kontos, and Y. Bryan. ’Children‘s Emotional Expression in Child Care Centers Varying in Quality,“ Early Childhood Research Quarterly 8 (1993): 295-307. Holloway, S.D., and M. Reichhart-Erickson. ’The Relationship of Day Care Quality to Children‘s Free Play Behavior and Social Problem-Solving Skills.“ Early Childhood Research Quarterly 3 (1988): 39-53. Howes, C. ’Can the Age of Entry into Child Care and the Quality of Child Care Predict Adjustment in Kindergarten?“ Developmental Psychology 26 (1990): 292-303. Howes, Carollee, Ellen Galinsky, Marybeth Shinn, Leyla Gulcur, Margaret Clements, Annette Sibley, Martha Abbott-Shim, and Jan McCarthy. The Florida Child Care Quality Improvement Study: 1996 Report. New York: Families and Work Institute, 1998. Howes, C., E. Smith, and E. Galinsky. The Florida Child Care Quality Improvement Study: Interim Report. New York: Families and Work Institute, 1995. Kontos, S., and A. Wilcox-Herzog. ’Influences on Children‘s Competence in Early Childhood Classrooms.“ Early Childhood Research Quarterly 12 (1997): 247-262. Love, John M., Paul Ryer, and Bonnie Faddis. Caring Environments” Program Quality in California‘s Publicly Funded Child Development Programs: Report on the Legislatively Mandated 1990-91 Staff/Child Ratio Study. Portsmouth, N.H.: RMC Research Corporation, 1992. Love, John M., Peter Z. Schochet, and Alicia L. Meckstroth. Are They in Any Real Danger? What Research Does”and Doesn‘t”Tell Us about Child Care Quality and Children‘s Well-Being. Princeton, N.J.: Mathematica Policy Research, Inc., 1996. Marsh, Janet. South Carolina Child Care: Survey of the Workforce 2000. A report prepared for the South Carolina Department of Health and Human Services. Clemson, S.C.: Clemson University Institute on Family and Neighborhood Life, 2001. Massachusetts Child Care Resource and Referral Network for the Massachusetts Office of Child Care Services. Massachusetts Child Care Center & School Age Program Salary and Benefits Report. A report prepared for the Massachusetts Office of Child Care Services. October 2000. McCartney, K. ’Effect of Quality of Day-Care Environment on Children‘s Language Development.“ Developmental Psychology 20 (1984): 244-260. Mills and Pardee, Inc. The Massachusetts Early Care and Education Staff Recruitment and Retention Research and Recommendations. A report prepared for the Massachusetts Office of Child Care Services. April 2001. Montgomery, Deborah, Gabriele Phillips, Dennis Zeller, Helaine Hornby. Quality Improvement Program Evaluation: Final Report, Year 1 Evaluations – Child Care Initiative Project, Healthline, Stipend for Permit. A report prepared for the California Department of Education. Palo Alto, Calif.: Institutes for Research and Hornby Zeller Associates, Inc., 1999. National Research Council and Institute of Medicine. From Neurons to Neighborhoods: The Science of Early Childhood Development. Committee on Integrating the Science of Early Childhood Development. Edited by Jack P. Shonkoff and Deborah A. Phillips. Washington, D.C.: National Academy Press, 2000. NICHD Early Child Care Research Network (ECCRN). ’Early Child Care and Self-Control, Compliance, and Problem Behaviors at Twenty-Four and Thirty-Six Months.“ Child Development 69, no. 4 (1998): 1145-1170. NICHD ECCRN. ’Effect Sizes from the NICHD Study of Early Child Care.“ Paper presented at the Biennial Meeting of the Society for Research in Child Development, Albuquerque, N. Mex., April 1999. NICHD ECCRN. ’Infant Child Care and Attachment Security: Results of the NICHD Study of Early Child Care.“ Symposium presented at the meeting of the International Conference on Infant Studies, Providence, R.I., April 1996. NICHD ECCRN. ’The Relation of Child Care to Cognitive and Language Development.“ In Child Development (In press). Norris, Deborah J. and Loraine Dunn. Taking a Closer Look: Tiered Licensing and Differential Quality. N.p., 2000. Peisner-Feinberg, E. S., and M. R. Burchinal. ’Relations between Preschool Children‘s Child-Care Experiences and Concurrent Development: The Cost, Quality, and Outcomes Study.“ Merrill”Palmer Quarterly 43 (1997): 451-477. Porter, Toni, Sulaifa Habeeb, Sally Mabon, Anne Robertson, Lee Kreader, and Ann Collins. Assessing Child Care Development Fund (CCDF) Investments in Child Care Quality: A Study of Selected State Initiatives. New York: Institute for a Child Care Continuum, Bank Street College of Education, 2002. Ruopp, R., J. Travers, F. Glantz, and C. Coelen. Children at the Center: Final Report of the National Day Care Study. Cambridge: Abt Associates, 1979. Schliecker, E., D. R. White, and E. Jacobs. ’The Role of Day Care Quality in the Prediction of Children‘s Vocabulary.“ Canadian Journal of Behavioural Science 23 (1991): 12-24. Stuiber, Paul, Victoria Flood, Tamarine Cornelius, Tim Coulthart, Jolie Frederickson, and Jessica Lathrop. An Evaluation: Wisconsin Shares Child Care Subsidy Program. A report prepared for the Wisconsin Legislative Audit Bureau. January 2001. The Center for Human Investment Policy. The Expanding Quality Infant and Toddler Care in Colorado Initiative: An Evaluation Report on Year Two. Denver: University of Colorado, 2001. Vandell, Deborah Lowe and Barbara Wolfe. Child Care Quality: Does it Matter and Does it Need to be Improved? Washington, D.C.: U.S. Department of Health and Human Services, Office of the Assistant Secretary for Planning and Evaluation, 2000. Whitebook, Marcy, Alice Burton, Deborah Montgomery, Christine Hikido, Robert Vergun, Jay Chambers. California Child Care and Development Compensation Study: Towards Promising Policy and Practice – Final Report. Palo Alto, Calif.: American Institutes for Research and National Center for the Early Childhood Workforce, 1996. Whitebook, Marcy, Carollee Howes, and Deborah Phillips. Who Cares? Child Care Teachers and the Quality of Care in America: Final Report: National Child Care Staffing Study. Berkeley: Child Care Employee Project, 1989. [End of section] Related GAO Products: U.S. General Accounting Office. Early Childhood Programs: The Use of Impact Evaluations to Assess Program Effects. GAO-01-542. Washington, D.C.: April 16, 2001. U.S. General Accounting Office. Child Care: States Increased Spending on Low-Income Families. GAO-01-293. Washington, D.C.: February 2, 2001. U.S. General Accounting Office. Child Care: How Do Military and Civilian Center Costs Compare? GAO/HEHS-00-7. Washington, D.C.: October 14, 1999. U.S. General Accounting Office. Child Care: Use of Standards to Ensure High Quality Care. GAO/HEHS-98-223R. Washington, D.C.: July 31, 1998. U.S. General Accounting Office. Welfare Reform: States‘ Efforts to Expand Child Care Programs. GAO/HEHS-98-27. Washington, D.C.: January 13, 1998. U.S. General Accounting Office. Welfare Reform: Implications of Increased Work Participation for Child Care. GAO/HEHS-97-75. Washington, D.C.: May 1997. [End of section] Footnotes: [1] Congress specified that $1,000,000 of the earmark for resource and referral services and school-age care be used for a hotline to be operated by Child Care Aware. Child Care Aware is a national toll-free child care consumer telephone hotline and web-site operated by the National Association of Child Care Resource and Referral Agencies, through a cooperative agreement with the Child Care Bureau in the Department of Health and Human Services. [2] Recent studies of the military child care program include Gail L. Zellman and Susan M. Gates, Examining the Cost of Military Child Care (Santa Monica, Calif.: RAND, 2002), [hyperlink, http://www.rand.org/publications/MR/MR1415/] and Gail L. Zellman and Anne S. Johansen, Examining the Implementation and Outcomes of the Military Child Care Act of 1989 (Santa Monica, Calif.: RAND, 1998), [hyperlink, http://www.rand.org/publications/MR/MR665]. [3] The National Research Council is the principal operating body of the National Academy of Sciences, the National Academy of Engineering and the Institute of Medicine. It operates under a charter granted by Congress, to advise the government, the public and the scientific and engineering communities about scientific and technical matters. The National Academies of Science and Engineering are private, nonprofit societies of scholars in the fields of science and engineering. The Institute of Medicine is an association of eminent members of the professions pertaining to public health who advise on medical, research and educational issues. [4] An experimental design requires random assignment of study participants to a group that is receiving services and to a control group that is not. A quasi-experimental design does not require random assignment, but does require statistical controls for factors other than the program that may have influenced the outcome. See appendix I for a discussion of these research methods and considerations in their use. [5] These differences may be explained by the number of states undertaking the initiative or the amount of money individual states allocated to a particular initiative, which in turn reflects the state‘s size, available funds and priorities regarding child care quality. [6] See Background for a discussion of how standards for licensing and accreditation may vary. [7] In Who Cares for America‘s Children, the National Research Council reviewed research showing that children, especially very young children, need enduring and consistent relationships with a caregiver. Yet, a significant number of caregivers at child care centers leave in a given year. Massachusetts‘s recent study of caregiver recruitment and retention in the state confirmed the findings of other studies that caregivers who receive low wages are difficult to retain. [8] Tennessee changed the licensing requirement from one unannounced inspection to six unannounced inspections per year and increased the licensing staff from about 80 to 159. Licensing staff‘s caseload is now about 35 facilities per full time staff person. Child care officials estimated they have spent about $6 million over 2 years on increased inspections, which are performed for all licensed providers. [9] See Background for a description of the funding stream rules. [10] States were asked to estimate the proportion of quality improvement funds, including CCDF and all other funding sources, spent on different types of providers. This analysis refers only to the six initiatives for which funds are distributed to providers: caregiver compensation, on- and off-site training of caregivers, safety equipment and improvements, meeting state standards, and incentives for accreditation. Because these initiatives constitute 54 percent of all expenditures, this analysis accounts for just over half of all reported expenditures for quality improvement. Of the 42 states that responded to our survey, 34 were able to provide information about the type of provider targeted by one or more of the six initiatives that they funded. [11] ’Informal care“ refers to legally operating care given by adults, including friends and relatives, and is usually unregulated. [12] The methodological criterion we used was that to determine a program‘s effect, an evaluation should employ an experimental or quasi-experimental design. See appendix I for a more detailed discussion of these study designs. In one case, however, we have included a study that used a nonexperimental design, but had very high quality data. [13] C. Howes, E. Smith and E. Galinsky, The Florida Child Care Quality Improvement Study: 1996 Report (New York: Families and Work Institute, 1996). [14] See U.S. General Accounting Office, Early Childhood Programs: The Use of Impact Evaluations to Assess Program Effects, GAO-01-542 (Washington, D.C.: Apr. 16, 2001) for a more detailed discussion of other types and uses of program evaluation. [15] See Background for a description of the council‘s first review. [16] National Research Council and Institute of Medicine. From Neurons to Neighborhoods: The Science of Early Childhood Development. Committee on Integrating the Science of Early Childhood Development, edited by Jack P. Shonkoff and Deborah A. Phillips (Washington, D.C.: National Academy Press, 2000); Deborah Vandell and Barbara Wolfe, Child Care Quality: Does it Matter and Does it Need to be Improved? (Washington, D.C.: U.S. Department of Health and Human Services, Office of the Assistant Secretary for Planning and Evaluation, 2000); John M. Love, Peter Z. Schochet and Alicia L. Meckstroth, Are They in any Real Danger? What Research Does – and Doesn‘t”Tell us About Child Care Quality and Children‘s Well-Being, (Princeton, N.J.: Mathematica Policy Research, Inc., May 1996). [17] In methodological terms, when analyzing effects on children‘s development, the need to separate the influence of family characteristics from the quality of the child care setting is called controlling for selection bias. Controlling for selection bias in conducting analyses of the effects of child care quality on children‘s development and tying the size of an effect, when it can be determined, to the cost of achieving a change, are two key issues in research on child care quality and child outcomes. [18] HHS‘s Child Care Bureau, which administers the CCDF, oversees this research agenda. In addition, HHS‘s Office of Planning, Research and Evaluation (OPRE) is conducting the National Study of Child Care for Low-income Families. OPRE‘s funding has come historically under Section 110 of the Social Security Act. HHS‘s Office of the Assistant Secretary for Planning and Evaluation focuses on crosscutting issues and filling gaps not covered by other HHS agencies, but has no dedicated budget for child care research. [19] ’Tiered reimbursement“ is reimbursement that offers higher rates to providers that meet certain quality standards set by the state. This is implemented under federal and state programs that subsidize care for low-income families. [20] Under CCDF provisions, states may spend money appropriated in a prior fiscal year in a later fiscal year. (See Background.) [21] Toni Porter et al., Assessing the Child Care and Development Fund (CCDF) Investment in Child Care Quality: A Study of Selected State Initiatives (New York: Bank Street College of Education, 2002). [22] See GAO-01-542 for a detailed description of experimental and quasi-experimental designs. [23] ECERS and FDCRS were developed by Harms and Clifford, 1980, and ITERS was developed by Harms, Cryer, and Clifford, 1990. [24] John M. Love, Peter Z. Schochet, and Alicia L. Meckstroth, Are They in Any Real Danger? What Research Does”and Doesn‘t”Tell Us about Child Care Quality and Children‘s Well-Being, (Princeton, N.J.: Mathematica Policy Research, Inc.); Deborah Lowe Vandell and Barbara Wolfe, Child Care Quality: Does it Matter and Does it Need to be Improved? (Washington, D.C.: U.S. Department of Health and Human Services, Office of the Assistant Secretary for Planning and Evaluation, 2000). [25] Love, Schochet and Meckstroth, Danger; Vandell and Wolfe, Child Care Quality. [26] R. Ruopp, J. Travers, F. Glantz, and C. Coelen, Children at the Center: Final Report of the National Day Care Study (Cambridge: Abt Associates, 1979). [27] NICHD ECCRN, ’Effect Sizes from the NICHD Study of Early Child Care,“ paper presented at the Biennial Meeting of the Society for Research in Child Development, Albuquerque, N. Mex., April 1999. [28] Ruopp, Travers, Glantz and Coelen, Children at the Center. [29] Marcy Whitebook, Carollee Howes, and Deborah Phillips, Who Cares? Child Care Teachers and the Quality of Care in America: Final Report: National Child Care Staffing Study (Berkeley: Child Care Employee Project, 1989). Reviewers indicated study controlled for family characteristics. Study design and analysis procedures were not identified. [30] Ruopp, Travers, Glantz and Coelen, Children at the Center. [31] NICHD ECCRN, ’Effect Sizes.“ [32] Ruopp, Travers, Glantz and Coelen, Children at the Center. [33] Ruopp, Travers, Glantz and Coelen, Children at the Center. [34] NICHD ECCRN, ’Effect Sizes.“ [35] Ellen Galinsky, Carollee Howes, and Susan Kontos, The Family Child Care Training Study: Highlights of Findings (New York: Families and Work Institute, 1995). [36] Whitebook, Howes, and Phillips, Child Care Staffing Study. Reviewers indicated study controlled for family characteristics. Study design and analysis procedures were not identified. [37] L.L. Hestenes, S. Kontos, and Y. Bryan, ’Children‘s Emotional Expression in Child Care Centers Varying in Quality,“ Early Childhood Research Quarterly 8 (1993): 295-307; S.D. Holloway and M. Reichhart-Erickson, ’The Relationship of Day Care Quality to Children‘s Free Play Behavior and Social Problem-Solving Skills,“ Early Childhood Research Quarterly 3 (1988): 39-53; S. Kontos and A. Wilcox-Herzog, ’Influences on Children‘s Competence in Early Childhood Classrooms,“ Early Childhood Research Quarterly 12 (1997): 247-262. [38] E.S. Peisner-Feinberg and M. R. Burchinal, ’Relations between Preschool Children‘s Child-Care Experiences and Concurrent Development: The Cost, Quality, and Outcomes Study,“ Merrill”Palmer Quarterly 43 (1997): 451-477. [39] Ruopp, Travers, Glantz and Coelen, Children at the Center. [40] Ruopp, Travers, Glantz and Coelen, Children at the Center. [41] NICHD ECCRN, ’Effect Sizes.“ [42] Ruopp, Travers, Glantz and Coelen, Children at the Center. [43] NICHD ECCRN, ’Effect Sizes.“ [44] NICHD ECCRN, ’Effect Sizes.“ [45] Whitebook, Howes, and Phillips, Child Care Staffing Study. Reviewers indicated study controlled for family characteristics. Study design and analysis procedures were not identified. [46] L. Dunn, S. A. Beach, and S. Kontos, ’Quality of the Literacy Environment in Day Care and Children‘s Development,“ Journal of Research in Childhood Education 9 (1994): 24-34; H. Goelman, ’The Relationship between Structure and Process Variables in Home and Day Care Settings on Children‘s Language Development,“ in The Practice of Ecological Research: From Concepts to Methodology, edited by A. Pence and H. Goelman ( N.p., 1988); K. McCartney, ’Effect of Quality of Day-Care Environment on Children‘s Language Development,“ Developmental Psychology 20 (1984): 244-260; NICHD ECCRN, ’The Relation of Child Care to Cognitive and Language Development,“ in Child Development (in press); Peisner-Feinberg and Burchinal, ’Preschool Children‘s Child-Care Experiences“; E. Schliecker, D. R. White, and E. Jacobs, ’The Role of Day Care Quality in the Prediction of Children‘s Vocabulary,“ Canadian Journal of Behavioural Science 23 (1991): 12-24. [47] L. Dunn, ’Proximal and Distal Features of Day Care Quality and Children‘s Development,“ Early Childhood Research Quarterly 8 (1993): 167-192. [48] Peisner-Feinberg and Burchinal, ’Preschool Children‘s Child-Care Experiences.“ [49] S. Kontos and A. Wilcox-Herzog, ’Influences on Children‘s Competence in Early Childhood Classrooms,“ Early Childhood Research Quarterly 12 (1997): 247-262. [50] Dunn, Beach, and Kontos, ’Quality of the Literacy Environment in Day Care“; Goelman, ’The Relationship between Structure and Process Variables “; McCartney ’Effect of Quality of Day-Care;“ NICHD ECCRN, ’The Relation of Child Care“; Peisner-Feinberg and Burchinal, ’Preschool Children‘s Child-Care Experiences“; Schliecker, White, and Jacobs, ’The Role of Day Care Quality.“ [51] M.R. Burchinal, J.E. Roberts, R. Riggins, S. A. Zeisel, E. Neebe, and D. Bryant, ’Relating Quality of Center Child Care to Early Cognitive and Language Development Longitudinally,“ in Child Development (in press). [52] C. Howes, ’Can the Age of Entry into Child Care and the Quality of Child Care Predict Adjustment in Kindergarten?,“ Developmental Psychology 26 (1990): 292-303. [53] D.M. Blau, ’The Effects of Child Care Characteristics on Child Development,“ Journal of Human Resources 34, no. 4 (1999): 786-822. [54] Burchinal et al., ’Relating Quality of Center Child Care.“ [55] Howes, ’Adjustment in Kindergarten.“ [56] Howes, ’Adjustment in Kindergarten.“ [57] Burchinal et al., ’Relating Quality of Center Child Care.“ [End of section] GAO‘s Mission: The General Accounting Office, the investigative arm of Congress, exists to support Congress in meeting its constitutional responsibilities and to help improve the performance and accountability of the federal government for the American people. GAO examines the use of public funds; evaluates federal programs and policies; and provides analyses, recommendations, and other assistance to help Congress make informed oversight, policy, and funding decisions. GAO‘s commitment to good government is reflected in its core values of accountability, integrity, and reliability. Obtaining Copies of GAO Reports and Testimony: The fastest and easiest way to obtain copies of GAO documents at no cost is through the Internet. GAO‘s Web site [hyperlink, http://www.gao.gov] contains abstracts and fulltext files of current reports and testimony and an expanding archive of older products. The Web site features a search engine to help you locate documents using key words and phrases. You can print these documents in their entirety, including charts and other graphics. Each day, GAO issues a list of newly released reports, testimony, and correspondence. GAO posts this list, known as ’Today‘s Reports,“ on its Web site daily. The list contains links to the full-text document files. To have GAO e-mail this list to you every afternoon, go to [hyperlink, http://www.gao.gov] and select ’Subscribe to daily E-mail alert for newly released products“ under the GAO Reports heading. Order by Mail or Phone: The first copy of each printed report is free. Additional copies are $2 each. A check or money order should be made out to the Superintendent of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or more copies mailed to a single address are discounted 25 percent. Orders should be sent to: U.S. General Accounting Office: 441 G Street NW, Room LM: Washington, D.C. 20548: To order by Phone: Voice: (202) 512-6000: TDD: (202) 512-2537: Fax: (202) 512-6061: To Report Fraud, Waste, and Abuse in Federal Programs Contact: Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]: E-mail: fraudnet@gao.gov: Automated answering system: (800) 424-5454 or (202) 512-7470: Public Affairs: Jeff Nelligan, managing director, NelliganJ@gao.gov: (202) 512-4800: U.S. General Accounting Office: 441 G Street NW, Room 7149: Washington, D.C. 20548:

The Justia Government Accountability Office site republishes public reports retrieved from the U.S. GAO These reports should not be considered official, and do not necessarily reflect the views of Justia.