Community Services Block Grant Program

HHS Should Improve Oversight by Focusing Monitoring and Assistance Efforts on Areas of High Risk Gao ID: GAO-06-627 June 29, 2006

The Community Services Block Grant (CSBG) provided over $600 million to states in fiscal year 2005 to support over 1,000 local antipoverty agencies. The Department of Health and Human Services's (HHS) Office of Community Services (OCS) is primarily responsible for overseeing this grant; states have oversight responsibility for local agencies. At the request of Congress, GAO is providing information on (1) HHS's compliance with federal laws and standards in overseeing states, (2) five states' efforts to monitor local agencies, and (3) federal CSBG training and technical assistance funds targeted to local agencies with problems and the results of the assistance. States were selected based on varying numbers of local agencies and grant amounts and recommendations from associations, among other criteria.

In a February 2006 letter (GAO-06-373R), GAO notified OCS that it lacked effective policies, procedures, and controls to help ensure that it fully met legal requirements for monitoring states and internal control standards. At that time, GAO also offered recommendations for improvements. OCS has responded that it intends to take actions to address each of those recommendations. In addition, GAO found that OCS did not routinely collect key information, such as results of state monitoring reports, or systematically use available information, such as state performance data, to assess the states' CSBG management risks and target monitoring efforts to states with the highest risk. All five states we visited conducted on-site monitoring of local agencies with varying frequency and performed additional oversight efforts. Two state offices visited each local agency at least once between 2003 and 2005, while the other three states visited local agencies less frequently. State officials we visited had different views on what they must do to meet the statutory requirement to visit local agencies at least once during each 3-year period, and OCS has not issued guidance interpreting this requirement. Officials in all five states also provided oversight in addition to monitoring through such activities as reviewing reports and coordinating with other federal and state programs. OCS targeted some training and technical assistance funds to local grantees with financial or management problems, but information on the results of this assistance is limited. In fiscal years 2002 through 2005, OCS designated between $666,000 and $1 million of its annual $10 million training and technical assistance funds to local agencies with problems, but had no process for strategically allocating these funds to areas of greatest need. In addition, the final reports on awarded grants indicated that some local agencies had improved, but the reports provided no information on the outcomes of assistance for nearly half of the 46 local agencies that GAO identified as being served.

Recommendations

Our recommendations from this work are listed below with a Contact for more information. Status will change from "In process" to "Open," "Closed - implemented," or "Closed - not implemented" based on our follow up work.

Director: Team: Phone:


GAO-06-627, Community Services Block Grant Program: HHS Should Improve Oversight by Focusing Monitoring and Assistance Efforts on Areas of High Risk This is the accessible text file for GAO report number GAO-06-627 entitled 'Community Service Block Grant Program: HHS Should Improve Oversight by Focusing Monitoring and Assistance Efforts on Areas of High Risk' which was released on July 11, 2006. This text file was formatted by the U.S. Government Accountability Office (GAO) to be accessible to users with visual impairments, as part of a longer term project to improve GAO products' accessibility. Every attempt has been made to maintain the structural and data integrity of the original printed product. Accessibility features, such as text descriptions of tables, consecutively numbered footnotes placed at the end of the file, and the text of agency comment letters, are provided but may not exactly duplicate the presentation or format of the printed version. The portable document format (PDF) file is an exact electronic replica of the printed version. We welcome your feedback. Please E-mail your comments regarding the contents or accessibility features of this document to Webmaster@gao.gov. This is a work of the U.S. government and is not subject to copyright protection in the United States. It may be reproduced and distributed in its entirety without further permission from GAO. Because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. Report to Congressional Requesters: United States Government Accountability Office: GAO: June 2006: Community Services Block Grant Program: HHS Should Improve Oversight by Focusing Monitoring and Assistance Efforts on Areas of High Risk: Community Services Block Grant Program: GAO-06-627: GAO Highlights: Highlights of GAO-06-627, a report to congressional requesters Why GAO Did This Study: The Community Services Block Grant (CSBG) provided over $600 million to states in fiscal year 2005 to support over 1,000 local antipoverty agencies. The Department of Health and Human Services‘s (HHS) Office of Community Services (OCS) is primarily responsible for overseeing this grant; states have oversight responsibility for local agencies. At the request of Congress, GAO is providing information on (1) HHS‘s compliance with federal laws and standards in overseeing states, (2) five states‘ efforts to monitor local agencies, and (3) federal CSBG training and technical assistance funds targeted to local agencies with problems and the results of the assistance. States were selected based on varying numbers of local agencies and grant amounts and recommendations from associations, among other criteria. What GAO Found: In a February 2006 letter (GAO-06-373R), GAO notified OCS that it lacked effective policies, procedures, and controls to help ensure that it fully met legal requirements for monitoring states and internal control standards. At that time, GAO also offered recommendations for improvements. OCS has responded that it intends to take actions to address each of those recommendations. In addition, GAO found that OCS did not routinely collect key information, such as results of state monitoring reports, or systematically use available information, such as state performance data, to assess the states‘ CSBG management risks and target monitoring efforts to states with the highest risk. All five states we visited conducted on-site monitoring of local agencies with varying frequency and performed additional oversight efforts. Two state offices visited each local agency at least once between 2003 and 2005, while the other three states visited local agencies less frequently. State officials we visited had different views on what they must do to meet the statutory requirement to visit local agencies at least once during each 3-year period, and OCS has not issued guidance interpreting this requirement. Officials in all five states also provided oversight in addition to monitoring through such activities as reviewing reports and coordinating with other federal and state programs. OCS targeted some training and technical assistance funds to local grantees with financial or management problems, but information on the results of this assistance is limited. In fiscal years 2002 through 2005, OCS designated between $666,000 and $1 million of its annual $10 million training and technical assistance funds to local agencies with problems, OCS but had no process for strategically allocating these funds to areas of greatest need. In addition, the final reports on awarded grants indicated that some local agencies had improved, but the reports provided no information on the outcomes of assistance for nearly half of the 46 local agencies that GAO identified as being served. See figure below. Figure: Results of Assistance to 46 Local Agencies as Reported by Grantees, Fiscal Years 2002 through 2005: [See PDF for Image] Source: GAO analysis of grantee reports. [End of Figure] What GAO Recommends: GAO recommends that the Assistant Secretary for Children and Families direct OCS to conduct a risk-based assessment of state CSBG programs, have policies and procedures to help ensure monitoring focuses on states with the highest risk, issue guidance on state monitoring requirements and training and technical assistance reporting, and implement a strategic plan to guide its training and technical assistance efforts. The agency agreed with our recommendations and has made plans to address them. [Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-06-627]. To view the full product, including the scope and methodology, click on the link above. For more information, contact Marnie S. Shaul at (202) 512-7215 or shaulm@gao.gov [End of Section] Contents: Letter: Results in Brief: Background: OCS Lacks Internal Controls and a Risk Management Framework Needed to Carry Out Effective Monitoring Efforts: Frequency of State On-Site Monitoring Varied, but Selected States Performed Other Oversight Activities: OCS Targeted Some Training and Technical Assistance Funds to Grantees with Problems, but Information on Results Is Limited: Conclusions: Recommendations for Executive Action: Comments from the Department of Health and Human Services on Our Evaluation: Appendix I: Objectives, Scope, and Methodology: Appendix II: Community Services Block Grant Program HHS Needs to Improve Monitoring of State Grantees, GAO-06-373R: Appendix III: The Department of Health and Human Services's Response to GAO-06-373R: Appendix IV: Ranking of States Based on Percentage of Local CSBG Subgrantees with Single Audit Findings: Appendix V: Comments from the Department of Health & Human Services: Appendix VI: GAO Contact and Staff Acknowledgments: Related GAO Products: Tables: Table 1: Local Agency Monitoring Visits Conducted by Select States, 2003-2005: Table 2: Total CSBG Funds, Expenditures for Administration, and Numbers of CSBG Staff and Local Agencies, Fiscal Year 2005: Table 3: Office of Community Services CSBG Training and Technical Assistance Funding, Fiscal Years 2002 to 2005: Table 4: States Visited in Our Study: Table 5: Single Audit Data for States Related to Findings among Local CSBG Subgrantees Ranked by the Percentage of Agencies with Findings, 2002: Table 6: Single Audit Data for States Related to Findings among Local CSBG Subgrantees Ranked by the Percentage of Agencies with Findings, 2003: Figures: Figure 1. CSBG Network's Total Resources, Fiscal Year 2004: Figure 2: Results of Assistance to 46 Local Agencies from Special State Technical Assistance and Peer-to-Peer Grants as Reported by Grantees, Fiscal Years 2002 through 2005: Abbreviations: ACF: Administration for Children and Families: CAA: community action agency: CDBG: Community Development Block Grant: CSBG: Community Services Block Grant: DOE: Department of Energy: HHS: Department of Health and Human Services: LIHEAP: Low-Income Home Energy Assistance Program: MICA: Mid-Iowa Community Action: NASCSP: National Association of State Community Service Programs: OCS: Office of Community Services: OMB: Office of Management and Budget: ROMA: Results Oriented Management and Accountability: SSTA: Special State Technical Assistance: United States Government Accountability Office: Washington, DC 20548: June 29, 2006: The Honorable Howard P. "Buck" McKeon: Chairman: Committee on Education and the Workforce: House of Representatives: The Honorable Michael N. Castle: Chairman: Subcommittee on Education Reform: Committee on Education and the Workforce: House of Representatives: The Honorable John A. Boehner: House of Representatives: The Honorable Tom Osborne: House of Representatives: The Community Services Block Grant (CSBG) provided over $600 million to states in fiscal year 2005 to support over 1,000 local antipoverty agencies. These local agencies, predominately community action agencies (CAA), often use CSBG to support their institutional frameworks for providing services, including staff and facilities. They also use CSBG dollars to leverage other public and private resources to support a variety of activities, including Head Start programs, low-income home energy assistance programs, and low-income housing. The Office of Community Services (OCS) within the Department of Health and Human Services (HHS) is primarily responsible for overseeing this block grant, and the states are responsible for overseeing local agencies. In our February 2006 letter to HHS (GAO-06-373R), GAO reported several challenges that OCS faced in ensuring effective oversight of CSBG funds and, at that time, made recommendations for improvements. The CSBG Act requires OCS to visit several states each year to evaluate the states' use of CSBG funds and report on its findings to the visited states and Congress annually. The law also requires OCS to provide training and technical assistance funds to states in order to, among other purposes, support state monitoring efforts and improve local programs' quality. The law requires states to visit all local agencies at least once during each 3-year period and more often if local agencies fail to meet state-established goals, requirements, and standards. The Office of Management and Budget (OMB) issued guidance to assist auditors in determining whether states are carrying out their CSBG monitoring responsibilities to visit each local agency once every 3 years in compliance with the law. The law also requires states to report performance data to OCS annually, such as data on the number of people served by different antipoverty programs. Additionally, other federal laws and standards for ensuring accountability, such as internal control standards and the Single Audit Act, affect CSBG management and reporting. To better understand the efforts that OCS and states have undertaken to oversee the use of CSBG funds, we agreed to examine (1) the extent to which HHS's oversight of state efforts to monitor local agencies complied with federal laws and standards, (2) the efforts selected states have made to monitor local agencies' compliance with fiscal requirements and performance standards, and (3) the extent to which HHS targeted federal CSBG training and technical assistance funds to efforts to assist local agencies with financial or management problems and what is known about the results of the assistance. To address the first objective, we reviewed federal laws and standards to obtain information on OCS's requirements for providing oversight to states, interviewed federal officials on their efforts, and obtained available documentation on these efforts from fiscal year 2003 through fiscal year 2005. To assess state monitoring efforts, we reviewed federal laws and standards to obtain information on states' CSBG oversight responsibilities and interviewed and collected documentation from state and local officials in Illinois, Pennsylvania, Missouri, Texas, and Washington on state oversight efforts from fiscal year 2003 through fiscal year 2005. We selected states that had, among other characteristics, varying grant amounts and numbers of local agencies, state administrative structures that may have allowed for collaboration with other programs that provide funds to CAAs, varying Single Audit results among their local agencies, and recommendations from CSBG associations for promising oversight practices. We also obtained information on state CSBG audit findings from auditors in the five states. In addition, we interviewed federal and state officials in Head Start, the Low-Income Home Energy Assistance Program (LIHEAP), and the Community Development Block Grant Program (CDBG)--three programs from which CAAs often receive funds--to obtain information on the degree to which those officials collaborate with federal and state CSBG officials with regard to oversight. Our results on selected state monitoring efforts are not generalizable to all states. For the third objective, we interviewed federal officials and their contractors that provide training and technical assistance to obtain information on whether OCS grants were targeted and how they determined that the grant-supported efforts were effective. We also obtained and reviewed training and technical assistance grant applications and reports for the two programs that support efforts to assist local agencies for fiscal year 2002 through fiscal year 2005 to assess these efforts and their results. Furthermore, we reviewed Single Audit data to assess the extent to which states had local agencies with findings reported in fiscal years 2002 and 2003, the most recent information available. All 50 states, the District of Columbia, and Puerto Rico were included in our review of Single Audit findings. We assessed the reliability of Single Audit and programmatic data by conducting electronic and manual data testing and interviewing officials knowledgeable about the data. We determined that the data were sufficiently reliable for the purposes of this report. (See app. I for a more detailed description of the scope and methodology of our review.) We performed our work between July 2005 and May 2006 in accordance with generally accepted government auditing standards. Results in Brief: In its efforts to oversee states, OCS lacked effective policies, procedures, and controls to help ensure that it fully met legal requirements for monitoring states and federal internal control standards. OCS also lacked a process to assess states' CSBG management risks. Although OCS met statutory requirements by visiting nine states in fiscal years 2003 through 2005, it did not issue reports to states and annual reports to Congress on monitoring visits, which is also statutorily required. In addition, OCS did not meet internal controls standards because it sent monitoring teams without adequate financial expertise and lost documentation from state visits it conducted in fiscal years 2003 and 2004. We notified the Assistant Secretary for Children and Families about OCS's lack of effective CSBG monitoring controls and offered recommendations for improvements in a letter dated February 7, 2006. OCS has responded that it intends to take actions to address each of our recommendations presented in the letter. In addition to issues previously reported, we found that OCS did not systematically use available information, such as state performance data and audit findings, or collect key information, such as the results of state monitoring of local grantees. Such data would allow OCS to assess states' risk related to managing CSBG programs and target its limited monitoring resources to states with the highest risks. Federal officials said that they visited states after learning from state or local officials that some grantees had management challenges, such as financial problems or staff turnover. However, in other instances, OCS officials could not recall their basis for selecting states for monitoring visits. All five states we visited conducted on-site monitoring of local agencies with varying frequency and performed additional oversight efforts, such as reviewing financial and programmatic reports from local agencies. Officials in Illinois and Texas conducted at least one on-site visit to each local agency between 2003 and 2005. However, officials in the other three states visited their local agencies less frequently. Pennsylvania and Washington officials monitored over 90 percent of their local agencies between 2003 and 2005. Missouri visited about 20 percent during this time and allowed up to 5 years between visits to some agencies. While state offices varied in the frequency of their monitoring visits, officials in all states told us that they visited local agencies with identified problems more often, and three states conducted risk assessments to determine which local agencies should receive additional visits. State officials we visited have taken different views on what they must do to meet the legal requirement to visit local agencies at least once during each 3-year period, and OCS has not issued guidance interpreting this requirement. During a 2004 audit, Pennsylvania state auditors, using OMB guidance, found the state CSBG program to be out of compliance with federal requirements because it did not monitor local agencies once every 3 years. In contrast, although Missouri officials visited 4 of 19 local agencies between 2003 and 2005, the state CSBG office maintains it is in compliance with monitoring requirements because it plans to visit all local agencies within the 3-year periods of 2001 to 2003 and 2004 to 2006. We also found that state offices varied in their capacity to conduct monitoring visits. Specifically, Missouri, Pennsylvania, and Washington officials told us that challenges, such as staff shortages, affected their ability to monitor local agencies. Nonetheless, each of the five state programs that we visited regularly reviewed local agencies' reports, including their community action plans, budgets, and performance data. State CSBG officials also reviewed Single Audit reports for local agencies when CSBG findings were mentioned. In addition, CSBG state programs established relationships with officials in other programs that provided funds to the same local grantees to learn the results of other monitoring efforts. Generally, state programs offered training and technical assistance to local agencies with findings from on-site monitoring visits. OCS targeted some training and technical assistance funds to local grantees with financial and programmatic management problems, but information on the results of this assistance is limited. In fiscal years 2002 through 2005, OCS designated between $666,000 and $1 million of its annual $10 million training and technical assistance funds to local agencies with problems, but had no process in place to strategically allocate these funds to areas of greatest need. Without systematically tracking which local agencies experienced problems and what those problems were, OCS did not have adequate information to determine whether its training and technical assistance programs and the amounts dedicated to them were appropriate for addressing the greatest needs of local agencies and the state agencies that oversee them. OCS currently allocates training and technical assistance funds based on input from some state and local agencies, but this process has not been guided by a comprehensive assessment of state and local needs. Additionally, there is limited information on the results of OCS's grant programs that target local agencies with problems. The final reports on awarded grants provided no information on the outcomes of assistance for nearly half of the 46 local agencies that we identified as being served. To provide better oversight of state agencies, we recommended that the Assistant Secretary for Children and Families direct OCS to (1) conduct a risk-based assessment of state programs by systematically collecting and using information, (2) establish policies and procedures to help ensure monitoring is focused on states with the highest risks, (3) issue guidance on state responsibilities with regard to complying with the requirement to monitor local agencies during each 3-year period, (4) establish reporting guidance on training and technical assistance grants that allows OCS to obtain information on outcomes for local agencies, and (5) implement a strategic plan that will focus its training and technical assistance efforts on areas in which states face the greatest needs. In its written comments on a draft of this report, HHS officials agreed with our recommendations and stated that they have made plans to address them. Background: The CSBG program provides funds to state and local agencies to support efforts that reduce poverty, revitalize low-income communities, and lead to self-sufficiency among low-income families and individuals.[Footnote 1] CSBG dates back to the War on Poverty of the 1960s and 1970s, which established the Community Action program, under which the nationwide network of local community action agencies was developed. A key feature of Community Action was the direct involvement of low-income people in the design and administration of antipoverty activities through mandatory representation on local agency governing boards. The federal government had direct oversight of local agencies until 1981, when Congress created CSBG and designated states as the primary recipients. States subgrant funds to over 1,000 eligible local agencies that are primarily community action agencies. In order to ensure accountability, both federal and state program offices have oversight responsibilities, including on-site monitoring of grantees and subgrantees, following-up on monitoring findings, and providing technical assistance. Federal Role: OCS administers CSBG and is required by law to conduct on-site compliance evaluations of several states in each fiscal year, report to states on the results of these evaluations, and make recommendations for improvements. Upon receiving an evaluation report, states must submit a plan of action that addresses recommendations. In addition, OCS is required to annually report to Congress on the performance of the CSBG program, including the results of state compliance evaluations. For states to receive CSBG funding, they must submit, at least every 2 years, an application and plan to OCS stating their intention that funds will be used to, among other things, support activities to help families and individuals with the following: achieve self-sufficiency, find and retain meaningful employment, attain an adequate education, make better use of available income, obtain adequate housing, and achieve greater participation in community affairs. The CSBG Act requires OCS to reserve 1.5 percent of annual appropriations (about $10 million in fiscal year 2005) for training and technical assistance for state and local agencies; planning, evaluation, and performance measurement; assisting states with carrying out corrective action activities; and monitoring, reporting, and data collection activities. The fiscal year 2005 Consolidated Appropriations Act conference report directed OCS to develop a 3-year strategic plan to guide its training and technical assistance efforts. OCS has provided assistance to local agencies with problems primarily through two grant programs: Special State Technical Assistance (SSTA) Grants and the Peer-to-Peer Technical Assistance and Crisis Aversion Intervention (Peer-to-Peer) Grants. OCS generally awarded Special State Technical Assistance Grants to states or state associations of community action agencies to provide support to local agencies that have problems. Since 2001, OCS has awarded the Peer-to-Peer Grant solely to Mid-Iowa Community Action (MICA), a community action agency, to offer problem assessment, interim management, and other technical assistance services to local agencies with problems. Standards for Internal Control in the Federal Government: In addition to the federal requirements in law, OCS, like other federal agencies, is required to adhere to internal control standards established by the Office of Management and Budget and GAO in order to help ensure efficient and effective operations, reliable financial reporting, and compliance with federal laws.[Footnote 2] Internal controls help government program managers achieve desired results through effective stewardship of public resources. Such interrelated controls comprise the plans, methods, and procedures used to meet missions, goals, and objectives and, in doing so, support performance- based management and should provide reasonable assurance that an organization achieves its objectives of (1) effective and efficient operations, (2) reliable reporting, and (3) compliance with applicable laws and regulations. The five components of internal controls are: * Control environment: creating a culture of accountability within an entire organization--program offices, financial services, and regional offices--by establishing a positive and supportive attitude toward the achievement of established program outcomes. * Risk assessment: identifying and analyzing relevant risks, both internal and external, that might prevent the program from achieving objectives, and developing processes that can be used to form a basis for the measuring of actual or potential effects of relevant factors and manage their risks. During such a risk assessment process, managers should consider their reliance on other parties to perform critical program operations. * Control activities: establishing and implementing oversight processes to address risk areas and help ensure that management's directives-- especially about how to mitigate and manage risks--are carried out and program objectives are met. * Information and communication: using and sharing relevant, reliable, and timely operational and financial information to determine whether the agency is meeting its performance and accountability goals. * Monitoring: tracking improvement initiatives over time and identifying additional actions needed to further improve program efficiency and effectiveness. State Role: The CSBG Act requires each state to designate a lead agency to administer CSBG funds and to provide oversight of local agencies that receive funds. States are required to award at least 90 percent of their federal block grant allotments to eligible local agencies, but are allowed to determine how CSBG funds are distributed among local agencies. States may use up to $55,000 or 5 percent of their CSBG allotment, whichever is higher, for administrative costs.[Footnote 3] States may use remaining funds for the provision of training and technical assistance, coordination and communication activities, payments to ensure they target funds to areas with the greatest need, support for innovative programs and activities conducted by local organizations, or other activities consistent with the purposes of the CSBG Act. In addition, state and local agencies that expend $500,000 or more ($300,000 or more prior to 2004) in total federal awards are required under the Single Audit Act to undergo an audit annually and submit a report to the Federal Audit Clearinghouse. Furthermore, individual federal funding sources may also be reviewed annually under the Single Audit, depending on the size of these expenditures.[Footnote 4] The CSBG Act requires states to monitor local agencies to determine whether they meet performance goals, administrative standards and financial management and other state requirements. States are required to perform this monitoring through a full on-site review of each local agency at least once during each 3-year period and to conduct follow-up reviews, including prompt return visits, to local agencies that fail to meet the goals, standards, and requirements established by the state. OMB has issued Single Audit compliance review guidance for CSBG that explicitly mentions that when auditors review state programs, they should determine whether states are visiting each local agency once every 3 years to assess if states comply with the law. States must also offer training and technical assistance to failing local agencies. Local agencies are required to submit a community action plan to states that contains a community needs assessment, a description of the service delivery system for services provided by or coordinated with CSBG funds, a description of how they will partner with other local agencies to address gaps in services they provide, a description of how funds will be coordinated with other public and private resources, and a description of how funds will be used to support innovative community and neighborhood-based initiatives. The CSBG Act requires both state and local agencies to participate in a performance measurement system. Results Oriented Management and Accountability (ROMA) is the OCS-sponsored performance management system that states and local agencies use to measure their performance in achieving their CSBG goals. State agencies report annually on ROMA using the CSBG Information System survey, which the National Association for State Community Services Programs (NASCSP) administers. CSBG Network Resources: In fiscal year 2004, the network of local CSBG agencies received almost $9.7 billion from all sources. About $7 billion of these funds came from federal sources, including about $600 million from CSBG. Other federal programs funding the CSBG network included Head Start, LIHEAP, CDBG, Child Care and Development Fund, Temporary Assistance for Needy Families, and the Social Services Block Grant (see fig. 1). HHS's Administration for Children and Families contributed 90 percent of the $4.4 billion in funds provided to local agencies through HHS. Figure 1: Figure 1. CSBG Network's Total Resources, Fiscal Year 2004: [See PDF for image] Source: GAO analysis of data from NASCSP. Note: Percentages do not sum to 100 because of rounding. [End of figure] HHS received about $637 million in CSBG funding for fiscal year 2005 and about $630 million for fiscal year 2006. OCS Lacks Internal Controls and a Risk Management Framework Needed to Carry Out Effective Monitoring Efforts: In its efforts to oversee states, OCS did not fully comply with federal laws related to monitoring states and internal control standards and lacked a process to assess state CSBG management risks. OCS visited nine states in fiscal years 2003 through 2005. However, as mentioned in our letter to the Assistant Secretary for Children and Families, OCS lacked the policies, procedures, and other internal controls to ensure effective monitoring efforts. As a result, states and Congress are not receiving required information on monitoring findings, and states may not have made improvements to how they administer CSBG funds. We recommended that the OCS director establish formal written policies and procedures to improve OCS's monitoring and related reporting, and OCS officials have made plans to address each of the recommendations included in the letter. We also found that OCS did not systematically use or collect available information that would allow it to assess states' CSBG management risks. Officials told us that they considered a variety of risk-related factors when selecting sites for monitoring visits, including reports from state and local officials about financial management problems and staff turnover, but they did not have a systematic approach to assess risk or target monitoring toward states with the greatest needs. OCS Lacked Policies and Procedures to Help Ensure Effective Monitoring of States but Plans Improvements: OCS lacked policies, procedures, and internal controls to help ensure effective on-site monitoring of state CSBG programs but has made plans to address these issues. OCS officials told us they visited nine states since 2003: Delaware, Louisiana, Maryland, and North Carolina in 2003; Alabama and Montana in 2004; and Kentucky, New Jersey, and Washington in 2005.[Footnote 5] During these visits, OCS officials told us they used a monitoring tool to assess the administrative and financial operations of state programs. However, OCS sent monitoring teams that lacked required financial expertise to conduct evaluations of states and did not issue final reports to states as required by law. Consequently, the visited states may have been unaware of potential OCS findings and, therefore, may not have developed corrective action plans if needed. Furthermore, OCS officials also told us that they lost documentation for the state visits conducted in fiscal years 2003 and 2004, leaving them unable to report to states they visited or perform appropriate follow-up procedures. OCS officials did not include information on their monitoring visits in their most recent CSBG report to Congress, released in December 2005, as statutorily required. In addition, OCS has not issued reports to Congress annually, as required by law. We reported on OCS's monitoring challenges to the Assistant Secretary for Children and Families on February 7, 2006, and made recommendations for improving these conditions (for a copy of this letter, see app. II). Specifically, we recommended that the OCS director establish formal written policies and procedures to (1) ensure that teams conducting monitoring visits include staff with requisite skills, (2) ensure the timely completion of monitoring reports to states, (3) maintain and retain documentation of monitoring visits, and (4) ensure the timely issuance of annual reports to Congress. In response to this letter, OCS officials said that they plan to address each of our recommendations by hiring additional monitoring staff with expertise in financial oversight, training all staff on requirements that states must meet prior to visits, establishing a triennial monitoring schedule for visiting states, developing new guidelines for reporting to states and maintaining monitoring documents, and issuing timely reports to Congress, among other efforts. See appendix III for more details on HHS's response to our letter. OCS Did Not Use or Collect Available Information Needed to Assess States' Risk of Mismanaging Their CSBG Programs: OCS did not systematically use or collect key information that would allow it to assess states' CSBG management risks and target its limited monitoring resources toward states with the greatest risks. OCS officials told us that they used a risk-based approach to select states to visit, but we found the selection process to be ad hoc and often unexplained. OCS officials explained that they used information received from state and local officials on state CSBG management concerns to decide in which states to conduct compliance evaluation visits. For example, upon learning that local agencies in Louisiana were concerned that they had not received all the funds allotted to them, OCS decided to conduct an evaluation of that state. OCS officials also mentioned that when selecting states to visit, they considered such risk factors as staff turnover and having limited information about the state in general. However, OCS officials could not provide an explanation for why they visited six of the nine states that had undergone evaluations since 2003 and had no formal, written criteria for determining which states to visit. Each state provides annual program performance information to OCS, but OCS does not systematically use this information to assess states' risks of not meeting program objectives. Specifically, states annually provide OCS with information about the number of people receiving services and the types of services local agencies provided and categorize this information according to designated program goals, which can provide OCS with data on whether state and local agencies are performing as expected. OCS also did not systematically use information on the amount of CSBG funds states have expended. OCS officials said they reviewed state Single Audit reports when CSBG was included, but we found state CSBG programs generally fell below thresholds to receive an annual required audit. OCS does not systematically collect other key information that would allow federal officials to assess risk related to states' oversight efforts and therefore cannot determine whether states are fulfilling their requirement to visit local agencies. For example, although OCS required states to certify that they will conduct statutorily required on-site visits of local agencies in their CSBG applications, it did not require states to submit documentation, such as reports on their monitoring findings, to verify that they had conducted these visits. OCS officials told us that they relied on state Single Audit reports to learn which states did not comply with monitoring requirements. However, we found these audits rarely, if ever, review state CSBG programs. OCS officials told us that they were not aware of how rarely CSBG is reviewed through the Single Audit. OCS also does not systematically collect information on the local agencies that experience management problems or on the extent to which identified problems are being resolved. The federal CSBG director told us that as a result, OCS may not be fully aware of the extent to which states had local agencies facing challenges with managing CSBG. OCS is aware of some local agencies with problems but has not established regular methods for collecting this information. OCS officials told us that it is the states' responsibility to identify and address problems in local agencies. In our review of Single Audit data, we found that financial management problems were common, with about 30 percent of local agencies reporting findings in 2002 and 2003. However, less than 10 percent of all local agencies reported more severe findings that could result in undetected financial reporting error and fraud, that are material weaknesses, in either year (see app. IV for Single Audit data). Frequency of State On-Site Monitoring Varied, but Selected States Performed Other Oversight Activities: All five states we visited conducted on-site monitoring of local agencies with varying frequency and performed additional oversight efforts, such as reviewing financial and programmatic reports from local agencies. The state programs that we visited had different views on what they must do to meet federal requirements to monitor local agencies at least once during each 3-year period, and OCS had not issued guidance clarifying the time frames states should use when conducting on-site visits. Specifically, officials in two states conducted the on-site visits at least once between 2003 and 2005, but officials in the other three states visited their local agencies less frequently. While states varied in their frequency of monitoring visits, all five states offices visited local agencies with identified problems more often. Capacity to conduct on-site monitoring varied among the five state offices, particularly in the areas of administrative and financial monitoring resources. Officials in all five states that we visited reviewed local agency reports as an additional oversight effort and provided required training and technical assistance to local agencies. In addition, some state offices coordinated with other federal programs that fund local activities to gain further insight into local agencies' management practices. Frequency of On-site Visits Varied: The frequency of on-site visits to local agencies varied among the five states we visited, ranging from 1 to 5 years between site visits. State CSBG offices in Illinois and Texas conducted visits to each local agency between 2003 and 2005. Specifically, officials in these two states visited at least half of their agencies each year. In contrast, Pennsylvania, Missouri, and Washington officials monitored their local agencies less frequently, with Missouri allowing up to 5 years to pass between monitoring visits to some local agencies. Washington and Pennsylvania officials had visited nearly all of their local agencies from 2003 through 2005, leaving less than 10 percent unmonitored during this period. Conversely, the Missouri state CSBG office visited 4 of 19 local agencies from 2003 to 2005, leaving nearly 80 percent of agencies unmonitored since 2001 or 2002. While states varied in their frequency of monitoring visits, officials in all five states told us they visited local agencies with identified problems more often. Illinois, Texas, and Washington assessed local agencies' management risks to prioritize which local agencies they visited more frequently during a monitoring cycle. Table 1 below shows how many local agencies these states monitored with on-site CSBG reviews from 2003 through 2005. Table 1: Local Agency Monitoring Visits Conducted by Select States, 2003-2005: State: Illinois; Number of local agencies: 36; Total number of agencies reviewed[A]: 36; Percentage of agencies reviewed: 100; Number of agencies reviewed, 2003: 36; Number of agencies reviewed, 2004: 36; Number of agencies reviewed, 2005: 29. State: Missouri; Number of local agencies: 19; Total number of agencies reviewed[A]: 4; Percentage of agencies reviewed: 21; Number of agencies reviewed, 2003: 1; Number of agencies reviewed, 2004: 0; Number of agencies reviewed, 2005: 4. State: Pennsylvania; Number of local agencies: 44; Total number of agencies reviewed[A]: 40; Percentage of agencies reviewed: 91; Number of agencies reviewed, 2003: 5; Number of agencies reviewed, 2004: 10; Number of agencies reviewed, 2005: 25. State: Texas; Number of local agencies: 47; Total number of agencies reviewed[A]: 49[B]; Percentage of agencies reviewed: 100[B]; Number of agencies reviewed, 2003: 39; Number of agencies reviewed, 2004: 27; Number of agencies reviewed, 2005: 34. State: Washington; Number of local agencies: 31; Total number of agencies reviewed[A]: 29; Percentage of agencies reviewed: 94; Number of agencies reviewed, 2003: 10; Number of agencies reviewed, 2004: 13; Number of agencies reviewed, 2005: 11. Source: GAO analysis of data provided by state CSBG programs. [A] Number of agencies reviewed is not a sum total of reviews conducted in each year, since some agencies were reviewed in multiple years. [B] The number and percentage of agencies reviewed in Texas between 2003 and 2005 exceeds its totals because two of the local agencies reviewed in 2003 subsequently closed. [End of table] Although the CSBG Act states that local agencies should be visited at least once during each 3-year period, the state officials we visited have different views on what is necessary to meet this requirement, and OCS has not issued guidance to states to clarify how the law should be interpreted. During the fiscal year 2004 Single Audit, Pennsylvania state auditors, using OMB guidance stating that reviews of local agencies must be conducted once every 3 years, found the state CSBG program to be out of compliance with federal requirements. However, the Missouri CSBG program manager stated that even though 15 local agencies have not been visited between 2003 and 2005, according to the state's interpretation of the CSBG law, the CSBG office will meet monitoring requirements because all local agencies will be visited within the two 3-year periods of 2001 to 2003 and 2004 to 2006. For example, the Missouri officials visited five agencies in 2001, during the first 3- year period, and plan to visit these agencies again in 2006, during the second 3-year period. States' Capacity to Conduct On-site Monitoring Varied: Administrative and financial monitoring resources varied in the five states we visited. Specifically, administrative funding ranged from less than 1 percent ($135,380) of CSBG funds in Missouri to 4 percent ($1.2 million) in Texas. The Missouri program manager told us that the state CSBG office used less than 1 percent for administration because state hiring restrictions prevented the CSBG program from hiring full- time CSBG staff. In addition, state officials in Missouri, Pennsylvania, and Washington told us that staff shortages prevented them from visiting local agencies more frequently. The number of staff available, funding for administration, and other related information are shown in table 2. Table 2: Total CSBG Funds, Expenditures for Administration, and Numbers of CSBG Staff and Local Agencies, Fiscal Year 2005: State: Illinois; Number of CSBG staff[A]: 7; Total CSBG funds: $29,934,237; Administrative expenditures: $1,037,843; Percentage of total CSBG funds for administration: 3.5; Number of local agencies: 36. State: Missouri; Number of CSBG staff[A]: 5; Total CSBG funds: $17,535,155; Administrative expenditures: $135,380; Percentage of total CSBG funds for administration: 0.8; Number of local agencies: 19. State: Pennsylvania; Number of CSBG staff[A]: 7; Total CSBG funds: $26,828,424; Administrative expenditures: $722,311; Percentage of total CSBG funds for administration: 2.7; Number of local agencies: 44. State: Texas; Number of CSBG staff[A]: 15; Total CSBG funds: $30,514,311; Administrative expenditures: $1,226,817; Percentage of total CSBG funds for administration: 4.0; Number of local agencies: 47. State: Washington; Number of CSBG staff[A]: 4; Total CSBG funds: $7,433,155; Administrative expenditures: $214,790; Percentage of total CSBG funds for administration: 2.9; Number of local agencies: 31. Source: GAO analysis of data from states, NASCSP, and HHS. [A] Includes part-time staff paid with CSBG funds. [End of table] State programs generally developed and made use of written monitoring guides, but they varied in their ability to assess local agencies' financial operations. The five state programs we visited all had written guides for monitoring visits that covered such areas as financial controls, governance, personnel, performance outcomes, and previous monitoring findings. However, state auditors in Washington told us that the CSBG office could not provide evidence that the guides were consistently used during monitoring visits because available documentation showed that the guides were often incomplete after a visit. Illinois, Texas, and Washington offices regularly used accountants to support their reviews of local agencies' financial operations as part of the on-site monitoring visits. Conversely, Missouri and Pennsylvania officials told us they did not regularly involve accountants in their monitoring efforts but had taken steps to improve the guides they used to review local agencies' finances. Specifically, the Missouri CSBG office, in consultation with MICA, made changes to its monitoring guide and provided financial training to its staff. The state CSBG office in Pennsylvania, with input from state budget staff, revised the financial aspects of its monitoring guide. States Made Efforts to Provide Additional Oversight of Local Agencies: The states that we visited provided oversight in addition to on-site monitoring through such activities as reviewing reports, coordinating with other federal and state programs, and providing formal training and technical assistance. All five state programs collected regular financial and performance reports and reviewed local expenditure reports. In addition, officials in the five states told us that they reviewed reports of the annual Single Audits for local agencies when they included findings related to the CSBG program. For example, a state audit manager in Washington reviewed the audits and regularly notified the CSBG program office when local agency findings were identified, and state CSBG program staff followed up with local agency officials and worked to ensure that the findings were addressed. States also required all local agencies to submit performance data. State officials told us that local agencies established their own performance goals, and the state offices reviewed these goals and sometimes modified them in consultation with local agencies. Additionally, all state CSBG offices reviewed local community action plans. Illinois, Texas, and Washington officials used information from these additional oversight activities to conduct risk assessments and select local agencies for more frequent on-site monitoring visits. In conducting these risk assessments, the state programs considered such factors as the amount of funds received from the state, the time since the last monitoring visit, and any identified concerns about an agency's competency, integrity, or proficiency. State officials told us that they directed local agencies to use preventive training and technical assistance to address any issues raised by risk assessments. Three of the five state CSBG offices that we visited also coordinated oversight activities with other federal and state programs that fund local agencies. For example, the Missouri, Texas, and Washington offices performed joint monitoring visits with state LIHEAP officials, and Missouri exchanged the results of local agency monitoring visits with the regional Head Start office. Coordination with other federal and state programs that provide funds to local agencies, such as housing-related programs and Head Start, generally consisted of occasional meetings and the sharing of some information. Also, OCS and the Head Start Bureau entered into a memorandum of understanding to foster collaboration and improve oversight of local agencies. While most regional and state officials told us they were aware of the memorandum of understanding, some told us that its intent was unclear and that they needed additional guidance to implement it more usefully. State associations of community action agencies played an important role in providing formal training and technical assistance to local agencies. CSBG officials in Illinois, Missouri, Pennsylvania, and Texas relied on state community action associations to provide technical assistance. For example, the Illinois Community Action Association received state training and technical assistance funds to provide on- line resources, peer coaching, and routine conferences on a regular basis. Missouri's state association for community action agencies, the Missouri Association for Community Action, also received CSBG training and technical assistance funds, which it used to help local agencies improve communications and management information systems and provide additional technical assistance as needed. In addition the Missouri association provides networking opportunities for local agencies and has a full-time training expert on staff, supported by the state CSBG contract, to provide one-on-one support to local agencies. In addition to training provided by the association, the Texas CSBG staff sponsored conferences and workshops that allowed the staff members to provide training directly to local agencies. In Washington, the association and state staff sponsored discussion groups for the local agencies. Additionally, during on-site monitoring visits, state CSBG officials provided immediate informal technical assistance and follow-up with local agencies on monitoring findings when necessary. OCS Targeted Some Training and Technical Assistance Funds to Grantees with Problems, but Information on Results Is Limited: While OCS targeted some training and technical assistance funds to local grantees with financial and programmatic management problems, the information on results of this assistance is limited. In fiscal years 2002 through 2005, OCS designated between $666,000 and $1 million of its annual $10 million training and technical assistance funds to local agencies with problems, but OCS did not have information to determine whether its training and technical assistance programs and their funding amounts were appropriate for addressing the areas of greatest needs. Specifically, the federal CSBG director explained that OCS currently allocates training and technical assistance funds based on input from some state and local agencies, but this process was not guided by a systematic assessment of state and local needs. Information on the results of OCS's current grant programs that target local agencies with problems was limited. However, information provided by progress reports for these grants showed that some of the agencies assisted had improved. OCS Designated $1 Million or Less of Its Annual Training and Technical Assistance Funding to Assist Local Agencies with Problems but Does Not Know if These Funds Addressed the Greatest Needs: In fiscal years 2002 through 2005, OCS designated $1 million or less of its annual $10 million training and technical assistance funds to assist local agencies with problems, but it had no way to determine whether this money was allocated in a way that addressed the greatest needs of state and local agencies. OCS divided its annual $10 million training and technical assistance funds among program support, contracts, and grants. The Deputy Director of OCS told us that program support funds paid salaries and expenses for OCS officials that manage CSBG grants, and contract funds paid for costs associated with logistics such as outreach and meeting with grantees, costs related to a management information system, and costs related to grant competitions. Training and technical assistance grants may be used for a variety of purposes, and OCS allocated these funds to support different types of activities each year. For example, OCS frequently funded activities such as supporting the implementation of ROMA, encouraging agencies to share innovative ideas, and providing program and management training opportunities for community action professionals. OCS designated between $666,000 and $1 million of annual training and technical assistance grants to assist local agencies with problems through two grant programs: Special State Technical Assistance Grants and the Peer-to-Peer Technical Assistance and Crisis Aversion Intervention Grants. These grants were commonly used to address to management, financial, and board governance problems at local agencies. Table 3 shows the allocation of CSBG funding for grants, contracts, and program support. Table 3: Office of Community Services CSBG Training and Technical Assistance Funding, Fiscal Years 2002 to 2005: Thousands of dollars. Allocation type: Grants to assist local agencies with problems; 2002: $666; 2003: $845; 2004: $1,000; 2005: $900. Allocation type: Grants to assist local agencies with problems: Special State Technical Assistance; 2002: 366; 2003: 545; 2004: 500; 2005: 400. Allocation type: Grants to assist local agencies with problems: Peer-to- Peer; 2002: 300; 2003: 300; 2004: 500; 2005: 500. Allocation type: Other grants; 2002: 6,158; 2003: 5,303; 2004: 6,215; 2005: 6,571. Allocation type: Contracts; 2002: 1,483; 2003: 2,436; 2004: 1,432; 2005: 1,039. Allocation type: Program support; 2002: 1,871; 2003: 1,540; 2004: 1,469; 2005: 1,557. Totals[A]; 2002: $10,218; 2003: $10,125; 2004: $10,117; 2005: $10,066. Source: HHS Office of Community Services and GAO analysis: [A] Column totals may not add up because of rounding. [End of table] Despite a congressional recommendation, OCS officials told us there is no process in place to strategically allocate its approximately $10 million in training and technical assistance funds among program areas. OCS drafted a strategic plan for allocating its training and technical assistance funds--an action directed by congressional conferees in the fiscal year 2005 Consolidated Appropriations Act conference report--but did not implement the plan. The federal CSBG director told us that OCS did not implement the strategic plan because the President's recent budget proposals did not include funding for CSBG, although Congress has continued to provide funding for the program. The federal CSBG director also told us that the draft plan focused resources on such areas as financial integrity and management, leadership enhancement, and data collection. The federal CSBG director also told us that OCS currently allocates training and technical assistance funds based on input from some state and local agencies, but this process was not guided by a systematic assessment of state and local needs and did not involve guidance or specifications on the actual amounts that should be awarded for activities. Specifically, OCS sought input each year from the Monitoring and Assessment Taskforce--a group made up of some state and local CSBG officials and national CSBG associations such as the National Association of State Community Service Programs (NASCSP)--to generate a list of priority activities. OCS then presented this list at national community action conferences, such as those sponsored by NASCSP or the Community Action Partnership, for additional comments. However, OCS does not track which local agencies experienced problems and what those problems were. As a result, OCS could not provide us with information on the extent to which these current efforts are addressing those needs. Limited Data on the Results of the Assistance Showed That Some Local Agencies Had Improved: Information on the results of OCS grant programs that targeted local agencies with problems was limited. However, the available information showed that some local agencies have improved financial and programmatic management as a result of the assistance they received. Our review of all available grant applications and subsequent progress reports for SSTA and Peer-to-Peer grants identified 68 local agencies that these grants targeted for assistance between 2002 and 2005. Of these 68 agencies, 22 had no results available because the assistance was ongoing and therefore final progress reports were not yet due. We identified outcomes for 25 of the remaining 46 agencies, as shown in figure 2. Of these 25 agencies, 18 reported improvement, and the remaining 7 agencies had unresolved issues, had closed, or had undeterminable results. Results were unknown for the other 21 agencies because their grant progress reports did not include information on outcomes. Figure 2: Results of Assistance to 46 Local Agencies from Special State Technical Assistance and Peer-to-Peer Grants as Reported by Grantees, Fiscal Years 2002 through 2005: [See PDF for image] Source: GAO analysis of grantee progress reports. Notes: This figure does not include does not include the 22 of the 68 agencies identified as being targeted for assistance and for which data on results are not available because final progress reports on the grants awarded to assist these agencies are not yet due. Among the 18 agencies that showed improvement, 5 agencies still had remaining challenges to address. [End of figure] OCS officials told us that they hold grantees accountable for conducting activities under the proposed scope of training and technical assistance grants, not whether these activities result in successful outcomes for the local agencies they assist. OCS's guidance to training and technical assistance grantees recommends that the grantees report whether activities are completed but does not include a requirement to report on outcomes. Further, HHS's guidance on discretionary grant reporting, which covers CSBG training and technical assistance grants, does not specify what information program offices should collect on performance and outcomes. We also spoke with officials in HHS's Office of Inspector General who mentioned that on the basis of prior reviews the office had some concerns about the administration of CSBG discretionary grants. Specifically, these officials had concerns about the completeness and accuracy of progress reports and whether grantees were meeting their goals. Officials involved in efforts to use grants to assist agencies gave mixed reviews on the effectiveness of activities funded by these grants. State and local officials in Texas and Missouri spoke highly of their interaction with MICA to assist agencies with problems. State officials in Texas said they had used an SSTA grant to assist two local agencies and had hired MICA as the contractor to provide the assistance. Texas officials were pleased with the assistance that MICA provided and said that the state did not have the resources to provide the kind of long-term, on-site assistance that MICA offered. Missouri officials told us that all five local agencies that the state had contracted with MICA to work with had benefited from MICA's expertise, particularly with regard to financial matters. Like Texas, the Missouri office also used SSTA grants to provide assistance to four of these agencies. In contrast, we also spoke to national, regional, and state community action association officials who said they had worked with local agencies that received assistance from MICA and had concerns about MICA's work. Specifically, they told us that MICA was not always effective in resolving local agencies' problems, did not use money efficiently, and had an apparent conflict of interest stemming from its practice of conducting agency assessments and offering services to correct problems those assessments identify. For example, an Ohio official who managed a local agency's contract told us that even with 6 months of paid assistance from MICA, the local agency had closed. In response to these criticisms, a MICA official said that some problems at local agencies were too severe for them to address and that MICA tries to be transparent about its costs by issuing a detailed proposal before starting work with an agency. Additionally, the MICA official said that MICA and OCS officials had discussed the conflict of interest issue, and OCS had encouraged MICA to continue efforts to assess agencies and address their problems. Conclusions: Under the CSBG program, federal, state, and local agencies work together to help low-income people achieve self-sufficiency. The federal government's role is to oversee states' efforts to ensure that local agencies properly and effectively use CSBG funds. OCS currently lacks the procedures, information, and guidance to grantees that it needs to effectively carry out its role. Specifically, OCS does not fully use the data it collects and does not collect other key information on state oversight efforts and the outcomes of training and technical assistance grants that could enhance its oversight capabilities. Additionally, OCS has not issued guidance for how often states should visit local agencies. Thus, OCS cannot determine where program risks exist or effectively target its limited resources to where they would be most useful. Consequently, OCS may have missed opportunities to monitor states facing the greatest oversight challenges and to identify common problem areas where it could target training and technical assistance. Recommendations for Executive Action: In order to provide better oversight of state agencies, we recommend that the Assistant Secretary for Children and Families direct OCS to take the following actions: * Conduct a risk-based assessment of state CSBG programs by systematically collecting and using information. This information may include programmatic and performance data, state and local Single Audit findings, information on state monitoring efforts and local agencies with problems, and monitoring results from other related federal programs that may be obtained by effectively using the memorandum of understanding with the Head Start program and other collaborative efforts. * Establish policies and procedures to help ensure that its on-site monitoring is focused on states with highest risk. * Issue guidance on state responsibilities with regard to complying with the requirement to monitor local agencies at least once during each 3-year time period. * Establish reporting guidance for training and technical grants that would allow OCS to obtain information on the outcomes of grant-funded activities for local agencies. * Implement a strategic plan that will focus its training and technical assistance efforts on the areas in which states face the greatest needs. OCS should make use of risk assessments and its reviews of past training and technical assistance efforts to inform the strategic plan. Comments from the Department of Health and Human Services on Our Evaluation: We provided a draft of this report to the Department of Health and Human Services and received written comments from the agency. In its comments, HHS officials agreed with our recommendations and, in response, have planned several changes to improve CSBG oversight. Specifically, HHS officials stated that OCS is finalizing a risk-based strategy to identify state and local agencies most in need of oversight and technical assistance based on characteristics identified in state plans, audit reports, previous monitoring and performance reports, and reports from other programs administered by local agencies that receive CSBG funds. HHS officials said that this strategy will result in OCS implementing a triennial monitoring schedule they plan to have fully operational by fiscal year 2008. HHS officials also said that by October 1, 2006, OCS will issue guidance to state CSBG lead agencies to clarify the states' statutory obligation to monitor all local entities receiving CSBG funding within a 3-year period, as well as requirements for states to execute their monitoring programs. Additionally, HHS officials said that OCS has worked with a group of local and state CSBG officials and national CSBG associations to develop a comprehensive training and technical assistance strategic plan focused on issues such as leadership, administration, fiscal controls, and data collection and reporting. See appendix V for HHS's comments. As agreed with your offices, unless you publicly announce its contents earlier, we plan no further distribution of this report until 30 days after its issue date. At that time, we will send copies of this report to the Assistant Secretary for Children and Families, relevant congressional committees, and other interested parties. We will also make copies available to others upon request. In addition, the report will be available at no charge on GAO's Web site at [Hyperlink, http://www.gao.gov]. Please contact me at (202) 512-7215 if you or your staff have any questions about this report. Other major contributors to this report are listed in appendix VI. Signed by: Marnie S. Shaul: Director, Education, Workforce, and Income Security Issues: [End of section] Appendix I: Objectives, Scope, and Methodology: To gain a better understanding of oversight efforts undertaken by state and federal program offices to monitor the Community Services Block Grant (CSBG) program and ensure the accountability of funds, we examined (1) the extent to which the Department of Health and Human (HHS) oversight of states efforts to monitor local agencies complied with federal laws and standards, (2) the efforts selected states have made to monitor local agencies' compliance with fiscal requirements and performance standards, and (3) the extent to which HHS targeted federal CSBG training and technical assistance funds to efforts to assist local agencies with financial or management problems and what is known about the results of the assistance. To address the first objective, we reviewed federal laws and standards to obtain information on the Office of Community Services's (OCS) requirements and responsibilities for the oversight of states and interviewed federal officials about their oversight efforts. In addition, we obtained and reviewed available information on OCS monitoring policies and procedures; documentation of federal monitoring visits of states conducted during fiscal years 2003 through 2005; other information OCS collects from states, including state applications and performance data; and guidance issued by OCS to communicate program- related information, concerns, and priorities to grantees to assess OCS's compliance with laws and standards. We also reviewed available Single Audit data for local agencies and grouped them by state to assess the percentage of local agencies with Single Audit findings at both the national and state levels reported in fiscal years 2002 and 2003, the most recent years for which information was available. The scope of this review included the District of Columbia and Puerto Rico as well as the 50 states. We assessed the reliability of Single Audit data by performing electronic and manual data testing to assess whether the data were complete and accurate. We also assessed the reliability of CSBG statistical data by interviewing officials knowledgeable about data collection and maintenance. We determined that the data were sufficiently reliable for the purposes of this report. In addition, we interviewed federal officials with Head Start, the Low-Income Home Energy Assistance Program, and the Community Development Block Grant program, which also distribute funds to local agencies, to learn whether officials from these programs shared information with CSBG officials to support oversight efforts. To address the second objective, we reviewed federal laws and standards to obtain information on states' CSBG oversight responsibilities and conducted site visits. We visited five states, Illinois, Missouri, Pennsylvania, Texas, and Washington, that were selected using several criteria including grant amounts, number of local agencies, state administrative structure, and analysis of Single Audit results among local agencies. CSBG association officials recommended some of these states based on promising efforts to monitor local agencies. Table 4 provides characteristics we considered for each state. Table 4: States Visited in Our Study: Criteria: Recommendations from CSBG associations; Illinois: [Empty]; Missouri: [Empty]; Pennsylvania: [Empty]; Texas: [Empty]; Washington: [Empty]. Criteria: Size of grant (range $2.5 million - 56.5 million); Illinois: $29,934,237; Missouri: $17,535,155; Pennsylvania: $26,828,424; Texas: $30,514,311; Washington: $7,433,155. Criteria: Number of local agencies (Range 1-63); Illinois: 36; Missouri: 19; Pennsylvania: 44; Texas: 47; Washington: 31. Criteria: State administrative structure; (few: fewer than five programs providing funds to CAAs are in the same state agency; many: five or more); Illinois: Few; Missouri: Few; Pennsylvania: Many; Texas: Few; Washington: Many. Criteria: Occurrences of financial findings (above or below average national percentage of subgrantees reporting 2003 Single Audit data); Illinois: Below; Missouri: Above; Pennsylvania: Above; Texas: Above; Washington: Below. Criteria: Recently monitored by HHS; Illinois: [Empty]; Missouri: [Empty]; Pennsylvania: [Empty]; Texas: [Empty]; Washington: Yes; (FY 2005). Criteria: Percentage of people in poverty (National Average = 12.1%); Illinois: 11.8; Missouri: 10.1; Pennsylvania: 9.9; Texas: 15.8; Washington: 11.4. Criteria: Census region; Illinois: Midwest; Missouri: Midwest; Pennsylvania: Northeast; Texas: South; Washington: West. Source: GAO analysis of data from the U.S. Census Bureau, Federal Audit Clearinghouse, NASCSP, and the Community Action Partnership. [End of table] During our state site visits, we interviewed and collected information from state and local officials in Illinois, Missouri, Pennsylvania, Texas, and Washington about state oversight efforts from fiscal year 2003 through fiscal year 2005. Specifically, we interviewed state program officials and reviewed related documentation including state guidance and directives to local agencies, application instructions, state on-site monitoring schedules, on-site monitoring guides, sample contracts, and reporting forms for local agencies. We also visited three local agencies in each state and interviewed staff to learn more about state oversight and monitoring efforts, including application processes, fiscal and performance reporting, on-site monitoring, and training and technical assistance. In each state we visited, we reviewed program files for six local agencies, including files for the three we visited and three others, that included community action plans and applications, financial and performance reports, and state monitoring reports and follow-up correspondence. In addition, we obtained information on state audit findings related to CSBG and met with state auditors during site visits to learn more about additional state oversight of CSBG and related programs and local agencies. We also interviewed state officials in the Low-Income Home Energy Assistance Program and the Community Development Block Grant programs, as well as regional HHS officials, to learn whether any coordination occurred between the programs to support state oversight efforts. Our results on the five states that we visited are not generalizable to all state CSBG programs. To address the third objective, we interviewed federal officials and contractors that provide training and technical assistance to obtain information on the extent to which OCS grants were targeted to assist agencies with problems and how they determined whether these efforts were effective. We obtained and reviewed training and technical assistance grant applications and progress reports for Special State Technical Assistance (SSTA) Grants and Peer-to-Peer Technical Assistance and Crisis Intervention (Peer-to-Peer) Grants for fiscal year 2002 through fiscal year 2005 to assess efforts to assist local agencies with problems and the results of these efforts. This review included applications for all 39 SSTA Grants awarded during this period, progress reports issued in 6-month intervals for the Peer-to- Peer Grant, and available final progress reports for the SSTA Grants. We were not able to obtain some SSTA Grant progress reports because the assistance was still ongoing, particularly for grants issued recently. We also interviewed a national association representative and state and local officials to learn about the results of training and technical assistance efforts. We conducted our work from July 2005 through May 2006 in accordance with generally accepted government auditing standards. [End of section] Appendix II: Community Services Block Grant Program: HHS Needs to Improve Monitoring of State Grantees, GAO-06-373R: Accountability - Integrity - Reliability: United States Government Accountability Office: Washington, DC 20648: February 7, 2006: The Honorable Wade F. Horn: Assistant Secretary for Children and Families: U.S. Department of Health and Human Services: Subject: Community Services Block Grant Program: HHS Needs to Improve Monitoring of State Grantees: Dear Mr. Horn: As you know, the House Committee on Education and the Workforce has asked GAO to review the administration of the Community Services Block Grant (CSBG) program. As part of this review, we are examining the Department of Health and Human Services' (HHS) efforts to monitor states' use of CSBG funds. Specifically, we have been reviewing efforts undertaken by HHS's Office of Community Services (OCS), which has primary responsibility for administering the CSBG program. The purpose of this letter is to bring to your attention concerns that we have about OCS' ability to effectively fulfill its CSBG monitoring responsibilities. Under the Community Services Block Grant Act, as amended, OCS is required to evaluate several states' use of CSBG funds annually and issue reports on the results of those evaluations to the states and Congress. In addition, the standards pursuant to 31 U.S.C. section 3512(c), (d), commonly referred to as the Federal Managers' Financial Integrity Act of 1982, require federal agencies to establish and maintain internal controls to provide reasonable assurance that agencies achieve the objectives of effective and efficient operations, reliable financial reporting, and compliance with applicable laws and regulations.[Footnote 6] In order to have sufficient internal controls, managers should have policies and procedures in place that, among other things, ensure that statutory responsibilities are fulfilled; agency information and communications are available, reliable, and timely; and staff have adequate training and skills. In the course of our work to date, we found that OCS does not have the policies, procedures, and internal controls in place needed to carry out its monitoring efforts. Specifically, we found that: * OCS staff that conduct monitoring visits told us they lacked the full range of necessary skills to perform these visits, and procedures are not in place to ensure that monitoring teams collectively have requisite skills. Specifically, CSBG staff told us they lacked the expertise needed to assess the financial operations of state CSBG programs-a key component of the CSBG monitoring process. * OCS did not issue mandatory reports on monitoring results to any of the six states it visited during fiscal years 2003 and 2004. OCS officials informed us that they intend to issue reports for fiscal year 2005 visits. However, 6 months after completing these visits, OCS has yet to issue a final report to any state. Although OCS monitoring procedures direct staff to write reports after visits and send draft reports on their findings to state agencies for review and comment before final issuance, the procedures do not include instructions with regard to specific time frames for completing these steps. * OCS cannot locate key documents pertaining to its monitoring visits of states conducted in 2003 and 2004. Although OCS officials stated that program files should include documentation of its monitoring visits, we found that GCS's file management policies do not include directions on retaining these monitoring documents. Additionally, officials told us that staff responsible for carrying out monitoring activities had retired and their files could not be located. * OCS's most recent CSBG report to Congress, which covers fiscal years 2000 to 2003, did not include information on the results of its state evaluations, as required by statute. In its previous report to Congress, which covered fiscal year 1999, OCS provided information on these results and acknowledged that it is statutorily obligated to do so. Additionally, OCS has not been timely in issuing these reports and has not submitted them annually to Congress, as required. For example, OCS issued a consolidated report for fiscal years 2000 to 2003 in December 2005. Officials in OCS and the Administration for Children and Families' (ACF) Office of Legislative Affairs and Budget said that the report was issued just recently because of time lags in receiving data from states and the length of the agency's internal review process. In conclusion, OCS's procedures and controls are not adequate to ensure that it performs its monitoring responsibilities in a timely and effective manner. As a result, states and congressional oversight committees are not receiving information that is required under law. States may not have made improvements to how they administer CSBG funds because they were not made aware of findings from OCS's evaluations. Furthermore, GCS's failure to provide timely reports that include the results of these evaluations hinders Congress' ability to carry out its oversight responsibilities. Also troubling is that OCS will not be able to produce this information for the years for which monitoring documentation has been lost. Finally, by sending staff without sufficient expertise in financial management on monitoring visits, OCS failed to ensure that states spent federal dollars appropriately. In order to ensure that OCS has the internal controls to fulfill its CSBG monitoring responsibilities, we recommend that you instruct the director of OCS to establish formal written policies and procedures for: * ensuring that teams conducting monitoring visits include staff with requisite skills, * ensuring the timely completion of monitoring reports to states, * maintaining and retaining documentation of monitoring visits, and * ensuring the timely issuance of annual reports to Congress. We met with OCS and other ACF staff on February 2, 2006 and presented these findings. These staff had no comments on the findings but stated that they are beginning to undertake actions intended to address issues we raised with their monitoring efforts. In the meantime, we are continuing our review of the CSBG program. We will include the issues raised in this letter, and any actions that your agency has taken to resolve them, along with the other findings in our final report. If you or your staff have any questions about this correspondence, please contact me at (202) 512-7215. Sincerely yours, Signed by: Marnie S. Shaul: Director, Education, Workforce, and Income Security Issues: [End of section] Appendix III: The Department of Health and Human Services's Response to GAO-06-373R: The Secretary Of Health And Human Services: Washington, D.C. 20201: APR 24 2006: The Honorable David M. Walker: Comptroller General of the United States: Washington, DC 20548: Dear Mr. Walker: In accordance with the requirements of OMB Circular A-50, I acknowledge this Department's receipt of the GAO final correspondence entitled, "Community Services Block Grant Program: HHS Needs to Improve Monitoring of State Grantees" (GAO-06-373R), dated February 7, 2006. I am enclosing a copy of our comments. Sincerely, Signed by: Michael O. Leavitt: Enclosure: Statement Of Actions Taken By The Department Of Health And Human Services on the U.S. Government Accountability Office's Final Correspondence Entitled. "Community Services Block Grant Program: HHS Needs To Improve Monitoring Of State Grantees" (GAO-06-37311): The Department of Health and Human Services (HHS) appreciates the opportunity to update the U.S. Government Accountability Office (GAO) on actions taken by the Administration for Children and Families since receiving GAO's February 7, 2006, correspondence. GAO Recommendations: In order to ensure that the Office of Community Services (OCS) has the internal controls to fulfill its Community Services Block Grant (CSBG) monitoring responsibilities, we recommend that you instruct the Director of OCS to establish formal written policies and procedures for: * ensuring that teams conducting monitoring visits include staff with requisite skills, * ensuring the timely completion of monitoring reports to states, * maintaining and retaining documentation of monitoring visits, and: * ensuring the timely issuance of annual reports to Congress. HHS Comment: Over the past year, a new Director for OCS was appointed (Josephine Bias Robinson). She has focused considerable attention on strengthening the administration of the CSBG program. Ms. Robinson has taken actions to put organizational controls in place to ensure proper oversight of the program, including clarifying staff structure and responsibilities for program oversight and developing enhanced program management policies and procedures to guide OCS program stewardship. In terms of strengthening the capacity of OCS to monitor and oversee State administration and fiscal controls for CSBG: * OCS has been granted the authority to hire additional staff with appropriate skills and expertise in financial management to conduct monitoring of State authorities and to provide training and technical assistance to improve State financial management oversight of local agencies receiving CSBG funds. In addition, OCS has been given approval to use additional contract support to assist in increasing the number of on-site monitoring visits conducted this year. * Prior to conducting monitoring reviews, staff will be trained on Office of Management and Budget (OMB) circular requirements, appropriations law, and CSBG program requirements. * Through the continued use of CSBG funds available for technical assistance, OCS will assist States in strengthening their knowledge and use of basic financial management principles to improve their allocation and control of funds, oversight of local agencies and compliance with OMB and Internal Revenue Service requirements. * We are establishing a triennial monitoring review schedule. The monitoring reviews will be augmented through analyses of OMB Circular A- 133 audits, number and frequency of crisis agency visits, joint agency interest with the Low Income Home Energy Assistance Program, and regional representation. * We are presently developing guidelines outlining the process for conducting the monitoring reviews, for communicating the results of the reviews to States, and for working with States to ensure corrective action. We expect to finalize these documents no later than May 30, 2006. * We are also developing guidelines for record maintenance and retention of documentation of monitoring visits. We expect to complete the guidelines no later than April 30, 2006. * We are working on the current CSBG Report to Congress and expect that it will be provided in a timely manner. [End of section] Appendix IV: Ranking of States Based on Percentage of Local CSBG Subgrantees with Single Audit Findings: Tables 5 and 6 present Single Audit data by state for local CSBG agencies (i.e., community action agencies) for 2002 and 2003, respectively. For each state, we report (1) the number of local agencies for which Single Audit data were available, (2) the percentage of local agencies in the state that had any type of Single Audit finding, (3) the percentage of local agencies that had material weakness findings,[Footnote 7] and (4) the percentage of local agencies that had material noncompliance findings.[Footnote 8] States are ranked in decreasing order by the percentage of local agencies in the state that had any type of Single Audit finding. Table 5: Single Audit Data for States Related to Findings among Local CSBG Subgrantees Ranked by the Percentage of Agencies with Findings, 2002: Rank: 1; State: Puerto Rico; Number of local agencies reporting Single Audits: 3; Percentage of local agencies with Single Audit findings: 100; Percentage of local agencies with material weakness findings: 33; Percentage of local agencies with material noncompliance findings: 33. Rank: 2; State: New Mexico; Number of local agencies reporting Single Audits: 8; Percentage of local agencies with Single Audit findings: 75; Percentage of local agencies with material weakness findings: 0; Percentage of local agencies with material noncompliance findings: 0. Rank: 3; State: West Virginia; Number of local agencies reporting Single Audits: 14; Percentage of local agencies with Single Audit findings: 64; Percentage of local agencies with material weakness findings: 36; Percentage of local agencies with material noncompliance findings: 14. Rank: 4; State: Missouri; Number of local agencies reporting Single Audits: 19; Percentage of local agencies with Single Audit findings: 58; Percentage of local agencies with material weakness findings: 5; Percentage of local agencies with material noncompliance findings: 0. Rank: 5; State: Wyoming; Number of local agencies reporting Single Audits: 9; Percentage of local agencies with Single Audit findings: 56; Percentage of local agencies with material weakness findings: 44; Percentage of local agencies with material noncompliance findings: 0. Rank: 6; State: Maryland; Number of local agencies reporting Single Audits: 17; Percentage of local agencies with Single Audit findings: 53; Percentage of local agencies with material weakness findings: 6; Percentage of local agencies with material noncompliance findings: 0. Rank: 7; State: Connecticut; Number of local agencies reporting Single Audits: 12; Percentage of local agencies with Single Audit findings: 50; Percentage of local agencies with material weakness findings: 0; Percentage of local agencies with material noncompliance findings: 25. Rank: 7; State: South Dakota; Number of local agencies reporting Single Audits: 4; Percentage of local agencies with Single Audit findings: 50; Percentage of local agencies with material weakness findings: 0; Percentage of local agencies with material noncompliance findings: 0. Rank: 9; State: Louisiana; Number of local agencies reporting Single Audits: 36; Percentage of local agencies with Single Audit findings: 47; Percentage of local agencies with material weakness findings: 14; Percentage of local agencies with material noncompliance findings: 14. Rank: 10; State: Arizona; Number of local agencies reporting Single Audits: 11; Percentage of local agencies with Single Audit findings: 45; Percentage of local agencies with material weakness findings: 9; Percentage of local agencies with material noncompliance findings: 0. Rank: 10; State: Maine; Number of local agencies reporting Single Audits: 11; Percentage of local agencies with Single Audit findings: 45; Percentage of local agencies with material weakness findings: 9; Percentage of local agencies with material noncompliance findings: 0. Rank: 12; State: New Jersey; Number of local agencies reporting Single Audits: 20; Percentage of local agencies with Single Audit findings: 45; Percentage of local agencies with material weakness findings: 5; Percentage of local agencies with material noncompliance findings: 5. Rank: 13; State: Pennsylvania; Number of local agencies reporting Single Audits: 43; Percentage of local agencies with Single Audit findings: 44; Percentage of local agencies with material weakness findings: 9; Percentage of local agencies with material noncompliance findings: 7. Rank: 14; State: Minnesota; Number of local agencies reporting Single Audits: 36; Percentage of local agencies with Single Audit findings: 42; Percentage of local agencies with material weakness findings: 6; Percentage of local agencies with material noncompliance findings: 6. Rank: 15; State: Indiana; Number of local agencies reporting Single Audits: 22; Percentage of local agencies with Single Audit findings: 41; Percentage of local agencies with material weakness findings: 14; Percentage of local agencies with material noncompliance findings: 14. Rank: 16; State: South Carolina; Number of local agencies reporting Single Audits: 16; Percentage of local agencies with Single Audit findings: 38; Percentage of local agencies with material weakness findings: 19; Percentage of local agencies with material noncompliance findings: 6. Rank: 17; State: Texas; Number of local agencies reporting Single Audits: 43; Percentage of local agencies with Single Audit findings: 37; Percentage of local agencies with material weakness findings: 12; Percentage of local agencies with material noncompliance findings: 5. Rank: 18; State: Alabama; Number of local agencies reporting Single Audits: 19; Percentage of local agencies with Single Audit findings: 37; Percentage of local agencies with material weakness findings: 26; Percentage of local agencies with material noncompliance findings: 0. Rank: 18; State: Oklahoma; Number of local agencies reporting Single Audits: 19; Percentage of local agencies with Single Audit findings: 37; Percentage of local agencies with material weakness findings: 0; Percentage of local agencies with material noncompliance findings: 5. Rank: 20; State: Nevada; Number of local agencies reporting Single Audits: 11; Percentage of local agencies with Single Audit findings: 36; Percentage of local agencies with material weakness findings: 0; Percentage of local agencies with material noncompliance findings: 0. Rank: 21; State: Ohio; Number of local agencies reporting Single Audits: 50; Percentage of local agencies with Single Audit findings: 36; Percentage of local agencies with material weakness findings: 10; Percentage of local agencies with material noncompliance findings: 4. Rank: 22; State: Arkansas; Number of local agencies reporting Single Audits: 14; Percentage of local agencies with Single Audit findings: 36; Percentage of local agencies with material weakness findings: 0; Percentage of local agencies with material noncompliance findings: 0. Rank: 22; State: Florida; Number of local agencies reporting Single Audits: 28; Percentage of local agencies with Single Audit findings: 36; Percentage of local agencies with material weakness findings: 7; Percentage of local agencies with material noncompliance findings: 7. Rank: 24; State: Georgia; Number of local agencies reporting Single Audits: 24; Percentage of local agencies with Single Audit findings: 33; Percentage of local agencies with material weakness findings: 4; Percentage of local agencies with material noncompliance findings: 4. Rank: 24; State: Iowa; Number of local agencies reporting Single Audits: 18; Percentage of local agencies with Single Audit findings: 33; Percentage of local agencies with material weakness findings: 22; Percentage of local agencies with material noncompliance findings: 0. Rank: 24; State: New Hampshire; Number of local agencies reporting Single Audits: 6; Percentage of local agencies with Single Audit findings: 33; Percentage of local agencies with material weakness findings: 17; Percentage of local agencies with material noncompliance findings: 0. Rank: 24; State: North Dakota; Number of local agencies reporting Single Audits: 6; Percentage of local agencies with Single Audit findings: 33; Percentage of local agencies with material weakness findings: 0; Percentage of local agencies with material noncompliance findings: 0. Rank: 28; State: California; Number of local agencies reporting Single Audits: 56; Percentage of local agencies with Single Audit findings: 32; Percentage of local agencies with material weakness findings: 2; Percentage of local agencies with material noncompliance findings: 4. Rank: 29; State: Virginia; Number of local agencies reporting Single Audits: 26; Percentage of local agencies with Single Audit findings: 31; Percentage of local agencies with material weakness findings: 4; Percentage of local agencies with material noncompliance findings: 0. Rank: 30; State: Michigan; Number of local agencies reporting Single Audits: 27; Percentage of local agencies with Single Audit findings: 30; Percentage of local agencies with material weakness findings: 4; Percentage of local agencies with material noncompliance findings: 4. Rank: 31; State: Wisconsin; Number of local agencies reporting Single Audits: 17; Percentage of local agencies with Single Audit findings: 29; Percentage of local agencies with material weakness findings: 6; Percentage of local agencies with material noncompliance findings: 6. Rank: 32; State: Tennessee; Number of local agencies reporting Single Audits: 18; Percentage of local agencies with Single Audit findings: 28; Percentage of local agencies with material weakness findings: 6; Percentage of local agencies with material noncompliance findings: 6. Rank: 33; State: Illinois; Number of local agencies reporting Single Audits: 33; Percentage of local agencies with Single Audit findings: 27; Percentage of local agencies with material weakness findings: 3; Percentage of local agencies with material noncompliance findings: 3. Rank: 34; State: Colorado; Number of local agencies reporting Single Audits: 35; Percentage of local agencies with Single Audit findings: 26; Percentage of local agencies with material weakness findings: 6; Percentage of local agencies with material noncompliance findings: 0. Rank: 35; State: New York; Number of local agencies reporting Single Audits: 47; Percentage of local agencies with Single Audit findings: 26; Percentage of local agencies with material weakness findings: 4; Percentage of local agencies with material noncompliance findings: 0. Rank: 36; State: North Carolina; Number of local agencies reporting Single Audits: 29; Percentage of local agencies with Single Audit findings: 24; Percentage of local agencies with material weakness findings: 14; Percentage of local agencies with material noncompliance findings: 7. Rank: 37; State: Washington; Number of local agencies reporting Single Audits: 25; Percentage of local agencies with Single Audit findings: 24; Percentage of local agencies with material weakness findings: 4; Percentage of local agencies with material noncompliance findings: 4. Rank: 38; State: Mississippi; Number of local agencies reporting Single Audits: 17; Percentage of local agencies with Single Audit findings: 24; Percentage of local agencies with material weakness findings: 0; Percentage of local agencies with material noncompliance findings: 0. Rank: 38; State: Oregon; Number of local agencies reporting Single Audits: 17; Percentage of local agencies with Single Audit findings: 24; Percentage of local agencies with material weakness findings: 0; Percentage of local agencies with material noncompliance findings: 6. Rank: 40; State: Nebraska; Number of local agencies reporting Single Audits: 9; Percentage of local agencies with Single Audit findings: 22; Percentage of local agencies with material weakness findings: 0; Percentage of local agencies with material noncompliance findings: 0. Rank: 41; State: Montana; Number of local agencies reporting Single Audits: 10; Percentage of local agencies with Single Audit findings: 20; Percentage of local agencies with material weakness findings: 0; Percentage of local agencies with material noncompliance findings: 0. Rank: 42; State: Massachusetts; Number of local agencies reporting Single Audits: 23; Percentage of local agencies with Single Audit findings: 17; Percentage of local agencies with material weakness findings: 4; Percentage of local agencies with material noncompliance findings: 0. Rank: 43; State: Kentucky; Number of local agencies reporting Single Audits: 21; Percentage of local agencies with Single Audit findings: 14; Percentage of local agencies with material weakness findings: 0; Percentage of local agencies with material noncompliance findings: 0. Rank: 43; State: Utah; Number of local agencies reporting Single Audits: 7; Percentage of local agencies with Single Audit findings: 14; Percentage of local agencies with material weakness findings: 0; Percentage of local agencies with material noncompliance findings: 0. Rank: 45; State: Kansas; Number of local agencies reporting Single Audits: 8; Percentage of local agencies with Single Audit findings: 13; Percentage of local agencies with material weakness findings: 13; Percentage of local agencies with material noncompliance findings: 13. Rank: 46; State: Alaska; Number of local agencies reporting Single Audits: 1; Percentage of local agencies with Single Audit findings: 0; Percentage of local agencies with material weakness findings: 0; Percentage of local agencies with material noncompliance findings: 0. Rank: 46; State: Delaware; Number of local agencies reporting Single Audits: 1; Percentage of local agencies with Single Audit findings: 0; Percentage of local agencies with material weakness findings: 0; Percentage of local agencies with material noncompliance findings: 0. Rank: 46; State: Hawaii; Number of local agencies reporting Single Audits: 4; Percentage of local agencies with Single Audit findings: 0; Percentage of local agencies with material weakness findings: 0; Percentage of local agencies with material noncompliance findings: 0. Rank: 46; State: Idaho; Number of local agencies reporting Single Audits: 8; Percentage of local agencies with Single Audit findings: 0; Percentage of local agencies with material weakness findings: 0; Percentage of local agencies with material noncompliance findings: 0. Rank: 46; State: Rhode Island; Number of local agencies reporting Single Audits: 7; Percentage of local agencies with Single Audit findings: 0; Percentage of local agencies with material weakness findings: 0; Percentage of local agencies with material noncompliance findings: 0. Rank: 46; State: Vermont; Number of local agencies reporting Single Audits: 4; Percentage of local agencies with Single Audit findings: 0; Percentage of local agencies with material weakness findings: 0; Percentage of local agencies with material noncompliance findings: 0. Rank: N/A; State: District of Columbia[A]; Number of local agencies reporting Single Audits: 0; Percentage of local agencies with Single Audit findings: N/A; Percentage of local agencies with material weakness findings: N/A; Percentage of local agencies with material noncompliance findings: N/A. State: National totals; Number of local agencies reporting Single Audits: 969; Percentage of local agencies with Single Audit findings: 34; Percentage of local agencies with material weakness findings: 7; Percentage of local agencies with material noncompliance findings: 4. Source: GAO analysis of data from the Federal Audit Clearinghouse. Note: Ties are noted with some states sharing ranks. Ranking was based on percentages rounded to 1/100th, therefore the table may show other apparent but not actual ties. [A] Single Audit data for the local agency in the District of Columbia were not available for fiscal year 2002. [End of table] Table 6: Single Audit Data for States Related to Findings among Local CSBG Subgrantees Ranked by the Percentage of Agencies with Findings, 2003: Rank: 1; State: Delaware; Number of local agencies reporting Single Audits: 1; Percentage of local agencies with Single Audit findings: 100; Percentage of local agencies with material weakness findings: 0; Percentage of local agencies with material noncompliance findings: 0. Rank: 1; State: District of Columbia; Number of local agencies reporting Single Audits: 1; Percentage of local agencies with Single Audit findings: 100; Percentage of local agencies with material weakness findings: 100; Percentage of local agencies with material noncompliance findings: 100. Rank: 3; State: Puerto Rico; Number of local agencies reporting Single Audits: 4; Percentage of local agencies with Single Audit findings: 75; Percentage of local agencies with material weakness findings: 25; Percentage of local agencies with material noncompliance findings: 25. Rank: 4; State: Wyoming; Number of local agencies reporting Single Audits: 9; Percentage of local agencies with Single Audit findings: 67; Percentage of local agencies with material weakness findings: 44; Percentage of local agencies with material noncompliance findings: 22. Rank: 5; State: West Virginia; Number of local agencies reporting Single Audits: 13; Percentage of local agencies with Single Audit findings: 54; Percentage of local agencies with material weakness findings: 31; Percentage of local agencies with material noncompliance findings: 8. Rank: 6; State: Maryland; Number of local agencies reporting Single Audits: 15; Percentage of local agencies with Single Audit findings: 53; Percentage of local agencies with material weakness findings: 13; Percentage of local agencies with material noncompliance findings: 0. Rank: 7; State: New Hampshire; Number of local agencies reporting Single Audits: 6; Percentage of local agencies with Single Audit findings: 50; Percentage of local agencies with material weakness findings: 17; Percentage of local agencies with material noncompliance findings: 0. Rank: 7; State: South Dakota; Number of local agencies reporting Single Audits: 4; Percentage of local agencies with Single Audit findings: 50; Percentage of local agencies with material weakness findings: 0; Percentage of local agencies with material noncompliance findings: 0. Rank: 9; State: Texas; Number of local agencies reporting Single Audits: 42; Percentage of local agencies with Single Audit findings: 48; Percentage of local agencies with material weakness findings: 17; Percentage of local agencies with material noncompliance findings: 2. Rank: 10; State: Mississippi; Number of local agencies reporting Single Audits: 13; Percentage of local agencies with Single Audit findings: 46; Percentage of local agencies with material weakness findings: 8; Percentage of local agencies with material noncompliance findings: 15. Rank: 11; State: Nevada; Number of local agencies reporting Single Audits: 11; Percentage of local agencies with Single Audit findings: 45; Percentage of local agencies with material weakness findings: 9; Percentage of local agencies with material noncompliance findings: 18. Rank: 12; State: Louisiana; Number of local agencies reporting Single Audits: 36; Percentage of local agencies with Single Audit findings: 44; Percentage of local agencies with material weakness findings: 17; Percentage of local agencies with material noncompliance findings: 14. Rank: 12; State: Missouri; Number of local agencies reporting Single Audits: 18; Percentage of local agencies with Single Audit findings: 44; Percentage of local agencies with material weakness findings: 6; Percentage of local agencies with material noncompliance findings: 0. Rank: 14; State: New Jersey; Number of local agencies reporting Single Audits: 17; Percentage of local agencies with Single Audit findings: 41; Percentage of local agencies with material weakness findings: 6; Percentage of local agencies with material noncompliance findings: 6. Rank: 15; State: Pennsylvania; Number of local agencies reporting Single Audits: 42; Percentage of local agencies with Single Audit findings: 38; Percentage of local agencies with material weakness findings: 10; Percentage of local agencies with material noncompliance findings: 5. Rank: 16; State: Virginia; Number of local agencies reporting Single Audits: 27; Percentage of local agencies with Single Audit findings: 37; Percentage of local agencies with material weakness findings: 7; Percentage of local agencies with material noncompliance findings: 4. Rank: 17; State: Oklahoma; Number of local agencies reporting Single Audits: 19; Percentage of local agencies with Single Audit findings: 37; Percentage of local agencies with material weakness findings: 5; Percentage of local agencies with material noncompliance findings: 5. Rank: 18; State: South Carolina; Number of local agencies reporting Single Audits: 14; Percentage of local agencies with Single Audit findings: 36; Percentage of local agencies with material weakness findings: 14; Percentage of local agencies with material noncompliance findings: 0. Rank: 19; State: Alabama; Number of local agencies reporting Single Audits: 18; Percentage of local agencies with Single Audit findings: 33; Percentage of local agencies with material weakness findings: 11; Percentage of local agencies with material noncompliance findings: 0. Rank: 19; State: Arkansas; Number of local agencies reporting Single Audits: 15; Percentage of local agencies with Single Audit findings: 33; Percentage of local agencies with material weakness findings: 7; Percentage of local agencies with material noncompliance findings: 0. Rank: 19; State: New Mexico; Number of local agencies reporting Single Audits: 6; Percentage of local agencies with Single Audit findings: 33; Percentage of local agencies with material weakness findings: 0; Percentage of local agencies with material noncompliance findings: 0. Rank: 19; State: North Dakota; Number of local agencies reporting Single Audits: 6; Percentage of local agencies with Single Audit findings: 33; Percentage of local agencies with material weakness findings: 0; Percentage of local agencies with material noncompliance findings: 0. Rank: 23; State: Florida; Number of local agencies reporting Single Audits: 28; Percentage of local agencies with Single Audit findings: 32; Percentage of local agencies with material weakness findings: 11; Percentage of local agencies with material noncompliance findings: 4. Rank: 24; State: Indiana; Number of local agencies reporting Single Audits: 22; Percentage of local agencies with Single Audit findings: 32; Percentage of local agencies with material weakness findings: 14; Percentage of local agencies with material noncompliance findings: 14. Rank: 25; State: Minnesota; Number of local agencies reporting Single Audits: 35; Percentage of local agencies with Single Audit findings: 31; Percentage of local agencies with material weakness findings: 6; Percentage of local agencies with material noncompliance findings: 3. Rank: 26; State: Ohio; Number of local agencies reporting Single Audits: 48; Percentage of local agencies with Single Audit findings: 31; Percentage of local agencies with material weakness findings: 10; Percentage of local agencies with material noncompliance findings: 4. Rank: 27; State: Colorado; Number of local agencies reporting Single Audits: 37; Percentage of local agencies with Single Audit findings: 30; Percentage of local agencies with material weakness findings: 8; Percentage of local agencies with material noncompliance findings: 0. Rank: 28; State: Iowa; Number of local agencies reporting Single Audits: 17; Percentage of local agencies with Single Audit findings: 29; Percentage of local agencies with material weakness findings: 18; Percentage of local agencies with material noncompliance findings: 6. Rank: 29; State: Georgia; Number of local agencies reporting Single Audits: 22; Percentage of local agencies with Single Audit findings: 27; Percentage of local agencies with material weakness findings: 5; Percentage of local agencies with material noncompliance findings: 0. Rank: 29; State: Maine; Number of local agencies reporting Single Audits: 11; Percentage of local agencies with Single Audit findings: 27; Percentage of local agencies with material weakness findings: 9; Percentage of local agencies with material noncompliance findings: 0. Rank: 31; State: New York; Number of local agencies reporting Single Audits: 50; Percentage of local agencies with Single Audit findings: 26; Percentage of local agencies with material weakness findings: 4; Percentage of local agencies with material noncompliance findings: 2. Rank: 32; State: Washington; Number of local agencies reporting Single Audits: 27; Percentage of local agencies with Single Audit findings: 26; Percentage of local agencies with material weakness findings: 4; Percentage of local agencies with material noncompliance findings: 0. Rank: 33; State: California; Number of local agencies reporting Single Audits: 58; Percentage of local agencies with Single Audit findings: 26; Percentage of local agencies with material weakness findings: 2; Percentage of local agencies with material noncompliance findings: 0. Rank: 34; State: Illinois; Number of local agencies reporting Single Audits: 31; Percentage of local agencies with Single Audit findings: 26; Percentage of local agencies with material weakness findings: 3; Percentage of local agencies with material noncompliance findings: 0. Rank: 35; State: Arizona; Number of local agencies reporting Single Audits: 8; Percentage of local agencies with Single Audit findings: 25; Percentage of local agencies with material weakness findings: 0; Percentage of local agencies with material noncompliance findings: 0. Rank: 35; State: Connecticut; Number of local agencies reporting Single Audits: 12; Percentage of local agencies with Single Audit findings: 25; Percentage of local agencies with material weakness findings: 8; Percentage of local agencies with material noncompliance findings: 8. Rank: 35; State: Hawaii; Number of local agencies reporting Single Audits: 4; Percentage of local agencies with Single Audit findings: 25; Percentage of local agencies with material weakness findings: 25; Percentage of local agencies with material noncompliance findings: 25. Rank: 35; State: Michigan; Number of local agencies reporting Single Audits: 28; Percentage of local agencies with Single Audit findings: 25; Percentage of local agencies with material weakness findings: 4; Percentage of local agencies with material noncompliance findings: 4. Rank: 35; State: North Carolina; Number of local agencies reporting Single Audits: 28; Percentage of local agencies with Single Audit findings: 25; Percentage of local agencies with material weakness findings: 11; Percentage of local agencies with material noncompliance findings: 11. Rank: 40; State: Nebraska; Number of local agencies reporting Single Audits: 9; Percentage of local agencies with Single Audit findings: 22; Percentage of local agencies with material weakness findings: 0; Percentage of local agencies with material noncompliance findings: 0. Rank: 41; State: Montana; Number of local agencies reporting Single Audits: 10; Percentage of local agencies with Single Audit findings: 20; Percentage of local agencies with material weakness findings: 0; Percentage of local agencies with material noncompliance findings: 0. Rank: 42; State: Kentucky; Number of local agencies reporting Single Audits: 21; Percentage of local agencies with Single Audit findings: 19; Percentage of local agencies with material weakness findings: 10; Percentage of local agencies with material noncompliance findings: 5. Rank: 43; State: Wisconsin; Number of local agencies reporting Single Audits: 17; Percentage of local agencies with Single Audit findings: 18; Percentage of local agencies with material weakness findings: 6; Percentage of local agencies with material noncompliance findings: 0. Rank: 44; State: Massachusetts; Number of local agencies reporting Single Audits: 23; Percentage of local agencies with Single Audit findings: 13; Percentage of local agencies with material weakness findings: 0; Percentage of local agencies with material noncompliance findings: 0. Rank: 45; State: Kansas; Number of local agencies reporting Single Audits: 8; Percentage of local agencies with Single Audit findings: 13; Percentage of local agencies with material weakness findings: 0; Percentage of local agencies with material noncompliance findings: 0. Rank: 45; State: Tennessee; Number of local agencies reporting Single Audits: 16; Percentage of local agencies with Single Audit findings: 13; Percentage of local agencies with material weakness findings: 0; Percentage of local agencies with material noncompliance findings: 0. Rank: 47; State: Oregon; Number of local agencies reporting Single Audits: 18; Percentage of local agencies with Single Audit findings: 11; Percentage of local agencies with material weakness findings: 0; Percentage of local agencies with material noncompliance findings: 0. Rank: 48; State: Alaska; Number of local agencies reporting Single Audits: 1; Percentage of local agencies with Single Audit findings: 0; Percentage of local agencies with material weakness findings: 0; Percentage of local agencies with material noncompliance findings: 0. Rank: 48; State: Idaho; Number of local agencies reporting Single Audits: 8; Percentage of local agencies with Single Audit findings: 0; Percentage of local agencies with material weakness findings: 0; Percentage of local agencies with material noncompliance findings: 0. Rank: 48; State: Rhode Island; Number of local agencies reporting Single Audits: 8; Percentage of local agencies with Single Audit findings: 0; Percentage of local agencies with material weakness findings: 0; Percentage of local agencies with material noncompliance findings: 0. Rank: 48; State: Utah; Number of local agencies reporting Single Audits: 8; Percentage of local agencies with Single Audit findings: 0; Percentage of local agencies with material weakness findings: 0; Percentage of local agencies with material noncompliance findings: 0. Rank: 48; State: Vermont; Number of local agencies reporting Single Audits: 4; Percentage of local agencies with Single Audit findings: 0; Percentage of local agencies with material weakness findings: 0; Percentage of local agencies with material noncompliance findings: 0. State: National totals; Number of local agencies reporting Single Audits: 954; Percentage of local agencies with Single Audit findings: 31; Percentage of local agencies with material weakness findings: 8; Percentage of local agencies with material noncompliance findings: 4. Source: GAO analysis of data from the Federal Audit Clearinghouse. Note: Ties are noted with some states sharing ranks. Ranking was based on percentages rounded to 1/100th, therefore the table may show other apparent but not actual ties. [End of table] [End of section] Appendix V: Comments from the Department of Health & Human Services: Department Of Health & Human Services: Office of Inspector General: Ms. Marnie S. Shaul: Director, Education, Workforce and Income Security Issues: U.S. Government Accountability Office: Washington, DC 20548: Dear Ms. Shaul: Enclosed are the Department's comments on the U.S. Government Accountability Office's (GAO) draft report entitled, "Community Services Block Grant Program: HHS Should Improve Oversight by Focusing Monitoring and Assistance Efforts on Areas of High Risk" (GAO-06-627). These comments represent the tentative position of the Department and are subject to reevaluation when the final version of this report is received. The Department provided several technical comments directly to your staff. The Department appreciates the opportunity to comment on this draft report before its publication. Sincerely, Signed by: Daniel R. Levinson: Inspector General: Enclosure: The Office of Inspector General (OIG) is transmitting the Department's response to this draft report in our capacity as the Department's designated focal point and coordinator for U.S. Government Accountability Office reports. OIG has not conducted an independent assessment of these comments and therefore expresses no opinion on them. Comments Of The Department Of Health And Human Services On The U.S. Government Accountability Office Draft Report Entitled, "Community Services Block Grant Program: HHS Should Improve Oversight By Focusing Monitoring And Assistance Efforts On Areas Of High Risk" (GAO-06-627): The Department of Health and Human Services (HHS) appreciates the opportunity to comment on the U.S. Government Accountability Office (GAO) draft report. GAO Recommendations: In order to provide better oversight of state agencies, we recommend that the Assistant Secretary for Children and Families direct OCS to take the following actions: * Conduct a risk-based assessment of state CSBG programs by systematically collecting and using information. This information may include programmatic and performance data, state and local Single Audit findings, information on state monitoring efforts and local agencies with problems; and monitoring results from other related federal programs that may be obtained by effectively using the memorandum of understanding with the Head Start Program and other collaborative efforts, * Establish policies and procedures to help ensure that its on -site monitoring is focused on states with highest risk, * Issue guidance on state responsibilities with regard to complying with the requirement to monitor local agencies at least once during each 3 - year time period; * Establish reporting guidance for training and technical grants that would allow OCS to obtain information on the outcomes of grant funded activities for local agencies, and: * Implement a strategic plan that will focus its training and technical assistance efforts on the areas in which states face the greatest needs. OCS should make use of risk assessments and its reviews of past training and technical assistance efforts to inform the strategic plan. HHS Response: The Assistant Secretary for Children and Families, Wade F. Horn, Ph.D., and the Director of the Office of Community Services (OCS), Josephine Bias Robinson, have worked diligently over the past year to strengthen the Community Service Block Grant (CSBG) program oversight. The Administration for Children and Families (ACF) has made great gains in restructuring the monitoring component of the program in a way that will improve administration, accountability and outcomes of State and local agencies in the provision of CSBG-funded services. In addition to the actions ACF identified in its response to the GAO preliminary report in April 2006, (i.e., to build the capacity of the Federal agency to better monitor State and local agencies), OCS has initiated the following management improvements: * A Risk-Based Triennial Schedule for OCS Monitoring of State Lead Agencies: OCS is finalizing a strategy to identify States and local agencies most in need of Federal oversight and technical assistance, and to establish a triennial schedule for monitoring State and local agencies formed by the risk assessment characteristics. OCS is prepared to institute a trial year of the procedure in fiscal year (FY) 2007 and to establish a fully operational triennial schedule and methodology by FY 2008. While the goal is for each State to receive a review within each triennial monitoring cycle, OCS is working to focus and time the most immediate reviews for the next 3 years based on a risk assessment of the need for Federal oversight in the States' programs. The characteristics that determine "risk," in State and local programs are being culled from the State plans, audit reports, previous CSBG monitoring and performance reports, and reports from other programs administered by CSBG-funded local agencies (e.g., Head Start, the Low Income Home Energy Assistance Program, etc.) and will be used to select States for review. OCS is also communicating with CSBG State-lead agency officials, ACF regional liaisons, and the on-going OCS-sponsored Monitoring and Assessment Task Force. This task force is a consortium composed of representatives from OCS, national community action organizations, State CSBG lead-agency officials, State association directors, and selected local agency executive directors. * Guidance to States on Statutory Monitoring Responsibilities: On January 13, 2006, OCS issued Information Memorandum Transmittal No. 94, which advises State CSBG authorities of their statutory obligation to monitor local agencies and encourages them to make special efforts to conduct monitoring and technical assistance among those agencies that are scheduled for initial or follow-up Head Start Performance and Registration Information Systems Management Reviews. [Hyperlink, http://www.acf.dhhs.gov/programs/ocs/csbg/docments/im94.html]. OCS will issue, by October 1, 2006, follow-up guidance pursuant to this transmittal that will further clarify to State CSBG-lead agencies their statutory obligation to monitor all local entities receiving CSBG funding within a 3-year period, and clarify the requirements for executing their monitoring program. * Improved OCS Training and Technical Assistance: OCS has worked with the Monitoring and Assessment Task Force to develop a comprehensive strategic plan for providing training and technical assistance to State and local CSBG-funded entities that focuses on: * Program Leadership, * Program Integrity-Administration and Fiscal Controls, and: * Program Accountability-Data Collection and Reporting. The plan has been disseminated for review and comment by all State and local CSBG agencies. OCS is in the process of hiring three Federal staff and two contractors with appropriate skills and expertise in financial management to conduct monitoring of State programs. Fiscal management staff also will provide training and technical assistance to improve State financial oversight of local agencies receiving CSBG funds. OCS expects to make approximately eight grant awards by September 30, 2006, to intermediary organizations who provide technical assistance to State and local entities in need of management, fiscal, or administrative assistance. Awards will be made to organizations with experience in financial management principles to help troubled CSBG grantees improve their allocation and control o^ funds, oversight of local agencies, and compliance with the Office of Management and Budget and Internal Revenue Service requirements. As indicated, ACF officials agree that there should be improvements to Federal and State monitoring of the CSBG program. We believe that as these modifications to improve upon the Federal and State monitoring and oversight of the CSBG program are implemented in full, we will strengthen the capacity and improve the management of the program to better serve children and families in communities across the nation. [End of section] Appendix VI: GAO Contact and Staff Acknowledgments: GAO Contact: Marnie S. Shaul, (202) 512-7215, shaulm@gao.gov: Staff Acknowledgments: In addition to the contact named above, Bryon Gordon (Assistant Director), Danielle Giese (Analyst-in-Charge), Janice Ceperich, Tim Hall, and Andrew Huddleston made significant contributions to this report. Curtis Groves, Matt Michaels, and Luann Moy provided assistance with research methodology and data analysis. Jim Rebbe provided legal counsel, and Jonathan McMurray and Lise Levie assisted with report development. [End of section] Related GAO Products: Community Services Block Grant Program: HHS Needs to Improve Monitoring of State Grantees. GAO-06-373R. Washington, D.C.: February 7, 2006. Head Start: Comprehensive Approach to Identifying and Addressing Risks Could Help Prevent Grantee Financial Management Weaknesses. GAO-05- 465T. Washington, D.C.: April 5, 2005. Head Start: Comprehensive Approach to Identifying and Addressing Risks Could Help Prevent Grantee Financial Management Weaknesses. GAO-05-176. Washington, D.C: February 28, 2005. Internal Control Management and Evaluation Tool. GAO-01-1008G. Washington, D.C.: August 2001. Standards for Internal Control in the Federal Government. GAO/AIMD-00- 21.3.1. Washington: D.C.: November 1999. Grant Programs: Design Features Shape Flexibility, Accountability, and Performance Information. GAO/GGD-98-137. Washington, D.C.: June 22, 1998. FOOTNOTES [1] Territories and tribes also receive CSBG funds but were not included in the scope of our work. [2] For more information on internal control standards, see GAO, Standards for Internal Control in the Federal Government, GAO/ AIMD-00- 21.3.1 (Washington: D.C.: November 1999), and Office of Management and Budget, Circular No. A-123, Management's Responsibility for Internal Control, (Washington: D.C.: Dec. 21, 2004). [3] Administrative costs are not directly associated with providing services and can include staff salaries and costs related to reporting program data. [4] For a specific funding source to be reviewed under a Single Audit annually, its expenditures would have to represent between 0.15 percent and 3 percent of the total amount of federal funds expended by a state or local agency. The threshold varies depending on total expenditures of federal awards. [5] OCS also visited the Navajo Nation in 2003. [6] For more information on internal control standards, see GAO, Standards for Internal Control in the Federal Government, GAO/AIMD-00- 21.3.1 (Washington: D.C.: November 1999), and Office of Management and Budget, Circular No. A-123, Management's Responsibility for Internal Control, (Washington: D.C.: December 21, 2004). [7] The American Institute of Certified Public Accountants (AICPA) standards define "material weakness" as a reportable condition in which the design or operation of one or more of the internal control components does not reduce to a relatively low level the risk that misstatements caused by error or fraud in amounts that would be material in relation to the financial statements being audited may occur and not be detected within a timely period by employees in the normal course of performing their assigned functions. [8] The Financial Audit Manual published by GAO and the President's Council on Integrity and Efficiency (GAO-01-765G) defines "material noncompliance" as reportable noncompliance in which a failure to comply with laws or regulations results in misstatements that are material to the financial statements. GAO's Mission: The Government Accountability Office, the investigative arm of Congress, exists to support Congress in meeting its constitutional responsibilities and to help improve the performance and accountability of the federal government for the American people. GAO examines the use of public funds; evaluates federal programs and policies; and provides analyses, recommendations, and other assistance to help Congress make informed oversight, policy, and funding decisions. GAO's commitment to good government is reflected in its core values of accountability, integrity, and reliability. Obtaining Copies of GAO Reports and Testimony: The fastest and easiest way to obtain copies of GAO documents at no cost is through the Internet. GAO's Web site ( www.gao.gov ) contains abstracts and full-text files of current reports and testimony and an expanding archive of older products. The Web site features a search engine to help you locate documents using key words and phrases. You can print these documents in their entirety, including charts and other graphics. Each day, GAO issues a list of newly released reports, testimony, and correspondence. GAO posts this list, known as "Today's Reports," on its Web site daily. The list contains links to the full-text document files. To have GAO e-mail this list to you every afternoon, go to www.gao.gov and select "Subscribe to e-mail alerts" under the "Order GAO Products" heading. Order by Mail or Phone: The first copy of each printed report is free. Additional copies are $2 each. A check or money order should be made out to the Superintendent of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or more copies mailed to a single address are discounted 25 percent. Orders should be sent to: U.S. Government Accountability Office 441 G Street NW, Room LM Washington, D.C. 20548: To order by Phone: Voice: (202) 512-6000: TDD: (202) 512-2537: Fax: (202) 512-6061: To Report Fraud, Waste, and Abuse in Federal Programs: Contact: Web site: www.gao.gov/fraudnet/fraudnet.htm E-mail: fraudnet@gao.gov Automated answering system: (800) 424-5454 or (202) 512-7470: Public Affairs: Jeff Nelligan, managing director, NelliganJ@gao.gov (202) 512-4800 U.S. Government Accountability Office, 441 G Street NW, Room 7149 Washington, D.C. 20548:

The Justia Government Accountability Office site republishes public reports retrieved from the U.S. GAO These reports should not be considered official, and do not necessarily reflect the views of Justia.