Recovery Act

Energy Efficiency and Conservation Block Grant Recipients Face Challenges Meeting Legislative and Program Goals and Requirements Gao ID: GAO-11-379 April 7, 2011

The American Recovery and Reinvestment Act of 2009 (Recovery Act) provided $3.2 billion for the Department of Energy's (DOE) Energy Efficiency and Conservation Block Grant Program (EECBG) to develop and manage projects to improve energy efficiency and reduce energy use and fossil fuel emissions. The Recovery Act requires GAO to review funds made available under the act and to comment on recipients' estimates of jobs created or retained. GAO examined (1) how EECBG recipients used EECBG funds and challenges they faced, if any; (2) DOE and recipients' oversight and monitoring activities and challenges, if any; (3) the extent to which the EECBG program is meeting Recovery Act and program goals for energy savings; and (4) the quality of jobs data reported by Recovery Act recipients, particularly EECBG recipients. GAO also updates the status of open recommendations from previous bimonthly and recipient reporting reviews. GAO analyzed DOE recipient data and interviewed DOE officials and a nonprobability sample of EECBG recipients, among other things.

According to DOE data, EECBG recipients primarily used funds for 3 of the 14 activities eligible for EECBG funding. These activities are energy-efficiency retrofits, financial incentive programs, and buildings and facilities projects. Some DOE officials, recipients, and others identified challenges in obligating and spending funds due to local jurisdictional requirements and staff and resource limitations. In addition, in April 2010 DOE determined that many recipients were not on a trajectory to obligate and spend funds within specified time frames, so DOE issued new milestones for obligating and spending funds. Many recipients reported having had difficulty meeting the new milestones. DOE is taking steps to address these difficulties. According to DOE officials and documentation, DOE follows a programwide monitoring plan to oversee the use of Recovery Act funds and uses a variety of techniques to monitor recipients. Overall, recipients also use various methods to monitor contractors and subrecipients, but DOE does not always collect information on recipients' monitoring activities. As a result, DOE does not always know whether the monitoring activities of recipients are sufficiently rigorous to ensure compliance with federal requirements. Some DOE officials, recipients, and others have reported to GAO that some DOE staff and recipients faced challenges with overseeing the use of funds, including (1) technical challenges with a Web-based reporting application DOE uses as a primary oversight tool and (2) staffing and expertise limitations, such as some recipients' unfamiliarity with federal grant procedures. Recipients contacted and some DOE officials reported to GAO that recipients are using EECBG funds to develop projects designed to reduce energy use and increase energy savings in line with Recovery Act and program goals. However, DOE officials have experienced challenges in assessing the extent to which the EECBG program is meeting those goals. Because actual energy savings data are generally available only after a project is completed, DOE officials said that most recipients report estimates to comply with program reporting requirements. DOE takes steps to assess the reasonableness of these estimates but does not require recipients to report the methods or tools used to develop estimates. In addition, while DOE provides recipients with a tool to estimate energy savings, DOE does not require that recipients use the most recent, updated version of its estimating tool. GAO's analysis of the Recovery.gov data that recipients reported, including jobs funded, shows data quality this quarter reflects minor amounts of inconsistencies or illogical data. The portion of EECBG recipients reporting some jobs funded has continued to increase. DOE headquarters and field officials continue to address data quality concerns, including ensuring that recipients and reviewers had the updated Office of Management and Budget guidance on narrative descriptions. However, data across reporting periods may not be comparable because, in earlier periods, some confusion existed about methods for calculating jobs funded. GAO recommends that DOE (1) explore a means to capture information on recipients' monitoring activities, and (2) solicit information on recipients' methods for estimating energy-related impact metrics and verify that recipients use the most recent version of DOE's estimating tool. DOE generally agreed with GAO's recommendations.

Recommendations

Our recommendations from this work are listed below with a Contact for more information. Status will change from "In process" to "Open," "Closed - implemented," or "Closed - not implemented" based on our follow up work.

Director: Franklin W. Rusco Team: Government Accountability Office: Natural Resources and Environment Phone: (202) 512-4597


GAO-11-379, Recovery Act: Energy Efficiency and Conservation Block Grant Recipients Face Challenges Meeting Legislative and Program Goals and Requirements This is the accessible text file for GAO report number GAO-11-379 entitled 'Recovery Act: Energy Efficiency and Conservation Block Grant Recipients Face Challenges Meeting Legislative and Program Goals and Requirements' which was released on April 7, 2011. This text file was formatted by the U.S. Government Accountability Office (GAO) to be accessible to users with visual impairments, as part of a longer term project to improve GAO products' accessibility. Every attempt has been made to maintain the structural and data integrity of the original printed product. Accessibility features, such as text descriptions of tables, consecutively numbered footnotes placed at the end of the file, and the text of agency comment letters, are provided but may not exactly duplicate the presentation or format of the printed version. The portable document format (PDF) file is an exact electronic replica of the printed version. We welcome your feedback. Please E-mail your comments regarding the contents or accessibility features of this document to Webmaster@gao.gov. This is a work of the U.S. government and is not subject to copyright protection in the United States. It may be reproduced and distributed in its entirety without further permission from GAO. Because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. United States Government Accountability Office: GAO: Report to the Congress: April 2011: Recovery Act: Energy Efficiency and Conservation Block Grant Recipients Face Challenges Meeting Legislative and Program Goals and Requirements: GAO-11-379: GAO Highlights: Highlights of GAO-11-379, a report to the Congress. Why GAO Did This Study: The American Recovery and Reinvestment Act of 2009 (Recovery Act) provided $3.2 billion for the Department of Energy‘s (DOE) Energy Efficiency and Conservation Block Grant Program (EECBG) to develop and manage projects to improve energy efficiency and reduce energy use and fossil fuel emissions. The Recovery Act requires GAO to review funds made available under the act and to comment on recipients‘ estimates of jobs created or retained. GAO examined (1) how EECBG recipients used EECBG funds and challenges they faced, if any; (2) DOE and recipients‘ oversight and monitoring activities and challenges, if any; (3) the extent to which the EECBG program is meeting Recovery Act and program goals for energy savings; and (4) the quality of jobs data reported by Recovery Act recipients, particularly EECBG recipients. GAO also updates the status of open recommendations from previous bimonthly and recipient reporting reviews. GAO analyzed DOE recipient data and interviewed DOE officials and a nonprobability sample of EECBG recipients, among other things. What GAO Found: According to DOE data, EECBG recipients primarily used funds for 3 of the 14 activities eligible for EECBG funding. These activities are energy-efficiency retrofits, financial incentive programs, and buildings and facilities projects. Some DOE officials, recipients, and others identified challenges in obligating and spending funds due to local jurisdictional requirements and staff and resource limitations. In addition, in April 2010 DOE determined that many recipients were not on a trajectory to obligate and spend funds within specified time frames, so DOE issued new milestones for obligating and spending funds. Many recipients reported having had difficulty meeting the new milestones. DOE is taking steps to address these difficulties. According to DOE officials and documentation, DOE follows a programwide monitoring plan to oversee the use of Recovery Act funds and uses a variety of techniques to monitor recipients. Overall, recipients also use various methods to monitor contractors and subrecipients, but DOE does not always collect information on recipients‘ monitoring activities. As a result, DOE does not always know whether the monitoring activities of recipients are sufficiently rigorous to ensure compliance with federal requirements. Some DOE officials, recipients, and others have reported to GAO that some DOE staff and recipients faced challenges with overseeing the use of funds, including (1) technical challenges with a Web-based reporting application DOE uses as a primary oversight tool and (2) staffing and expertise limitations, such as some recipients‘ unfamiliarity with federal grant procedures. Recipients contacted and some DOE officials reported to GAO that recipients are using EECBG funds to develop projects designed to reduce energy use and increase energy savings in line with Recovery Act and program goals. However, DOE officials have experienced challenges in assessing the extent to which the EECBG program is meeting those goals. Because actual energy savings data are generally available only after a project is completed, DOE officials said that most recipients report estimates to comply with program reporting requirements. DOE takes steps to assess the reasonableness of these estimates but does not require recipients to report the methods or tools used to develop estimates. In addition, while DOE provides recipients with a tool to estimate energy savings, DOE does not require that recipients use the most recent, updated version of its estimating tool. GAO‘s analysis of the Recovery.gov data that recipients reported, including jobs funded, shows data quality this quarter reflects minor amounts of inconsistencies or illogical data. The portion of EECBG recipients reporting some jobs funded has continued to increase. DOE headquarters and field officials continue to address data quality concerns, including ensuring that recipients and reviewers had the updated Office of Management and Budget guidance on narrative descriptions. However, data across reporting periods may not be comparable because, in earlier periods, some confusion existed about methods for calculating jobs funded. What GAO Recommends: GAO recommends that DOE (1) explore a means to capture information on recipients‘ monitoring activities, and (2) solicit information on recipients‘ methods for estimating energy-related impact metrics and verify that recipients use the most recent version of DOE‘s estimating tool. DOE generally agreed with GAO‘s recommendations. View [hyperlink, http://www.gao.gov/products/GAO-11-379] or key components. For more information, contact Mark E. Gaffigan at (202) 512-3841 or gaffiganm@gao.gov or Yvonne D. Jones at (202) 512-6806 or jonesy@gao.gov. [End of section] Contents: Letter: Background: Grant Recipients Are Using EECBG Funds Primarily for Three Activities but Face Several Challenges in Obligating and Spending These Funds: DOE and Recipients Are Taking Actions to Provide Oversight of EECBG Funds but Report Facing Challenges in Meeting Recovery Act and Other Program Requirements: DOE Has Faced Challenges in Determining the Extent to Which the EECBG Program is Meeting Recovery Act and Program Goals for Energy Savings: Oversight of Recipient Reporting Data Quality Continues for the Sixth Round of Reporting: Conclusions: Recommendations for Executive Action: Agency Comments and Our Evaluation: Appendix I: Objectives, Scope, and Methodology: Appendix II: Comments from the Department of Energy: Appendix III: Status of Prior Open Recommendations and Matters for Congressional Consideration: Appendix IV: GAO Contacts and Staff Acknowledgments: Related GAO Products: Tables: Table 1: Activities Eligible for EECBG Funding: Table 2: Number of Projects and Percentage of EECBG Funds Allocated to Activities: Table 3: Total EECBG Funds Obligated and Spent by Recipients as of December 31, 2010: Table 4: Recipients That Met DOE's September 30, 2010, Spending Milestone: Table 5: Planned Frequency of EECBG Monitoring Activities: Table 6: DOE-Reported EECBG Monitoring Progress as of February 14, 2011: Figure: Figure 1: Portion of EECBG Recipients Reporting Funding at Least a Partial FTE in 2010: Abbreviations: CFO: Chief Financial Officer: DOE: Department of Energy: EECBG: Energy Efficiency and Conservation Block Grant: EECS: Energy Efficiency and Conservation Strategy: EISA: Energy Independence and Security Act of 2007: FTE: full-time equivalent: HVAC: heating, ventilation, and air conditioning: LED: light-emitting diode: NACo: National Association of Counties: NASEO: National Association of State Energy Officials: OMB: Office of Management and Budget: OWIP: Office of Weatherization and Intergovernmental Programs: PAGE: Performance and Accountability for Grants in Energy: Recovery Act: American Recovery and Reinvestment Act of 2009: USCM: U.S. Conference of Mayors: [End of section] United States Government Accountability Office: Washington, DC 20548: April 7, 2011: Report to the Congress: Since the American Recovery and Reinvestment Act of 2009 (Recovery Act) was enacted in February 2009, the Department of the Treasury has paid out approximately $205.4 billion[Footnote 1] of the Recovery Act funds for use by states and localities. The Recovery Act directed states to use the funds for various purposes, including to preserve and create jobs; assist those most affected by the recession; and invest in transportation, environmental protection, and other infrastructure to provide long-term economic benefits.[Footnote 2] Of the Recovery Act funds, about $3.2 billion were provided as grants to states, territories, federally recognized Indian tribes and local communities through the Department of Energy's (DOE) Office of Energy Efficiency and Renewable Energy's newly funded Energy Efficiency and Conservation Block Grant program (EECBG). DOE provides EECBG funds to grant recipients to develop, promote, and manage projects to improve energy efficiency and reduce energy use and fossil fuel emissions in local communities. DOE further encourages EECBG recipients to develop new and innovative approaches; prioritize energy efficiency and conservation; develop projects in a cost-effective manner that will stimulate the economy; and to the extent possible, to develop programs that will continue beyond the funding period. DOE provides EECBG funds to grant recipients in two forms: through formula grants and competitive grants. Of the $3.2 billion, DOE awarded about $2.7 billion through formula grants to local communities and states. About 61 percent of the total EECBG funds ($1.94 billion) was awarded as formula grants to more than 2,000 local communities-- including cities, counties, and tribal communities--and about 24 percent of the total EECBG funds ($767 million) was awarded to the states, five territories, and the District of Columbia. About 1 percent of the total EECBG funds ($40 million) was allocated to Administrative and Training/Technical Assistance. In addition to the approximately $2.7 billion in formula grants, DOE awarded about 14 percent of the total EECBG funds ($453 million) through competitive grants to local communities. This report, the ninth in a series of bimonthly GAO reviews, responds to a mandate in the Recovery Act, which requires that GAO conduct such reviews of funds made available under the act to determine how funds are used, including whether funds are achieving the stated purposes of the act. The Recovery Act also requires GAO to comment on estimates of jobs created or retained as reported by recipients.[Footnote 3] Over the past 2 years, our bimonthly reviews of Recovery Act programs have covered a wide range of programs including Medicaid, education, Head Start, highways and transit, housing construction and tax credit assistance, emergency food and shelter, justice assistance and community-oriented policing, workforce investment, and environmental and energy projects. In this report, we reviewed and updated recipients' information available on the EECBG program, focusing on the approximately $2.7 billion awarded through formula funding to eligible states and local and tribal communities. Specifically, our objectives were to determine (1) how EECBG funds are being used, and what challenges, if any, EECBG recipients face in obligating and spending their funds; (2) actions DOE officials and EECBG recipients are taking to provide oversight of EECBG funds and challenges, if any, they face in meeting Recovery Act and other requirements; (3) the extent to which EECBG recipients and the EECBG program are meeting Recovery Act and EECBG program goals for energy savings and what challenges, if any, recipients have encountered in measuring and reporting energy savings; and (4) how the quality of estimates of jobs created and retained reported by Recovery Act recipients, particularly EECBG recipients, has changed over time. To address these objectives, we reviewed relevant federal laws and regulations, as well as DOE guidance documents. We analyzed DOE's EECBG program data from DOE databases. We interviewed EECBG program officials, including about 30 project officers, technical monitors, and contractors in the field offices responsible for managing and monitoring awards. We also interviewed representatives from several energy and public service organizations, including the National Association of State Energy Officials (NASEO), the National Association of Counties (NACo), and the U.S. Conference of Mayors (USCM). In addition, we e-mailed questions to a sample of 91 purposefully selected city and county recipients that are eligible to receive EECBG funding. We received responses from 49 recipients to questions on the various aspects of our objectives, such as obligating and spending funds, guidance, best practices, monitoring, and challenges experienced. The responses from this nonprobability sample of recipients are not generalizable to the 2,185 EECBG states, cities, counties, and tribal grant recipients eligible for formula funding nationwide. To comment on recipients' estimates of jobs created or retained, we analyzed the quality of estimates of jobs created or retained provided by recipients. The Recovery Act requires that nonfederal recipients of Recovery Act funds--including grants, contracts, and loans--submit quarterly reports. These reports include a list of each project or activity for which Recovery Act funds were expended or obligated and information concerning the amount and use of funds and jobs created or retained by these projects and activities, among other information. The latest of these recipient reports covered the activity as of the Recovery Act's passage through the quarter ending December 31, 2010. We assessed these reports for completeness and reliability and found them sufficiently reliable for the purposes of this report. Our oversight of programs funded by the Recovery Act has resulted in more than 10 related products in or after December 2010 and more than 90 related products with numerous recommendations since we began reporting on the Recovery Act (see the Related GAO Products list for reports issued in or after December 2010 and GAO's Web site for a list of all GAO reports related to Recovery Act funding). In addition to the objectives outlined above, this report updates agency actions in response to recommendations from previous bimonthly and recipient reporting reviews that have not been fully implemented (open recommendations), including our prior recommendations regarding the use of Recovery Act funds for the Weatherization Assistance Program (see appendix III).[Footnote 4] We conducted this performance audit from September 2010 to April 2011 in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. Background: The goals of the Recovery Act are to preserve and create jobs and promote economic recovery; to assist those most impacted by the recession; to provide investments needed to increase economic efficiency by spurring technological advances in science and health; to invest in transportation, environmental protection, and other infrastructure that will provide long-term economic benefits; and to stabilize state and local government budgets, in order to minimize and avoid reductions in essential services and counterproductive state and local tax increases. The EECBG program was authorized in the Energy Independence and Security Act of 2007 (EISA),[Footnote 5] which intended to move the United States toward greater energy independence and security and to increase the production of clean renewable fuels, among other things. The EECBG program was funded for the first time by the Recovery Act, through formula and competitive grants. Through the program, DOE allocates formula grants to the 50 states, the District of Columbia, and five territories; to city and county recipients based on their resident and commuter populations; and to Native American tribes based on population and climatic conditions. Applicants eligible for formula funding include cities or city-equivalent units of government, such as towns or villages with populations of at least 35,000; counties, which include county-equivalent units of local government, such as parishes or boroughs with populations of at least 200,000; and all Indian tribes and any Alaska Native village. A city or county is also eligible for direct funding if it is one of the 10 highest-populated cities or counties of the state in which it is located. The EECBG program has broad goals for energy-related outcomes. DOE encourages EECBG recipients to develop new and innovative approaches to meet the purposes of the program: to prioritize energy efficiency and conservation; develop projects in a cost-effective manner that will maximize benefits over time; stimulate the economy; leverage other public or private resources; promote energy market transformation; and to the extent possible, to develop programs that will provide sustainable, and measurable energy savings, job creation, and economic stimulus benefits that will continue beyond the funding period. DOE announced the funding opportunity for interested applicants to submit applications for EECBG formula grant funding on March 26, 2009. DOE required applicants to submit an Energy Efficiency and Conservation Strategy (EECS) that described their strategy for achieving the Recovery Act's goals of the program. DOE had 120 days to review and approve or disapprove recipients' EECSs. DOE's funding announcement also required that recipients select projects from the 14 eligible activities identified in Section 544 of EISA, shown in table 1. Table 1: Activities Eligible for EECBG Funding: Eligible activity: 1. Energy Efficiency and Conservation Strategy (EECS); Description: Developing a strategy for using EECBG funds to promote energy efficiency and conservation goals. All recipients are required to develop and submit an EECS. Eligible activity: 2. Technical consultant services; Description: Retaining a technical consultant to assist in developing an EECS. Eligible activity: 3. Residential and commercial buildings energy audits; Description: Conducting energy audits of residential and commercial buildings. Eligible activity: 4. Financial incentive programs; Description: Providing programs for energy-efficiency improvements, such as energy-saving performance contracting, on-bill financing, and revolving loan funds. Eligible activity: 5. Energy-efficiency retrofits; Description: Providing grants to nonprofit organizations and governmental agencies for retrofitting existing facilities to improve energy efficiency. Eligible activity: 6. Buildings and facilities; Description: Developing and implementing energy efficiency and conservation programs for buildings and facilities within the recipient's jurisdiction, such as measurement and verification protocols, public education, and identifying energy-efficient technology. Eligible activity: 7. Transportation programs; Description: Developing programs to conserve energy used in transportation, such as bike lanes, synchronizing traffic signals, and programs that reduce commuting. Eligible activity: 8. Codes and inspections; Description: Developing building codes and inspection services to promote building energy efficiency. Eligible activity: 9. Energy distribution; Description: Implementing energy distribution technologies that significantly increase energy efficiency, including distributed resources, combined heat and power, and district heating and cooling systems. Eligible activity: 10. Material conservation programs; Description: Developing and implementing material conservation programs, including source reduction, recycling, and recycled content procurement programs that increase energy efficiency. Eligible activity: 11. Reduction/capture of methane/greenhouse gases; Description: Purchasing or implementing technologies to reduce and capture methane and greenhouse gases generated by landfills or similar waste-related sources. Eligible activity: 12. Lighting; Description: Replacing traffic signals and street lighting with energy- efficient lighting technologies, such as light emitting diodes. Eligible activity: 13. Renewable energy technologies; Description: Developing on-site renewable energy technology on or in a government building, including solar energy, wind energy, fuel cells, and biomass. Eligible activity: 14. Other; Description: Undertaking any other appropriate activity that meets the purposes of the program and is approved by DOE, such as using EECBG funds to leverage public and private sector funds and partnering with third party lenders. Source: EISA Pub. L. No. 110-140, § 544; DOE, Financial Assistance Funding Opportunity Announcement: Recovery Act - Energy Efficiency and Conservation Block Grants - Formula Grants, DE-FOA-0000013 (Mar. 26, 2009); DOE, Energy Efficiency and Conservation Block Grant Program Notice 10-011 (April 21, 2010); DOE, Energy Efficiency and Conservation Block Grant Program Notice 10-021 (Jan. 4, 2011). [End of table] The Recovery Act increased the importance of transparency and accountability in its use of funds. Accordingly, DOE requires grant recipients to report grant-level expenditure information and performance information including hours worked, energy cost savings, and percent of work completed as well as other figures, through a Web- based application called the Performance and Accountability for Grants in Energy (PAGE) system within 30 calendar days of the end of each quarter year.[Footnote 6] PAGE allows recipients to electronically submit and manage grant performance and financial information to DOE. In addition, grant recipients are required to report through www.FederalReporting.gov within 10 days after the end of each quarter. This information is made available to the general public through the Recovery.gov Web site. Grant Recipients Are Using EECBG Funds Primarily for Three Activities but Face Several Challenges in Obligating and Spending These Funds: Grant recipients are using EECBG funds primarily for three activities: energy-efficiency retrofits, financial incentive programs, and building and facilities programs. However, recipients have reported several challenges that have delayed their efforts to obligate and spend these funds. Grant Recipients Have Allocated Almost Two-Thirds of EECBG Funds to 3 of the 14 Eligible Activities: Grant recipients have allocated most EECBG funds to 3 of the 14 activities that DOE designated as eligible for EECBG funding in accordance with EISA. As shown in table 2, recipients have allocated nearly two-thirds (65.1 percent) of EECBG funds for three types of activities: (1) energy-efficiency retrofits (36.8 percent), which includes activities such as grants to nonprofit organizations and governmental agencies for retrofitting existing facilities to improve energy efficiency; (2) financial incentive programs (18.5 percent), which includes activities such as rebates, subgrants, and revolving loans to promote energy-efficiency improvements; and (3) energy- efficiency and conservation programs for buildings and facilities (9.8 percent), which includes activities such as installing storm windows or solar hot water technology. Some EECBG recipients are using their awards to fund projects in all three of these categories. For example, according to information reported through PAGE, New York City plans to use $31 million of its $80.8 million award to fund energy-efficiency retrofits at municipal buildings such as schools, courthouses, police precincts, and firehouses. It has also allocated $16.1 million to a financial incentive program that will provide loans to capital- constrained building owners for energy-efficient retrofits to residential, commercial, or industrial buildings. New York City designated another $2 million for buildings and facilities projects and will fund a retro-commissioning program at city facilities designed to identify efficiency measures and address anomalies in energy use, equipment schedules, and control sequences that may cause energy waste. As indicated in table 2, these three activities account for 48 percent of all the projects funded through the EECBG program, or 3,674 out of 7,594 total projects. Table 2: Number of Projects and Percentage of EECBG Funds Allocated to Activities: Activity: 1. Energy Efficiency and Conservation Strategy (EECS)[A]; Budget: $151,438,205; Percent of budget: 5.2%; Number of projects: 801; Percent of all projects: 10.5%. Activity: 2. Technical consultant services; Budget: $71,982,608; Percent of budget: 2.5%; Number of projects: 555; Percent of all projects: 7.3%. Activity: 3. Residential and commercial buildings and audits; Budget: $68,330,175; Percent of budget: 2.4%; Number of projects: 443; Percent of all projects: 5.8%. Activity: 4. Financial incentive programs; Budget: $534,573,604; Percent of budget: 18.5%; Number of projects: 406; Percent of all projects: 5.3%. Activity: 5. Energy-efficiency retrofits; Budget: $1,065,779,447; Percent of budget: 36.8%; Number of projects: 2,460; Percent of all projects: 32.4%. Activity: 6. Buildings and facilities; Budget: $284,711,283; Percent of budget: 9.8%; Number of projects: 808; Percent of all projects: 10.6%. Activity: 7. Transportation; Budget: $122,182,595; Percent of budget: 4.2%; Number of projects: 528; Percent of all projects: 7.0%. Activity: 8. Codes and inspections; Budget: $19,765,501; Percent of budget: 0.7; Number of projects: 117; Percent of all projects: 1.5. Activity: 9. Energy distribution; Budget: $35,628,958; Percent of budget: 1.2%; Number of projects: 77; Percent of all projects: 1.0%. Activity: 10. Material conservation programs; Budget: $35,677,882; Percent of budget: 1.2%; Number of projects: 163; Percent of all projects: 2.1%. Activity: 11. Reduction/capture of methane/greenhouse gases; Budget: $30,474,297; Percent of budget: 1.1%; Number of projects: 48; Percent of all projects: 0.6%. Activity: 12. Lighting; Budget: $198,321,849; Percent of budget: 6.9%; Number of projects: 622; Percent of all projects: 8.2%. Activity: 13. Renewable energy technologies; Budget: $175,410,392; Percent of budget: 6.1%; Number of projects: 478; Percent of all projects: 6.3%. Activity: 14. Other; Budget: $100,112,235; Percent of budget: 3.5%; Number of projects: 88; Percent of all projects: 1.2%. Activity: Total; Budget: $2,894,389,031[B]; Percent of budget: 100.0%[C]; Number of projects: 7,594; Percent of all projects: 100.0%. Source: DOE. [A] The number of projects funded under this activity is relatively high because all recipients were required to complete an energy efficiency and conservation strategy as part of their grant application and some recipients used EECBG funds to cover the cost of preparing these strategies. [B] According to DOE officials, budgeted amounts are reported by recipients at the activity level and as a result there is some discrepancy between this figure and the total amount awarded (approximately $2.7 billion). [C] Totals may not add up to 100 percent due to rounding. [End of table] According to DOE officials, a number of factors explain why energy- efficiency retrofits, financial incentive programs, and buildings and facilities programs account for such a large portion of all EECBG- funded projects. DOE officials told us that some recipients had previously identified needed improvements to their buildings and facilities. Recipients also told us that EECBG funds allowed them to undertake planned facilities projects that previously lacked the requisite funding. DOE officials told us that other recipients allocated EECBG funds to these projects to save money on future energy bills and many recipients chose retrofit programs because these programs allowed them to use EECBG funds to engage their broader communities by retrofitting commercial and residential buildings, in addition to government facilities. DOE also encouraged recipients to pursue these projects. In the EECBG program's funding announcement, DOE asked recipients to "prioritize energy efficiency and conservation first as the cheapest, cleanest, and fastest ways to meet energy demand" and to "develop programs and strategies that will continue beyond the funding period."[Footnote 7] Energy efficiency retrofits and buildings and facilities projects meet both of these goals. Although financial incentive programs appear to be the second highest funded activity--receiving over 18 percent of EECBG funds--the data for this activity require further explanation. According to DOE officials, approximately 73 percent of activities classified as financial incentive programs are subgrants made by state governments to units of local government within the state.[Footnote 8] Such subgrant recipients can use these funds for any of the 14 eligible activities, such as lighting, retrofits, and transportation. The state awarding the subgrant can report details of the activities funded by subgrant recipients. However, the state government reports these details in narrative fields within DOE's PAGE system while the primary activity type, reflected in table 2, is simply classified as "financial incentive programs." High-level summary data on these activities may give the impression that over 18 percent of EECBG funds are allocated to financial incentive programs however, nearly three- quarters of these funds may ultimately be used for any of the 14 eligible activities. Although DOE collects information on how these funds are ultimately used, these data are not readily available. DOE has obligated all EECBG funds to recipients, and recipients are beginning to obligate and spend these funds. The Recovery Act required DOE to obligate all funds to recipients by September 30, 2010, and DOE has done so. DOE staff told us that recipients have completed the planning stages of their projects and they expect that recipient spending will soon hit a peak before leveling off as funds are expended. As of December 2010, recipients reported obligating approximately $1.7 billion, 57 percent of their EECBG budgets, and reported spending more than $655 million, approximately 23 percent of their EECBG budgets. DOE officials expect recipients' spending to increase significantly in forthcoming reporting periods as work begins or increases on more projects. The table below shows the total funds budgeted, obligated, and spent and the percentages of budgeted funds spent for each eligible activity. Table 3: Total EECBG Funds Obligated and Spent by Recipients as of December 31, 2010: Activity: 1. Energy Efficiency and Conservation Strategy (EECS); Budget: $151,438,205; Obligated by recipients: $103,869,366; Spent by recipients: $41,251,127; Percent of budget spent: 27.2%. Activity: 2. Technical consultant services; Budget: $71,982,608; Obligated by recipients: $48,167,480; Spent by recipients: $18,209,103; Percent of budget spent: 25.3%. Activity: 3. Residential and commercial buildings and audits; Budget: $68,330,175; Obligated by recipients: $31,872,116; Spent by recipients: $14,454,762; Percent of budget spent: 21.2%. Activity: 4. Financial incentive programs; Budget: $534,573,604; Obligated by recipients: $368,791,277; Spent by recipients: $92,384,019; Percent of budget spent: 17.3%. Activity: 5. Energy-efficiency retrofits; Budget: $1,065,779,447; Obligated by recipients: $569,119,301; Spent by recipients: $249,773,284; Percent of budget spent: 23.4%. Activity: 6. Buildings and facilities; Budget: $284,711,283; Obligated by recipients: $148,324,716; Spent by recipients: $68,106,739; Percent of budget spent: 23.9%. Activity: 7. Transportation; Budget: $122,182,595; Obligated by recipients: $49,087,408; Spent by recipients: $29,754,830; Percent of budget spent: 24.4%. Activity: 8. Codes and inspections; Budget: $19,765,501; Obligated by recipients: $13,710,406; Spent by recipients: $4,961,257; Percent of budget spent: 25.1%. Activity: 9. Energy distribution; Budget: $35,628,958; Obligated by recipients: $11,075,534; Spent by recipients: $5,568,287; Percent of budget spent: 15.6%. Activity: 10. Material conservation programs; Budget: $35,677,882; Obligated by recipients: $12,767,443; Spent by recipients: $10,834,821; Percent of budget spent: 30.4%. Activity: 11. Reduction/capture of methane/greenhouse gases; Budget: $30,474,297; Obligated by recipients: $24,816,635; Spent by recipients: $5,518,184; Percent of budget spent: 18.1%. Activity: 12. Lighting; Budget: $198,321,849; Obligated by recipients: $97,181,881; Spent by recipients: $57,828,029; Percent of budget spent: 29.2%. Activity: 13. Renewable energy technologies; Budget: $175,410,392; Obligated by recipients: $79,385,833; Spent by recipients: $43,722,847; Percent of budget spent: 24.9%. Activity: 14. Other; Budget: $100,112,235; Obligated by recipients: $94,853,487; Spent by recipients: $12,769,706; Percent of budget spent: 12.8%. Activity: Total; Budget: $2,894,389,031[A]; Obligated by recipients: $1,653,022,884; Spent by recipients: $655,136,996; Percent of budget spent: 22.6%. Source: DOE. [A] According to DOE officials, budgeted amounts are reported by recipients at the activity level and as a result there is some discrepancy between this figure and the total amount awarded (approximately $2.7 billion). [End of table] EECBG Recipients Face Several Challenges in Obligating and Spending Recovery Act Funds and Meeting DOE's New Obligating and Spending Milestones: Some recipients and others have identified several challenges that have delayed spending of Recovery Act funds under this newly funded EECBG program. DOE has made efforts to help recipients address some of these challenges, including launching a Technical Assistance Program and Solution Center to provide recipients with one-on-one assistance, an online resource library, training, webcasts, and a peer-exchange forum for sharing best practices and lessons learned. Inexperienced DOE Program Administrators: Because the EECBG program is relatively new--authorized by EISA in 2007 but not funded until the Recovery Act was passed in 2009--some DOE administrators had little previous experience with the program and its requirements. DOE's Inspector General reported that some of the DOE staff assigned to review EECBG grant applications lacked financial assistance experience and failed to obtain the information necessary to issue awards, which required additional requests for documentation that further delayed awards.[Footnote 9] The Inspector General also reported that the program lacked a permanent Program Director until April 2010. Some EECBG project officers--the DOE staff primarily responsible for overseeing and interacting with EECBG recipients--told us that they faced a steep learning curve during the initial months of the program, when they began working with recipients to resolve obstacles to applying for funds and address questions about meeting requirements and reporting outcomes. Several project officers compared managing the EECBG program to flying a plane while it is still being built. Limitations in Recipient Staff and Resources: In addition, several DOE project officers told us that some recipients' efforts to effectively manage grants and spend funds have been complicated by staff and resource limitations. Some recipients lack the staff and resources needed to comply with EECBG and Recovery Act requirements. For example, a project officer told us that one county has only two staff members who were entirely responsible for managing the grant and meeting reporting requirements, in addition to their regular workload. The economic downturn exacerbated some recipients' staffing challenges as budget shortfalls led to furloughs and hiring freezes. For example, one county reported to DOE that staffing shortages due to budget cuts had delayed its planned retrofit projects. Another recipient reported project delays due to a furlough that closed the city government and prevented the city council from approving their plans for a financial incentive program. Jurisdictional Requirements: Some recipients also told us that they experienced local jurisdictional requirements that delayed their ability to spend Recovery Act funds. DOE's Inspector General reported that local budget and procurement requirements prevented some recipients from obligating funds until DOE made the entire award amount available. In addition, several project officers told us that some recipients cannot initiate their proposed EECBG-funded projects until their spending decisions and budgets are approved by local officials, which can delay projects and spending for months or even longer in localities where local officials only meet quarterly or twice a year. In response to questions about spending delays, one recipient told us that although local procedures can be time-consuming, these procedures also protect tax dollars. Another recipient told us that DOE needs to take local procedures into consideration so that spending milestones are more flexible and realistic. Both representatives from NACo told us that although recipients are grateful for the opportunity to implement critical projects that they previously could not have funded, DOE has not adapted guidance and deadlines to the needs, timelines, or procedures of local governments and that this has created some challenges. Some project officers expressed similar views in our meetings with them, stating that the federal government in general lacked an appreciation of city and county government processes. Both USCM representatives also told us that although DOE's guidance and support had improved significantly, this lack of understanding of how city governments worked had a negative impact on the success of the program. Reporting Requirements: Additionally, some recipients told us that meeting the reporting requirements for EECBG Recovery Act funds is time-intensive and that requiring recipients to submit similar information through PAGE and FederalReporting.gov makes the reporting process unnecessarily duplicative. For example, one recipient told us that the required reporting for EECBG takes two to three times longer than other federal grants. Some recipients told us that the EECBG program's reporting requirements were more cumbersome than other federal grant programs. One recipient with decades of federal grant experience told DOE's Recovery Act help line that although DOE staff had been very helpful in providing information, the reporting requirements for EECBG Recovery Act funds were the most onerous he had experienced in 20 years of government work despite regularly applying for millions of dollars in federal grants. Another recipient told us that his city canceled one of its planned projects, a geothermal system, because the reporting requirements would have been too burdensome. Similarly, project officers told us that some small recipients were so overwhelmed with the reporting requirements that they declined their awards. However, the vast majority of recipients accepted their awards. Delays in Acquiring Needed Materials and Products: Some recipients have also faced challenges in acquiring needed materials and products in a timely manner. EECBG recipients have created a large demand for energy-efficient materials. As a result, some of the materials and products needed to complete projects are out of stock or on back order, which can delay the implementation of these projects and the spending of funds allocated for these projects. For example, project officers told us that shortages of lighting and heating, ventilation, and air conditioning (HVAC) systems products have delayed some of the projects requiring these items. Both NASEO representatives told us that shortages were also an issue for solar and energy-efficient lighting projects. DOE Established New Obligating and Spending Milestones for Recipients: When DOE announced funding opportunities through the EECBG program in March 2009, it stated that recipients must obligate all funds within 18 months of their effective award date[Footnote 10] and spend all funds within 36 months of their effective award date.[Footnote 11] These original time frames require recipients that were awarded grants in fall 2009--the majority of recipients--to obligate 100 percent of their funds by spring 2011 and to spend these funds by fall 2012. However, in April 2010 DOE determined that many recipients were not on a trajectory to obligate and spend all of their funds within this time frame. DOE sent letters to all EECBG recipients outlining new obligation and spending milestones in an effort to increase obligating and spending rates among recipients and ensure that all funds are spent before the 36-month deadline. DOE's new milestones encouraged recipients to obligate 90 percent of their funds by June 25, 2010, spend 20 percent of their funds by September 30, 2010, and spend 50 percent of their funds by June 30, 2011. Officials from the Office of Weatherization and Intergovernmental Programs (OWIP), the DOE office which manages the EECBG program, told us that DOE and the Administration expressed an urgency to spend funds quickly, thereby creating jobs and stimulating the economy--primary purposes of the Recovery Act. These OWIP officials told us that many recipients found the milestone letters useful to facilitate local procurement processes and overcome other barriers to obligations and payments. DOE initiated Operation Clear Path to meet the September 2010 spending milestone by contacting 600 targeted recipients through telephone calls and helping these recipients develop strategies and tactics to accelerate their spending. An internal DOE newsletter reported that Operation Clear Path was yielding real gains by reaching out to recipients of grants over $3 million. DOE cited recent spending increases among targeted recipients as evidence that this approach was succeeding. For example, one city moved $600,000 from a revolving loan fund to a lighting retrofit, bringing the city's spending up to 31 percent. Additional examples given from other cities include $10 million spent to capitalize a revolving loan fund and $4 million spent on a lighting equipment purchase. However, many recipients have had difficulty meeting these new milestones. According to DOE's data, about 41 percent of recipients met DOE's new milestone of spending 20 percent of EECBG funds by September 30, 2010 (see table 4). Table 4: Recipients That Met DOE's September 30, 2010, Spending Milestone: Award amount: Over $2 million; Total number of grants: 294; Number of recipients that spent over 20% of funds by Sept. 30, 2010[A]: 95; Percent of recipients that spent over 20% of funds by Sept. 30, 2010: 32%. Award amount: Between $250,000 and $2 million; Total number of grants: 909; Number of recipients that spent over 20% of funds by Sept. 30, 2010[A]: 339; Percent of recipients that spent over 20% of funds by Sept. 30, 2010: 37%. Award amount: Less than $250,000; Total number of grants: 984; Number of recipients that spent over 20% of funds by Sept. 30, 2010[A]: 467; Percent of recipients that spent over 20% of funds by Sept. 30, 2010: 47%. Award amount: Total number of grant recipients; Total number of grants: 2,187[B]; Number of recipients that spent over 20% of funds by Sept. 30, 2010[A]: 901; Percent of recipients that spent over 20% of funds by Sept. 30, 2010: 41%. Source: DOE. [A] Note that spending is determined by funds drawn down from the Automated Standard Application for Payments system. The system was developed by the Financial Management Service through which recipient organizations receiving federal funds can draw from accounts pre- authorized by federal agencies. [B] Note: two recipients have withdrawn from the program since September 30, 2010. The current number of recipients, as of February 14, 2011, is 2,185. [End of table] Furthermore, some project officers told us that DOE's new spending and obligation milestones confused recipients. These project officers stated that some recipients were concerned about the consequences they would face for not being able to meet the new milestones. Although DOE officials told us that there were no repercussions for recipients that failed to meet these milestones, some project officers told us that some recipients did not understand this and were concerned that they might lose their funding. In some cases, project officers told us that they too were unsure of the consequences of recipients' failing to meet these milestones. In addition, some representatives from NASEO told us that these new milestones were not consistent with the timelines set in the terms and conditions of award agreements that DOE had already approved. Some project officers told us they are concerned that DOE is sending conflicting messages by encouraging recipients to spend funds more quickly than the time frame recipients had agreed to in the terms of their grants. Further, DOE's Inspector General reported in August 2010 that DOE's new obligating and spending milestones "may increase the risks associated with ensuring compliance with regulatory requirements— as well as, maintaining effective financial control over the expenditure of funds."[Footnote 12] DOE has initiated a second round of telephone calls to targeted recipients, some of which may have been contacted during the first round of follow-up telephone calls, in an effort to increase spending to meet DOE's new milestone of spending 50 percent of EECBG funds by June 30, 2011. DOE staff continue to work with recipients and project officers to share best practices, overcome challenges, and ensure that the EECBG program advances the goals of energy efficiency and job creation. However, it is still unclear whether this effort will overcome the challenges recipients face in obligating and spending EECBG funds and meeting DOE's new milestones. DOE and Recipients Are Taking Actions to Provide Oversight of EECBG Funds but Report Facing Challenges in Meeting Recovery Act and Other Program Requirements: Both DOE and recipients are taking a variety of actions to provide oversight of EECBG funds, with some recipients providing much more rigorous oversight than others. DOE and recipients reported having experienced technical, staffing, and expertise challenges that hinder their ability to meet Recovery Act and program requirements. DOE is taking steps to address many of these challenges. DOE and Recipients Are Taking a Variety of Actions to Provide Oversight of EECBG Funds, and the Level of Recipient Oversight Varies: DOE and recipients are using a variety of oversight actions for EECBG funds such as program office monitoring and oversight by the DOE Chief Financial Officer (CFO) and the Inspector General. Recipients are also providing oversight, and the level of this oversight may vary by the recipient's resources and the nature of the project. DOE's Oversight Actions: DOE's framework for oversight is its programwide monitoring plan that was issued in August 2009 and revised in March and June 2010. According to DOE documentation, DOE developed the plan to, among other things, provide a structure for oversight of recipients' procedures and processes, ensure consistent application of program and reporting standards, and provide clear and transparent guidelines. As outlined in the monitoring plan, OWIP oversees the administration and oversight of EECBG funds. The office plans and budgets programmatic requirements and resources, develops standardized monitoring practices, provides expertise to review and analyze performance measures, and helps provide guidance and technical expertise to meet programmatic requirements, among other activities. Specifically, OWIP has issued guidance to help recipients and DOE program staff meet Recovery Act and program requirements such as Buy American,[Footnote 13] Davis- Bacon,[Footnote 14] and monitoring requirements, as well as to help to ensure that funds are spent efficiently. OWIP also develops and hosts Web seminars on specific topics such as designing retrofit and appliance rebate programs. To monitor recipients' use of funds, DOE project officers act as the primary support to recipients, as well as liaisons between recipients and DOE. DOE's monitoring plan directs project officers to gather key program data from recipients, provide information on training and technical assistance opportunities, and coordinate monitoring activities. In addition to project officers, DOE also uses other monitoring staff such as technical monitors, contract specialists, and staff accountants.[Footnote 15] The plan and guidance also identify goals and actions for three primary components of oversight and monitoring: (1) desktop monitoring, (2) on-site monitoring, and (3) worksite visits. Specifically: * Desktop monitoring: According to DOE, all EECBG grant awards are to be reviewed through quarterly desktop monitoring--remote monitoring conducted by the project officer. This includes examining recipients' financial and other reports to assess progress and determine compliance with federal requirements, as well as examining recipients' planned goals and objectives and the reporting and tracking of resources expended by the recipient and subrecipients.[Footnote 16] DOE also directs project officers and other monitors to use information gained from desktop monitoring for further monitoring and potential corrective action. * On-site monitoring: DOE conducts periodic on-site monitoring of grant recipients. On-site monitoring primarily occurs at the recipient and contractor levels and can also include field inspections on projects that have achieved milestones since the previous visit. Prior to on-site monitoring, monitors also review the previous findings of other DOE project officers and monitoring staff to determine deficiencies and to determine what steps are being taken to resolve the issues. During on-site monitoring, project officers review what specific internal controls[Footnote 17] recipients have in place, such as segregation of duties, accounting standards and practices, and payment procedures. Monitoring staff also interview contractors to determine if follow-up procedures were used and deficiencies were corrected. * Project worksite reviews: DOE monitoring staff also visit some worksites during on-site monitoring to review the progress of activities at the project, facility, or building being completed. Staff also review the quality of work performed and compliance with federal requirements, such as the Buy American provision. DOE officials stated that worksite reviews were conducted as determined necessary by the project officer, though project officers made an effort to visit at least one worksite during every on-site visit. Overall, DOE officials stated that they planned to prioritize their monitoring based on the size of the grant rather than on the expected risk of the project. For example, all recipients with over $2 million in Recovery Act funds--which as a group, represent about 70 percent of all EECBG Recovery Act formula funds--receive significantly more on- site monitoring than would recipients of grants of under $250,000. Table 5 shows the frequency of DOE's planned monitoring activities. Table 5: Planned Frequency of EECBG Monitoring Activities: Type of monitoring: Percent of total grant funds[A]; Recipients receiving less than $250,000: 4%; Recipients receiving less than $1M but at least $250,000: 16%; Recipients receiving less than $2M but at least $1M: 9%; Recipients receiving $2M or greater: 70%. Type of monitoring: Desktop reviews; Recipients receiving less than $250,000: Quarterly; Recipients receiving less than $1M but at least $250,000: Quarterly; Recipients receiving less than $2M but at least $1M: Quarterly; Recipients receiving $2M or greater: Quarterly. Type of monitoring: On-site reviews; Recipients receiving less than $250,000: One in the life of the grant for 10% of the grants; Recipients receiving less than $1M but at least $250,000: One in the life of the grant for 25% of the grants; Recipients receiving less than $2M but at least $1M: One in the life of the grant; Recipients receiving $2M or greater: One to two per year. Type of monitoring: Worksite reviews; Recipients receiving less than $250,000: As needed; Recipients receiving less than $1M but at least $250,000: As needed; Recipients receiving less than $2M but at least $1M: As needed; Recipients receiving $2M or greater: As needed. Source: DOE. [A] Numbers do not add to 100 percent due to rounding. [End of table] To conduct and track monitoring activities, DOE monitoring staff make extensive use of a Web-based application, the PAGE system. For example, DOE monitoring staff use PAGE data to inform and target further monitoring actions such as on-site monitoring. Additionally, DOE has also created a monitoring tool within PAGE that compiles the checklists used by project officers to determine the compliance of recipients with the program's statutes and requirements. DOE guidance states that through the use of PAGE, monitoring staff can target recipients with low spending rates or reporting delays for additional follow-up to help ensure that funds were spent in a timely manner. Project officers also stated that they use financial data reported in PAGE to identify potential areas of concern for further analysis. Overall, DOE officials stated that they have almost met their desktop monitoring goals with a rate of over 90 percent of their planned monitoring complete, and based on their current rate, are on track to meet on-site monitoring goals (see table 6). DOE officials stated that they do not currently track the number of worksite reviews, but that all initial on-site monitoring visits also include a worksite review. Table 6: DOE-Reported EECBG Monitoring Progress as of February 14, 2011: Type of monitoring: Total recipients; Recipients receiving less than $250,000: 982; Recipients receiving less than $1M but at least $250,000: 733; Recipients receiving less than $2M but at least $1M: 176; Recipients receiving $2M or greater: 294; Total: 2,185[A]. Type of monitoring: Total on-site visits; Recipients receiving less than $250,000: 67; Recipients receiving less than $1M but at least $250,000: 61; Recipients receiving less than $2M but at least $1M: 51; Recipients receiving $2M or greater: 293; Total: 472. Type of monitoring: Total desktop reports submitted--CY 2010 Q3 (CY 2010 Q4); Recipients receiving less than $250,000: 915 (790); Recipients receiving less than $1M but at least $250,000: 732 (94); Recipients receiving less than $2M but at least $1M: 176 (16); Recipients receiving $2M or greater: 293 (104); Total: 2,116 (1,004). Type of monitoring: Percentage desktop reports submitted--CY 2010 Q3 (CY 2010 Q4); Recipients receiving less than $250,000: 93.2% (80.4%); Recipients receiving less than $1M but at least $250,000: 99.9% (12.8%); Recipients receiving less than $2M but at least $1M: 100.0% (9.1%); Recipients receiving $2M or greater: 99.6% (35.4%); Total: 96.8% (45.9%). Source: DOE: Note: "CY" refers to calendar year and "Q" refers to quarter. [A] Total number of recipients for table 4 and table 6 do not match because two recipients withdrew from the program. [End of table] In addition to desktop monitoring and site visits, several DOE officials reported almost meeting their planned staffing goals for project officers and other monitoring staff. Specifically, as of December 20, 2010, these DOE officials reported that approximately 60 project officers, 20 contract specialists, and 35 full-time equivalents (FTE) of contractor support were assigned to administer the EECBG program.[Footnote 18] These staff members are located in Washington, D.C.; Oak Ridge, Tennessee; Golden, Colorado; and Las Vegas, Nevada. As of December 2010, DOE officials stated they had met their staffing goals with the exception of one field office, which had recently lost several of its staff. These DOE officials said they hoped to fill those positions soon. DOE monitoring also includes specific financial management oversight in DOE field offices, as well as independent oversight through DOE's Inspector General. Specifically, financial management oversight is conducted by the DOE Office of the CFO in the Golden, Colorado, and Oak Ridge, Tennessee, field offices. In addition, the project officers in the Las Vegas, Nevada, field office also oversee disbursements of DOE funds. Data on the disbursement of funds are then made available to DOE employees, including headquarters employees and the CFO. DOE's Inspector General also provides oversight of the EECBG program and has issued a report on the status of implementation of EECBG, as well as the management controls over the development and implementation of PAGE.[Footnote 19] Along with issuing such reports, DOE's Inspector General takes actions to investigate allegations related to fraud, waste, and abuse for the EECBG program. To date, DOE's Inspector General has received four complaints regarding the use of EECBG funds and is handling two of them through ongoing investigations, while the other two were sent back to the program office for resolution. A DOE Inspector General official stated that the complaints are related to potential conflicts of interest and potential ethics violations. Additionally, while not investigating EECBG directly, DOE's Inspector General recently issued a management alert related to DOE's State Energy Efficient Appliance Rebate Program.[Footnote 20] Some recipients also used EECBG funds to help fund rebate programs. In its alert, DOE's Inspector General identified an incident in which a consumer in Georgia was able to purchase water heaters at a store, return them, and still inappropriately apply for and receive a rebate through the state's rebate program. DOE's Inspector General report noted that the incident showed that if similar process vulnerabilities exist in other jurisdictions, then the rebate program could be exposed to abusive practices on a broad scale. Recipients' Oversight Actions: The extent of recipients' oversight through the monitoring of Recovery Act funds may depend on the individual recipient's resources and the nature of the project. Broadly stated, federal law, federal regulations, and DOE guidance require that recipients comply with applicable laws, including provisions in the Recovery Act, such as Buy American, and other federal requirements, including regulatory and procedural requirements. Recipients are responsible for informing subrecipients of all applicable laws and other federal requirements and ensuring their subrecipients' compliance with program, fiscal, and audit requirements. The Single Audit Act, as amended, requires recipients passing on the funds to subrecipients to monitor the subrecipients' use of the federal funds through site visits, limited scope audits, and other means.[Footnote 21] DOE did not require recipients to conduct any specific oversight actions but instead released guidance on October 26, 2010, identifying a number of best practices for monitoring by recipients to ensure their and their subrecipients' compliance with federal requirements for monitoring. Further, DOE has also provided additional resources regarding the monitoring of subrecipients, such as workshops and guidance. We found that recipients contacted are using various methods to meet applicable laws and other federal requirements, and recipients, DOE officials, and project officers stated that some recipients are providing substantially more rigorous monitoring than others. For example, while it is required that recipients monitor subrecipients, in response to our questions, one recipient did not indicate that it monitored subrecipients and instead stated that it included terms in their contracts that required contractor adherence to applicable federal requirements. Other recipients reported actively monitoring funds but using relatively limited techniques such as reviewing invoices and other reports submitted by vendors. DOE officials acknowledged that many recipients are resource constrained, limiting their ability to monitor and ensure compliance with applicable federal requirements, and DOE would need to evaluate on a case-by-case basis whether the grant recipient's subrecipient monitoring system is sufficient to meet compliance requirements. However, officials stated that DOE does not gather specific information on recipient monitoring practices except during on-site visits. Additionally, because less than 25 percent of grants under $1 million are to receive on-site visits, DOE does not have specific information on monitoring for many recipients. DOE officials stated that they focused monitoring efforts on recipients of larger grants because they were more likely to have subrecipients. DOE officials also said that they adopted standard, well-accepted audit sampling practices to achieve the most effective coverage in verifying monitoring practices and conducted additional on- site reviews of recipients that have demonstrated cause for concern such as not filing quarterly reports. Officials stated that if they identified a concern during an on-site visit, they would work with the recipient to develop an alternative system to monitor subrecipients. While some recipients reported less rigorous monitoring activities, in other instances, recipients reported having detailed monitoring practices with multiple components. For example, one recipient reported collecting monthly progress reports, which included financial reports, from all subrecipients, vendors, and contractors, as well as conducting site visits. Additionally, the recipient reported that their auditor and grant services staff also conducted internal audits of all Recovery Act programs. Another recipient also reported using a variety of monitoring practices such as inspections, site visits, and regular billing of vendors. The recipient's staff also reported that they planned to use third-party oversight of site visits in the coming year and that they had recently hired a grant compliance analyst who would be administering a subrecipient monitoring plan in the coming year. Some grant sizes and project types may not lend themselves to the same level of oversight as others. DOE project officers stated that some larger projects included specific monitoring requirements in the contract to help ensure that the project met key goals including achieving project outcomes and ensuring that funds were used efficiently. For example, project officers stated that some recipients were able to use expertise provided by contractors that perform energy- savings work to generate energy-cost savings to help determine estimated versus achieved energy savings. Other smaller contracts, however, were not large enough to include these types of oversight. In another instance, one recipient noted that because the grants with subrecipients they were using were so small, ranging from $8,000 to $55,000, that the recipient did not see the need to conduct internal and financial audits but did report conducting on-site visits on some projects. DOE Officials and Recipients Reported Technical, Staffing, and Expertise Challenges That Hinder Efforts to Ensure Requirements Are Met: DOE officials and some recipients reported experiencing technical, staffing, and expertise challenges that hinder the ability of both DOE and recipients to provide oversight to ensure that Recovery Act and program requirements are met, but DOE is taking steps to address many of these challenges. Technical Challenges: Some DOE project officers and recipients reported experiencing technical challenges using PAGE. For example, project officers told us that monitoring was challenging because of the extensive amount of time that they spent helping recipients use the PAGE system, which recipients found cumbersome and difficult to use. In particular, some project officers said that they provide ongoing support during reporting cycles to help fix reporting errors and, in some cases, help walk recipients through the reporting process and help fix reporting errors. For example, some DOE project officers in one field office told us that during the reporting period--the last week of the quarter and the following 2 months--project officers spent approximately 70 to 85 percent of their time helping recipients fill out their PAGE and federal reporting forms.[Footnote 22] Other project officers with the same office also told us that recipients often cannot figure out how to enter data into PAGE and do not understand the reporting system. In another field office, a DOE project officer stated that almost all of the initial PAGE reports for each quarter were initially rejected for issues such as missing or incorrect data entries and that recipients reported having difficulties with entering data in every field. Some recipients have also reported difficulties using PAGE--for example, though not asked about PAGE issues specifically in our structured questions, one recipient stated in a response regarding challenges faced that "PAGE reporting is the biggest challenge" while another recipient noted that "technical glitches" in PAGE made reporting more challenging. Incorrect data entered into PAGE has also limited project officers' ability to monitor recipients. For example, some project officers stated that some recipients continued to enter incorrect information in PAGE for data fields such as total hours worked and estimated energy savings--key project measures used to help gauge project progress and status--in part due to difficulties recipients had understanding PAGE data fields. Specifically, some project officers stated that some PAGE data field names were difficult to interpret, confusing some recipients. Some project officers also stated that recipients entered incorrect job information using incorrect job calculation formulas or that they entered incorrect energy data and that DOE project officers had to spend time locating the errors and working with the recipient to ensure that the correct data were entered. In previous work looking at EECBG, as well as the State Energy Program and Weatherization Assistance Program, we have also reported on both DOE monitoring staff and recipients having problems with PAGE that hindered DOE's ability to administer the program.[Footnote 23] Specifically, we previously reported that some DOE staff and recipients noted that the time and effort that project officers spent with recipients to help them understand and navigate the PAGE system significantly increased the administrative burden of the program. In reviewing PAGE's development process, DOE's Inspector General found significant issues, noting that DOE did not seek input from grant recipients in designing PAGE because it had to be developed and implemented so quickly. The Inspector General's report noted that user input significantly increases the likelihood that the system will meet user needs and also helps avoid rework costs due to a lack of functionality. Similarly, we previously reported on the importance of eliciting the users' needs to identify and prioritize system requirements.[Footnote 24] Without gathering this information, programs and systems may not meet the needs of their users. DOE officials stated that they have implemented a number of improvements to both the administration of PAGE, and the system itself, to help address user concerns and that they are currently implementing more changes. For example, DOE established a PAGE hotline to assist users and gather users' feedback and also provided training videos for recipients using PAGE. DOE officials also stated that they recently implemented a new software tool to gather and prioritize user feedback from large numbers of recipients to better address users' major concerns. Additionally, DOE officials said that among other improvements, they plan to incorporate a new interface in the next few months for DOE staff to review data reported in PAGE, which was designed to improve usability of the system. DOE officials stated that these improvements will take time to implement and have to be carefully considered so as not to negatively affect other PAGE functions. For example, PAGE is also used for recipient reporting by DOE's Weatherization Assistance and State Energy programs. DOE officials also stated that they have had to adapt their process to handle the large number of PAGE users, in contrast to the previous system. Specifically, DOE officials stated that approximately 50 state users and a limited number of federal staff used the previous system, while PAGE had 4,292 recipient and 411 federal users as of December 8, 2010. Staffing and Expertise Challenges: Some DOE officials, monitoring staff, and grant recipients, as well as all stakeholders we contacted, stated that some recipients are also encountering staffing and expertise challenges that have limited their ability to monitor the use of Recovery Act funds and ensure effective and efficient use of funds while also meeting Recovery Act and other federal requirements. As noted previously, some project officers stated that some recipients may lack sufficient resources and staffing for program oversight such as monitoring. Some DOE officials also acknowledged the significant recipient workload associated with monitoring. These officials stated that the biggest overall challenge to monitoring was the need to determine the right amount of information to collect from recipients so as to ensure recipient compliance with applicable federal requirements while not burdening recipients with reporting requirements. These officials also stated that they took steps to decrease the reporting burden on recipients, including decreasing certain monitoring requirements. Further, these DOE officials also stated that some recipients said that they might have trouble ensuring compliance with program requirements due to limited staffing and were focused on Recovery Act requirements such as Davis-Bacon and Buy American. Finally, the NASEO and USCM representatives we spoke with stated that some recipients have expressed concerns about complying with Davis-Bacon requirements, especially with limited staff and a large number of projects. For example, both USCM representatives stated that while some city and county projects already had Davis-Bacon or Buy American requirements before the Recovery Act, they were typically on much larger projects in excess of $30 million, such as transportation projects, with dedicated resources to ensure compliance with these requirements. Some DOE monitoring staff and the NASEO and USCM representatives stated that some recipients found compliance with federal laws, including Recovery Act provisions, such as those for financial management monitoring and Davis-Bacon, difficult because of a lack of previous expertise with those requirements. Additionally, some DOE monitoring staff stated that some recipients have limited experience with federal grants and faced a steep learning curve with implementing projects, and that this inexperience has limited monitoring efforts, making it more difficult to ensure that funds were spent properly. For example, several DOE staff in one field office stated that, in particular, those recipients that did not have previous experience performing a Single Audit Act report[Footnote 25] tended to have more trouble meeting EECBG requirements than others due to inexperience. Additionally, some DOE project officers also stated that some smaller recipients with fewer staff did not have specific expertise in energy management and that this made it more difficult to monitor programs. Further, these DOE project officers also noted that certain project types, such as energy loan programs, may have fewer tangible outcomes than others such as building retrofits, and are thus more difficult to monitor, especially without specific staff expertise. Additionally, certain Recovery Act requirements can also prove more difficult to ensure compliance with than others. For example, both NASEO representatives stated that some recipients found compliance with the Buy American provision difficult, especially for items with multiple components such as air conditioners. DOE is taking steps to address staffing and expertise challenges by expanding support to recipients. Some DOE officials and the NASEO and USCM representatives stated that DOE's administration of EECBG was initially limited during the program's early stages and that this hindered early program administration including oversight. Overall, the NASEO and USCM representatives stated that while the DOE program office has improved significantly in providing support such as guidance to recipients, the program was significantly impacted by these early delays. DOE has gradually expanded the amount of support provided to recipients. For example, while DOE project officers continue to provide individualized support to recipients, DOE has also developed a Technical Assistance and Solution Center to provide information related to monitoring recipient activities such as project implementation. Through the Technical Assistance and Solution Center, DOE has provided near daily Web seminars on specific energy topics while also helping connect recipients with specific technical expertise. Further, DOE continues to issue programwide guidance to help recipients comply with Recovery Act and program requirements. For example, on January 4, 2011, DOE issued guidance to help recipients determine the eligibility of recipient programs for Recovery Act funds. DOE has also hosted seminars with recipients and subrecipients to help identify and share monitoring best practices. Finally, DOE is working with the Office of Management and Budget (OMB) to update OMB's Compliance Supplement to its guidance for the Single Audit Act (Cir. No. A-133) regarding EECBG program requirements. DOE Has Faced Challenges in Determining the Extent to Which the EECBG Program is Meeting Recovery Act and Program Goals for Energy Savings: EECBG program recipients reported using EECBG grant funding to develop projects designed to achieve a variety of benefits in line with Recovery Act and program goals, including reducing total energy use and increasing energy savings for local governments and residents. For example, some recipients we contacted reported anticipating energy savings and reduced overall energy usage from such projects as powering the electrical needs for a large city park with solar panels; maximizing the use of day lighting in government buildings; installing more energy-efficient technology in households and businesses; replacing convention center light fixtures with light-emitting diode (LED) bulbs to reduce energy usage; and updating and remodeling a 40- year-old public building with more energy-efficient products, including new windows and doors, and HVAC systems. Furthermore, DOE officials told us that some recipients are already reporting achieving benefits consistent with Recovery Act and program goals. For example, DOE reported that according to initial recipient self-reporting through December 2010, EECBG recipients have upgraded more than 10,000 buildings, installed 40,000 efficient street lights, and upgraded more than 100,000 traffic signals. DOE has put some examples of program successes on its Web site. For example, a small town in southwestern Wyoming used its EECBG funds to convert its streetlights to LED fixtures. Town officials reported better lighting quality and visibility, less light pollution, and lower energy use that has reduced lighting-related energy costs by almost two-thirds. In North Carolina, local officials used EECBG funds to convert an abandoned grocery store into an energy-efficient community training center and classroom. By installing a more energy-efficient roof, new insulation, and HVAC systems, local officials said they anticipate achieving substantial energy savings. Another recipient in Oklahoma used EECBG funds to purchase five wind turbines. According to the recipient, the new wind energy technology has offset the electrical costs for all town-owned buildings, and the recipient reported anticipating saving $24,000 annually. However, DOE officials have experienced challenges in assessing whether the EECBG program is meeting Recovery Act and program goals for energy savings because most recipients do not measure energy savings by collecting actual data and several factors affect the reasonableness of energy-savings estimates. Additionally, while DOE officials say they have anecdotal examples of program successes, DOE lacks actual programwide data on energy savings. DOE guidance requires that recipients report impact metrics--which include energy savings, energy cost savings, renewable-energy generation, and emissions reductions--on a quarterly basis and verify cumulative totals when grants are closed out, but it does not require that these impact metrics be based on actual, as opposed to estimated, data.[Footnote 26] Furthermore, according to some DOE officials, there have been only a few opportunities for recipients to collect actual energy-savings data because in most cases actual data are only available after a project has been completed, and recipients are just beginning to complete projects. These officials said that instead of collecting actual energy-savings data, most recipients report estimates to comply with program reporting requirements. As part of the quarterly review process, DOE's monitoring plan appendix requires project officers to assess whether recipients' estimates of impact metrics, including energy savings, are reasonable [Footnote 27] and to determine whether grant recipients have an adequate procedure in place to collect, verify, and report these data. Project officers are further instructed, as part of this quarterly review process, to review recipients' reported impact metrics and determine whether they are within a range of values that DOE would expect. For example, DOE's EECBG desktop review guidance instructs project officers to consider rejecting recipients' quarterly reports in cases where the estimates entered by recipients appear too low or too high. According to DOE officials, several factors affect the extent to which estimates are reasonable. One factor is the variance in the type and robustness of methodologies that recipients use to develop estimates. DOE guidance allows recipients flexibility in how they estimate impact metrics, such as energy savings. For example, DOE's Program Notice 10- 07B requires that recipients "take care to account for other determinant factors (e.g., weather variation)" when recipients develop estimates. DOE officials said that they prefer that recipients rely on certain estimation methods and tools that DOE would expect to produce more accurate and sophisticated estimates of grant project results-- such as contractor-or engineering-supplied estimates of project savings or estimates calculated with the Environmental Protection Agency's Portfolio Manager tool.[Footnote 28] For recipients that are not able to use methods or tools that are specific to their grant projects, DOE has provided recipients with tools that DOE officials believe are capable of calculating "high-level" estimates of energy- related impact metrics.[Footnote 29] Since the beginning of the EECBG program in 2009, DOE has updated and refined its estimating tool and underlying assumptions several times and may update the tool and assumptions in the future as well. While DOE officials said they believe that estimates calculated with the current version of this tool are more accurate than those calculated with previous versions, DOE does not require that recipients who use its estimating tool to use the most updated version when calculating and reporting estimates. Consequently, some recipients may use DOE's earlier, less refined tool to develop estimates of energy-related impact metrics. Another factor that affects DOE officials' ability to assess reasonableness is the fact that the agency recommends but does not require, that recipients report the methods or tools used to calculate estimates, so DOE officials do not know which recipients are using older versions of DOE's estimation tool or other methods to estimate energy-related impact metrics. Knowing which method or version of a tool recipients used to calculate estimates may be more important for smaller grant recipients, as DOE officials told us they believe that smaller grant recipients who do not have the expertise at hand or resources to hire energy-efficiency experts are more likely to use a DOE-supplied estimation tool. Without knowing the methods being used by recipients to estimate energy-related impacts, DOE cannot identify instances where the method along with the associated assumptions being used in calculating estimates may need to be more carefully reviewed. A third factor that DOE officials said can affect the development and reporting of reasonable estimates is the level of expertise available to recipients to develop impact metrics, as this can vary. According to DOE officials, some EECBG recipients are receiving federal grants for the very first time through the EECBG program, which was implemented in March 2009. Some communities may have limited (if any) direct experience with federal grant program requirements, and likewise have limited experience in measuring and reporting impact metrics. As a result, DOE officials reported that some recipients have had difficulty developing their estimates. In addition, DOE officials said that some recipients may also have made errors in reporting their estimates of energy-related impact metrics. For example, DOE officials said that some recipients may be incorrectly aggregating impact metrics from multiple project sites and, as a result, producing errors in the estimated energy-related impacts. In December 2010, DOE issued guidance to recipients outlining the steps DOE and recipients must take to formally close out EECBG grants. As part of this process, project officers are required to review recipients' final quarterly performance report and federal financial report for completeness and reasonableness. However, this guidance does not specify how project officers should assess reasonableness. DOE officials said that the agency is in the process of drafting internal closeout guidance for its project officers that will outline the procedures the project officers must follow to formally close out a grant. According to DOE officials, the upcoming internal closeout guidance intends to recommend, but not require, that recipients confirm they are reporting the most accurate data available at the time of closeout. DOE officials told us that once a grant is closed out, the agency does not require and cannot legally obligate the recipients to capture additional energy-savings data. For the few grants that have been closed, DOE expects its project officers to rely on their own expertise to assess the reasonableness of the estimates, according to DOE officials. For example, one project officer told us that as a project is completed, he determines whether the reported energy savings are reasonable by comparing the estimate when the project is completed to the recipient's original estimates, saying that he expects the estimates to be comparable unless the recipient has changed the scope of the project or received updated information from vendors or engineers. Even if the scope of the project has not changed or no new information has been provided, differences may be observed if recipients used an earlier version of the DOE tool to prepare an initial estimate and then used the updated DOE tool to compute the energy savings at the end of the project. Given that DOE may not know what method the recipient used to estimate energy savings, project officers may not be able to determine the level of review necessary to ensure reasonable reporting. Additionally, some of the project officers noted that they do not have the technical expertise to independently verify energy-savings estimates. To determine overall program outcomes, DOE officials told us they plan to conduct a program-wide evaluation measurement and verification study after the end of the EECBG program, which will be designed to measure and report the program's energy savings and cost savings. However, this effort is still in the design phase. As part of this study, the officials said that they would need to capture more than a year's worth of data to account for weather and seasonal variations that impact energy needs. While DOE collects some information regarding expected energy savings, officials noted that they were uncomfortable reporting these numbers before completing the study, which may be as long as 2 to 3 years after the projects are completed. Oversight of Recipient Reporting Data Quality Continues for the Sixth Round of Reporting: To meet our mandate to comment on recipient reports, we have continued monitoring data that recipients reported for Recovery.gov, including data on jobs funded. This time we focused our review on the EECBG recipient data in addition to the national data. Analyzing these data can help in improving the accuracy and completeness of the Recovery.gov data and in planning analyses of recipient reports. Overall, this round's results were similar to those we observed in previous rounds. According to Recovery.gov as of January 30, 2011, recipients reported on over 209,400 awards across multiple programs indicating that the Recovery Act funded approximately 585,654 jobs during the quarter beginning October 1, 2010, and ending December 31, 2010.[Footnote 30] This included 2,051 prime reports associated with EECBG recipients.[Footnote 31] As reported by the Recovery Accountability and Transparency Board, job calculations are based on the number of hours worked in a quarter and funded under the Recovery Act--expressed in FTEs.[Footnote 32] Analysis of Sixth Round Recipient Reporting Data Shows Data Quality Remains Relatively Stable: Using the sixth reporting period data, we continued our monitoring of errors or potential problems by repeating the analyses and edit checks reported in our previous reports. We reviewed 71,643 prime recipient report records from all programs posted on Recovery.gov for this sixth round. This was, for the first time, a decrease of 6,068 prime recipient reports or about an 8 percent drop from round five. The size of this decline in reporting was somewhat mitigated by the number of prime recipients reporting for the first time in round six. In round five, 7,465 recipients identified that round as their final report and did not report in round six. This was more than three times the number of prime recipients reporting for the first time in round six and suggests that further decreases in the number of recipients reporting in the next quarter are likely. For our analyses, in addition to this sixth round of recipient report data, we also used all the previous rounds of data as posted on Recovery.gov as of February 2, 2011. In examining recipient reports, we continued to look for progress in addressing limitations we noted in our prior reports. In those prior rounds, we reviewed data logic and consistency and reviewed unusual or atypical data. Data logic and consistency provide information on whether the data are believable, given program guidelines and objectives; unusual or atypical data values indicate potential inaccuracies. As with previous quarterly report rounds, these reviews included (1) the ability to link reports for the same project across quarters and (2) concerns in the data logic and consistency, such as reports marked final that show a significant portion of the award amount not spent. We continued to see minor variations in the number or percent of reports appearing atypical or showing some form of data discrepancy. For example, we continued to find a small number of prime recipient reports for which there were potential linkage issues across quarters. For this latest round, there was a slight increase from 1.5 percent to 2.2 percent in the number of prime reports appearing across all quarters showing a skip in reporting for one or more quarters. This may impact the ability to track project funding and FTEs over quarters. The number of reports marked "final" for which there appeared to be some discrepancy, such as reports marked "final" but for which project status was marked as less than 50 percent completed, continued to be quite small and unchanged from the previous round. We continued to examine the recipient reports' agency review flag field as part of our examination of data logic and consistency, since we have noted inconsistencies between agencies' accounts of their review process and the data shown in that field. Prime recipient report records include a review flag indicating whether or not a federal agency reviewed the record during the data quality review time frames. Prior analyses suggested that, for some agencies, the data in this field might not correctly reflect the extent of their review process. However, this did not seem to be the case for the EECBG program. EECBG program data in this field in this sixth round showed that 93 percent of the prime recipient reports were marked as reviewed by DOE, which was generally consistent with accounts of agency officials about their review process. However, we continue to observe some inconsistency when another data field on recipient reports, which shows whether or not a correction was initiated, is considered in conjunction with agency and recipient review flags. A correction could be initiated by either the prime recipient or the reviewing agency. Logically, one might expect that if a correction was made, it would have been initiated by a reviewer, and therefore the review flag should also be set to "yes." In this sixth round, as in the prior round, 10 percent of all prime recipient reports for all programs had this correction flag set to "yes" even though the review flags indicated that neither the agency nor prime recipient had reviewed those reports. As part of our focus on EECBG recipient reports for this sixth round of reporting, we examined reported FTE data since they can provide insight into the use and impact of the Recovery Act funds. Recipient reports of FTEs, however, cover only direct jobs funded by the Recovery Act. They do not include the employment impact on suppliers (indirect jobs) or on the local community (induced jobs).[Footnote 33] Our analyses of EECBG reporting for the last five quarters showed a steady increase in the number of FTEs reported in each quarterly reporting period. As shown in figure 1, there was also a similar steady increase in the percent of EECBG recipients reporting funding at least a partial FTE with Recovery Act funds. Figure 1: Portion of EECBG Recipients Reporting Funding at Least a Partial FTE in 2010: [Refer to PDF for image: stacked horizontal bar graph] Reporting quarter: Second (Oct.-Dec. 2009); Number of recipients reporting funding at least a partial FTE: 393 (21%); Number of recipients reporting no FTEs: 1,504 (79%); Total number of recipients: 1,897. Reporting quarter: Third (Jan.-Mar. 2010); Number of recipients reporting funding at least a partial FTE: 685 (33%); Number of recipients reporting no FTEs: 1,375 (67%); Total number of recipients: 2,060. Reporting quarter: Fourth (Apr.-June 2010); Number of recipients reporting funding at least a partial FTE: 934 (44%); Number of recipients reporting no FTEs: 1,186 (56%); Total number of recipients: 2,120. Reporting quarter: Fifth (July-Sept. 2010); Number of recipients reporting funding at least a partial FTE: 1,167 (55%); Number of recipients reporting no FTEs: 948 (45%); Total number of recipients: 2,115. Reporting quarter: Sixth (Oct.-Dec. 2010); Number of recipients reporting funding at least a partial FTE: 1,243 (61%); Number of recipients reporting no FTEs: 808 (39%); Total number of recipients: 2,051. Source: GAO analysis of Recovery.gov data. Note: We did not include data from the first reporting quarter due to concerns about reliability. [End of figure] For this report, we have chosen not to show the count of EECBG FTEs reported, out of concern about the comparability and reliability of the figures across quarterly reporting periods. As we noted in our September 2010 report, our field work had shown that the FTE calculations continued to be difficult for some recipients. This concern, while still present, based on information gathered from DOE officials in headquarters and DOE project officers in the field, continues to be addressed by DOE officials at all levels. As we noted in September, some confusion may have existed about the acceptability and use of some methods for calculating FTEs over the course of the reporting periods.[Footnote 34] This decision is also based on some irregularities and inconsistencies we observed in our analyses of the FTE data across quarters and the relationship of the hours worked, as reported to DOE by recipients, with the FTE values the recipients directly reported to FederalReporting.gov.[Footnote 35],[Footnote 36] DOE officials indicated that they continue to assess compliance with and encourage recipients to follow the DOE and Office of Management and Budget (OMB) guidance on how to correctly report FTEs. Moving forward, as these issues in reporting methods are addressed, it is likely that the comparability and reliability of the figures may improve. Quality Reviews Performed on EECBG Data by DOE and Prime Recipients Included a Focus on Updated OMB Guidance Requirements: Each quarter, DOE performs quality assurance steps on the data that recipients provide to FederalReporting.gov, including checks that are performed centrally across all their Recovery Act programs and reviews done by EECBG project officers at the program level. Based on these reviews, DOE officials reported that most recipients of Recovery Act funds have reported to FederalReporting.gov in previous rounds and now understand the reporting process, resulting in the reporting proceeding more smoothly. As in previous rounds, DOE performed several checks of the data centrally as information became available. For example, officials compared the amount recipients reported as funds awarded with agency internal records. They also compared jobs data from DOE's PAGE reporting system with FTEs reported to FederalReporting.gov. When discrepancies were found, project officers were instructed to contact recipients to make the necessary corrections. DOE followed up with grant recipients who did not report to FederalReporting.gov. For the sixth round, DOE reported 36 recipients to OMB as not in compliance. Of these, 34 are EECBG grant recipients. Several are tribal recipients that are in remote locations where reporting online is particularly challenging. EECBG project officers' efforts also helped ensure the quality of information recipients reported to FederalReporting.gov. For example, one group of project officers we interviewed reported spending a large portion of time helping recipients complete reporting requirements and ensuring the quality of reports. Project officers cited helping recipients understand terminology, such as distinguishing between vendors and recipients of subawards. They reported taking steps, including following up when large increases in job numbers were reported, reports were missing, a recipient in a remote location had difficulty submitting reports, or recipients had questions about definitions. DOE officials notified both recipients and reviewers, such as project officers, of the need to ensure that narrative descriptions met requirements laid out in OMB's September 2010 guidance.[Footnote 37] On September 29, 2010--a few days after OMB's guidance was released but before recipients started reporting for the quarter--DOE e-mailed both recipients and project officers instructions related to the guidance. The e-mail to recipients informed recipients of the need to provide sufficiently clear descriptions to facilitate the public's understanding, and stated that overly general or unclear award descriptions could be considered material omissions. Similarly, the e- mail to reviewers restated the guidance. It instructed reviewers to make sure they read the descriptions in the narrative data fields, and provide a comment to the recipient if they believed clarification was required. DOE also included this information in its webinars on recipient reporting designed for grant recipients and contractors. Further, it included a step in the reviewers' checklist to determine if the narrative descriptions provided clear and complete information on the award's purpose, scope, and activities. DOE officials also reported that during the last three quarters' reviews they have focused on ensuring that reports marked "final" correctly reflect that status. They have reached out to educate recipients on what that designation means and to ensure that those marked "final" are correctly identified. This includes looking at the amount reported as spent. DOE's quality assurance process flags reports in which it appears the designation may not be correct based on financial analyses, and encourages recipients to make needed corrections during the continuous corrections process. Conclusions: The Recovery Act pledges unprecedented transparency and accountability in its use of funds. In light of this pledge, the ability of the EECBG program to ensure compliance with applicable laws, including the Recovery Act and program requirements is critical, and will help determine the extent to which the program is meeting Recovery Act and program goals. DOE and recipients are taking steps to monitor the use of funds to help ensure that Recovery Act and program requirements are met, but DOE assesses recipients' monitoring practices only in a limited number of cases. Because of this limited assessment, DOE is not always able to identify when recipients' monitoring practices are sufficient to ensure compliance with applicable federal requirements. If DOE is not aware of recipients' monitoring practices, it cannot ensure that they have effective monitoring practices in place. In addition to ensuring that Recovery Act and program requirements are met, DOE must also be able to determine the extent to which the EECBG program is meeting Recovery Act and program goals for energy-related outcomes, such as energy savings. Because actual energy-savings data are often unavailable, DOE must rely on estimates. DOE takes some steps to assess the reasonableness of energy-related estimates, but without knowing which methodology or tool recipients used, it is difficult to do such an assessment. For example, without knowing if recipients who used DOE's estimating tool--which has been revised in the past and may be revised again in the future--were using the best available information for calculating metrics in the most recent version, project officers cannot be sure that recipients used sound estimating methods. Without more information regarding the recipients' estimating methods, DOE's assessment of the reasonableness of these estimates may not be sufficient to support the defensible development of programwide estimates of energy-related impacts, and therefore, the assessment of progress toward program goals. Recommendations for Executive Action: To better ensure that EECBG funds are used to meet Recovery Act and program goals, we are recommending that the Secretary of Energy take the following two actions: * Explore a means to capture information on the monitoring processes of all recipients to make certain that recipients have effective monitoring practices. * Solicit information from recipients regarding the methodology they used to calculate their energy-related impact metrics and verify that recipients who use DOE's estimation tool use the most recent version when calculating these metrics. Agency Comments and Our Evaluation: We provided a draft of this report to the Department of Energy for review and comment. DOE's comments are reproduced in appendix II. DOE agreed with GAO's recommendations, stating that "implementing the report's recommendations will help ensure that the Program continues to be well managed and executed." DOE also provided additional information on steps it has initiated or planned to implement. In particular, with respect to our first recommendation, DOE elaborated on additional monitoring practices it performs over high dollar value grant recipients, such as its reliance on audit results obtained in accordance with the Single Audit Act and its update to the EECBG program requirements in the Compliance Supplement to OMB Circular No. A-133. However, these monitoring practices only focus on larger grant recipients, and we believe that the program could be more effectively monitored if DOE captured information on the monitoring practices of all recipients. We are sending copies of this report to appropriate congressional committees, the Secretary of Energy, the Director of the Office of Management and Budget, and other interested parties. The report will also be available at no charge on the GAO Web site at [hyperlink, http://www.gao.gov]. If you or your staffs have any questions about this report, please contact Mark E. Gaffigan at (202) 512-3841 or gaffiganm@gao.gov, or Yvonne D. Jones at (202) 512-6806 or jonesy@gao.gov. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this report. GAO staff who made major contributions to this report are listed in appendix IV. Signed by: Mark E. Gaffigan: Managing Director Natural Resources and Environment: Signed by: Yvonne D. Jones: Director, Strategic Issues: List of Committees: The Honorable Daniel Inouye: Chairman: The Honorable Thad Cochran: Vice Chairman: Committee on Appropriations: United States Senate: The Honorable Harold Rogers: Chairman: The Honorable Norman D. Dicks: Ranking Member: Committee on Appropriations: House of Representatives: The Honorable Joseph I. Lieberman: Chairman: The Honorable Susan M. Collins: Ranking Member: Committee on Homeland Security and Governmental Affairs: United States Senate: The Honorable Darrell E. Issa: Chairman: The Honorable Elijah Cummings: Ranking Member: Committee on Oversight and Government Reform: House of Representatives: The Honorable Fred Upton: Chairman: The Honorable Henry Waxman: Ranking Member: Committee on Energy and Commerce: House of Representatives: The Honorable Jeff Bingaman: Chairman: The Honorable Lisa Murkowski: Ranking Member: Committee on Energy and Natural Resources: United States Senate: [End of section] Appendix I: Objectives, Scope, and Methodology: Objectives: We took a number of steps to address our objectives, which were to determine (1) how Energy Efficiency and Conservation Block Grant (EECBG) funds are being used, and what challenges, if any, do EECBG recipients face in obligating and spending their funds; (2) actions Department of Energy (DOE) officials and EECBG recipients are taking to provide oversight of EECBG funds and challenges, if any, they face in meeting Recovery Act and other requirements; (3) the extent to which EECBG program recipients and the EECBG program are meeting Recovery Act and EECBG program goals for energy savings, and what challenges, if any, have recipients encountered in measuring and reporting energy savings; and (4) how the quality of estimates of jobs created and retained reported by Recovery Act recipients, particularly EECBG recipients, has changed over time. Scope and Methodology: To address these objectives, we reviewed and analyzed relevant federal laws and regulations, as well as federal agency guidance related to program goals, use of funds, monitoring, and reporting outcomes. We interviewed agency officials and about 30 staff members in DOE field offices that have some role in managing and monitoring awards including project officers, technical monitors, contract specialists and contractors, and compliance officers, as well as those that provide direct support including attorneys and accountants, to discuss the roles and responsibilities for managing awards, project management guidance and communication, activities undertaken and challenges faced in providing support to recipients with obligating and spending funds, monitoring, and reporting outcomes. We also met with officials from the DOE Office of Inspector General to better understand their role in the expanded grant-awarding process and in monitoring recipients. We also interviewed representatives from associations and organizations, including the National Association of Counties (NACo), the National Association of State Energy Officials (NASEO), and the U.S. Conference of Mayors (USCM) to better understand best practices identified and challenges faced by recipients in using funds, monitoring projects, and reporting outcomes. We also reviewed reports prepared by DOE and analyzed data from DOE's iPortal database on the number of grant awards, amount of obligations and expenditures, and number of users. In addition, we reviewed reports prepared by DOE and analyzed data from DOE's PAGE database on the number and type of program activities. We assessed the reliability of the program data we used by reviewing DOE documentation and Inspector General reports on the Performance and Accountability for Grants in Energy (PAGE) system; interviewing knowledgeable DOE officials about the quality and potential limitations of the data and what checks and controls were in place to ensure data accuracy; and performing edit checks on iPortal and PAGE data. We determined the data were sufficiently reliable for our purposes. In addition, we developed a set of questions to be administered to a nonprobability sample of 50 grant recipients eligible to receive formula funding. These questions addressed various aspects of our first three objectives, such as obligating and spending funds, guidance, best practices, internal controls, monitoring, working with DOE officials, and challenges faced in implementing projects. We pre- tested and revised these questions and sent them in an e-mail to 50 EECBG grant recipients in cities and counties in the United States. Using October 2010 data from DOE's iPortal and PAGE information systems, and from data gathered from Recovery.gov, we purposefully identified and selected a sample of city, county, and tribal recipients that included a range of grants by project activity types (e.g., building retrofit, incentive program); award size;[Footnote 38] state; different DOE project officers and monitors; and different stages of completion. We received responses from 25 of the 50 city and county grant recipients. No tribal communities responded. The responses from recipients in this sample are not generalizable to the 2,185 state, city, county and tribal recipients receiving formula EECBG funds nationwide. To obtain additional information regarding our objective addressing program goals for saving energy and reporting those savings, we selected another nonprobability sample of 41 EECBG grant recipients and sent them a similar e-mail questionnaire. We selected a range of recipients for this sample similar to the sample previously described. In addition, however, these recipients were selected because they had completed many, if not all, projects in their award as of October 2010. We selected these recipients in order to obtain information on best practices, strengths and weaknesses in reporting outcomes and challenges in measuring jobs, and cost and energy savings. The questionnaire sent to this sample of recipients included questions addressing these topics along with a few of the same questions asked of the other recipient sample. We received responses from 24 of the 41 grant recipients in the sample. One tribal grant recipient in our sample responded. The responses from this sample are also not generalizable to the population of EECBG recipients nationwide. In making our selection of grant recipients in both samples, we did not include grant recipients whose grant applications or awards were the subject of data collection efforts for previous GAO or DOE Inspector General reports. For both samples, we sent follow-up e-mails to recipients who did not respond after several weeks to our initial e- mail, encouraging them to complete the questionnaire. We were not able to conduct further follow-up activities to improve the response rate in the limited time remaining for us to complete our data collection field work. We reviewed responses to questions on guidance and experiences in obligating and spending funds, oversight and monitoring efforts, and reporting outcomes as well as best practices and challenges faced in managing and monitoring projects. The recipient reporting section of this report responds to the Recovery Act's mandate that we comment on the estimates of jobs created or retained by direct recipients of Recovery Act funds. For our review of the sixth submission of recipient reports, covering the period from October 1, 2010, through December 31, 2010, we built on findings from our five prior reviews of the reports, covering the period from February 2009 through September 30, 2010. To understand how the quality of jobs data reported by EECBG grant recipients has changed over time, we compared the six quarters of recipient reporting data that were publicly available at Recovery.gov on February 2, 2011. We performed edit checks and other analyses on EECBG grant recipient reports, which included matching DOE-provided data from iPortal and PAGE information systems on EECBG recipients. As part of that matching process, we also examined the reliability of recipient data contained in these DOE information systems. Our assessment activities included reviewing documentation of system processes, Inspector General reviews of the systems and conducting logic tests for key variables. Our matches showed a high degree of agreement between DOE recipient information and the information reported by recipients directly to FederalReporting.gov. However, the magnitude of the differences or lack of agreement with regard to the full-time equivalents (FTE) are not insignificant.[Footnote 39] In general, we consider the data used to be sufficiently reliable, with attribution to official sources for the purposes of providing background information and a general sense of the status of EECBG recipient reporting. To update the status of open recommendations from previous bimonthly and recipient reporting reviews, we obtained information from agency officials on actions taken in response to the recommendations. We conducted this performance audit from September 2010 to April 2011 in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. [End of section] Appendix II: Comments from the Department of Energy: Department of Energy: Washington, DC 20585: March 17, 2011: Mark Gaffigan: Managing Director: Natural Resources and Environment: US Government Accountability Office: 441 G Street Room 2P47: Washington DC 20548: Dear Mr. Gaffigan: The Office of Energy Efficiency and Renewable Energy (EERE) appreciates the opportunity to review and comment on the results of the GAO Report to Congress "Energy Efficiency and Conservation Block Grant Recipients Face Challenges Meeting Legislative and Program Goals and Requirements". EERE management is committed to continuing to improve the Energy Efficiency and Conservation Block Grant (EECBG) Program. Energy efficiency and conservation measures implemented at the local level are important ways to save taxpayer dollars and reduce demand for polluting sources of energy and reduce America's dependence on fossil fuel. Our partnership with local communities in executing Energy Block Grants is creating clean energy jobs that are contributing to the country's economic growth and recovery. Thanks to this program, new alliances have been forged between local businesses, non-profits and governments as we transition into a new clean energy economy. Through the adoption of renewable energy and energy efficiency technologies, communities of every size are addressing local energy challenges and saving money by saving energy. The EECBG program has seen enormous success: to date, we have injected nearly $1 billion into the economy. The direct jobs count continues to rise”over 3,400 hundred jobs in Q3 of CY 2010 and over 5,700 jobs in Q4 of CY 2010, a 70% increase over the previous quarter. Grantees are reporting significant impacts on their community through implementation of their EECBG sustainability strategies. According to initial grantee self-reporting through December 2010, communities have upgraded the energy efficiency of over 10,000 buildings; installed 40,000 efficient street lights; and upgraded more than 100,000 traffic signals.[Footnote 1] Americans are at work making schools, court houses and police stations run more efficiently, developing electric charging station networks, upgrading homes and businesses, and installing clean energy solutions like solar panels and waste-to-energy power plants. This translates to millions of dollars of savings for taxpayers today and billions into the future. Over 2,300 communities from America's rural counties and tribal organizations to our largest cities and every state and U.S. Territory are moving to a more secure economic and energy future. EERE would also like to comment on the reporting requirements that GAO discusses in the report. Across the Recovery Act, programs are being implemented with unprecedented transparency, and each program must balance the need for detailed reporting to ensure that taxpayer dollars are effectively put to work with the level of burden on grantees. The EECBG program is one of the largest grant programs ever run by the federal government (as measured by the number of grantees), with 2,300 recipients and over 10,000 sub-recipients. Many of these recipients and sub-recipients are receiving federal grant money for the first time. While DOE does not have the resources to visit each recipient, sub-recipient, and project site, we do require recipients to file short but detailed reports on project activities to ensure compliance with EECBG grant terms and conditions. This allows DOE to track and monitor a wide breadth of recipients to ensure they are effectively spending taxpayer dollars and complying with requirements. There are many reporting requirements that are independent of the EECBG-specific requirements. For example, GAO notes the case of an EECBG recipient who canceled a geothermal project because of "reporting requirements" (p. 14). In general, geothermal projects funded with Federal dollars, particularly ARRA funds, may trigger a variety of requirements that necessitate documentation or reporting by the recipient”for example, NEPA, Davis-Bacon, and Buy American requirements. The additional EECBG reporting requirements are nominal and do not differ based on the activity type, so this particular project would not have required any more EECBG reporting than any other activity. For reporting requirements specific to the EECBG program, EERE solicits regular feedback from grantees, stakeholders and project officers with the aim of continuously improving our operations, including reporting. Such feedback led EERE to make changes that lowered the reporting burden on grantees, while improving the quality of reports the program received. For example, EERE scaled back significantly our monthly reporting requirements. EERE also incorporated lessons learned from formula grant recipients into the reporting requirements for competitive grant recipients. Below are EERE's responses to GAO's two recommendations. Recommendation 1: Explore means to capture information processes of all recipients to make certain that recipients have effective monitoring practices. The EECBG program manages an unprecedented number of grantees in what is essentially a startup program, with grants ranging in size from $25,000 to $80 million. We agree with GAO's position that EECBG should establish effective monitoring practices that ensure funds are properly spent and controls are in place to prevent waste, fraud and abuse. However, information gathering and review must be balanced with agency resources. EECBG adopted standard, well-accepted, audit sampling practices to achieve the most effective coverage in verifying monitoring practices. These practices include 100% sampling of reviews for the high dollar value (allocation greater than or equal to $1 million) grantees. These are typically our most complex grants, with multiple sub-recipients. EECBG monitors the remaining set of grantees based on the size, complexity and risk profile of the grantees. Additional on-site reviews are undertaken for grants that have/are demonstrating potential at-risk behaviors, such as not filing quarterly reports, demonstrating reporting errors or inconsistencies, show a lack of responsiveness to project officers and other EERE personnel, and/or through desktop reviews, which are conducted on 100% of grantees quarterly. As of March 7, 2011, EERE has conducted onsite monitoring of 483 grant recipients with a combined allocation of just over $2 billion, or approximately 75% of the EECBG formula program's total $2.7 billion allocation. EERE plans to conduct at least one onsite visit to each recipient with an allocation of $1 million or greater, which accounts for 79% of EECBG's total allocation. The number will increase to over 80% as EERE monitors conduct additional onsite visits. EERE performs onsite monitoring of $2 million or greater grantees at least once per year. EERE project officers complete a monitoring checklist and file a narrative report after each onsite monitoring visit. These monitoring checklists and narrative reports provide EERE with information on recipient monitoring practices. During the 483 onsite visits conducted to-date, EERE identified a handful of cases where recipients' monitoring of sub-recipients/subcontractors needed improvement. In these cases, EERE informed the recipients of the need to improve, and” where appropriate”instituted a Corrective Action Plan (CAP) to help ensure that recipients take the necessary steps to remediate issues. We then continue to track the status of the CAPs at a program level to ensure that issues are driven to resolution. EERE also has issued guidance on sub-recipient monitoring (EECBG Program Notice 10-019, October 26, 2010). DOE's guidance includes both general principles and specific examples of tools and best practices developed by EECBG recipients. Recipients were encouraged to incorporate both the general principles and the specific tools/best practices into their own monitoring activities. As part of EERE's comprehensive oversight and monitoring approach, we will continue to rely on the results of audits performed in accordance with Office of Management and Budget (OMB) Circular A-133 (single audits). All of the larger EECBG recipients are subject to the A-133 audit requirement, the scope of which includes a review of sub- recipient monitoring practices. For the current fiscal year, EERE is actively working with OMB to include a specific discussion of EECBG program requirements in OMB's annual guidance for the A-133 program in order the strengthen the audit oversight of the EECBG program. Recommendation 2: Solicit information from recipients regarding the methodology they used to calculate their energy-related impact metrics and verify that recipients who use DOE's estimation tool use the most recent version when calculating these metrics. The Department of Energy and recipients use a variety of tools and scientific approaches to estimate the energy savings of specific activities. These tools and approaches include engineering estimates, individually metered equipment, utility bill tracking, and estimation tools such as the calculator made available to grantees by EECBG. EECBG grants represent over 10,000 unique activities, ranging in geography, approach, technology, and size. Due to the diversity of these grants, it is not practical to prescribe one form of measurement and verification to all grantees. EERE has issued guidance that provides grantees with suggested guidelines to plan and conduct appropriate Evaluation, Measurement & Verification (EM&V) efforts for their Recovery-Act-funded EECBG programs and activities.[Footnote 2] It is important that the results achieved with funds provided by the Recovery Act be documented and assessed to the extent practicable for grantees. In order to provide a reasonable estimate of energy savings, the program reviews energy process and impact metrics submitted each quarter for reasonableness, works with grantees to correct unreasonable metrics, and continues to work with grantees through closeout to refine metrics. EECBG will take a scientific approach to overall program evaluation during the formal evaluation process at the conclusion of the program. EERE is deeply committed to the success of the Energy Efficiency and Conservation Block Program and believes that implementing the report's recommendations will help ensure that the Program continues to be well managed and executed. The Department will be pleased to provide any additional documentation upon request. Thank you for the opportunity to comment on the report. We look forward to working with GAO to help achieve the goals of the American Reinvestment and Recovery Act. If you have any questions, please contact Ms. Martha Oliver, Office of Congressional and Intergovernmental Affairs, at (202) 586-2229. Sincerely, Signed by: [Illegible] for: Kathleen B. Hogan: Deputy Assistant Secretary for Energy Efficiency: Office of Technology Development: Energy Efficiency and Renewable Energy: Appendix II Footnotes: [1] These numbers reflect preliminary grantee reported metrics that have not necessarily been audited by a third party or the US Department of Energy. [2] EECBG Program Notice 10-017. [End of section] Appendix III: Status of Prior Open Recommendations and Matters for Congressional Consideration: In this appendix, we update the status of agencies' efforts to implement the 26 open recommendations, and 5 newly implemented recommendations from our previous bimonthly and recipient reporting reviews.[Footnote 40] New recommendations and agency responses to those recommendations are included in the program section of this report. Recommendations that were listed as implemented or closed in a prior report are not repeated here. Lastly, we address the status of our Matters for Congressional Consideration. Department of Energy: Open Recommendations:[Footnote 41] Given the concerns we have raised about whether program requirements were being met, we recommended in May 2010 that the Department of Energy (DOE), in conjunction with both state and local weatherization agencies, develop and clarify weatherization program guidance that: * clarifies the specific methodology for calculating the average cost per home weatherized to ensure that the maximum average cost limit is applied as intended. * accelerates current DOE efforts to develop national standards for weatherization training, certification, and accreditation, which is currently expected to take 2 years to complete. * develops a best practice guide for key internal controls that should be present at the local weatherization agency level to ensure compliance with key program requirements. * sets time frames for development and implementation of state monitoring programs. * revisits the various methodologies used in determining the weatherization work that should be performed based on the consideration of cost-effectiveness and develops standard methodologies that ensure that priority is given to the most cost- effective weatherization work. To validate any methodologies created, this effort should include the development of standards for accurately measuring the long-term energy savings resulting from weatherization work conducted. In addition, given that state and local agencies have felt pressure to meet a large increase in production targets while effectively meeting program requirements and have experienced some confusion over production targets, funding obligations, and associated consequences for not meeting production and funding goals, we recommended that DOE clarify its production targets, funding deadlines, and associated consequences while providing a balanced emphasis on the importance of meeting program requirements. Agency Actions: DOE generally concurred with all of our recommendations and has begun to take some actions to implement them. With regard to clarifying the methodology to calculate the average cost per home weatherized, DOE has taken some action but has not yet provided specific guidance to clarify this methodology. In response to our recommendation to develop and clarify guidance on developing national standards for weatherization training, certification, and accreditation, according to DOE officials, DOE is making progress toward advancing such standards. For example, DOE and the Department of Labor released the draft "Workforce Guidelines for Home Energy Upgrades" for single- family homes in November 2010. DOE officials expect to finalize the guidelines by early spring 2011. DOE has taken some steps to address our recommendation that it develop and clarify guidance to generate a best practice guide for key internal controls. According to officials, the Weatherization Assistance Program Technical Assistance Center Web site provides a variety of best practices on program management, administrative procedures, and technical standards. However, while the Web site is a central repository for all relevant resource documents, DOE has not created a dedicated guide on best practices for key internal controls. In response to our recommendation to develop and clarify guidance to set time frames for development and implementation of state monitoring programs, DOE has taken limited action. DOE officials provided current guidance available on state monitoring efforts but did not identify any time frames for development or implementation of state monitoring programs. With regard to our recommendation on developing and clarifying guidance for prioritizing cost-effective weatherization work, DOE has taken some actions. For example, DOE contracted with the Oak Ridge National Laboratory in 2010 to conduct an assessment of aspects of program performance such as costs and benefits for program years 2008 to 2010. The assessment will cover both Recovery Act funds and annual appropriation funds. Preliminary results may be available in late spring 2011. In response to our recommendation that DOE clarify its production targets, funding deadlines, and associated consequences, DOE has taken steps to address this recommendation. According to officials, DOE has communicated directly with recipients about funding, production, and other priorities. For example, the Green Light program is in its fourth round of communications between two DOE offices and recipients. DOE officials cited these calls as assisting in the identification of barriers preventing grantees from increasing production and expenditures. In addition, DOE officials stated that grantees were notified of the requirement to spend all Recovery Act funds by March 31, 2012. However, DOE provided no evidence that it has clarified the consequences. Newly Implemented Recommendations:[Footnote 42] We recommended that DOE, in conjunction with both state and local weatherization agencies, develop and clarify weatherization program guidance that: * establishes best practices for how income eligibility should be determined and documented and issues specific guidance that does not allow the self-certification of income by applicants to be the sole method of documenting income eligibility. * considers and addresses how the weatherization program guidance is impacted by the introduction of increased amounts of multifamily units. Agency Actions: DOE agreed with both of our recommendations and has taken action to implement them. In response to our recommendation on issuing guidance and establishing best practices to determine income eligibility, DOE issued guidance--Weatherization Program Notice 10-18, 2010 Poverty Income Guidelines and Definition of Income--on September 20, 2010. In this guidance, DOE clarified the definition of income and strengthened income eligibility requirements. For example, the guidance clarified that self-certification of income would only be allowed after all other avenues of documenting income eligibility are exhausted. Additionally, for individuals to self-certify income, a notarized statement indicating the lack of other proof of income is required. Regarding our recommendation on weatherization program guidance for multifamily units, DOE officials identified several issues that impact the increased number of multifamily buildings to be weatherized and issued several guidance documents addressing multifamily buildings. For example, DOE issued Weatherization Program Notice 11-1, Program Year 2011 Weatherization Grant Guidance in December 2010, which contained two sections related to multifamily units. One section covered the eligibility of multifamily units for the weatherization program and the other section provided guidance on conducting energy audits on multifamily units. In reviewing the recently issued program guidance for both recommendations, we have concluded that DOE addressed the intent of these recommendations. Environmental Protection Agency: Open Recommendation:[Footnote 43] We recommended that the Environmental Protection Agency (EPA) Administrator work with the states to implement specific oversight procedures to monitor and ensure subrecipients' compliance with the provisions of the Recovery Act-funded Clean Water and Drinking Water State Revolving Fund (SRF) program. Agency Actions: In response to our recommendation, EPA provided additional guidance to the states regarding their oversight responsibilities, with an emphasis on enhancing site-specific monitoring and inspections. Specifically, in June 2010, the agency developed and issued an oversight plan outline for Recovery Act projects that provides guidance on the frequency, content, and documentation related to regional reviews of state Recovery Act programs and regional and state reviews of specific Recovery Act projects. For example, EPA's guidance states that regions and states should be reviewing the items included on the EPA "State ARRA Inspection Checklist" or use a state equivalent that covers the same topics. The plan also describes EPA headquarters' role in ongoing Recovery Act oversight and plans for additional webcasts. EPA also reiterated that contractors are available to provide training and to assist with file reviews and site inspections. We are undertaking further review of the states' use of Recovery Act funds for the Clean Water and Drinking Water programs. As part of that work, we will consider EPA's and the states' oversight of Recovery Act funds and, more specifically, progress in implementing EPA's guidance. Department of Health and Human Services: Office of Head Start: Open Recommendation:[Footnote 44] To facilitate understanding of whether regional decisions regarding waivers of the program's matching requirement are consistent with Recovery Act grantees' needs across regions, we recommended that the Director of the Office of Head Start (OHS) should regularly review waivers of the nonfederal matching requirement and associated justifications. Agency Actions: OHS has not conducted a review of waivers of the nonfederal matching requirement, but OHS officials stated that the variation is largely due to differences in regions' policy in timing: some regional offices grant waivers at the same time that the grant is made official, whereas other regions grant waivers later. OHS officials stated that although the OHS central office has not regularly reviewed grantees' justifications for waiver applications for regional variability in the past, they are looking into tracking this data in their Web-based system consistently across regions. The process of tracking waivers is not yet complete. Open Recommendation:[Footnote 45] To oversee the extent to which grantees are meeting the program goal of providing services to children and families and to better track the initiation of services under the Recovery Act, we recommended that the Director of OHS should collect data on the extent to which children and pregnant women actually receive services from Head Start and Early Head Start grantees. Agency Actions: The Department of Health and Human Services (HHS) disagreed with our recommendation. OHS officials stated that attendance data are adequately examined in triennial or yearly on-site reviews and in periodic risk management meetings. Because these reviews and meetings do not collect or report data on service provision, we continue to believe that tracking services to children and families is an important measure of the work undertaken by Head Start and Early Head Start service providers. Open Recommendation:[Footnote 46] To help ensure that grantees report consistent enrollment figures, we recommended that the Director of OHS should better communicate a consistent definition of "enrollment" to grantees for monthly and yearly reporting and begin verifying grantees' definition of "enrollment" during triennial reviews. Agency Actions: OHS issued informal guidance on its Web site clarifying monthly reporting requirements, but has not clarified yearly reporting requirements. Open Recommendation:[Footnote 47] To provide grantees consistent information on how and when they will be expected to obligate and expend federal funds, we recommended that the Director of OHS should clearly communicate its policy to grantees for carrying over or extending the use of Recovery Act funds from one fiscal year into the next. Agency Actions: HHS indicated that OHS will issue guidance to grantees on obligation and expenditure requirements, as well as improve efforts to effectively communicate the mechanisms in place for grantees to meet the requirements for obligation and expenditure of funds. Open Recommendation:[Footnote 48] To better consider known risks in scoping and staffing required reviews of Recovery Act grantees, we recommended that the Director of OHS should direct OHS regional offices to consistently perform and document Risk Management Meetings and incorporate known risks, including financial management risks, into the process for staffing and conducting reviews. Agency Actions: HHS reported OHS is reviewing the risk management process to ensure it is consistently performed and documented in its centralized data system and that it has taken related steps, such as requiring the grant officer to identify known or suspected risks prior to an on-site review. Department of Housing and Urban Development: Open Recommendation:[Footnote 49] Because the absence of third-party investors reduces the amount of overall scrutiny Tax Credit Assistance Program (TCAP) projects would receive and the Department of Housing and Urban Development (HUD) is currently not aware of how many projects lacked third-party investors, we recommended that HUD should develop a risk-based plan for its role in overseeing TCAP projects that recognizes the level of oversight provided by others. Agency Actions: HUD responded to our recommendation by saying it will identify projects that are not funded by the HOME Investment Partnerships Program funds and projects that have a nominal tax credit award. However, HUD said it will not be able to identify these projects until it could access the data needed to perform the analysis, and it does not receive access to those data until after projects have been completed. HUD currently has not taken any action on this recommendation because it only has data on the small percentage of projects completed to date. It is too early in the process to be able to identify projects that lack third-party investors. The agency will take action once they are able to collect the necessary information from the project owners and the state housing finance agencies. Department of Labor: Open Recommendations:[Footnote 50] To enhance the Department of Labor's (Labor) ability to manage its Recovery Act and regular Workforce Investment Act (WIA) formula grants and to build on its efforts to improve the accuracy and consistency of financial reporting, we recommended that the Secretary of Labor take the following actions: * To determine the extent and nature of reporting inconsistencies across the states and better target technical assistance, conduct a one-time assessment of financial reports that examines whether each state's reported data on obligations meet Labor's requirements. * To enhance state accountability and to facilitate their progress in making reporting improvements, routinely review states' reporting on obligations during regular state comprehensive reviews. Agency Actions: Labor agreed with both of our recommendations and has begun to take some actions to implement them. To determine the extent of reporting inconsistencies, Labor awarded a contract in September 2010 to perform an assessment of state financial reports to determine if the data reported are accurate and reflect Labor's guidance on reporting of obligations and expenditures. Labor plans to begin interviewing states in February 2011 and will issue a report after the interviews are completed and analyzed. To enhance states' accountability and facilitate their progress in making improvements in reporting, Labor has drafted guidance on the definitions of key financial terms such as obligations, which is currently in final clearance. After the guidance is issued, Labor plans to conduct a systemwide webinar on this topic. Open Recommendation:[Footnote 51] Our September 2009 bimonthly report identified a need for additional federal guidance in defining green jobs and we made the following recommendation to the Secretary of Labor: * To better support state and local efforts to provide youth with employment and training in green jobs, provide additional guidance about the nature of these jobs and the strategies that could be used to prepare youth for careers in green industries. Agency Actions: Labor agreed with our recommendation and has begun to take several actions to implement it. Labor's Bureau of Labor Statistics has developed a definition of green jobs which was finalized and published in the Federal Register on September 21, 2010. In addition, Labor continues to host a Green Jobs Community of Practice, an online virtual community available to all interested parties. As part of this effort, in December 2010, Labor hosted its first Recovery Act Grantee Technical Assistance Institute, which focused on critical success factors for achieving the goals of the grants and sustaining the impact into the future. The department also plans to host a symposium in late Spring 2011 with the green jobs state Labor Market Information Improvement grantees. The symposium will share recent research and other promising practices to inform workforce development and training strategies. In addition, the department anticipates releasing its Internet-based Occupational Information Network (O*NET) Career Profiler tool in the winter of 2011 for those new to the workforce. This tool includes the O*NET green leaf symbol to highlight green occupations. Furthermore, the department's implementation study of the Recovery Act-funded green jobs training grants is still ongoing. The interim report is expected in late 2011. Newly Implemented Recommendation:[Footnote 52] Our September 2009 bimonthly report identified a need for additional federal guidance in measuring the work readiness of youth and we made the following recommendation to the Secretary of Labor: * To enhance the usefulness of data on work readiness outcomes, provide additional guidance on how to measure work readiness of youth, with a goal of improving the comparability and rigor of the measure. Agency Actions: Labor agreed with our recommendation and has taken steps to implement it. Labor issued guidance in May and August 2010 to identify requirements for measuring the work readiness of youth and the methodology for implementing the work readiness indicators for the WIA Youth Program. The guidance clarified the changes to the definition of work readiness by requiring a worksite evaluation conducted by the employer. Executive Office of the President: Office of Management and Budget: Open Recommendation: To leverage Single Audits as an effective oversight tool for Recovery Act programs, we recommended that the Director of the Office of Management and Budget (OMB): 1. provide more direct focus on Recovery Act programs through the Single Audit to help ensure that smaller programs with higher risk have audit coverage in the area of internal controls and compliance;[Footnote 53] 2. take additional efforts to provide more timely reporting on internal controls for Recovery Act programs for 2010 and beyond; [Footnote 54] 3. evaluate options for providing relief related to audit requirements for low-risk programs to balance new audit responsibilities associated with the Recovery Act;[Footnote 55] 4. issue Single Audit guidance in a timely manner so that auditors can efficiently plan their audit work;[Footnote 56] 5. issue the OMB Circular No. A-133 Compliance Supplement no later than March 31 of each year;[Footnote 57] 6. explore alternatives to help ensure that federal awarding agencies provide their management decisions on the corrective action plans in a timely manner;[Footnote 58] and: 7. shorten the timeframes required for issuing management decisions by federal agencies to grant recipients.[Footnote 59] Agency Actions: (1) To provide more direct focus on Recovery Act programs to help ensure that smaller programs with higher risk have audit coverage in the area of internal controls and compliance through the Single Audit, OMB updated its single audit guidance in the OMB Circular A-133, Audits of States, Local Government, and Non-Profit Organizations Compliance Supplement in July 2010.[Footnote 60] This compliance supplement requires auditors to consider all federal programs with expenditures of Recovery Act awards to be considered programs with higher risks when performing standard risk-based tests to select programs to be audited. The compliance supplement also clarified information to assist auditors in determining the appropriate risk levels for programs with Recovery Act expenditures. OMB officials have stated that they are in the process of completing the 2011 Compliance Supplement which they expected to issue by March 31, 2011. As of April 4, 2011, the 2011 Compliance Supplement had not yet been issued. They also stated that this compliance supplement will continue to provide guidance that addresses some of the higher risks inherent in Recovery Act programs. The most significant of these risks are associated with newer programs that may not yet have the internal controls and accounting systems in place to help ensure that funds are distributed and used in accordance with program regulations and objectives. Since Recovery Act spending is projected to continue through 2016, we believe that it is essential that OMB provide direction in Single Audit guidance so that some smaller programs with higher risk would not be automatically excluded from receiving audit coverage based upon the requirements in the Single Audit Act. In recent discussions with OMB officials, we communicated our concern that future Single Audit guidance provide instruction that helps to ensure that smaller programs with higher risk have audit coverage in the area of internal controls and compliance. OMB officials agreed and stated that they plan to continue including similar language in the Compliance Supplement and performing outreach training throughout the duration of the Recovery Act. (2) To address the recommendation for taking additional efforts to encourage more timely reporting on internal controls for Recovery Act programs for 2010 and beyond, OMB commenced a second voluntary Single Audit Internal Control Project (project) in August 2010 for states that received Recovery Act funds in fiscal year 2010.[Footnote 61] Similar to the prior project (which did not get started until October 2009), one of the project's goals is to achieve more timely communication of internal control deficiencies for higher-risk Recovery Act programs so that corrective action can be taken more quickly. Specifically, the project encourages participating auditors of states that received Recovery Act funds to identify and communicate deficiencies in internal control to management 3 months sooner than the 9-month time frame currently required under OMB Circular No. A- 133. The project also requires that management provide, 2 months earlier than required under statute, plans for correcting internal control deficiencies to the cognizant agency for audit for immediate distribution to the appropriate federal agencies.[Footnote 62] The federal agency is then to have provided its concerns relating to management's plan of corrective actions in a written decision as promptly as possible and no later than 90 days after the corrective action plan is received by the cognizant agency for audit. According to OMB officials, 14 states volunteered to participate in the second project. Each participating state was to select a minimum of four Recovery Act programs for inclusion in the project. We assessed the results of the first OMB Single Audit Internal Control Project for fiscal year 2009 and found that it was helpful in communicating internal control deficiencies earlier than required under statute. We reported that 16 states participated in the first project and that the states selected at least two Recovery programs for the project. We also reported that the project's dependence on voluntary participation limited its scope and coverage and that voluntary participation may also bias the project's results by excluding from analysis states or auditors with practices that cannot accommodate the project's requirement for early reporting of control deficiencies. Overall, we concluded that although the project's coverage could have been more comprehensive, the analysis of the project's results provided meaningful information to OMB for better oversight of the Recovery Act programs selected and information for making future improvements to the Single Audit guidance. OMB's second Single Audit Internal Control Project is in progress and its planned completion date is June 2011. OMB plans to assess the project's results after its completion date. As of February 9, 2011, OMB officials have stated that the 14 participating states have met the milestones for submitting interim internal control reports by December 31, 2010 and their corrective action plans by January 31, 2011. We believe that OMB needs to continue taking steps to encourage timelier reporting on internal controls through Single Audits for Recovery Act programs. (3) OMB officials have stated that they are aware of the increase in workload for state auditors who perform Single Audits due to the additional funding to Recovery Act programs and corresponding increases in programs being subject to audit requirements. OMB officials stated that they solicited suggestions from state auditors to gain further insights to develop measures for providing audit relief. However, OMB has not yet identified viable alternatives that would provide relief to all state auditors that conduct Single Audits. For state auditors that are participating in the second OMB Single Audit Internal Control Project, OMB has provided some audit relief in that they have modified the requirements under Circular No. A-133 to reduce the number of low-risk programs that are to be included in some project participants' risk assessment requirements. As expenditures of Recovery Act funds are expected to continue through 2016, it is important that OMB look for opportunities and implement various options for providing audit relief in future years. (4)(5) With regard to issuing Single Audit Guidance in a timely manner, and specifically the OMB Circular A-133 Compliance Supplement, we previously reported in December 2010 that OMB officials stated that they intended to issue the 2011 Compliance Supplement by March 31, 2011.[Footnote 63] In January 2011, OMB officials reported that the production of the 2011 Compliance Supplement was on schedule for issuance by March 31, 2011. As of April 4, 2011, the 2011 Compliance Supplement had not yet been issued, and we will continue to monitor OMB's progress to achieve this objective. (6)(7) In October 2010, OMB officials stated that, based on their assessment of the results of the project, they have discussed alternatives for helping to ensure that federal awarding agencies provide their management decisions on the corrective action plans in a timely manner, including possibly shortening the time frames required for federal agencies to provide their management decisions to grant recipients.[Footnote 64] However, OMB officials have yet to decide on the course of action that they will pursue to implement our related recommendations. OMB officials acknowledged that the results of the 2009 OMB Single Audit Internal Control Project confirmed that this issue continues to be a challenge. They stated that they have met individually with several federal awarding agencies that were late in providing their management decisions in the 2009 project to discuss the measures that the agencies will take to improve the timeliness of their management decisions. In March 2010, OMB issued guidance under memo M-10-14, item 7, [hyperlink, http://www.whitehouse.gov/sites/default/files/omb/assets/memoranda_2010/ m10-14.pdf] that called for federal awarding agencies to review reports prepared by the Federal Audit Clearinghouse regarding Single Audit findings and submit summaries of the highest-risk audit findings by major Recovery Act programs by September 30, 2010, as well as other relevant information on the federal awarding agency's actions regarding these areas. OMB officials have stated that they plan to use this information to identify trends that may require clarification or additional guidance in the compliance supplement. OMB officials also stated that they are working with the Recovery Act Accountability and Transparency Board to develop metrics for determining how federal awarding agencies are to use information available in the Single Audit. As of January 2011, according to OMB officials, the metric project is progressing. OMB officials have stated that they anticipate that the metrics will be available in early spring 2011 and that the metrics could be applied at the agency level, by program, to allow for analysis of Single Audit findings and other uses to be determined. One goal of the metrics project is to increase the effectiveness and timeliness of federal awarding agencies' actions to resolve Single Audit findings. We will monitor the progress of these efforts to determine the extent that it improves the timeliness of federal agencies' actions to resolve audit findings so that risks to Recovery Act funds are reduced and internal controls in Recovery Act programs are strengthened. Department of Transportation: Open Recommendations:[Footnote 65] To ensure that Congress and the public have accurate information on the extent to which the goals of the Recovery Act are being met, we recommended that the Secretary of Transportation direct the Federal Highway Administration (FHWA) to take the following two actions: 1. Develop additional rules and data checks in the Recovery Act Data System, so that these data will accurately identify contract milestones such as award dates and amounts, and provide guidance to states to revise existing contract data. 2. Make publicly available--within 60 days after the September 30, 2010, obligation deadline--an accurate accounting and analysis of the extent to which states directed funds to economically distressed areas, including corrections to the data initially provided to Congress in December 2009. Agency Actions: As of the time of this report, the Department of Transportation (DOT) was in the process of developing its plans in response to these recommendations. Open Recommendation:[Footnote 66] To better understand the impact of Recovery Act investments in transportation, we believe that the Secretary of Transportation should ensure that the results of these projects are assessed and a determination made about whether these investments produced long-term benefits. Specifically, in the near term, we recommended that the Secretary direct FHWA and the Federal Transit Administration (FTA) to determine the types of data and performance measures they would need to assess the impact of the Recovery Act and the specific authority they may need to collect data and report on these measures. Agency Actions: In its response, DOT noted that it expected to be able to report on Recovery Act outputs, such as the miles of road paved, bridges repaired, and transit vehicles purchased, but not on outcomes, such as reductions in travel time, nor did it commit to assessing whether transportation investments produced long-term benefits. DOT further explained that limitations in its data systems, coupled with the magnitude of Recovery Act funds relative to overall annual federal investment in transportation, would make assessing the benefits of Recovery Act funds difficult. DOT indicated that, with these limitations in mind, it is examining its existing data availability and, as necessary, would seek additional data collection authority from Congress if it became apparent that such authority were needed. DOT plans to take some steps to assess its data needs, but it has not committed to assessing the long-term benefits of Recovery Act investments in transportation infrastructure. We are therefore keeping our recommendation on this matter open. Newly Implemented Recommendation:[Footnote 67] We recommended that the Secretary of Transportation should gather timely information on the progress they are making in meeting the maintenance-of-effort requirement and to report preliminary information to Congress within 60 days of the certified period (September 30, 2010), (1) on whether states met required program expenditures as outlined in their maintenance-of-effort certifications, (2) the reasons that states did not meet these certified levels, if applicable, and (3) lessons learned from the process. Agency Actions: On January 27, 2011, the Secretary of Transportation sent a report to Congress that addressed each reporting element we recommended. DOT reported that 29 states and the District of Columbia met their planned level of expenditure and 21 states did not. It also summarized reasons states did not meet the certified levels, such as a reduction in dedicated revenues for transportation or a state legislature approving a lower-than-expected level of transportation funding in the state budget. Finally, DOT's report provided its perspectives on lessons learned from the process, including identifying barriers to effectively implementing the maintenance-of-effort requirement. For example, it noted that the lack of clarity around statutory definitions regarding what constituted "state funding" and the substantial decreases in state dedicated transportation revenues were barriers to states producing an accurate certification and meeting the certified level. Department of the Treasury: Newly Implemented Recommendation:[Footnote 68] The Department of the Treasury (Treasury) should expeditiously provide Housing Finance Agencies (HFA) with guidance on monitoring project spending and develop plans for dealing with the possibility that projects could miss the spending deadline and face further project interruptions. Agency Actions: Treasury officials told us that after they provided additional guidance, every state HFA and the respective property owners complied with the 30 percent spending rule by the end of calendar year 2010. We concluded that Treasury and the state HFAs have addressed the intent of this recommendation. Matters for Congressional Consideration: Matter:[Footnote 69] To the extent that appropriate adjustments to the Single Audit process are not accomplished under the current Single Audit structure, Congress should consider amending the Single Audit Act or enacting new legislation that provides for more timely internal control reporting, as well as audit coverage for smaller Recovery Act programs with high risk. We continue to believe that Congress should consider changes related to the Single Audit process. Matter:[Footnote 70] To the extent that additional coverage is needed to achieve accountability over Recovery Act programs, Congress should consider mechanisms to provide additional resources to support those charged with carrying out the Single Audit Act and related audits. We continue to believe that Congress should consider changes related to the Single Audit process. Matter:[Footnote 71] To provide HFAs with greater tools for enforcing program compliance, in the event the Section 1602 Program is extended for another year, Congress may want to consider directing Treasury to permit HFAs the flexibility to disburse Section 1602 Program funds as interest-bearing loans that allow for repayment. We continue to believe that Congress should consider directing Treasury to permit HFAs the flexibility to disburse Section 1602 Program funds as interest-bearing loans that allow for repayment. [End of section] Appendix IV: GAO Contacts and Staff Acknowledgments: GAO Contacts: Mark E. Gaffigan, (202) 512-3841 or gaffiganm@gao.gov: Yvonne D. Jones, (202) 512-6806 or jonesy@gao.gov: Staff Acknowledgments: In addition to the contacts above, Joshua Akery, Thomas Beall, Andrew Ching, Holly Dye, Kim Gianopoulos, Sharon Hogan, Thomas James, Jonathan Kucskar, Kristen Massey, Karine McClosky, Alison O'Neill, Carol Patey, Brenda Rabinowitz, Beverly Ross, Kelly Rubin, Ben Shouse, Jonathan Stehle, Kiki Theodoropoulos, Nick Weeks, and Ethan Wozniak made key contributions to this report. [End of section] Related GAO Products: The following is a list of 10 related products published since the last mandated GAO report on the Recovery Act. [hyperlink, http://www.gao.gov/products/GAO-11-166], December 15, 2010. For a full list of products related to the Recovery Act, see [hyperlink, http://gao.gov/recovery/related-products/]. Medicaid: Improving Responsiveness of Federal Assistance to States during Economic Downturns. [hyperlink, http://www.gao.gov/products/GAO-11-395]. Washington, D.C.: March 31, 2011. State and Local Governments: Knowledge of Past Recessions Can Inform Future Federal Fiscal Assistance. [hyperlink, http://www.gao.gov/products/GAO-11-401]. Washington, D.C.: March 31, 2011. Recovery Act: Status of Department of Energy's Obligations and Spending. [hyperlink, http://www.gao.gov/products/GAO-11-483T]. Washington, D.C.: March 17, 2011. Recovery Act: Broadband Programs Awards and Risks to Oversight. [hyperlink, http://www.gao.gov/products/GAO-11-371T]. Washington, D.C.: February 10, 2011. Department of Education: Improved Oversight and Controls Could Help Education Better Respond to Evolving Priorities. [hyperlink, http://www.gao.gov/products/GAO-11-194]. Washington, D.C.: February 10, 2011. Rail Transit: FTA Programs Are Helping Address Transit Agencies' Safety Challenges, but Improved Performance Goals and Measures Could Better Focus Efforts. [hyperlink, http://www.gao.gov/products/GAO-11-199]. Washington, D.C.: January 31, 2011. Defense Infrastructure: High-Level Federal Interagency Coordination Is Warranted to Address Transportation Needs beyond the Scope of the Defense Access Roads Program. [hyperlink, http://www.gao.gov/products/GAO-11-165]. Washington, D.C.: January 26, 2011. Summary of GAO's Performance and Financial Information Fiscal Year 2010. [hyperlink, http://www.gao.gov/products/GAO-11-3SP]. Washington, D.C.: January 24, 2011. Child Support Enforcement: Departures from Long-term Trends in Sources of Collections and Caseloads Reflect Recent Economic Conditions. [hyperlink, http://www.gao.gov/products/GAO-11-196]. Washington, D.C.: January 14, 2011. Multiple Employment and Training Programs: Providing Information on Colocating Services and Consolidating Administrative Structures Could Promote Efficiencies. [hyperlink, http://www.gao.gov/products/GAO-11-92]. Washington, D.C.: January 13, 2011. [End of section] Footnotes: [1] This amount is current as of March 18, 2011. For updates see [hyperlink, http://gao.gov/recovery]. [2] Recovery Act, Pub. L. No. 111-5, § 3, 123 Stat. 116 (2009). [3] Recovery Act, div. A, § 1512(e), 123 Stat. 288. In this report, we refer to the quarterly reports required by Section 1512 as recipient reports. [4] GAO, Recovery Act: States' and Localities' Uses of Funds and Actions Needed to Address Implementation Challenges and Bolster Accountability, [hyperlink, http://www.gao.gov/products/GAO-10-604] (Washington, D.C.: May 26, 2010). [5] The Energy Independence and Security Act of 2007 (EISA) was signed into law on December 19, 2007. Pub. L. No. 110-140, 121 Stat. 1492. [6] The first quarter that recipients began reporting grant information through PAGE was July 1, 2009, to September 30, 2009. [7] DOE, Financial Assistance Funding Opportunity Announcement: Recovery Act - Energy Efficiency and Conservation Block Grants - Formula Grants, DE-FOA-0000013 (Mar. 26, 2009). [8] According to EECBG statutory requirements for formula grants, "A State that receives a grant under the program shall use not less than 60 percent of the amount received to provide subgrants to units of local government in the State that are ineligible" for direct formula grants. This means that 60 percent of EECBG funds awarded to states must be distributed to local units of government within the state that are not eligible to receive direct formula grants through the EECBG program. [9] DOE, Office of Inspector General, The Department of Energy's Implementation of the Energy Efficiency and Conservation Block Grant Program under the Recovery and Reinvestment Act: A Status Report, OAS- RA-10-16 (Aug. 11, 2010). [10] DOE defines the effective date of award as the date that the DOE contracting officer signed the award document. [11] DOE, Financial Assistance Funding Opportunity Announcement: Recovery Act - Energy Efficiency and Conservation Block Grants - Formula Grants, DE-FOA-0000013 (Mar. 26, 2009) and DOE, Energy Efficiency and Conservation Block Grant Program Notice 10-011 (Apr. 21, 2010). [12] DOE, Office of Inspector General, The Department of Energy's Implementation of the Energy Efficiency and Conservation Block Grant Program under the Recovery and Reinvestment Act: A Status Report, OAS- RA-10-16 (Aug. 11, 2010). [13] The Buy American requirement of the act generally requires that grant recipients use iron, steel, and manufactured goods produced in the United States on all Recovery Act-funded projects. [14] Under Division A, Section 1606 of the Recovery Act, contractors and subcontractors hired with Recovery Act funds are required to pay prevailing wages to laborers and mechanics. [15] Duties and position titles for monitoring staff vary somewhat by DOE office location. For the purposes of this report, contract specialists and technical monitors will also be referred to as project officers. [16] DOE defines subrecipients as those recipients that receive pass- through funds from recipients but are not the ultimate beneficiary of the funds, such as the vendor or contractor who provided the good or service. [17] Internal controls include organization, policies, and procedures and are tools to help program and financial managers achieve results and safeguard the integrity of their programs. [18] A full-time equivalent (FTE) is calculated as the total hours worked divided by the number of hours in a full-time schedule. [19] DOE, Office of Inspector General, The Department of Energy's Implementation of the Energy Efficiency and Conservation Block Grant Program under the Recovery and Reinvestment Act: A Status Report, OAS- RA-10-16 (Aug. 11, 2010). [20] DOE, Office of Inspector General, Management Controls over the Development and Implementation of the Office of Energy Efficiency and Renewable Energy's Performance and Accountability for Grants in Energy System, OAS-RA-10-14 (July 22, 2010). [21] 31 U.S.C. § 7502(f)(2). OMB's implementing guidance, OMB Circular No. A-133, Audits of States, Local Governments, and Non-Profit Organizations (June 2007), also imposes additional monitoring oversight requirements. [22] In addition to PAGE reporting, under Section 1512 of the Recovery Act, recipients of Recovery Act funds must also report--via FederalReporting.gov--on key project metrics such as job creation and total funds spent. [23] GAO, Recovery Act: Opportunities to Improve Management and Strengthen Accountability over States' and Localities' Uses of Funds, [hyperlink, http://www.gao.gov/products/GAO-10-999] (Washington, D.C.: Sept. 20, 2010). [24] GAO, Geostationary Operational Environmental Satellites: Improvements Needed in Continuity Planning and Involvement of Key Users, [hyperlink, http://www.gao.gov/products/GAO-10-799] (Washington, D.C.: Sept. 1, 2010). [25] Under the Single Audit Act, as amended, a non-federal entity that expends more than $500,000 of federal awards is required to have either a single audit or a program-specific audit for the fiscal year and report the results of the audit, among other things, to the federal clearinghouse designated by OMB. [26] DOE guidance encourages recipients who have the resources to voluntarily capture actual energy-savings data once they become available and conduct an evaluation, measurement, and verification effort to measure energy savings and usage. [27] In order to establish reasonableness, DOE officials said that DOE expects project officers to use their own knowledge and expertise, as well as leverage the knowledge and expertise of other program staff, to identify data outliers and spot inconsistencies. DOE is also developing online tools (in conjunction with project officers) to help make these determinations, according to DOE officials. [28] DOE allows recipients to use the Environmental Protection Agency's ENERGY STAR "Portfolio Manager" tool to measure and track energy performance. [29] DOE guidance states that outputs from DOE-supplied calculators and tools should only be used for reporting if site specific estimates are not available. The instructions for the DOE "ARRA Benefit Reporting Calculator" state: "This tool is designed to provide high level estimates of energy savings and resulting emission reductions. This tool is not intended to replace contractor or engineering supplied estimates of your project savings." [30] Under the continuous corrections period, recipients were allowed to modify submissions from February 2, 2011, to March 21, 2011. The final update of this round of recipient reported data should occur on March 23, 2011. [31] Prime recipients are nonfederal entities that receive Recovery Act funding as federal awards in the form of grants, loans, or cooperative agreements directly from the federal government. [32] Under the Recovery Act, recipients are to file reports for any quarter in which they receive Recovery Act funds directly from the federal government. Reporting requirements apply to nonfederal recipients of funding, including entities such as state and local governments, educational institutions, nonprofits, and other private organizations. These requirements apply to recipients who receive funding through the Recovery Act's discretionary appropriations, not recipients receiving funds through entitlement programs, such as Medicaid, or tax provisions. Certain other exceptions apply, such as for individuals. Recovery Act, div. A, § 1512, 123 Stat. at 287-288. [33] For further discussion of FTE data limitations, see GAO, Recovery Act: Recipient Reported Jobs Data Provide Some Insight into Use of Recovery Act Funding, but Data Quality and Reporting Issues Need Attention, [hyperlink, http://www.gao.gov/products/GAO-10-223] (Washington, D.C.: Nov. 19, 2009), 6-9. [34] [hyperlink, http://www.gao.gov/products/GAO-10-999]. [35] See further discussion regarding our scope and methodology in appendix I. [36] FederalReporting.gov is the nationwide data collection system for recipient reporting data requirements, while the data reported by recipients are available to the public for viewing and downloading in Recovery.gov. [37] See OMB Memorandum M-10-34. This memorandum included updated guidance on when a recipient should mark a record as final and stated that changes to prior reports may not be initiated for the number of jobs field. Further, the memorandum noted that previous OMB memorandums (M-09-21 and M-10-08) require recipients to provide narrative descriptions that are sufficiently clear to facilitate understanding by the general public. [38] We selected a mix of large, medium and small grants defined as grants greater than $2 million, grants between $250,000 and $2 million, and grants less than $250,000, respectively. [39] In matching 2010 third quarter recipient reports against DOE- provided data, we could not match 2 percent of the recipient reports with the DOE data. For the recipient reports that we did match, 21 percent of the matched records were not congruent with regard to whether they reported any FTE value or not. An example of the mismatch would be a DOE-provided record that showed some FTE value while the matching recipient report showed zero or no FTE value at all. [40] GAO, Recovery Act: As Initial Implementation Unfolds in States and Localities, Continued Attention to Accountability Issues Is Essential, [hyperlink, http://www.gao.gov/products/GAO-09-580] (Washington, D.C.: Apr. 23, 2009); Recovery Act: States' and Localities' Current and Planned Uses of Funds While Facing Fiscal Stresses, [hyperlink, http://www.gao.gov/products/GAO-09-829] (Washington, D.C.: July 8, 2009); Recovery Act: Funds Continue to Provide Fiscal Relief to States and Localities, While Accountability and Reporting Challenges Need to Be Fully Addressed, [hyperlink, http://www.gao.gov/products/GAO-09-1016] (Washington, D.C.: Sept. 23, 2009); Recovery Act: Recipient Reported Jobs Data Provide Some Insight into Use of Recovery Act Funding, but Data Quality and Reporting Issues Need Attention, [hyperlink, http://www.gao.gov/products/GAO-10-223] (Washington, D.C.: Nov. 19, 2009); Recovery Act: Status of States' and Localities' Use of Funds and Efforts to Ensure Accountability, [hyperlink, http://www.gao.gov/products/GAO-10-231] (Washington, D.C.: Dec. 10, 2009); Recovery Act: One Year Later, States' and Localities' Uses of Funds and Opportunities to Strengthen Accountability, [hyperlink, http://www.gao.gov/products/GAO-10-437] (Washington, D.C.: Mar. 3, 2010); Recovery Act: States' and Localities' Uses of Funds and Actions Needed to Address Implementation Challenges and Bolster Accountability, [hyperlink, http://www.gao.gov/products/GAO-10-604] (Washington, D.C.: May 26, 2010); Recovery Act: Opportunities to Improve Management and Strengthen Accountability over States' and Localities' Uses of Funds, [hyperlink, http://www.gao.gov/products/GAO-10-999] (Washington, D.C.: Sept. 20, 2010); and Recovery Act: Head Start Grantees Expand Services, but More Consistent Communication Could Improve Accountability and Decisions about Spending, [hyperlink, http://www.gao.gov/products/GAO-11-166] (Washington, D.C.: Dec. 15, 2010). [41] [hyperlink, http://www.gao.gov/products/GAO-10-604], 245-246. [42] [hyperlink, http://www.gao.gov/products/GAO-10-604], 245-246. [43] [hyperlink, http://www.gao.gov/products/GAO-10-604], 246-247. [44] [hyperlink, http://www.gao.gov/products/GAO-10-604], 184. [45] [hyperlink, http://www.gao.gov/products/GAO-10-604], 184. [46] [hyperlink, http://www.gao.gov/products/GAO-11-166], 39. [47] [hyperlink, http://www.gao.gov/products/GAO-11-166], 39. [48] [hyperlink, http://www.gao.gov/products/GAO-11-166], 39. [49] [hyperlink, http://www.gao.gov/products/GAO-10-999], 189. [50] [hyperlink, http://www.gao.gov/products/GAO-10-604], 244. [51] [hyperlink, http://www.gao.gov/products/GAO-09-1016], 78. [52] [hyperlink, http://www.gao.gov/products/GAO-09-1016], 78. [53] [hyperlink, http://www.gao.gov/products/GAO-09-829], 127. [54] [hyperlink, http://www.gao.gov/products/GAO-10-604], 248. [55] [hyperlink, http://www.gao.gov/products/GAO-09-829], 127. [56] [hyperlink, http://www.gao.gov/products/GAO-10-604], 247. [57] [hyperlink, http://www.gao.gov/products/GAO-10-999], 194. [58] [hyperlink, http://www.gao.gov/products/GAO-10-604], 247-248. [59] [hyperlink, http://www.gao.gov/products/GAO-10-999], 194. [60] Congress passed the Single Audit Act, as amended, 31 U.S.C. ch. 75, to promote, among other things, sound financial management, including effective internal controls, with respect to federal awards administered by nonfederal entities. The Single Audit Act requires states, local governments, and nonprofit organizations expending $500,000 or more in federal awards in a year to obtain an audit in accordance with the requirements set forth in the act. A Single Audit consists of (1) an audit and opinions on the fair presentation of the financial statements and the Schedule of Expenditures of Federal Awards; (2) gaining an understanding of and testing internal control over financial reporting and the entity's compliance with laws, regulations, and contract or grant provisions that have a direct and material effect on certain federal programs (i.e., the program requirements); and (3) an audit and an opinion on compliance with applicable program requirements for certain federal programs. [61] OMB's second project is similar to its first Single Audit Internal Control project which started in October 2009. Sixteen states participated in the first project. We assessed the results of the project and reported them in [hyperlink, http://www.gao.gov/products/GAO-10-999]. [62] Each award recipient expending more than $50 million is assigned a cognizant agency for audit. Generally, the cognizant agency for audit is the federal awarding agency that provides the predominant amount of direct funding to a recipient unless OMB assigns this responsibility to another agency. Some of the responsibilities of the cognizant agency include performing quality control reviews, considering auditee requests for extensions, and coordinating a management decision for audit findings that affect federal programs of more than one agency. For the states participating in the project, HHS is the cognizant agency for audit. [63] The Compliance Supplement is updated annually. The 2010 Compliance Supplement was issued in July 2010 and is applicable to audits of fiscal years beginning after June 30, 2009. [64] The project's guidelines called for the federal awarding agencies to complete (1) performing a risk assessment of the internal control deficiency and identify those with the greatest risk to Recovery Act funding and (2) identifying corrective actions taken or planned by the auditee. OMB guidance requires this information to be included in a management decision that the federal agency was to have issued to the auditee's management, the auditor, and the cognizant agency for audit. [65] [hyperlink, http://www.gao.gov/products/GAO-10-999], 187-188. [66] [hyperlink, http://www.gao.gov/products/GAO-10-604], 241-242. [67] [hyperlink, http://www.gao.gov/products/GAO-10-437], 29. [68] [hyperlink, http://www.gao.gov/products/GAO-10-999], 194. [69] [hyperlink, http://www.gao.gov/products/GAO-09-829], 128. [70] [hyperlink, http://www.gao.gov/products/GAO-09-829], 128. [71] [hyperlink, http://www.gao.gov/products/GAO-10-604], 251. [End of section] GAO's Mission: The Government Accountability Office, the audit, evaluation and investigative arm of Congress, exists to support Congress in meeting its constitutional responsibilities and to help improve the performance and accountability of the federal government for the American people. GAO examines the use of public funds; evaluates federal programs and policies; and provides analyses, recommendations, and other assistance to help Congress make informed oversight, policy, and funding decisions. GAO's commitment to good government is reflected in its core values of accountability, integrity, and reliability. Obtaining Copies of GAO Reports and Testimony: The fastest and easiest way to obtain copies of GAO documents at no cost is through GAO's Web site [hyperlink, http://www.gao.gov]. Each weekday, GAO posts newly released reports, testimony, and correspondence on its Web site. To have GAO e-mail you a list of newly posted products every afternoon, go to [hyperlink, http://www.gao.gov] and select "E-mail Updates." Order by Phone: The price of each GAO publication reflects GAO‘s actual cost of production and distribution and depends on the number of pages in the publication and whether the publication is printed in color or black and white. Pricing and ordering information is posted on GAO‘s Web site, [hyperlink, http://www.gao.gov/ordering.htm]. Place orders by calling (202) 512-6000, toll free (866) 801-7077, or TDD (202) 512-2537. Orders may be paid for using American Express, Discover Card, MasterCard, Visa, check, or money order. Call for additional information. To Report Fraud, Waste, and Abuse in Federal Programs: Contact: Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]: E-mail: fraudnet@gao.gov: Automated answering system: (800) 424-5454 or (202) 512-7470: Congressional Relations: Ralph Dawn, Managing Director, dawnr@gao.gov: (202) 512-4400: U.S. Government Accountability Office: 441 G Street NW, Room 7125: Washington, D.C. 20548: Public Affairs: Chuck Young, Managing Director, youngc1@gao.gov: (202) 512-4800: U.S. Government Accountability Office: 441 G Street NW, Room 7149: Washington, D.C. 20548:

The Justia Government Accountability Office site republishes public reports retrieved from the U.S. GAO These reports should not be considered official, and do not necessarily reflect the views of Justia.