Juvenile Justice
A Time Frame for Enhancing Grant Monitoring Documentation and Verification of Data Quality Would Help Improve Accountability and Resource Allocation Decisions Gao ID: GAO-09-850R September 22, 2009From fiscal years 2006 through 2008, the Office of Juvenile Justice and Delinquency Prevention (OJJDP) within the Department of Justice (DOJ) awarded $1.2 billion in funds through approximately 2,000 grants in support of its mission to help states and communities prevent and reduce juvenile delinquency and victimization and improve their juvenile justice systems. OJJDP awards grants to states, territories, localities, and organizations to address a variety of issues, such as reducing juvenile substance abuse, combating Internet crimes against children, preventing youth gang involvement, and providing youth mentoring services. The scope and administration of OJJDP grants also vary, ranging from private organization recipients that implement programs directly in a single community to states that administer grants by awarding the funds they receive to subgrantees to implement programs locally and statewide. Assessing the performance of these programs through grant monitoring is a key management tool to hold grantees accountable for implementing programs as agreed to in their awards, to verify they are making progress toward the objectives of their programs, and to ensure that grant funds are used in support of OJJDP's mission. DOJ's Office of Justice Programs (OJP) establishes grant monitoring policies for its components, including OJJDP. In 2008, the DOJ Office of the Inspector General identified grant management, including maintaining proper oversight of grantees to ensure grant funds are used as intended, as a critical issue and among the department's top management challenges. In the past, we have identified concerns specific to OJJDP's grant monitoring activities. In October 2001, we reported that OJJDP was not consistently documenting its grant monitoring activities, such as required phone contacts between grant managers and grantees, and as a result could not determine the level of monitoring being performed by grant managers. We recommended that OJJDP take steps to determine why it was not consistently documenting its grant monitoring activities and develop and enforce clear expectations regarding monitoring requirements. Since that time, partially in response to our recommendation, OJJDP has taken steps to address this recommendation. For example, OJJDP conducted an assessment of additional policies and procedures that were needed for grant monitoring, and developed a manual that outlined steps for completing specific monitoring activities, such as review of grantee documentation. To help Congress ensure effective use of funds for juvenile justice grant programs, you asked us to assess OJJDP's efforts to monitor the implementation of its grant programs. This report addresses the following questions: (1) What processes does OJJDP have in place to monitor the performance of its juvenile justice grants, and to what extent does it record results of its monitoring efforts to ensure transparency and accountability? (2) How, if at all, does OJJDP use performance measurement data to make programming and funding decisions, and to what extent does it verify the quality of these data?
In accordance with OJP requirements, OJJDP has processes in place to monitor the performance of its juvenile justice grants including desk reviews, site visits, and postsite visit follow-up; however, the office does not have a process to record all follow-up steps taken to resolve grantee issues identified during site visits. Grant managers are to conduct grantee monitoring through three phases: (1) desk review, used to review grantee documentation to understand a grantee's level of accomplishment relative to its stated goals; (2) site visit, used to verify that grantees are implementing programs consistent with proposed plans; and (3) postsite visit, used to resolve issues identified during the visit. We found, during our review of OJJDP monitoring documentation, that desk review and site visit activities are to be recorded in OJP's automated repository, the Grant Management System, in accordance with OJP requirements. In addition, OJJDP officials said that OJJDP requires grant managers to record postsite visit actions taken through OJP's Correction Action Plan process, which OJJDP reserves for egregious circumstances, such as a failure to meet basic programming requirements.10 However, OJJDP does not require that issues resolved informally, such as by e-mail, during the postsite visit phase be recorded in the system. While OJJDP has developed performance measures for its grant programs and collects performance measurement data from its grantees, the office is making limited use of these data because it is not verifying these data to ensure their quality, which is inconsistent with leading management practices in performance measurement. As we have reported in the past, data verification--assessment of data completeness, accuracy, consistency, timeliness, and related quality control practices--helps to ensure that users can have confidence in the reported performance information. According to OJJDP officials, they have not taken action to verify performance data because since 2002 they have focused on the collection of such data rather than on its utilization. Specifically, since 2002, OJJDP has developed performance measures for each of its grant programs and implemented requirements for all grantees to report on measures at least once a year. Although these officials said that OJJDP has processes in place to assess whether the data are appropriate for the performance measure, they stated that OJJDP does not have data verification processes in place and is dependent on grantees to report complete, accurate, consistent, and timely data. These officials also stated that because OJJDP does not know the quality of the data submitted, they do not use performance data to make resource allocation decisions.
RecommendationsOur recommendations from this work are listed below with a Contact for more information. Status will change from "In process" to "Open," "Closed - implemented," or "Closed - not implemented" based on our follow up work.
Director: Team: Phone: