Office of Justice Programs
Problems with Grant Monitoring and Concerns about Evaluation Studies
Gao ID: GAO-02-507T March 7, 2002
The Office of Justice Programs (OJP) provides grants to state and local governments, universities, and private foundations to help prevent and control crime, administer justice, and assist crime victims. OJP bureaus and program offices award both formula and discretionary grants. The monitoring of grant activities is a key management tool to ensure that funds awarded to grantees are being properly spent. In recent years, GAO and others, including OJP, have identified various grant monitoring problems among OJP's bureaus and offices. OJP has begun to work with its bureaus and offices to address these problems, but it is too early to tell whether its efforts will resolve the issues identified.
GAO-02-507T, Office of Justice Programs: Problems with Grant Monitoring and Concerns about Evaluation Studies
This is the accessible text file for GAO report number GAO-02-507T
entitled 'Office Of Justice Programs: Problems with Grant Monitoring
and Concerns about Evaluation Studies' which was released on March 7,
2002.
This text file was formatted by the U.S. General Accounting Office
(GAO) to be accessible to users with visual impairments, as part of a
longer term project to improve GAO products' accessibility. Every
attempt has been made to maintain the structural and data integrity of
the original printed product. Accessibility features, such as text
descriptions of tables, consecutively numbered footnotes placed at the
end of the file, and the text of agency comment letters, are provided
but may not exactly duplicate the presentation or format of the
printed version. The portable document format (PDF) file is an exact
electronic replica of the printed version. We welcome your feedback.
Please E-mail your comments regarding the contents or accessibility
features of this document to Webmaster@gao.gov.
This is a work of the U.S. government and is not subject to copyright
protection in the United States. It may be reproduced and distributed
in its entirety without further permission from GAO. Because this work
may contain copyrighted images or other material, permission from the
copyright holder may be necessary if you wish to reproduce this
material separately.
United States General Accounting Office:
GAO:
Testimony:
Before the Subcommittee on Crime, Committee on the Judiciary, House of
Representatives:
For Release on Delivery:
Expected at 10:00 a.m.
Thursday, March 7, 2002:
Office Of Justice Programs:
Problems with Grant Monitoring and Concerns about Evaluation Studies:
Statement of Laurie E. Ekstrand, Director, Justice Issues:
GAO-02-507T:
Mr. Chairman and Members of the Subcommittee:
I am pleased to be here today to discuss our work on the Office of
Justice Programs (OJP). During the last 5 years, we have reported on a
number of programs run by OJP bureaus and offices. An overarching
theme of these reviews is a need for improvements in monitoring and
evaluating the myriad grant programs that OJP oversees. Our work has
shown longstanding problems with OJP grant monitoring and has begun to
raise questions about the methodological rigor of some of OJP‘s impact
evaluation studies. Monitoring and evaluation are the activities that
identify whether programs are operating as intended, whether they are
reaching those that should be served, and ultimately whether they are
making a difference in the fight against crime and delinquency. In
other words, these are major elements of assessing results. Our recent
work has focused mostly on discretionary grant programs administered
by the Bureau of Justice Assistance (BJA), the Violence Against Women
Office (VAWO), the Office of Juvenile Justice and Delinquency
Prevention (OJJDP), the Executive Office for Weed and Seed (EOWS), and
the Drug Courts Program Office (DCPO). We have also examined National
Institute of Justice (NIJ) impact evaluations of some of these
programs.
Background:
OJP, the grant making arm of the Department of Justice (DOJ), provides
grants to various organizations, including state and local
governments, universities, and private foundations, that are intended
to develop the nation‘s capacity to prevent and control crime,
administer justice, and assist crime victims. OJP‘s assistant attorney
general is responsible for overall management and oversight of OJP
through setting policy and ensuring that OJP policies and programs
reflect the priorities of the president, the attorney general, and the
Congress. The assistant attorney general promotes coordination among
OJP‘s five bureaus”including BJA, NIJ, and OJJDP”as well as its seven
program offices, including VAWO, the Executive Office for Weed and
Seed (EOWS), and the Drug Courts Program Office.[Footnote 1]
OJP bureaus and program offices award two types of grants: formula and
discretionary. Formula grants are awarded to state governments, which
then make subawards to state and local units of government.
Discretionary grants can be awarded on a competitive and non-
competitive basis directly to states, local units of government,
Indian tribes and tribal organizations, individuals, educational
institutions, private nonprofit organizations, and private commercial
organizations. Bureaus and program offices, like BJA, VAWO, and OJJDP
are, together with OJP‘s Office of the Comptroller, responsible for
monitoring OJP‘s discretionary grants to ensure that they are being
implemented as intended, responsive to grant goals and objectives, and
compliant with statutory regulations and other policy guidelines. NIJ
is OJP‘s principal research and development agency and awards grants
for the research and evaluation of many of OJP‘s grant programs. OJJDP
also funds research and evaluation efforts associated with the
juvenile justice system.
Between fiscal years 1990 and 2000, OJP‘s budget grew, in constant
fiscal year 2000 dollars, by 323 percent, from about $916 million in
fiscal year 1990 to nearly $3.9 billion in fiscal year 2000.[Footnote
2]
OJP‘s Efforts to Resolve Continuing Grant Monitoring Problems:
The monitoring of grant activities is a key management tool to help
ensure that funds awarded to grantees are being properly spent. Over
the last few years, we and others, including OJP, have identified
various grant monitoring problems among OJP‘s bureaus and offices. OJP
has begun to work with its bureaus and offices to address these
problems, but it is too early to tell whether its efforts will be
enough to resolve many of the issues identified.
Problems with OJP Grant Monitoring Are Not New:
Since 1996, we have testified and issued reports that document grant
monitoring problems among some of OJP‘s bureaus and offices. In
November 2001, in response to a request by Senators Sessions and
Grassley, we reported that grant files for discretionary grants
awarded by VAWO and under BJA‘s Byrne Program often lacked the
documentation necessary to ensure that required monitoring activities
occurred.[Footnote 3] In October 2001, at the request of Congressman
Schaffer, we cited similar problems with the monitoring of OJJDP
discretionary grants”problems consistent with the lack of OJJDP
monitoring documentation we reported in May 1996.[Footnote 4] For
example, our review of grant files for a representative sample of
OJJDP, BJA Byrne, and VAWO discretionary grants active in all of
fiscal years 1999 and/or 2000 showed the following:
* BJA and VAWO grant files did not always contain requisite grant
monitoring plans, whereas OJJDP grant files generally did. When
monitoring plans were in the files, grant managers from the three
organizations did not consistently document their monitoring
activities, such as site visits, according to the monitoring plans
they developed.
* A substantial number of OJJDP, BJA Byrne, and VAWO grant files did
not contain progress and financial reports sufficient to cover the
entire grant period, contrary to OJP guidelines. Furthermore, Byrne
and VAWO grantee progress and financial reports were often submitted
late by grantees. These reports are an important management tool to
help managers and grant monitors determine if grantees are meeting
program objectives and financial commitments.
* BJA Byrne, VAWO, and OJJDP grant files did not always contain the
required closeout documents”key documents by which OJP ensures that,
among other things, the final accounting of federal funds have been
received.
We concluded that neither OJP, OJJDP, BJA, VAWO, nor GAO can determine
the level of monitoring performed by grant managers as required by OJP
and the comptroller general‘s internal control standards, which call
for documentation of all transactions and other significant events to
ensure that management directives are being carried out.[Footnote 5]
We recommended that OJJDP, BJA, and VAWO review why documentation
problems occurred and take steps to improve their documentation of
grant monitoring. We also recommended that OJP (1) study and recommend
ways to establish an approach to systematically test or review grant
files to ensure consistent documentation across OJP and (2) explore
ways to electronically compile and maintain documentation of monitoring
activities to facilitate more consistent documentation, more
accessible management oversight, and sound performance measurement.
Also, in 1999, our report on the management of the Weed and Seed
Program showed similar results.[Footnote 6] Among other things, EOWS
did not always ensure that local Weed and Seed sites met critical
requirements”almost one half of 177 sites funded in fiscal year 1998
had not submitted all of the required progress reports. Furthermore,
while EOWS was to conduct site visits at all Weed and Seed sites, EOWS
monitors did not always document the results of these visits. We
concluded that EOWS lacked adequate management controls over its grant
monitoring process and recommended that EOWS improve program
monitoring to ensure that sites meet grant requirements for submitting
progress reports and EOWS document site visits.
Our work has also shown that others, including OJP itself and the DOJ
Office of the Inspector General, have identified problems with grant
monitoring. Our November 2001 report discussed that, in 1996, an OJP-
wide working group reported on various aspects of the grant process,
including grant administration and monitoring. Among other things, the
working group found that grant monitoring was not standardized in OJP;
given available resources, monitoring plans were overly ambitious,
resulting in failure to achieve the level of monitoring articulated;
and an OJP-wide tracking system was needed to facilitate control of
the monitoring process. The working group recommended that OJP
establish another working group to develop detailed operating
procedures, giving special attention to grant monitoring.
Almost 4 years later, in February 2000, an independent contractor
delivered a report to OJP containing similar findings.[Footnote 7] The
report stated that OJP lacked consistent procedures and practices for
performing grant management functions, including grant monitoring,
across the agency. For example, the contractor found that (1) no
formal guidance had been provided grant managers about how stringent
or flexible they should be with grantees in enforcing deadlines, due
dates, and other grant requirements and (2) grant files were often not
complete or reliable. The contractor recommended that, among other
things, OJP develop an agencywide, coordinated and integrated
monitoring strategy; standardize procedures for conducting site visits
and other monitoring activities; and mandate the timeliness and filing
of monitoring reports.
Finally, the DOJ Office of the Inspector General has identified and
reported on OJP-wide monitoring problems and has identified grant
management as one of the 10 major management challenges facing DOJ. In
December 2000, the inspector general stated that DOJ‘s multibillion-
dollar grant programs are a high risk for fraud, given the amount of
money involved and the tens-of-thousands of grantees. Among other
things, the inspector general said that past reviews determined that
many grantees did not submit the required progress and financial
reports and that program officials‘ on-site reviews did not
consistently address all grant conditions.
Too Early to Gauge Effectiveness of OJP‘s Efforts to Resolve Grant
Monitoring Problems:
We reported in November 2001 that OJP had begun to work with bureaus
and offices to resolve some of the problems it and others have
identified, but it was too early to tell how effective these efforts
will be in resolving these issues. In its Fiscal Year 2000 Performance
Report and Fiscal Year 2002 Performance Plan developed under the
Government Performance and Results Act of 1993, OJP established a goal
to achieve the effective management of grants. Among other things, DOJ
plans to achieve this goal by continued progress toward full
implementation of a new grant management system as a way of
standardizing and streamlining the grant process. According to the
performance report and performance plan, the grant management system
will assist OJP in setting priorities for program monitoring and
facilitate timely program and financial reports from grantees.
At the time of our review, the new system covered grants for some
organizations up to the award stage. Since then, OJP has created a
chief information officer position charged with planning and
implementing an agency-side grant management system. According to the
assistant attorney general, the new system is envisioned to produce
reports in response to informational requests, provide information
pertaining to grantees and all resources provided by OJP, and maintain
information from the opening to the closing of a grant award. Although
the assistant attorney general said that OJP will consider the
comptroller general‘s internal control standards in taking these
steps, it is unclear whether the new system will include the full
range and scope of monitoring activities.
We also reported that OJP had been working on two other key efforts.
One of these initiatives, ’Operation Closeout,“ was a pilot project
announced in February 2000 by OJP‘s Working Group on Grant
Administration that was to, among other things, accelerate the grant
closeout process through revising closeout guidelines and elevating
the importance of the closeout function as a required procedure in the
administration of grants. By November 2000, OJP reported that this
operation closed out 4,136 outstanding grants over a 6-month period,
resulting in over $30 million in deobligated funds. As of September
2001, OJP had plans to initiate another closeout operation based on
the success of the pilot.
Another OJP initiative involved the issuance of new OJP-wide guidance
for grant administration, including grant monitoring. In January 2001,
OJP released its Grant Management Policies and Procedures Manual to
update and codify OJP‘s policies and procedures regarding its business
practices.[Footnote 8] According to OJP officials, the new guidance
was developed to reengineer the grant management process based on the
best practices of bureaus and offices throughout OJP. At the time of
our review, OJP had trained over 300 grant managers and had plans to
train supervisors about the new guidance. OJP also had planned to send
a similar questionnaire to recently trained grant managers and
supervisors, respectively, to identify any issues or problems with
using the new manual and to identify potential training interest and
topics. However, there were no plans to test or systematically monitor
compliance with the new guidelines to ensure that grant managers were
fulfilling their responsibilities.
OJP‘s bureaus and program offices have told us that they recognize
that they need to take some steps to respond to our recent
recommendations, but it is too early to tell if these actions will be
effective. For example, in response to our November 2001 report on the
monitoring of Byrne and VAWO discretionary grants, BJA said that it
had, among other things, (1) modified its internal grant tracking
system to include tracking of events such as site visits, phone
contacts involving staff and grantees, and grant closeouts and (2)
developed more specific guidance for grantees on completing progress
reports to ensure more specific performance data are obtained. VAWO
said it had begun to develop both an internal monitoring manual that
would include procedures for development of monitoring plans using a
risk-based assessment tool and a management information system that
will eventually track the submission of progress and financial reports.
Likewise, in response to our October 2001 report on OJJDP grant
monitoring, OJJDP officials said they conducted an internal assessment
of grant monitoring activities and established an OJJDP standard for
grant administration and monitoring; a protocol for adhering to the
standard; and a set of tools for grant administration and monitoring.
OJJDP said that it anticipates OJJDP-wide implementation during fiscal
year 2002. Finally, with respect to our 1999 Weed and Seed report,
EOWS said it recognizes the need to improve program monitoring”citing
that it has a chronic problem of grantees not submitting programmatic
progress reports in a timely manner”and acknowledged the need to
document all monitoring visits. In a July 2000 letter, EOWS officials
said that EOWS had taken steps to improve program monitoring,
including documentation of site monitoring visits.
Concerns about Impact Evaluation Studies:
We have also issued reports questioning the methodological rigor of
impact evaluation studies of various OJP grant programs. Impact
evaluations are intended to assess the net effect of a program by
comparing program outcomes with an estimate of what would have
happened in the absence of the program.
Today, we are issuing a report on work undertaken at the request of
Senators Sessions and Grassley concerning the methodological rigor of
impact evaluations of a Byrne grant program and three VAWO
discretionary grant programs.[Footnote 9] During fiscal years 1995
through 2001, NIJ awarded about $6 million for 5 Byrne and 5 VAWO
discretionary grant program evaluations. Of the 10 program
evaluations, all 5 VAWO evaluations were intended to measure the
impact of the VAWO programs. One of the 5 Byrne evaluations was
designed as an impact evaluation. Our in-depth review of the 4 impact
evaluations that have progressed beyond the formative stage showed
that only 1 of these, the evaluation of the Byrne Children at Risk
(CAR) Program, was methodologically sound.
The other 3 evaluations, all of which examined VAWO programs, had
methodological problems that raise concerns about whether the
evaluations will produce definitive results. Although program
evaluation is an inherently difficult task, in all 3 VAWO evaluations,
the effort is particularly arduous because of variations across
grantee sites in how the programs are implemented. In addition, VAWO
sites participating in the impact evaluations have not been shown to
be representative of their programs, thereby limiting the evaluators‘
ability to generalize results. Further, the lack of nonprogram
participant comparison groups hinders their ability to minimize the
effects of factors that are external to the program and isolate the
impact of the program alone. In some situations, other means (other
than comparison groups) can be effective in isolating the impact of a
program from other factors. However, in these evaluations, effective
alternative methods were not used. In addition, data collection and
analytical problems (e.g. related to statistical tests, assessment of
change) compromise the evaluators‘ ability to draw appropriate
conclusions from the results.
We have made a recommendation in relation to the two VAWO impact
evaluations in the formative stage of development, and for all future
impact evaluations, to ensure that potential methodological design and
implementation problems are mitigated. The assistant attorney general
commented that she agreed with the substance of our recommendations
and has begun or plans to take steps to address them. It is still too
early to tell whether these actions will be effective in preventing or
resolving the problems we identified, but they appear to be steps in
the right direction.
Our in-depth review of 10 of OJJDP‘s impact evaluations of its own
programs undertaken since 1995 also raised some concerns about whether
many of the evaluations would produce definitive results. We reported
these concerns in an October 2001 report, requested by Congressman
Schaffer, on OJJDP grantee reporting requirements and evaluation
studies.[Footnote 10] At the time of our review, all of the 10
evaluations were ongoing, with 5 in their formative stages and 5 well
into implementation. As in the report cited above, we noted that some
of the evaluations we reviewed were particularly difficult to design
because sites varied in how they implemented the same program. While
these variations were intended to allow communities to tailor programs
to meet their unique needs, they will make it difficult to interpret
evaluation results when the studies are completed. Two of the
evaluations that were in their later stages and 3 of those that were
in their formative stages at the time of our review lacked specific
plans for comparison groups, which would aid in isolating the impacts
of the program from the effects of other factors that may have
influenced change. Furthermore, 3 of the 5 evaluations that were well
into implementation at the time of our review had developed data
collection problems.
We recommended in our report that OJJDP assess the 5 impact
evaluations that were in their formative stages to address potential
comparison group and data collection problems and initiate any needed
interventions to help ensure the evaluations produce definitive
results. In commenting on a draft of our report, the assistant
attorney general said the report is an important tool that OJP would
use to improve the quality of its evaluations and to design programs
to achieve greater impact. Furthermore, OJP will assess the five
impact evaluations in their formative stages, as we recommended. Two
months after our report‘s issuance, OJP reported to us on the status
of these evaluations. OJP informed us that OJJDP had decided to
discontinue 1 evaluation that had planned to use a comparison group
because of delays and difficulties in identifying a comparison site.
In addition, OJJDP is considering scaling back and refocusing the
scope of another evaluation because the program being studied did not
lend itself to an impact evaluation with comparison groups.
We have also reported on problems with evaluation studies of federally
funded drug court programs.[Footnote 11] In our 1997 report to the
House and Senate Judiciary Committees,[Footnote 12] we found, among
other things, that differences and methodological limitations in
existing drug court evaluation studies did not permit firm conclusions
to be made on the overall impact or effectiveness of drug court
programs. We recommended that future drug court program impact
evaluations, funded by DOJ and others, be required to include post-
program data and comparison groups within their scope. The preliminary
results of our ongoing follow-up work on drug court programs for
Senators Grassley and Sessions indicate that various administrative
and research factors have hampered NIJ‘s efforts to complete a
national impact evaluation study, and that alternative plans for
addressing the impact of federally funded drug court programs, if
implemented, are not expected until year 2007. As a result, DOJ will
continue to lack near term information that the Congress, the public,
and other program stakeholders may need to determine the overall
impact of federally funded drug court programs and to assess whether
these programs are an effective use of federal funds. We expect to
issue a report on this issue in April 2002.
In summary, OJP‘s grant programs have grown rapidly during the last
decade, increasing the importance of ensuring that they are achieving
intended results. Yet, repeated GAO reviews of grant monitoring and
impact evaluations across a variety of OJP entities have shown a need
for improvement. OJP itself and the DOJ Office of the Inspector
General have identified a need for improvements in grant management as
well.
Despite past commitments to shore up grant monitoring and better
assess program results, we have still found problems in very recent
reports. The recent reorganization plans and the anticipated
management information system have been cited as the foundation for
positive changes in grants management, including monitoring and
evaluation. But, reorganization and management information systems are
only tools and are only as good as the management that wields them.
Commitment to improvement and oversight are needed to ensure progress.
Chairman Smith has recently requested an assessment of NIJ‘s impact
evaluation studies. This work may lead to additional recommendations
related to OJP grant evaluations.
Mr. Chairman, this concludes my prepared statement. I would be pleased
to answer any questions that you or other members of the subcommittee
may have.
For further information regarding this testimony, please contact
Laurie E. Ekstrand or John F. Mortin at (202) 512-8777. Individuals
making key contributions to this testimony included James Blume, Dan
Harris, Charles Johnson, Weldon McPhail, Wendy Simkalo, Lori Weiss,
Jared Hermalin, Rochelle Burns, Jenna Battcher, and Kimberly Hutchens.
[End of section]
Appendix I: OJP Organization Chart:
[Refer to PDF for image: organization chart]
Top level:
Office of the Assistant Attorney General;
* American Indian & Alaska Native Affairs Desk;
* Equal Employment Opportunity Office.
Second level, reporting to the Office of the Assistant Attorney
General:
Violence Against Women Office;
Corrections Program Office;
Drug Courts Program Office;
Executive Office for Weed & Seed;
Office for Domestic Preparedness;
Office of Police Corps & Law Enforcement Education.
Third level, reporting to the Office of the Assistant Attorney General:
Office of Budget, & Management Services;
Office of Administration;
Office for Civil Rights;
Office of the Comptroller;
Office of General Counsel;
Office of Congressional & Public Affairs.
Fourth level, reporting to the Office of the Assistant Attorney
General:
Bureau of Justice Assistance;
Bureau of Justice Statistics;
National Institute of Justice;
Office of Juvenile Justice & Delinquency Prevention;
Office for Victims of Crime.
Note: The organization chart is current as of March 2002.
Source: Prepared by GAO based on OJP documentation.
[End of figure]
[End of section]
Appendix II: OJP‘s Budget:
Figure: Appropriated Resources for Office of Justice Programs FYs 1990-
2000 (in thousands):
[Refer to PDF for image: vertical bar graph]
Note: Annual totals include Crime Victims Fund collections and Public
Safety Officers‘ Benefits Program and exclude Management and
Administration.
Source: OJP Office of Budget and Management Services.
[End of figure]
[End of section]
Appendix III: OJP‘s FY 2000 Budget by Program Office and Bureau:
[Refer to PDF for image: pie-chart]
Bureau of Justice Assistance: 43.5%;
Juvenile and Delinquency Prevention: 14.5%;
Corrections Programs Office: 14.2%;
Office for Victims of Crime: 12.9%;
Violence Against Women Office: 6.5%;
State & Local Preparedness Office: 3.1%;
National Institute of Justice: 2.6%;
Drug Courts: 1.0%;
Weed & Seed: 0.9%;
Bureau of Justice Statistics: 0.7%.
Source: OJP Office of Budget and Management Services.
[End of figure]
[End of section]
Related GAO Products:
Justice Impact Evaluations: One Byrne Evaluation Was Rigorous; All
Reviewed Violence Against Women Office Evaluations Were Problematic,
[hyperlink, http://www.gao.gov/products/GAO-02-309] (Washington, D.C.:
March 7, 2002).
Justice Discretionary Grants: Byrne Program and Violence Against Women
Office Grant Monitoring Should Be Better Documented, [hyperlink,
http://www.gao.gov/products/GAO-02-25] (Washington, D.C.: November 28,
2001).
Juvenile Justice: OJJDP Reporting Requirements for Discretionary and
Formula Grantees and Concerns About Evaluation Studies, [hyperlink,
http://www.gao.gov/products/GAO-02-23] (Washington, D.C.: October 30,
2001).
Juvenile Justice: Better Documentation of Discretionary Grant
Monitoring Is Needed, [hyperlink,
http://www.gao.gov/products/GAO-02-65] (Washington, D.C.: October 10,
2001).
Internal Control: Standards for Internal Control in the Federal
Government, [hyperlink, http://www.gao.gov/products/GAO/AIMD-00-
21.3.1] (Washington, D.C.: November 1999).
Federal Grants: More Can Be Done to Improve Weed and Seed Program
Management, [hyperlink, http://www.gao.gov/products/GAO/GGD-99-110]
(Washington, D.C.: July 16, 1999).
Drug Courts: Overview of Growth, Characteristics, and Results,
[hyperlink, http://www.gao.gov/products/GAO/GGD-97-106] (Washington,
D.C.: July 31, 1997).
Juvenile Justice: Status of Delinquency Prevention Program and
Description of Local Projects, [hyperlink,
http://www.gao.gov/products/GAO/GGD-96-147] (Washington, D.C.: August
13, 1996).
Juvenile Justice: Selected Issues Relating to OJJDP‘s Reauthorization,
[hyperlink, http://www.gao.gov/products/GAO/T-GGD-96-103] (Washington,
D.C.: May 8, 1996).
OJJDP Discretionary Grant Programs, [hyperlink,
http://www.gao.gov/products/GAO/GGD-96-111R] (Washington, D.C.: May 7,
1996).
Drug Courts: Information on a New Approach to Address Drug-Related
Crime, [hyperlink, http://www.gao.gov/products/GAO/GGD-95-159BR]
(Washington, D.C.: May 22, 1995).
Office of Justice Programs: Discretionary Grants Reauthorization,
[hyperlink, http://www.gao.gov/products/GAO/GGD-93-23] (Washington,
D.C.: November 20, 1992).
[End of section]
Footnote:
[1] OJP‘s five bureaus are Bureau of Justice Assistance, Bureau of
Justice Statistics, National Institute of Justice, Office of Juvenile
Justice and Delinquency Prevention, and Office for Victims of Crime.
OJP‘s seven program offices are American Indian and Alaska Native
Affairs Desk, Violence Against Women Office, Executive Office for Weed
and Seed, Corrections Program Office, Drug Courts Program Office,
Office for Domestic Preparedness, and Office of Police Corps and Law
Enforcement Education. Appendix I shows OJP‘s current organizational
structure.
[2] Appendix II shows the growth of OJP‘s budget between fiscal years
1990 and 2000, and Appendix III shows OJP‘s fiscal year 2000 budget
broken out by Bureaus and Program Offices.
[3] U.S. General Accounting Office, Justice Discretionary Grants:
Byrne Program and Violence Against Women Office Grant Monitoring
Should Be Better Documented, [hyperlink,
http://www.gao.gov/products/GAO-02-025] (Washington, D.C.: Nov. 28,
2001).
[4] U.S. General Accounting Office, Juvenile Justice: Better
Documentation of Discretionary Grant Monitoring is Needed, GAO-02-65
(Washington, D.C.: Oct. 10, 2001) and Juvenile Justice: Selected
Issues Relating to OJJDP‘s Reauthorization, [hyperlink,
http://www.gao.gov/products/GAO/T-GGD-96-103] (Washington, D.C.: May
8, 1996).
[5] U.S. General Accounting Office, Internal Control: Standards for
Internal Control in the Federal Government, [hyperlink,
http://www.gao.gov/products/GAO/AIMD-00-21.3.1] (Washington, D.C.:
Nov. 1999).
[6] U.S. General Accounting Office, Federal Grants: More Can Be Done
to Improve Weed and Seed Program Management, [hyperlink,
http://www.gao.gov/products/GAO/GGD-99-110] (Washington, D.C.: July
16, 1999).
[7] Dougherty and Associates, Final Report of Finding &
Recommendations for Improvement of the Grant Management Process
(Alexandria, VA, 2000).
[8] This document superseded OJP Handbook: Policies and Procedures for
the Administration of OJP Grants (Washington, D.C., 1992).
[9] U.S. General Accounting Office, Justice Impact Evaluations: One
Byrne Evaluation Was Rigorous; All Reviewed Violence Against Women
Office Evaluations Were Problematic, [hyperlink,
http://www.gao.gov/products/GAO-02-309] (Washington, D.C.: Mar. 7,
2002).
[10] U.S. General Accounting Office, Juvenile Justice: OJJDP Reporting
Requirements for Discretionary and Formula Grantees and Concerns About
Evaluation Studies, [hyperlink, http://www.gao.gov/products/GAO-02-23]
(Washington, D.C.: Oct. 30, 2001).
[11] The main purpose of a drug court program is to use the authority
of the court to reduce crime by changing defendants‘ substance abuse
or risk behavior. Under this concept, in exchange for the possibility
of dismissed charges or reduced sentences, defendants are diverted to
drug court programs in various ways and at various stages in the
judicial process. Judges preside over drug court proceedings; monitor
the progress of defendants; and prescribe sanctions and rewards as
appropriate in collaboration with prosecutors, defense attorneys,
treatment providers, and others.
[12] U.S. General Accounting Office, Drug Courts: Overview of Growth,
Characteristics, and Results, [hyperlink,
http://www.gao.gov/products/GAO/GGD-97-106] (Washington, D.C.: July
31, 1997).