Adult Drug Courts
Studies Show Courts Reduce Recidivism, but DOJ Could Enhance Future Performance Measure Revision Efforts
Gao ID: GAO-12-53 December 9, 2011
A drug court is a specialized court that targets criminal offenders who have drug addiction and dependency problems. These programs provide offenders with intensive court supervision, mandatory drug testing, substance-abuse treatment, and other social services as an alternative to adjudication or incarceration. As of June 2010, there were over 2,500 drug courts operating nationwide, of which about 1,400 target adult offenders. The Department of Justice's (DOJ) Bureau of Justice Assistance (BJA) administers the Adult Drug Court Discretionary Grant Program, which provides financial and technical assistance to develop and implement adult drug-court programs. DOJ requires grantees that receive funding to provide data that measure their performance. In response to the Fair Sentencing Act of 2010, this report assesses (1) data DOJ collected on the performance of federally funded adult drug courts and to what extent DOJ used these data in making grant- related decisions, and (2) what is known about the effectiveness of drug courts. GAO assessed performance data DOJ collected in fiscal year 2010 and reviewed evaluations of 32 drug- court programs and 11 cost-benefit studies issued from February 2004 through March 2011.
BJA collects an array of data on adult drug-court grantees, such as drug-court completion rates, and during the course of GAO's review, began expanding its use of this performance data to inform grant-related decisions, such as allocating resources and setting program priorities. For example, during September 2011, BJA assessed a sample of adult drug-court grantees' performance across a range of variables, using a new process it calls GrantStat. BJA developed recommendations following this assessment and is determining their feasibility. In addition, in October 2011, BJA finalized revisions to the performance measures on which grantees report. BJA's process of revising its performance measures generally adhered to key practices, such as obtaining stakeholder involvement; however, BJA could improve upon two practices as it continues to assess and revise measures in the future. First, while BJA plans to assess the reliability of the new measures after the first quarter of grantees' reporting, officials have not documented, as suggested by best practices, how it will determine if the measures were successful or whether changes would be needed. Second, should future changes to the measures be warranted, BJA could improve the way it documents its decisions and incorporates feedback from stakeholders, including grantees, by recording key methods and assumptions used to guide its revision efforts. By better adhering to best practices identified by GAO and academic literature, BJA could better ensure that its future revision efforts result in successful and reliable metrics--and that the revision steps it has taken are transparent. In the evaluations that GAO reviewed, drug-court program participation was generally associated with lower recidivism. GAO's analysis of evaluations reporting recidivism data for 32 programs showed that drug-court program participants were generally less likely to be re-arrested than comparison group members drawn from criminal court, with differences in likelihood reported to be statistically significant for 18 of the programs. Cost-benefit analyses showed mixed results. For example: (1) Across studies showing re-arrest differences, the percentages of drug- court program participants re-arrested were lower than for comparison group members by 6 to 26 percentage points. Drug court participants who completed their program had re-arrest rates 12 to 58 percentage points below those of the comparison group. (2) GAO's analysis of evaluations reporting relapse data for eight programs showed that drug-court program participants were less likely than comparison group members to use drugs, based on drug tests or self- reported drug use, although the difference was not always significant. (3) Of the studies assessing drug-court costs and benefits, the net benefit ranged from positive $47,852 to negative $7,108 per participant. GAO recommends that BJA document key methods used to guide future revisions of its performance measures for the adult drug-court program. DOJ concurred with GAO's recommendation.
Recommendations
Our recommendations from this work are listed below with a Contact for more information. Status will change from "In process" to "Open," "Closed - implemented," or "Closed - not implemented" based on our follow up work.
Director:
David C. Maurer
Team:
Government Accountability Office: Homeland Security and Justice
Phone:
(202) 512-9627
GAO-12-53, Adult Drug Courts: Studies Show Courts Reduce Recidivism, but DOJ Could Enhance Future Performance Measure Revision Efforts
This is the accessible text file for GAO report number GAO-12-53
entitled 'Adult Drug Courts: Studies Show Courts Reduce Recidivism,
but DOJ Could Enhance Future Performance Measure Revision Efforts'
which was released on December 9, 2011.
This text file was formatted by the U.S. Government Accountability
Office (GAO) to be accessible to users with visual impairments, as
part of a longer term project to improve GAO products' accessibility.
Every attempt has been made to maintain the structural and data
integrity of the original printed product. Accessibility features,
such as text descriptions of tables, consecutively numbered footnotes
placed at the end of the file, and the text of agency comment letters,
are provided but may not exactly duplicate the presentation or format
of the printed version. The portable document format (PDF) file is an
exact electronic replica of the printed version. We welcome your
feedback. Please E-mail your comments regarding the contents or
accessibility features of this document to Webmaster@gao.gov.
This is a work of the U.S. government and is not subject to copyright
protection in the United States. It may be reproduced and distributed
in its entirety without further permission from GAO. Because this work
may contain copyrighted images or other material, permission from the
copyright holder may be necessary if you wish to reproduce this
material separately.
United States Government Accountability Office:
GAO:
Report to Congressional Committees:
December 2011:
Adult Drug Courts:
Studies Show Courts Reduce Recidivism, but DOJ Could Enhance Future
Performance Measure Revision Efforts:
GAO-12-53:
GAO Highlights:
Highlights of GAO-12-53, a report to congressional committees.
Why GAO Did This Study:
A drug court is a specialized court that targets criminal offenders
who have drug addiction and dependency problems. These programs
provide offenders with intensive court supervision, mandatory drug
testing, substance-abuse treatment, and other social services as an
alternative to adjudication or incarceration. As of June 2010, there
were over 2,500 drug courts operating nationwide, of which about 1,400
target adult offenders. The Department of Justice‘s (DOJ) Bureau of
Justice Assistance (BJA) administers the Adult Drug Court
Discretionary Grant Program, which provides financial and technical
assistance to develop and implement adult drug-court programs. DOJ
requires grantees that receive funding to provide data that measure
their performance. In response to the Fair Sentencing Act of 2010,
this report assesses (1) data DOJ collected on the performance of
federally funded adult drug courts and to what extent DOJ used these
data in making grant-related decisions, and (2) what is known about
the effectiveness of drug courts. GAO assessed performance data DOJ
collected in fiscal year 2010 and reviewed evaluations of 32 drug-
court programs and 11 cost-benefit studies issued from February 2004
through March 2011.
What GAO Found:
BJA collects an array of data on adult drug-court grantees, such as
drug-court completion rates, and during the course of GAO‘s review,
began expanding its use of this performance data to inform grant-
related decisions, such as allocating resources and setting program
priorities. For example, during September 2011, BJA assessed a sample
of adult drug-court grantees‘ performance across a range of variables,
using a new process it calls GrantStat. BJA developed recommendations
following this assessment and is determining their feasibility. In
addition, in October 2011, BJA finalized revisions to the performance
measures on which grantees report. BJA‘s process of revising its
performance measures generally adhered to key practices, such as
obtaining stakeholder involvement; however, BJA could improve upon two
practices as it continues to assess and revise measures in the future.
First, while BJA plans to assess the reliability of the new measures
after the first quarter of grantees‘ reporting, officials have not
documented, as suggested by best practices, how it will determine if
the measures were successful or whether changes would be needed.
Second, should future changes to the measures be warranted, BJA could
improve the way it documents its decisions and incorporates feedback
from stakeholders, including grantees, by recording key methods and
assumptions used to guide its revision efforts. By better adhering to
best practices identified by GAO and academic literature, BJA could
better ensure that its future revision efforts result in successful
and reliable metrics-”and that the revision steps it has taken are
transparent.
In the evaluations that GAO reviewed, drug-court program participation
was generally associated with lower recidivism. GAO‘s analysis of
evaluations reporting recidivism data for 32 programs showed that drug-
court program participants were generally less likely to be re-
arrested than comparison group members drawn from criminal court, with
differences in likelihood reported to be statistically significant for
18 of the programs. Cost-benefit analyses showed mixed results. For
example:
* Across studies showing re-arrest differences, the percentages of
drug-court program participants re-arrested were lower than for
comparison group members by 6 to 26 percentage points. Drug court
participants who completed their program had re-arrest rates 12 to 58
percentage points below those of the comparison group.
* GAO‘s analysis of evaluations reporting relapse data for eight
programs showed that drug-court program participants were less likely
than comparison group members to use drugs, based on drug tests or
self-reported drug use, although the difference was not always
significant.
* Of the studies assessing drug-court costs and benefits, the net
benefit ranged from positive $47,852 to negative $7,108 per
participant.
What GAO Recommends:
GAO recommends that BJA document key methods used to guide future
revisions of its performance measures for the adult drug-court
program. DOJ concurred with GAO‘s recommendation.
View [hyperlink, http://www.gao.gov/products/GAO-12-53]. For more
information, contact David C. Maurer at (202) 512-9627 or
maurerd@gao.gov.
[End of section]
Contents:
Letter:
Background:
BJA Is Expanding Use of Grantee Performance Data but Could Enhance
Processes as It Continues to Refine Performance Measures:
Drug Courts Were Associated with Lower Recidivism and Relapse Rates
for Program Participants Than Criminal Courts:
Conclusions:
Recommendation for Execution Action:
Agency Comments:
Appendix I: DOJ Has Fully Implemented Most of Our 2002 Recommendations
and Plans to Address the Remaining One:
Appendix II: MADCE Is the Most Comprehensive Study of Drug Courts to
Date, but Generalizability of Findings May Be Limited:
Appendix III: Objectives, Scope, and Methodology:
Appendix IV: Overview of Drug Court Program Characteristics:
Appendix V: Ten Key Components of a Drug Court--Developed by BJA in
Collaboration with The National Association of Drug Court
Professionals:
Appendix VI: BJA Offers Solicitations in Four Broad Drug-Court Grant
Categories--Implementation, Enhancement, Statewide, and Joint:
Appendix VII: Key Management Activities Identified for Which
Performance Information Can Be Most Useful:
Appendix VIII: Comments from the Department of Justice, Bureau of
Justice Assistance:
Appendix IXGAO Contacts and Staff Acknowledgments:
Bibliography:
Tables:
Table 1: List of the Seven Questions to Which Adult Drug Court
Grantees Must Submit Narrative Responses:
Table 2: Types of Information BJA Officials Reported Using or Planning
to Use When Performing Key Management Activities for the Adult Drug
Court Grant Program:
Table 3: Differences in Reported Rearrest Rates between Drug Court
Program Participants and Comparison Group Members:
Table 4: Drug Use Relapse Results of Evaluations GAO Reviewed:
Table 5: Cost Conclusions of the 11 Drug Court Program Evaluations in
Our Cost-Benefit Review:
Table 6: Status of DOJ's Efforts to Address Recommendations We Made in
2002 on DOJ's Collection of Performance Data to Measure the Impact of
Federally Funded Drug Court Programs:
Table 7: Methodological Quality Categories for Evaluations of a Drug
Court Program:
Table 8: Five Criteria for Assessing a Cost-Benefit Analysis of a Drug
Court Program:
Table 9: General Description of Drug Court Program Components:
Table 10: Ten Key Components of a Drug Court:
Table 11: Adult Drug-Court Discretionary Grant Program--Grant Type and
Description:
Table 12: Definitions: Key Management Activities Identified for which
Performance Information Can Be Most Useful:
Figures:
Figure 1: Number of Adult Drug Court Discretionary Grant Program
Awards Increased 588 Percent from Fiscal Year 2006 Through 2010:
Abbreviations:
BJA: Bureau of Justice Assistance:
DOJ: Department of Justice:
GPRA: Government Performance and Results Act:
GMS: Grants Management System:
MADCE: Multi-Site Adult Drug Court Evaluation:
NADCP: National Association of Drug Court Professionals:
NIJ: National Institute of Justice:
OJP: Office of Justice Programs:
PMT: Performance Measurement Tool:
SAMHSA: Substance Abuse and Mental Health Services Administration:
[End of section]
United States Government Accountability Office:
Washington, DC 20548:
December 9, 2011:
The Honorable Patrick Leahy:
Chairman:
The Honorable Chuck Grassley:
Ranking Member:
Committee on the Judiciary:
United States Senate:
The Honorable Lamar Smith:
Chairman:
The Honorable John Conyers, Jr.
Ranking Member:
Committee on the Judiciary:
House of Representatives:
Drug court programs were established beginning in the late 1980s as a
local response to increasing numbers of drug-related cases and
expanding jail and prison populations nationwide. A drug court is a
specialized court-based program that targets criminal offenders who have
alcohol and other drug addiction and dependency problems. Drug courts
have implemented deferred prosecution or post-adjudication case-
processing approaches, or have blended both in their organizational
structures. In drug courts using deferred prosecution, defendants waive
rights to a trial and enter a treatment program shortly after being
charged; those who subsequently fail to complete the treatment program
have their charges adjudicated, while those who complete the program
are not prosecuted further, or have their charges dismissed. In post-
adjudication case processing, defendants are tried and convicted, but
either have deferred sentences or suspensions of incarceration until
they complete or withdraw from the treatment program. The first
approach offers individuals the opportunity to obtain treatment and
avoid the possibility of a felony conviction, while the second
provides a rehabilitation incentive because treatment progress is
factored into the sentencing determination. As of June 2010, there
were over 2,500 drug courts operating throughout the United States, of
which about 1,400 of these target adult offenders.[Footnote 1] Drug
courts are generally based on a comprehensive model involving:
* offender assessment;
* judicial interaction;
* monitoring (e.g., drug testing) and supervision;
* graduated sanctions and incentives; and:
* treatment services.
The Department of Justice (DOJ), through its Office of Justice
Programs' (OJP) Bureau of Justice Assistance (BJA), administers the
Adult Drug Court Discretionary Grant Program, which provides financial
and technical assistance to states, state courts, local courts, units
of local government, and Indian tribal governments to develop and
implement drug treatment courts.[Footnote 2] The total amount BJA has
awarded in grants through the program increased from about $2 million
in fiscal year 2006 to $29 million in fiscal year 2010, and the number
of grants it has awarded during the same period increased 588 percent.
Pursuant to the Government Performance and Results Act (GPRA), DOJ
requires applicants that receive funding through the program to
provide data that measure the results of their work.[Footnote 3]
In April 2002, we reported that DOJ had not sufficiently managed its
efforts to collect performance measurement and outcome data from
federally funded drug courts.[Footnote 4] We recommended that DOJ take
actions to address these concerns, and DOJ agreed with our
recommendations and took actions in response. Appendix I provides
information on the status of these recommendations. In February 2005,
we studied drug courts again and reported that in most of the 27 drug-
court program evaluations we reviewed, adult drug-court programs led
to recidivism reductions--that is, reductions in new criminal
offenses--during periods of time that generally corresponded to the
length of the drug court program.[Footnote 5] We also reported that
the evidence about the effectiveness of drug court programs in
reducing participants' substance-use relapse was limited and mixed.
[Footnote 6]
This report responds to the Fair Sentencing Act of 2010, which
directed GAO to report on drug court programs.[Footnote 7] We briefed
your offices on our preliminary results on July 18, 2011. This report
includes our final results related to the following questions: (1)
What data does DOJ collect on the performance of federally funded
adult drug courts, and to what extent has it used these data in making
grant related decisions? And (2) What is known about the effectiveness
of adult drug courts in reducing recidivism and substance-abuse
relapse rates, and what are the costs and benefits of adult drug
courts? In addition, appendix I of this report provides information on
the extent to which DOJ has addressed the recommendations that we made
in 2002 regarding drug court programs.
To address the first question, we analyzed: the reporting guidance and
requirements that BJA provided in fiscal years 2007 through 2011 to
grantees applying for Adult Drug Court Discretionary Grant Program
funds;[Footnote 8] BJA-generated grantee performance data reports from
October to December 2010; and BJA's guides for managing grants and
enforcing grantee compliance that were issued in fiscal year 2011. We
selected 2007 as the starting point for our review because BJA
implemented its Performance Measurement Tool (PMT)--an online
reporting tool that supports BJA grantees' ability to collect,
identify, and report performance measurement data activities funded by
the award--in fiscal year 2007. We also reviewed our prior reports and
internal control standards as well as other academic literature
regarding effective performance management practices.[Footnote 9]
Further, we interviewed cognizant BJA officials about the extent to
which they use grantees' performance data when engaging in these
management activities, any challenges faced with ensuring grantee
compliance, ongoing efforts to revise program performance metrics, and
the extent to which BJA's revisions incorporate best practices we
previously identified.[Footnote 10]
To address the second question, we conducted a systematic review of
evaluations of drug court program effectiveness issued from February
2004 through March 2011 to identify what is known about the effect of
drug court programs on the recidivism of and relapse of drug involved
individuals as well as the costs and benefits of drug courts.[Footnote
11] We also reviewed DOJ's National Institute of Justice (NIJ)-funded
Multi-Site Adult Drug Court Evaluation (MADCE), a 5-year longitudinal
process, impact, and cost evaluation of adult drug courts that was
issued in June 2011, a summary of which we provide in appendix
II.[Footnote 12] We identified the universe of evaluations to include
in our review using a three-stage process. First, we identified
evaluations by searching databases and Web sites. Second, we selected
evaluations of adult drug court programs in the United States that
report recidivism, substance use relapse, and/or costs and benefits.
Third, we screened the selected studies to determine whether each met
criteria for methodological soundness based on generally accepted
social science principles or cost-benefit analysis criteria. From more
than 260 studies in our initial group, we assessed the findings of 44
studies that met our criteria and reported on the effectiveness of 32
drug court programs or sets of programs. See appendix III for
additional details on our scope and methodology.
We conducted this performance audit from November 2010 through
December 2011 in accordance with generally accepted government-
auditing standards. Those standards require that we plan and perform
the audit to obtain sufficient, appropriate evidence to provide a
reasonable basis for our findings and conclusions based on our audit
objectives. We believe that the evidence obtained provides a
reasonable basis for our findings and conclusions based on our
objectives.
Background:
Drug court programs are designed to address the underlying cause of an
offender's behavior--alcohol, drug addiction, and dependency problems.
Drug court programs share several general characteristics but vary in
their specific policies and procedures because of, among other things,
differences in local jurisdictions and criminal justice system
practices. In general, judges preside over drug court proceedings,
which are called status hearings; monitor offenders' progress with
mandatory drug testing; and prescribe sanctions and incentives as
appropriate in collaboration with prosecutors, defense attorneys,
treatment providers, and others. Drug court programs vary in terms of
the substance-abuse treatment required. However, most programs offer a
range of treatment options and generally require a minimum of 1 year
of participation before an offender completes the program.
Practices for determining defendants' eligibility for drug court
participation vary across drug court programs, but typically involve
screening defendants for their criminal history, current case
information, whether they are on probation, and their substance use,
which can include the frequency and type of use, prior treatment
experiences, and motivation to seek treatment. In 2005, we reported
that based on literature reviewed, eligible drug-court program
participants ranged from nonviolent offenders charged with drug-
related offenses who had substance addictions, to relatively medium
risk defendants with fairly extensive criminal histories and who had
failed prior substance-abuse-treatment experiences. Appendix IV
presents additional information about the general characteristics of
drug court programs. As shown in appendix V, BJA, in collaboration
with the National Association of Drug Court Professionals (NADCP),
identified The Key Components, which describes the basic elements that
define drug courts and offers performance benchmarks to guide
implementation.[Footnote 13]
BJA administers the Adult Drug Court Discretionary Grant Program to
provide financial and technical assistance to states, state courts,
local courts, units of local government, and Indian tribal governments
to develop and implement drug treatment courts.[Footnote 14] Through
the Adult Drug Court Discretionary Grant Program, BJA offers funding
in four broad drug-court grant categories. See appendix VI for a more
detailed discussion on each of the following grant categories.
* Implementation grants: Available to jurisdictions that have
completed a substantial amount of planning and are ready to implement
an adult drug court.
* Enhancement grants: Available to jurisdictions with a fully
operational (at least 1-year) adult drug court.
* Statewide grants: Available for two purposes: (1) To improve,
enhance, or expand drug court services statewide through activities
such as training and/or technical assistance programs for drug court
teams and (2) To financially support drug courts in local or regional
jurisdictions that do not currently operate with BJA Adult Drug Court
Discretionary Grant Program funding.
* Joint grants: In fiscal year 2010, BJA, in collaboration with the
Department of Health and Human Services, Substance Abuse and Mental
Health Services Administration (SAMHSA), offered a joint grant program
for the enhancement of adult drug court services, coordination, and
substance-abuse treatment capacity.[Footnote 15]
From fiscal years 2006 through 2010, Congress appropriated about $120
million for DOJ's administration of all drug court programs.[Footnote
16] Of this amount, $76 million was used for the Adult Drug Court
Discretionary Grant Program, which includes funding provided to
grantees through the previously mentioned grant categories. The grant
award totals for the Adult Drug Court Discretionary Grant Program
increased from $2 million in fiscal year 2006 to $29 million in fiscal
year 2010.[Footnote 17] Correspondingly, the number of Adult Drug
Court Discretionary Grant Program awards increased from 16 in fiscal
year 2006 to 110 in fiscal year 2010--an increase of 588 percent, as
shown in figure 1.[Footnote 18]
Figure 1: Number of Adult Drug Court Discretionary Grant Program
Awards Increased 588 Percent from Fiscal Year 2006 through 2010:
[Refer to PDF for image: vertical bar graph]
Fiscal year: 2006;
Grant awards: 16.
Fiscal year: 2007;
Grant awards: 19.
Fiscal year: 2008;
Grant awards: 42.
Fiscal year: 2009;
Grant awards: 95.
Fiscal year: 2010;
Grant awards: 110.
Source: GAO analysis of Bureau of Justice Assistance data.
[End of figure]
With regard to drug courts' effectiveness, however, drug courts have
been difficult to evaluate because they are so varied, and the
resources required to conduct a study that would allow conclusions
about the effectiveness of drug courts can be substantial. In
particular, while drug courts generally adhere to certain key program
components, drug courts can differ in factors including admission
criteria, type and duration of drug treatment, degree of judicial
monitoring and intervention, and application of sanctions for
noncompliance. In February 2005, we studied drug courts and reported
that in most of the 27 drug-court program evaluations we reviewed,
adult drug court programs led to recidivism reductions during periods
of time that generally corresponded to the length of the drug court
program.[Footnote 19] Several syntheses of multiple drug court program
evaluations, conducted in 2005 and 2006, also concluded that drug
courts are associated with reduced recidivism rates, compared to
traditional correctional options. However, the studies included in
these syntheses often had methodological limitations, such as the lack
of equivalent comparison groups and the lack of appropriate
statistical controls.[Footnote 20]
BJA Is Expanding Use of Grantee Performance Data but Could Enhance
Processes as It Continues to Refine Performance Measures:
BJA's Ongoing Data Collection Efforts:
BJA collects an array of performance data from its adult drug court
grantees through its Performance Measurement Tool (PMT) and OJP's
Grants Management System (GMS).
[Side bar:
The Performance Measurement Tool:
(PMT) is an online reporting tool that supports BJA grantees' ability
to collect, identify, and report performance-measurement data
activities funded by the award. OJP's Grants Management System (GMS)
is an online system designed to make the grant application process
easier and more efficient for grantees. GMS allows grantees to fill
out forms and submit application materials online.
Source: BJA. End of side bar]
Since fiscal year 2008, BJA has required grantees to submit
quantitative performance data on a quarterly basis and qualitative
performance information on a semi-annual basis. The quantitative
information grantees submit to BJA varies depending on the type of
grant awarded. For example, information that BJA can calculate based
on what Implementation grantees have been required to submit quarterly
includes "the percent of drug court participants who exhibit a
reduction in substance use during the reporting period," "the percent
of program participants who re-offended while in the drug court
program," and "the number and percent of drug court graduates."
Information that BJA can calculate based on what Enhancement grantees
have been required to submit includes "the increase in units of
substance-abuse treatment services" and "the percent increase in
services provided to participants." In addition to the quarterly
reporting of quantitative performance data, all adult drug court
grantees must submit progress reports semi-annually. As part of these
progress reports, grantees provide qualitative or narrative responses
to seven questions. Table 1 shows the seven questions to which
grantees must submit narrative responses when completing their semi-
annual reports.
Table 1: List of the Seven Questions to Which Adult Drug Court
Grantees Must Submit Narrative Responses:
1. What goals were accomplished, as they relate to your grant
application?
2. What problems/barriers did you encounter, if any, within the
reporting period that prevented you from reaching your goals or
milestones?
3. Is there any assistance that BJA can provide to address any
problems/barriers identified in question number three above?
4. Are you on track to fiscally and programmatically complete your
program as outlined in your grant application?
5. What major activities are planned for the next 6 months?
6. What were your accomplishments within this reporting period?
7. Based on your knowledge of the criminal justice field, are there
any innovative programs/accomplishments that you would like to share
with BJA?
Source: BJA.
[End of table]
Recent Steps to Improve Use of Performance Data:
BJA officials told us that grant managers regularly review individual
grantees' quarterly performance data and semi-annual progress reports
and use this information to determine whether additional training or
technical assistance could improve their performance. However,
according to BJA officials, resource constraints in the past had
prevented staff from fully analyzing the performance data BJA collects
from all adult drug court grantees--specifically the analysis of
grantees' answers to the seven narrative questions--to identify more
effective program approaches and processes to share with the drug
court community. In early fiscal year 2011, BJA officials initiated a
new process called GrantStat to maximize the use of performance
information by leveraging the resources of other BJA divisions,
[Footnote 21] BJA's training and technical assistance partners, its
contractor, and other key stakeholders.[Footnote 22] GrantStat
provides an analytical framework to assess grantee performance data
and other relevant information on a semi-annual basis to determine the
effectiveness of the grant programs in BJA's portfolio.
In September 2011, BJA officials applied GrantStat to a review of the
Adult Drug Court Discretionary Grant Program. As part of the process,
they collected, reviewed, and analyzed performance data and other
relevant information from a cohort of Implementation grantees to
determine the overall effectiveness of the adult drug court program
and to identify grantees that might need additional technical
assistance to improve their outcomes. BJA officials told us that as
part of the GrantStat review, they and their technical-assistance
provider's staff reviewed selected Implementation grantees' responses
to the seven narrative questions and discussed common issues they each
identified. For example, BJA identified that a number of grantees had
lower-than-expected capacity because drug court stakeholders (e.g.,
district attorneys) were referring fewer drug-involved defendants to
these drug courts. BJA also reported reviewing and discussing other
qualitative information, such as the training and technical assistance
provider's site-visit reports, to determine grantees' fidelity to the
10 key components.[Footnote 23] BJA officials acknowledged that prior
to GrantStat, they had not leveraged the summary data that its
technical assistance providers had previously compiled from grantees'
narrative responses to these seven questions and indicated that future
iterations of GrantStat would continue to include both qualitative and
quantitative performance data reviews.
Our prior work has emphasized the importance of using performance data
to inform key decisions[Footnote 24] and underscored that performance
measures can be used to demonstrate the benefits of a program or
identify ways to improve it.[Footnote 25] In addition, we also have
reported that effective performance measurement systems include steps
to use performance information to make decisions. In doing so, program
managers can improve their programs and results.[Footnote 26]
Recognizing that BJA is working through GrantStat to improve its use
of performance data in managing the drug court program, we identified
six management activities for which performance information can be
most useful to decision makers and benchmarked BJA's practices against
them.[Footnote 27] The six activities are: (1) setting program
priorities, (2) allocating resources, (3) adopting new program
approaches, (4) identifying and sharing with stakeholders more
effective program processes and approaches, (5) setting expectations
for grantees, and (6) monitoring grantee performance. See appendix VII
for the definition of the six management activities. As illustrated in
table 2, BJA has current and planned efforts underway across all six
activities.
Table 2: Types of Information BJA Officials Reported Using or Planning
to Use When Performing Key Management Activities for the Adult Drug
Court Grant Program:
Key management activities GAO identified: 1. Setting program
priorities;
Current or prior use of performance grantee data: BJA officials
reported using a range of information when setting program priorities
including NIJ-sponsored research, other drug court evaluations,
NADCP's annual problem-solving court census, input from state drug
court coordinators, and grantee quantitative performance data to set
program priorities;
Planned or proposed use of grantee data: BJA officials stated that
they have finalized grantees' quantitative performance measures and
plan to use GrantStat to identify the most effective grantees and
their common characteristics. They told us that through GrantStat they
plan to systematically assess performance information to prioritize
which types of drug courts BJA should fund in future grant
solicitations.
Key management activities GAO identified: 2. Allocating resources;
Current or prior use of performance grantee data: BJA officials
reported regularly using grantees' quantitative performance data when
deciding the level of funding to be allocated toward technical
assistance annually and the types of technical assistance grantees
need to improve their performance;
Planned or proposed use of grantee data: BJA officials reported the
revised quantitative performance measures will allow BJA to analyze
information across all grant categories (e.g., Enhancement,
Implementation, and Statewide) to determine how grantees are
performing relative to one another and then allocate funding and other
resources accordingly.
Key management activities GAO identified: 3. Adopting new program
approaches or changing work processes;
Current or prior use of performance grantee data: BJA officials
reported that they use grantees' quantitative data to revise training
courses for the program and in drafting the program's grant
solicitations;
Planned or proposed use of grantee data: BJA officials stated the
revised quantitative measures will allow them to conduct more
sophisticated analyses through GrantStat. As a result, BJA officials
expect to be able to identify not only the grantees that are
underperforming, but also the reasons why, and then target the
appropriate technical assistance to those in the most need. For
example, BJA officials reported that the revised measures will help
determine the extent to which grantees have adopted evidence-based
practices, such as the seven design features highlighted in the MADCE
study.[A].
Key management activities GAO identified: 4. Identifying and sharing
with stakeholders more effective program processes and approaches;
Current or prior use of performance grantee data: According to BJA
officials, because of resource constraints, BJA had been unable to
conduct analyses across all grantees' responses to the seven narrative
questions in their semi-annual progress reports. As a result, the
officials had not used this qualitative data when carrying out this
activity. Instead, they reported using information gathered in site
visits, desk reviews, and technical assistance assessments, as well as
MADCE and other NIJ-sponsored research and drug court evaluations to
identify effective drug court processes and procedures. BJA officials
stated that information from NIJ-sponsored research and drug court
evaluations is disseminated to stakeholders through the BJA-NIJ
Research to Practice initiative[B];
Planned or proposed use of grantee data: BJA officials stated that
GrantStat will address BJA's difficulties with collectively analyzing
grantee performance data on a regular basis by leveraging internal and
external resources. They also stated that future GrantStat reviews
will allow BJA to identify high-performing grantees and share their
success stories with other grantees.
Key management activities GAO identified: 5. Setting expectations for
grantees;
Current or prior use of performance grantee data: BJA officials said
that they have been unable to utilize adult drug-court program
grantees' quantitative performance data to set grantees' expectations
because the measures lacked benchmarks against which to gauge grantee
performance;
Planned or proposed use of grantee data: BJA's revised quantitative
performance measures include benchmarks and other performance
indicators allowing BJA to use grantees' data to establish the targets
and goals that grantees are expected to achieve. According to BJA
officials, some of these performance indicators were established as
part of GrantStat's first review of the program and will be
communicated in the 2012 Adult Drug Court Discretionary Grant Program
solicitation announcement. These indicators are based on grantee
cohort averages, individual grantees' 4-year data averages, and adult
drug court averages obtained from adult drug court research.
Key management activities GAO identified: 6. Monitoring grantee
performance;
Current or prior use of performance grantee data: According to BJA
officials, they have analyzed individual grantees' performance data on
a regular basis and provide training and technical assistance as
warranted. However, according to BJA officials, because of resource
constraints, BJA had been unable to conduct analyses across all
grantees' responses to the seven narrative questions in their semi-
annual progress reports. As a result, they had not used this
qualitative information when carrying out this activity. They also
acknowledged that prior to GrantStat, they had not leveraged the
summary data that its technical assistance providers prepared based on
grantees' responses, despite recognizing its utility;
Planned or proposed use of grantee data: BJA officials reported the
revised quantitative measures will improve BJA's ability to compare
grantees' performance results with established targets and goals to
determine the extent to which grantees have met them and, if
necessary, to target program resources (e.g., technical assistance) to
improve underperforming grantees' performance. BJA officials also told
us that GrantStat's review included an assessment of the narrative
responses and would continue to include it in the future.
Source: GAO analysis of types of information BJA officials reported
using when performing management activities.
[A] The seven design features include: (1) screening and assessment,
(2) target population, (3) procedural and distributive justice
behavior, (4) judicial interaction, (5) monitoring, (6) treatment and
other services, and (7) relapse prevention and community integration.
According to DOJ officials, the seven principles were developed with
NIJ on the basis of MADCE, and other rigorous research studies. The
language used to describe the seven principles was determined in
consultation with BJA's drug court training and technical assistance
providers.
[B] The Adult Drug Court Research to Practice Initiative is a joint
partnership between the National Center for State Courts and the
Justice Programs Office of the School of Public Affairs at American
University, with the purpose of disseminating information to drug
court practitioners about current research relevant to the operations
and services of adult drugs. The initiative was co-funded by BJA and
NIJ.
[End of table]
According to BJA officials, after the GrantStat review, they
identified trends and developed several potential findings and action
items for program design changes. However, BJA officials added that
since the action items originated from GrantStat's first review, they
are not implementing them immediately. Instead, BJA plans to evaluate
the action items over the next 6 months to ensure they are feasible
and effective alternatives for improving grantee outcomes. We are
encouraged by BJA's recent efforts to regularly analyze grantee
performance data to determine whether the program is meeting its
goals. We also are encouraged that BJA is using this information to
better inform its grant-related management activities, such as setting
program priorities, identifying and sharing effective processes and
approaches, and setting expectations for grantees.
BJA Recently Revised Its Drug Court Performance Measures:
During the course of our review, BJA revised its adult drug court
program performance measures to improve their reliability and
usefulness. BJA provided us with the revised measures on October 28,
2011. According to BJA officials, unclear definitions of some of the
previous measures confused grantees about what data elements they were
expected to collect. For example, officials told us that grantees may
have been confused with how to measure "the number of participants
admitted" and "the number of drug court participants." Specifically,
BJA officials added that their analysis of several years of data shows
that some grantees reported the same number for these two measures,
some grantees reported a higher number than were admitted, a few
grantees reported a lesser number for the number of participants than
the number admitted, and some grantees reported these two measures in
each of these three ways over multiple reporting periods. According to
BJA officials, such a wide degree of variability made these measures
unreliable, and BJA was thus hindered from comparing grantee
performance data across grantee cohorts.
BJA's performance measure revisions resulted in the following:
* All grantees are required to report on "participant level" measures.
Examples of these measures include the demographic make-up of their
drug court participant populations, the amount of service provided to
their participants, and the geographic location of their drug courts;
* Enhancement, Joint, and Statewide grantees are required to report on
participant level outcomes, such as graduation rates, to ensure
consistency with measures BJA collects from Implementation grantees;
* Measures previously excluded from the PMT, such as retention rates
and outcomes of participants once they complete the drug court
program, are now included;
* BJA has established two sets of benchmarks as points of reference
against which to gauge grantees' performance. The first set of
benchmarks requires a comparison of grantees' performance against
averages of drug court performance derived from research. The second
set of benchmarks requires a comparison of grantees' performance to
historical performance data reported to BJA by adult drug court
grantees; and:
* BJA revised the descriptions and the definitions of the measures to
help ensure their clarity.
To revise the performance measures, BJA officials consulted with
technical assistance providers and a drug court researcher to discuss
possible improvements to the performance measures, reviewed drug court
literature, and reviewed and analyzed BJA grantees' clarification and
information requests to identify the most common problems adult drug
court grantees historically experienced submitting performance
information to BJA.[Footnote 28] In addition, BJA obtained comments on
the proposed measures from BJA staff and other DOJ stakeholders, as
well as Enhancement, Implementation, Joint, and Statewide grantees.
[Footnote 29] BJA officials also invited all current grantees to
participate in four teleconferences to obtain their feedback on the
feasibility of collecting and reporting the new measures and their
suggestions to improve the clarity of the measures' definitions and
descriptions. BJA officials finalized the new measures in October 2011
and plan to closely monitor grantees' performance data submissions to
ensure the reliability and usefulness of the measures and then revise
as necessary after the first reporting period. BJA officials also
stated that they expected to review the measures' overall reliability
and validity after the first reporting period--October 1, 2011,
through December 30, 2011.
BJA officials reported that the revised measures will strengthen the
reliability and improve the usefulness of grantee performance data in
making grant-related decisions. For example, BJA officials stated that
reliable and useful data would help them to identify the most
effective grantees and common characteristics these courts share to
inform the types of drug courts the officials choose to fund in future
grant solicitations. BJA officials also reported that as a result of
the revision, they expect to be able to conduct more sophisticated
analyses using GrantStat that are needed to inform grant-related
decisions. For example, BJA officials told us that implementing
benchmarks and participant level measures will enable the agency to
compare similar drug courts (e.g., large-urban jurisdictions of
similar size, demographic make-up, and geographic context) to one
another and across jurisdictions, thereby improving BJA's
understanding of grantees' impact on the populations they serve.
BJA Could Enhance Two Key Practices as It Continues to Review and
Revise Its Adult Drug Court Performance Measures:
BJA's process to revise its performance measures generally adhered to
some of the key practices that we have identified as important to
ensuring that measures are relevant and useful to decision-making.
These key practices included obtaining stakeholder involvement
[Footnote 30] and ensuring that the measures have certain key
attributes, such as clarity.[Footnote 31] The key practices also
describe the value of testing the measures to ensure that they are
credible, reliable and valid[Footnote 32] and documenting key steps
throughout the revision process.[Footnote 33] However, BJA could take
actions to improve its efforts in these two areas. For instance, BJA
officials told us that after the grantees' first reporting period
concludes, they plan to assess the data that grantees submitted to
ensure that the measures produce reliable and useful data over at
least the first quarter of fiscal year 2012. They stated that if
necessary, at that point they will then further revise the measures.
Nevertheless, BJA officials have not documented how they will
determine if the measures were successful or whether changes would be
needed. In addition, BJA officials did not record key methods and
assumptions used to guide their revision efforts, such as the feedback
stakeholders provided and BJA's disposition of these comments. For
example, BJA officials provided a document generally showing the
original performance measure; whether it was removed, revised or
replaced; and BJA's justification for the action, but this document
did not demonstrate how BJA had incorporated the stakeholder feedback
it considered when making its decisions. The document also did not
include a link to a new performance measure in instances where an
older one was being replaced. Further, BJA's justification did not
include the rationale for the changes it made to 22 of the 51
performance measures. According to BJA officials, they did not
document their decisions in this way because of the rapid nature of
the revision process and limited staff resources. They also told us
that maintaining such documentation and providing it to stakeholders
held little value.
Our previous work has shown the importance of documentation to the
successful development of effective performance measures.[Footnote 34]
In the past, we have reported that revising performance measures
involves a number of aspects needing to be carefully planned and
carried out and that by documenting the steps undertaken in developing
and implementing the revised measures, agencies can be better assured
their revisions result in effective performance measures.[Footnote 35]
In addition, academic literature on the best practices for developing
effective performance measures states that agencies should develop
products to document and guide their revision efforts. These products,
among other things, can include plans for ensuring the quality and
integrity of the data for full-scale implementation of the measures.
[Footnote 36] Further, Standards for Internal Control in the Federal
Government call for clear documentation of significant events, which
can include assumptions and methods surrounding key decisions, and
this documentation should be readily available for examination.
[Footnote 37] As BJA moves forward in assessing the revised measures
and implementing additional changes, if it deems necessary, BJA could
better ensure that its efforts result in successful and reliable
metrics and are transparent by documenting key methods used to guide
revision efforts and an assessment of its measures. This would also
help bolster the integrity of its decisions.
Drug Courts Were Associated with Lower Recidivism and Relapse Rates
for Program Participants Than Criminal Courts:
In the evaluations we reviewed, adult drug-court program participation
was generally associated with lower recidivism. Our analysis of
evaluations reporting recidivism data for 32 programs showed that drug
court program participants were generally less likely to be re-
arrested than comparison group members drawn from the criminal court
system, although the differences in likelihood were reported to be
statistically significant in 18 programs.[Footnote 38] Across studies
showing re-arrest differences, the percentages of drug court program
participants rearrested were lower than for comparison group members
by 6 to 26 percentage points. One program did not show a lower re-
arrest rate for all drug-court program participants relative to the
comparison group within 3 years of entry into the program, although
that study did show a lower re-arrest rate for drug court participants
who had completed the program than for members of the comparison
group. In general, the evaluations we reviewed found larger
differences in re-arrest rates between drug-court program completers
and members of the comparison group than between all drug-court
program participants and the comparison group members. The rearrest
rates for program completers ranged from 12 to 58 percentage points
below those of the comparison group.[Footnote 39] The completion rates
reported in the evaluations we reviewed ranged from 15 percent to 89
percent.
Included among the evaluations we reviewed was the MADCE, a 5-year
longitudinal process, impact, and cost evaluation of adult drug
courts. The MADCE reported a re-arrest rate for drug court
participants that was 10 percentage points below that of the
comparison group; specifically, 52 percent of drug court participants
were re-arrested after the initiation of the drug court program, while
62 percent of the comparison group members were re-arrested.[Footnote
40] However, the 10 percentage point difference between these rearrest
rates for the samples of drug court participants and comparison group
members was not statistically significant. The MADCE study also
reported that drug court participants were significantly less likely
than the comparison group to self-report having committed crimes when
they were interviewed 18 months after the baseline (40 percent vs. 53
percent), and drug court participants who did report committing crimes
committed fewer than comparison group members.
We assigned a numerical rating to each evaluation to reflect the
quality of its design and the rigor of the analyses conducted. Our
methodology for rating the evaluation studies is detailed in appendix
III. After assigning the rating, we grouped the studies into two
tiers. Tier 1 studies were the most carefully designed and
incorporated substantial statistical rigor in their analyses. Tier 2
studies, while still meeting our basic criteria for methodological
soundness, were relatively less rigorous in their design and analyses.
Both tier 1 and tier 2 studies reported differences between drug court
participants and comparison group members and both sets of studies
found that some but not all differences were statistically
significant.[Footnote 41]
Table 3 shows whether a difference in recidivism rates was reported
for each program--expressed as the difference in the rate of re-arrest
between all drug court program participants and the comparison group.
In some cases the difference in recidivism was reported as something
other than a difference in the re-arrest rate, such as a difference in
the number of arrests or the relative odds of an arrest. In those
cases, table 3 notes that a difference was reported, but does not
include the difference in re-arrest rates. For example, the evaluation
of the Queens Misdemeanor Treatment Court reported that the re-arrest
rate for program participants was 14 percentage points lower than the
re-arrest rate of comparison group members up to 2 years after
participants entered into the program, and 10 percentage points lower
at 3 or more years after entry. Similarly, the evaluation of the
Hillsborough County Adult Drug Court reported a statistically
significant difference in the relative odds of an arrest after drug
court program enrollment but did not report the difference in rearrest
rates, therefore table 3 indicates a statistically significant
reduction in rearrest rates but does not show the difference in rates.
Table 3: Differences in Reported Rearrest Rates between Drug Court
Program Participants and Comparison Group Members:
Tier 1 Evaluations:
Drug court program (state): Breaking the Cycle Program (Florida);
Reduction reported? Yes;
Reduction statistically significant?[A]: No;
Percentage point difference in rate of re-arrest where reported, time
frame covered:
Up to 1 year after entry: -10%[B];
Up to 2 years after entry: [Empty];
3 or more years after entry: [Empty].
Drug court program (state): Hillsborough County Adult Drug Court
(Florida);
Reduction reported? Yes[C];
Reduction statistically significant?[A]: Yes;
Percentage point difference in rate of re-arrest where reported, time
frame covered:
Up to 1 year after entry: [Empty];
Up to 2 years after entry: [Empty];
3 or more years after entry: [Empty].
Drug court program (state): Baltimore City Drug Treatment Court
(Maryland);
Reduction reported? Yes;
Reduction statistically significant?[A]: No;
Percentage point difference in rate of re-arrest where reported, time
frame covered:
Up to 1 year after entry: [Empty];
Up to 2 years after entry: [Empty];
3 or more years after entry: -9%; -8%[D].
Drug court program (state): Queens Misdemeanor Treatment Court (New
York);
Reduction reported? Yes;
Reduction statistically significant?[A]: Yes;
Percentage point difference in rate of re-arrest where reported, time
frame covered:
Up to 1 year after entry: [Empty];
Up to 2 years after entry: -14%*;
3 or more years after entry: -10%*.
Drug court program (state): Multnomah County Drug Court (Oregon);
Reduction reported? Yes[E];
Reduction statistically significant?[A]: Yes;
Percentage point difference in rate of re-arrest where reported, time
frame covered:
Up to 1 year after entry: [Empty];
Up to 2 years after entry: [Empty];
3 or more years after entry: [Empty].
Drug court program (state): MADCE (Multiple States);
Reduction reported? Yes;
Reduction statistically significant?[A]: No;
Percentage point difference in rate of re-arrest where reported, time
frame covered:
Up to 1 year after entry: [Empty];
Up to 2 years after entry: -10%;
3 or more years after entry: [Empty].
Drug court program (state): Breaking the Cycle Program (Washington);
Reduction reported? Yes;
Reduction statistically significant?[A]: Yes;
Percentage point difference in rate of re-arrest where reported, time
frame covered:
Up to 1 year after entry: -10%*[F];
Up to 2 years after entry: [Empty];
3 or more years after entry: [Empty].
Tier 2 Evaluations:
Drug court program (state): Multiple Drug Courts (California);
Reduction reported? Yes;
Reduction statistically significant?[A]: Not reported;
Percentage point difference in rate of re-arrest where reported, time
frame covered:
Up to 1 year after entry: [Empty];
Up to 2 years after entry: [Empty];
3 or more years after entry: -12%.
Drug court program (state): Sacramento Drug Court (California);
Reduction reported? Yes;
Reduction statistically significant?[A]: Not reported;
Percentage point difference in rate of re-arrest where reported, time
frame covered:
Up to 1 year after entry: [Empty];
Up to 2 years after entry: -20%;
3 or more years after entry: [Empty].
Drug court program (state): Guam Adult Drug Court (Guam);
Reduction reported? Yes;
Reduction statistically significant?[A]: Yes[G];
Percentage point difference in rate of re-arrest where reported, time
frame covered:
Up to 1 year after entry: [Empty];
Up to 2 years after entry: [Empty];
3 or more years after entry: -20% .
Drug court program (state): Ada County Drug Court (Idaho);
Reduction reported? Yes;
Reduction statistically significant?[A]: Yes;
Percentage point difference in rate of re-arrest where reported, time
frame covered:
Up to 1 year after entry: [Empty];
Up to 2 years after entry: [Empty];
3 or more years after entry: -25%*[H].
Drug court program (state): Multiple Drug Courts (Idaho);
Reduction reported? Yes[I];
Reduction statistically significant?[A]: Yes;
Percentage point difference in rate of re-arrest where reported, time
frame covered:
Up to 1 year after entry: [Empty];
Up to 2 years after entry: [Empty];
3 or more years after entry: [Empty].
Drug court program (state): Monroe County Drug Treatment Court
(Indiana);
Reduction reported? Yes;
Reduction statistically significant?[A]: Not reported;
Percentage point difference in rate of re-arrest where reported, time
frame covered:
Up to 1 year after entry: [Empty];
Up to 2 years after entry: -16%;
3 or more years after entry: [Empty].
Drug court program (state): St. Joseph County Drug Court (Indiana);
Reduction reported? Yes;
Reduction statistically significant?[A]: Not reported;
Percentage point difference in rate of re-arrest where reported, time
frame covered:
Up to 1 year after entry: [Empty];
Percentage point difference in rate of re-arrest where reported, time
frame covered: Up to 2 years after entry: -16%;
Percentage point difference in rate of re-arrest where reported, time
frame covered: 3 or more years after entry: [Empty].
Drug court program (state): Vanderburgh County Day Reporting Drug
Court (Indiana);
Reduction reported? Yes;
Reduction statistically significant?[A]: Yes[J];
Percentage point difference in rate of re-arrest where reported, time
frame covered:
Up to 1 year after entry: [Empty];
Up to 2 years after entry: -9%;
3 or more years after entry: [Empty].
Drug court program (state): Baltimore City Circuit Court Adult Drug
Treatment Court and Felony Diversion Initiative (Maryland);
Reduction reported? Yes;
Reduction statistically significant?[A]: Yes[J];
Percentage point difference in rate of re-arrest where reported, time
frame covered:
Up to 1 year after entry: [Empty];
Up to 2 years after entry: [Empty];
3 or more years after entry: [Empty].
Drug court program (state): Baltimore City District Court Adult Drug
Treatment Court (Maryland);
Reduction reported? Yes[K];
Reduction statistically significant?[A]: No;
Percentage point difference in rate of re-arrest where reported, time
frame covered:
Up to 1 year after entry: [Empty];
Up to 2 years after entry: [Empty];
3 or more years after entry: [Empty].
Drug court program (state): Harford County District Court Adult Drug
Court (Maryland);
Reduction reported? Yes;
Reduction statistically significant?[A]: Yes[J];
Percentage point difference in rate of re-arrest where reported, time
frame covered:
Up to 1 year after entry: [Empty];
Up to 2 years after entry: [Empty];
3 or more years after entry: [Empty].
Drug court program (state): Howard County District Court Drug
Treatment Court (Maryland);
Reduction reported? Yes;
Reduction statistically significant?[A]: No;
Percentage point difference in rate of re-arrest where reported, time
frame covered:
Up to 1 year after entry: -13%;
Up to 2 years after entry: -10%;
3 or more years after entry: [Empty].
Drug court program (state): Montgomery County Adult Drug Court
(Maryland);
Reduction reported? Yes;
Reduction statistically significant?[A]: Yes;
Percentage point difference in rate of re-arrest where reported, time
frame covered:
Up to 1 year after entry: -26%*;
Up to 2 years after entry: -19%*;
3 or more years after entry: [Empty].
Drug court program (state): Prince George's County Circuit Court Adult
Drug Court (Maryland);
Reduction reported? No;
Reduction statistically significant?[A]: No;
Percentage point difference in rate of re-arrest where reported, time
frame covered:
Up to 1 year after entry: [Empty];
Up to 2 years after entry: 0%;
3 or more years after entry: +1%.
Drug court program (state): Wicomico County Circuit Court Adult Drug
Treatment Court (Maryland);
Reduction reported? Yes;
Reduction statistically significant?[A]: Yes;
Percentage point difference in rate of re-arrest where reported, time
frame covered:
Up to 1 year after entry: -24%*;
Up to 2 years after entry: -18%;
3 or more years after entry: [Empty].
Drug court program (state): Suffolk County Drug Court (Massachusetts);
Reduction reported? Yes;
Reduction statistically significant?[A]: Yes;
Percentage point difference in rate of re-arrest where reported, time
frame covered:
Up to 1 year after entry: [Empty];
Up to 2 years after entry: -6%*;
3 or more years after entry: [Empty].
Drug court program (state): Barry County Adult Drug Court (Michigan);
Reduction reported? Yes;
Reduction statistically significant?[A]: Not reported;
Percentage point difference in rate of re-arrest where reported, time
frame covered:
Up to 1 year after entry: [Empty];
Up to 2 years after entry: -24%;
3 or more years after entry: [Empty].
Drug court program (state): Kalamazoo County Adult Drug Treatment
Court (Michigan);
Reduction reported? Yes;
Reduction statistically significant?[A]: Yes[J];
Percentage point difference in rate of re-arrest where reported, time
frame covered:
Up to 1 year after entry: [Empty];
Up to 2 years after entry: -14%;
3 or more years after entry: [Empty].
Drug court program (state): Unnamed Drug Court (Midwest);
Reduction reported? Yes[L];
Reduction statistically significant?[A]: No;
Percentage point difference in rate of re-arrest where reported, time
frame covered:
Up to 1 year after entry: [Empty];
Up to 2 years after entry: [Empty];
3 or more years after entry: [Empty].
Drug court program (state): Kings County District Attorney's Office
Drug Treatment Alternative to Prison Program (New York);
Reduction reported? Yes;
Reduction statistically significant?[A]: Yes;
Percentage point difference in rate of re-arrest where reported, time
frame covered:
Up to 1 year after entry: -22%*;
Up to 2 years after entry: -18%*;
3 or more years after entry: -22%* (3 yrs) -26%* (4 yrs).
Drug court program (state): Multiple Drug Courts (Ohio);
Reduction reported? Yes[M];
Reduction statistically significant?[A]: Yes;
Percentage point difference in rate of re-arrest where reported, time
frame covered:
Up to 1 year after entry: [Empty];
Up to 2 years after entry: -18%*;
3 or more years after entry: [Empty].
Drug court program (state): Multnomah County Clean Court (Oregon);
Reduction reported? Yes;
Reduction statistically significant?[A]: Not reported;
Percentage point difference in rate of re-arrest where reported, time
frame covered:
Up to 1 year after entry: -16%;
Up to 2 years after entry: [Empty];
3 or more years after entry: [Empty].
Drug court program (state): Marion County Adult Drug Court (Oregon);
Reduction reported? Yes;
Reduction statistically significant?[A]: Not reported;
Percentage point difference in rate of re-arrest where reported, time
frame covered:
Up to 1 year after entry: [Empty];
Up to 2 years after entry: -14%;
3 or more years after entry: [Empty].
Drug court program (state): Multiple Drug Courts (Oregon);
Reduction reported? Yes;
Reduction statistically significant?[A]: Yes;
Percentage point difference in rate of re-arrest where reported, time
frame covered:
Up to 1 year after entry: [Empty];
Up to 2 years after entry: [Empty];
3 or more years after entry: -12%*.
Drug court program (state): Rutland County Adult Drug Court (Vermont);
Reduction reported? Yes;
Reduction statistically significant?[A]: Yes;
Percentage point difference in rate of re-arrest where reported, time
frame covered:
Up to 1 year after entry: [Empty];
Up to 2 years after entry: [Empty];
3 or more years after entry: -23%*.
Source: GAO analysis of drug court program evaluations.
*The difference was reported to be statistically significant.
[A] Indicates whether a reduction in any measure of recidivism
reported by the study was statistically significant.
[B] Study reported a difference in recidivism of -9% for self-reported
criminal acts.
[C] Study reported a statistically significant difference in the
relative odds of an arrest between 12 and 18 months after drug court
program enrollment.
[D] Reflects findings from 2 studies of the same drug court by the
same author. See, Gottfredson et al.
[E] Study reported a 17% reduction in re-arrest rates over 5 years,
but the difference in rates was not reported.
[F] Study reported a difference in recidivism of -15% for self-
reported criminal acts.
[G] Study reported a statistically significant difference in the
average number of arrests resulting in court filings after program
initiation between the treatment and comparison groups, but did not
report the significance of the difference in rates of arrest resulting
in court case filings.
[H] Study reported a statistically significant difference in the rate
of new court filings following program initiation. Statistical
controls, if any, were not presented.
[I] Study reported a statistically significant difference in the rate
of new court filings following program initiation for program
completers. Statistical controls, if any, were not presented.
[J] While the differences in rearrest rates were not reported as
significant, or not reported at all, study did report a statistically
significant difference in average number of re-arrests between the
treatment and comparison groups.
[K] Re-arrest rates for drug court participants were lower than for
comparison group members for some years over a 10-year period, but
none of the differences was statistically significant.
[L] Study reported a difference in re-arrest rates of -12% for program
completers vs. the comparison group, which was not statistically
significant.
[M] Findings reflect a comparison group that combines multiple courts.
[End of table]
The evaluations we reviewed showed that adult drug-court program
participation was also associated with reduced drug use. Our analysis
of evaluations reporting relapse data for eight programs showed that
drug court program participants were less likely than comparison group
members to use drugs, based on drug tests or self-reported drug use,
although the difference was not always significant.[Footnote 42] This
was true for both within-program and post-program measures, and
whether drug use was reported as the difference in the frequency of
drug use or the proportion of the treatment and comparison groups who
used drugs.
The MADCE concluded drug courts produce significant reductions in drug
relapse. Specifically, MADCE reported that "drug court participants
were significantly less likely than the comparison group to report
using all drugs (56 vs. 76 percent) and also less likely to report
using 'serious' drugs (41 vs. 58 percent), which omit marijuana and
'light' alcohol use (fewer than four drinks per day for women or less
than five drinks per day for men). On the 18-month oral fluids drug
test, significantly fewer drug court participants tested positive for
illegal drugs (29 vs. 46 percent). Further, among those who tested
positive or self-reported using drugs, drug court participants used
drugs less frequently than the comparison group." Regarding post-drug
court program relapses, the MADCE concluded that participation in drug
court--along with less frequent drug use among offenders prior to
arrest, and the absence of mental health problems--were the strongest
predictors of success against relapses. Table 4 summarizes the results
of drug-use relapse reported in the evaluations we reviewed.
Table 4: Drug Use Relapse Results of Evaluations GAO Reviewed:
Drug tests:
Drug court program (state): Barry County Adult Drug Court (Michigan);
Results of drug use relapse: Reduction within-program: Drug court
participants generally had fewer positive drug test results than the
comparison group in the 12-month period following program initiation.
Drug court program (state): St. Joseph County Drug Court (Indiana);
Results of drug use relapse: Reduction within-program: At all times
during a 12-month period following program initiation, the drug court
participants had a lower percentage of positive urine drug screens
than the comparison group sample.
Drug court program (state): Vanderburgh County Day Reporting Drug
Court (Indiana);
Results of drug use relapse: Reduction within-program: At all times
during the 12-month period following program initiation, the drug
court participants had a lower percentage of positive urine drug
screens than the comparison group.
Drug court program (state): Methamphetamine Treatment Project
(California);
Results of drug use relapse: Reduction within-program: Results show
that drug court participants were significantly more likely than
comparison participants to provide a higher proportion of clean urine
samples, 97.3 percent versus 90.5 percent, respectively.
Drug court program (state): Methamphetamine Treatment Project
(California);
Results of drug use relapse: Reduction post-program: There were
substantial reductions in methamphetamine use over time for drug court
participants compared to non-drug court comparison participants at
program completion, and at 6 and 12 months following program
completion.
Self-reported drug use:
Drug court program (state): Baltimore City Drug Treatment Court
(Maryland);
Results of drug use relapse: Reduction post-program: Participants in
the treatment group had about 27 fewer days of alcohol use, 19 fewer
days of cocaine use, and 28 fewer days of heroin use on average
compared to the control group. The differences were statistically
significant only for the effect on cocaine. Among participants
originating in the circuit court, the average number of days of
cocaine use was 50 days lower in the treatment group than the control
group.
Drug court program (state): Breaking the Cycle (Florida);
Results of drug use relapse: Reduction within-program: Approximately 9
months after the initial arrest, drug court participants were less
likely to report drug use in the past 30 days (24 percent) than the
comparison group (33 percent); however, this difference was not
significant at the 95 percent level.
Drug court program (state): Breaking the Cycle (Washington);
Results of drug use relapse: Reduction within-program: Approximately 9
months after the initial arrest, there was no significant difference
between the percentage of drug court participants (50 percent) and the
comparison group (51percent) who self-reported drug use in the past 30
days.
Both drug tests and self-reported use:
Drug court program (state): MADCE (Multiple States);
Results of drug use relapse: Reduction 18 months after program
initiation: 56 percent of drug court participants reported using any
drugs compared with 76 percent among the comparison group;
41 percent of the drug court participants reported using "serious"
drugs (not marijuana or light alcohol use) compared with 58 percent
among the comparison group. 29 percent of drug court participants
tested positive for illegal drugs compared with 46 percent of the
comparison group. Among those who tested positive or self-reported
using drugs, drug court participants used drugs less frequently than
the comparison group.
Source: GAO analysis of adult drug court program evaluations.
[End of table]
Drug Court Programs Were Associated with Both Positive and Negative
Net Benefits:
Of the studies we reviewed, 11 included sufficient information to
report a net benefit figure. Of these studies, the net benefit ranged
from positive $47,852 to negative $7,108 per participant. The net
benefit is the monetary benefit of reduced recidivism accrued to
society from the drug court program through reduced future
victimization and justice system expenditures, less the net costs of
the drug court program--that is, the cost of the program less the cost
of processing a case in criminal court. A negative net benefit value
indicates that the costs of the drug court program outweigh its
estimated benefits and that the program was not found to be cost
beneficial. Eight of the studies reported positive net benefits--the
benefits estimated to accrue from the drug court program exceeded the
program's net costs. Three of the 11 studies reported negative net
benefits. We did not attempt to determine whether the differences in
the reported values were because of differences in study methodology
or the attributes of the drug courts themselves. The environment in
which the drug court operates may also be important. For example, the
largest net benefit reported was for Kings County, in which members of
the comparison group were incarcerated, in contrast to other programs
in which members of the comparison group were given probation, which
is less costly. The more costly the alternative, such as
incarceration, the more likely a drug court will have positive net
benefits. In this case, the study reported that society would accrue
$47,852 in benefits relative to conventional court processing.
Table 5 below shows whether, based on the available information, the
study was shown to be cost beneficial. It also shows the net benefits
per participant of the drug court study. For example, MADCE found that
the drug court participants led to a net benefit of $6,208 per
participant--within the range of the other studies.[Footnote 43] The
MADCE analysis of costs and benefits is discussed further in appendix
II.
Table 5: Cost Conclusions of the 11 Drug Court Program Evaluations in
Our Cost-Benefit Review:
Drug court program (state): Kings County District Attorney's Office
Drug Treatment Alternative to Prison Program (New York)[A];
Program shown to be cost beneficial? Yes;
Net benefits: $47,836.
Drug court program (state): Multiple Drug Courts (Maine);
Program shown to be cost beneficial? Yes;
Net benefits: $42,177.
Drug court program (state): Douglas County Drug Court (Nebraska);
Program shown to be cost beneficial? Yes;
Net benefits: $11,336.
Drug court program (state): Multnomah County Drug Court (Oregon);
Program shown to be cost beneficial? Yes;
Net benefits: $10,826.
Drug court program (state): MADCE (Multiple States);
Program shown to be cost beneficial? Yes[B];
Net benefits: $6,208.
Drug court program (state): Multiple Drug Courts (Kentucky);
Program shown to be cost beneficial? Yes;
Net benefits: $5,446.
Drug court program (state): St. Joseph County Drug Court (Indiana);
Program shown to be cost beneficial? Yes;
Net benefits: $3,148.
Drug court program (state): St. Louis City Adult Felony Drug Court
(Missouri);
Program shown to be cost beneficial? Yes;
Net benefits: $2,615.
Drug court program (state): Vanderburgh County Day Reporting Drug
Court (Indiana);
Program shown to be cost beneficial? No;
Net benefits: ($1,640).
Drug court program (state): Barry County Adult Drug Court (Michigan);
Program shown to be cost beneficial? No;
Net benefits: ($3,552).
Drug court program (state): Monroe County Drug Treatment Court
(Indiana);
Program shown to be cost beneficial? No;
Net benefits: ($7,108).
Source: GAO of drug court program evaluations.
[A] Comparison was to prison population.
[B] Because of the variability in the estimate, the MADCE study could
not determine that the net benefits were statistically significant.
Most other studies did not report on whether differences in cost were
statistically significant.
[End of table]
Conclusions:
During the course of our review, BJA made strides in managing its
adult drug court program, including implementation of the GrantStat
process and recent revisions to the grantee performance measures.
Given that BJA has committed to testing its new measures during this
first grantees' reporting period, enhancements could be made to
facilitate this assessment. By documenting how it plans to assess the
measures and determine any changes that may be needed and providing
the rationale for future revisions, BJA could bolster the transparency
and integrity of its decisions. Doing so could also improve the
reliability of the data it collects, its usefulness to managers in
guiding the program, and the success of its measures.
Recommendation for Execution Action:
Recognizing that BJA has recently revised the adult drug-court
performance measures and has plans to assess their utility, we
recommend that BJA's Director take the following action to ensure that
its revision process is transparent and results in quality and
successful metrics to inform management's key decisions on program
operations:
* Document key methods used to guide future revisions of its adult
drug-court program performance measures. This documentation should
include both a plan for how BJA will assess the measures after
conclusion of the grantees' first reporting period and a rationale for
why each measure was refined, including a discussion of the scope and
nature of any relevant stakeholder comments.
Agency Comments:
We provided a draft of this report to DOJ for review and comment. On
December 1, 2011, we received written comments on the draft report
from DOJ, which are reproduced in full in appendix VIII. DOJ concurred
with our recommendation and described actions under way or planned to
address the recommendation. DOJ also provided technical comments,
which we incorporated as appropriate.
DOJ stated that BJA will continue to document grantee feedback and
will ensure that revisions to the measures are documented in
accordance with GAO's best practices standards. In particular, DOJ
stated that BJA will document (1) whether the name and definition of
the measure is consistent with the methodology used to calculate it;
(2) whether the measure is reasonably free from bias; (3) whether the
measure meets the expectation of the program; and (4) its rationale
for why each performance measure was refined, including the scope and
nature of any relevant stakeholder comments. We believe that such
actions would improve the reliability of the information collected,
its usefulness to managers in making key decisions on program
operations, and the success of its measures.
We are sending copies of this report to the Attorney General and
interested congressional committees. In addition, this report will be
available at no charge on the GAO Web site at [hyperlink,
http://www.gao.gov].
Should you or your staff have any questions concerning this report,
please contact me at (202) 512-8777 or by e-mail at maurerd@gao.gov.
Contact points for our Offices of Congressional Relations and Public
Affairs may be found on the last page of this report. Key contributors
to this report are listed in appendix IX.
Signed by:
David C. Maurer:
Director, Homeland Security and Justice Issues:
[End of section]
Appendix I: DOJ Has Fully Implemented Most of Our 2002 Recommendations
and Plans to Address the Remaining One:
The following provides the current status of the seven recommendations
we made in 2002--which have since been closed--on DOJ's collection of
performance data.[Footnote 44] Specifically, DOJ has fully implemented
six of them and partially implemented one.[Footnote 45] DOJ has plans
to fully address the remaining recommendation related to analyzing
performance and outcome data collected from grantees and reporting
annually on the results. Table 6 reflects this status.
Table 6: Status of DOJ's Efforts to Address Recommendations We Made in
2002 on DOJ's Collection of Performance Data to Measure the Impact of
Federally Funded Drug Court Programs:
Recommendation: 1. Develop and implement a management information
system that is able to track and readily identify the universe of drug
court programs DOJ funds;
Actions to address recommendation: BJA is currently using OJP's GMS to
track and identify the universe of BJA funded drug court programs.[B]
In 2009, BJA also developed the Enterprise Reporting Tool, an internal
system, which allows BJA to query most, if not all, BJA databases,
including GMS. BJA officials said this tool allows them to run reports
on the universe of drug court programs funded by grant type, amount
awarded, status, year awarded, and jurisdiction;
Status[A]: Fully implemented.
Recommendation: 2. Reinstate the collection of post-program data,
selectively spot checking grantee responses to ensure accurate
reporting;
Actions to address recommendation: On October 28, 2011, BJA provided
us with the revised performance measures for the Adult Drug Court
Discretionary Grant Program--which includes the data requirement for
grantees to track drug-court program participants following their
program completion. According to BJA officials, data collection on the
revised measures will take effect with grant activities on October 1,
2011, through December 30, 2011. Data entry and reporting by grantees
in the Performance Management Tool (PMT) will begin on January 1,
2012.[C] BJA officials reported concerns regarding the usefulness and
reliability of post-program data, emphasizing that once the drug court
grants expire, drug courts are no longer required to track
participants or report their status to BJA. BJA reported that it will
test this new measure (post-program data), along with all other
revised measures, and monitor for reliability in the data it receives
from grantees. Having these new measures in place could help ensure
BJA has the program management data it needs to make informed grantee
decisions;
Status[A]: Fully implemented.
Recommendation: 3. Take immediate steps to accelerate the funding and
implementation of a methodologically sound national impact evaluation
and consider ways to reduce the time needed to provide information on
the overall impact of federally funded drug court programs;
Actions to address recommendation: In 2002, NIJ commissioned the adult
drug court evaluation (MADCE) that selected multiple sites from across
the country for review. In June 2011, MADCE was issued, and the main
objectives were to: (1) test whether drug courts reduce drug use,
crime, and multiple other problems associated with drug abuse, in
comparison with similar offenders not exposed to drug courts; (2)
address how drug courts work and for whom by isolating key individual
and program factors that make drug courts more or less effective in
achieving their desired outcomes; (3) explain how offender attitudes
and behaviors change when they are exposed to drug courts and how
these changes help explain the effectiveness of drug court programs;
and (4) examine whether drug courts generate cost savings. The
evaluation found that drug courts prevent crime and substance use and
work equally well for most participant subgroups. See appendix II for
a summary of the study;
Status[A]: Fully implemented.
Recommendation: 4. Take steps to ensure and sustain an adequate
grantee response rate by improving efforts to notify and remind
grantees of their reporting requirements;
Actions to address recommendation: In fiscal year 2007, BJA began
using GMS to send notifications to remind grantees of upcoming due
dates for progress reports. If a progress report is more than 15 days
late, GMS automatically freezes the grantee's available funding until
the report is submitted. Similarly, the grantee is unable to draw down
funds on a grant if a financial report is more than one day late. BJA
officials said that these procedures provide an additional tool to
assist grant managers in providing adequate oversight of grantees'
reporting activities to ensure compliance with reporting requirements;
Status[A]: Fully implemented.
Recommendation: 5. Take corrective action toward grantees that do not
comply with data collection reporting requirements;
Actions to address recommendation: In fiscal year 2007, OJP
implemented a policy whereby available grant funds are frozen for
noncompliant grantees that are delinquent in submitting semi-annual
progress reports or quarterly financial reports. In addition, BJA has
the ability to designate a grantee as high risk if the grantee
continues to be noncompliant in reporting requirements. Once grantees
are notified of their high-risk designation, all new awards to the
grantee include high-risk special conditions that provide for
additional oversight, as necessary, and restrict the grantee from
obligating, expending, or drawing down funds under the new awards from
DOJ;
Status[A]: Fully implemented.
Recommendation: 6. Analyze performance and outcome data collected from
grantees and report annually on the results;
Actions to address recommendation: Since 2007, using PMT, BJA has
collected quarterly quantitative performance data from federally
funded drug court grantees. Semi-annually, BJA also collects responses
to seven narrative questions that grantees provide using PMT. BJA
officials said they regularly analyze the numeric data and publish the
results of the performance measure reporting on BJA's Web site. BJA
does not fully analyze and report on the grantees' responses to the
narrative questions. As mentioned, on October 28, 2011, BJA provided
us with the revised adult drug-court program performance measures,
which include measures previously excluded from PMT, such as retention
rates and outcomes of participants once they complete the program. As
noted, BJA plans to reassess the reliability of the measures after the
initial grantee reporting period concludes. After this period, BJA
officials explained that they will make any necessary revisions or
changes to the measures--then analyze and report on the results. As
mentioned previously, BJA initiated a new process called GrantStat to
maximize the use of performance information--providing an analytical
framework to assess grantee performance data and other relevant
information on a semi-annual basis to determine the effectiveness of
the grant programs in BJA's portfolio;
Status[A]: Partially implemented; BJA plans to fully implement.
Recommendation: 7. Consolidate the multiple DOJ-funded drug-court-
program-related data collection efforts;
Actions to address recommendation: BJA has been using PMT to
consolidate data collection efforts since 2007. According to
officials, PMT allows for grantees' online performance measurement
data submission and enhanced capacity for BJA to (1) aggregate grantee
data across performance measures, (2) distill performance by the type
of adult drug court grant, and (3) more quickly "error check" the
reliability of grantees' data submissions. BJA officials said PMT
allows them to query results and assess performance outcomes, which
helps them make decisions when designing future grant solicitations.
According to BJA officials, using PMT to consolidate the federally
funded drug court program data collection efforts enables DOJ to
better manage the programs;
Status[A]: Fully implemented.
Source: BJA.
[A] The following explains the definitions we used in assessing DOJ
status in addressing the recommendations. Fully implemented--DOJ
provided evidence that satisfies the entire recommendation. Partially
implemented--DOJ provided evidence that satisfies about half of the
recommendation. Not implemented--DOJ provided no evidence that
satisfies any of the recommendation.
[B] GMS is an online system designed to make the grant application
process easier and more efficient for grantees. GMS allows grantees to
fill out forms and submit application materials online.
[C] PMT is an online reporting tool that supports BJA grantees'
ability to collect, identify, and report performance measurement data
activities funded by their award.
[End of table]
[End of section]
Appendix II: MADCE Is the Most Comprehensive Study of Drug Courts to
Date, but Generalizability of Findings May Be Limited:
NIJ's MADCE was conducted by the Urban Institute, Center for Court
Innovation, and Research Triangle Institute.[Footnote 46] Data were
collected from 1156 drug court participants in 23 different drug
courts in 7 geographic clusters and from a comparison group of 625
drug-involved offenders in six different sites in four geographic
clusters. Data collected included: three waves of interviews; drug
tests; administrative records on treatment, arrests, and
incarceration; court observation and interviews with staff and other
stakeholders; and budget and other cost information. The evaluation
was designed to address the following four questions:
(1) Do drug courts reduce drug use, criminal behavior, and other
associated offender problems?
(2 )Do drug courts generate cost savings for the criminal justice
system and other public institutions?
(3) Are drug courts especially effective or less effective for certain
categories of offenders or program characteristics?
(4) Which drug court policies and offender perceptions explain their
overall impact?
MADCE Found Reductions in Recidivism and Relapse, but Generalizability
May Be Limited:
The MADCE's major findings can be summarized as follows:
* Drug courts produce statistically significant reductions in self-
reported crime. While both the drug court participants and comparison
group participants reported large numbers of crimes in the year
preceding the 18-month follow-up, drug court participants reported
statistically significantly fewer than the comparison group members.
Drug court participants were less likely than members of the
comparison group to report committing any crimes (40 percent vs. 53
percent) and drug court participants reported committing fewer crimes
in the preceding 12 months than comparison group members (43 criminal
acts vs. 88 criminal acts). The difference between the two groups in
the probability of an official re-arrest over 24 months was not
statistically significant, though the percentage of individuals
rearrested was lower for the drug court group than the comparison
group (52 percent vs. 62 percent), as was the average number of re-
arrests (1.24 vs. 1.64).[Footnote 47]
* Drug courts produce statistically significant reductions in drug
use. Drug court participants were less likely than members of the
comparison group to report using any drugs (56 percent vs. 76 percent)
and any serious drugs (41 percent vs. 58 percent), and less likely to
test positively for drugs at the 18-month follow-up (29 percent vs. 46
percent). Furthermore, the large difference in self-reported relapse
rates is evident at 6 months (40 percent vs. 59 percent), so the
impact of drug courts on alcohol and other drug use is sustained. The
interview data also indicate that among the drug court participants
and comparison group members that were using drugs, the drug court
participants, on average, were using them less frequently.
* Drug court participants reported some benefits, relative to
comparison group members, in other areas of their lives. At 18 months,
drug court participants were statistically significantly less likely
than comparison group members to report a need for employment,
educational, and financial services, and reported statistically
significantly less family conflict. However, there were modest, non-
significant differences in employment rates, income, and family
emotional support, and no differences found in experiencing
homelessness or depression.
* Regardless of background, most offenders who participated in drug
courts had better outcomes than offenders who were in the comparison
programs. However, the impact of drug courts was greater for
participants with more serious prior drug use and criminal histories,
and the impact was smaller for participants who were younger, male,
African-American, or who had mental health problems.
* While the treatment and service costs were higher for drug court
participants than treatment and service costs associated with the
alternative "business-as-usual" comparison programs, drug courts save
money through improved outcomes, according to the researchers,
primarily through savings to victims resulting from fewer crimes and
savings resulting from fewer re-arrests and incarcerations.
The authors of the study assert that their findings have strong
internal validity--that is, that the findings were actually produced
by the drug court programs--and external validity--that is, that the
findings can be generalized to the population of all drug court
participants and potential comparison group members. The claim to
strong internal validity is not without merit, given the high response
rates, low attrition, propensity score adjustments, and conservative
estimates produced by the hierarchical models used.[Footnote 48] The
claim of high internal validity is also supported by the sensitivity
analyses undertaken for several outcomes using other models and
methods of adjustments that produced little or no change in
conclusions. The claim to strong external validity, which relates to
the generalizability of the results beyond the sample of courts and
comparison sites and specific offenders considered, may be somewhat
overstated. The authors note that the 23 drug courts included in the
study represent "a broad mix of urban, suburban, and rural courts from
7 geographic clusters nationwide," but that doesn't assure that,
collectively, the drug courts that were included resemble the hundreds
of drug courts that were not included, especially since they were not
chosen at random. It also seems unlikely that the six comparison sites
from four states are representative of all potential controls, or all
alternative programs in all states, and it is potentially problematic
that all of the selected sites, including drug court and comparison
sites, were alike in their willingness and interest in participating.
Those concerns notwithstanding, this is the broadest and most
ambitious study of drug courts to date; it is well done analytically;
and the results, as they relate to the impact of drug courts, are
transparent and well described.
MADCE's Cost Benefit Analysis Focused on Individuals:
The MADCE cost benefit analysis approach differed from most of the
other studies we reviewed. In most of the other studies, the average
cost and benefit of a drug court participant was compared to the
average cost and benefit of normal court processing. In contrast, the
MADCE obtained a separate net benefit figure for each individual. The
net benefit was obtained by tracking each individual's use of
resources, such as hearings or meetings with case managers, and
program outcomes like use of public assistance. The MADCE also tracked
each individual's rates of re-arrest, number of crimes, and time of
incarceration. The crimes are multiplied by cost to victims per crime
to obtain the cost to society. The difference between the net benefits
of the drug court participants and the comparison group were obtained
using a hierarchical model similar to the one used for program
outcomes. After applying the method, the MADCE found that the drug
court participants led to a net benefit of $6,208[Footnote 49] to
society per participant, as compared to the comparison group. However,
due to the variability in the estimate, the study did not find that
the net benefits were statistically significant.
The lack of a statistically significant difference may be because of
greater variability in the MADCE approach than the approach used in
other studies. Specifically, the MADCE did not assume identical costs
for each participant. As a result, costs may be higher for individuals
who have lower rates of re-arrest, perhaps because those individuals
received more treatment. According to the study's authors, by assuming
identical costs for each participant, the standard approach
understates the variance in the computed net benefit figure by not
including the variability in cost. However, the MADCE authors assumed
that the prices of services were consistent across sites by using a
weighted average. In contrast, some studies generate site-specific
cost figures. In this way, the MADCE approach did exclude one source
of variation that is present in some other studies.
In addition to tracking costs and benefits at the individual level,
the MADCE also included some effects of drug court participation that
some other studies omit. This is consistent with OMB guidance that
states that studies should be comprehensive in the benefits and costs
to society considered.[Footnote 50] One of the benefits considered by
the MADCE, sometimes omitted elsewhere, is the estimated earnings of
the drug court participant. However, it is unclear that the full value
of earnings should have been considered a net benefit to society. For
example, to be comprehensive, a study should also consider the cost to
society of providing that benefit. The net benefit would account for
the value of production from this employment less the wages paid.
Although in this case, it is unlikely that this would affect the
result of the analysis, as the earnings are similar for drug court
participants and the comparison group.
[End of section]
Appendix III: Objectives, Scope, and Methodology:
To determine what data DOJ collects on the performance of federally
funded adult drug courts and to what extent DOJ has used this data in
making grant-related decisions, we analyzed the reporting guidance and
requirements that BJA provided in fiscal years 2007 through 2011 to
grantees applying for Adult Drug Court Discretionary Grant Program
funds;[Footnote 51] BJA-generated grantee performance data reports
from October to December 2010; and BJA's guides for managing grants
and enforcing grantee compliance that were issued in fiscal year 2011.
We selected 2007 as the starting point for our review because BJA
implemented its Performance Measurement Tool (PMT)--an online
reporting tool that supports BJA grantees' ability to collect,
identify, and report performance-measurement data activities funded by
the grantees' awards--in fiscal year 2007. We also reviewed our prior
reports and internal control standards as well as other academic
literature regarding effective performance-management practices.
[Footnote 52] We then used this information and BJA officials'
statements to identify and define six management activities for which
performance information can be most useful in making grant-related
decisions.[Footnote 53] Further, we interviewed cognizant BJA
officials about the extent to which they use grantees' performance
data when engaging in these management activities, any challenges
faced with ensuring grantee compliance, ongoing efforts to revise
program performance metrics, and the extent to which BJA's revisions
incorporate best practices we previously identified.[Footnote 54]
To determine what is known about the effectiveness of adult drug
courts in reducing recidivism and substance-abuse relapse rates and
what the costs and benefits of adult drug courts are, we conducted a
systematic review of evaluations of drug-court program effectiveness
issued from February 2004 through March 2011 to identify what is known
about the effect of drug court programs on the recidivism of and
relapse of drug-involved individuals as well as the costs and benefits
of drug courts.[Footnote 55] We also reviewed DOJ's NIJ-funded MADCE,
a 5-year longitudinal process, impact, and cost evaluation of adult
drug courts that was issued in June 2011. We identified the universe
of evaluations to include in our review using a three-stage process.
First, we (1) conducted key-word searches of criminal justice and
social science research databases;[Footnote 56] (2) searched drug
court program-related Web sites, such as those of BJA and NADCP; (3)
reviewed bibliographies, meta-analyses of drug court evaluations, and
our prior reports on drug court programs;[Footnote 57] and (4) asked
drug court researchers and DOJ officials to identify evaluations. Our
literature search identified 260 documents, which consisted of
published and unpublished outcome evaluations, process evaluations,
commentary about drug court programs, and summaries of multiple
program evaluations.[Footnote 58] Second, we reviewed the 260
documents our search yielded and identified 44 evaluations that
reported recidivism or substance use relapse rates using either an
experimental or quasi-experimental design, or analyzed program costs
and benefits.[Footnote 59] Third, we used generally accepted social
science and cost benefit criteria to review the 44 evaluations.
To assess the methodological quality of evaluations that reported on
recidivism or relapse rates, we placed each evaluation into one of
five categories, with category 1 evaluations being the most rigorous
and category 5 the least, as outlined in table 7.
Table 7: Methodological Quality Categories for Evaluations of a Drug
Court Program:
Category: 1;
Required methodological elements: Random assignment to drug court or
control group, drawn from local offenders eligible for the program.
The sample: (a) is of sufficient size to ensure that randomization
balances all covariates (e.g., potential predictors of outcomes, such
as past criminal history); or (b) has constrained key covariates to
balance in small samples through stratification (e.g., sampling
subpopulations independently to improve representativeness) or through
other adjustments. Randomization occurs after eligible offense, not
after screening or other self-selection. Drug court is compulsory if
assigned, and rates of attrition from the program are low.
Category: 2;
Required methodological elements: Either;
Random assignment with several factors in group 1 missing, such as
small sample sizes or some amount of pre-screening of eligible
participants;
or:
Nonrandom assignment, but analysis models or controls for the specific
process used to assign participants to the drug court when
constructing a longitudinal comparison group. Alternatively,
assignment to the program was nonrandom but clearly exogenous to the
outcomes. The comparison group is similar to the treatment group on
any variables the program explicitly uses to screen participants, such
as "readiness for treatment" or "likely response to treatment."
Comparison group is matched on multiple years of pre-treatment data.
Category: 3;
Required methodological elements: Either;
Problematic random assignment: extremely small sample sizes, many
large differences between treatment and control groups, randomization
that occurs after all important forms of self-selection or screening;
or:
Nonrandom assignment, and the analysis controls for pre-treatment
outcomes and participant demographics without considering the specific
process used to assign participants to the program being evaluated.
Comparison group is used, but has limited pre-treatment covariate data
for construction. Comparison or treatment groups are constructed in
ways that could have a clear impact on the outcomes (e.g., truncating
the sample).
Category: 4;
Required methodological elements: Nonrandom assignment. Comparison
group constructed with few controls for pre-treatment outcomes or
shows covariate differences with the treatment group. Several
plausible sources of selection bias, such as preexisting differences
between the two groups in the degree of substance use.
Category: 5;
Required methodological elements: Nonrandom assignment. Cross-
sectional design with few controls, pre-post design with no comparison
group and few controls, or treatment group that includes only program
graduates.
Source: GAO.
[End of table]
We excluded studies that were placed in category 5 in the table above
or studies in which the comparison group was not drawn from a criminal
court. We were left with 33 studies, plus the MADCE, that reported on
the effectiveness of 32 drug court programs or sets of
programs.[Footnote 60] As noted in our report, we then grouped the 34
studies, including the MADCE, into two tiers according to their
quality category, Tier 1 studies were those that fell into categories
1 or 2, Tier 2 studies were those that fell into categories 3 or 4.
Observed differences in recidivism could arise from measured and
unmeasured sources of variation between drug court participants and
comparison group members. If comparison group members differed
systematically from drug court participants on variables that are also
associated with recidivism, such as the degree of their substance-
abuse addiction problem and these variables were not accounted for by
the design or analysis used in the evaluation, then the study could
suffer from selection bias wherein observed differences in recidivism
could be because of these sources of variation rather than
participation in the drug court program. As indicated in table 7, our
evaluation of the methods used to deal with selection bias was
reflected in the quality categorization of each study.[Footnote 61]
To assess the methodological quality of evaluations that reported on
drug court program costs and benefits, we assessed them according to
the five criteria we developed and outlined in table 8 below.[Footnote
62]
Table 8: Five Criteria for Assessing a Cost-Benefit Analysis of a Drug
Court Program:
Criterion: 1. States the program's purpose;
Description: In general, the purpose of a drug court program is to
reduce repeated criminal behavior--to reduce recidivism--by reducing
offenders' substance-using behavior.
Criterion: 2. Identifies the baseline;
Description: The baseline, or alternative, is what would happen to an
offender if the drug court program did not exist.
Criterion: 3. Assesses all relevant costs;
Description: The costs involved in a drug court program are those
associated with the program's operation and those associated with the
baseline.
Criterion: 4. Assesses all relevant benefits;
Description: Benefits usually attributed to drug court programs are
costs avoided because of reduced recidivism;
they accrue to the criminal justice system and potential victims of
crime. Other benefits an analysis could consider include reduced
medical costs and successful program participants' increased
productivity.
Criterion: 5. Assesses uncertainty in cost and benefit estimates;
Description: Most cost and benefit estimates entail uncertainty from
imprecision in the data underlying the analysis and the assumptions
built into the analysis. Assessing uncertainty enhances confidence in
the estimates used in evaluation.
Source: GAO-05-219.
[End of table]
We determined that an essential criterion for reporting a net benefit
of drug courts was that the costs of the drug court were assessed
against a baseline (i.e., "business-as-usual" or traditional court
processing). Eleven studies met this essential standard and were used
to report on program costs and benefits. We excluded other studies not
meeting this standard even though they may have met others.
To obtain information on our outcomes of interest--that is,
recidivism, substance use relapse, and costs and benefits--we used
data collection instruments to systematically collect information
about the methodological characteristics of each evaluation, the drug
court participants and comparison group members studied, and the
outcomes of the participants and other comparable groups reported.
Each evaluation was read and coded by a senior social scientist,
statistician, or economist with training and experience in evaluation
research methods. A second senior social scientist, statistician, or
economist then reviewed each completed data collection instrument to
verify the accuracy of the information included. Part of our
assessment also focused on the quality of the data used in the
evaluations as reported by the researchers and our observations of any
problems with missing data, any limitations of data sources for the
purposes for which they were used, and inconsistencies in reporting
data. We incorporated any data problems that we noted in our quality
assessments.
We selected the evaluations in our review based on their
methodological strength; therefore, our results cannot be generalized
to all drug court programs or their evaluations. Although the findings
of the evaluations we reviewed are not representative of the findings
of all evaluations of drug court programs, the evaluations consist of
those evaluations we could identify that used the strongest designs to
assess drug-court program effectiveness.
To identify the extent to which DOJ has addressed the recommendations
that we made in 2002 regarding drug court programs, we interviewed
cognizant DOJ officials and obtained and reviewed documentation (e.g.,
drug-court program grant solicitations and grantee-performance
reporting guidance) on the actions taken to address and implement each
of our prior recommendations. We conducted this performance audit from
November 2010 through December 2011 in accordance with generally
accepted government-auditing standards. Those standards require that
we plan and perform the audit to obtain sufficient, appropriate
evidence to provide a reasonable basis for our findings and
conclusions based on our audit objectives. We believe that the
evidence obtained provides a reasonable basis for our findings and
conclusions based on our objectives.
[End of section]
Appendix IV: Overview of Drug Court Program Characteristics:
This appendix provides a general description of drug court program
components (see table 9). Drug court programs rely on a combination of
judicial supervision and substance-abuse treatment to motivate
defendants' recovery.[Footnote 63] Judges preside over drug court
proceedings, which are called status hearings; monitor defendants'
progress with mandatory drug testing; and prescribe sanctions and
incentives, as appropriate in collaboration with prosecutors, defense
attorneys, treatment providers, and others. Drug court programs can
vary in terms of the substance-abuse treatment required. However, most
programs offer a range of treatment options and generally require a
minimum of about 1 year of participation before a defendant completes
the program.
Table 9: General Description of Drug Court Program Components:
Drug court elements: Drug court program approaches;
Description: Drug court programs generally have taken two approaches
to processing cases: (1) deferred prosecution (diversion);
and (2) post-adjudication. In the diversion model, the courts defer
prosecution dependent on the defendant's agreement to participate in
the drug court program. Deferred adjudication models do not require
the defendant to plead guilty. Instead the defendant enters the drug
court before pleading to a charge. Defendants who complete the
treatment program are not prosecuted further and their charges are
dismissed. Failure to complete the program results in prosecution for
the original offense. This approach is intended to capitalize on the
trauma of arrest and offers defendants the opportunity to obtain
treatment and avoid the possibility of a felony conviction. In
contrast, defendants participating in a post-adjudication (post-plea)
drug court program plead guilty to the charge(s) and their sentences
are suspended or deferred. Upon successful completion of the program,
sentences are waived and in many cases records are expunged. This
approach provides an incentive for the defendant to rehabilitate
because progress toward rehabilitation is factored into the sentencing
determination. Both of these approaches provide the defendant with a
powerful incentive to complete the requirements of the drug court
program. Some drug court programs use both deferred prosecution and
post-adjudication approaches and assign defendants to an approach
depending on the severity of the charge. Additionally, drug court
programs may also combine aspects of these models into a hybrid or
combined approach.
Drug court elements: Screening process and participant eligibility
criteria;
Description: Defendants reach the drug court program from different
sources and at varying points in case processing. Screening defendants
to determine eligibility for a drug court program generally includes
assessing their criminal history and current case information (e.g.,
charging offense, prior convictions, pending cases, and probation
status). Depending on the program, an assistant district or
prosecuting attorney, court clerk, or drug court coordinator typically
conducts the review. Drug courts generally accept defendants charged
with drug possession or other nonviolent offenses such as property
crimes. Some drug court programs allow defendants who have prior
convictions to participate, and others do not. Federal grants
administered by BJA may not be awarded to any drug court program that
allows either current or past violent offenders to participate in its
program.[A]
After defendants are determined to be legally eligible for the
program, treatment providers or case managers will typically determine
defendants' clinical eligibility. This can be determined through
structured assessment tests, interviews, or even preliminary drug test
results. While drug courts generally only accept defendants with
substance-abuse problems, they vary in the level of addiction or type
of drug to which defendants are addicted. For example, some programs
do not accept defendants who only have addictions to marijuana or
alcohol, while others do.
Clinical eligibility can also include factors such as medical or
mental health barriers and motivation or treatment readiness. In
several drug court programs in our review, the drug court judge's
satisfaction with or assessment of an offender's motivation and
ability to complete the program was a factor used to screen defendants.
Drug court elements: Program completion requirements;
Description: Drug court programs typically require defendants to
complete a 1-year treatment program in order to graduate from or
complete the program. Some programs impose other conditions that
participants must meet in addition to treatment. These conditions
could include remaining drug-free for a minimum amount of time, not
being arrested for a specified period of time, maintaining employment
or obtaining an educational degree or certification, or performing
community service.
Drug court elements: Judicial supervision and status hearings;
The central element of all drug court programs is attendance at the
regularly scheduled status hearings, at which the drug court judge
monitors the progress of participants. Monitoring is based on
treatment-provider reports on such matters as drug testing and
attendance at counseling sessions. The judge is to reinforce progress
and address noncompliance with program requirements. The primary
objectives of the status hearing are to keep the defendant in
treatment and to provide continuing court supervision. More broadly,
judicial supervision includes regular court appearances and direct in-
court interaction with the judge, as well as scheduled case manager
visits.
Drug court elements: Drug-testing requirements;
Monitoring participants' substance use through mandatory and frequent
testing is a core component of drug court programs. Programs vary in
the specific policies and procedures regarding the nature and
frequency of testing. For example, in some programs in our review
participants were required to call to find out whether they are
required to be tested in a given period or on a randomly selected day
of the week. The frequency of testing generally varied depending on
the stage or phase of the program that participants were in.
Drug court elements: Treatment components;
Description: In most drug court programs, treatment is designed to
last at least 1 year and is generally administered on an outpatient
basis with limited inpatient treatment, as needed, to address special
detoxification or relapse situations. Many of the programs operate
with the philosophy that because drug addiction is a disease, relapses
can occur and that the court must respond with progressive sanctions
or enhanced treatment, rather than immediate termination.
Treatment services are generally divided into three phases.
Detoxification, stabilization, counseling, drug education, and therapy
are commonly provided during phases I and II, and in some instances,
throughout the program. Other services relating to personal and
educational development, job skills, and employment services are
provided during phases II and III, after participants have responded
to initial detoxification and stabilization. Housing, family, and
medical services are frequently available throughout the program. In
some instances, a fourth phase consisting primarily of aftercare-
related services is provided. The objectives of drug-court program
treatment are generally to (1) eliminate the program participants'
physical dependence on drugs through detoxification; (2) treat the
defendant's craving for drugs through stabilization (referred to as
rehabilitation stage) during which frequent group or individual
counseling sessions are generally employed; and (3) focus on helping
the defendant obtain education or job training, find a job, and remain
drug free.
Drug court programs can also either directly provide or refer
participants to a variety of other services and support, and they may
include medical or health care, mentoring, and educational or
vocational programs. The use of community-based treatment self-help
groups, such as Alcoholics Anonymous and Narcotics Anonymous, and
aftercare programs also varies across drug court programs.
Drug court elements: Sanctions for noncompliance;
Description: Judges generally prescribe sanctions and incentives as
appropriate in collaboration with prosecutors, defense attorneys,
treatment providers, and others. Typical sanctions for program
noncompliance include oral warnings from the judge; transfer to an
earlier stage of the program; attendance at more frequent status
hearings, treatment sessions, or drug tests; and serving jail time for
several days or weeks. The approach or philosophy for how a drug court
judge prescribes sanctions can vary. For example, some judges use a
graduated sanctions approach, where sanctions are applied in
increasing severity. Other judges may use discretion in prescribing
sanctions, assessing participants' noncompliance on a case-by-case
basis.
Drug court elements: Termination criteria;
Description: Drug court programs typically use various criteria for
ending a defendant's participation in the program before completion.
These criteria may include a new felony offense, multiple failures to
comply with program requirements such as not attending status hearings
or treatment sessions, and a pattern of positive drug tests.
Before terminating a defendant for continuing to use drugs, drug court
programs generally will use an array of treatment services and
available sanctions. There are no uniform standards among all programs
on the number of failed drug tests and failures to attend treatment
sessions that lead to a participant's termination. Drug court program
judges generally make decisions to terminate a program participant on
a case-by-case basis, taking into account the recommendations of
others, including the treatment provider, prosecutor, and defense
counsel. Relapses are expected, and the extent to which noncompliance
results in terminations varies from program to program. Once a
defendant is terminated, he or she is usually referred for
adjudication or sentencing.
Source: GAO-05-219.
[A] 42 U.S.C. § 3797u-1-u-2. Violent offenders generally include those
who have been charged with or convicted of an offense that is
punishable by a term of imprisonment of greater than one year, and the
offense involved a firearm or dangerous weapon; death or serious
bodily injury; or the use of force. Past violent offenders include
those who have one or more prior convictions for a felony crime of
violence involving the use or attempted use of force against a person
with the intent to cause death or serious bodily harm. § 3797u-2.
[End of table]
[End of section]
Appendix V: Ten Key Components of a Drug Court--Developed by BJA in
Collaboration with the National Association of Drug Court
Professionals:
Table 10: Ten Key Components of a Drug Court:
1. Integration of substance-abuse treatment with justice system case
processing.
2. Use of a non-adversarial approach, in which prosecution and defense
promote public safety while protecting the right of the participant to
due process.
3. Early identification and prompt placement of eligible participants.
4. Access to continuum of treatment, rehabilitation, and related
services.
5. Frequent testing for alcohol and illicit drugs.
6. A coordinated strategy governs drug court responses to
participants' compliance.
7. Ongoing judicial interaction with each participant.
8. Monitoring and evaluation to measure achievement of program goals
and gauge effectiveness.
9. Continuing interdisciplinary education to promote effective
planning, implementation, and operation.
10. Forging partnerships among drug courts, public agencies, and
community-based organizations generates local support and enhances
drug court program effectiveness.
Source: BJA.
[End of table]
[End of section]
Appendix VI: BJA Offers Solicitations in Four Broad Drug-Court Grant
Categories--Implementation, Enhancement, Statewide, and Joint:
As mentioned, the Adult Drug Court Discretionary Grant Program
provides financial and technical assistance to states, state courts,
local courts, units of local government, and Indian tribal governments
to develop and implement drug treatment courts. There are four
different types of awards that BJA makes to adult drug-court grantees
through the program. Table 11 provides a description of the grant
types.
Table 11: Adult Drug-Court Discretionary Grant Program--Grant Type and
Description:
Grant type: Implementation grants;
Description: Available to jurisdictions that have completed planning
and are ready to implement an adult drug court. Grantees may use their
awards to fund various court operations and services including
offender supervision, management, and services;
provision and coordination of non-treatment recovery support services;
and childcare and other family supportive services.
Grant type: Enhancement grants;
Description: Available to jurisdictions with fully operational adult
drug courts. Applicants may use funding to expand their target
population, enhance court services, or enhance offender services.
Grant type: Statewide grants;
Description: Used to improve, enhance, or expand drug court services
statewide by encouraging adherence to the evidence-based design
features and through activities such as: training and/or technical
assistance programs for drug court teams geared to improve drug court
functioning and to increase drug court participation and participant
outcomes; tracking, compiling, coordinating, and disseminating state
drug court information and resources; increasing communication,
coordination, and information sharing among drug court programs;
conducting a statewide drug court evaluation; or establishing a
statewide automated drug-court data collection and/or performance
management system.
Grant type: Joint grants;
Description: BJA, in collaboration with the Department of Health and
Human Services, Substance Abuse and Mental Health Services
Administration (SAMHSA), offers a joint grant program for the
enhancement of adult drug court services, coordination, and substance-
abuse treatment capacity.[A] Successful applicants are awarded two
grants: an Enhancement grant from BJA and a Behavioral Health Court
Collaboration Grant from SAMHSA. This joint program offers grantees
the opportunity to design a comprehensive strategy for enhancing drug
court capacity while accessing both criminal justice and substance-
abuse treatment funds under a single grant application.
Source: BJA.
[A] SAMHSA is authorized under section 509 of the Public Health
Service Act, as amended (42 U.S.C. § 290bb-2) to provide Adult
Treatment Drug Court grants.
[End of table]
[End of section]
Appendix VII: Key Management Activities Identified for Which
Performance Information Can Be Most Useful:
Table 12: Definitions: Key Management Activities Identified for Which
Performance Information Can Be Most Useful:
Key management activities: a. Setting program priorities;
How performance information may be used to support the activity:
Performance information is used to set priorities in budgeting and to
target resources. Agencies can also use this information to identify
priorities on which to focus their efforts. For example, targeting
grants to address "underserved" client groups.
Key management activities: b. Allocating resources;
How performance information may be used to support the activity:
Performance information is used to compare results of agencies'
programs with goals and to identify where program resources should be
targeted to improve performance and achieve goals. When faced with
reduced resources, such analyses can assist agencies' efforts to
minimize the impact on program results.
Key management activities: c. Adopting new program approaches or
changing work processes;
How performance information may be used to support the activity:
Performance information is used to assess the way a program is
conducted and the extent to which a program's practices and policies
have or have not led to improvements in outcomes. Such information is
used to identify problems and consider alternative approaches and
processes in areas where goals are not being met and to enhance the
use of program approaches and processes that are working well.
Key management activities: d. Identifying and sharing with
stakeholders more effective processes and approaches to program
implementation;
How performance information may be used to support the activity:
Performance information is used to identify and increase the use of
program approaches that are working well and share these effective
processes and approaches with stakeholders.
Key management activities: e. Setting expectations for grantees;
How performance information may be used to support the activity:
Performance information is used to establish the targets and goals
that grantees are expected to achieve. These targets and goals can be
used as the basis for corrective action (e.g., technical assistance,
freezing of funds) or to reward high performing grantees.
Key management activities: f. Monitoring grantee performance;
How performance information may be used to support the activity:
Performance information is used to compare grantees' performance
results with established targets and goals to determine the extent to
which grantees have met them and, if necessary, target program
resources (e.g., technical assistance) to improve grantees'
performance.
Source: GAO analyses.
Note: We identified the first four management activities above as
relevant from governmentwide surveys of federal managers conducted in
1997, 2000, and 2003. See GAO-05-927. The remaining two activities we
identified by reviewing performance management literature. In defining
the management activities, we reviewed the literature identified and
met with BJA officials to determine the extent to which they agreed
with our definitions. BJA staff confirmed each of these six to be
relevant to managing the drug court program.
[End of table]
[End of section]
Appendix VIII: Comments from the Department of Justice, Bureau of
Justice Assistance:
U.S. Department of Justice:
Office of Justice Programs:
Washington, D.C. 20531:
December 1, 2011:
Mr. David C. Maurer:
Director:
Homeland Security and Justice Issues:
Government Accountability Office:
441 G Street, NW:
Washington, DC 20548:
Dear Mr. Maurer:
Thank you for the opportunity to comment on the draft Government
Accountability Office (GAO) report entitled, "Adult Drug Courts:
Studies Show Courts Reduce Recidivism, But DOJ Could Enhance Future
Performance Measure Revision Efforts" (GAO-12-53). The draft GAO
report contains one Recommendation for Executive Action to the U.S.
Department of Justice (DOJ), which is restated in bold text below and
is followed by our response.
[Begin bold text:
Recognizing that BJA has recently revised the adult drug court
performance measures and has plans to assess their utility, we
recommend that BJA's Director take the following action to ensure that
its revision process is transparent and results in quality and
successful metrics to inform management's key decisions on program
operations:
* Document key methods used to guide future revisions of its adult
drug court program performance measures. This documentation should
include both a plan for bow BJA will assess the measures after the
first grantee reporting period concludes and a rationale for why each
measure was refined, including the scope and nature of any relevant
stakeholder comments.
End bold text]
The Office of Justice Programs (OJP) agrees with the Recommendation
for Executive Action, and will continue to ensure that any revisions
to the Drug Court measures or the process to revise those measures is
transparent, and results in quality and successful metrics to inform
management's key decisions on program operations. As stated in the GAO
draft report, the Bureau of Justice Assistance (BJA) documented key
components of the revision process, including meeting minutes,
stakeholder call recordings, and email documentation; however,
BJA did not consolidate all information collected into a single
document. BJA will continue to document grantee feedback, and will
ensure that revisions to the measures are documented in accordance
with GAO Best Practices standards regarding: (1) whether the name and
definition of the measure is consistent with the methodology used to
calculate it; (2) whether the measure is reasonably free from bias;
(3) whether the measure meets the expectation of the program; and
(4) BJA's rationale for why each performance measure was refined,
including the scope and nature of any relevant stakeholder comments.
Beginning with the first reporting cycle, which ends on December 31,
2011, BJA will gauge the ability of Drug Court grantees to understand
and accurately report on the new performance measures. By July 15,
2012, BJA plans to analyze two quarters of performance data submitted
by Drug Court grantees for such inaccuracies, including, but not
limited to missing data, outliers, and duplicate counts. This will
enable BJA to identify performance measures that may potentially
produce unreliable results. BJA anticipates that the assessment of the
quality of the data and refinement of performance measures will be an
ongoing process.
If you have any questions regarding this response, you or your staff
may contact Maureen Henneberg, Director, Office of Audit, Assessment,
and Management, at (202) 616-3282.
Sincerely,
Signed by:
Laurie O. Robinson:
Assistant Attorney General:
cc: Mary Lou Leary:
Principal Deputy Assistant Attorney General:
James H. Burch, II:
Deputy Assistant Attorney General for Operations and Management:
Denise O'Donnell:
Director:
Bureau of Justice Assistance:
Leigh Benda:
Chief Financial Officer:
Maureen Henneberg:
Director:
Office of Audit, Assessment, and Management:
Louise Duhamel, Ph.D.
Acting Director, Audit Liaison Group:
Internal Review and Evaluation Office:
Justice Management Division:
OJP Executive Secretariat:
Control Number 20111887:
[End of section]
Appendix IX: GAO Contacts and Staff Acknowledgments:
GAO Contacts:
David C. Maurer, (202) 512-8777 or maurerd@gao.gov:
Staff Acknowledgments:
In addition to the contact named above, Joy Booth, Assistant Director
and Frederick Lyles, Jr., Analyst-in-Charge, managed this assignment.
Christoph Hoashi-Erhardt, Michael Lenington, and Jerry Seigler, Jr.,
made significant contributions to the work. David Alexander, Benjamin
Bolitzer, Michele Fejfar, and Doug Sloane assisted with design and
methodology. Pedro Almoguera, Carl Barden, Harold Brumm, Jr., Jean
McSween, Cynthia Saunders, Jeff Tessin, Susan B. Wallace, and Monique
Williams assisted with evaluation review. Janet Temko provided legal
support, and Katherine Davis provided assistance in report preparation.
Bibliography:
Bouffard, Jeffrey A., and Katie A. Richardson. "The Effectiveness of
Drug Court Programming for Specific Kinds of Offenders:
Methamphetamine and DWI Offenders Versus Other Drug-Involved
Offenders." Criminal Justice Policy Review, 18(3) (September 2007):
274-293.
Carey, Shannon M., and Michael W. Finigan. Indiana Drug Courts: St.
Joseph County Drug Court Program Process, Outcome and Cost Evaluation-
Final Report. Portland, OR: NPC Research, 2007.
Carey, Shannon M., and Michael W. Finigan. Indiana Drug Courts:
Vanderburgh County Day Reporting Drug Court Process, Outcome and Cost
Evaluation-Final Report. Portland, OR: NPC Research, 2007.
Carey, Shannon M., and Michael W. Finigan. "A Detailed Cost Analysis
in a Mature Drug Court Setting: A Cost-Benefit Evaluation of the
Multnomah County Drug Court." Journal of Contemporary Criminal
Justice, 20(3) (August 2004): 315-338.
Carey, Shannon M., Michael W. Finigan, et. al. Indiana Drug Courts:
Monroe County Drug Treatment Court Process, Outcome and Cost
Evaluation-Final Report. Portland, OR: NPC Research, 2007.
Carey, Shannon M., Michael Finigan, Dave Crumpton, and Mark S. Waller.
"California Drug Courts: Outcomes, Costs and Promising Practices: An
Overview of Phase II in a Statewide Study." Journal of Psychoactive
Drugs (November 2006).
Carey, Shannon M., Lisa M. Lucas, Mark S. Waller, Callie H. Lambarth,
Robert Linhares, Judy M. Weller, and Michael W. Finigan. Vermont Drug
Courts: Rutland County Adult Drug Court Process, Outcome, and Cost
Evaluation-Final Report. Portland, OR: NPC Research, 2009.
Carey, Shannon, and Gwen Marchand. Marion County Adult Drug Court
Outcome Evaluation-Final Report. Portland, OR: NPC Research, 2005.
Carey, Shannon M. and Mark S. Waller. Oregon Drug Court Cost Study:
Statewide Costs and Promising Practices-Final Report. Portland, OR:
NPC Research, 2010.
Carey, Shannon M., and Mark Waller. California Drug Courts: Costs and
Benefits-Phase III. Portland, OR: NPC Research, 2008.
Carey, Shannon M., and Mark S. Waller. Guam Adult Drug Court Outcome
Evaluation-Final Report. Portland, OR: NPC Research, 2007.
Dandan, Doria Nour. Sex, Drug Courts, and Recidivism. University of
Nevada, Las Vegas: 2010.
Ferguson, Andrew, Birch McCole, and Jody Raio. A Process and Site-
Specific Outcome Evaluation of Maine's Adult Drug Treatment Court
Programs. Augusta, ME: University of Southern Maine, 2006.
Finigan, Michael W., Shannon M. Carey, and Anton Cox. Impact of a
Mature Drug Court Over 10 Years of Operation: Recidivism and Costs
(Final Report). Portland, OR: NPC Research, 2007.
Gottfredson, Denice C., Brook W. Kearley, Stacy S. Najaka, and Carlos
M. Rocha. "How Drug Treatment Courts Work: An Analysis of Mediators."
Journal of Research in Crime and Delinquency, 44(1) (February 2007): 3-
35.
Gottfredson, Denice C., Brook W. Kearley, Stacy S. Najaka, and Carlos
M. Rocha. "Long-term effects of participation in the Baltimore City
drug treatment court: Results from an experimental study." Journal of
Experimental Criminology, 2(1) (January 2006): 67-98.
Gottfredson, Denice C., Brook W. Kearley, Stacy S. Najaka, and Carlos
M. Rocha. "The Baltimore City Drug Treatment Court: 3-Year Self-Report
Outcome Study." Evaluation Review, 29(1) (February 2005): 42-64.
Krebs, C.P., C.H. Lindquist, W. Koetse, and P.K. Lattimore. "Assessing
the long-term impact of drug court participation on recidivism with
generalized estimating equations." Drug and Alcohol Dependence, 91(1)
(November 2007): 57-68.
Labriola, Melissa M. The Drug Court Model and Chronic Misdemeanants:
Impact Evaluation of the Queens Misdemeanor Treatment Court. New York,
NY: Center for Court Innovation, 2009.
Latimer, Jeff, Kelly Morton-Bourgon, and Jo-Anne Chrétien. A Meta-
Analytic Examination of Drug Treatment Courts: Do They Reduce
Recidivism? Ottawa, Ontario: Department of Justice Canada, 2006.
Listwan, Shelley Johnson, James Borowiak, and Edward J. Latessa. An
Examination of Idaho's Felony Drug Courts: Findings and
Recommendations-Final Report. Kent State University and University of
Cincinnati: 2008.
Logan, T. K., William H. Hoyt, Kathryn E. McCollister, Michael T.
French, Carl Leukefeld, and Lisa Minton. "Economic evaluation of drug
court: methodology, results, and policy Implications." Evaluation and
Program Planning, 27 (2004):381-396.
Loman, Anthony L. A Cost-Benefit Analysis of the St. Louis City Adult
Felony Drug Court. Institute of Applied Research. St. Louis, MO: 2004.
Lowenkamp, Christopher T., Alexander M. Holsinger, Edward J. Latessa.
"Are Drug Courts Effective: A Meta-Analytic Review." Journal of
Community Corrections. (Fall 2005): 5-28.
Mackin, Juliette R., Shannon M. Carey, and Michael W. Finigan. Harford
County District Court Adult Drug Court: Outcome and Cost Evaluation.
Portland, OR: NPC Research, 2008.
Mackin, Juliette R., Shannon M. Carey, and Michael W. Finigan. Prince
George's County Circuit Court Adult Drug Court: Outcome and Cost
Evaluation. Portland, OR: NPC Research, 2008.
Mackin, Juliette R., Lisa M. Lucas, Callie H. Lambarth, Mark S.
Waller, Shannon M. Carey, and Michael W. Finigan. Baltimore City
Circuit Court Adult Drug Treatment Court and Felony Diversion
Initiative: Outcome and Cost Evaluation-Final Report. Portland, OR:
NPC Research, 2009.
Mackin, Juliette R., Lisa M. Lucas, Callie H. Lambarth, Mark S.
Waller, Theresa Allen Herrera, Shannon M. Carey, and Michael W.
Finigan. Howard County District Court Drug Treatment Court Program
Outcome and Cost Evaluation. Portland, OR: NPC Research, 2010.
Mackin, Juliette R., Lisa M. Lucas, Callie H. Lambarth, Mark S.
Waller, Theresa Allen Herrera, Shannon M. Carey, and Michael W.
Finigan. Montgomery County Adult Drug Court Program Outcome and Cost
Evaluation. Portland, OR: NPC Research, 2010.
Mackin, Juliette R., Lisa M. Lucas, Callie H. Lambarth, Mark S.
Waller, Theresa Allen Herrera, Shannon M. Carey, and Michael W.
Finigan. Wicomico County Circuit Court Adult Drug Treatment Court
Program Outcome and Cost Evaluation. Portland, OR: NPC Research, 2009.
Mackin, Juliette R., Lisa M. Lucas, Callie H. Lambarth, Mark S.
Waller, Judy M. Weller, Jennifer A. Aborn, Robert Linhares, Theresa L.
Allen, Shannon M. Carey, and Michael W. Finigan. Baltimore City
District Court Adult Drug Treatment Court: 10-Year Outcome and Cost
Evaluation. Portland, OR: NPC Research, 2009.
Marchand, Gwen, Mark Waller, and Shannon M. Carey. Barry County Adult
Drug Court Outcome and Cost Evaluation-Final Report. Portland, OR: NPC
Research, 2006.
Marchand, Gwen, Mark Waller, and Shannon M. Carey. Kalamazoo County
Adult Drug Treatment Court Outcome and Cost Evaluation-Final Report.
Portland, OR: NPC Research, 2006.
Marinelli-Casey, Patricia, Rachel Gonzales, Maureen Hillhouse, Alfonso
Ang, Joan Zweben, Judith Cohen, Peggy Fulton Hora, and Richard A.
Rawson. "Drug court treatment for methamphetamine dependence:
Treatment response and posttreatment outcomes." Journal of Substance
Abuse Treatment, 34(2) (March 2008): 242-248.
Mitchell, Ojmarrh, and Adele Harrell. "Evaluation of the Breaking the
Cycle Demonstration Project: Jacksonville, FL and Tacoma, WA." Journal
of Drug Issues, 36(1) (Winter 2006): 97-118.
Piper, R. K., and Cassia Spohn. Cost/Benefit Analysis of the Douglas
County Drug Court. Omaha, NE: University of Nebraska at Omaha, 2004.
Rhodes, William, Ryan Kling, and Michael Shively. Suffolk County Drug
Court Evaluation. Abt. Associates, Inc., 2006.
Rhyne, Charlene. Clean Court Outcome Study. Portland, OR: Multnomah
County Department of Community Justice, 2004.
Rossman, S., M. Rempel, J. Roman, et.al. The Multi-Site Adult Drug
Court Evaluation: The Impact of Drug Courts. Washington, D.C.: Urban
Institute, 2011.
Shaffer, Deborah K., Kristin Bechtel, and Edward J. Latessa.
Evaluation of Ohio's Drug Courts: A Cost Benefit Analysis. Cincinnati,
OH: Center for Criminal Justice Research, University of Cincinnati,
2005.
Wilson, David B., Ojmarrh Mitchell, and Doris L. Mackenzie. "A
systematic review of drug court effects on recidivism." Journal of
Experimental Criminology, 2(4) (2006): 459-487.
Zarkin, Gary A., Lara J. Dunlap, Steven Belenko, and Paul A. Dynia. "A
Benefit-Cost Analysis of the Kings County District Attorney's Office
Drug Treatment Alternative to Prison (DTAP) Program." Justice Research
and Policy, 7(1) (2005).
[End of section]
Footnotes:
[1] The types of drug courts include adult drug courts, juvenile drug
courts, family drug courts, tribal drug courts, designated Driving
Under the Influence (DUI) courts, campus drug courts, reentry drug
courts, federal reentry drug courts, veterans drug courts, and co-
occurring disorder courts--for offenders with mental health and
substance addiction issues.
[2] The Adult Drug Court Discretionary Grant Program was originally
authorized under Title V of the Violent Crime Control and Law
Enforcement Act of 1994, Pub. L. No. 103-322, 108 Stat. 1796,1955-59,
and subsequently reauthorized by Title II of the 21st Century
Department of Justice Appropriations Authorization Act, Pub L. No. 107-
273, § 2301, 116 Stat. 1758, 1794-99 (2002) (codified at 42 U.S.C. §§
3797u-u-8). Drug court programs have also received funding from other
federal sources, and state and local governments.
[3] BJA's grant solicitation states that to assist DOJ in fulfilling
its obligation under GPRA, grantees must provide certain requested
data. GPRA was intended to address several broad purposes, including,
among other things, improving federal program effectiveness,
accountability, and service delivery; and enhancing congressional
decision making by providing more objective information on program
performance.
[4] GAO, Drug Courts: Better DOJ Data Collection and Evaluation
Efforts Needed to Measure Impact of Drug Court Programs, [hyperlink,
http://www.gao.gov/products/GAO-02-434] (Washington, D.C.: Apr. 18,
2002).
[5] We use the term recidivism to refer generally to the act of
committing new criminal offenses after having been arrested or
convicted of a crime.
[6] GAO, Drug Courts: Evidence Indicates Recidivism Reductions and
Mixed Results for Other Outcomes, [hyperlink,
http://www.gao.gov/products/GAO-05-219] (Washington, D.C.: Feb. 25,
2005).
[7] Fair Sentencing Act of 2010, Pub. L. No. 111-220, § 9, 124 Stat.
2372, 2374-75.
[8] Grantees are defined as states, state courts, local courts, units
of local government, and Indian tribal governments acting directly or
through an agreement with other public or private entities that
receive funding under the drug court program. 42 U.S.C. § 3797u(a).
[9] GAO, Standards for Internal Control in the Federal Government,
[hyperlink, http://www.gao.gov/products/GAO-AIMD-00-21.3.1]
(Washington, D.C.: November 1999).
[10] GAO, Tax Administration: IRS Needs to Further Refine Its Tax
Season Performance Measures, [hyperlink,
http://www.gao.gov/products/GAO-03-143], (Washington, D.C.: November
2002); and GAO, Recovery Act: Department of Justice Could Better
Assess Justice Assistance Grant Program Impact, [hyperlink,
http://www.gao.gov/products/GAO-11-87] (Washington, D.C.: October
2010).
[11] In February 2005, we studied evaluations of drug court programs
that were published from May 1997 through January 2004.
[12] NIJ is the research, development, and evaluation agency of DOJ.
[13] NADCP is a national membership and advocacy organization of drug
court professionals that provides for the collection and dissemination
of information, technical assistance, and mutual support to
association members.
[14] 42 U.S.C. §§ 3797u-u-8. Drug courts funded by BJA are required to
involve mandatory periodic drug testing, graduated sanctions for
participants who fail drug tests, and continuing judicial supervision
over offenders, among other requirements. Id. Federal drug court
grants have a matching requirement. Drug court grants are not
permitted to cover more than 75 percent of the total costs of the
project being funded. Grant applicants are required to identify a
nonfederal source of 25 percent of the program's cost with cash or in-
kind services, or some combination of both. 42 U.S.C. § 3797u-5.
[15] This joint program offers grantees the opportunity to design a
comprehensive strategy for enhancing drug court capacity while
accessing both criminal justice and substance-abuse treatment funds
under a single grant application. These grants are authorized under
section 509 of the Public Health Service Act, as amended (42 U.S.C. §
290bb-2) to provide Adult Treatment Drug Court grants.
[16] The appropriation amounts include adult drug courts, juvenile
drug court programs, training and technical assistance, and other
related expenses, among other things.
[17] For fiscal year 2011, the number of Adult Drug Court
Discretionary Grant Program grantee awards and award amounts were not
available at the time of our review.
[18] The average Adult Drug Court Discretionary Grant Program award
amount totals ranged from $122,000 in fiscal year 2006 to $267,000 in
fiscal year 2010.
[19] [hyperlink, http://www.gao.gov/products/GAO-05-219].
[20] See Jeff Latimer, Jeff, Kelly Morton-Bourgon, and Jo-Anne
Chrétien. A Meta-Analytic Examination of Drug Treatment Courts: Do
they Reduce Recidivism? (Ottawa, Ontario: Department of Justice
Canada, 2006), 12. Christopher T. Lowenkamp, Alexander M. Holsinger,
Edward J. Latessa. "Are drug courts effective: A meta-analytic
review," Journal of Community Corrections. (Fall 2005), 8, 9, 28.
David B. Wilson, Ojmarrh Mitchell, and Doris L. Mackenzie. "A
systematic review of drug court effects on recidivism." Journal of
Experimental Criminology, 2(4) (2006), 468-469.
[21] BJA officials stated that its Policy, Programs, and Planning
Offices participate in the GrantStat reviews.
[22] According to BJA officials, the contactor provides a range of
data collection, technical assistance, analytical, and research
services to BJA and its grantees. This includes developing and
maintaining the PMT and providing a user support help desk and formal
training to grantees regarding their reporting requirements. In
addition, contractor analysts review, analyze, and report on BJA
grantees' performance data to BJA.
[23] See appendix VI for more information regarding the 10 key
components.
[24] GAO, Managing For Results: Enhancing Agency Use of Performance
Information for Management Decision Making, [hyperlink,
http://www.gao.gov/products/GAO-05-927] (Washington, D.C.: Sept. 9,
2005).
[25] GAO, Justice: A Time Frame for Enhancing Grant Monitoring
Documentation and Verification of Data Quality Would Help Improve
Accountability and Resource Allocation Decisions, [hyperlink,
http://www.gao.gov/products/GAO-09-850R] (Washington, DC: September
2009) and GAO, Performance Measurement and Evaluation: Definitions and
Relationships, [hyperlink, http://www.gao.gov/products/GAO-05-739SP]
(Washington, DC: May 2005).
[26] GAO, Drug Control: DOD Needs to Improve Its Performance
Measurement System to Better Manage and Oversee Its Counternarcotics
Activities, [hyperlink, http://www.gao.gov/products/GAO-10-835]
(Washington, D.C.: July 2010).
[27] We identified the first four management activities in table 1 as
relevant from governmentwide surveys of federal managers that GAO
conducted in 1997, 2000, and 2003. See [hyperlink,
http://www.gao.gov/products/GAO-05-927]. The remaining two activities
we identified, in part, by reviewing performance management
literature. BJA staff confirmed each of these six to be relevant to
managing the drug court program.
[28] The technical assistance providers included: American University,
Tribal Law and Policy Institute, Center for Court Innovation, the
National Association of Drug Court Professionals, the Office of
Management and Budget, and the National Center for State Courts.
[29] BJA reported that DOJ stakeholders consulted included staff from
NIJ, OJP, DOJ's Policy, Management, and Planning Branch, DOJ's Chief
Financial Officer, the Office of Juvenile Justice and Delinquency
Programs, and BJA's Director.
[30] GAO, Information Security: Concerted Effort needed to Improve
Federal Performance Measures, [hyperlink,
http://www.gao.gov/products/GAO-09-617] (Washington, D.C.: September
2009); GAO, Results-Oriented Cultures: Creating a Clear Linkage
between Individual Performance and Organizational Success, [hyperlink,
http://www.gao.gov/products/GAO-03-488] (Washington, D.C.: March
2003); and GAO, Managing for Results: Measuring Program Results That
Are Under Limited Federal Control [hyperlink,
http://www.gao.gov/products/GAO/GGD-99-16], (Washington, D.C.:
December 1998).
[31] These attributes are clarity, reliability, linkage to strategic
goals, objectivity, and measurable targets. See GAO, Tax
Administration: IRS Needs to Further Refine Its Tax Filing Season
Performance Measures, [hyperlink,
http://www.gao.gov/products/GAO-03-143], (Washington, D.C.: November
2002); and GAO, Recovery Act: Department of Justice Could Better
Assess Justice Assistance Grant Program Impact, [hyperlink,
http://www.gao.gov/products/GAO-11-87] (Washington, D.C.: October
2010).
[32] GAO, Grants Management: Enhancing Performance Accountability
Provisions Could Lead to Better Results, [hyperlink,
http://www.gao.gov/products/GAO-06-1046] (Washington, D.C.: September
2006).
[33] [hyperlink, http://www.gao.gov/products/GAO-09-850R]; GAO,
Performance Plans: Selected Approaches for Verification and Validation
of Agency Performance Information, [hyperlink,
http://www.gao.gov/products/GAO/GGD-99-139] (Washington, D.C.: July,
1999).
[34] [hyperlink, http://www.gao.gov/products/GAO-05-927].
[35] [hyperlink, http://www.gao.gov/products/GAO/GGD-99-139].
[36] Theodore H. Poister, Measuring Performance in Public and
Nonprofit Organizations. The Jossey-Bass Non-Profit and Public
Management Series (San Francisco: Jossey-Bass, 2003).
[37] [hyperlink, http://www.gao.gov/products/GAO-AIMD-00-21.3.1].
[38] We report findings to be statistically significant only if they
were significant at the 95-percent, or greater, level of statistical
significance, even though some studies reported findings to be
statistically significant at the 90-percent level. In general, the
evaluations we reviewed reported differences in overall rearrest
rates--that is, the percentage of a group rearrested for any new
offense in a given period of time--although some evaluations reported
differences in the number of re-arrests or the relative odds of re-
arrest. Of the 32 programs reviewed, 31 showed lower recidivism for
drug court program participants, and for 18 of these programs, the
differences were statistically significant. The findings for the
remaining 13 programs were either not statistically significant or the
significance of their findings was not reported.
[39] It is important to note that the studies we reviewed did not
include treatments other than drug court; for example, they did not
measure the relative effectiveness of drug treatment programs
administered outside of a drug court.
[40] These percentages were adjusted for differences in the baseline
characteristics of the individuals in the two groups compared as well
as differences in the baseline characteristics of the programs they
were in.
[41] The range of percentage differences for re-arrest rates was
narrower for higher quality studies as a group than for lower quality
studies, and the differences for higher quality studies did not range
as high.
[42] We are reporting on the eight programs for which drug -relapse
data from drug court participants were compared with a comparison
group. Evaluations of other programs included information on drug-
relapse only for drug court participants.
[43] The estimate of $6,208 reflects the hierarchical modeling used in
the MADCE study. However, according to NIJ officials, the estimated
net benefits could be as low as $5,680, under different assumptions.
[44] GAO tracks recommendations for implementation and has closed
these as either being fully or partially implemented.
[45] [hyperlink, http://www.gao.gov/products/GAO-02-434].
[46] The Urban Institute is a nonpartisan economic and social policy
research organization. The Center for Court Innovation functions as
the independent research and development arm for the New York State
court system and provides criminal justice consulting services to
jurisdictions outside New York. The Research Triangle Institute is an
independent, nonprofit institute that provides research, development,
and technical services to government and commercial clients.
[47] Numbers, percentages, and differences in the foregoing and
following bullets, are adjusted (or estimated), as opposed to raw (or
observed) numbers, percentages or differences; that is, they were
obtained by the MADCE researchers from statistical models that
estimated them after adjusting for differences in the baseline
characteristics of the individuals in the two groups compared as well
as differences in the baseline characteristics of the programs they
were in.
[48] Propensity score adjustments are a statistical approach to
control for baseline differences between the drug court and comparison
groups and to correct for attrition and selection biases by
effectively giving greater weight to underrepresented categories of
offenders and lesser weight to overrepresented categories of
offenders. Hierarchical linear models are used to take account of the
nesting--or clustering--of participants within the different sites.
These statistical adjustments were necessary as a result of baseline
differences between specific drug court and comparison groups and the
specific individuals in them, and because of the attrition that
occurred in both the drug offender and comparison samples over time.
[49] The estimate of $6,208 reflects the hierarchical modeling used in
the MADCE study. However, according to NIJ officials, the estimated
net benefits could be as low as $5,680, under different assumptions.
[50] Office of Management and Budget, Circular A-94 Guidelines and
Discount Rates for Benefit-Cost Analysis of Federal Programs
(Washington, D.C.: 1992.) p 4.
[51] Grantees are defined as states, state courts, local courts, units
of local government, and Indian tribal governments acting directly or
through an agreement with other public or private entities that
receive funding under the drug court program. 42 U.S.C. § 3797u(a).
[52] GAO, Standards for Internal Control in the Federal Government,
[hyperlink, http://www.gao.gov/products/GAO-AIMD-00-21.3.1]
(Washington, D.C.: November, 1999).
[53] The management activities include: (1) setting program
priorities; (2) allocating resources; (3) adopting new program
approaches or changing work processes; (4) identifying and sharing
with stakeholders more effective processes and approaches to program
implementation; (5) setting expectations for grantees; and (6)
monitoring grantee performance.
[54] GAO, Tax Administration: IRS Needs to Further Refine Its Tax
Filing Season Performance Measures, [hyperlink,
http://www.gao.gov/products/GAO-03-143], (Washington, D.C.: November
2002); and GAO, Recovery Act: Department of Justice Could Better
Assess Justice Assistance Grant Program Impact, [hyperlink,
http://www.gao.gov/products/GAO-11-87] (Washington, D.C.: October
2010).
[55] In February 2005, we studied evaluations of drug court programs
that were published from May 1997 through January 2004.
[56] We searched the ERIC, Biosis Previews, Social Scisearch, Gale
Group Magazine Database, Gale Group Health & Wellness Database, Gale
Group Legal Resource Index, Wilson Social Science Abstracts, and
Periodical Abstracts PlusText.
[57] Prior GAO reports included, GAO, Drug Courts: Information on a
New Approach to Address Drug-Related Crime, [hyperlink,
http://www.gao.gov/products/GAO/GGD-95-159BR] (Washington, D.C.: May
22, 1995); GAO, Drug Courts: Overview of Growth, Characteristics, and
Results, [hyperlink, http://www.gao.gov/products/GAO/GGD-97-106]
(Washington, D.C.: July 31, 1997); and GAO, Drug Courts: Better DOJ
Data Collection and Evaluation Efforts Needed to Measure Impact of
Drug Court Programs, [hyperlink,
http://www.gao.gov/products/GAO-02-434] (Washington, D.C.: Apr. 18,
2002); GAO, Adult Drug Courts: Evidence Indicates Recidivism
Reductions and Mixed Results for Other Outcomes, [hyperlink,
http://www.gao.gov/products/GAO-05-219] (Washington, D.C.: Feb. 28,
2005).
[58] A process evaluation assesses the extent to which a program is
operating as it was intended. It typically assesses program
activities' conformance to statutory and regulatory requirements,
program design, and professional standards or customer expectations.
[59] An experimental design is one in which eligible offenders were
randomly assigned to different programs. A quasi-experimental design
is one in which (1) all drug-court program participants were compared
with an appropriate group of comparable offenders who did not
participate in the drug court program, and (2) appropriate statistical
methods were used to adjust, or control, for group differences.
[60] Some studies reported results that were aggregated from multiple
drug court programs.
[61] For a summary of how drug court studies have addressed selection
bias in the past, see [hyperlink,
http://www.gao.gov/products/GAO-05-219], p 16-24.
[62] [hyperlink, http://www.gao.gov/products/GAO-05-219], p 27.
[63] Drug courts funded by BJA are required to involve mandatory
periodic drug testing, graduated sanctions for participants who fail
drug tests, and continuing judicial supervision over offenders, among
other requirements. 42 U.S.C. §§ 3797u-u-8.
[End of section]
GAO‘s Mission:
The Government Accountability Office, the audit, evaluation, and
investigative arm of Congress, exists to support Congress in meeting
its constitutional responsibilities and to help improve the
performance and accountability of the federal government for the
American people. GAO examines the use of public funds; evaluates
federal programs and policies; and provides analyses, recommendations,
and other assistance to help Congress make informed oversight, policy,
and funding decisions. GAO‘s commitment to good government is
reflected in its core values of accountability, integrity, and
reliability.
Obtaining Copies of GAO Reports and Testimony:
The fastest and easiest way to obtain copies of GAO documents at no
cost is through GAO‘s website [hyperlink, http://www.gao.gov]. Each
weekday afternoon, GAO posts on its website newly released reports,
testimony, and correspondence. To have GAO e mail you a list of newly
posted products, go to [hyperlink, http://www.gao.gov] and select ’E-
mail Updates.“
Order by Phone:
The price of each GAO publication reflects GAO‘s actual cost of
production and distribution and depends on the number of pages in the
publication and whether the publication is printed in color or black
and white. Pricing and ordering information is posted on GAO‘s
website, [hyperlink, http://www.gao.gov/ordering.htm].
Place orders by calling (202) 512-6000, toll free (866) 801-7077, or
TDD (202) 512-2537.
Orders may be paid for using American Express, Discover Card,
MasterCard, Visa, check, or money order. Call for additional
information.
Connect with GAO:
Connect with GAO on facebook, flickr, twitter, and YouTube.
Subscribe to our RSS Feeds or E mail Updates. Listen to our Podcasts.
Visit GAO on the web at [hyperlink, http://www.gao.gov].
To Report Fraud, Waste, and Abuse in Federal Programs:
Contact:
Website: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm];
E-mail: fraudnet@gao.gov;
Automated answering system: (800) 424-5454 or (202) 512-7470.
Congressional Relations:
Ralph Dawn, Managing Director, dawnr@gao.gov, (202) 512-4400
U.S. Government Accountability Office, 441 G Street NW, Room 7125
Washington, DC 20548.
Public Affairs:
Chuck Young, Managing Director, youngc1@gao.gov, (202) 512-4800
U.S. Government Accountability Office, 441 G Street NW, Room 7149
Washington, DC 20548.