Coast Guard
Non-Homeland Security Performance Measures Are Generally Sound, but Opportunities for Improvement Exist
Gao ID: GAO-06-816 August 16, 2006
Using performance measures, the Coast Guard explains how well its programs are performing. To do so, it reports one "primary" measure for each program (such as percent of mariners rescued) and maintains data on other, "secondary" measures (such as percent of property saved). Concerns have been raised about whether measures for non-homeland security programs accurately reflect performance, that is, they did not rise or fall as resources were added or reduced. For the six non-homeland security programs, GAO used established criteria to assess the soundness of the primary measures--that is, whether measures cover key activities; are clearly stated; and are objective, measurable, and quantifiable--and the reliability of data used to calculate them. GAO also used these criteria to assess the soundness of 23 selected secondary measures. Finally, through interviews and report review, GAO assessed challenges in using measures to link resources to results.
While some opportunities for improvement exist, the primary measures for the Coast Guard's six non-homeland security programs are generally sound, and the data used to calculate them are generally reliable. All six measures cover key program activities and are objective, measurable, and quantifiable, but three are not completely clear--that is, they do not consistently provide clear and specific descriptions of the data, events, or geographic areas they include. Also, the processes used to enter and review the Coast Guard's own internal data are likely to produce reliable data; however, neither the Department of Homeland Security (DHS) nor the Coast Guard have policies or procedures for reviewing or verifying data from external sources, such as other federal agencies. Currently, the review processes vary from source to source, and for the primary measure covering marine environmental protection (which concerns oil and chemical spills), the processes are insufficient. Of the 23 secondary performance measures GAO assessed, 9 are generally sound, with weaknesses existing in the remaining 14. These weaknesses include (1) a lack of measurable performance targets, (2) a lack of agencywide criteria or guidance to ensure objectivity, and (3) unclear descriptions of the measures. Two main challenges exist with using primary measures to link resources to results. In one case, the challenge is comprehensiveness--that is, although each primary measure captures a major segment of program activity, no one measure captures all program activities and thereby accounts for all program resources. The other challenge involves external factors, some of which are outside the Coast Guard's control, that affect performance. For example, weather conditions can affect the amount of ice that must be cleared, the number of aids to navigation that need repair, or mariners that must be rescued. As a result, linking resources and results is difficult, and although the Coast Guard has a range of ongoing initiatives to do so, it is still too early to assess the agency's ability to successfully provide this link.
Recommendations
Our recommendations from this work are listed below with a Contact for more information. Status will change from "In process" to "Open," "Closed - implemented," or "Closed - not implemented" based on our follow up work.
Director:
Team:
Phone:
GAO-06-816, Coast Guard: Non-Homeland Security Performance Measures Are Generally Sound, but Opportunities for Improvement Exist
This is the accessible text file for GAO report number GAO-06-816
entitled 'Coast Guard: Non-Homeland Security Performance Measures Are
Generally Sound, but Opportunities for Improvement Exist' which was
released on September 18, 2006.
This text file was formatted by the U.S. Government Accountability
Office (GAO) to be accessible to users with visual impairments, as part
of a longer term project to improve GAO products' accessibility. Every
attempt has been made to maintain the structural and data integrity of
the original printed product. Accessibility features, such as text
descriptions of tables, consecutively numbered footnotes placed at the
end of the file, and the text of agency comment letters, are provided
but may not exactly duplicate the presentation or format of the printed
version. The portable document format (PDF) file is an exact electronic
replica of the printed version. We welcome your feedback. Please E-mail
your comments regarding the contents or accessibility features of this
document to Webmaster@gao.gov.
This is a work of the U.S. government and is not subject to copyright
protection in the United States. It may be reproduced and distributed
in its entirety without further permission from GAO. Because this work
may contain copyrighted images or other material, permission from the
copyright holder may be necessary if you wish to reproduce this
material separately.
Report to the Subcommittee on Fisheries and Coast Guard, Committee on
Commerce, Science, and Transportation, U.S. Senate:
United States Government Accountability Office:
GAO:
August 2006:
Coast Guard:
Non-Homeland Security Performance Measures Are Generally Sound, but
Opportunities for Improvement Exist:
Coast Guard:
GAO-06-816:
GAO Highlights:
Highlights of GAO-06-816, a report to the Subcommittee on Fisheries and
Coast Guard, Committee on Commerce, Science, and Transportation, U.S.
Senate
Why GAO Did This Study:
Using performance measures, the Coast Guard explains how well its
programs are performing. To do so, it reports one ’primary“ measure for
each program (such as percent of mariners rescued) and maintains data
on other, ’secondary“ measures (such as percent of property saved).
Concerns have been raised about whether measures for non-homeland
security programs accurately reflect performance, that is, they did not
rise or fall as resources were added or reduced. For the six non-
homeland security programs, GAO used established criteria to assess the
soundness of the primary measures”that is, whether measures cover key
activities; are clearly stated; and are objective, measurable, and
quantifiable”and the reliability of data used to calculate them. GAO
also used these criteria to assess the soundness of 23 selected
secondary measures. Finally, through interviews and report review, GAO
assessed challenges in using measures to link resources to results.
What GAO Found:
While some opportunities for improvement exist, the primary measures
for the Coast Guard‘s six non-homeland security programs are generally
sound, and the data used to calculate them are generally reliable. All
six measures cover key program activities and are objective,
measurable, and quantifiable, but three are not completely clear”that
is, they do not consistently provide clear and specific descriptions of
the data, events, or geographic areas they include. Also, the processes
used to enter and review the Coast Guard‘s own internal data are likely
to produce reliable data; however, neither the Department of Homeland
Security (DHS) nor the Coast Guard have policies or procedures for
reviewing or verifying data from external sources, such as other
federal agencies. Currently, the review processes vary from source to
source, and for the primary measure covering marine environmental
protection (which concerns oil and chemical spills), the processes are
insufficient.
Of the 23 secondary performance measures GAO assessed, 9 are generally
sound, with weaknesses existing in the remaining 14. These weaknesses
include (1) a lack of measurable performance targets, (2) a lack of
agencywide criteria or guidance to ensure objectivity, and (3) unclear
descriptions of the measures.
Two main challenges exist with using primary measures to link resources
to results. In one case, the challenge is comprehensiveness”that is,
although each primary measure captures a major segment of program
activity, no one measure captures all program activities and thereby
accounts for all program resources. The other challenge involves
external factors, some of which are outside the Coast Guard‘s control,
that affect performance. For example, weather conditions can affect the
amount of ice that must be cleared, the number of aids to navigation
that need repair, or mariners that must be rescued. As a result,
linking resources and results is difficult, and although the Coast
Guard has a range of ongoing initiatives to do so, it is still too
early to assess the agency‘s ability to successfully provide this link.
Table: Soundness of Primary Measures and Reliability of Data Used to
Calculate the Primary Measure for the Coast Guard's Non-Homeland
Security Programs:
Program: Aids to navigation;
Is the Primary Measure sound?: Yes;
Are the data used to calculate the measure reliable?: Yes.
Program: Ice operations;
Is the Primary Measure sound?: weaknesses identified; Are the data used
to calculate the measure reliable?: Yes.
Program: Living marine resources;
Is the Primary Measure sound?: Weaknesses identified; Are the data used
to calculate the measure reliable?: Yes.
Program: Marine environmental protection; Is the Primary Measure
sound?: Yes;
Are the data used to calculate the measure reliable?: Weaknesses
identified.
Program: Marine safety;
Is the Primary Measure sound?: Yes;
Are the data used to calculate the measure reliable?: Yes.
Program: Search and rescue;
Is the Primary Measure sound?: Weaknesses identified; Are the data used
to calculate the measure reliable?: Yes.
Source: GAO analysis of Coast Guard primary performance measures.
[End of Table]
What GAO Recommends:
GAO made recommendations to clarify, develop targets, establish
criteria, and review external data for certain performance measures and
improve the Coast Guard‘s overall reporting of results. DHS and the
Coast Guard generally agreed with the recommendations in this report.
[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-06-816].
To view the full product, including the scope and methodology, click on
the link above. For more information, contact Stephen L. Caldwell at
(202) 512-9610 or caldwells@gao.gov.
[End of Section]
Contents:
Letter:
Results in Brief:
Background:
Non-Homeland Security Primary Performance Measures Are Generally Sound
and Data Are Generally Reliable, but Weaknesses Exist:
More than a Third of the Secondary Performance Measures Assessed Are
Generally Sound, and the Remainder Have Weaknesses:
Challenges Exist in Using Measures to Link Resources to Results, but
the Coast Guard Is Working on Ways to Address Them:
Conclusions:
Recommendations for Executive Action:
Agency Comments:
Appendix I: Objectives, Scope, and Methodology:
Appendix II: Secondary Performance Measures:
Appendix III: Ongoing Coast Guard Initiatives to Link Resources Used to
Results Achieved:
Appendix IV: Comments from the Department of Homeland Security:
Appendix V: GAO Contact and Staff Acknowledgments:
Related GAO Products:
Tables:
Table 1: Coast Guard's Non-Homeland Security Programs:
Table 2: Soundness of Primary Measures and Reliability of Data Used to
Calculate the Primary Measures for the Coast Guard's Non-Homeland
Security Programs:
Table 3: Source of Data Used to Calculate Non-Homeland Security Primary
Performance Measures:
Table 4: Soundness of Selected Non-Homeland Security Secondary
Performance Measures:
Table 5: Soundness of Secondary Measures for Coast Guard's Non-Homeland
Security Programs:
Table 6: Coast Guard Non-Homeland Security Secondary Performance
Measures Not Assessed:
Table 7: Ongoing Coast Guard Initiatives to Link Resources Used to
Results Achieved:
Abbreviations:
Corps: United States Army Corps of Engineers:
BARD: Boating Accident Reporting Database:
DHS: Department of Homeland Security:
GPRA: Government Performance and Results Act of 1993:
OMB: Office of Management and Budget:
PART: Program Assessment Rating Tool:
United States Government Accountability Office:
Washington, DC 20548:
August 16, 2006:
The Honorable Olympia J. Snowe:
Chair:
The Honorable Maria Cantwell:
Ranking Minority Member:
Subcommittee on Fisheries and Coast Guard:
Committee on Commerce, Science, and Transportation:
United States Senate:
Recent years have seen a marked shift in the Coast Guard's
responsibilities. The events of September 11, 2001, shifted the Coast
Guard's priorities and focus toward homeland security responsibilities,
such as protecting the nation's network of ports and waterways. At the
same time, however, the agency's traditional non-homeland security
programs, such as rescuing people at sea and directing oil spill
cleanup efforts, remain an integral part of its operations. In all, the
Coast Guard has six non-homeland security programs (see table 1), and
collectively, the effort that goes into them constitutes 50 percent of
the Coast Guard's fiscal year 2006 enacted budget.[Footnote 1]
Table 1: Coast Guard's Non-Homeland Security Programs:
Program[A]: Aids to navigation;
Brief description: Managing U.S. waterways through maintaining
navigation aids and monitoring marine traffic.
Program[A]: Ice operations;
Brief description: Conducting domestic and polar icebreaking and
international ice monitoring.
Program[A]: Living marine resources;
Brief description: Ensuring compliance with domestic living marine
resources laws and regulations within the U.S. Exclusive Economic Zone
by fishermen through at-sea enforcement[B].
Program[A]: Marine environmental protection;
Brief description: Preventing and responding to oil and chemical
spills; prevention of invasive aquatic nuisance species; and preventing
illegal dumping of plastics and garbage in U.S. waters.
Program[A]: Marine safety;
Brief description: Setting safety standards and inspecting commercial
and passenger vessels; partnering with states and organizations to
reduce recreational boating deaths.
Program[A]: Search and rescue;
Brief description: Conducting operations to find and assist mariners in
distress.
Source: GAO analysis of Coast Guard documents.
[A] Starting with the fiscal year 2007 budget, OMB has designated the
Coast Guard's drug interdiction and other law enforcement programs as
non-homeland security missions for budgetary purposes. However, at the
time of our review, Coast Guard officials told us that, in terms of
measuring performance, the agency still categorized these programs as
homeland security missions as delineated under section 888 of the
Homeland Security Act of 2002, 6 U.S.C. § 468.
[B] The U.S. Exclusive Economic Zone is defined as an area within 200
miles of U.S shores in which U.S. citizens have primary harvesting
rights to fish stocks.
[End of table]
Since the changes that increased the Coast Guard's homeland security
responsibilities, Congress has paid renewed attention to the Coast
Guard's ability to carry out its non-homeland security programs. To
help gauge its performance in these areas, the Coast Guard collects
data on 45 performance measures, such as the percentage of mariners
successfully rescued from imminent danger and the number of oil spills
and chemical discharges. When reporting its performance, the Coast
Guard follows the instructions of its parent agency, the Department of
Homeland Security (DHS), and reports one measure for each program. For
example, for the ice operations program, the Coast Guard reports on the
annual number of days certain waterways are closed because of ice, and
for the aids to navigation program, the Coast Guard reports on the
number of collisions, allisions, and groundings.[Footnote 2] These
performance measures, which we call "primary measures" in this report,
are intended to communicate Coast Guard performance and provide
information for the budgeting process to Congress, other policymakers,
and taxpayers. Beyond the six primary performance measures, the Coast
Guard also uses a variety of other performance measures to manage its
programs, called "secondary measures" in this report. There are three
key publications that DHS and the Coast Guard use to report the Coast
Guard's non-homeland security primary performance measures--the DHS
Performance and Accountability Report, the DHS fiscal year budget
request, and the Coast Guard's fiscal year Budget-in-Brief.
Our recent analyses have raised concerns about whether the primary
measures accurately reflect what the Coast Guard is accomplishing with
the resources it expends. In April 2004, we testified that despite
substantial changes in the distribution of resources among programs,
performance results appeared largely unaffected, and the Coast Guard
had limited data and no systematic approach to explain the lack of a
clear relationship between resources expended and performance results
achieved.[Footnote 3] You asked us to consider whether shortcomings in
the primary measures might explain why there was no apparent connection
between resources expended and results achieved for the non-homeland
security programs. In response, we evaluated the primary measures for
the Coast Guard's six non-homeland security programs with regard to two
key characteristics: (1) their soundness--that is, whether the measures
cover the key activities of the program, are clearly stated and
described, and are objective, measurable, and quantifiable--including
having annual targets--and (2) the reliability of the data used to
calculate the measures--that is, whether controls are in place to
ensure the timeliness, completeness, accuracy, and consistency of the
data.[Footnote 4] You also asked us to provide information on some of
the secondary measures that are used in the Coast Guard's six non-
homeland security programs. Our report addresses three questions:
* Are the primary performance measures for the Coast Guard's six non-
homeland security programs sound, and are the data used to calculate
them reliable?
* Are selected secondary performance measures for four of the Coast
Guard's non-homeland security programs sound?
* What challenges, if any, are present in trying to use the primary
measures to link resources expended and results achieved?
To conduct our analysis of the soundness of the primary performance
measures, we relied primarily on a set of criteria that we had
previously developed.[Footnote 5] These criteria were developed based
on the Government Performance and Results Act of 1993 (GPRA) and Office
of Management and Budget (OMB) guidelines for agency performance
measures.[Footnote 6] We used our judgment to assess whether these
measures met our criteria. We also reviewed the fiscal years 2005 and
2006 DHS Performance and Accountability Report, the fiscal years 2006
and 2007 DHS budget requests, and the Coast Guard's fiscal years 2006
and 2007 Budget-in-Brief. To conduct our reliability analysis, we
relied primarily on comparisons of Coast Guard data collection methods
and internal control processes with GPRA and the Reports Consolidation
Act of 2000 requirements, as well as commonly accepted standards and
practices.[Footnote 7] Our reliability analysis assessed only the
specific data fields used to collect and report data for the six non-
homeland security primary performance measures, and not the relevant
databases as a whole. We reviewed and analyzed information collected
and assembled at Coast Guard headquarters as well as at four Coast
Guard field locations.[Footnote 8] To the extent possible, we also
reviewed secondary measures for four of the six non-homeland security
programs.[Footnote 9] To identify and assess the challenges in trying
to use the primary measures to link resources expended and results
achieved we interviewed Coast Guard officials at agency headquarters to
discuss how measures are used in resource and budget allocation
decisions and reviewed previous GAO reports on performance measures,
performance reporting, and the link between the Coast Guard's resources
used and results achieved. We conducted our work from July 2005 to
August 2006 in accordance with generally accepted government auditing
standards. More details about the scope and methodology of our work are
presented in appendix I.
Results in Brief:
Although some opportunities for improvement exist, the Coast Guard's
primary performance measures for its six non-homeland security programs
are generally sound, and the data used to calculate them are generally
reliable. All six measures are generally sound in that they cover key
program activities and are objective, measurable, and quantifiable, but
three are not completely clear, that is, they do not consistently
provide clear and specific descriptions of the data, events, or
geographic areas they include. For example, the primary performance
measure for ice operations, "domestic icebreaking--annual number of
waterway closure days," actually only reflects closures for certain
waterways within the Great Lakes region. Although these waterways are
the main location for domestic icebreaking, icebreaking also takes
place on the East Coast. While this caveat is included in some
accompanying text, the description is inconsistent across department
and agency publications. For instance, the DHS fiscal year 2005
Performance and Accountability Report notes that the measure is made up
of nine critical waterways, but the DHS fiscal year 2007 budget request
reports that the measure consists of seven critical waterways, while
the Coast Guard's fiscal year 2007 Budget-in-Brief does not mention the
number of waterways included in the measure. With regard to data
reliability, although the processes the Coast Guard uses to enter and
review its own internal data are likely to produce reliable data for
the performance measures we reviewed, we identified weaknesses with
processes used to review the reliability of data gathered from external
sources. Specifically, we found that neither DHS nor the Coast Guard
has policies requiring review or consistent verification processes for
these data. Instead, the processes vary for different data sources. For
example, the Coast Guard tests the reliability of state-provided data
used for its marine safety program's primary measure, but does not test
the reliability of Army Corps of Engineers (Corps) data or review the
Corps' data reliability procedures for data used for its marine
environmental protection program's primary measure. While, according to
a Corps official, the Corps does have some controls in place, without,
at a minimum, familiarity with the internal controls used by the Corps
to ensure the reliability of these data, the Coast Guard cannot provide
assurance that the data are reliable.
For the four non-homeland security programs we assessed, more than a
third of the secondary performance measures are generally sound (9 of
the 23), while opportunities for improvement exist for the remainder
(14 of the 23). More specifically, for the 14 secondary measures, we
found (1) the Coast Guard does not have measurable targets to assess
whether program and agency goals and objectives are being achieved for
12 measures, (2) the Coast Guard does not have agencywide criteria or
guidance to accurately reflect program results and ensure objectivity
for 1 measure, and (3) the Coast Guard does not clearly state or
describe the data or events included in 1 measure. For example, a
secondary measure for the search and rescue program, "percent of lives
saved after Coast Guard notification," does not clearly state that it
excludes incidents in which 11 or more lives were saved or lost in a
single case. While including such large incidents in performance
measures would skew annual performance results, it is important for the
Coast Guard to identify these exclusions, either through a footnote or
accompanying text, to ensure that events such the rescues of Hurricane
Katrina--when the agency rescued more than 33,500 people within a few
weeks--are recognized; otherwise, performance results could be
misinterpreted or misleading to users.
Although the primary performance measures are generally sound and data
used to calculate them are generally reliable, even sound performance
measures have limits to how much they can explain about the
relationship between resources expended and results achieved.
Specifically, we identified two challenges that stand in the way of
establishing a clear link between resources and results. One challenge
involves the difficulty of capturing an entire program such as ice
operations or marine environmental protection in a single performance
measure. The Coast Guard follows DHS guidance in reporting a single
measure per program, which is consistent with our prior work on
agencies that were successful in measuring performance and implementing
GPRA.[Footnote 10] However, reporting some secondary measures or
additional data in venues, such as the Coast Guard's annual Budget-in-
Brief or program-specific publications, could provide additional
context and help to more clearly articulate to stakeholders and
decision makers the relationship between resources expended and results
achieved. For instance, reporting data on the annual number of search
and rescue cases in the search and rescue program, in addition to its
primary measure, "the percent of mariners in imminent danger who are
rescued," can provide greater context for the program's activity level.
This is important because while the percentage of mariners saved may
remain consistent from year-to-year, the number of cases, number of
lives saved, and the resources used to achieve this result can vary.
The second challenge involves the Coast Guard's ability to account for
factors other than resources that can affect program results. Some of
these factors are external to the agency--and perhaps outside of its
ability to influence. Because of the potentially large number of
external factors, and their sometimes unpredictable or often unknown
effect on performance, it may be difficult to account for how they--and
not the resources expended on the program--affect results. For example,
a change in fishery regulations reduced the number of search and rescue
cases in Alaska because it provided greater flexibility for fishermen
to choose when they would fish for certain fish stocks--this
flexibility allowed them to choose different timeframes and therefore
safer weather conditions for their fishing activities. Developing a
system or model that could realistically take all such factors into
account may not be achievable, but, the challenge is to develop enough
sophistication about each program's context so the Coast Guard can more
systematically consider these factors, and then explain their influence
on resource decisions and performance results. Recognizing these
limitations, and responding to recommendations we have made in past
reports, the Coast Guard has developed a range of initiatives that
agency officials believe will help explain the effects of these factors
and decide where resources are best spent.[Footnote 11] Some of these
initiatives have been ongoing for several years, and according to
agency officials, the extent and complexity of the effort, together
with challenges presented in integrating them into a data-driven and
comprehensive strategy, requires additional time to complete.
Currently, the Coast Guard does not expect to fully implement many of
the initiatives until 2010, and thus it is not possible to assess their
likely impact in linking resources and results until they are further
developed and operational.
To improve the quality of program performance reporting and to more
efficiently and effectively assess progress toward achieving the goals
or objectives stated in agency plans, we are recommending that the
Secretary of the Department of Homeland Security direct the Commandant
of the Coast Guard to take steps to further improve the soundness of
the 3 primary measures and 14 secondary measures we found to have
weaknesses, develop and implement a policy to review the reliability of
all external data that is used in calculating performance measures, and
report additional information--besides the one primary performance
measure--in appropriate venues to better inform stakeholders and
decision makers about the relationship between resources expended and
results achieved. In commenting on this draft, DHS and Coast Guard
officials generally agreed with our findings and recommendations, and
provided technical comments that we incorporated.
Background:
The Coast Guard has responsibilities divided into 11 programs that fall
under two broad missions--homeland security and non-homeland security-
-which are recognized in the Homeland Security Act. To accomplish its
wide range of responsibilities, the Coast Guard is organized into two
major commands that are responsible for overall mission execution--one
in the Pacific area and the other in the Atlantic area. These commands
are divided into nine districts, which in turn are organized into 35
sectors that unify command and control of field units and resources,
such as multimission stations and patrol boats. In fiscal year 2005,
the Coast Guard had over 46,000 full-time positions--about 39,000
military and 7,000 civilians. In addition, the agency had about 8,100
reservists who support the national military strategy or provide
additional operational support and surge capacity during times of
emergency, such as natural disasters. Furthermore, the Coast Guard also
had about 31,000 volunteer auxiliary personnel help with a wide array
of activities, ranging from search and rescue to boating safety
education.
For each of its six non-homeland security programs, the Coast Guard has
developed a primary performance measure to communicate agency
performance and provide information for the budgeting process to
Congress, other policymakers, and taxpayers. The Coast Guard has also
developed 39 secondary measures that it uses to manage these six
programs. The Coast Guard selected and developed the six primary
measures based on a number of criteria, including GPRA, DHS, and OMB
guidance as well as legislative, department, and agency priorities.
When viewed as a suite of measures, the primary and secondary measures
combined are intended to provide Coast Guard officials with a more
comprehensive view of program performance than just the program's
primary measure. Some of these secondary measures are closely related
to the primary measures; for example, a secondary measure for the
marine environmental protection program, "annual number of oil spills
greater than 100 gallons and chemical discharges per 100 million tons
shipped," is closely related to the program's primary measure, "5-year
average annual number of oil spills greater than 100 gallons and
chemical discharges per 100 million tons shipped." However, other
secondary measures reflect activities and priorities that are not
reflected in the primary performance measures. For example, a secondary
measure in the search and rescue program, "percent of property saved,"
reflects activities not captured in the program's primary measure,
"percent of mariners in imminent danger who are rescued."
In 2004, we compared trends in performance results, as reported by the
Coast Guard's primary performance measures, with the agency's use of
resources and found that the relationship between results achieved and
resources used was not always what might be expected--that is,
resources expended and performance results achieved did not have
consistent direction of movement and sometimes bore an opposite
relationship.[Footnote 12] We reported that disconnects between
resources expended and performance results achieved have important
implications for resource management and accountability, especially
given the Coast Guard's limited ability to explain them. In particular,
these disconnects prompted a question as to why, despite substantial
changes in a number of programs' resource hours used over the period we
examined, the corresponding performance results for these programs were
not necessarily affected in a similar manner--that is, they did not
rise or fall along with changes in resources.[Footnote 13] At that
time, the Coast Guard could not say with any assurance why this
occurred. For example, while resource hours for the search and rescue
program dropped by 22 percent in fiscal year 2003 when compared to the
program's pre-September 11, 2001 baseline, the performance results for
the program remained stable for the same period. These results suggest
that performance was likely affected by factors other than resource
hours. One set of factors cited by the Coast Guard as helping to keep
performance steady despite resource decreases involved strategies such
as the use of new technology, better operational tactics, improved
intelligence, and stronger partnering efforts. Coast Guard officials
also pointed to another set of factors, largely beyond the agency's
control (such as severe weather conditions), to explain performance
results that did not improve despite resource increases. At the time of
our 2004 report, the Coast Guard did not have a systematic approach to
effectively link resources to results. However, the Coast Guard had
begun some initiatives to better track resource usage and manage
program results, but many of these initiatives were still in early
stages of development and some did not have a time frame for
completion.
Like other federal agencies, DHS is subject to the performance-
reporting requirements of GPRA. GPRA requires agencies to publish a
performance report that includes performance measures and results.
These reports are intended to provide important information to agency
managers, policymakers, and the public on what each agency accomplished
with the resources it was given. The three key annual publications that
DHS and the Coast Guard use to report the Coast Guard's non-homeland
security primary performance measures are the DHS Performance and
Accountability Report, the DHS fiscal year budget request, and the
Coast Guard's fiscal year Budget-in-Brief. The DHS Performance and
Accountability Report provides financial and performance information to
the President, Congress and the public for assessing the effectiveness
of the department's mission performance and stewardship of resources.
The DHS annual budget request to Congress identifies the resources
needed for meeting the department's missions. The Coast Guard's annual
Budget-in-Brief reports performance information to assess the
effectiveness of the agency's performance as well as a summary of the
agency's most recent budget request. These documents report the primary
performance measures for each of the Coast Guard's non-homeland
security programs, as well as descriptions of the measures and
explanations of performance results. While these documents report
performance results from some secondary measures, DHS and the Coast
Guard do not report most of the Coast Guard's secondary measures in
these documents.
GPRA also requires agencies to establish goals and targets to define
the level of performance to be achieved by a program and express such
goals in an objective, quantifiable, and measurable form. In passing
GPRA, Congress emphasized that the usefulness of agency performance
information depends to a large degree on the reliability of performance
data. To be useful in reporting to Congress on the fulfillment of GPRA
requirements and in improving program results, the data must be
reliable--that is, they must be seen by potential users to be of
sufficient quality to be trustworthy. While no data are perfect,
agencies need to have sufficiently reliable performance data to provide
transparency of government operations so that Congress, program
managers, and other decision makers can use the information. In
establishing a system to set goals for federal program performance and
to measure results, GPRA requires that agencies describe the means to
be used to validate and verify measured values to improve congressional
decision making by providing objective, complete, accurate and
consistent information on achieving statutory objectives, and on the
relative effectiveness and efficiency of federal programs and
spending.[Footnote 14] In addition, to improve the quality of agency
performance management information, the Reports Consolidation Act of
2000 requires an assessment of the reliability of performance data used
in the agency's program performance report.[Footnote 15]
OMB's Program Assessment Rating Tool (PART) is designed to strengthen
and reinforce performance measurement under GPRA by encouraging careful
development of outcome-oriented performance measures.[Footnote 16]
Between 2002 and 2005, OMB reviewed each of the Coast Guard's six non-
homeland security programs.[Footnote 17] OMB found that four programs-
-ice operations, living marine resources, marine environmental
protection, and marine safety--were performing adequately or better,
and two programs--aids to navigation and search and rescue--did not
demonstrate results. OMB recommended that for the aids to navigation
program, the Coast Guard develop and implement a better primary
performance measure that allows program managers to understand how
their actions produce results. Specifically, OMB recommended using an
outcome-based measure, the number of collisions, allisions, and
groundings, as a measure for the program, instead of the measure that
was being used--aid availability. For the search and rescue program,
OMB recommended that the Coast Guard develop achievable long-term goals
for the program. Since these reviews, the Coast Guard has implemented a
new primary performance measure for the aids to navigation program, "5-
year average annual number of distinct collisions, allisions, and
groundings," and developed new long-term goals for the search and
rescue program's primary performance measure, that is rescuing between
85 and 88 percent of mariners in imminent danger each year from fiscal
year 2002 through 2010.
Non-Homeland Security Primary Performance Measures Are Generally Sound
and Data Are Generally Reliable, but Weaknesses Exist:
While the six non-homeland security primary performance measures are
generally sound, and the data used to calculate these measures are
generally reliable, we found weaknesses with the soundness of three
measures and the reliability of the data used in one measure (see table
2). All six measures cover key program activities and are objective,
measurable, and quantifiable, but three are not completely clear, that
is, they do not consistently provide clear and specific descriptions of
the data, events, or geographic areas they include. The Coast Guard's
processes for entering and reviewing its own internal data are likely
to produce reliable data. However, processes for reviewing or verifying
data gathered from external sources vary from source to source, and for
the marine environmental protection measure, the processes are
insufficient.
Table 2: Soundness of Primary Measures and Reliability of Data Used to
Calculate the Primary Measures for the Coast Guard's Non-Homeland
Security Programs:
Program: Aids to navigation;
Primary measure: 5-year average annual number of distinct collisions,
allisions, and groundings;
Is the measure sound?: yes;
Are the data used to calculate the measure reliable?: yes.
Program: Ice operations;
Primary measure: Domestic icebreaking--annual number of waterway
closure days;
Is the measure sound?: Weaknesses identified;
Are the data used to calculate the measure reliable?: yes.
Program: Living marine resources;
Primary measure: Percent of fishermen in compliance with regulations;
Is the measure sound?: Weaknesses identified;
Are the data used to calculate the measure reliable?: Yes.
Program: Marine environmental protection;
Primary measure: 5-year average annual number of oil spills greater
than 100 gallons and chemical discharges per 100 million tons shipped;
Is the measure sound?: Yes;
Are the data used to calculate the measure reliable?: Weaknesses
identified.
Program: Marine safety;
Primary measure: 5-year average annual number of deaths and injuries of
recreational boaters, mariners, and passengers;
Is the measure sound?: Yes;
Are the data used to calculate the measure reliable?: Yes.
Program: Search and rescue;
Primary measure: Percent of mariners in imminent danger who are
rescued;
Is the measure sound?: Weaknesses identified;
Are the data used to calculate the measure reliable?: yes.
Source: GAO analysis of Coast Guard primary performance measures.
[End of table]
Although the Six Primary Measures Are Generally Sound, Three Have
Weaknesses:
While the six primary performance measures are generally sound--in that
the measures cover key activities of the program, and are objective,
measurable, and quantifiable--three of the measures are not completely
clear. The primary performance measures for the ice operations, living
marine resources, and search and rescue programs do not consistently
provide clear and specific descriptions of the data, events, or
geographic areas they include. It is possible these weaknesses could
lead to decisions or judgments based on inaccurate, incomplete, or
misreported data. The three programs with primary measures that are not
completely clear are as follows:
* Ice operations. Further clarity and consistency in reporting the
geographic areas included in the ice operations primary performance
measure, "domestic ice breaking--annual number of waterway closure
days," would provide users additional context to discern the full scope
of the measure. Despite its broad title, the measure does not reflect
the annual number of closure days for all waterways across the United
States, but rather reflects only the annual number of closure days in
the Great Lakes region, although the Coast Guard breaks ice in many
East Coast ports and waterways. According to Coast Guard officials, the
measure focuses on the Great Lakes region because it is a large
commerce hub where the icebreaking season tends to be longer and where
ice has a greater impact on maritime transportation. While this
limitation is included in accompanying text in some documents, the
description of the limitation is inconsistent across department and
agency publications. The DHS fiscal year 2005 Performance and
Accountability Report notes that the measure is made up of nine
critical waterways within the region, but the DHS fiscal year 2007
budget request reports that it consists of seven critical waterways,
while the Coast Guard's fiscal year 2007 Budget-in-Brief does not
mention the number of waterways included in the measure. In addition,
Coast Guard program officials said that the measure only reflects
closures in one critical waterway--the St. Mary's River. Coast Guard
program officials at agency headquarters said that they are in the
early stages of developing a new primary performance measure that will
incorporate domestic icebreaking activities in areas beyond the Great
Lakes. However, until a better measure is developed, the description of
the current measure can confuse users and might cause them to think
performance was better or worse than it actually was.
* Search and rescue. While the primary performance measure for the
search and rescue program, "percent of mariners in imminent danger who
are rescued," reflects the program's priority of saving lives, it
excludes those incidents in which 11 or more lives were saved or lost.
According to Coast Guard officials, an agency analysis in fiscal year
2005 showed that 98 percent of search and rescue cases involved 10 or
fewer people that were saved or lost. Coast Guard officials added that
large cases involving 11 or more people are data anomalies and by
excluding these cases the agency is better able to assess the program's
performance on a year-to-year basis. While we understand the Coast
Guard's desire to assess program performance on a year-to-year basis,
and to not skew the data, in some instances this type of exclusion may
represent a significant level of activity that is not factored into the
measure. For example, during Hurricane Katrina, the Coast Guard rescued
more than 33,500 people. While including such large incidents in the
performance measure would skew annual performance results, it is
important for the Coast Guard to recognize these incidents, either
through a footnote or accompanying text in department and agency
publications. Not clearly defining the measure and recognizing such
incidents may cause internal managers and external stakeholders to
think performance was better or worse than it actually was.
* Living marine resources. Similar to the ice operations primary
measure, the living marine resources primary performance measure,
"percent of fishermen in compliance with regulations," is not
consistently and clearly defined in all department and agency
publications. The Coast Guard enforces federal regulations, similar to
agencies across law enforcement, not by checking fishing vessels at
random, but instead by targeting those entities that are most likely to
be in violation of fishery regulations, such as vessels operating in
areas that are closed to fishing. Because the Coast Guard targets
vessels, the primary measure does not reflect the compliance rate of
all fishermen in those areas patrolled by the Coast Guard, as could be
inferred by the description, but rather is an observed compliance rate,
that is, the compliance rate of only those fishing vessels boarded by
Coast Guard personnel. The description of this performance measure is
inconsistent across department and agency publications. For example, in
the DHS fiscal year 2005 Performance and Accountability Report and the
Coast Guard's Budget-in-Brief, this measure is described as an observed
compliance rate, but the DHS fiscal year 2007 budget request does not
clarify that this measure represents an observed compliance rate rather
than the compliance rate of all fishermen in those areas patrolled by
the Coast Guard. A measure that is not consistently and clearly stated
may affect the validity of managers' and stakeholders' assessments of
program performance, possibly leading to a misinterpretation of
results.
Existing Procedures Help Ensure Reliable Internal Data, but Procedures
Do Not Exist to Check Reliability of All External Data:
While the Coast Guard has controls in place to ensure the timeliness,
completeness, accuracy, and consistency of internal data it creates--
that is, original data that Coast Guard personnel collect and enter
into its data systems--the agency does not have controls in place to
verify or review the completeness and accuracy of data obtained from
all external sources that it uses in calculating some of the primary
performance measures. The internal data used to calculate the six
primary performance measures are generally reliable--in that the Coast
Guard has processes in place to ensure the data's timeliness,
completeness, accuracy, and consistency. These controls include data
fields, such as pick lists and drop-down lists, that allow for
standardized data entry, mandatory data fields to ensure all required
data are entered, access controls that allow only authorized users to
enter and edit data, requirements for entering data in a timely manner,
and multiple levels of review across the agency. To ensure data
consistency across the Coast Guard, each of the six non-homeland
security programs has published definitions or criteria to define the
data used for the primary measures. However, the Coast Guard
acknowledges that in some instances these criteria may be open to
subjective interpretation, such as with the search and rescue program.
For example, when entering data to record the events of a search and
rescue incident, rescuers must identify the outcome of the event by
listing whether lives were "lost," "saved," or "assisted." While
program criteria define a life that is lost, saved, or assisted, there
is potential for subjective interpretation in some incidents.[Footnote
18] Through reviews at the sector, district, and headquarters levels
the Coast Guard attempts to remedy any inconsistencies from
interpretations of these criteria.
While the Coast Guard uses internal data for all six of its non-
homeland security primary performance measures, it also uses external
data to calculate the primary performance measures for two programs--
marine safety and marine environmental protection (see table 3). The
Coast Guard's procedures for reviewing external data are inconsistent
across these two programs. For example, while the Coast Guard has
developed better processes and controls for external data used in the
marine safety program's primary performance measure--such as using a
news clipping service that gathers media articles on recreational
boating accidents and fatalities and using a database that gathers
recreational boating injury data from hospitals--the agency does not
have processes to test the reliability of external data used in the
marine environmental protection program's primary performance measure.
The extent to which controls are used to verify external data for the
marine safety and marine environmental protection primary measures is
described below.
Table 3: Source of Data Used to Calculate Non-Homeland Security Primary
Performance Measures:
Program: Aids to navigation;
Internal Data Sources: Coast Guard Marine Information Safety and Law
Enforcement database[A]: check;
Internal Data Sources: Coast Guard District 9 icebreaking reports[B]:
[Empty];
External Data Sources: [Empty].
Program: Ice operations;
Internal Data Sources: Coast Guard Marine Information Safety and Law
Enforcement database[A]: [Empty];
Internal Data Sources: Coast Guard District 9 icebreaking reports[B]:
check;
External Data Sources: [Empty].
Program: Living marine resources;
Internal Data Sources: Coast Guard Marine Information Safety and Law
Enforcement database[A]: check;
Internal Data Sources: Coast Guard District 9 icebreaking reports[B]:
[Empty];
External Data Sources: [Empty].
Program: Marine environmental protection;
Internal Data Sources: Coast Guard Marine Information Safety and Law
Enforcement database[A]: check;
Internal Data Sources: Coast Guard District 9 icebreaking reports[B]:
[Empty];
External Data Sources: check[C].
Program: Marine safety;
Internal Data Sources: Coast Guard Marine Information Safety and Law
Enforcement database[A]: check;
Internal Data Sources: Coast Guard District 9 icebreaking reports[B]:
[Empty];
External Data Sources: check[D].
Program: Search and rescue;
Internal Data Sources: Coast Guard Marine Information Safety and Law
Enforcement database[A]: check;
Internal Data Sources: Coast Guard District 9 icebreaking reports[B]:
[Empty];
External Data Sources: [Empty].
Source: GAO analysis of Coast Guard data.
[A] The Marine Information Safety and Law Enforcement database is a Web-
based database used to track marine safety and law enforcement
activities involving commercial and recreational vessels. The system
provides query, reporting, and file-downloading capabilities to the
Coast Guard marine safety and law enforcement operating programs.
[B] Coast Guard District 9 (headquartered in Cleveland, Ohio) develops
weekly icebreaking reports by compiling information from icebreaking
cutters operating within the district. Information in these reports
includes data on the number of vessels beset in ice that were assisted,
the number of waterways closed because of ice, the duration of any
waterway closures, and the number of vessel transits through critical
waterways. These reports are sent directly from the cutters to the
district office and compiled into an annual report that is sent to
Coast Guard headquarters.
[C] To calculate the marine environmental protection primary
performance measure, the Coast Guard uses data from the Army Corps of
Engineers on the amount of oil and chemicals shipped in the United
States.
[D] To calculate the marine safety primary performance measure, the
Coast Guard uses state data on recreational boating deaths and
injuries.
[End of table]
* Marine safety. To calculate the marine safety program's primary
performance measure, "5-year average annual number of deaths and
injuries of recreational boaters, mariners, and passengers," the Coast
Guard uses internal data on deaths and injuries for mariners and
passengers, as well as external data on recreational boating deaths and
injuries from the Boating Accident Reporting Database (BARD)--a Coast
Guard managed database--that relies on data collected and entered by
the states. In 2000, the Department of Transportation Office of
Inspector General reported that recreational boating fatality data
collected from the states consistently understated the number of
fatalities, in part because a precise definition of a recreational
boating fatality did not exist.[Footnote 19] To improve the reliability
and consistency of the data, the Coast Guard created a more precise
definition and clarified reporting criteria by providing each state
with a data dictionary that describes the definitions for all required
data fields. In addition, to improve the timeliness of incident
reporting, the Coast Guard created a Web-based version of BARD for
electronic submission of recreational boating accident data. According
to Coast Guard officials, this system allows Coast Guard staff to
verify, validate, and corroborate data with each state for accuracy and
completeness prior to inclusion in the measure.
According to Coast Guard officials, a recent Coast Guard analysis
showed that these efforts have improved the error rate from an average
of about 6 percent to about 1 percent annually. However, despite these
improvements, the Coast Guard acknowledges that some incidents may
still never be reported, some incidents may be inaccurately reported,
and some duplicate incidents may be included. Coast Guard officials
told us that the agency continues to work to reduce these errors by
developing additional steps to validate data. These recent steps
include using a news clipping service that gathers all media articles
concerning recreational boating accidents and fatalities and using a
database that gathers recreational boating injury data from hospitals.
* Marine environmental protection. In contrast, the Coast Guard does
not have processes to validate the reliability of external data used in
the marine environmental protection program's primary performance
measure, "5-year average annual number of oil spills greater than 100
gallons and chemical discharges per 100 million tons shipped." Each
year the Coast Guard uses internal data on oil spills and chemical
discharges, as well as external data from the Corps on the amount of
oil and chemicals shipped annually in the United States to calculate
this measure. However, the Coast Guard does not review the Corps' data
for completeness or accuracy, nor does it review the data reliability
procedures the Corps uses to test the data for completeness or
accuracy. Coast Guard officials said that they did not take these steps
because they had thought the Corps performed its own internal
assessments, but they were also unaware of what these assessments were
or whether the Corps actually performed them. While, according to a
Corps official, the Corps does have some controls in place, an official
at the Coast Guard agreed that the Coast Guard would benefit from
having, at a minimum, some familiarity with the internal controls used
by the Corps.
More than a Third of the Secondary Performance Measures Assessed Are
Generally Sound, and the Remainder Have Weaknesses:
More than a third (9 of the 23) of the secondary performance measures
assessed are generally sound--that is, they are clearly stated and
described; cover key activities of the program; and are objective,
measurable, and quantifiable (see table 4). However, as described
below, weaknesses exist for the other 14 of these 23 measures. More
specifically, for the 14 secondary measures, we found (1) the Coast
Guard does not have measurable targets to assess whether program and
agency goals and objectives are being achieved for 12 measures, (2) the
Coast Guard does not have agencywide criteria or guidance to accurately
reflect program results and ensure objectivity for 1 measure, and (3)
the Coast Guard does not clearly state or describe the data or events
included in 1 measure. These weaknesses do not allow the Coast Guard to
provide assurance that these performance measures do not lead to
decisions or judgments based on inaccurate, incomplete, or misreported
information. More detail on all of the secondary measures we assessed
is in appendix II.
Table 4: Soundness of Selected Non-Homeland Security Secondary
Performance Measures:
Program: Aids to navigation;
Number of measures that are sound: 3;
Number of measures with weaknesses: 0.
Program: Living marine resources;
Number of measures that are sound: 0;
Number of measures with weaknesses: 11.
Program: Marine environmental protection;
Number of measures that are sound: 6;
Number of measures with weaknesses: 1.
Program: Search and rescue;
Number of measures that are sound: 0;
Number of measures with weaknesses: 2.
Program: Total;
Number of measures that are sound: 9;
Number of measures with weaknesses: 14.
Source: GAO analysis of Coast Guard secondary performance measures.
[End of table]
* Measures without measurable targets. Twelve secondary measures--11
living marine resources measures and 1 marine environmental protection
measure--do not have annual targets to assess whether program and
agency goals and objectives are being achieved.[Footnote 20] According
to Coast Guard officials, these measures do not have targets because
the focus of the program is on the primary performance measures, and
not the inputs and outputs reflected in these secondary measures.
However, without any quantifiable, numeric targets, it is difficult for
the Coast Guard to know the extent to which program and agency goals
and objectives are being achieved.
* Measure without criteria or guidance to accurately reflect program
results and ensure objectivity. One of the search and rescue program's
secondary performance measures that we analyzed, "percent of property
saved," does not have criteria or guidance for agency personnel to
objectively and consistently determine the value of saved property.
Despite this lack of criteria on how to consistently and objectively
determine property values, data from this measure are reported in both
the Coast Guard's annual Budget-in-Brief and the DHS fiscal year
Performance and Accountability Report. Coast Guard officials said it
would be difficult to develop such criteria because of the large number
of boats and vessels and their varying values. Officials added that
Coast Guard personnel generally do not have access to, and do not
follow up to obtain, insurance or damage estimates for saved property.
In addition, we found that Coast Guard units do not consistently record
property values across the agency. For example, some units do not
record property values at all, other units record property values only
when the actual value can be determined, and other units estimate
property values using a $1,000-per-foot-of-vessel-length rule of thumb.
Without any criteria or guidance to determine property values, the
Coast Guard cannot provide assurance that agency personnel are
consistently and objectively making these determinations across the
agency, and whether the measure accurately reflects program results.
* Measure not completely clear. Similar to the primary performance
measure for the search and rescue program, one of the search and rescue
program's secondary measures we analyzed, "percent of lives saved after
Coast Guard notification," reflects the program's priority of saving
lives, but excludes those incidents in which 11 or more lives were
saved or lost in a single case. As with the primary measure, including
such large incidents in performance measures would skew annual
performance results, and thus it may be appropriate to exclude them.
However, it is important for the Coast Guard to recognize, either
through a footnote or accompanying text, the exclusion of these
incidents--such as during Hurricane Katrina, in which the agency
rescued more than 33,500 people--because otherwise, performance results
could be misinterpreted or misleading to users.
Challenges Exist in Using Measures to Link Resources to Results, but
the Coast Guard Is Working on Ways to Address Them:
While the primary measures for the Coast Guard's six non-homeland
security programs are generally sound and use reliable data, challenges
exist with using the primary measures to assess the link between
resources expended and results achieved. Ideally, a performance measure
not only tells decision makers what a program is accomplishing, but it
also gives them a way to affect these results through the decisions
they make about resources--for example, by providing additional
resources with a degree of confidence that doing so will translate into
better results. Even sound performance measures, however, may have
limits to how much they can explain about the relationship between
resources expended and results achieved. For the Coast Guard, these
limits involve (1) the difficulty of fully reflecting an entire program
such as ice operations or marine environmental protection in a single
performance measure and (2) the ability to account for the many
factors, other than resources, that can affect program results.
Recognizing these limitations, and responding to recommendations we
have made in past reports, Coast Guard officials have been working on a
wide range of initiatives they believe will help in understanding the
effects of these other factors and deciding where resources can best be
spent. According to Coast Guard officials, although the agency has been
working on some of these initiatives for several years, the extent and
complexity of the effort, together with the challenges presented in
integrating a multitude of initiatives into a data-driven and
comprehensive strategy, requires additional time to complete. At this
time, the Coast Guard does not expect many of the initiatives to be
implemented until 2010. Until these initiatives are developed and
operational, it is not possible to fully assess the overall success the
agency is likely to have in establishing clear explanations for how its
resources and results are linked.
Primary Performance Measures Cover a Key Activity, but Not Every
Activity Conducted under a Program:
Performance measures are one important tool to communicate what a
program has accomplished and provide information for budget decisions.
It is desirable for these measures to be as effective as possible in
helping to explain the relationship between resources expended and
results achieved, because agencies that understand this linkage are
better positioned to allocate and manage their resources effectively.
The Coast Guard follows DHS guidance in reporting a single measure per
program, and doing so is consistent with our prior work on agencies
that were successful in measuring performance and implementing
GPRA.[Footnote 21] Previously, we found that agencies successful in
measuring performance and meeting GPRA's goal-setting and performance
measurement requirements limited their measures to covering core
program activities essential for producing data for decision making and
not all program activities. Each of the Coast Guard's primary measures
for its six non-homeland security programs meets our criteria of
covering a key activity. None of them, however, is comprehensive enough
to capture all of the activities performed within the program that
could affect results. For example, the primary performance measure for
the marine environmental protection program relates to preventing oil
and chemical spills. This is a key program activity, but under this
program the Coast Guard also takes steps to prevent other marine debris
and pollutants (such as plastics and garbage), protect against the
introduction of invasive aquatic nuisance species, and respond to and
mitigate oil and chemical spills that actually do occur. As such,
resources applied to these other activities would not be reflected in
the program's primary measure, and thus, a clear and direct
relationship between total program resources and program results is
blurred.
In some cases, it may be possible to identify or develop a performance
measure that fully encapsulates all the activities within a program,
but in many cases the range of activities is too broad, resulting in a
measure that would be too nebulous to be of real use. Coast Guard
officials told us that developing primary measures that incorporate all
of the diverse activities within some programs, as well as reflect the
total resources used within the program would be difficult, and that
such a measure would likely be too broad to provide any value for
assessing overall program performance. As such, officials added that
performance measures provide a better assessment of program performance
and resource use when all of a program's measures--both primary and
secondary--are viewed in conjunction as a suite of measures.
Performance Results Can Be Affected by Factors Other than Resources:
A second challenge in establishing a clearer relationship between
resources expended and results achieved is that many other factors can
affect performance and blur such a relationship. Some of these factors
can be external to an agency--and perhaps outside an agency's ability
to influence. At the time of our 2004 report, Coast Guard officials
also pointed to these external factors outside of the agency's control
to explain performance results that did not improve despite resource
increases. Because of the potentially large number of external factors,
and their sometimes unpredictable or often unknown effect on
performance, it may be difficult to account for how they--and not the
resources expended on the program--affect performance results.
Such factors are prevalent in the Coast Guard's non-homeland security
programs, according to Coast Guard officials. They cited such examples
as the following:
* Changes in fishing policies off the coast of Alaska had an effect on
performance results in the search and rescue program. For many years,
commercial sablefish and halibut fishermen were allowed to fish only
during a 2-week period each year. Given the limited window of
opportunity that this system provided, these fishermen had a strong
incentive to go out to sea regardless of weather conditions, thereby
affecting the number of the Coast Guard's search and rescue cases that
occurred. In 1994, these regulations were changed; in place of a 2-week
fishing season with no limits on the amount of fish any permitted
fisherman could harvest, the regulations set a longer season with
quotas. This change allowed fishermen more flexibility and more
opportunity to exercise caution about when they should fish rather than
driving them to go out in adverse weather conditions. Following the
change in regulations, Coast Guard statistics show that search and
rescue cases decreased in halibut and sablefish fisheries by more than
50 percent, from 33 in 1994 to 15 in 1995. However, Coast Guard
officials said that because of the large number of search and rescue
cases in the district during these two years--more than 1,000 annually-
-this policy change only had a minimal impact on the amount of
resources the district used for search and rescue cases.
* Vagaries of weather can also affect a number of non-homeland security
missions. Unusually severe weather, such as Hurricane Katrina, for
example, can affect the success rates for search and rescue or cause
navigational aids to be out of service. Even good weather on a holiday
weekend, can increase the need for search and rescue operations--and
consequently affect performance results--because such weather tends to
encourage large numbers of recreational boaters to be out on the water.
Harsh winter weather can also affect performance results for the ice
operations program.
* Results for the marine environmental protection primary performance
measure, "the 5-year average annual number of oil spills greater than
100 gallons and chemical discharges per 100 million tons shipped" can
be affected by policies and activities that are not part of the marine
environmental protection program. For example, according to Coast Guard
officials, a foreign country's decision to institute a more aggressive
vessel inspection program could reduce spills caused by accidents in
U.S. waters, if the inspections uncovered mechanical problems that were
corrected before those vessels arrived in the United States. While not
captured in the primary performance measure, the Coast Guard tracks
such information through a secondary measure, "the Tokyo and Paris
memorandums of understanding port state control reports."[Footnote 22]
This small set of examples demonstrates that, in some situations, other
factors beyond resources expended may influence performance results.
Developing a system or model that could realistically take all of these
other factors into account is perhaps impossible, and it would be a
mistake to view this second challenge as a need to do so. Rather, the
challenge is to develop enough sophistication about each program's
context so that the Coast Guard can more systematically consider such
factors, and then explain the influence of these factors on resource
decisions and performance results.
Coast Guard Has Developed a Range of Initiatives to Forge Better Links
between Resources and Results:
The Coast Guard is actively seeking to address such challenges, as
those discussed above, through efforts, some of which have been under
way for several years. In 2004, we reported that several initiatives
had already begun, and we recommended that the Coast Guard ensure that
its strategic planning process and associated documents include a
strategy for identifying intervening factors that may affect
performance and systematically assess the relationship among these
factors, resources expended, and results achieved. Shortly thereafter
the Coast Guard chartered a working group to investigate its then more
than 50 ongoing initiatives to make recommendations on their value,
contribution, and practicality, and to influence agency decisions on
the integration, investment, and institutionalization of these
initiatives. The working group's product was a "road map" that clearly
defined executable segments, sequencing, and priorities. These results
were then documented in a January 2005 Coast Guard internal report that
summarized these priorities.[Footnote 23] Agency documents indicate
that the Coast Guard later reduced these 50 original initiatives to the
25 initiatives considered to be the most critical and immediate by
evaluating and categorizing all 50 initiatives based on their ability
to contribute to the agency's missions. These 25 initiatives, listed
along with their status in appendix III, involve a broad range of
activities that fall into seven main areas, as follows:
* Measurement. Five initiatives are intended to improve the agency's
data collection, including efforts to quantify input, output, and
performance to enhance analysis and fact-based decision making.
* Analysis. Eight initiatives are intended to transform data into
information and knowledge to answer questions and enhance decision
making on issues such as performance, program management, cause-and-
effect relationships, and costs.
* Knowledge management. Three ongoing initiatives are intended to
capture, evaluate, and share employee knowledge, experiences, ideas,
and skills.
* Alignment. Three initiatives are intended to improve the consistency
and alignment of agency planning, resource decisions, and analysis
across all Coast Guard programs.
* Access. Two initiatives relate to making data, information, and
knowledge transparent and available to employees.
* Policy and doctrine. Three initiatives are intended to develop new
and maintain current Coast Guard management policies.
* Communication and outreach. One initiative is intended to assist and
guide program managers and staff to understand and align all aspects of
the Coast Guard's overall management strategy.
We found that one of the initiatives that the working group deemed
important and included among the most critical and immediate
initiatives, relates, in part, to the first challenge we discussed--
that is, developing new measures and improving the breadth of old
measures to better manage Coast Guard programs and achieve agency
goals. Coast Guard efforts have been ongoing in this regard, and our
current work has identified several performance measures that were
recently improved, and others that are currently under development. For
example, to provide a more comprehensive measure of search and rescue
program performance, the Coast Guard is improving its ability to track
lives-unaccounted-for--that is, those persons who at the end of a
search and rescue response remain missing. According to Coast Guard
officials, the agency is working on and anticipates being able to
eventually include data on lives-unaccounted-for in the primary
performance measure. Also, the Coast Guard began including data on the
number of recreational boating injuries, along with the data on mariner
and passenger deaths and injuries and recreational boater deaths, which
can help provide a more comprehensive primary measure for the marine
safety program. In addition, recently, OMB guidance began requiring
efficiency measures as part of performance management, and in response,
the Coast Guard has started developing such efficiency measures. The
Coast Guard is also developing a variety of performance measures to
capture agency performance related to other activities, such as the
prevention of invasive aquatic nuisance species (marine environmental
protection), maritime mobility (aids to navigation), and domestic and
polar icebreaking (ice operations).
Many of the Coast Guard's other ongoing initiatives are aimed at the
second challenge--that is, developing a better understanding of the
various factors that affect the relationship between resources and
results. This is a substantial undertaking, and in 2005, upon the
recommendation of the working group, the Coast Guard created an office
to conduct and coordinate these efforts.[Footnote 24] This office has
taken the lead in developing, aligning, implementing, and managing all
of the initiatives. Together, the activities cover such steps as (1)
improving measurement, with comprehensive data on activities,
resources, and performance; (2) improving agency analysis and
understanding of cause-and-effect relationships, such as the
relationship between external factors and agency performance; and (3)
providing better planning and decision making across the agency. Coast
Guard officials expect that once these initiatives are completed, the
Coast Guard will have a more systematic approach to link resources to
results.
The Coast Guard has already been at this effort for several years but
does not anticipate implementation of many of these initiatives until
at least fiscal year 2010. The amount of time that has already elapsed
since our 2004 report may raise some concerns about whether progress is
being made. However, as described in the examples below, many of these
are complex data-driven initiatives that make up a larger comprehensive
strategy to better link resources to results, and as such, we think the
lengthy time frame reflects the complexity of the task. According to
Coast Guard officials, the agency is proceeding carefully and is still
learning about how these initiatives can best be developed and
implemented. Three key efforts help show the extent of, and
interrelationships among, the various components of the effort:
* Standardized reporting. The Coast Guard is currently developing an
activities dictionary to standardize the names and definitions for all
Coast Guard activities across the agency. According to Coast Guard
officials, this activities dictionary is a critical step in continuing
to develop, implement, and integrate these initiatives. Officials added
that standardizing the names and definitions of all Coast Guard
activities will create more consistent data collection throughout the
agency, which is important because these data will be used to support
many other initiatives.
* Measurement of readiness. Another initiative, the Readiness
Management System, is a tool being developed and implemented to track
the agency's readiness capabilities by providing up-to-date information
on resource levels at each Coast Guard unit as well as the
certification and skills of all Coast Guard uniformed personnel. This
information can directly affect outcomes and performance measures by
providing unit commanders with information to reconfigure resources for
a broad range of missions. Tracking this information, for example,
should allow the unit's commanding officer to determine what resources
and personnel skills are needed to help ensure the unit has the skills
and resources necessary to accomplish its key activities, or for new
programs or activities. Coast Guard officials told us that the
Readiness Management System is in the early stages of being implemented
across the agency.
* Framework for analyzing risk, readiness, and performance. According
to Coast Guard officials, the information from the Readiness Management
System will be integrated with another initiative currently under
development, the Uniform Performance Logic Model. This initiative is
intended to illustrate the causal relationships among risk, readiness
management, and agency performance. Coast Guard officials said that by
accounting for these many factors, the model will help decision makers
understand why events and outcomes occur, and how these events and
outcomes are related to resources. For example, the model will provide
the Coast Guard with an analysis tool to assist management with
decisions regarding the allocation of resources.
The Coast Guard currently anticipates that many of the 25 initiatives
will initially be implemented by fiscal year 2010 and expects further
refinements to extend beyond this time frame. While the Coast Guard
appears to be moving in the right direction and has neared completion
of some initiatives, until all of the agency's efforts are complete, it
remains too soon to determine how effective it will be at clearly
linking resources to performance results.
Conclusions:
It is important for the Coast Guard to have sound performance measures
that are clearly stated and described; cover key program activities;
are objective, measurable, and quantifiable--including having annual
targets; and using reliable data. This type of information would help
Coast Guard management and stakeholders, such as Congress, make
decisions about how to fund and improve program performance. We found
that the Coast Guard's non-homeland security performance measures
satisfy many of the criteria and use data that are generally reliable.
The weaknesses and limitations we did find do not mean that the
measures are not useful but rather represent opportunities for
improvement. However, if these weaknesses are not addressed--that is,
if measures are not clearly stated and well-defined, do not have
measurable performance targets, or do not have criteria to objectively
and consistently report data, or processes in place to ensure external
data are reliable--the information reported through these measures
could be misinterpreted, misleading, or inaccurate. For example,
without either processes in place to review the reliability of external
data used in performance measures, or a familiarity with the controls
used by external parties to verify and validate these data, the Coast
Guard cannot ensure the completeness or accuracy of all of its
performance results.
While the Coast Guard's measures are generally sound, even sound
performance measures have limits as to how much they can explain about
the relationship between resources expended and results achieved. The
Coast Guard continues to work to overcome these limitations by
developing a number of different initiatives, including but not limited
to developing and refining the agency's performance measures. Although
the agency appears to be moving in the right direction, until all of
the Coast Guard's efforts are complete, we will be unable to determine
how effective these initiatives are at linking resources to results. In
the interim, an additional step the Coast Guard can take to further
demonstrate the relationship between resources and results is to
provide additional information or measures in some of its annual
publications--aside from the one primary measure used in department
publications--where doing so would help provide context or provide
additional perspective. For example, this could be done in other
venues--such as the Coast Guard's annual Budget-in-Brief, or any
program-specific publications--where reporting some secondary measures
or additional data could provide more context or perspective on
programs, and could help to more fully articulate to stakeholders and
decision makers the relationship between resources expended and results
achieved. Reporting supplemental information on such things as the
percentage of aids to navigation available and in need of maintenance,
the annual number of search and rescue cases, and icebreaking
activities beyond the Great Lakes region would provide additional
information on the annual levels of activity that constitute the aids
to navigation, search and rescue, and ice operations programs;
information that external decision makers, in particular, might find
helpful. Reporting these measures would be useful to provide additional
information to Congress on activities being conducted that may require
more or less funding while the Coast Guard continues its work on the
many initiatives it has ongoing aimed at better linking its performance
results with resources expended.
Recommendations for Executive Action:
To improve the quality of program performance reporting and to more
efficiently and effectively assess progress toward achieving the goals
or objectives stated in agency plans, we recommend that the Secretary
of Homeland Security direct the Commandant of the Coast Guard to:
* Refine certain Coast Guard primary and secondary performance measures
by:
- further clarifying the ice operations primary measure by clearly and
consistently describing the geographic area and number of waterways
included in the measure; the living marine resources primary measure by
clearly and consistently reporting the scope of the measure; and the
search and rescue primary measure and the search and rescue "percent of
lives saved after Coast Guard notification" secondary measure by
reporting those incidents or data that are not included in the
measures;
- developing measurable performance targets to facilitate assessments
of whether program and agency goals and objectives are being achieved
for the 11 living marine resources secondary measures and the 1 marine
environmental protection secondary measure, "Tokyo and Paris
memorandums of understanding port state control reports," that lack
annual targets; and:
- establishing agencywide criteria or guidance to help ensure the
objectivity and consistency of the search and rescue program's "percent
of property saved" secondary performance measure.
* Develop and implement a policy to review external data provided by
third parties that is used in calculating performance measures to, at a
minimum, be familiar with the internal controls external parties use to
determine the reliability of their data.
* Report additional information--besides the one primary measure--in
appropriate agency publications or documents where doing so would help
provide greater context or perspective on the relationship between
resources expended and program results achieved.
Agency Comments:
We provided a draft of this report to the Department of Homeland
Security, including the Coast Guard, for their review and comment. The
Department of Homeland Security and the Coast Guard generally agreed
with the findings and recommendations of the draft and provided
technical comments, which we incorporated to ensure the accuracy of our
report. The Department of Homeland Security's written comments are
reprinted in appendix IV.
As agreed with your office, unless you publicly announce the contents
of this report earlier, we plan no further distribution of it until 30
days from the date of this letter. We will then send copies to the
Secretary of Homeland Security; the Commandant of the Coast Guard; the
Director, Office of Management and Budget; and make copies available to
other interested parties who request them. In addition, the report will
be available at no charge on GAO's Web site at [Hyperlink,
http://www.gao.gov].
If you have any questions about this report, please contact me at
CaldwellS@gao.gov or (202) 512-9610. Contact points for our Offices of
Congressional Relations and Public Affairs may be found on the last
page of this report. GAO staff who made major contributions to this
report are listed in appendix IV.
Signed by:
Stephen L. Caldwell:
Acting Director, Homeland Security and Justice Issues:
[End of section]
Appendix I: Objectives, Scope, and Methodology:
For our first objective--whether the primary performance measure for
the Coast Guard's six non-homeland security programs are sound, and the
data used to calculate them are reliable--we used previously
established GAO criteria to determine the soundness of the primary
performance measures.[Footnote 25] Specifically, we used our judgment
to assess whether the measures are (1) clearly stated and described;
(2) cover a key program activity and represent mission goals and
priorities; (3) objective, that is whether they are open to bias or
subjective interpretation; (4) measurable, that is, represent
observable events; and (5) quantifiable, that is, are countable events
or outcomes. A measure should be clearly stated and described so that
it is consistent with the methodology used to calculate it and can be
understood by stakeholders both internally and externally. Measures
should also cover key program activities and represent program and
agency goals and priorities to help identify those activities that
contribute to the goals and priorities. To the greatest extent
possible, measures should be objective, that is, reasonably free of
bias or manipulation that would distort an accurate assessment of
performance. When appropriate, measures should be measurable and
quantifiable, including having annual targets, to facilitate future
assessments of whether goals or objectives were achieved, because
comparisons can be easily made between projected performance and actual
results.
In addition, to further assess the soundness of the primary performance
measures, we interviewed program officials from each non-homeland
security program and reviewed planning and performance documentation
from each program office at the headquarters, district, and sector
levels. Program officials we spoke with included headquarters officials
responsible for developing and implementing performance measures in
each program, as well as officials at the district and sector levels
responsible for collecting and entering performance data. We reviewed
documentation on Coast Guard policies and manuals for performance
measures, Coast Guard annual performance plans and reports, commandant
instructions, prior GAO reports, Office of Management and Budget
Program Assessment Rating Tool reviews for each program, and Department
of Homeland Security annual reports.
To determine the reliability of data used in the primary measures, we
assessed whether processes and controls were in place to ensure that
the data used in the measures are timely, complete, accurate, and
consistent, and appear reasonable. We reviewed legislative requirements
for data reliability in both the Government Performance and
Accountability Act of 1993 and the Reports Consolidation Act of 2000
and reviewed Coast Guard standards and procedures for collecting
performance data and calculating results. In addition, we interviewed
agency officials at Coast Guard headquarters, as well as at the
district and sector levels, regarding standardized agencywide data
collection, entry, verification, and reporting policies, and inquired
as to if and how these procedures differed across programs and at each
level of the organization. We observed data entry for the Marine
Information Safety and Law Enforcement database at Coast Guard district
and sector offices in Boston, Massachusetts; Miami, Florida; and
Seattle, Washington; a district office in Cleveland, Ohio; as well as
at an air station in Miami, Florida; and a marine safety office in
Cleveland, Ohio, to check for inconsistencies and discrepancies in how
data are collected and maintained throughout the agency. We selected
these field locations because of the number and types of non-homeland
security programs that are performed at these locations. We also spoke
with information technology officials responsible for maintaining the
Marine Information Safety and Law Enforcement database.
For our second objective--whether selected secondary performance
measures for four of the Coast Guard's non-homeland security programs
are sound--we selected measures in addition to the primary performance
measures for the aids to navigation, living marine resources, marine
environmental protection, and search and rescue programs. We selected
these programs because they had the largest budget increases between
the fiscal year 2005 budget and the Coast Guard's fiscal year 2006
budget request, and are programs of particular interest because of
events surrounding Hurricane Katrina. In addition, we did not assess
any of the secondary measures that were in development at the time of
our report. For these four programs, we assessed the soundness of only
those other performance measures that Coast Guard officials said were
high level, strategic measures used for performance budgeting, budget
projections, management decisions, and external reporting. The 23
secondary measures we assessed for these four programs represent more
than half of the 39 high-level, strategic secondary measures used to
manage the six non-homeland security programs. To assess the soundness
of the selected 23 secondary measures, we used the same GAO criteria
and followed the same steps that we used to determine the soundness of
the primary performance measures.
For our third objective--the challenges, if any, that are present in
trying to use these measures to link resources expended to results
achieved--we interviewed Coast Guard budget officials at agency
headquarters to discuss how performance measures are used in resource
and budget allocation decision making processes. We reviewed previous
GAO reports on performance measures, performance reporting, and the
link between the Coast Guard's resources expended and results achieved.
We also interviewed program officials at Coast Guard headquarters about
ongoing initiatives the agency is developing and implementing to link
resources expended to results achieved.
We conducted our work from July 2005 to August 2006 in accordance with
generally accepted government auditing standards.
[End of section]
Appendix II: Secondary Performance Measures:
Appendix II provides our findings for the soundness of the high-level,
strategic secondary measures we assessed (see table 5), as well as a
list of those high-level, strategic secondary measures we did not
assess (see table 6). Because of the large number of secondary measures
for the Coast Guard's six non-homeland security programs, we assessed
the soundness of secondary measures for the aids to navigation, living
marine resources, marine environmental protection, and search and
rescue programs, and we did not assess the soundness of secondary
measures for the ice operations and marine safety programs.
Table 5: Soundness of Secondary Measures for Coast Guard's Non-Homeland
Security Programs:
Program and measure: Aids to navigation: Annual number of distinct
collision, allision, and grounding events[A];
Is measure sound?: yes.
Program and measure: Aids to navigation: Aid availability;
Is measure sound?: yes.
Program and measure: Aids to navigation: Aids overdue for servicing;
Is measure sound?: yes.
Program and measure: Living marine resources: Percent of Marine Affairs
graduates in Marine Affairs-coded billets;
Is measure sound?: Weaknesses identified.
Program and measure: Living marine resources: Number of domestic
fisheries enforcement resource hours;
Is measure sound?: Weaknesses identified.
Program and measure: Living marine resources: Number of active
commercial fishing vessels by major fishery;
Is measure sound?: Weaknesses identified.
Program and measure: Living marine resources: Number of domestic
boardings by major fishery;
Is measure sound?: Weaknesses identified.
Program and measure: Living marine resources: Boardings per active
commercial fishing vessels by major fishery;
Is measure sound?: Weaknesses identified.
Program and measure: Living marine resources: Number of significant
violations by major fishery;
Is measure sound?: Weaknesses identified.
Program and measure: Living marine resources: Number of significant
violations per domestic resource hours;
Is measure sound?: Weaknesses identified.
Program and measure: Living marine resources: Status of fish stocks;
Is measure sound?: Weaknesses identified.
Program and measure: Living marine resources: Number of Coast Guard
members trained at Regional Fishing Training Centers;
Is measure sound?: Weaknesses identified.
Program and measure: Living marine resources: Cost per Coast Guard
member trained at Regional Fishing Training Centers;
Is measure sound?: Weaknesses identified.
Program and measure: Living marine resources: Number of Marine Affairs
graduates on active duty;
Is measure sound?: Weaknesses identified.
Program and measure: Marine environmental protection: Annual number of
oil spills greater than 100 gallons and chemical discharges per 100
million tons shipped;
Is measure sound?: Yes.
Program and measure: Marine environmental protection: Annual volume of
oil spilled;
Is measure sound?: yes.
Program and measure: Marine environmental protection: 5-year average
annual volume of oil spilled;
Is measure sound?: yes.
Program and measure: Marine environmental protection: 5-year average
annual number of distinct collision, allision, and grounding events[B];
Is measure sound?: Yes.
Program and measure: Marine environmental protection: Port state annual
detention ratio[A];
Is measure sound?: Yes.
Program and measure: Marine environmental protection: Port state 3-year
average detention ratio[A];
Is measure sound?: yes.
Program and measure: Marine environmental protection: Tokyo and Paris
memorandums of understanding port state control reports[A];
Is measure sound?: Weaknesses identified.
Program and measure: Search and rescue: Percent of lives saved after
Coast Guard notification;
Is measure sound?: Weaknesses identified.
Program and measure: Search and rescue: Percent of property saved;
Is measure sound?: Weaknesses identified.
Source: GAO analysis of Coast Guard secondary performance measures.
[A] Four secondary measures--(1) port state annual detention ratio; (2)
port state 3-year average detention ratio; (3) Tokyo and Paris
memorandums of understanding port state control reports; and (4) annual
number of distinct collision, allision, and grounding events--are each
used by the aids to navigation, marine environmental protection, and
marine safety programs.
[B] The marine environmental protection program secondary measure, 5-
year average annual number of distinct collision, allision, and
grounding events, is also the primary performance measure for the aids
to navigation program.
[End of table]
Table 6: Coast Guard Non-Homeland Security Secondary Performance
Measures Not Assessed:
Program and measure: Ice operations: Ensure that ferry service to
isolated communities is not interrupted for more than 2 days annually.
Program and measure: Ice operations: Annually respond to all Army Corps
of Engineers requests to assist in relieving ice jams to prevent
potential flooding.
Program and measure: Ice operations: Annually during ice season ensure
that 95 percent of vessels transiting during light winters, 90 percent
of vessels transiting during normal winters, and 70 percent of vessels
transiting during severe winters are able to maintain an average track
speed of 3 knots.
Program and measure: Ice operations: With adequate advanced notice,
annually provide all necessary icebreaking services to allow product
delivery.
Program and measure: Marine safety: Annual observed wear rate of
personal flotation devices.
Program and measure: Marine safety: Annual number of voluntary Vessel
Safety Exams.
Program and measure: Marine safety: Annual number of boating operators
receiving boating education (by state).
Program and measure: Marine safety: Annual number of recreational
boating safety boardings by states.
Program and measure: Marine safety: Annual number of recreational
boating safety boardings by Coast Guard.
Program and measure: Marine safety: Annual number of citations issued
for improper carriage of safety equipment.
Program and measure: Marine safety: Annual number of boatings under the
influence (by state).
Program and measure: Marine safety: Annual number of commercial vessel
safety-related mariner deaths.
Program and measure: Marine safety: Annual number of commercial vessel
safety-related passenger deaths.
Program and measure: Marine safety: Annual number of commercial vessel
safety-related mariner injuries.
Program and measure: Marine safety: Annual number of commercial vessel
safety-related passenger injuries.
Program and measure: Marine safety: 5-year average number of passenger
and maritime worker casualties and recreational boating deaths divided
by the ratio of the current period to the prior period 5-year average
operating expense authority for marine safety.
Source: Coast Guard.
[End of table]
[End of section]
Appendix III: Ongoing Coast Guard Initiatives to Link Resources Used to
Results Achieved:
Appendix III provides a list of the Coast Guard's ongoing initiatives
to improve the agency's planning, resource management, and decision
support systems to more closely align performance with resources. (See
table 7.)
Table 7: Ongoing Coast Guard Initiatives to Link Resources Used to
Results Achieved:
Type of initiative: Measurement initiatives;
Purpose: Measurement initiatives are being developed to provide
numerical facts and data to quantify input, output, and performance
dimensions of processes, products, services, and overall organizational
outcomes;
Initiative: Readiness Management System;
Status: Operational in fiscal year 2005.
Type of initiative: Measurement initiatives;
Purpose: Measurement initiatives are being developed to provide
numerical facts and data to quantify input, output, and performance
dimensions of processes, products, services, and overall organizational
outcomes;
Initiative: Risk-Based Decision Making;
Status: Estimated to be completed in fiscal year 2010.
Type of initiative: Measurement initiatives;
Purpose: Measurement initiatives are being developed to provide
numerical facts and data to quantify input, output, and performance
dimensions of processes, products, services, and overall organizational
outcomes;
Initiative: Operational Transactional Systems;
Status: These systems are currently operational.
Type of initiative: Measurement initiatives;
Purpose: Measurement initiatives are being developed to provide
numerical facts and data to quantify input, output, and performance
dimensions of processes, products, services, and overall organizational
outcomes;
Initiative: Logistics;
Status: Estimated to be completed in fiscal year 2010.
Type of initiative: Measurement initiatives;
Purpose: Measurement initiatives are being developed to provide
numerical facts and data to quantify input, output, and performance
dimensions of processes, products, services, and overall organizational
outcomes; Initiative: Performance Measures and Scorecards;
Status: Measures and scorecards are currently used, but efforts to
improve are ongoing.
Type of initiative: Analysis initiatives;
Purpose: Analysis initiatives are being developed to examine and
transform numerical facts and data into information and knowledge for
effective decision making. Analyses are conducted to answer questions
about performance, program management, cause-and-effect relationships,
costs, strategy, and, in general, overall Coast Guard management;
Initiative: Activity-Based Management;
Status: Estimated to be completed in fiscal year 2010.
Type of initiative: Analysis initiatives;
Purpose: Analysis initiatives are being developed to examine and
transform numerical facts and data into information and knowledge for
effective decision making. Analyses are conducted to answer questions
about performance, program management, cause-and-effect relationships,
costs, strategy, and, in general, overall Coast Guard management;
Initiative: Mission Cost Model;
Status: Operational in fiscal year 1999.
Type of initiative: Analysis initiatives;
Purpose: Analysis initiatives are being developed to examine and
transform numerical facts and data into information and knowledge for
effective decision making. Analyses are conducted to answer questions
about performance, program management, cause-and-effect relationships,
costs, strategy, and, in general, overall Coast Guard management;
Initiative: Modeling and Simulation;
Status: Estimated to be completed in fiscal year 2010.
Type of initiative: Analysis initiatives;
Purpose: Analysis initiatives are being developed to examine and
transform numerical facts and data into information and knowledge for
effective decision making. Analyses are conducted to answer questions
about performance, program management, cause-and-effect relationships,
costs, strategy, and, in general, overall Coast Guard management;
Initiative: Force/Asset Requirements;
Status: Operational in fiscal year 2003 but efforts to improve are
ongoing.
Type of initiative: Analysis initiatives;
Purpose: Analysis initiatives are being developed to examine and
transform numerical facts and data into information and knowledge for
effective decision making. Analyses are conducted to answer questions
about performance, program management, cause-and-effect relationships,
costs, strategy, and, in general, overall Coast Guard management;
Initiative: Risk Assessments and Profiles;
Status: These assessments are currently used, but efforts to improve
are ongoing.
Type of initiative: Analysis initiatives;
Purpose: Analysis initiatives are being developed to examine and
transform numerical facts and data into information and knowledge for
effective decision making. Analyses are conducted to answer questions
about performance, program management, cause-and-effect relationships,
costs, strategy, and, in general, overall Coast Guard management;
Initiative: Maritime Homeland Security Operations Planning System;
Status: Began a pilot project in fiscal year 2004.
Type of initiative: Analysis initiatives;
Purpose: Analysis initiatives are being developed to examine and
transform numerical facts and data into information and knowledge for
effective decision making. Analyses are conducted to answer questions
about performance, program management, cause-and-effect relationships,
costs, strategy, and, in general, overall Coast Guard management;
Initiative: Competency Assessments;
Status: Initially performed in fiscal year 2004, but efforts continue
to be ongoing.
Type of initiative: Analysis initiatives;
Purpose: Analysis initiatives are being developed to examine and
transform numerical facts and data into information and knowledge for
effective decision making. Analyses are conducted to answer questions
about performance, program management, cause-and-effect relationships,
costs, strategy, and, in general, overall Coast Guard management;
Initiative: G-Organizational Assessments;
Status: These assessments are performed annually.
Type of initiative: Knowledge management initiatives; Purpose:
Knowledge management initiatives are being developed to accumulate,
evaluate, and share enterprise information assets--that is, management
strategies, methods, and knowledge possessed by employees in the form
of information, ideas, learning, understanding, memory, insights,
cognitive and technical skills, and capabilities;
Initiative: Evergreen Strategic Renewal Process;
Status: This strategic process is conducted every 4 years.
Type of initiative: Knowledge management initiatives; Purpose:
Knowledge management initiatives are being developed to accumulate,
evaluate, and share enterprise information assets--that is, management
strategies, methods, and knowledge possessed by employees in the form
of information, ideas, learning, understanding, memory, insights,
cognitive and technical skills, and capabilities;
Initiative: Risk-based Performance Management;
Status: Currently undergoing testing as a pilot project; estimated to
be completed in fiscal year 2010.
Type of initiative: Knowledge management initiatives; Purpose:
Knowledge management initiatives are being developed to accumulate,
evaluate, and share enterprise information assets--that is, management
strategies, methods, and knowledge possessed by employees in the form
of information, ideas, learning, understanding, memory, insights,
cognitive and technical skills, and capabilities;
Initiative: Capital Asset Management;
Status: Estimated to be completed in fiscal year 2010.
Type of initiative: Alignment initiatives;
Purpose: Alignment initiatives are being developed to improve
consistency of plans, processes, actions, information, resource
decisions, results, analyses, and learning to support key
organizationwide goals;
Initiative: Unified Performance Logic Model;
Status: Estimated to be completed in fiscal year 2010.
Type of initiative: Alignment initiatives;
Purpose: Alignment initiatives are being developed to improve
consistency of plans, processes, actions, information, resource
decisions, results, analyses, and learning to support key
organizationwide goals;
Initiative: Activities Dictionary, Product and Services Catalog, and
Enterprise lexicon;
Status: Partially completed; estimated to be completed in fiscal year
2010.
Type of initiative: Alignment initiatives;
Purpose: Alignment initiatives are being developed to improve
consistency of plans, processes, actions, information, resource
decisions, results, analyses, and learning to support key
organizationwide goals;
Initiative: Enterprise Architecture;
Status: Ongoing; began development in fiscal year 2004.
Type of initiative: Access initiatives;
Purpose: Access initiatives are being developed to provide enterprise-
wide right of entry to organizational information and knowledge to
promote visibility, transparency, and use of valid, reliable, and
consistent data and information to know, compare, benchmark, and
improve organizational performance;
Initiative: Coast Guard Central;
Status: Operational in fiscal year 2005, but efforts to improve are
ongoing.
Type of initiative: Access initiatives;
Purpose: Access initiatives are being developed to provide enterprise-
wide right of entry to organizational information and knowledge to
promote visibility, transparency, and use of valid, reliable, and
consistent data and information to know, compare, benchmark, and
improve organizational performance;
Initiative: Enterprise Data Warehouse;
Status: This is an ongoing effort to merge Coast Guard data sources.
Type of initiative: Policy and doctrine initiatives;
Purpose: Policy and doctrine initiatives are being developed to
maintain current, and develop new, Coast Guard management policies;
Initiative: Commandant's Performance Excellence Criteria;
Status: Ongoing; performed on annual and biennial basis.
Type of initiative: Policy and doctrine initiatives;
Purpose: Policy and doctrine initiatives are being developed to
maintain current, and develop new, Coast Guard management policies;
Initiative: Innovation Process and Recognition Program;
Status: Ongoing; performed on annual basis.
Type of initiative: Policy and doctrine initiatives;
Purpose: Policy and doctrine initiatives are being developed to
maintain current, and develop new, Coast Guard management policies;
Initiative: Measurement;
Status: Ongoing, initially implemented in fiscal year 1995.
Type of initiative: Communication and outreach initiative;
Purpose: This communication and outreach initiative is being developed
to assist and guide commands and staffs in understanding and aligning
with all aspects of the Coast Guard;
Initiative: Organizational Performance Consultants Field Guide;
Status: Completed.
Source: Coast Guard.
[End of table]
[End of section]
Appendix IV: Comments from the Department of Homeland Security:
Homeland Security:
August 11, 2006:
Mr. Stephen L. Caldwell:
Acting Director:
Homeland Security and Justice Issues:
U.S. Government Accountability:
Office 441 G Street, NW:
Washington, DC 20548:
Dear Mr. Caldwell:
RE: Draft Report GAO-06-816, Coast Guard: Non-Homeland Security
Performance Measures Are Generally Sound, but Opportunities for
Improvement Exist (GAO Job Code 440432):
The Department of Homeland Security appreciates the opportunity to
review and comment on the draft report. The Government Accountability
Office (GAO) recommends that the Coast Guard refine particular primary
and secondary performance measures, develop and implement a policy to
review external data provided by third parties used in calculating
Coast Guard performance measures, and report additional information in
appropriate publications or documents. We generally agree with the
recommendations which essentially recognize that performance measures
are in place and meet Government Performance and Results Act
requirements, but can be improved upon or refined.
The report acknowledges Coast Guard successful efforts to address prior
GAO and Office of Management and Budget recommendations designed to
improve performance measures. During this engagement, GAO found that
primary performance measures were generally sound and the data used to
calculate them reliable and correctly noted that even sound performance
measures have limits to how much they can explain the relationship
between resources expended and results achieved. Coast Guard has
developed a range of initiatives that we believe will help explain the
effects of external factors on program results other than resources
expended. Some of these initiatives, due to their complexity, will
require additional time to complete. The report overall reflects Coast
Guard's focus on continuous improvement.
Sincerely,
Signed by:
Steven J. Pecinovsky:
Director:
Departmental GAO/OIG Liaison Office:
[End of section]
Appendix V: GAO Contact and Staff Acknowledgments:
GAO Contact:
Stephen L. Caldwell, Acting Director, Homeland Security and Justice
Issues, (202) 512-9610, or CaldwellS@gao.gov:
Acknowledgments:
In addition to the individual named above, Billy Commons, Christine
Davis, Michele Fejfar, Dawn Hoff, Allen Lomax, Josh Margraf, Dominic
Nadarski, Jason Schwartz, and Stan Stenersen made key contributions to
this report.
[End of section]
Related GAO Products:
Coast Guard: Station Readiness Improving, but Resource Challenges and
Management Concerns Remain. GAO-05-161. Washington, D.C.: January 31,
2005.
Coast Guard: Relationship between Resources Used and Results Achieved
Needs to Be Clearer. GAO-04-432. Washington, D.C.: March 22, 2004.
Coast Guard: Comprehensive Blueprint Needed to Balance and Monitor
Resource Use and Measure Performance for All Missions. GAO-03-544T.
Washington, D.C.: March 12, 2003.
Performance Reporting: Few Agencies Reported on the Completeness and
Reliability of Performance Data. GAO-02-372. Washington, D.C.: April
26, 2002.
Coast Guard: Budget and Management Challenges for 2003 and Beyond. GAO-
02-538TU. Washington, D.C.: March 19, 2002.
Coast Guard: Update on Marine Information for Safety and Law
Enforcement System. GAO-02-11. Washington, D.C.: October 17, 2001.
Tax Administration: IRS Needs to Further Refine Its Tax Filing
Performance Measures. GAO-03-143. Washington, D.C.: November 22, 2002.
The Results Act: An Evaluator's Guide to Assessing Agency Performance
Plans. GAO/GGD-10-1.20. Washington, D.C.: April 1998.
Agencies' Annual Performance Plans under the Results Act: An Assessment
Guide to Facilitate Congressional Decision Making. GAO-GGD/AIMD-
10.1.18. Washington, D.C.: February 1998.
Executive Guide: Effectively Implementing the Government Performance
and Results Act. GAO/GGD-96-118. Washington, D.C.: June 1996.
FOOTNOTES
[1] The Coast Guard's six non-homeland security programs account for
about $4.2 billion of the Coast Guard's $8.4 billion fiscal year 2006
enacted budget. The remaining $4.2 billion is for its five homeland
security programs--ports, waterways, and coastal security; illegal drug
interdiction; defense readiness; undocumented migrant interdiction; and
other law enforcement activities, including U.S. Exclusive Economic
Zone enforcement.
[2] The Coast Guard defines an "allision" as a collision between a
vessel and a fixed object.
[3] GAO, Coast Guard: Key Management and Budget Challenges for Fiscal
Year 2005 and Beyond, GAO-04-636T (Washington, D.C.: Apr. 7, 2004); and
Coast Guard: Relationship between Resources Used and Results Achieved
Needs to Be Clearer, GAO-04-432 (Washington, D.C.: March 2004).
[4] The criteria for assessing soundness are not equal, and failure to
meet a particular criterion does not necessarily preclude that measure
from being useful; rather, it may indicate an opportunity for further
refinement.
[5] GAO, The Results Act: An Evaluator's Guide to Assessing Agency
Performance Plans, GAO/GGD-10-1.20 (Washington, D.C.: April 1998).
[6] GPRA, Pub. L. No. 103-62, 107 Stat. 285 (1993).
[7] The Reports Consolidation Act of 2000, Pub. L. No. 106-531, 114
Stat. 2537.
[8] The field locations we selected were District 1 (Boston,
Massachusetts); District 7 (Miami, Florida); District 9 (Cleveland,
Ohio); and District 13 (Seattle, Washington). We selected these field
locations because of the number and types of non-homeland security
programs that are performed at these locations. We reviewed activities
at multiple offices or units at each location.
[9] The four programs we selected were aids to navigation, living
marine resources, marine environmental protection, and search and
rescue. We selected these programs because they had the largest budget
increases of the six non-homeland security programs (as reflected in
the fiscal year 2005 budget and the Coast Guard's fiscal year 2006
budget request) and because they are programs of particular interest
because of events surrounding Hurricane Katrina. Further, we selected
only those measures that Coast Guard officials said were high-level,
strategic measures used in performance budgeting, budget projections,
and management decisions. In addition, we did not assess any of the
secondary measures that were in development at the time of our report.
The 23 secondary measures we assessed for these four programs represent
more than half of the 39 high-level, strategic secondary measures used
to manage the six non-homeland security programs.
[10] GAO, Executive Guide: Effectively Implementing the Government
Performance and Results Act, GAO/GGD-96-118 (Washington, D.C.: June
1996); and Tax Administration: IRS Needs to Further Refine Its Tax
Filing Season Performance Measures, GAO-03-143 (Washington, D.C.:
November. 2002).
[11] In 2004, we recommended that the Coast Guard identify the
intervening factors that may affect performance and systematically
assess the relationship among these factors, resources used, and
results achieved. GAO-04-432.
[12] GAO-04-432.
[13] The Coast Guard maintains information on how assets, such as
cutters, patrol boats, and aircraft are used. Each hour that these
resources are used is called a resource hour. Resource hours do not
include such things as the time that the asset stands idle or the time
that is spent maintaining it.
[14] GPRA, Pub. L. No. 103-62, 107 Stat. 285 (1993).
[15] The Reports Consolidation Act of 2000, Pub. L. No. 106-531, § 5,
114 Stat. 2537, 2539-40.
[16] OMB's PART review is a systematic method of assessing the
performance of program activities across the federal government used by
OMB to review federal agency programs. The PART review is a series of
questions that assess different aspects of program performance in which
agencies under review must answer; responses must be evidenced-based.
Agencies must clearly explain their answers and include relevant
supporting evidence such as agency performance information, independent
evaluations, and financial information. PART reviews provide an overall
rating for each program that includes effective (the program is well
managed), moderately effective (the program is well managed but needs
improvements), adequate (the program needs to improve accountability),
ineffective (the program is unable to achieve results), results not
demonstrated (the program does not have acceptable performance goals or
targets).
[17] OMB reviewed the aids to navigation and search and rescue programs
in 2002, the living marine resources and marine environmental
protection programs in 2003, the ice operations program in 2004, and
the marine safety program in 2005.
[18] The U.S. Coast Guard Addendum to the United States National Search
and Rescue Supplement defines lives lost, saved, and assisted. A life
saved is defined as a life that would have been lost had the rescue
action not been taken, including actually pulling a person from a
position of distress or removing them from a situation that would
likely have resulted in their death had the action not been taken. A
life assisted is defined as those persons who are provided assistance
that did not meet the criteria for lives saved but did receive some
assistance, however, persons merely onboard a vessel that is provided
assistance directed at the vessel (such as providing repairs or fuel)
are not necessarily assisted. To count a life as lost there must be a
body recovered; otherwise it is considered a life-unaccounted-for.
Lives lost before notification are those lives lost, which to the best
of the reporting unit's knowledge, occurred before notification of the
incident was made to the Coast Guard and lives lost after notification
are those lives lost that occurred after notification was made to the
Coast Guard.
[19] Department of Transportation Office of Inspector General, Audit of
the Performance Measure for the Recreational Boating System, MA-2000-
084 (Washington, D.C.: April 2000).
[20] The 11 living marine resources performance measures without
measurable targets are the (1) number of domestic fisheries enforcement
resource hours, (2) number of active commercial fishing vessels by
major fishery, (3) number of domestic boardings by major fishery, (4)
boardings per active commercial fishing vessels by major fishery, (5)
number of significant violations by major fishery, (6) number of
significant violations per domestic resource hour, (7) status of fish
stocks, (8) number of Coast Guard members trained at Regional Fishing
Training Centers, (9) cost per Coast Guard member trained at Regional
Fishing Training Centers, (10) number of Marine Affairs graduates on
active duty and, (11) percent of Marine Affairs graduates in Marine
Affairs-coded billets. The one marine environmental protection
performance measure is the Tokyo and Paris memorandum of understanding
port state control reports measure.
[21] GAO/GGD-96-118; and GAO-03-143.
[22] The Tokyo and Paris memorandums of understanding are agreements
between the U.S. and other countries to promote maritime safety and
environmental protection, and eliminate sub-standard shipping through
port controls that include enforcing applicable treaties. These
treaties include various construction, design, equipment, operating,
and training requirements related to maritime safety, environmental
protection, and security. The Tokyo memorandum of understanding
includes 19 countries and the Paris memorandum of understanding
includes 22 countries.
[23] U.S. Coast Guard, Institutional Research Road Map (Washington,
D.C.: January 2005).
[24] The Office of Performance Management and Decision Support was
established by the Coast Guard Chief of Staff on August 11, 2005.
[25] GAO/GGD-10-1.20.
GAO's Mission:
The Government Accountability Office, the investigative arm of
Congress, exists to support Congress in meeting its constitutional
responsibilities and to help improve the performance and accountability
of the federal government for the American people. GAO examines the use
of public funds; evaluates federal programs and policies; and provides
analyses, recommendations, and other assistance to help Congress make
informed oversight, policy, and funding decisions. GAO's commitment to
good government is reflected in its core values of accountability,
integrity, and reliability.
Obtaining Copies of GAO Reports and Testimony:
The fastest and easiest way to obtain copies of GAO documents at no
cost is through the Internet. GAO's Web site ( www.gao.gov ) contains
abstracts and full-text files of current reports and testimony and an
expanding archive of older products. The Web site features a search
engine to help you locate documents using key words and phrases. You
can print these documents in their entirety, including charts and other
graphics.
Each day, GAO issues a list of newly released reports, testimony, and
correspondence. GAO posts this list, known as "Today's Reports," on its
Web site daily. The list contains links to the full-text document
files. To have GAO e-mail this list to you every afternoon, go to
www.gao.gov and select "Subscribe to e-mail alerts" under the "Order
GAO Products" heading.
Order by Mail or Phone:
The first copy of each printed report is free. Additional copies are $2
each. A check or money order should be made out to the Superintendent
of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or
more copies mailed to a single address are discounted 25 percent.
Orders should be sent to:
U.S. Government Accountability Office
441 G Street NW, Room LM
Washington, D.C. 20548:
To order by Phone:
Voice: (202) 512-6000:
TDD: (202) 512-2537:
Fax: (202) 512-6061:
To Report Fraud, Waste, and Abuse in Federal Programs:
Contact:
Web site: www.gao.gov/fraudnet/fraudnet.htm
E-mail: fraudnet@gao.gov
Automated answering system: (800) 424-5454 or (202) 512-7470:
Public Affairs:
Jeff Nelligan, managing director,
NelliganJ@gao.gov
(202) 512-4800
U.S. Government Accountability Office,
441 G Street NW, Room 7149
Washington, D.C. 20548: