Information Technology
Customs Automated Commercial Environment Program Progressing, but Need for Management Improvements Continues
Gao ID: GAO-05-267 March 14, 2005
The Department of Homeland Security (DHS) is conducting a multiyear, multibillion-dollar acquisition of a new trade processing system, planned to support the movement of legitimate imports and exports and strengthen border security. By congressional mandate, plans for expenditure of appropriated funds on this system, the Automated Commercial Environment (ACE), must meet certain conditions, including GAO review. This study addresses whether the fiscal year 2005 plan satisfies these conditions, describes the status of DHS's efforts to implement prior GAO recommendations for improving ACE management, and provides observations about the plan and DHS's management of the program.
The fiscal year 2005 ACE expenditure plan, including related program documentation and program officials' statements, largely satisfies the legislative conditions imposed by the Congress. In addition, some of the recommendations that GAO has previously made to strengthen ACE management have been addressed, and DHS has committed to addressing those that remain. However, much remains to be done before these recommendations are fully implemented. For example, progress has been slow on implementing the recommendation that the department proactively manage the dependencies between ACE and related DHS border security programs. Delays in managing the relationships among such programs will increase the chances that later system rework will be needed to allow the programs to interoperate. Among GAO's observations about the ACE program and its management are several regarding DHS's approach to addressing previously identified cost and schedule overruns. DHS has taken actions intended to address these overruns (such as revising its baselines for cost and schedule, as GAO previously recommended); however, it is unlikely that these actions will prevent future overruns, because DHS has relaxed system quality standards, meaning that milestones are being passed despite material system defects. Correcting such defects will require the program to use resources (e.g., people and test environments) at the expense of later system releases. Until the ACE program is held accountable not only for cost and schedule but also for system capabilities and benefits, the program is likely to continue to fall short of expectations. Finally, the usefulness of the fiscal year 2005 expenditure plan for congressional oversight is limited. For example, it does not adequately describe progress against commitments (e.g., ACE capabilities, schedule, cost, and benefits) made in previous plans, which makes it difficult to make well-informed judgments on the program's overall progress. Also, in light of recent program changes, GAO questions the expenditure plan's usefulness to the Congress as an accountability mechanism. The expenditure plan is based largely on the ACE program plan of July 8, 2004. However, recent program developments have altered some key bases of the ACE program plan and thus the current expenditure plan. In particular, the expenditure plan does not reflect additional program releases that are now planned or recent changes to the roles and responsibilities of the ACE development contractor and the program office. Without complete information and an up-to-date plan, meaningful congressional oversight of program progress and accountability is impaired.
Recommendations
Our recommendations from this work are listed below with a Contact for more information. Status will change from "In process" to "Open," "Closed - implemented," or "Closed - not implemented" based on our follow up work.
Director:
Team:
Phone:
GAO-05-267, Information Technology: Customs Automated Commercial Environment Program Progressing, but Need for Management Improvements Continues
This is the accessible text file for GAO report number GAO-05-267
entitled 'Information Technology: Customs Automated Commercial
Environment Program Progressing, but Need for Management Improvements
Continues' which was released on March 14, 2005.
This text file was formatted by the U.S. Government Accountability
Office (GAO) to be accessible to users with visual impairments, as part
of a longer term project to improve GAO products' accessibility. Every
attempt has been made to maintain the structural and data integrity of
the original printed product. Accessibility features, such as text
descriptions of tables, consecutively numbered footnotes placed at the
end of the file, and the text of agency comment letters, are provided
but may not exactly duplicate the presentation or format of the printed
version. The portable document format (PDF) file is an exact electronic
replica of the printed version. We welcome your feedback. Please E-mail
your comments regarding the contents or accessibility features of this
document to Webmaster@gao.gov.
This is a work of the U.S. government and is not subject to copyright
protection in the United States. It may be reproduced and distributed
in its entirety without further permission from GAO. Because this work
may contain copyrighted images or other material, permission from the
copyright holder may be necessary if you wish to reproduce this
material separately.
Report to Congressional Committees:
March 2005:
Information Technology:
Customs Automated Commercial Environment Program Progressing, but Need
for Management Improvements Continues:
[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-05-267]:
GAO Highlights:
Highlights of GAO-05-267, a report to the Subcommittees on Homeland
Security, Senate and House Committees on Appropriations:
Why GAO Did This Study:
The Department of Homeland Security (DHS) is conducting a multiyear,
multibillion-dollar acquisition of a new trade processing system,
planned to support the movement of legitimate imports and exports and
strengthen border security. By congressional mandate, plans for
expenditure of appropriated funds on this system, the Automated
Commercial Environment (ACE), must meet certain conditions, including
GAO review. This study addresses whether the fiscal year 2005 plan
satisfies these conditions, describes the status of DHS‘s efforts to
implement prior GAO recommendations for improving ACE management, and
provides observations about the plan and DHS‘s management of the
program.
What GAO Found:
The fiscal year 2005 ACE expenditure plan, including related program
documentation and program officials‘ statements, largely satisfies the
legislative conditions imposed by the Congress. In addition, some of
the recommendations that GAO has previously made to strengthen ACE
management have been addressed, and DHS has committed to addressing
those that remain. However, much remains to be done before these
recommendations are fully implemented. For example, progress has been
slow on implementing the recommendation that the department proactively
manage the dependencies between ACE and related DHS border security
programs. Delays in managing the relationships among such programs will
increase the chances that later system rework will be needed to allow
the programs to interoperate.
Among GAO‘s observations about the ACE program and its management are
several regarding DHS‘s approach to addressing previously identified
cost and schedule overruns. DHS has taken actions intended to address
these overruns (such as revising its baselines for cost and schedule,
as GAO previously recommended); however, it is unlikely that these
actions will prevent future overruns, because DHS has relaxed system
quality standards, meaning that milestones are being passed despite
material system defects. Correcting such defects will require the
program to use resources (e.g., people and test environments) at the
expense of later system releases. Until the ACE program is held
accountable not only for cost and schedule but also for system
capabilities and benefits, the program is likely to continue to fall
short of expectations.
Finally, the usefulness of the fiscal year 2005 expenditure plan for
congressional oversight is limited. For example, it does not adequately
describe progress against commitments (e.g., ACE capabilities,
schedule, cost, and benefits) made in previous plans, which makes it
difficult to make well-informed judgments on the program‘s overall
progress. Also, in light of recent program changes, GAO questions the
expenditure plan‘s usefulness to the Congress as an accountability
mechanism. The expenditure plan is based largely on the ACE program
plan of July 8, 2004. However, recent program developments have altered
some key bases of the ACE program plan and thus the current expenditure
plan. In particular, the expenditure plan does not reflect additional
program releases that are now planned or recent changes to the roles
and responsibilities of the ACE development contractor and the program
office. Without complete information and an up-to-date plan, meaningful
congressional oversight of program progress and accountability is
impaired.
What GAO Recommends:
To help ensure the success of ACE, GAO recommends, among other things,
that DHS define and implement an ACE accountability framework that
provides for establishment of explicit program commitments for expected
system capabilities and benefits as well as cost and schedule, and
ensures that progress against these commitments is measured and
reported. DHS agreed with GAO‘s recommendations and described actions
that it plans to take to respond to them.
www.gao.gov/cgi-bin/getrpt?GAO-05-267.
To view the full product, including the scope and methodology, click on
the link above. For more information, contact Randolph C. Hite at (202)
512-3439 or hiter@gao.gov.
[End of section]
Contents:
Letter:
Compliance with Legislative Conditions:
Status of Open Recommendations:
Observations on Management of ACE:
Conclusions:
Recommendations for Executive Action:
Agency Comments:
Appendixes:
Appendix I: Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations:
Appendix II: Comments from the U.S. Department of Homeland Security:
Appendix III: Contacts and Staff Acknowledgments:
GAO Contacts:
Staff Acknowledgments:
Abbreviations:
ACE: Automated Commercial Environment:
ACS: Automated Commercial System:
CBP: U.S. Customs and Border Protection:
CBPMO: Customs and Border Protection Modernization Office:
CIO: chief information officer:
CMU: Carnegie Mellon University:
EA: enterprise architecture:
eCP: e-Customs Partnership:
EVM: earned value management:
IDIQ: indefinite-delivery/indefinite-quantity:
IEEE: Institute of Electrical and Electronics Engineers:
IRB: Investment Review Board:
ITDS: International Trade Data System:
IV&V: independent verification and validation:
JAR: Java Archive:
OIG: Office of Inspector General:
OIT: Office of Information and Technology:
ORR: operational readiness review:
OTB: Over Target Baseline:
PRR: production readiness review:
PTR: program trouble report:
SA-CMM®: Software Acquisition Capability Maturity Model:
SAT: system acceptance test:
SDLC: systems development life cycle:
SEI: Software Engineering Institute:
SIT: system integration test:
SWIT: software integration test:
TRR: test readiness review:
UAT: user acceptance test:
US-VISIT: United States Visitor and Immigrant Status Indicator
Technology:
Letter March 14, 2005:
The Honorable Judd Gregg:
Chairman:
The Honorable Robert C. Byrd:
Ranking Minority Member:
Subcommittee on Homeland Security:
Committee on Appropriations:
United States Senate:
The Honorable Harold Rogers:
Chairman:
The Honorable Martin Olav Sabo:
Ranking Minority Member:
Subcommittee on Homeland Security:
Committee on Appropriations:
House of Representatives:
In November 2004, U.S. Customs and Border Protection (CBP), within the
Department of Homeland Security (DHS), submitted to the Congress its
fiscal year 2005 expenditure plan for the Automated Commercial
Environment (ACE) program. ACE is to be CBP's new import and export
processing system. The program's goals include facilitating the
movement of legitimate trade through more effective trade account
management and strengthening border security by identifying import and
export transactions that could pose a threat to the United States. DHS
currently plans to acquire and deploy ACE in 11 increments, referred to
as releases, over 9 years. The first 3 releases are deployed and
operating. The fourth release is in the final stages of testing. Later
releases are in various stages of definition and development. The risk-
adjusted ACE life-cycle cost estimate is about $3.3 billion,[Footnote
1]and through fiscal year 2004, about $1 billion in ACE-appropriated
funding has been provided.
As required by DHS's fiscal year 2005 appropriations,[Footnote 2] we
reviewed the ACE fiscal year 2005 expenditure plan. Our objectives were
to (1) determine whether the expenditure plan satisfies certain
legislative conditions, (2) determine the status of our open ACE
recommendations, and (3) provide any other observations about the
expenditure plan and DHS's management of the ACE program.
On December 20, 2004, we briefed your offices on the results of this
review. This report transmits the results of our work. The full
briefing, including our scope and methodology, can be found in appendix
I.
Compliance with Legislative Conditions:
The fiscal year 2005 expenditure plan satisfied or partially satisfied
the conditions specified in DHS's appropriations act. Specifically, the
plan, including related program documentation and program officials'
statements, satisfied or provided for satisfying all key aspects of (1)
meeting the capital planning and investment control review requirements
of the Office of Management and Budget (OMB) and (2) review and
approval by DHS and OMB. The plan partially satisfied the conditions
that specify (1) compliance with the DHS enterprise
architecture[Footnote 3] and (2) compliance with the acquisition rules,
requirements, guidelines, and systems acquisition management practices
of the federal government.
Status of Open Recommendations:
CBP is working toward addressing our open recommendations. Each
recommendation, along with the status of actions to address it, is
summarized below.
* Develop and implement a rigorous and analytically verifiable cost-
estimating program that embodies the tenets of effective estimating as
defined in the Software Engineering Institute's (SEI) institutional and
project-specific estimating models.[Footnote 4]
The CBP Modernization Office's (CBPMO) implementation of this
recommendation is in progress. CBPMO has (1) defined and documented
processes for estimating expenditure plan costs (including management
reserve costs); (2) hired a contractor to develop cost estimates,
including contract task orders, that are independent of the ACE
development contractor's estimates; and (3) tasked a support contractor
with evaluating the independent estimates and the development
contractor's estimates against SEI criteria. According to the summary-
level results of this evaluation, the independent estimates either
satisfied or partially satisfied the SEI criteria, and the development
contractor's estimates satisfied or partially satisfied all but two of
the seven SEI criteria.
* Ensure that future expenditure plans are based on cost estimates that
are reconciled with independent cost estimates.
CBPMO's implementation of this recommendation is complete with respect
to the fiscal year 2005 expenditure plan. In August 2004, CBP's support
contractor completed an analysis comparing the cost estimates in the
fiscal year 2005 expenditure plan (which are based on the ACE
development contractor's cost estimates) with the estimate prepared by
CBPMO's independent cost estimating contractor; this analysis concluded
that the two estimates are consistent.
* Immediately develop and implement a human capital management strategy
that provides both near-and long-term solutions to the program office's
human capital capacity limitations, and report quarterly to the
appropriations committees on the progress of efforts to do so.
CBPMO's implementation of this recommendation is in progress, and it
has reported on its actions to the Congress. Following our
recommendation, CBPMO provided reports dated March 31, 2004, and June
30, 2004, to the appropriations committees on its human capital
activities, including development of a staffing plan that identifies
the positions it needs to manage ACE. However, in December 2004, CBPMO
implemented a reorganization of the modernization office, which makes
the staffing plan out of date. As part of this reorganization, CBP
transferred government and contractor personnel who have responsibility
for the Automated Commercial System,[Footnote 5] the Automated
Targeting System,[Footnote 6] and ACE training from non-CBPMO
organizational units to CBPMO. According to CBPMO, this change is
expected to eliminate redundant ACE-related program management efforts.
* Have future ACE expenditure plans specifically address any proposals
or plans, whether tentative or approved, for extending and using ACE
infrastructure to support other homeland security applications,
including any impact on ACE of such proposals and plans.
CBP's implementation of this recommendation is in progress. In our
fiscal year 2004 expenditure plan review,[Footnote 7] we reported that
CBPMO had discussed collaboration opportunities with DHS's United
States Visitor and Immigrant Status Indicator Technology (US-VISIT)
program[Footnote 8] to address the potential for ACE infrastructure,
data, and applications to support US-VISIT. Since then, ACE and US-
VISIT managers have again met to identify potential areas for
collaboration between the two programs and to clarify how the programs
can best support the DHS mission. The US-VISIT and ACE programs have
formed collaboration teams that have drafted team charters, identified
specific collaboration opportunities, developed timelines and next
steps, and briefed ACE and US-VISIT program officials on the teams'
progress and activities.
* Establish an independent verification and validation (IV&V) function
to assist CBP in overseeing contractor efforts, such as testing, and
ensure the independence of the IV&V agent.
CBP has completed its implementation of this recommendation. To ensure
independence, CBPMO has selected an IV&V contractor that, according to
CBP officials, has had no prior involvement in the modernization
program. The IV&V contractor is to be responsible for reviewing ACE
products and management processes and is to report directly to the CBP
chief information officer.[Footnote 9]
* Define metrics, and collect and use associated measurements, for
determining whether prior and future program management improvements
are successful.
CBPMO's implementation of this recommendation is in progress. CBPMO has
implemented a program that generally focuses on measuring the ACE
development contractor's performance through the use of earned value
management,[Footnote 10] metrics for the timeliness and quality of
deliverables, and risk and issue disposition reporting. Additionally,
it is planning to broaden its program to encompass metrics and measures
for determining progress toward achieving desired business results and
acquisition process maturity. The plan for expanding the metrics
program is scheduled for approval in early 2005.
* Reconsider the ACE acquisition schedule and cost estimates in light
of early release problems, including these early releases' cascading
effects on future releases and their relatively small size compared to
later releases, and in light of the need to avoid the past levels of
concurrency among activities within and between releases.
CBP has completed its implementation of this recommendation. In
response to the cost overrun on Releases 3 and 4, CBPMO and the ACE
development contractor established a new cost baseline of $196 million
for these releases, extended the associated baseline schedule, and
began reporting schedule and cost performance relative to the new
baselines. Additionally, in July 2004, a new version of the ACE Program
Plan was developed that rebaselined the ACE program, extending delivery
of the last ACE release from fiscal year 2007 to fiscal year 2010,
adding a new screening and targeting release, and increasing the ACE
life-cycle cost estimate by about $1 billion to $3.1 billion. Last, the
new program schedule reflects less concurrency between future releases.
* Report quarterly to the House and Senate Appropriations Committees on
efforts to address open GAO recommendations.
CBP's implementation of this recommendation is in progress. CBP has
submitted reports to the committees on its efforts to address open GAO
recommendations for the quarters ending March 31, 2004, and June 30,
2004. CBPMO plans to submit a report for the quarter ending September
30, 2004, after it is approved by DHS and OMB.
Observations on Management of ACE:
We made observations related to ACE performance, use, testing,
development, cost and schedule performance, and expenditure planning.
An overview of the observations follows:
Initial ACE releases have largely met a key service level agreement.
According to a service level agreement between the ACE development
contractor and CBPMO, 99.9 percent of all ACE transactions are to be
executed successfully each day. The development contractor reports that
ACE has met this requirement on all but 11 days since February 1, 2004,
and attributed one problem that accounted for 5 successive days during
which the service level agreement was not met to CBPMO's focus on
meeting schedule commitments.
Progress toward establishing ACE user accounts has not met
expectations. CBPMO established a goal of activating 1,100 ACE importer
accounts by February 25, 2005, when Release 4 is to become operational.
Weekly targets were established to help measure CBPMO's progress toward
reaching the overall goal. However, CBPMO has not reached any of its
weekly targets, and the gap between the actual and targeted number of
activated accounts has continued to grow. To illustrate, as of November
26, 2004, the goal was 600 activated accounts and the actual number was
311.
Release 3 testing and pilot activities were delayed and have produced
system defect trends that raise questions about decisions to pass key
milestones and about the state of system maturity. Release 3 test
phases and pilot activities were delayed and revealed system defects,
some of which remained open at the time decisions were made to pass key
life-cycle milestones. In particular, we observed the following:
* Release 3 integration testing started later than planned, took longer
than expected, and was declared successful despite open defects that
prevented the system from performing as intended. For example, the test
readiness milestone was passed despite the presence of 90 severe
defects.
* Release 3 acceptance testing started later than planned, concluded
later than planned, and was declared successful despite having a
material inventory of open defects. For example, the production
readiness milestone was passed despite the presence of 18 severe
defects.
* Release 3 pilot activities, including user acceptance testing, were
declared successful, despite the presence of severe defects. For
example, the operational readiness milestone was passed despite the
presence of 6 severe defects.
* The current state of Release 3 maturity is unclear because defect
data reported since user acceptance testing are not reliable.
Release 4 test phases were delayed and overlapped, and revealed a
higher than expected volume and significance of defects, raising
questions about decisions to pass key milestones and about the state of
system maturity. In particular, we observed the following:
* Release 4 testing revealed a considerably higher than expected number
of material defects. Specifically, 3,059 material defects were
reported, compared with the 1,453 estimated, as of the November 23,
2004, production readiness milestone.
* Changes in the Release 4 integration and acceptance testing schedule
resulted in tests being conducted concurrently. As we previously
reported, concurrent test activities increase risk and have contributed
to past ACE cost and schedule problems.
* The defect profile for Release 4 shows improvements in resolving
defects, but critical and severe defects remain in the operational
system. Specifically, as of November 30, 2004, which was about 1.5
weeks from deployment of the Release 4 pilot period, 33 material
defects were present.
Performance against the revised cost and schedule estimates for
Releases 3 and 4 has been mixed. Since the cost and schedule for
Releases 3 and 4 were revised in April 2004, work has been completed
under the budgeted cost, but it is being completed behind schedule. In
order to improve the schedule performance, resources targeted for later
releases have been retained on Release 4 longer than planned. While
this has resulted in improved performance against the schedule, it has
adversely affected cost performance.
The fiscal year 2005 expenditure plan does not adequately describe
progress against commitments (e.g., ACE capabilities, schedule, cost,
and benefits) made in previous plans. In the fiscal year 2004
expenditure plan, CBPMO committed to, for example, acquiring
infrastructure for ACE releases and to defining and designing an ACE
release that was intended to provide additional account management
functionality. However, the current plan described neither the status
of infrastructure acquisition nor progress toward defining and
designing the planned account management functionality. Also, the
current plan included a schedule for developing ACE releases, but
neither reported progress relative to the schedule presented in the
fiscal year 2004 plan nor explained how the individual releases and
their respective schedules were affected by the rebaselining that
occurred after the fiscal year 2004 plan was submitted.
Some key bases for the commitments made in the fiscal year 2005
expenditure plan have changed, raising questions as to the plan's
currency and relevance. Neither the expenditure plan nor the program
plan reflected several program developments, including the following:
* A key Release 5 assumption made in the program and expenditure plans
regarding development, and thus cost and delivery, of the multimodal
manifest functionality is no longer valid.
* Additional releases, and thus cost and effort, are now planned that
were not reflected in the program and expenditure plans.
* The current organizational change management approach is not fully
reflected in program and expenditure plans, and key change management
actions are not to be implemented.
* Significant changes to the respective roles and responsibilities of
the ACE development contractor and CBPMO are not reflected in the
program and expenditure plans.
Conclusions:
DHS and OMB have largely satisfied four of the five conditions
associated with the fiscal year 2005 ACE expenditure plan that were
legislated by the Congress, and we have satisfied the fifth condition.
Further, CBPMO has continued to work toward implementing our prior
recommendations aimed at improving management of the ACE program and
thus the program's chances of success. Nevertheless, progress has been
slow in addressing some of our recommendations, such as the one
encouraging proactive management of the relationships between ACE and
other DHS border security programs, like US-VISIT. Given that these
programs have made and will continue to make decisions that determine
how they will operate, delays in managing their relationships will
increase the chances that later system rework will eventually be
required to allow the programs to interoperate.
Additionally, while DHS has taken important actions to help address ACE
release-by-release cost and schedule overruns that we previously
identified, it is unlikely that the effect of these actions will
prevent the past pattern of overruns from recurring. This is because
DHS has met its recently revised cost and schedule commitments in part
by relaxing system quality standards, so that milestones are being
passed despite material system defects, and because correcting such
defects will ultimately require the program to expend resources, such
as people and test environments, at the expense of later system
releases (some of which are now under way).
In the near term, cost and schedule overruns on recent releases are
being somewhat masked by the use of less stringent quality standards;
ultimately, efforts to fix these defects will likely affect the
delivery of later releases. Until accountability for ACE is redefined
and measured in terms of all types of program commitments--system
capabilities, benefits, costs, and schedules--the program will likely
experience more cost and schedule overruns.
During the last year, DHS's accountability for ACE has been largely
focused on meeting its cost and schedule baselines. This focus is
revealed by the absence of information in the latest expenditure plan
on progress against all commitments made in prior plans, particularly
with regard to measurement and reporting on such things as system
capabilities, use, and benefits. It is also shown by the program's
insufficient focus on system quality, as demonstrated by its
willingness to pass milestones despite material defects, and by the
absence of attention to the current defect profile for Release 3 (which
is already deployed).
Moreover, the commitments that DHS made in the fiscal year 2005
expenditure plan have been overcome by events, which limits the
currency and relevance of this plan and its utility to the Congress as
an accountability mechanism. As a result, the prospects of greater
accountability in delivering against its capability, benefit, cost, and
schedule commitments are limited. Therefore, it is critically important
that DHS define for itself and the Congress an accountability framework
for ACE, and that it manage and report in accordance with this
framework. If it does not, the effects of the recent rebaselining of
the program will be short lived, and the past pattern of ACE costing
more and taking longer than planned will continue.
Recommendations for Executive Action:
To strengthen accountability for the ACE program and better ensure that
future ACE releases deliver promised capabilities and benefits within
budget and on time, we recommend that the DHS Secretary, through the
Under Secretary for Border and Transportation Security, direct the
Commissioner, Customs and Border Protection, to define and implement an
ACE accountability framework that ensures:
* coverage of all program commitment areas, including key expected or
estimated system (1) capabilities, use, and quality; (2) benefits and
mission value; (3) costs; and (4) milestones and schedules;
* currency, relevance, and completeness of all such commitments made to
the Congress in expenditure plans;
* reliability of data relevant to measuring progress against
commitments;
* reporting in future expenditure plans of progress against commitments
contained in prior expenditure plans;
* use of criteria for exiting key readiness milestones that adequately
consider indicators of system maturity, such as severity of open
defects; and:
* clear and unambiguous delineation of the respective roles and
responsibilities of the government and the prime contractor.
Agency Comments:
In written comments on a draft of this report signed by the Acting
Director, Departmental GAO/OIG Liaison, DHS agreed with our findings
concerning progress in addressing our prior recommendations. In
addition, the department agreed with the new recommendations we are
making in this report and described actions that it plans to take to
enhance accountability for the program. These planned actions are
consistent with our recommendations. DHS's comments are reprinted in
appendix II.
We are sending copies of this report to the Chairmen and Ranking
Minority Members of other Senate and House committees and subcommittees
that have authorization and oversight responsibilities for homeland
security. We are also sending copies to the Secretary of Homeland
Security, the Under Secretary for Border and Transportation Security,
the CBP Commissioner, and the Director of OMB. In addition, the report
will be available at no charge on the GAO Web site at [Hyperlink,
http://www.gao.gov].
Should you or your offices have any questions on matters discussed in
this report, please contact me at (202) 512-3459 or at [Hyperlink,
hiter@gao.gov]. Other contacts and key contributors to this report are
listed in appendix III.
Signed by:
Randolph C. Hite:
Director, Information Technology Architecture and Systems Issues:
[End of section]
Appendixes:
Appendix I: Briefing to Subcommittees on Homeland Security, House and
Senate Committees on Appropriations:
Information Technology: Customs Automated Commercial Environment
Program Progressing, but Need for Management Improvements Continues:
Briefing to the Staffs of the Subcommittees on Homeland Security,
Senate and House Committees on Appropriations:
December 20, 2004:
Briefing Overview:
Introduction:
Objectives:
Results in Brief:
Background:
Results:
* Legislative Conditions:
* Status of Recommendations:
* Observations:
Conclusions:
Recommendations:
Agency Comments:
Attachment 1: Scope and Methodology:
Introduction:
The Department of Homeland Security's (DHS) Bureau of Customs and
Border Protection (CBP)[NOTE 1] is over 3 years into its second attempt
to introduce new trade processing capability, known as the Automated
Commercial Environment (ACE). The goals of ACE are to:
* facilitate the movement of legitimate trade through more effective
trade account management;
* strengthen border security by identifying import/export transactions
that have an elevated risk of posing a threat to the United States or
of violating a trade law or regulation; and:
* provide a single system interface between the trade community [NOTE
2] and the federal government, [NOTE 3] known as the International
Trade Data System (ITDS), and thereby reduce the data reporting burden
placed on the trade community while also providing federal agencies
with the data and various capabilities to support their respective
international trade and transportation missions.
NOTES:
[1] CBP was formed from the former U.S. Customs Service and other
entities with border protection responsibility.
[2] Members of the trade community include importers and exporters,
brokers and trade advisors, and carriers.
[3] Includes federal agencies responsible for managing international
trade and transportation processes.
The Department of Homeland Security Appropriations Act, 2005, [NOTE 1]
states that DHS may not obligate any funds for ACE until DHS submits
for approval to the House and Senate Committees on Appropriations a
plan for expenditure that:
1. meets the capital planning and investment control review
requirements established by the Office of Management and Budget (OMB),
including Circular A-11, part 7, [NOTE 2]
2. complies with DHS's enterprise architecture;
3. complies with the acquisition rules, requirements, guidelines, and
systems acquisition management practices of the federal government;
4. is reviewed and approved by the DHS Investment Review Board (IRB),
[NOTE 3] Secretary of Homeland Security, and OMB; and:
5. is reviewed by GAO.
NOTES:
[1] Pub. L. 108-334 (Oct. 18, 2004).
[2] OMB Circular A-11 establishes policy for planning, budgeting,
acquisition, and management of federal capital assets.
[3] The purpose of the Investment Review Board is to integrate capital
planning and investment control, budgeting, acquisition, and management
of investments. It is also to ensure that spending on investments
directly supports and furthers the mission and that this spending
provides optimal benefits and capabilities to stakeholders and
customers.
In the Department of Homeland Security Appropriations Act for fiscal
year 2005, the Congress appropriated approximately $321.7 million for
the ACE program. [NOTE 1]
DHS submitted its fiscal year 2005 expenditure plan for $321.7 million
on November 8, 2004, to its House and Senate Appropriations
Subcommittees on Homeland Security.
DHS currently plans to acquire and deploy ACE in 11 increments,
referred to as releases. The first three releases are deployed and
operational. The fourth release is in the final stages of testing.
Other releases are in various stages of definition and development.
NOTES:
[1] Pub. L. 108-334 (Oct. 18, 2004).
Objectives:
As agreed, our objectives were to:
* determine whether the ACE fiscal year 2005 expenditure plan satisfies
the legislative conditions,
* determine the status of our open recommendations on ACE, and:
* provide any other observations about the expenditure plan and DHS's
management of the ACE program.
We conducted our work at CBP headquarters and contractor facilities in
the Washington, D.C., metropolitan area from April 2004 through
December 2004, in accordance with generally accepted government
auditing standards. Details of our scope and methodology are provided
in attachment 1.
Results in Brief:
Objective 1: Satisfaction of legislative conditions:
Legislative conditions: 1. Meets the capital planning and investment
control review requirements established by OMB, including OMB Circular
A-11, part 7.
Status: Satisfied[A].
Legislative conditions: 2. Complies with DHS's enterprise architecture.
Status: Partially satisfied[B].
Legislative conditions: 3. Complies with the acquisition rules,
requirements, guidelines, and systems acquisition management practices
of the federal government.
Status: Partially satisfied.
Legislative conditions: 4. Is reviewed and approved by the DHS
Investment Review Board, Secretary of Homeland Security, and OMB.
Status: Satisfied.
Legislative conditions: 5. Is reviewed by GAO.
Status: Satisfied.
Source: GAO.
[A] Satisfied means that the plan, in combination with supporting
documentation, either satisfied or provides for satisfying every aspect
of the condition that we reviewed.
[B] Partially satisfied means that the plan, in combination with
supporting documentation, either satisfied or provides for satisfying
many, but not all, key aspects of the condition that we reviewed.
[End of table]
Objective 2: Status of actions to implement our open recommendations:
GAO recommendations: Develop and implement a rigorous and analytically
verifiable cost estimating a program.
Status: In progress[A].
GAO recommendations: Ensure that future expenditure plans are based on
cost estimates that are reconciled with independent cost estimates.
Status: Complete[B,C]
GAO recommendations: Immediately develop and implement a human capital
management strategy that provides both near and long-term solutions;
develop and implement missing human capital practices.
Status: In progress.
GAO recommendations: Have future ACE expenditure plans specifically
address any proposals or plans for extending and using ACE
infrastructure to support other homeland security applications.
Status: In progress.
[A] In progress means that actions are under way to implement the
recommendation.
[B] Complete means that actions have been taken to fully implement the
recommendation.
[C] With respect to the fiscal year 2005 expenditure plan.
Objective 2: Status of actions to implement our open recommendations:
Status:
GAO recommendations: Establish an independent verification and
validation (IV&V) function to assist CBP in overseeing contractor
efforts, such as testing, and ensure the independence of the IV&V
agent. [NOTE 1]
Status: Complete.
GAO recommendations: Reconsider the ACE acquisition schedule and cost
estimates in light of early release problems and the need to avoid past
levels of concurrency among activities within and between releases.
Status: Complete
GAO recommendations: Define metrics, and collect and use associated
measurements, for determining whether prior and future program
management improvements are successful.
Status: In progress.
GAO recommendations: Report quarterly to the House and Senate
Appropriations Committees on efforts to address open GAO
recommendations.
Status: In progress.
Source: GAO.
[End of section]
NOTES:
[1] The purpose of IV&V is to increase the chances of program success
by having independent reviews of program management processes and
products throughout the acquisition and deployment phase.
Objective 3: Observations:
* Initial ACE releases have largely met a key service level agreement.
* Progress toward establishing ACE user accounts has not met
expectations.
* Release 3 testing and pilot activities were delayed and have produced
system defect trends that raise questions about decisions to pass key
milestones and about the state of system maturity.
- Release 3 integration testing started later than planned, took longer
than expected, and was declared successful despite open defects that
prevented system from performing as intended.
- Release 3 acceptance testing started later than planned, concluded
later than planned, and was declared successful despite material
inventory of open defects.
- Release 3 pilot activities, including user acceptance testing, were
declared successful despite severe defects remaining open.
- Current state of Release 3 maturity is unclear because defect data
since user acceptance testing are not reliable.
* Release 4 test phases were delayed and overlapped, and revealed a
higher than expected volume and significance of defects, raising
questions about decisions to pass key milestones and about the state of
system maturity.
- Release 4 testing revealed a considerably higher than expected number
of material defects.
- Release 4 integration and acceptance testing schedule changes
resulted in tests being conducted concurrently.
- Release 4 defect profile shows improvements in resolving defects, but
critical and severe defects remain in operational system.
* Performance against the revised cost and schedule estimates for
Releases 3 and 4 has been mixed.
* The fiscal year 2005 expenditure plan does not adequately describe
progress against commitments (e.g., ACE capabilities, schedule, cost,
and benefits) made in previous plans.
* Some key bases for the commitments made in the fiscal year 2005
expenditure plan have changed, raising questions as to the plan's
currency and relevance.
- A key Release 5 assumption underpinning program and expenditure plans
is no longer valid.
- Additional release(s) are now planned that were not reflected in the
program and expenditure plans.
- The current organizational change management approach is not fully
reflected in program and expenditure plans, and key change management
actions are not to be implemented.
- Recent changes to the respective roles and responsibilities of the
ACE development contractor and CBP's Modernization Office are not
reflected in the program and expenditure plans.
We are making recommendations to the DHS Secretary to strengthen
accountability for the ACE program and better ensure that future ACE
releases deliver expected capabilities and benefits within budget and
on time.
In their oral comments on a draft of this briefing, DHS and CBP
officials, including the DHS Chief Information Officer (CIO), the
Border and Transportation Security CIO, and the CBP Acting CIO,
generally agreed with our findings, conclusions, and recommendations
and stated that it was fair and balanced. They also provided clarifying
information that we incorporated as appropriate in this briefing.
Background:
ACE-Related Business Functions:
ACE is to support eight major CBP business areas.
1. Release Processing: Processing of cargo for import or export;
tracking of conveyances, cargo and crew; and processing of in-bond,
warehouse, Foreign Trade Zone, and special import and export entries.
2. Entry Processing: Liquidation and closeout of entries and entry
summaries related to imports, and processing of protests and decisions.
3. Finance: Recording of revenue, performance of fund accounting, and
maintenance of the general ledger.
4. Account Relationships: Maintenance of trade accounts, their bonds
and CBP-issued licenses, and their activity.
5. Legal and Policy: Management of import and export legal, regulatory,
policies and procedures, and rulings issues.
6. Enforcement: Enforcement of laws, regulations, policies and
procedures, and rulings governing the import and export of cargo,
conveyances, and crew.
7. Business Intelligence: Gathering and reporting data, such as
references for import and export transactions, for use in making
admissibility and release decisions.
8. Risk: Decisionmaking about admissibility and compliance of cargo
using risk-based mitigation, selectivity, and targeting.
Background:
Description of ACE Technical Architecture:
The ACE technical architecture is to consist of layers or tiers of
computer technology:
* The Client Tier includes user workstations and external system
interfaces.
* The Presentation Tier provides the mechanisms for the user
workstations and external systems to access ACE.
* The Integration Services Tier provides the middleware for integrating
and routing information between ACE software applications and legacy
systems.
* The Applications Tier includes software applications comprising
commercial products (e.g., SAP [NOTE 1]) and custom-developed software
that provide the functionality supporting CBP business processes.
* The Data Tier provides the data management and warehousing services
for ACE, including database backup, restore, recovery, and space
management.
Security and data privacy are to be embedded in all five layers.
NOTE:
[1] SAP is a commercial enterprise resource planning software product
that has multiple modules, each performing separate but integrated
business functions. ACE will use SAP as the primary commercial, off-the-
shelf product supporting its business processes and functions. CBP's
Modernization Office is also using SAP as part of a joint project with
its Office of Finance to support financial management, procurement,
property management, cost accounting, and general ledger processes.
Background:
ACE Technical Architecture:
Simplified View of ACE Technical Architecture:
[See PDF for image]
Source: GAO based on CBP data.
[End of figure]
Background:
Acquisition Strategy:
CBP's Modernization Office (CBPMO) is responsible for acquiring and
implementing ACE through a contract awarded on April 27, 2001, to IBM
Global Services. IBM and its subcontractors are collectively called the
e-Customs Partnership (eCP).
CBPMO's initial strategy provided for acquiring ACE in four increments
deployed over 4 years. In September 2002, the modernization office
modified this strategy to acquire and deploy the first three increments
in six releases; all four increments were to be deployed over 4 years.
In October 2003, CBPMO changed its plans, deciding to acquire and
deploy ACE in 10 releases over 6 years.
Subsequently, between January and July 2004, CBPMO and eCP conducted a
planning project called the Global Business Blueprint. It was intended
to define how ACE will use SAP and other technologies to perform CBP
business processes in Releases 5, 6, and 7; to define the functional
scope of these releases; and to develop updated program schedule and
cost estimates. Following the blueprint, CBP changed its acquisition
strategy again. It currently plans to acquire and deploy ACE in 11
releases over 9 years.
Background:
Summary of ACE Releases:
The functionality associated with, status of, and plans for the 11 ACE
releases are as follows.
Release 1 (ACE Foundation): Provide IT infrastructure-computer hardware
and system software-to support subsequent system releases. This release
was deployed in October 2003 and is operating.
Release 2 (Account Creation): Give initial group of CBP national
account managers [NOTE 1] and importers access to account information,
such as trade activity. This release was deployed in October 2003 and
is operating.
Release 3 (Periodic Payment): Provide additional account managers and
importers, as well as brokers and carriers, [NOTE 2] access to account
information; provide initial financial transaction processing and CBP
revenue collection capability, allowing importers and their brokers to
make monthly payments of duties and fees.
NOTES:
[1] CBP national account managers work with the largest importers.
[2] Brokers obtain licenses from CBP to conduct business on behalf of
the importers by filling out paperwork and obtaining a bond; carriers
are individuals or organizations engaged in transporting goods for
hire.
This release was deployed in July 2004 and is operating. As a result,
CBP reports that importers can now obtain a national view of their
transactions on a monthly statement and can pay duties and fees on a
monthly basis for the first time since CBP and its predecessor
organizations were established in 1789. Additionally, according to CBP,
Release 3 provides a national view of trade activity, thus greatly
enhancing its ability to accomplish its mission of providing border
security while facilitating legitimate trade and travel. CBP also
reports that as of December 6, 2004, it had processed 27,777 entries
and collected over $126.5 million using Release 3.
Release 4 (e-Manifest: Trucks): Provide truck manifest [NOTE 1]
processing and interfacing to legacy enforcement systems and databases.
This release is under development and scheduled for deployment
beginning in February 2005.
Screening S1 (Screening Foundation): Establish the foundation for
screening and targeting cargo and conveyances by centralizing criteria
and results into a single standard database; allow users to define and
maintain data sources and business rules. This release is scheduled for
deployment beginning in September 2005.
NOTE:
[1] Manifests are lists of passengers or invoices of cargo for a
vehicle, such as a truck, ship, or plane.
Background:
Summary of ACE Releases:
Screening S2 (Targeting Foundation): Establish the foundation for
advanced targeting capabilities by enabling CBP's National Targeting
Center to search multiple databases for relevant facts and actionable
intelligence. This release is scheduled for deployment beginning in
February 2006.
Release 5 (Account Revenue and Secure Trade Data): Leverage SAP
technologies to enhance and expand accounts management, financial
management, and postrelease functionality, as well as provide the
initial multimodal manifest [NOTE 1] capability. This release is
scheduled for deployment beginning in November 2006.
Screening S3 (Advanced Targeting): Provide enhanced screening for
reconciliation, intermodal manifest, Food and Drug Administration data,
and in-bond, warehouse, and Foreign Trade Zone authorized movements;
integrate additional data sources into targeting capability; provide
additional analytical tools for screening and targeting data. This
release is scheduled for deployment beginning in February 2007.
NOTES:
[1] The multimodal manifest involves the processing and tracking of
cargo as it transfers between different modes of transportation, such
as cargo that arrives by ship, is transferred to a truck, and then is
loaded onto an airplane.
Background:
Summary of ACE Releases:
Screening S4 (Full Screening and Targeting): Provide screening and
targeting functionality supporting all modes of transportation and all
transactions within the cargo management lifecycle, including enhanced
screening and targeting capability with additional technologies. This
release is scheduled for deployment beginning in February 2009.
Release 6 (e-Manifest: All Modes and Cargo Security): Provide enhanced
postrelease functionality by adding full entry processing; enable full
tracking of cargo, conveyance, and equipment; enhance the multimodal
manifest to include shipments transferring between transportation
modes. This release is scheduled for deployment beginning in February
2009.
Release 7 (Exports and Cargo Control): Implement the remaining ACE
functionality, including Foreign functionality, including Foreign Trade
Zone warehouse; export, seized asset and case tracking system; import
activity summary statement; and mail, pipeline, hand carry, drawback,
protest, and document management. This release is scheduled for
deployment beginning in May 2010.
The graphic on the following slide illustrates the planned schedule for
ACE.
Background:
Current ACE Schedule:
[See PDF for image]
Source: GAO analysis of CBP data.
[End of figure]
Background:
GAO ACE Satisfaction of Modernization Act Requirements:
ACE is intended to support CBP satisfaction of the provisions of Title
VI of the North American Free Trade Agreement, commonly known as the
Modernization Act. Subtitle B of the Modernization Act contains the
various automation provisions that were intended to enable the
government to modernize international trade processes and permit CBP to
adopt an informed compliance approach with industry. The following
table illustrates how each ACE release is to fulfill the requirements
of Subtitle B.
Background:
GAO ACE Satisfaction of Modernization Act Requirements
[See PDF for image]
Source: CBP.
[End of figure]
Background:
Contract Tasks:
Thus far, CBPMO has executed 21 contract task orders. The following
table describes and provides the status of the executed eCP task
orders.
Status and description of eCP task orders:
[See PDF for image]
[End of table]
Background:
Chronology of Six ACE Expenditure Plans:
Since March 2001, six ACE expenditure plans have been submitted. [NOTE
1] Collectively, the six plans have identified a total of $1,401.5
million in funding.
* On March 26, 2001, CBP submitted to its appropriations committees the
first expenditure plan seeking $45 million for the modernization
contract to sustain CBPMO operations, including contractor support. The
appropriations committees subsequently approved the use of $45 million,
bringing the total ACE funding to $50 million.
* On February 1, 2002, the second expenditure plan sought $206.9
million to sustain CBPMO operations; define, design, develop, and
deploy Increment 1, Release 1 (now Releases 1 and 2); and identify
requirements for Increment 2 (now part of Releases 5, 6, and 7 and
Screenings 1 and 2). The appropriations committees subsequently
approved the use of $188.6 million, bringing total ACE funding to
$238.6 million.
NOTE:
[1] In March 2001, appropriations committees approved the use of $5
million in stopgap funding to fund program management office
operations.
* On May 24, 2002, the third expenditure plan sought $190.2 million to
define, design, develop, and implement Increment 1, Release 2 (now
Releases 3 and 4). The appropriations committees subsequently approved
the use of $190.2 million, bringing the total ACE funding to $428.8
million.
* On November 22, 2002, the fourth expenditure plan sought $314 million
to operate and maintain Increment 1 (now Releases 1, 2, 3, and 4); to
design and develop Increment 2, Release 1 (now part of Releases 5, 6,
and 7 and Screening 1); and to define requirements and plan Increment 3
(now part of Releases 5, 6, and 7 and Screenings 2, 3, and 4). The
appropriations committees subsequently approved the use of $314
million, bringing total ACE funding to $742.8 million.
* On January 21, 2004, the fifth expenditure plan sought $318.7 million
to implement ACE infrastructure; to support, operate, and maintain ACE;
and to define and design Release 6 (now part of Releases 5, 6, and 7)
and Selectivity 2 (now Screenings 2 and 3). The appropriations
committees subsequently approved the use of $316.8 million, bringing
total ACE funding to $1,059.6 million.
* On November 8, 2004, CBP submitted its sixth expenditure plan,
seeking $321.7 million for detailed design and development of Release 5
and Screening 2, definition of Screening 3, Foundation Program
Management, Foundation Architecture and Engineering, and ACE Operations
and Maintenance.
Background:
Summary of Expenditure Plan Funding:
Summary of the ACE fiscal year 2005 expenditure plan:
Plan activity: Manifest/Entry & Revenue, Design and Development;
Funding: $40.0.
Plan activity: e-Manifest: Trucks (Release 4) Deployment;
Funding: $10.3.
Plan activity: Screening and Targeting, Design and Development;
Funding: $27.0.
Plan activity: Implementation Infrastructure and Support;
Funding: $55.4.
Plan activity: Foundation Program Management;
Funding: $40.5.
Plan activity: Foundation Architecture And Engineering;
Funding: $20.5.
Plan activity: Workforce Transformation;
Funding: $5.5.
Plan activity: Operations and Maintenance;
Funding: $45.5.
Plan activity: CBPMO Costs;
Funding: $48.6.
Plan activity: ITDS;
Funding: $16.2.
Plan activity: Management Reserve;
Funding: $12.2.
Total Funding: $321.7.
Source: CBP.
[A] Millions of dollars.
[End of table]
Objective 1 Results:
Legislative Conditions:
DHS and OMB satisfied or partially satisfied each of its legislative
conditions; GAO satisfied its legislative condition.
Condition 1. The plan, in conjunction with related program
documentation and program officials' statements, satisfied the capital
planning and investment control review requirements established by OMB,
including Circular A-11, part 7, which establishes policy for planning,
budgeting, acquisition, and management of federal capital assets.
The table that follows provides examples of the results of our
analysis.
Examples of A-11 conditions: Provide justification and describe
acquisition strategy.
Results of our analysis: The plan provides a high-level justification
for ACE. Supporting documentation describes the acquisition strategy
for ACE releases, including Release 5 and Screening 2 activities that
are identified in the fiscal year 2005 expenditure plan.
Examples of A-11 conditions: Summarize life cycle costs and
cost/benefit analysis, including the return on investment.
Results of our analysis: CBPMO issued a cost/benefit analysis for ACE
on September 16, 2004. This analysis includes a life cycle cost
estimate of $3.1 billion and a benefit cost ratio of 2.7.
Examples of A-11 conditions: Provide performance goals and measures.
Results of our analysis: The plan and supporting documentation describe
some goals and measures. For example, CBPMO has established goals for
time and labor savings expected to result from using the early ACE
releases, and it has begun or plans to measure results relative to
these goals and measures. It has defined measures and is collecting
data for other goals, such as measures for determining its progress
toward defining the complete set of ACE functional requirements.
Examples of A-11 conditions: Address security and privacy.
Results of our analysis: The security of Release 3 was certified on May
28, 2004, and accredited on June 9, 2004. Release 4 was certified on
November 23, 2004, and accredited on December 2, 2004. CBP plans to
certify and accredit future releases. CBPMO reports that it is
currently preparing a privacy impact assessment for ACE.
Examples of A-11 conditions: Address Section 508 compliance.[A]
Results of our analysis: CBPMO deployed Release 3 and plans to deploy
Release 4 without Section 508 compliance because the requirement was
overlooked and not built into either release. CBPMO has finalized and
begun implementing a strategy that is expected to result in full
Section 508 compliance. For example, CBPMO has defined a set of Section
508 requirements to be used in developing later ACE releases.
Source: GAO.
[A] Section 508 of the Rehabilitation Act (29 U.S.C. 794d), as amended
by the Workforce Investment Act of 1998 (Pub. L. 105-220), August 7,
1998, requires federal agencies to develop, procure, maintain, and use
electronic information technology in a way that ensures that the
technology is accessible to people with disabilities.
[End of table]
Objective 1 Results Legislative Conditions:
Condition 2. The plan, including related program documentation and
program officials' statements, partially satisfied this condition by
providing for future compliance with DHS's enterprise architecture
(EA).
DHS released version 1.0 of the architecture in September 2003. [NOTE
1] We reviewed the initial version of the architecture and found that
it was missing, either partially or completely, all the key elements
expected in a well-defined architecture, such as a description of
business processes, information flows among these processes, and
security rules associated with these information flows. [NOTE 2] Since
we reviewed version 1.0, DHS has drafted version 2.0 of its EA. We have
not reviewed this draft.
According to CBPMO officials, they have been working with the DHS EA
program office in developing version 2.0 to ensure that ACE is aligned
with DHS's evolving EA. They also said that CBP participates in both
the DHS EA Center of Excellence and the DHS Enterprise Architecture
Board. [NOTE 3]
NOTES:
[1] Department of Homeland Security Enterprise Architecture Compendium
Version 1.0 and Transitional Strategy.
[2] GAO, Homeland Security. Efforts Under Way to Develop Enterprise
Architecture, but Much Work Remains, GAO-04-777 (Washington, D.C.: Aug.
6, 2004).
[3] The Center of Excellence supports the Enterprise Architecture Board
in reviewing component documentation. The purpose of the Board is to
ensure that investments are aligned with the DHS EA.
In August 2004, the Center of Excellence approved CBPMO's analysis
intended to demonstrate ACE's architectural alignment, and the
Enterprise Architecture Board subsequently concurred with the center's
approval. However, DHS has not yet provided us with sufficient
documentation to allow us to understand DHS's architecture compliance
methodology and criteria (e.g., definition of alignment and compliance)
or with verifiable analysis justifying the approval.
Condition 3. The plan, in conjunction with related program
documentation, partially satisfied the condition of compliance with the
acquisition rules, requirements, guidelines, and systems acquisition
management practices of the federal government.
The Software Acquisition Capability Maturity Model (SA-CMM®), developed
by Carnegie Mellon University's Software Engineering Institute (SEI),
is consistent with the acquisition guidelines and systems acquisition
management practices of the federal government, and it provides a
management framework that defines processes for acquisition planning,
solicitation, requirements development and management, project
management, contract tracking and oversight, and evaluation.
In November 2003, SEI assessed ACE acquisition management against the
SA-CMM and assigned a level 2 rating, indicating that CBPMO has
instituted basic acquisition management processes and controls in the
following areas: acquisition planning, solicitation, requirements
development and management, project management, contract tracking and
oversight, and evaluation.
In June 2003, the Department of the Treasury's Office of Inspector
General (OIG) issued a report on the ACE program's contract, concluding
that the former Customs Service, now CBP, did not fully comply with
Federal Acquisition Regulation requirements in the solicitation and
award of its contract because the ACE contract is a multiyear contract
and not an indefinite-delivery/indefinite-quantity (IDIQ) contract.
Further, the Treasury OIG found that the ACE contract type, which it
determined to be a multiyear contract, is not compatible with the
program's stated needs for a contract that can be extended to a total
of 15 years, because multiyear contracts are limited to 5 years.
Additionally, the Treasury OIG found that Customs combined multiyear
contracting with IDIQ contracting practices. For example, it plans to
use contract options to extend the initial 5-year performance period.
CBP disagrees with the Treasury OIG conclusion.
To resolve the disagreement, DHS asked GAO to render a formal decision.
We are currently reviewing the matter.
Condition 4. DHS and OMB satisfied the condition that the plan be
reviewed and approved by the DHS IRB, the Secretary of Homeland
Security, and OMB.
On August 18, 2004, the DHS IRB reviewed the ACE program, including ACE
fiscal year 2005 cost, schedule, and performance plans. The DHS Deputy
Secretary, who chairs the IRB, delegated further review of the fiscal
year 2005 efforts, including review and approval of the fiscal year
2005 ACE expenditure plan, to the Under Secretary for Management, with
support from the Chief Financial Officer, Chief Information Officer,
and Chief Procurement Officer, all of whom are IRB members. The Under
Secretary for Management approved the expenditure plan on behalf of the
Secretary of Homeland Security on November 8, 2004.
OMB approved the plan on October 15, 2004.
Condition 5. GAO satisfied the condition that it review the plan. Our
review was completed on December 17, 2004.
Objective 2 Results:
Open Recommendations:
Open recommendation 1: Develop and implement a rigorous and
analytically verifiable cost estimating program that embodies the
tenets of effective estimating as defined in SEI's institutional and
project-specific estimating models. [NOTE 1]
Status: In progress:
CBPMO has taken several steps to strengthen its cost estimating
program. First, the program office has defined and documented processes
for estimating expenditure plan costs (including management reserve
costs). Second, it hired a contractor to develop cost estimates,
including contract task orders, that are independent of eCP's
estimates. Third, it tasked a support contractor with evaluating the
independent and eCP estimates against SEI criteria. According to the
summary-level results of this evaluation, the independent estimates
either satisfied or partially satisfied the SEI criteria, and eCP's
estimates satisfied or partially satisfied all but two of the seven SEI
criteria (these were the criteria for calibration of estimates using
actual experience and for adequately reflecting program risks in
estimates). CBPMO officials have not yet provided us with the detailed
results of this analysis because they have not yet been approved.
NOTES:
[1] For these models, see SEI's Checklists and Criteria for Evaluating
the Cost and Schedule Estimating Capabilities of Software Organizations
and A Manager's Checklist for Validating Software Cost and Schedule
Estimates.
Objective 2 Results Open Recommendations:
Open recommendation 2: Ensure that future expenditure plans are based
on cost estimates that are reconciled with independent cost estimates.
Status: Complete[1]:
In August 2004, CBP's support contractor completed an analysis
comparing the cost estimates in the fiscal year 2005 expenditure plan,
which are based on the eCP's cost estimates, with the estimate prepared
by CBPMO's independent cost estimating contractor. This analysis, which
was completed 3 months before the fiscal year 2005 expenditure plan was
submitted to the Appropriations Committees, states that the two
estimates are consistent.
NOTES:
[1] With respect to the fiscal year 2005 expenditure plan.
Open recommendation 3: Immediately develop and implement a human
capital management strategy that provides both near-and long-term
solutions to program office human capital capacity limitations, and
report quarterly to the appropriations committees on the progress of
efforts to do so.
Status: In progress:
According to the expenditure plan, CBPMO has since developed a
modernization staffing plan that identifies the positions and staff it
needs to effectively manage ACE. However, CBPMO did not provide this
plan to us because it was not yet approved. Moreover, program officials
told us that the staffing plan is no longer operative because it was
developed before December 2004, when a modernization office
reorganization was implemented. As part of this reorganization, CBP
transferred government and contractor personnel who have responsibility
for the Automated Commercial System, [NOTE 1] the Automated Targeting
System, [NOTE 2] and ACE training from non-CBPMO organizational units.
This change is expected to eliminate redundant ACE-related program
management efforts.
NOTES:
[1] The Automated Commercial System is CBP's system for tracking,
controlling, and processing imports to the United States.
[2] The Automated Targeting System is CBP's system for identifying
import shipments that warrant further attention.
Following our recommendation, CBPMO provided reports dated March 31,
2004, and June 30, 2004, to the appropriations committees on its human
capital activities, including development of the previously mentioned
staffing plan and related analysis to fully define CBPMO positions.
Additionally, it has reported on efforts to ensure that all
modernization office staff members complete a program management
training program.
Open Recommendation 4: Have future ACE expenditure plans specifically
address any proposals or plans, whether tentative or approved, for
extending and using ACE infrastructure to support other homeland
security applications, including any impact on ACE of such proposals
and plans.
Status: In progress:
The ACE Program Plan states that ACE provides functions that are
directly related to the "passenger business process" underlying the
U.S. Visitor and Immigrant Status Indicator Technology (US-VISIT)
program, [NOTE 1] and integration of certain ACE and US-VISIT
components is anticipated. In recognition of this relationship, the
expenditure plan states that CBPMO and US-VISIT are working together to
identify lessons learned, best practices, and opportunities for
collaboration.
NOTES:
[1] US-VISIT is a governmentwide program to collect, maintain, and
share information on foreign nationals for enhancing national security
and facilitating legitimate trade and travel, while adhering to U.S.
privacy laws and policies.
Specifically:
* In February 2004, ACE and US-VISIT managers met to identify potential
areas for collaboration between the two programs and to clarify how the
programs can best support the DHS mission and provide officers with the
information and tools they need. During the meeting, US-VISIT and ACE
managers recognized that the system infrastructure built to support the
two programs is likely to become the infrastructure for future border
security processes and system applications. Further, they identified
four areas of collaboration: business cases; program management;
inventory; and people, processes, and technology. These areas were
later refined to be as follows:
* Program Management coordination, which includes such activities as
creating a high-level integrated master schedule for both programs and
sharing acquisition strategies, plans, and practices;
* Business Case coordination, including such business case activities
as OMB budget submissions and acquisition management baselines;
* Inventory, which includes identifying connections between legacy
systems and establishing a technical requirements and architecture team
to review, among other things, system interfaces, data formats, and
system architectures; and:
* People, Processes, and Technology, which includes establishing teams
to review deployment schedules and establishing a team and process to
review and normalize business requirements.
According to CBPMO, scheduling and staffing constraints prevented any
collaboration activities from taking place between February and July
2004. In August 2004, the US-VISIT and ACE programs tasked their
respective contractors to form collaboration teams to address the four
areas identified at the February meeting. Nine teams were formed:
DHS investment management;
Organizational change management;
Information and data;
Privacy and security;
Program management;
Business;
Facilities;
Technology;
Deployment, operations, and maintenance.
In September 2004, the teams met to develop team charters, identify
specific collaboration opportunities, and develop timelines and next
steps. In October 2004, CBPMO and US-VISIT program officials were
briefed on the progress and activities of the collaboration teams.
Open recommendation 5: Establish an IV&V function to assist CBP in
overseeing contractor efforts, such as testing, and ensure the
independence of the IV&V agent.
Status: Complete:
According to ACE officials, they have selected an IV&V contractor that
has had no prior involvement in the modernization program to ensure
independence. These officials stated that the IV&V contractor will be
responsible for reviewing ACE products and management processes, and
will report directly to the CBP CIO. Award of this contract is to occur
on December 30, 2004.
Open recommendation 6: Define metrics, and collect and use associated
measurements, for determining whether prior and future program
management improvements are successful.
Status: In progress:
CBPMO has implemented a metrics program that generally focuses on
measuring eCP's performance through the use of earned value management
(EVM), deliverable timeliness and quality metrics, and risk and issue
disposition reporting. Additionally, CBPMO is planning to broaden its
program to encompass metrics and measures for determining progress
toward achieving desired business results and acquisition process
maturity. The plan for expanding the metrics program is scheduled for
approval in early 2005.
One part of CBPMO's metrics program that it has implemented relates to
EVM for its contract with eCP. EVM is a widely accepted best practice
for measuring contractor progress toward meeting deliverables by
comparing the value of work accomplished during a given period with
that of the work expected in that period. Differences from expectations
are measured in the form of both cost and schedule variances.
* Cost variances compare the earned value of the completed work with
the actual cost of the work performed. For example, if a contractor
completed $5 million worth of work and the work actually cost $6.7
million, there would be a -$1.7 million cost variance. Positive cost
variances indicate that activities are costing less, while negative
variances indicate activities are costing more.
* Schedule variances, like cost variances, are measured in dollars, but
they compare the earned value of the work completed to the value of
work that was expected to be completed. For example, if a contractor
completed $5 million worth of work at the end of the month, but was
budgeted to complete $10 million worth of work, there would be a -$5
million schedule variance. Positive schedule variances show that
activities are being completed sooner than planned. Negative variances
show activities are taking longer than planned.
In accordance with EVIVI principles, eCP reports on its financial
performance monthly. These reports provide detailed information on cost
and schedule performance on work segments in each task order. Cost and
schedule variances that exceed a certain threshold are further examined
to determine the root cause of the variance, the impact on the program,
and mitigation strategies.
Open recommendation 7: Reconsider the ACE acquisition schedule and cost
estimates in light of early release problems, including these early
releases' cascading effects on future releases and their relatively
small size compared to later releases, and in light of the need to
avoid the past levels of concurrency among activities within and
between releases.
Status: Complete:
As we previously reported, the cost estimate for Releases 3 and 4 had
grown to $185.7 million, which was about $36.2 million over the
contract baseline, and the chances of further overruns were likely.
[NOTE 1] Subsequently, the Release 3 and 4 cost overrun grew to an
estimated $46 million, resulting in CBPMO and eCP establishing a new
cost baseline for Releases 3 and 4 of $196 million. eCP began reporting
performance against this new baseline in April 2004. Further, in July
2004, CBPMO and eCP changed the associated contract task order baseline
completion date from September 15, 2004, to May 30, 2005, revised the
associated interim task order milestones, and began reporting schedule
performance relative to the new baselines.
NOTE:
[1] GAO, Information Technology. Early Releases of Customs Trade System
Operating, but Pattern of Cost and Schedule Problems Needs to Be
Addressed, GAO-04-719 (Washington, D.C.: May 14, 2004).
In July 2004, eCP also rebaselined the ACE program, producing a new
version of the ACE Program Plan. The new baseline extends delivery of
the last ACE release from fiscal year 2007 to fiscal year 2010 and adds
a new screening and targeting release. The new program plan also
provides a new ACE life-cycle cost estimate of $3.1 billion, [NOTE 1]
which is a $1 billion increase over the previous life-cycle cost
estimate. According to the expenditure plan, the new schedule reflects
less concurrency between releases. The following figure compares
previous and current schedules for ACE releases and shows a reduction
in the level of concurrency between releases.
NOTES:
[1] CBP's ACE life-cycle cost estimate adjusted for risk is about $3.3
billion.
ACE Schedule as of October 2003 Compared with November 2004 Version:
[See PDF for image]
Source: GAO analysis of CBP data.
[End of figure]
Open recommendation 8: Report quarterly to the House and Senate
Appropriations Committees on efforts to address open GAO
recommendations.
Status: In progress:
CBPMO submitted reports to the Committees on its efforts to address
open GAO recommendations for the quarters ending March 31, 2004, and
June 30, 2004. CBPMO plans to submit a report for the quarter ending
September 30, 2004, after it is approved by DHS and OMB.
Objective 3 Results:
Observations:
Observation 1: Initial ACE releases have largely met a key service
level agreement.
According to a service level agreement between eCP and CBPMO, 99.9
percent of all ACE transactions are to be executed successfully each
day. eCP reports that ACE has met this requirement on all but 11 days
(shown below) since February 1, 2004.
Date: February 25, 2004;
Percentage of daily transactions successful: 89.86.
Date: March 28, 2004;
Percentage of daily transactions successful: 90.83.
Date: August 15, 2004;
Percentage of daily transactions successful: 99.70.
Date: August 30, 2004;
Percentage of daily transactions successful: 98.06.
Date: October 30, 2004;
Percentage of daily transactions successful: 99.86.
Date: November 10, 2004;
Percentage of daily transactions successful: 99.50.
Date: November 11, 2004;
Percentage of daily transactions successful: 87.17.
Date: November 12, 2004;
Percentage of daily transactions successful: 87.17.
Date: November 13, 2004;
Percentage of daily transactions successful: 91.44.
Date: November 14, 2004;
Percentage of daily transactions successful: 96.83.
Date: November 22, 2004:
Percentage of daily transactions successful: 95.49.
Source: eCP.
[End of table]
For each day that the system did not meet the service level agreement,
eCP identified the root cause. For example, one of the incidents was
due to insufficient shutdown and startup procedures and another was
caused by an incorrectly configured Java Archive (JAR) file. [NOTE 1]
eCP also reported on actions taken to prevent a reoccurrence of the
problem. For example, eCP reported that it has amended the startup and
shutdown procedures, and made operators aware of the changes, and it
has implemented steps for correctly capturing changes to JAR file
configurations.
The November 10 to November 14 incidents were all attributed to a
single cause: a defect in a software update that allowed some trade
users to inappropriately view account information on other trade
accounts. According to the root cause analysis report, eCP corrected
the software error and then manually reviewed each account to ensure
that permissions had been set appropriately. However, this report also
raised questions as to whether system updates were being executed
without regard to risk mitigation in order to meet mandated schedules.
NOTES:
[1] Java (TM) Archive (JAR) files bundle multiple class files and
auxiliary resources associated with applets and applications into a
single archive file.
Observation 2: Progress toward establishing ACE user accounts has not
met expectations.
CBPMO established a goal of activating 1,100 ACE importer accounts by
February 25, 2005, which is when Release 4 is to become operational.
According to CBP, it is expected that the 1,100 accounts will represent
more than 50 percent of total import duty collected at ports.
To help measure progress toward reaching the overall goal of 1,100
accounts, CBPMO established weekly targets. One target was to have 600
accounts activated by November 26, 2004. However, CBPMO reported that
activated ACE accounts as of this date were 311, which is about 48
percent less than the interim target. In addition, since October 1,
2004, CBPMO has not reached any of its weekly targets, and the gap
between the actual and targeted number of activated accounts has grown.
As of December 15, 2004, CBPMO reports that 347 accounts have been
activated. Further, CBPMO officials said that they expect rapid growth
in activated accounts as Release 4 is deployed. The following figure
shows the trend in target versus actual accounts activated.
Target Versus Actual Activated ACE Accounts:
[See PDF for image]
Source: CBP.
[End of figure]
CBPMO officials stated that they are currently analyzing the reasons
for the lower than expected number of user accounts. They also stated
that they have initiated more aggressive techniques to inform the trade
community about ACE benefits and to clarify the steps to participate.
Observation 3: Release 3 testing and pilot activities were delayed and
have produced system defect trends that raise questions about decisions
to pass key milestones and about the state of system maturity.
Development of each ACE release includes system integration and system
acceptance testing, followed by a pilot period that includes user
acceptance testing. Generally, the purpose of these tests is to
identify defects or problems either in meeting defined system
requirements or in satisfying system user needs. The purpose of the
associated readiness reviews is to ensure that the system satisfies
criteria for proceeding to the next stage of testing or operation.
Tests and their related milestones are described in the following
table.
[See PDF for image]
Source: eCP.
[A] Generally, the identified SDLC milestone review comes at the
conclusion of the related test.
[End of figure]
Defects identified during testing and operation of the system are
documented as program trouble reports (PTRs). Defects are classified
into one of four severity categories, as described below.
[See PDF for image]
Source: eCP.
[End of table]
Release 3 integration testing started later than planned, took longer
than expected, and was declared successful despite open defects that
prevented system from performing as intended.
In September 2003, Release 3 system integration testing (SIT) was
scheduled to start on December 24, 2003, and last for 43 days. However,
the start of SIT testing was delayed until February 18, 2004, or about
2 months, and it lasted 56 days, or about 2 weeks longer than planned.
CBPMO officials attributed the delays in Release 3 testing to Release 2
testing delays that caused the shared test environments to be delivered
late to Release 3, and human capital that was held on Release 2 longer
than planned. These officials also explained that the additional 2
weeks for Release 3 integration testing was due to the aforementioned
late delivery of test environments, as well as to last minute design
and development changes.
Release 3 SIT consisted of 85 test cases, all of which reportedly
either passed or passed with exceptions. Those tests passing with
exceptions generated defects, but because none of the test cases were
judged to have completely failed, SIT was declared to be successfully
executed. The test readiness review (TRR) milestone approval was
granted because the approval criteria did not stipulate that all
critical and severe defects had to be resolved, but rather that they
either had to be resolved or have approved work-off plans in place. As
a result, TRR approval occurred on April 26, 2004, even though CBPMO
reported that 2 critical and 90 severe defects were open at this time.
Of these 92 open defects, two critical ones were reported to have been
closed 2 days after TRR, with 77 of the remaining severe defects being
closed within the next 2 weeks. The remaining severe defects were
largely closed, according to CBP, 4 weeks after TRR, with the final
three being closed on June 21, 2004, which is 8 weeks after TRR.
Given that critical defects by definition prevent the system from
performing mission-essential operations or jeopardize safety and
security, among other things, and severe defects prevent the system
from working as intended or produce errors that degrade system
performance, using criteria that permit one phase of testing to be
concluded and another phase to begin, despite having a large number of
such problems, introduces unnecessary risk.
Moreover, using such exit criteria represents a significant change from
the practice CBPMO followed on prior ACE releases, in which TRR could
not be passed if any critical defects were present, and Production
Readiness Review (PRR) could not be passed if any critical or severe
defects were present. In effect, this change in readiness review exit
criteria creates hidden overlap among test phases, as work to resolve
defects from a prior phase of testing occurs at the same time that work
is under way to execute a subsequent phase of testing. As we have
previously reported, such concurrency among test phases has contributed
to a recurring pattern of ACE release commitments not being met.
Release 3 acceptance testing started later than planned, concluded
later than planned, and was declared successful despite material
inventory of open defects.
Release 3 system acceptance testing (SAT) was planned to start on March
5, 2004, and last for 38 days. Because of delays caused by changes to
the requirements baseline affecting the development of test cases, SAT
began on May 7, 2004, about 2 months later than planned, and before all
severe SIT defects were closed. In order to avoid further Release 3
schedule delays and maintain the PRR date of May 28, 2004, the SAT
period was shortened from 38 to 20 days, or approximately half of the
originally planned period. CBPMO officials noted that the program
completed SAT in the compressed schedule by investing the additional
resources needed to conduct tests 7 days a week, often for up to 12
hours each day.
Release 3 SAT consisted of 28 test cases, all of which reportedly
passed successfully. During the SAT test period from May 7 to May 27,
2004, 3 critical, 129 severe, and 19 moderate defects were found.
The exit criteria for Release 3 PRR also stipulated that all critical
and severe defects either be resolved or have work-off plans
identified. At the time of the PRR on May 28, 2004, CBP reported that
18 severe defects remained open. According to CBP, because these
defects were determined not to pose an unacceptable risk to the system,
their closure was intentionally delayed until after PRR. However, such
defects, according to CBPMO's own definition, preclude the system from
working as intended or produce errors that degrade system performance.
This is one reason why guidance on effective test practices generally
advocates closing such defects before concluding one phase of testing
and beginning the next.
Release 3 pilot activities, including user acceptance testing, were
declared successful, despite severe defects remaining open.
Two major activities conducted during the Release 3 Pilot Performance
Period were training for CBP and trade users and user acceptance
testing (UAT). This pilot period lasted from PRR on May 28, 2004, until
ORR on August 25, 2004.
In training to prepare users to operate Release 3, business scenarios
were used that reflected daily job functions; training was conducted
over an 8-or 4-week period for CBP and trade users, respectively. This
training received an average user satisfaction score of about 4 on a 1
to 5 scale, which is defined as "very good."
Release 3 UAT consisted of CBP and trade users executing 19 and 23 test
cases, respectively, and rating the release in several areas, again
using a 1 to 5 scale (with 1 indicating "very dissatisfied" and 5
indicating "very satisfied"). The test areas were to address the major
functionality that is new or was significantly changed from Release 2.
UAT average user satisfaction scores for were 4.0 or "satisfied" for
trade users and 3.5 or "somewhat satisfied" for CBP users. According to
CBPMO officials, the target score was 4.0. A reason cited for the lower
scores for CBP users was that testing included a large number of less
experienced users, who tended to be more critical of ACE than users who
had more experience with the system.
The pilot period also produced a total of 191 defects, including 5
critical, 74 severe, 48 moderate, and 64 minor defects. CBPMO reported
that 6 of the 74 severe defects remained open at ORR on August 25,
2004.
Similar to the TRR and PRR exit criteria, the criteria for passing
Release 3 ORR stipulated that all critical and severe defects either be
resolved or have work-off plans in place at the time of ORR. According
to CBPMO, all defects that were open at ORR either had an acceptable
work-around in place, or CBPMO expected that they would not adversely
affect the use of the system. However, by definition, severe defects
adversely affect system performance, and if an acceptable work-around
exists, they are categorized as moderate defects, not severe defects.
Trends in Defects during the Release 3 Testing Period, Including the
Number of Open Severity Classification at the Time of the Readiness
Reviews:
[See PDF for image]
Source: GAO analysis of eCP data.
[End of figure]
Current state of Release 3 maturity is unclear because defect data
since user acceptance testing are not reliable.
Having current and accurate information on system defect density is
necessary to adequately understand system maturity and to make informed
decisions about allocation of limited resources in meeting competing
priorities. Since the Release 3 ORR, available data show that Release 3
is operating with longstanding defects and that new defects have not
been closed. For example, the defect data as of November 30, 2004, show
that 18 defects that were open at TRR were still open (11 moderate and
7 minor); 33 defects open at PRR were still open (16 moderate and 17
minor); and 92 defects open at ORR were still open (2 severe, 43
moderate, and 47 minor). In addition, the data show that 43 defects
opened since ORR (23 severe, 8 moderate, and 12 minor) were still open
as of November 30, 2004. However, CBPMO officials told us that these
data are not reliable because the focus has been on completing Release
4 testing and pilot activities, at the expense of keeping Release 3
defect data current and accurate. As a result, CBPMO does not currently
have a complete picture of the maturity of each of its releases so that
it can make internal resource allocation decisions.
Observation 4: Release 4 test phases were delayed and overlapped, and
revealed a higher than expected volume and significance of defects,
raising questions about decisions to pass key milestones and about the
state of system maturity.
As previously discussed, each ACE release is subject to SIT and SAT,
which are conducted by eCP. Each release also undergoes UAT, which is
conducted by CBP. Generally, the purpose of these tests is to identify
defects or problems in either meeting defined system requirements or in
satisfying system user needs. Defects are documented as PTRs that are
classified by severity. The four severity levels are (1) critical, (2)
severe, (3) moderate, and (4) minor.
Release 4 testing revealed a considerably higher than expected number
of material defects.
Before initiating Release 4 testing, eCP forecasted and planned for
resolution of an expected number of defects. Specifically, 2,018 total
defects were estimated to be found by the time of PRR. Of the 2,018,
343 were to be critical, 1110 severe, 383 moderate, and 182 minor.
However, at the time of PRR on November 23, 2004, 3757 total defects
were reported, which is about 86 percent more than expected. Moreover,
the significance of the defects was underestimated; 835 critical
defects were reported (143 percent more than expected), and 2224 severe
defects were reported (100 percent more than expected).
The following figure depicts the estimated and actual Release 4 defects
according to their severity level.
Release 4 Expected Versus Actual Defects by Severity:
[See PDF for image]
Source: GAO analysis of CBP data.
[End of figure]
eCP officials attributed the difference between estimated and actual
Release 4 defects to their underestimating the complexity of developing
the release, and thus underestimating the likely number of defects.
As a result of this significantly higher than expected number and
severity of defects, eCP drew resources from a later release and, as
discussed later, passed PRR with 5 critical and 37 severe defects.
The following figure depicts the total number of expected Release 4
defects in comparison to the actual number of defects identified.
Release 4 Expected Versus Actual Defects over Time:
[See PDF for image]
Source: GAO analysis of CBP data.
[End of figure]
Release 4 integration and acceptance testing schedule changes resulted
in tests being conducted concurrently.
According to the testing schedule, Release 4 SIT was scheduled to start
on May 12, 2004, and to finish on October 1, 2004. However, SIT was
started on June 28, 2004 (approximately 7 weeks later than planned) and
completed on November 23, 2004 (approximately 8 weeks later than
planned).
According to the same testing schedule, SAT was scheduled to start on
October 19, 2004, and to last 39 days. However, SAT was started on
November 1, 2004, and was completed on November 23, 2004, thus lasting
for 23 days. According to eCP's actual testing schedule, the SAT period
was shortened by 16 days, in order to reduce the impact of previous
schedule delays and conduct the planned PRR by November 23.
Further, the testing schedule planned to have no concurrency between
SIT and SAT. However, SIT and SAT were actually conducted concurrently,
which as we previously reported, increases risk and contributed to past
ACE cost and schedule problems (see next slide). According to program
officials, rather than waiting for SIT to be fully completed before
starting SAT, they began SAT on Release 4 functionality that
successfully completed SIT.
Release 4 SIT and SAT Time Frames:
[See PDF for image]
Source: GAO analysis of CBP data.
[End of figure]
Release 4 defect profile shows improvements in resolving defects, but
critical and severe defects remain in operational system.
The number of open Release 4 defects peaked on October 8, 2004, when
there were 59 critical, 243 severe, and 59 moderate defects open. CBPMO
reports that since then, many of these defects have been closed.
CBPMO's criteria for successfully passing PRR requires that all
critical and severe defects are resolved or have work-off plans. At the
time of PRR on November 23, 2004, CBPMO reported that most defects were
closed, with the exception of 5 critical and 37 severe defects for
which they have established or intended to establish work-off plans.
However, as of November 30, 2004, which was about 1.5 weeks from
deployment of the Release 4 pilot period, 3 critical defects and 30
severe defects remained open.
The following graph shows the number of defects open each week during
Release 4 testing.
Release 4 Defect Trend:
[See PDF for image]
Source: GAO analysis of CBP data.
[End of figure]
Observation 5: Performance against the revised cost and schedule
estimates for Releases 3 and 4 has been mixed.
Because the Release 3 and 4 contract was experiencing significant cost
and schedule overruns, CBPMO established a new baseline, referred to as
the Over Target Baseline (OTB) in April 2004. Program performance
against the OTB is measured using EVM cost variances and schedule
variances. Release 3 and 4 cost performance against the new baseline
has been positive, but the schedule performance has not.
The chart on the following slide illustrates the cumulative cost
variance on Release 3 and 4 since the OTB was established.
As shown below, the Release 3 and 4 contract was about $1.8 million
under budget in September 2004 and about $1.4 million under budget in
October 2004. eCP attributed the recent slip in cost performance to
additional resources being needed to complete Release 4 testing and to
resolve Release 4 defects.
Release 3 and 4 Cumulative Cost Variance, April to October 2004:
[See PDF for image]
Source: CBP.
[End of figure]
In contrast, Release 3 and 4 contract performance has continued to fall
short of the schedule OTB (see below).
[See PDF for image]
Source: CBP.
[End of figure]
As shown on the previous slide, eCP recovered about $1.4 million of the
schedule variance between August 2004 and October 2004 but still has
not completed $1.5 million worth of scheduled work. According to eCP,
the recent improvement in schedule performance reflects recent
completion of such work as Release 4 testing.
While cost performance on Release 3 and 4 has been positive since the
new baseline was established, schedule performance has not. In order to
meet Release 4 schedule commitments, resources have been held on
Release 4 longer than planned to complete testing and resolve defects.
While this has resulted in an improvement in schedule performance in
September and October 2004, it has also contributed to a slip in cost
performance in October 2004. Continuing to devote extra resources to
meet the Release 4 schedule could further impact the currently positive
cost variance.
Observation 6: The fiscal year 2005 expenditure plan does not
adequately describe progress against commitments (e.g., ACE
capabilities, schedule, cost, and benefits) made in previous plans.
ACE is intended to provide greater security at our nation‘s borders
while improving import and export processing, and its latest life-cycle
cost estimate is about $3.1 billion. Given ACE‘s immense importance and
sizable cost and complexity, the Congress has placed limitations on the
use of program funds until it is assured, through the submission of
periodic expenditure plans, that the program is being well managed.
As we have previously reported, to permit meaningful congressional
oversight, it is important that expenditure plans describe how well CBP
is progressing against the commitments made in prior expenditure plans.
[NOTE 1] However, the fiscal year 2005 expenditure plan did not
adequately describe such progress. In particular, in its fiscal year
2004 expenditure plan, CBPMO committed to, for example,
acquiring infrastructure (e.g., system environments, facilities,
telecommunications, and licenses) for ACE releases and
* defining and designing the ACE release (designated Release 6 at the
time) that is intended to provide additional account management
functionality.
The fiscal year 2005 plan, however, did not address progress against
these commitments. For example, the plan did not describe the status of
infrastructure acquisition, nor did it discuss the expenditure of the
$106.6 million requested for this purpose. While the plan did discuss
the status of the initial ACE releases, it did not describe progress
toward defining and designing the functionality that was to be in the
former Release 6.
Also, the fiscal year 2005 expenditure plan included a schedule for
developing ACE releases, but neither reported progress relative to the
schedule presented in the fiscal year 2004 plan nor explained how the
individual releases and their respective schedules were affected by the
rebaselining that occurred after the fiscal year 2004 plan was
submitted.
Further, while the fiscal year 2005 expenditure plan contained high-
level descriptions of the functionality provided by Releases 1 and 2,
it did not describe progress toward achieving the benefits they are
expected to provide.
Without such information, meaningful congressional oversight of CBP
progress and accountability is impaired.
NOTES:
[1] GAO, Information Technology: Homeland Security Needs to Improve
Entry Exit System Expenditure Planning, GAO-03-563:
Observation 7: Some key bases for the commitments made in the fiscal
year 2005 expenditure plan have changed, raising questions as to the
plan's currency and relevance.
The ACE fiscal year 2005 expenditure plan is based largely on the July
8, 2004. ACE Program Plan. This July plan represents the program's
authoritative and operative guiding document or plan of action.
Briefly, it describes such things as the ACE release construct,
development methodology, deployment strategy, organizational change
approach, training approach, and role/responsibility assignments. It
also identifies key assumptions made in formulating the plan, provides
a schedule for accomplishing major program activities, and contains
estimates of costs for the total program and major activities.
Recent program developments and program changes have altered some key
bases (e.g., assumptions, release construct, organizational change
management approach, and roles and responsibilities) of the ACE program
plan, and thus the current expenditure plan. As a result, questions
arise as to the extent to which the expenditure plan's commitments
remain current and relevant.
A key Release 5 assumption underpinning the program and expenditure
plans is no longer valid.
Release 5 is to include the capability to receive a multimodal manifest
that can be screened for risk indicators. According to the ACE program
plan, delivery of this capability is to be accomplished using the SAP
software product, which the SAP vendor was expected to enhance because
its product does not currently contain the functionality to accommodate
multimodal manifests. This expectation for product enhancement, within
certain time and resource constraints, was an assumption in the ACE
program plan, and was to be accomplished under a contract between eCP
and the SAP vendor.
Following the program plan's approval, initial development of Release 5
began (e.g., planning for the release, negotiations to enhance the SAP
product, development of release initiation documents, conduct of
release functionality workshops). However, CBPMO has recently decided
not to use SAP to provide the multimodal manifest functionality, thus
rendering a key assumption in the program plan and the expenditure plan
invalid. CBPMO has since suspended all work to develop the multimodal
manifest functionality until a new approach to developing it is
established. According to ACE officials, this change is intended to
result in providing the multimodal manifest functionality faster and at
lower cost.
Additional release(s) now planned that were not reflected in the
program and expenditure plans.
CBPMO now plans to add at least one new ACE release. According to CBPMO
officials, the need for additional Release 4 functionality was
expressed by various user groups during the development of this release-
functionality that was not in the scope of Release 4 and includes, for
example, the capability for trade users to look up transactions, and
for carriers to receive feedback on release of vehicles. In addition,
the need for ACE to more easily accommodate new legislative mandates
was identified. Therefore, a Release 4 enhancement, referred to as
Release 4.1, has been added to the ACE release construct.
In October, CBPMO defined high-level functional requirements for
Release 4.1, and it is currently defining more detailed requirements.
However, this additional release, including its scope, costs, and
schedule, are not reflected in the current ACE program plan or the
fiscal year 2005 expenditure plan. According to program officials, any
enhancement releases will not be reflected in the program plan until
its next major update (August 2005), which is after CBPMO anticipates
having implemented Release 4.1, and the first expenditure plan that
could recognize it is the fiscal year 2006 plan.
ACE officials also stated that the costs of Release 4.1 and any
additional releases will be funded by operations and maintenance funds
provided for in the expenditure plan.
The current organizational change management approach is not fully
reflected in program and expenditure plans, and key change management
actions are not to be implemented.
As we have previously reported, best practices for acquiring and
implementing commercial component-based systems include ensuring that
the organizational impact of introducing functionality embedded in the
commercial software products, like SAP, is proactively managed. [NOTE
1] Accordingly, about 2 years ago we first discussed with ACE program
executives the need to proactively prepare users for role,
responsibility, and business process changes associated with ACE
implementation. To its credit, the ACE program plan describes the
organizational change approach that is to be pursued to position CBP
for these changes. Specifically, the plan discusses three primary
activities that are to be performed: communicating and reaching out to
stakeholders; providing training; and establishing a performance
measurement structure.
On August 10, 2004, a revised organizational change approach was
introduced. This new approach introduces new change management
activities. As of November 2004, some of these activities are being or
are planned to be implemented.
NOTES:
[1] GAO, Information Technology. DOD's Acquisition Policies and
Guidance Need to Incorporate Additional Best Practices and Controls,
GAO-04-722 (Washington, D.C.: July 2004).
These activities include conducting a communications campaign, mapping
employee roles with position descriptions, and providing learning aids
and help desk support.
However, because this revised organizational change approach was
finalized more than a month after the ACE Program Plan was completed,
neither the program plan nor the fiscal year 2005 expenditure plan
fully reflects the changes.
Moreover, because the ACE funding request for fiscal year 2005 did not
fully reflect the revised approach to managing organizational change,
key actions associated with the revised approach are not planned for
implementation in fiscal year 2005. For example, one key action was to
establish and communicate ACE usage targets, which would both encourage
ACE usage and permit performance to be measured. This is important,
according to eCP, because users may continue to rely on ACS, which
would preclude accrual of full ACE benefits. CBPMO officials stated
that each of the key actions that will not be implemented introduces
risks that must be mitigated. Formal program risks and associated
mitigation plans are currently under development. The following slide
summarizes change management actions in the revised approach that are
not planned for implementation and their associated risks.
Actions not planned for implementation: Establish and communicate
targets for ACE usage to encourage users to use ACE rather than ACS.
Risk statements: If ACS remains available to ACE users, they may
continue to use the legacy system, and as a result the full benefits of
ACE will not be realized.
Actions not planned for implementation: Before training, make users
aware of the major differences between ACS and ACE.
Risk statements: If ACE users do not understand the differences between
the legacy systems and ACE, then the users will not understand how best
to use ACE, which may result in resistance to the new system and
processes.
Actions not planned for implementation: Discuss the future needs of CBP
to establish new roles and responsibilities within the Office of
Information and Technology (OIT).
Risk statements: If future roles of the OIT are not established, then
OIT may not be prepared to provide technical support when ACE is
transferred from eCP to OIT.
Actions not planned for implementation: Send staff to visit ports to
build critical knowledge regarding organizational change objectives.
Risk statements: If staff do not have adequate access to
representatives of occupational groups at each port, then
communications, training, and deployment efforts cannot be customized
to each group's needs. This may delay or disrupt ACE adoption.
Source: CBP.
[End of table]
Recent changes to the respective roles and responsibilities of the ACE
development contractor and CBPMO are not reflected in the program and
expenditure plans.
As previously mentioned, on April 27, 2001, eCP was awarded a contract
to develop and deploy ACE. The strategy was for the government to play
the role of the system acquirer and to leverage the expertise of eCP,
which was to be the system developer. Accordingly, CBPMO has since been
responsible for performing system acquisition functions (e.g., contract
tracking and oversight, evaluation of acquired products and services,
and risk management), and eCP has been responsible for system
development functions (e.g., requirements development; design,
development, testing, and deployment of Releases 1, 2, 3, and 4; and
related services, including architecture and engineering). These
respective roles and responsibilities are reflected in the ACE program
plan, and thus the fiscal year 2005 expenditure plan.
According to CBPMO officials, these respective roles and
responsibilities are being realigned so that CBPMO and eCP will share
ACE development duties. That is, CBPMO will be responsible for certain
ACE development and deployment efforts as well as for oversight of the
development efforts for which eCP will retain responsibility. eCP will
also provide support to CBPMO‘s development efforts
More detailed information on how this change in roles and
responsibilities will be operationalized was not yet available.
Moreover, this change in approach is not reflected in either the ACE
program plan or the fiscal year 2005 expenditure plan.
Nevertheless, this change in approach is significant, and thus it is
important that it be managed carefully. As we previously reported,
effective management of a large-scale systems modernization program,
like ACE, requires a clear allocation of the respective roles and
responsibilities of the government and the contractor, [NOTE 1]
particularly with regard to responsibility for integrating system
components developed by different parties. The extent to which these
are made explicit and unambiguous will go a long way in ensuring proper
accountability for performance.
NOTES:
[1] GAO, Tax Systems Modernization: Results of Review of IRS' Initial
Expenditure Plan, GAO/AIMD/GGD-99-206 (Washington, D.C.: June 1999).
Conclusions:
DHS and OMB have largely satisfied four of the five conditions
associated with the fiscal year 2005 ACE expenditure plan that were
legislated by the Congress, and we have satisfied the fifth condition.
Further, CBPMO has continued to work toward implementing our prior
recommendations aimed at improving management of the ACE program and
thus the program's chances of success. Nevertheless, progress has been
slow in addressing some of our recommendations, such as the one
encouraging proactive management of the relationships between ACE and
other DHS border security programs, like US-VISIT. Given that these
programs have made and will continue to make decisions that determine
how they will operate, delays in managing their relationships will
increase the chances that later system rework will eventually be
required to allow the programs to interoperate.
Additionally, while DHS has taken important actions to help address ACE
release-by-release cost and schedule overruns that we previously
identified, it is unlikely that the effect of these actions will
prevent the past pattern of overruns from recurring. This is because
DHS has met its recently revised cost and schedule commitments in part
by relaxing system quality standards, so that milestones are being
passed despite material system defects, and because correcting such
defects will ultimately require the program to expend resources, such
as people and test environments, at the expense of later system
releases (some of which are now under way).
In the near term, cost and schedule overruns on recent releases are
being somewhat masked by the use of less stringent quality standards;
ultimately, efforts to fix these defects will likely affect the
delivery of later releases. Until accountability for ACE is redefined
and measured in terms of all types of program commitments-system
capabilities, benefits, costs, and schedules-the program will likely
experience more cost and schedule overruns.
During the last year, DHS's accountability for ACE has been largely
focused on meeting its cost and schedule baselines. This focus is
revealed by the absence of information in the latest expenditure plan
on progress against all commitments made in prior plans, particularly
with regard to measurement and reporting on such things as system
capabilities, use, and benefits. It is also shown by the program's
insufficient focus on system quality, as demonstrated by its
willingness to pass milestones despite material defects, and by the
absence of attention to the current defect profile for Release 3 (which
is already deployed).
Moreover, the commitments that DHS made in the fiscal year 2005
expenditure plan have been overcome by events, which limits the
currency and relevance of this plan and its utility to the Congress as
an accountability mechanism. As a result, the prospects of greater
accountability in delivering against its capability, benefit, cost, and
schedule commitments are limited. Therefore, it is critically important
that DHS define for itself and the Congress an accountability framework
for ACE, and that it manage and report in accordance with this
framework. If it does not, the effects of the recent rebaselining of
the program will be short lived, and the past pattern of ACE costing
more and taking longer than planned will continue.
Recommendations:
To strengthen accountability for the ACE program and better ensure that
future ACE releases deliver promised capabilities and benefits within
budget and on time, we recommend that the DHS Secretary, through the
Under Secretary for Border and Transportation Security, direct the
Commissioner, Customs and Border Protection, to define and implement an
ACE accountability framework that ensures:
* coverage of all program commitment areas, including key expected or
estimated system (1) capabilities, use, and quality; (2) benefits and
mission value; (3) costs; and (4) milestones and schedules;
* currency, relevance, and completeness of all such commitments made to
the Congress in expenditure plans;
* reliability of data relevant to measuring progress against
commitments;
* reporting in future expenditure plans of progress against commitments
contained in prior expenditure plans;
* use of criteria for exiting key readiness milestones that adequately
consider indicators of system maturity, such as severity of open
defects; and:
* clear and unambiguous delineation of the respective roles and
responsibilities of the government and the prime contractor.
Agency Comments:
In their oral comments on a draft of this briefing, DHS and CBP
officials, including the DHS Chief Information Officer (CIO), the
Border and Transportation Security CIO, and the CBP Acting CIO,
generally agreed with our findings, conclusions, and recommendations
and stated that it was fair and balanced. They also provided clarifying
information that we incorporated as appropriate in this briefing.
Attachment 1:
Scope and Methodology:
To accomplish our objectives, we analyzed the ACE fiscal year 2005
expenditure plan and supporting documentation, comparing them to
relevant federal requirements and guidance, applicable best practices,
and our prior recommendations. We also interviewed DHS and CBP
officials and ACE program contractors. In particular, we reviewed:
* DHS and CBP investment management practices, using OMB A-11, part 7;
* DHS and CBP activities for ensuring ACE compliance with the DHS
enterprise architecture;
* DHS and CBP acquisition management efforts, using SEI's SA-CMM;
* CBP cost estimating program and cost estimates, using SEI's
institutional and project-specific estimating guidelines;[NOTE 1]
NOTES:
[1] SEI's institutional estimating guidelines are defined in Checklists
and Criteria for Evaluating the Cost and Schedule Estimating
Capabilities of Software Organizations, and SEI's project-specific
estimating guidelines are defined in A Manager's Checklist for
Validating Software Cost and Schedule Estimates.
* CBP actions to coordinate ACE with US-VISIT using program
documentation;
* ACE testing plans, activities, system defect data, and system
performance data using industry best practices;
* independent verification and validation (IV&V) activities using the
Institute of Electrical and Electronics Engineers Standard for Software
Verification and Validation; [NOTE 1]:
* CBP establishment and use of performance measures using the draft
Performance Metrics Plan and eCP's cost performance reports;
* ACE's performance using service level agreements;
NOTES:
[1] Institute of Electrical and Electronics Engineers (IEEE) Standard
for Software Verification and Validation, IEEE Std 1012-1998 (New York:
Mar. 9, 1998).
* CBP's progress toward increasing the number of ACE user accounts,
against established targets;
* ACE's quality, using eCP defect data and testing results for Releases
3 and 4; and:
* cost and schedule data and program commitments from program
management documentation.
For DHS-, CBP-, and contractor-provided data that our reporting
commitments did not permit us to substantiate, we have made appropriate
attribution indicating the data's source.
We conducted our work at CBP headquarters and contractor facilities in
the Washington, D.C., metropolitan area from April 2004 through
December 2004, in accordance with generally accepted government
auditing standards.
[End of section]
Appendix II: Comments from the U.S. Department of Homeland Security:
U.S. Department of Homeland Security:
Washington, DC 20528:
February 22, 2005:
Mr. Randolph C. Hite:
Director, Information Technology Architecture and Systems Issues:
U.S. Government Accountability Office:
Washington, DC 20548:
Re: Draft Report GAO-05-267SU, Information Technology: Customs
Automated Commercial Environment Program Progressing, but Need for
Management Improvements Continues:
Dear Mr. Hite:
Thank you for the opportunity to review and comment on the subject
draft report. We are providing general comments for your use in
preparing the final report and have submitted technical comments under
separate cover.
The Department of Homeland Security (DHS) agrees with the status of
open recommendations and recommendations for DHS executive action. The
GAO report indicates that earlier recommendations regarding independent
verification and validation, and the Automated Commercial Environment
(ACE) acquisition schedule, have been satisfied, and DHS concurs.
DHS's Customs and Border Protection Modernization Office (CBPMO)
continues to address the remaining open recommendations regarding: (1)
cost estimating; (2) human capital management; (3) use of ACE for other
DHS applications; (4) program management metrics and measurements; and
(5) quarterly reporting to Congress. The Department notes that because
of their recurring nature, aspects of recommendation 1, and
recommendations 3 and 5 above will likely remain open for the life of
the program. DHS program officials intend to coordinate further with
GAO representatives to ensure understanding and agreement on closure
criteria for all open recommendations.
In its report, the GAO emphasized one overarching recommendation for
the ACE program. This recommendation requires that DHS define and
implement an ACE accountability framework that better ensures future
ACE releases deliver promised capability and benefits, within budget
and on time.
Since program inception, ACE program managers anticipated that the
scale and technical complexity of the ACE program would result in
changes to the program. In September 2001, it became clear that world
events would also change the nature of the program to be more focused
on border security. With this backdrop, the Department has two key
objectives for the program - develop ACE capabilities sooner and at
less cost, and ensure those capabilities hit the mark when fielded. To
achieve both objectives, sound decision processes and clear quality
standards have been established.
DHS has in place a solid program management foundation of acquisition
processes, program analysis and reporting mechanisms, and management
systems to effectively manage the program. This is complemented by
strong stakeholder relationships that support the development of ACE
requirements, and provide feedback on ACE capabilities. Though the
addition of post-9/11 security requirements has resulted in a longer
ACE development schedule than originally planned, this existing
foundation has indeed helped DHS ensure program accountability,
including keeping program costs within 10 percent of the program
baseline, and managing the program within approved program funding.
Likewise, DHS has followed its established processes to balance
quality, cost, and schedule objectives. For example, specific criteria
are established for all ACE development milestone reviews. The process
requires verification that all problems have been resolved or have
viable resolution plans before the milestone is considered successfully
accomplished. Problem areas are prioritized and assessed to determine
whether deferring closure to post-milestone review resolution is an
acceptable risk. The resolution plans are implemented and tracked
closely until the problem is resolved. This process reflects careful
consideration and deliberate decisions by DHS officials as they seek to
balance program objectives.
The ACE program continues to make progress toward developing and
deploying those capabilities that will better detect and act on threats
to the United States and our fellow citizens, and ensure the efficient
flow of legitimate trade across our borders. ACE users have indicated
their enthusiasm for the account management capabilities that have
already been deployed, and DHS has implemented an automated truck
manifest pilot that is setting the stage for broad expansion of ACE
capabilities in the coming year.
Also, the CBPMO has been reorganized to enhance government oversight of
ACE support contractors. This reorganization will foster organizational
cohesion and integration among Office of Information and Technology
(OIT) staff agencies, and expand Modernization/ACE program ownership
and commitment within OIT. This reorganization does not change the
roles and responsibilities or relationship between the government and
the e-Customs Partnership (eCP), which continues its role as the ACE
systems integration contractor.
Acknowledging the six subordinate elements of the new GAO
recommendation, DHS will build on the existing program management
foundation and the aforementioned reorganization to further define and
enhance its accountability framework. As the accountable DHS official,
the CBPMO Executive Director is committed to taking the following
actions to improve the ACE program accountability framework:
* Establish a clear delineation of roles and responsibilities between
Customs and Border Protection and the prime contractor (eCP). This will
be accomplished as part of the ACE acquisition strategy. This effort
will also drive the continued development and refinement of individual
roles and responsibilities as part of the CBPMO Strategic Human Capital
Management Program, which is covered under a separate GAO
recommendation. The overall Human Capital Management effort will
continue to be grounded in the established Human Capital Management
Strategic Plan and the ten human capital principles emphasized by GAO
(January 2000 GAO report Human Capital: Key Principles from Nine
Private Sector Organizations).
* Establish a formal document that defines the ACE program
accountability framework, its key elements, and a description of how it
is being implemented. This document will further depict the decision-
making mechanisms for the ACE program.
* In conjunction with the GAO review of the Fiscal Year 2006 (FY06)
Expenditure Plan:
- Demonstrate coverage, currency, relevance, and completeness of all
program commitment areas = and the reliability of the data that
measures progress on these commitments - as outlined by GAO in its
March 2005 report. To satisfy this element of the GAO recommendation,
the CBPMO will include the status of FY05 Expenditure Plan commitments,
and show alignment with other key program documents.
- Demonstrate the application of milestone exit criteria that
adequately consider indicators of system maturity.
As stewards of the taxpayers' dollars, and mindful of the threat posed
by those who would harm our citizens and disrupt our American way of
life, the Department and the entire ACE team remain deeply committed to
the ACE program. The Department is working diligently to ensure the
program is managed within the targets established by the ACE program
plan, timely reporting of progress against that plan, and when
necessary, changes to the program baseline to deliver the capabilities
needed to ensure the safety and economic security of our Nation. The
ACE program team values the GAO role and the relationship it has with
its representatives, and looks forward to working together with them to
achieve the objectives embodied in this report.
We thank you again for the opportunity to provide comments on this
draft report and look forward to working with you on future homeland
security issues.
Sincerely,
Signed by:
Steven J. Pecinovsky:
Acting Director, Departmental GAO/OIG Liaison:
Office of the Chief Financial Officer:
[End of section]
Appendix III: Contacts and Staff Acknowledgments:
GAO Contacts:
Mark T. Bird, (202) 512-6260:
Staff Acknowledgments:
In addition to the person named above, Carol Cha, Barbara Collier,
William Cook, Neil Doherty, Nnaemeka Okonkwo, and Shannin O'Neill made
key contributions to this report.
(310297):
FOOTNOTES
[1] CBP's ACE life-cycle cost estimate not adjusted for risk is about
$3.1 billion.
[2] Pub. L. 108-334 (Oct. 18, 2004).
[3] An enterprise architecture is an institutional blueprint for
guiding and constraining investments in programs like ACE.
[4] SEI's institutional and project-specific estimating guidelines are
defined respectively in Robert E. Park, Checklists and Criteria for
Evaluating the Cost and Schedule Estimating Capabilities of Software
Organizations, CMU/SEI-95-SR-005, and A Manager's Checklist for
Validating Software Cost and Schedule Estimates, CMU/SEI-95-SR-004
(Pittsburgh, Pa.: Carnegie Mellon University Software Engineering
Institute, 1995).
[5] The Automated Commercial System is CBP's system for tracking,
controlling, and processing imports to the United States.
[6] The Automated Targeting System is CBP's system for identifying
import shipments that warrant further attention.
[7] GAO, Information Technology: Early Releases of Customs Trade System
Operating, but Pattern of Cost and Schedule Problems Needs to Be
Addressed, GAO-04-719 (Washington, D.C.: May 14, 2004).
[8] US-VISIT is a governmentwide program to collect, maintain, and
share information on foreign nationals in order to enhance national
security and facilitate legitimate trade and travel while adhering to
U.S. privacy laws.
[9] According to a CBP official, the IV&V contract was awarded on
December 30, 2004.
[10] Earned value management is a method of measuring contractor
progress toward meeting deliverables by comparing the value of work
accomplished during a given period with that of the work expected in
that period.
GAO's Mission:
The Government Accountability Office, the investigative arm of
Congress, exists to support Congress in meeting its constitutional
responsibilities and to help improve the performance and accountability
of the federal government for the American people. GAO examines the use
of public funds; evaluates federal programs and policies; and provides
analyses, recommendations, and other assistance to help Congress make
informed oversight, policy, and funding decisions. GAO's commitment to
good government is reflected in its core values of accountability,
integrity, and reliability.
Obtaining Copies of GAO Reports and Testimony:
The fastest and easiest way to obtain copies of GAO documents at no
cost is through the Internet. GAO's Web site ( www.gao.gov ) contains
abstracts and full-text files of current reports and testimony and an
expanding archive of older products. The Web site features a search
engine to help you locate documents using key words and phrases. You
can print these documents in their entirety, including charts and other
graphics.
Each day, GAO issues a list of newly released reports, testimony, and
correspondence. GAO posts this list, known as "Today's Reports," on its
Web site daily. The list contains links to the full-text document
files. To have GAO e-mail this list to you every afternoon, go to
www.gao.gov and select "Subscribe to e-mail alerts" under the "Order
GAO Products" heading.
Order by Mail or Phone:
The first copy of each printed report is free. Additional copies are $2
each. A check or money order should be made out to the Superintendent
of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or
more copies mailed to a single address are discounted 25 percent.
Orders should be sent to:
U.S. Government Accountability Office
441 G Street NW, Room LM
Washington, D.C. 20548:
To order by Phone:
Voice: (202) 512-6000:
TDD: (202) 512-2537:
Fax: (202) 512-6061:
To Report Fraud, Waste, and Abuse in Federal Programs:
Contact:
Web site: www.gao.gov/fraudnet/fraudnet.htm
E-mail: fraudnet@gao.gov
Automated answering system: (800) 424-5454 or (202) 512-7470:
Public Affairs:
Jeff Nelligan, managing director,
NelliganJ@gao.gov
(202) 512-4800
U.S. Government Accountability Office,
441 G Street NW, Room 7149
Washington, D.C. 20548: