DOD Business Systems Modernization
Navy Implementing a Number of Key Management Controls on Enterprise Resource Planning System, but Improvements Still Needed
Gao ID: GAO-09-841 September 15, 2009
The Department of Defense (DOD) has long been challenged in effectively implementing key acquisition management controls on its thousands of business system investments. For this and other reasons, GAO has designated DOD's business systems modernization efforts as high-risk since 1995. One major business system investment is the Navy's Enterprise Resource Planning (ERP) system. Initiated in 2003, it is to standardize the Navy's business processes, such as acquisition and financial management. It is being delivered in increments, the first of which is to cost about $2.4 billion over its 20-year useful life and be fully deployed by fiscal year 2013. To date, the program has experienced about $570 million in cost overruns and a 2-year schedule delay. GAO was asked to determine whether (1) system testing is being effectively managed, (2) system changes are being effectively controlled, and (3) independent verification and validation (IV&V) activities are being effectively managed. To do this, GAO analyzed relevant program documentation, traced random samples of test defects and change requests, and interviewed cognizant officials.
The Navy has largely implemented effective controls on Navy ERP associated with system testing and change control. For example, it has established a well-defined structure for managing tests, including providing for a logical sequence of test events, adequately planning key test events, and documenting and reporting test results. In addition, it has documented, and is largely following, its change request review and approval process, which reflects key aspects of relevant guidance, such as having defined roles and responsibilities and a hierarchy of control boards. However, important aspects of test management and change control have not been fully implemented. Specifically, the program's tool for auditing defect management did not always record key data about changes made to the status of identified defects. To its credit, the program office recently took steps to address this, thereby reducing the risk of defect status errors or unauthorized changes. Also, while the program office's change review and approval procedures include important steps, such as considering the impact of a change, and program officials told GAO that cost and schedule impacts of a change are discussed at control board meetings, GAO's analysis of 60 randomly selected change requests showed no evidence that cost and schedule impacts were in fact considered. Without such key information, decision-making authorities lack an adequate basis for making informed investment decisions, which could result in cost overruns and schedule delays. The Navy has not effectively managed its IV&V activities, which are designed to obtain an unbiased position on whether product and process standards are being met. In particular, the Navy has not ensured that the IV&V contractor is independent of the products and processes that it is reviewing. Specifically, the same contractor responsible for performing IV&V of Navy ERP products (e.g., system releases) is also responsible for ensuring that system releases are delivered within cost and schedule constraints. Because performance of this system development and management role makes the contractor potentially unable to render impartial assistance to the government in performing the IV&V function, there is an inherent conflict of interest. In addition, the IV&V agent reports directly and solely to the program manager and not to program oversight officials. As GAO has previously reported, the IV&V agent should report the findings and associated risks to program oversight officials, as well as program management, in order to better ensure that the IV&V results are objective and that the officials responsible for making program investment decisions are fully informed. Furthermore, the contractor has largely not produced the range of IV&V deliverables that were contractually required between 2006 and 2008. To its credit, the program office recently began requiring the contractor to provide assessment reports, as required under the contract, as well as formal quarterly reports; the contractor delivered the results of the first planned assessment in March 2009. Notwithstanding the recent steps that the program office has taken, it nevertheless lacks an independent perspective on the program's products and management processes
Recommendations
Our recommendations from this work are listed below with a Contact for more information. Status will change from "In process" to "Open," "Closed - implemented," or "Closed - not implemented" based on our follow up work.
Director:
Team:
Phone:
GAO-09-841, DOD Business Systems Modernization: Navy Implementing a Number of Key Management Controls on Enterprise Resource Planning System, but Improvements Still Needed
This is the accessible text file for GAO report number GAO-09-841
entitled 'DOD Business Systems Modernization: Navy Implementing a
Number of Key Management Controls on Enterprise Resource Planning
System, but Improvements Still Needed' which was released on
September 15, 2009.
This text file was formatted by the U.S. Government Accountability
Office (GAO) to be accessible to users with visual impairments, as part
of a longer term project to improve GAO products' accessibility. Every
attempt has been made to maintain the structural and data integrity of
the original printed product. Accessibility features, such as text
descriptions of tables, consecutively numbered footnotes placed at the
end of the file, and the text of agency comment letters, are provided
but may not exactly duplicate the presentation or format of the printed
version. The portable document format (PDF) file is an exact electronic
replica of the printed version. We welcome your feedback. Please E-mail
your comments regarding the contents or accessibility features of this
document to Webmaster@gao.gov.
This is a work of the U.S. government and is not subject to copyright
protection in the United States. It may be reproduced and distributed
in its entirety without further permission from GAO. Because this work
may contain copyrighted images or other material, permission from the
copyright holder may be necessary if you wish to reproduce this
material separately.
Report to Congressional Requesters:
United States Government Accountability Office:
GAO:
September 2009:
DOD Business Systems Modernization:
Navy Implementing a Number of Key Management Controls on Enterprise
Resource Planning System, but Improvements Still Needed:
GAO-09-841:
GAO Highlights:
Highlights of GAO-09-841, a report to congressional requesters.
Why GAO Did This Study:
The Department of Defense (DOD) has long been challenged in effectively
implementing key acquisition management controls on its thousands of
business system investments. For this and other reasons, GAO has
designated DOD‘s business systems modernization efforts as high-risk
since 1995. One major business system investment is the Navy‘s
Enterprise Resource Planning (ERP) system. Initiated in 2003, it is to
standardize the Navy‘s business processes, such as acquisition and
financial management. It is being delivered in increments, the first of
which is to cost about $2.4 billion over its 20-year useful life and be
fully deployed by fiscal year 2013. To date, the program has
experienced about $570 million in cost overruns and a 2-year schedule
delay. GAO was asked to determine whether (1) system testing is being
effectively managed, (2) system changes are being effectively
controlled, and (3) independent verification and validation (IV&V)
activities are being effectively managed. To do this, GAO analyzed
relevant program documentation, traced random samples of test defects
and change requests, and interviewed cognizant officials.
What GAO Found:
The Navy has largely implemented effective controls on Navy ERP
associated with system testing and change control. For example, it has
established a well-defined structure for managing tests, including
providing for a logical sequence of test events, adequately planning
key test events, and documenting and reporting test results. In
addition, it has documented, and is largely following, its change
request review and approval process, which reflects key aspects of
relevant guidance, such as having defined roles and responsibilities
and a hierarchy of control boards. However, important aspects of test
management and change control have not been fully implemented.
Specifically, the program‘s tool for auditing defect management did not
always record key data about changes made to the status of identified
defects. To its credit, the program office recently took steps to
address this, thereby reducing the risk of defect status errors or
unauthorized changes. Also, while the program office‘s change review
and approval procedures include important steps, such as considering
the impact of a change, and program officials told GAO that cost and
schedule impacts of a change are discussed at control board meetings,
GAO‘s analysis of 60 randomly selected change requests showed no
evidence that cost and schedule impacts were in fact considered.
Without such key information, decision-making authorities lack an
adequate basis for making informed investment decisions, which could
result in cost overruns and schedule delays.
The Navy has not effectively managed its IV&V activities, which are
designed to obtain an unbiased position on whether product and process
standards are being met. In particular, the Navy has not ensured that
the IV&V contractor is independent of the products and processes that
it is reviewing. Specifically, the same contractor responsible for
performing IV&V of Navy ERP products (e.g., system releases) is also
responsible for ensuring that system releases are delivered within cost
and schedule constraints. Because performance of this system
development and management role makes the contractor potentially unable
to render impartial assistance to the government in performing the IV&V
function, there is an inherent conflict of interest. In addition, the
IV&V agent reports directly and solely to the program manager and not
to program oversight officials. As GAO has previously reported, the
IV&V agent should report the findings and associated risks to program
oversight officials, as well as program management, in order to better
ensure that the IV&V results are objective and that the officials
responsible for making program investment decisions are fully informed.
Furthermore, the contractor has largely not produced the range of IV&V
deliverables that were contractually required between 2006 and 2008. To
its credit, the program office recently began requiring the contractor
to provide assessment reports, as required under the contract, as well
as formal quarterly reports; the contractor delivered the results of
the first planned assessment in March 2009. Notwithstanding the recent
steps that the program office has taken, it nevertheless lacks an
independent perspective on the program‘s products and management
processes.
What GAO Recommends:
GAO is making recommendations to the Secretary of Defense aimed at
improving the program‘s system change request review and approval
process and its IV&V activities. DOD concurred with the recommendations
and identified actions that it plans to take.
View [hyperlink, http://www.gao.gov/products/GAO-09-841] or key
components. For more information, contact Randolph C. Hite at (202) 512-
3439 or hiter@gao.gov.
[End of section]
Contents:
Letter:
Background:
Key Aspects of Navy ERP Testing Have Been Effectively Managed:
System Changes Have Been Controlled, but Their Cost and Schedule
Impacts Were Not Sufficiently Considered:
Navy ERP IV&V Function Is Not Independent and Has Not Been Fully
Performed:
Conclusions:
Recommendations for Executive Action:
Agency Comments:
Appendix I: Objectives, Scope, and Methodology:
Appendix II: Comments from the Department of Defense:
Appendix III: GAO Contact and Staff Acknowledgments:
Tables:
Table 1: Navy Systems Commands and Their Responsibilities:
Table 2: Navy ERP Template 1 Releases:
Table 3: Organizations Responsible for Navy ERP Oversight and
Management:
Table 4: Navy ERP Program Contracts:
Table 5: Description of the Purpose of Navy ERP Tests:
Table 6: Navy ERP Testing-Related Organizations and Respective Roles
and Responsibilities:
Table 7: Roles and Responsibilities for Change Review and Approval:
Figures:
Figure 1: Navy ERP Timeline:
Figure 2: Navy ERP Life-Cycle Cost Estimates in Fiscal Years 2003,
2004, and 2007:
Figure 3: Navy ERP Deployment Schedule:
Figure 4: Release 1.0 and 1.1 Test Activity Schedule:
Abbreviations:
DOD: Department of Defense:
DON: Department of the Navy:
ERP: Enterprise Resource Planning:
FISC: Fleet Industrial Supply Center:
FOC: full operational capability:
FOT&E: follow-on operational test and evaluation:
GCSS-MC: Global Combat Support System--Marine Corps:
GDIT: General Dynamics Information Technology:
IOC: initial operational capability:
IOT&E: initial operational test and evaluation:
IST: integrated system testing:
IT: information technology:
IV&V: independent verification and validation:
MDA: milestone decision authority:
NAVAIR: Naval Air Systems Command:
NAVSEA: Naval Sea Systems Command:
NAVSUP: Naval Supply Systems Command:
NTCSS: Naval Tactical Command Support System:
OT&E: operational test and evaluation:
SAP: Systems Applications and Products:
SPAWAR: Space and Naval Warfare Systems Command:
TEMP: Test and Evaluation Master Plan:
UAT: user acceptance testing:
[End of section]
United States Government Accountability Office:
Washington, DC 20548:
September 15, 2009:
The Honorable Evan Bayh:
Chairman:
The Honorable Richard Burr:
Ranking Member:
Subcommittee on Readiness and Management Support:
Committee on Armed Services:
United States Senate:
The Honorable John Ensign:
United States Senate:
For decades, the Department of the Defense (DOD) has been challenged in
modernizing its timeworn business systems.[Footnote 1] In 1995, we
designated DOD's business systems modernization program as high-risk,
and continue to do so today.[Footnote 2] Our reasons include the
modernization's large size, complexity, and its critical role in
addressing other high-risk areas, such as overall business
transformation and financial management. Moreover, we continue to
report on business system investments that fail to effectively employ
acquisition management controls and deliver promised benefits and
capabilities on time and within budget.[Footnote 3]
Nevertheless, DOD continues to invest billions of dollars in thousands
of these business systems, 11 of which account for about two-thirds of
the department's annual spending on business programs. The Navy
Enterprise Resource Planning (ERP) program is one such program.
Initiated in 2003, Navy ERP is to standardize the Navy's acquisition,
financial, program management, plant and wholesale supply, and
workforce management business processes across its dispersed
organizational environment. As envisioned, the program consists of a
series of major increments, the first of which includes three releases
and is expected to cost approximately $2.4 billion over its 20-year
life cycle and to be fully operational in fiscal year 2013. We recently
reported that Navy ERP program management weaknesses had contributed to
a 2-year schedule delay and about $570 million in cost overruns.
[Footnote 4]
As agreed, our objectives were to determine whether (1) system testing
is being effectively managed, (2) system changes are being effectively
controlled, and (3) independent verification and validation (IV&V)
activities are being effectively managed. To accomplish this, we
analyzed relevant program documentation, such as test management
documents, individual test plans and procedures and related test
results and defect reports; system change procedures and specific
change requests and decisions; change review board minutes; and
verification and validation plans and contract documents. We also
observed the use of tools for recording and tracking test defects and
change requests, including tracing a statistically valid sample of
transactions through these tools.
We conducted this performance audit from August 2008 to September 2009,
in accordance with generally accepted government auditing standards.
Those standards require that we plan and perform the audit to obtain
sufficient, appropriate evidence to provide a reasonable basis for our
findings and conclusions based on our audit objectives. We believe that
the evidence obtained provides a reasonable basis for our findings and
conclusions based on our audit objectives. Additional details on our
objectives, scope, and methodology are in appendix I.
Background:
The Department of the Navy's (DON) primary mission is to organize,
train, maintain, and equip combat-ready naval forces capable of winning
wars, deterring aggression by would-be foes, preserving freedom of the
seas, and promoting peace and security. Its operating forces, known as
the fleet, are supported by four systems commands. Table 1 provides a
brief description of each command's responsibilities.
Table 1: Navy Systems Commands and Their Responsibilities:
Systems command: Naval Air Systems Command (NAVAIR);
Responsibilities: Developing, delivering, and supporting aircraft and
weapons used by sailors and marines.
Systems command: Naval Supply Systems Command (NAVSUP);
Responsibilities: Providing supply, fuel, transportation, and other
logistics programs.
Systems command: Space and Naval Warfare Systems Command (SPAWAR);
Responsibilities: Developing, delivering, and supporting specialized
command and control technologies, business information technology, and
space capabilities.
Systems command: Naval Sea Systems Command (NAVSEA);
Responsibilities: Acquiring and maintaining the department's ships and
submarines.
Source: GAO analysis of DON data.
[End of table]
To support the department's mission, these commands perform a variety
of interrelated and interdependent business functions (e.g.,
acquisition and financial management), relying heavily on business
systems to do so. In fiscal year 2009, DON's budget for business
systems and associated infrastructure was about $2.7 billion, of which
about $2.2 billion was allocated to operations and maintenance of
existing systems and about $500 million to systems in development and
modernization. Of the approximately 2,480 business systems that DOD
reports having, DON accounts for 569, or about 23 percent, of the
total. Navy ERP is one such system investment.
Navy ERP: A Brief Description:
In July 2003, the Assistant Secretary of the Navy for Research,
Development, and Acquisition established Navy ERP to converge the
functionality of four pilot systems that were under way at the four
commands into one system.[Footnote 5] According to DOD, Navy ERP is to
address the Navy's long-standing problems related to financial
transparency and asset visibility. Specifically, the program is
intended to standardize the Navy's acquisition, financial, program
management, plant and wholesale supply, and workforce management
business processes across its dispersed organizational components, and
support about 86,000 users when fully implemented.
Navy ERP is being developed in a series of increments using the Systems
Applications and Products (SAP) commercial software package, augmented
as needed by customized software. SAP consists of multiple, integrated
functional modules that perform a variety of business-related tasks,
such as finance and acquisition. The first increment, called Template
1, is currently the only funded portion of the program and consists of
three releases (1.0, 1.1, and 1.2).[Footnote 6] Release 1.0, Financial
and Acquisition, is the largest of the three releases in terms of
Template 1 functional requirements.[Footnote 7] See table 2 for a
description of these releases.
Table 2: Navy ERP Template 1 Releases:
Release: 1.0 Financial and Acquisition;
Functionality:
* General Fund and Navy Working Capital Fund finance applications, such
as billing, budgeting, and cost planning;
* Acquisition applications, such as activity-based costing, contract
awards, and budget exhibits;
* Workforce management applications, such as personnel administration
and training, as well as events management.
Release: 1.1 Wholesale and Retail Supply;
Functionality:
* Wholesale applications, such as supply and demand planning, order
fulfillment, and supply forecasting;
* Retail supply applications, such as inventory management, supply and
demand processing, and warehouse management.
Release: 1.2 Intermediate-Level Maintenance;
Functionality:
* Maintenance applications, such as maintenance management, quality
management, and calibration management.
Source: GAO analysis of DON data.
[End of table]
DON estimates the life-cycle cost for Template 1 to be about $2.4
billion, including about $1 billion for acquisition and $1.4 billion
for operations and maintenance. The program office reported that
approximately $600 million was spent from fiscal year 2004 through
fiscal year 2008. For fiscal year 2009, about $190 million is planned
to be spent.
Program Oversight, Management, and Contractor Roles and
Responsibilities:
To acquire and deploy Navy ERP, DON established a program management
office within the Program Executive Office for Executive Information
Systems. The program office manages the program's scope and funding and
is responsible for ensuring that the program meets its key objectives.
To accomplish this, the program office performs program management
functions, including testing, change control, and IV&V. In addition,
various DOD and DON organizations share program oversight and review
activities. A listing of key entities and their roles and
responsibilities is provided in table 3.
Table 3: Organizations Responsible for Navy ERP Oversight and
Management:
Entity: Under Secretary of Defense for Acquisition, Technology, and
Logistics;
Roles and responsibilities: Serves as the milestone decision authority
(MDA), which according to DOD, has overall responsibility for the
program, to include approving the program to proceed through its
acquisition cycle on the basis of, for example, independent operational
test evaluation and certification.
Entity: Assistant Secretary of the Navy, Research, Development, and
Acquisition;
Roles and responsibilities: Serves as DON's oversight organization for
the program, to include enforcement of Under Secretary of Defense for
Acquisition, Technology, and Logistics policies and procedures.
Entity: DON, Program Executive Office for Executive Information
Systems;
Roles and responsibilities: Oversees a portfolio of large-scale
projects and programs designed to enable common business processes and
provide standard capabilities, to include reviewing and approving
overarching test plans and user acceptance test readiness.
Entity: Navy ERP Senior Integration Board;
Roles and responsibilities: Reviews progress in attaining acceptable
system performance at systems commands, including approving new system
capabilities. Chaired by the Principal Deputy Assistant Secretary of
the Navy.
Entity: Navy ERP Program Management Office;
Roles and responsibilities: Performs day-to-day program management and
serves as the single point of accountability for managing the program's
objectives through development, testing, deployment, and sustainment.
Source: GAO analysis of DOD data.
[End of table]
To deliver system and other program capabilities and to provide program
management support services, Navy ERP relies on multiple contractors,
as described in table 4.
Table 4: Navy ERP Program Contracts:
Contract: Release 1.0 System Integration;
Award date: September 2004;
Completion date: February 2008;
Contract value: $176 million;
Awarded to: BearingPoint;
Purpose: Design and development of release 1.0; training and deployment
at NAVAIR.
Contract: Release 1.1 & 1.2 System Integration;
Award date: June 2007;
Completion date: September 2011;
Contract value: $152.9 million;
Awarded to: IBM;
Purpose: Design and development of release 1.1 and 1.2.
Contract: Professional Support Service 1;
Award date: June 2006;
Completion date: September 2010;
Contract value: $163.7 million;
Awarded to: IBM;
Purpose: Business process analysis, training, organizational change
management, and deployment and sustainment support.
Contract: Professional Support Service 2;
Award date: June 2006;
Completion date: September 2010;
Contract value: $69 million;
Awarded to: General Dynamics Information Technology;
Purpose: Support to the government in its oversight of the system
integrators and other contractors, release management, and IV&V.
Source: GAO analysis of DON data.
[End of table]
Overview of Navy ERP's Status:
Template 1 of Navy ERP was originally planned to reach full operational
capability (FOC) in fiscal year 2011, and its original estimated life-
cycle cost was about $1.87 billion.[Footnote 8] The estimate was later
baselined[Footnote 9] in August 2004 at about $2.0 billion.[Footnote
10] In December 2006 and again in September 2007, the program was
rebaselined. FOC is now planned for fiscal year 2013, and the estimated
life-cycle cost is about $2.4 billion (a 31 percent increase over the
original estimate).[Footnote 11]
The program is currently in the production and deployment phase of the
defense acquisition system, having completed the system development and
demonstration phase in September 2007.[Footnote 12] This was 17 months
later than the program's original schedule set in August 2004, but on
time according to the revised schedule set in December 2006. Changes in
the program's acquisition phase timeline are depicted in figure 1, and
life-cycle cost estimates are depicted in figure 2.
Figure 1: Navy ERP Life-Cycle Cost Estimates in Fiscal Years 2003,
2004, and 2007:
[Refer to PDF for image: illustration]
Phase: Concept refinement and technology development;
Fiscal year 2003-2004: Program established (activity prior to the 2004
plan);
Phase: System development and demonstration;
Fiscal year 2004-2006: 2004 plan;
Fiscal year 2004-2008: 2007 plan;
Phase: Production and deployment;
Fiscal year 2006-2011: 2004 plan;
* 2006: Initial Operational Capability;
* 2011: Full Operational Capability.
Fiscal year 2007-2013: 2007 plan;
* 2008: Initial Operational Capability;
* 2013: Full Operational Capability.
Source: GAO analysis of DON data.
[End of figure]
Figure 2: Navy ERP Life-Cycle Cost Estimates in Fiscal Years 2003,
2004, and 2007:
[Refer to PDF for image: vertical bar graph]
Fiscal year: 2003;
Navy ERP Cost estimate: $1.87 billion.
Fiscal year: 2004;
Navy ERP Cost estimate: $1.99 billion.
Fiscal year: 2007;
Navy ERP Cost estimate: $2.44 billion.
Source: GAO analysis of DON data.
[End of figure]
Release 1.0 was deployed at NAVAIR in October 2007, after passing
developmental testing and evaluation. Initial operational capability
(IOC) was achieved in May 2008, 22 months later than the baseline
established in August 2004, and 4 months later than the new baseline
established in September 2007. According to program documentation,
these delays were due, in part, to challenges experienced at NAVAIR in
converting data from legacy systems to run on the new system and
implementing new business procedures associated with the system. In
light of the delays at NAVAIR in achieving IOC, the deployment
schedules for the other commands were revised in 2008. Release 1.0 was
deployed at NAVSUP in October 2008 as scheduled, but deployment at
SPAWAR was rescheduled for October 2009, 18 months later than planned,
and at NAVSEA General Fund in October 2010, and at Navy Working Capital
Fund in October 2011, each 12 months later than planned.
Release 1.1 is currently being developed and tested, and is planned to
be deployed at NAVSUP in February 2010, 7 months later than planned,
and at the Navy's Fleet and Industrial Supply Centers (FISC)[Footnote
13] starting in February 2011. Changes in the deployment schedule are
depicted in figure 3.
Figure 3: Navy ERP Deployment Schedule:
[Refer to PDF for image: illustration]
Release: 1.0 Financial and Acquisition;
NAVAIR: Late FY 2007 (2007 plan);
NAVSUP: Late FY 2008 (2007 and 2008 plan);
SPAWAR: Mid FY 2007 (2007 plan); Early FY 2010 (2008 plan);
NAVSEA for General Fund: Early FY 2010 (2007 plan); Early FY 2011 (2008
plan);
NAVSEA for Working Capital Fund: Early FY 2011 (2007 plan); Early FY
2012 (2008 plan).
Release: 1.1 Wholesale and Retail Supply;
NAVSUP: Late FY 2009 (2007 plan); Mid FY 2010 (2008 plan);
FISC (first of seven deployments): Mid FY 2011 (2008 plan).
Source: GAO analysis of DON data.
[End of figure]
Prior GAO Reviews of DOD Business System Investments Have Identified IT
Management Weaknesses:
We have previously reported that DOD has not effectively managed key
aspects of a number of business system investments,[Footnote 14]
including Navy ERP. Among other things, our reviews have identified
weaknesses in such areas as architectural alignment and informed
investment decision making, which are the focus of the Fiscal Year 2005
Defense Authorization Act business system provisions.[Footnote 15] Our
reviews have also identified weaknesses in other system acquisition and
investment management areas, such as earned value management,[Footnote
16] economic justification, risk management, requirements management,
test management, and IV&V practices.
In September 2008, we reported that DOD had implemented key information
technology (IT) management controls on Navy ERP to varying degrees of
effectiveness.[Footnote 17] For example, the control associated with
managing system requirements had been effectively implemented, and
important aspects of other controls had been at least partially
implemented, including those associated with economically justifying
investment in the program and proactively managing program risks.
However, other aspects of these controls, as well as the bulk of what
was needed to effectively implement earned value management, had not
been effectively implemented. As a result, the controls that were not
effectively implemented had, in part, contributed to sizable cost and
schedule shortfalls. Accordingly, we made recommendations aimed at
improving cost and schedule estimating, earned value management, and
risk management. DOD largely agreed with our recommendations.
In July 2008, we reported that DOD had not implemented key aspects of
its IT acquisition policies and related guidance on its Global Combat
Support System-Marine Corps (GCSS-MC) program.[Footnote 18] For
example, we reported that it had not economically justified its
investment in GCSS-MC on the basis of reliable estimates of both
benefits and costs and had not effectively implemented earned value
management. Moreover, the program office had not adequately managed all
program risks and had not used key system quality measures. We
concluded that by not effectively implementing these IT management
controls, the program was at risk of not delivering a system solution
that optimally supports corporate mission needs, maximizes capability
mission performance, and is delivered on time and within budget.
Accordingly, we made recommendations aimed at strengthening cost
estimating, schedule estimating, risk management, and system quality
measurement. The department largely agreed with our recommendations.
In July 2007, we reported that the Army's approach for investing about
$5 billion in three related programs--the General Fund Enterprise
Business System, Global Combat Support System-Army Field/Tactical, and
Logistics Modernization Program--did not include alignment with the
Army enterprise architecture or use of a portfolio-based business
system investment review process.[Footnote 19] Further, the Logistics
Modernization Program's testing was not adequate and had contributed to
the Army's inability to resolve operational problems. In addition, the
Army had not established an IV&V function for any of the three
programs. Accordingly, we recommended, among other things, use of an
independent test team and establishment of an IV&V function. DOD agreed
with the recommendations.
In December 2005, we reported that DON had not, among other things,
economically justified its ongoing and planned investment in the Naval
Tactical Command Support System (NTCSS) and had not adequately
conducted requirements management and testing activities.[Footnote 20]
Specifically, requirements were not traceable and developmental testing
had not identified problems that, subsequently, twice prevented the
system from passing operational testing. Moreover, DON had not
effectively performed key measurement, reporting, budgeting, and
oversight activities. We concluded that DON could not determine whether
NTCSS, as defined and as being developed, was the right solution to
meet its strategic business and technological needs. Accordingly, we
recommended developing the analytical basis necessary to know if
continued investment in NTCSS represented a prudent use of limited
resources, and strengthening program management, conditional upon a
decision to proceed with further investment in the program. The
department largely agreed with our recommendations.
In September 2005, we reported that while Navy ERP had the potential to
address some of DON's financial management weaknesses, it faced
significant challenges and risks, including developing and implementing
system interfaces with other systems and converting data from legacy
systems.[Footnote 21] Also, we reported that the program was not
capturing quantitative data to assess effectiveness, and had not
established an IV&V function. We made recommendations to address these
areas, including having the IV&V agent report directly to program
oversight bodies, as well as the program manager. DOD generally agreed
with our recommendations, including that an IV&V function should be
established. However, it stated that the IV&V team would report
directly to program management who in turn would inform program
oversight officials of any significant IV&V results. In response, we
reiterated the need for the IV&V to be independent of the program and
stated that performing IV&V activities independently of the development
and management functions helps to ensure that the results are unbiased
and based on objective evidence. We also reiterated our support for the
recommendation that the IV&V reports be provided to the appropriate
oversight body so that it can determine whether any of the IV&V results
are significant. We noted that doing so would give added assurance that
the results were objective and that those responsible for authorizing
future investments in Navy ERP have the information needed to make
informed decisions.
Key Aspects of Navy ERP Testing Have Been Effectively Managed:
To be effectively managed, testing should be planned and conducted in a
structured and disciplined fashion. According to DOD and industry
guidance,[Footnote 22] system testing should be progressive, meaning
that it should consist of a series of test events that first focus on
the performance of individual system components, then on the
performance of integrated system components, followed by system-level
tests that focus on whether the entire system (or major system
increments) is acceptable, interoperable with related systems, and
operationally suitable to users. For this series of related test events
to be conducted effectively, all test events need to be, among other
things, governed by a well-defined test management structure and
adequately planned. Further, the results of each test event need to be
captured and used to ensure that problems discovered are disclosed and
corrected.
Key aspects of Navy ERP testing have been effectively managed.
Specifically, the program has established an effective test management
structure, key development events were based on well-defined plans, the
results of all executed test events were documented, and problems found
during testing (i.e., test defects) were captured in a test management
tool and subsequently analyzed, resolved, and disclosed to decision
makers. Further, while we identified instances in which the tool did
not contain key data about defects that are needed to ensure that
unauthorized changes to the status of defects do not occur, the number
of instances found are not sufficient to conclude that the controls
were not operating effectively. Notwithstanding the missing data, this
means that Navy ERP testing has been performed in a manner that
increases the chances that the system will meet operational needs and
perform as intended.
A Well-defined Test Management Structure Has Been Established:
The program office has established a test management structure that
satisfies key elements of DOD and industry guidance.[Footnote 23] For
example, the program has developed a Test and Evaluation Master Plan
(TEMP) that defines the program's test strategy. As provided for in the
guidance, this strategy consists of a sequence of tests in a simulated
environment to verify first that individual system parts meet specified
requirements (i.e., development testing) and then verify that these
combined parts perform as intended in an operational environment (i.e.,
operational testing). As we have previously reported,[Footnote 24] such
a sequencing of test events is an effective approach because it permits
the source of defects to be isolated sooner, before it is more
difficult and expensive to address.
More specifically, the strategy includes a sequence of developmental
tests for each release consisting of three cycles of integrated system
testing (IST) followed by user acceptance testing (UAT). Following
development testing, the sequence of operational tests includes the
Navy's independent operational test agency conducting initial
operational test and evaluation (IOT&E) and then follow-on operational
test and evaluation (FOT&E), as needed, to validate the resolution of
deficiencies found during IOT&E. See table 5 for a brief description of
the purpose of each test activity, and figure 4 for the schedule of
Release 1.0 and 1.1 test activities.
Table 5: Description of the Purpose of Navy ERP Tests:
Test: Developmental testing: IST;
Purpose: To validate that the technical and functional components of
the system work properly together and operate as specified by the
requirements.
Test: Developmental testing: Cycle 1 (Scenario Testing);
Purpose: To validate chains of business process transactions using
small scenarios, such as a standard sales order, delivery, and
invoicing. Also, independent evaluators observe scenario testing in
preparation for operational test and evaluation.
Test: Developmental testing: Cycle 2 (Scenario Testing and Conversions
and Interfaces);
Purpose: To validate more complex sequences of transactions plus
customized software.
Test: Developmental testing: Cycle 3 (Final Integration Testing);
Purpose: To validate the entire system, including external components.
Test: Developmental testing: UAT;
Purpose: To allow the customer to ensure Navy ERP works properly and
operates as specified by the requirements.
Test: Operational testing: IOT&E; Purpose: To evaluate the operational
effectiveness and suitability of the system.
Test: Operational testing: FOT&E; Purpose: To verify the correction of
deficiencies identified during IOT&E.
Source: GAO analysis of DON data.
[End of table]
Figure 4: Release 1.0 and 1.1 Test Activity Schedule:
[Refer to PDF for image: illustration]
Release 1.0: Developmental testing:
IST Cycle 1: Duration FY 2007, October-November;
IST Cycle 2: Duration FY 2007, December-January;
IST Cycle 3: Duration FY 2007, February-June;
UAT: Duration FY 2007, July-August;
Release 1.0: IOT&E:
at NAVAIR: Duration, FY 2008, November-March;
Release 1.0: FOT&E:
at NAVAIR and NAVSUP: Duration, FY 2009, January-April.
Release 1.1: Developmental testing:
IST Cycle 1: Duration, FY 2009, January-February;
IST Cycle 2: Duration, FY 2009, March-April;
IST Cycle 3: Duration, FY 2009, May-October, 2010;
UAT: Duration, FY 2010, October-December;
Release 1.1: IOT&E:
at NAVSUP: Duration, FY 2010, May-August.
Source: GAO analysis of DON data.
[End of figure]
The TEMP also clearly identifies the roles and responsibilities of key
Navy ERP testing organizations, as provided for in DOD and industry
guidance. For example, it describes specific responsibilities of the
program manager, system integrator, quality assurance/test team lead,
and independent operational test and evaluation organizations. Table 6
summarizes the responsibilities of these various test organizations.
Table 6: Navy ERP Testing-Related Organizations and Respective Roles
and Responsibilities:
Testing-related organization: Program manager;
Responsibilities: Provides overall management and direction of Navy ERP
test and evaluation; Conducts test readiness reviews; Certifies that
the program is ready to proceed from developmental to operational
testing in a developmental test and evaluation report.
Testing-related organization: System integrator;
Responsibilities: Supports the execution of integration and user
acceptance testing, including training system testers and users;
Reports to the Navy ERP program manager.
Testing-related organization: Quality assurance/test team lead;
Responsibilities: Creates the test and evaluation strategy and
developmental test and evaluation plan; Assists in planning,
coordinating, and conducting developmental testing and evaluation, and
reporting the results to the program manager; Conducts integration
testing.
Testing-related organization: Operational Test and Evaluation Force;
Responsibilities: Plans and conducts Navy ERP operational test and
evaluation (OT&E); Reports results and recommendations to DOD's
Director, Operational Test and Evaluation; Performs follow-on OT&E to
verify that deficiencies found during initial OT&E have been resolved.
Testing-related organization: Joint Interoperability Test Command;
Responsibilities: Certifies to the Joint Chiefs of Staff that
interoperability requirements are met; Verifies readiness for
interoperability to the responsible operational test agency during or
prior to operational test readiness review.
Testing-related organization: Office of Director, Operational Test and
Evaluation;
Responsibilities: Reviews and approves IOT&E and FOT&E plans; Analyzes
OT&E results; Provides independent assessment to the MDA.
Source: GAO analysis of DOD data.
[End of table]
Well-defined Plans for Developmental Test Events Were Developed:
According to relevant guidance,[Footnote 25] test activities should be
governed by well-defined and approved plans. Among other things, such
plans are to include a defect triage process, metrics for measuring
progress in resolving defects, test entrance and exit criteria, and
test readiness reviews.
Each developmental test event for Release 1.0 (i.e., each cycle of
integrated systems testing and user acceptance testing) was based on a
well-defined test plan. For example, each plan provided for conducting
daily triage meetings to (1) assign new defects a criticality level
using documented criteria,[Footnote 26] (2) record new defects and
update the status of old defects in the test management tool, and (3)
address other defect and testing issues. Further, each plan included
defect metrics, such as the number of defects found and corrected and
their age. In addition, each plan specified that testing was not
complete until all major defects found during the cycle were resolved,
and all unresolved defects' impact on the next test event were
understood. Further, the plans provided for holding test readiness
reviews to review test results as a condition for proceeding to the
next event. By ensuring that plans for key development test activities
include these aspects of effective test planning, the risk of test
activities not being effectively and efficiently performed is reduced,
thus increasing the chances that the system will meet operational
requirements and perform as intended.
Test Results Were Documented and Reported, but Key Information about
Changes to the Status of Reported Defects Was Not Always Recorded:
According to industry guidance[Footnote 27], effective system testing
includes capturing, analyzing, resolving, and disclosing to decision
makers the status of problems found during testing (i.e., test
defects). Further, this guidance states that these results should be
collected and stored according to defined procedures and placed under
appropriate levels of control to ensure that any changes to the results
are fully documented.
To the program's credit, the relevant testing organizations have
documented test defects in accordance with defined plans. For example,
daily triage meetings involving the test team lead, testers, and
functional experts were held to review each new defect, assign it a
criticality level, and designate someone responsible for resolving it
and for monitoring and updating its resolution in the test management
tool. Further, test readiness reviews were conducted at which entrance
and exit criteria for each key test event were evaluated before
proceeding to the next event. As part of these reviews, the program
office and oversight officials, command representatives, and test
officials reviewed the results of test events to ensure, among other
things, that significant defects were closed and that there were no
unresolved defects that could affect execution of the next test event.
However, the test management tool did not always contain key data for
all recorded defects that are needed to ensure that unauthorized
changes to the status of defects do not occur. According to information
systems auditing guidelines,[Footnote 28] audit tools should be in
place to monitor user access to systems to detect possible errors or
unauthorized changes. For Navy ERP, this was not always the case.
Specifically, while the tool has the capability to track changes to
test defects in a history log,[Footnote 29] our analysis of 80 randomly
selected defects in the tool disclosed two instances in which the tool
did not record when a change in the defect's status was made or who
made the change. In addition, our analysis of 12 additional defects
that were potential anomalies[Footnote 30] disclosed two additional
instances where the tool did not record when a change was made and who
made it. While our sample size and results do not support any
conclusions as to the overall effectiveness of the controls in place
for recording and tracking test defect status changes, they do show
that it is possible that changes can be made without a complete audit
trail surrounding those changes. After we shared our results with
program officials, they stated that they provided each instance for
resolution to the vendor responsible for the tracking tool. These
officials attributed these instances to vendor updates to the tool that
caused the history settings to default to "off." To address this
weakness, they added that they are now ensuring that the history logs
are set correctly after any update to the tool. This addition is a
positive step because without an effective information system access
audit tool, the probability of test defect status errors or
unauthorized changes is increased.
System Changes Have Been Controlled, but Their Cost and Schedule
Impacts Were Not Sufficiently Considered:
Industry best practices and DOD guidance[Footnote 31] recognize the
importance of system change control when developing and maintaining a
system. Once the composition of a system is sufficiently defined, a
baseline configuration is normally established, and changes to that
baseline are placed under a disciplined change control process to
ensure that unjustified and unauthorized changes are not introduced.
Elements of disciplined change control include (1) formally documenting
a change control process, (2) rigorously adhering to the documented
process, and (3) adopting objective criteria for considering a proposed
change, including its estimated cost and schedule impact.
To its credit, the Navy ERP program has formally documented a change
control process. Specifically, it has a plan and related procedures
that include the purpose and scope of the process--to ensure that any
changes made to the system are properly identified, developed, and
implemented in a defined and controlled environment. It also is using
an automated tool to capture and track the disposition of each change
request. Further, it has defined roles and responsibilities and a
related decision-making structure for reviewing and approving system
changes. In this regard, the program has established a hierarchy of
review and approval boards, including a Configuration Control Board to
review all changes and a Configuration Management Board to further
review changes estimated to require more than 100 hours or $25,000 to
implement. Furthermore, a Navy ERP Senior Integration Board was
recently established to review and approve requests to add, delete, or
change the program's requirements. In addition, the change control
process states that the decisions are to be based on, among others, the
system engineering and earned value management (i.e., cost and
schedule) impacts the change will introduce, such as the estimated
number of work hours that will be required to effect the change. Table
7 provides a brief description of the decision-making authorities and
boards and their respective roles and responsibilities.
Table 7: Roles and Responsibilities for Change Review and Approval:
Review and approval organizations: Navy ERP Senior Integration Board;
Roles and responsibilities: Reviews and approves Engineering Change
Proposals, which are proposed changes that would impact system scope,
configuration, cost, or schedule by adding, deleting, or changing
requirements. The board is chaired by the Principal Deputy Assistant
Secretary of the Navy, Research, Development, and Acquisition.
Review and approval organizations: Configuration Management Board;
Roles and responsibilities: Reviews and approves change requests
requiring more than 100 hours or $25,000 to implement. The board is
chaired by the program manager and includes representatives from the
earned value management team (i.e., cost and schedule).
Review and approval organizations: Configuration Control Board;
Roles and responsibilities: Reviews all change requests and approves
those requiring less than 100 hours or $25,000 to implement. The board
is chaired by the systems engineer and includes representatives from
the earned value management team (i.e., cost and schedule).
Review and approval organizations: Engineering Review Board;
Roles and responsibilities: Ensures change requests are ready to
proceed to the Configuration Control Board by reviewing and
recommending changes. This board is facilitated and chaired by the
systems engineer and the configuration manager to ensure the change
request documentation is complete.
Review and approval organizations: Technical Change Control Board;
Roles and responsibilities: Approves or defers transport change
requests, which are requests to release changes into the deployed
system. The board is chaired by the production manager.
Source: GAO analysis of DON documentation.
[End of table]
Navy ERP is largely adhering to its documented change control process.
Specifically, our review of a random sample of 60 change requests and
minutes of related board meetings held between May 2006 and April 2009
showed that the change requests were captured and tracked using an
automated tool, and they were reviewed and approved by the designated
decision-making authorities and boards, in accordance with the
program's documented process.
However, the program has not sufficiently or consistently considered
the cost and schedule impacts of proposed changes. Our analysis of the
random sample of 60 change requests, including our review of related
board meeting minutes, showed no evidence that cost and schedule
impacts were identified or that they were considered. Specifically, we
did not see evidence that the cost and schedule impacts of these change
requests were assessed. According to program officials, the cost and
schedule impacts of each change were discussed at control board
meetings. In addition, they provided two change requests to demonstrate
this. However, while these change requests did include schedule impact,
they did not include the anticipated cost impact of proposed changes.
Rather, these two, as well as those in our random sample, included the
estimated number of work hours required to implement the change.
Because the cost of any proposed change depends on other factors
besides work hours, such as labor rates, the estimated number of work
hours is not sufficient for considering the cost impact of a change. In
the absence of verifiable evidence that cost and schedule impacts were
consistently considered, approval authorities do not appear to have
been provided key information needed to fully inform their decisions on
whether or not to approve a change. System changes that are approved
without a full understanding of their cost and schedule impacts could
result in unwarranted cost increases and schedule delays.
Navy ERP IV&V Function Is Not Independent and Has Not Been Fully
Performed:
The purpose of IV&V is to independently ensure that program processes
and products meet quality standards. The use of an IV&V function is
recognized as an effective practice for large and complex system
development and acquisition programs, like Navy ERP, as it provides
objective insight into the program's processes and associated work
products.[Footnote 32] To be effective, verification and validation
activities should be performed by an entity that is managerially
independent of the system development and management processes and
products that are being reviewed.[Footnote 33] Among other things, such
independence helps to ensure that the results are unbiased and based on
objective evidence.
The Navy has not effectively managed its IV&V function because it has
not ensured that the contractor performing this function is independent
of the products and processes that this contractor is reviewing and
because it has not ensured that the contractor is meeting contractual
requirements. In June 2006, DON awarded a professional support services
contract to General Dynamics Information Technology (GDIT), to include
responsibilities for, among other things, IV&V, program management
support, and delivery of releases according to cost and schedule
constraints. According to the program manager, the contractor's IV&V
function is organizationally separate from, and thus independent of,
the contractor's Navy ERP system development function. However, the
subcontractor performing the IV&V function is also performing release
management. According to the GDIT contract, the release manager is
responsible for developing and deploying a system release that meets
operational requirements within the program's cost and schedule
constraints, but it also states that the IV&V function is responsible
for supporting the government in its review, approval, and acceptance
of Navy ERP products (e.g., releases). The contract also states that
GDIT is eligible for an optional award fee payment based on its
performance in meeting, among other things, these cost and schedule
constraints. Because performance of the system development and
management role makes the contractor potentially unable to render
impartial assistance to the government in performing the IV&V function,
the contractor has an inherent conflict of interest relative to meeting
cost and schedule commitments and disclosing the results of
verification and validation reviews that may affect its ability to do
so.
The IV&V function's lack of independence is amplified by the fact that
it reports directly and solely to the program manager. As we have
previously reported,[Footnote 34] the IV&V function should report the
issues or weaknesses that increase the risks associated with the
project to program oversight officials, as well as to program
management, to better ensure that the verification and validation
results are objective and that the officials responsible for making
program investment decisions are fully informed. Furthermore, these
officials, once informed, can ensure that the issues or weaknesses
reported are promptly addressed.
Without ensuring sufficient managerial independence, valuable
information may not reach decision makers, potentially leading to the
release of a system that does not adequately meet users' needs and
operate as intended.
Beyond the IV&V function's lack of independence, the program office has
not ensured that the subcontractor has produced the range of
deliverables that were contractually required and defined in the IV&V
plan. For example, the contract and plan call for weekly and monthly
reports identifying weaknesses in program processes and recommendations
for improvement, a work plan for accomplishing IV&V tasks, and
associated assessment reports that follow the System Engineering Plan
and program schedule. However, the IV&V contractor has largely not
delivered these products. Specifically, until recently, it did not
produce a work plan and only monthly reports were delivered, and these
reports only list meetings that the IV&V contactor attended and
documents that it reviewed. They do not, for example, identify program
weaknesses or provide recommendations for improvement. According to
program officials, they have relied on oral reports from the
subcontractor at weekly meetings, and these lessons learned have been
incorporated into program guidance. According to the contractor, the
Navy has expended about $1.8 million between June 2006 and September
2008 for IV&V activities, with an additional $249,000 planned to be
spent in fiscal year 2009.
Following our inquiries about an IV&V work plan, the IV&V contractor
developed such a plan in October 2008, more than 2 years after the
contract was awarded, that lists program activities and processes to be
assessed, such as configuration management and testing. While this plan
does not include time frames for starting and completing these
assessments, meeting minutes show that the status of assessments has
been discussed with the program manager during IV&V review meetings.
The first planned assessment was delivered to the program in March 2009
and provides recommendations for improving the program's configuration
management process, such as using the automated tool to produce certain
reports and enhancing training to understand how the tool is used.
Further, program officials stated that they have also recently begun
requiring the contractor to provide formal quarterly reports, the first
of which was delivered to the program manager in January 2009. Our
review of this quarterly report shows that it provides recommendations
for improving the program's risk management process and organizational
change management strategy.
Notwithstanding the recent steps that the program office has taken, it
nevertheless lacks an independent perspective on the program's products
and management processes.
Conclusions:
DOD's successes in delivering large-scale business systems, such as
Navy ERP, are in large part determined by the extent to which it
employs the kind of rigorous and disciplined IT management controls
that are reflected in department policies and related guidance. While
implementing these controls does not guarantee a successful program, it
does minimize a program's exposure to risk and thus the likelihood that
it will fall short of expectations. In the case of Navy ERP, living up
to expectations is important because the program is large, complex, and
critical to addressing the department's long-standing problems related
to financial transparency and asset visibility.
The Navy ERP program office has largely implemented a range of
effective controls associated with system testing and change control,
including acting quickly to address issues with the audit log for its
test management tool, but more can be done to ensure that the cost and
schedule impacts of proposed changes are explicitly documented and
considered when decisions are reached. Moreover, while the program
office has contracted for IV&V activities, it has not ensured that the
contractor is independent of the products and processes that it is to
review and has not held the contractor accountable for producing the
full range of IV&V deliverables required under the contract. Moreover,
it has not ensured that its IV&V contractor is accountable to a level
of management above the program office, as we previously recommended.
Notwithstanding the program office's considerable effectiveness in how
it has managed both system testing and change control, these weaknesses
increase the risk of investing in system changes that are not
economically justified and unnecessarily limit the value that an IV&V
agent can bring to a program like Navy ERP. By addressing these
weaknesses, the department can better ensure that taxpayer dollars are
wisely and prudently invested.
Recommendations for Executive Action:
To strengthen the management of Navy ERP's change control process, we
recommend that the Secretary of Defense direct the Secretary of the
Navy, through the appropriate chain of command, to (1) revise the Navy
ERP procedures for controlling system changes to explicitly require
that a proposed change's life-cycle cost impact be estimated and
considered in making change request decisions and (2) capture the cost
and schedule impacts of each proposed change in the Navy ERP automated
change control tracking tool.
To increase the value of Navy ERP IV&V, we recommend that the Secretary
of Defense direct the Secretary of the Navy, through the appropriate
chain of command, to (1) stop performance of the IV&V function under
the existing contract and (2) engage the services of a new IV&V agent
that is independent of all Navy ERP management, development, testing,
and deployment activities that it may review. In addition, we reiterate
our prior recommendation relative to ensuring that the Navy ERP IV&V
agent report directly to program oversight officials, while
concurrently sharing IV&V results with the program office.
[Refer to PDF for image]
[End of figure]
Agency Comments:
In written comments on a draft of this report, signed by the Assistant
Deputy Chief Management Officer and reprinted in appendix II, the
department concurred with our recommendations, and stated that it will
take the appropriate corrective actions within the next 7 months.
We are sending copies of this report to interested congressional
committees; the Director, Office of Management and Budget; the
Congressional Budget Office; and the Secretary of Defense. The report
also is available at no charge on our Web site at [hyperlink,
http://www.gao.gov].
If you or your staffs have any questions on matters discussed in this
report, please contact me at (202) 512-3439 or hiter@gao.gov. Contact
points for our Offices of Congressional Relations and Public Affairs
may be found on the last page of this report. GAO staff who made major
contributions to this report are listed in appendix III.
Signed by:
Randolph C. Hite:
Director:
Information Technology Architecture and Systems Issues:
[End of section]
Appendix I: Objectives, Scope, and Methodology:
Our objectives were to determine whether (1) system testing is being
effectively managed, (2) system changes are being effectively
controlled, and (3) independent verification and validation (IV&V)
activities are being effectively managed for the Navy Enterprise
Resource Planning (ERP) program.
To determine if Navy ERP testing is being effectively managed, we
reviewed relevant documentation, such as the Test and Evaluation Master
Plan and test reports and compared them with relevant federal and
related guidance. Further, we reviewed development test plans and
procedures for each test event and compared them with best practices to
determine whether well-defined plans were developed. We also examined
test results and reports, including test readiness review documentation
and compared them against plans to determine whether they had been
executed in accordance with the plans. Moreover, to determine the
extent to which test defect data were being captured, analyzed, and
reported, we inspected 80 randomly selected defects from a sample of
2,258 defects in the program's test management system. In addition, we
reviewed the history logs associated with each of these 80 defects to
determine whether appropriate levels of control were in place to ensure
that any changes to the results were fully documented. This sample was
designed with a 5 percent tolerable error rate at the 95 percent level
of confidence, so that, if we found 0 problems in our sample, we could
conclude statistically that the error rate was less than 4 percent. In
addition, we interviewed cognizant officials, including the program's
test lead and the Navy's independent operational testers, about their
roles and responsibilities for test management.
To determine if Navy ERP changes are being effectively controlled, we
reviewed relevant program documentation, such as the change control
policies, plans, and procedures, and compared them with relevant
federal and industry guidance. Further, to determine the extent to
which the program is reviewing and approving change requests according
to its documented plans and procedures, we inspected 60 randomly
selected change requests in the program's configuration management
system. In addition, we reviewed the change request forms associated
with these 60 change requests and related control board meeting minutes
to determine whether objective criteria for considering a proposed
change, including estimated cost or schedule impacts, were adopted. In
addition, we interviewed cognizant officials, including the program
manager and systems engineer, about their roles and responsibilities
for reviewing, approving, and tracking change requests.
To determine if IV&V activities are being effectively managed we
reviewed Navy ERP's IV&V contract, strategy, and plans and compared
them with relevant industry guidance. We also analyzed the contractual
relationships relative to legal standards that govern organizational
conflict of interest. In addition, we examined IV&V monthly status
reports, work plans, an assessment report, and a quarterly report, to
determine the extent to which contract requirements were met. We
interviewed contractor and program officials about their roles and
responsibilities for IV&V and to determine the extent to which the
program's IV&V function is independent.
We conducted this performance audit at Department of Defense offices in
the Washington, D.C., metropolitan area; Annapolis, Maryland; and
Norfolk, Virginia; from August 2008 to September 2009, in accordance
with generally accepted government auditing standards. Those standards
require that we plan and perform the audit to obtain sufficient,
appropriate evidence to provide a reasonable basis for our findings and
conclusions based on our audit objectives. We believe that the evidence
obtained provides a reasonable basis for our findings and conclusions
based on our audit objectives.
Appendix II: Comments from the Department of Defense:
Office Of The Deputy Chief Management Officer:
9010 Defense Pentagon:
Washington, DC 20301-9010:
August 21, 2009:
Mr. Randolph C. Hite:
Director, Information Technology Architecture and Systems Issues:
U.S. Government Accountability Office:
441 G Street, NW:
Washington, DC 20548:
Dear Mr. Hite:
This is the Department of Defense (DoD) response to the GAO draft
report 09-841, "DOD Business Systems Modernization: Navy Implementing A
Number Of Key Management Controls On Enterprise Resource Planning
System, but Improvements Still Needed," dated July 21, 2009 (GAO Code
310666).
The Department concurs with all four of GAO's recommendations. The Navy
Enterprise Resource Planning (ERP) program office will take the
appropriate corrective actions within the next seven months. Detailed
responses to each recommendation are attached.
We appreciate the support of GAO as the Department further advances in
its business transformation efforts, and look forward to continuing our
partnership in achieving our shared goals.
Signed by:
Elizabeth A. McGrath:
Assistant Deputy Chief Management Officer:
Attachment(s): As stated:
[End of letter]
GAO Draft Report Dated July 21, 2009:
GAO-09-841 (GAO Code 310666):
"DOD Business Systems Modernization: Navy Implementing A Number Of Key
Management Controls On Enterprise Resource Planning System, But
Improvements Still Needed"
Department Of Defense Comments To The GAO Recommendations:
Recommendation 1: The GAO recommends that the Secretary of Defense
direct the Secretary of the Navy, through the appropriate chain of
command, to revise the Navy Enterprise Resource Planning (ERP)
procedures for controlling system changes to explicitly require that a
proposed change's life cycle cost impact be estimated and considered in
making change request decisions. (Page 31/GAO Draft Report).
DOD Response: Concur. The Navy ERP program office will revise the
Enterprise Change Request Process and Procedures document to require
explicitly that a proposed change's life cycle cost impact be estimated
as part of the change control decision process. Corrective actions to
address GAO's recommendation will he taken by the beginning of Fiscal
Year (FY) 2010.
Recommendation 2: The GAO recommends that the Secretary of Defense
direct the Secretary of the Navy, through the appropriate chain of
command, to capture the cost and schedule impacts of each proposed
change in the Navy ERP automated change control tracking tool. (Page
31/GAO Draft Report).
DOD Response: Concur. The Navy ERP program office will update its
automated change control tracking tool to capture "dollarized" cost
impacts and better identify schedule impacts of future proposed
changes. Corrective actions to address GAO's recommendation will he
taken by the beginning of FY 2010.
Recommendation 3: The GAO recommends that the Secretary of Defense
direct the Secretary of the Navy, through the appropriate chain of
command, to stop performance of the Independent Verification and
Validation (IV&V) function under the existing contract. (Page 31/GAO
Draft Report).
DOD Response: Concur. The Navy ERP program office plans to stop IV&V
functions under the existing contract at the end of the current fiscal
year (2009).
Recommendation 4: The GAO recommends that the Secretary of Defense
direct the Secretary of the Navy, through the appropriate chain of
command. to engage the services of a new IV&V agent that is independent
of all Navy ERP management, development, testing, and deployment
activities that it may review. (Page 31/GAO Draft Report).
DOD Response: Concur. The Navy ERP program plans to execute future IV&V
functions using contract support that is not associated with any of the
other Navy ERP program activities to ensure there is no conflict of
interest, real or perceived. Corrective actions to address GAO's
recommendation will be taken no later than March 2010.
[End of section]
Appendix III: GAO Contact and Staff Acknowledgments:
GAO Contact:
Randolph C. Hite, (202) 512-3439, or hiter@gao.gov:
Staff Acknowledgments:
In addition to the individual named above, key contributors to this
report were Neelaxi Lakhmani, Assistant Director; Monica Anatalio; Carl
Barden; Neil Doherty; Cheryl Dottermusch; Lee McCracken; Karl Seifert;
Adam Vodraska; Shaunyce Wallace; and Jeffrey Woodward.
[End of section]
Footnotes:
[1] Business systems are information systems, including financial and
nonfinancial systems that support DOD business operations, such as
civilian personnel, finance, health, logistics, military personnel,
procurement, and transportation.
[2] GAO, High-Risk Series: An Update, [hyperlink,
http://www.gao.gov/products/GAO-09-271] (Washington, D.C.: January
2009).
[3] See, for example, GAO, DOD Business Systems Modernization:
Important Management Controls Being Implemented on Major Navy Program,
but Improvements Needed in Key Areas, [hyperlink,
http://www.gao.gov/products/GAO-08-896] (Washington, D.C.: Sept. 8,
2008); DOD Business Systems Modernization: Key Marine Corps System
Acquisition Needs to Be Better Justified, Defined, and Managed,
[hyperlink, http://www.gao.gov/products/GAO-08-822] (Washington, D.C.:
July 28, 2008); and DOD Business Transformation: Lack of an Integrated
Strategy Puts the Army's Asset Visibility System Investments at Risk,
[hyperlink, http://www.gao.gov/products/GAO-07-860] (Washington, D.C.:
July 27, 2007).
[4] [hyperlink, http://www.gao.gov/products/GAO-08-896].
[5] The four pilots are SIGMA, CABRILLO, NEMAIS, and SMART.
[6] The Navy is considering deleting the third release, Release 1.2,
from Template 1.
[7] Release 1.0 accounts for about 56 percent of the requirements;
Release 1.1, about 33 percent; and Release 1.2, about 10 percent.
[8] This 2003 estimate, which was prepared to assist in budget
development and support the Milestone A/B approval, was for
development, deployment, and sustainment costs in fiscal years 2003
through 2021.
[9] According to DOD's acquisition guidebook, an Acquisition Program
Baseline is a program manager's estimated cost, schedule, and
performance goals. Goals consist of objective values, which represent
what the user desires and expects, and threshold values, which
represent acceptable limits. When the program manager determines that a
current cost, schedule, or performance threshold value will not be
achieved, the MDA must be notified, and a new baseline developed,
reviewed by decision makers and, if the program is to continue,
approved by the MDA.
[10] According to the August 2004 Acquisition Program Baseline, this
estimate is for acquisition, operations, and support for fiscal years
2004 through 2021.
[11] According to the September 2007 Acquisition Program Baseline, this
estimate is for acquisition, operations, and support for fiscal years
2004 through 2023.
[12] The defense acquisition system is a framework-based approach that
is intended to translate mission needs and requirements into stable,
affordable, and well-managed acquisition programs. It was updated in
December 2008 and consists of five key program life-cycle phases and
three related milestone decision points--(1) Materiel Solution Analysis
(previously Concept Refinement), followed by Milestone A; (2)
Technology Development, followed by Milestone B; (3) Engineering and
Manufacturing Development (previously System Development and
Demonstration), followed by Milestone C; (4) Production and Deployment;
and (5) Operations and Support.
[13] Fleet and Industrial Supply Centers are located in San Diego,
California; Norfolk, Virginia; Jacksonville, Florida; Puget Sound,
Washington; Pearl Harbor, Hawaii; Yokosuka, Japan; and Sigonella,
Italy; and provide worldwide logistics services for the Navy.
[14] See, for example, [hyperlink,
http://www.gao.gov/products/GAO-08-896]; GAO, DOD Business Systems
Modernization: Planned Investment in Navy Program to Create Cashless
Shipboard Environment Needs to be Justified and Better Managed,
[hyperlink, http://www.gao.gov/products/GAO-08-922] (Washington, D.C.:
Sept. 8, 2008); [hyperlink, http://www.gao.gov/products/GAO-08-822];
[hyperlink, http://www.gao.gov/products/GAO-07-860]; Information
Technology: DOD Needs to Ensure that Navy Marine Corps Intranet Program
Is Meeting Goals and Satisfying Customers, [hyperlink,
http://www.gao.gov/products/GAO-07-51] (Washington, D.C.: Dec. 8,
2006); DOD Systems Modernization: Planned Investment in the Navy
Tactical Command Support System Needs to be Reassessed, [hyperlink,
http://www.gao.gov/products/GAO-06-215] (Washington, D.C.: Dec. 5,
2005); and DOD Business Systems Modernization: Navy ERP Adherence to
Best Business Practices Critical to Avoid Past Failures, [hyperlink,
http://www.gao.gov/products/GAO-05-858] (Washington, D.C.: Sept. 29,
2005).
[15] Ronald W. Reagan National Defense Authorization Act for Fiscal
Year 2005, Pub. L. No. 108-375, Sec. 332 (2004) (codified at 10 U.S.C.
Sections 186 and 2222).
[16] Earned value management is a means for measuring actual program
progress against cost and schedule estimates.
[17] [hyperlink, http://www.gao.gov/products/GAO-08-896].
[18] [hyperlink, http://www.gao.gov/products/GAO-08-822].
[19] [hyperlink, http://www.gao.gov/products/GAO-07-860].
[20] [hyperlink, http://www.gao.gov/products/GAO-06-215].
[21] [hyperlink, http://www.gao.gov/products/GAO-05-858].
[22] See, for example, Office of the Under Secretary of Defense for
Acquisition, Technology, and Logistics, Department of Defense
Instruction 5000.02 (Arlington, VA: Dec. 2, 2008); Defense Acquisition
University, Test and Evaluation Management Guide, 5th ed. (Fort
Belvoir, VA: January 2005); Institute of Electrical and Electronics
Engineers, Inc., Standard for Software Verification and Validation,
IEEE Std 1012-2004 (New York, NY: June 8, 2005); Software Engineering
Institute, Capability Maturity Model Integration for Acquisition,
version 1.2 (Pittsburgh, PA: May 2008); and GAO, Year 2000 Computing
Crisis: A Testing Guide, [hyperlink,
http://www.gao.gov/products/GAO/AIMD-10.1.21] (Washington, D.C.:
November 1998).
[23] Office of the Under Secretary of Defense for Acquisition,
Technology, and Logistics, Department of Defense Instruction 5000.02
(Arlington, VA: Dec. 2, 2008); Defense Acquisition University, Test and
Evaluation Management Guide, 5th ed. (Fort Belvoir, VA: January 2005);
and Institute of Electrical and Electronics Engineers, Inc., Standard
for Software and System Test Documentation, IEEE Std 829-2008 (New
York, NY: 2008).
[24] GAO, Secure Border Initiative: DHS Needs to Address Significant
Risks in Delivering Key Technology Investment, [hyperlink,
http://www.gao.gov/products/GAO-08-1086] (Washington, D.C.: Sept. 22,
2008).
[25] See, for example, [hyperlink,
http://www.gao.gov/products/GAO/AIMD-10.1.21].
[26] According to program documentation, criticality levels range from
1 to 5, as follows: 1 is a problem that prevents accomplishment of an
operational or mission critical capability; 2 is a major technical
problem with no work-around solution; 3 is a major technical problem
with a work-around solution; 4 is a minor technical problem; and 5 is
any other defect, such as a cosmetic problem.
[27] Institute of Electrical and Electronics Engineers, Inc., Standard
for Information Technology--Software Life Cycle Processes--
Implementation Considerations, IEEE/EIA Std 12207.2-1997 (New York, NY:
April 1998) and Software Engineering Institute, Capability Maturity
Model Integration for Acquisition, version 1.2 (Pittsburgh, PA: May
2008).
[28] Information Systems Audit and Control Association, Inc., IS
Standards, Guidelines and Procedures for Auditing and Control
Professionals (Rolling Meadows, IL: Jan. 15, 2009).
[29] According to program documentation, the date a defect is entered
into the system and the date the status of the defect is changed to
"closed" are automatically populated. Further, changes to a defect's
status, including from "new" to "open" and from "open" to "closed;"
changes to the criticality level; and the user who makes the changes
are tracked in a defect's history log.
[30] These anomalies are defects that we found that (1) were attributed
to integrated system test events, but were not detected until after the
system was deployed; (2) had a criticality level that was different
from the level that was reported at a test readiness review; (3) were
deferred to a later test event or to post-deployment to be verified as
resolved; or (4) had no criticality level.
[31] See, for example, Electronics Industries Alliance, National
Consensus Standard for Configuration Management , ANSI/EIA-649-1998
(Arlington, VA: August 1998) and Department of Defense, Military
Handbook: Configuration Management Guidance, MIL-HDBK-61A(SE)
(Washington, D.C.: Feb. 7, 2001).
[32] GAO, Homeland Security: U.S. Visitor and Immigration Status
Indicator Technology Program Planning and Execution Improvements
Needed, [hyperlink, http://www.gao.gov/products/GAO-09-96] (Washington,
D.C.: Dec. 12, 2008); Homeland Security: Recommendations to Improve
Management of Key Border Security Program Need to Be Implemented,
[hyperlink, http://www.gao.gov/products/GAO-06-296] (Washington, D.C.:
Feb. 14, 2006); and [hyperlink,
http://www.gao.gov/products/GAO-05-858].
[33] Institute of Electrical and Electronics Engineers, Inc., Standard
for Software Verification and Validation, IEEE Std 1012-2004 (New York,
NY: June 8, 2005).
[34] See [hyperlink, http://www.gao.gov/products/GAO-07-860] and
[hyperlink, http://www.gao.gov/products/GAO-05-858].
[End of section]
GAO's Mission:
The Government Accountability Office, the audit, evaluation and
investigative arm of Congress, exists to support Congress in meeting
its constitutional responsibilities and to help improve the performance
and accountability of the federal government for the American people.
GAO examines the use of public funds; evaluates federal programs and
policies; and provides analyses, recommendations, and other assistance
to help Congress make informed oversight, policy, and funding
decisions. GAO's commitment to good government is reflected in its core
values of accountability, integrity, and reliability.
Obtaining Copies of GAO Reports and Testimony:
The fastest and easiest way to obtain copies of GAO documents at no
cost is through GAO's Web site [hyperlink, http://www.gao.gov]. Each
weekday, GAO posts newly released reports, testimony, and
correspondence on its Web site. To have GAO e-mail you a list of newly
posted products every afternoon, go to [hyperlink, http://www.gao.gov]
and select "E-mail Updates."
Order by Phone:
The price of each GAO publication reflects GAO‘s actual cost of
production and distribution and depends on the number of pages in the
publication and whether the publication is printed in color or black and
white. Pricing and ordering information is posted on GAO‘s Web site,
[hyperlink, http://www.gao.gov/ordering.htm].
Place orders by calling (202) 512-6000, toll free (866) 801-7077, or
TDD (202) 512-2537.
Orders may be paid for using American Express, Discover Card,
MasterCard, Visa, check, or money order. Call for additional
information.
To Report Fraud, Waste, and Abuse in Federal Programs:
Contact:
Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]:
E-mail: fraudnet@gao.gov:
Automated answering system: (800) 424-5454 or (202) 512-7470:
Congressional Relations:
Ralph Dawn, Managing Director, dawnr@gao.gov:
(202) 512-4400:
U.S. Government Accountability Office:
441 G Street NW, Room 7125:
Washington, D.C. 20548:
Public Affairs:
Chuck Young, Managing Director, youngc1@gao.gov:
(202) 512-4800:
U.S. Government Accountability Office:
441 G Street NW, Room 7149:
Washington, D.C. 20548: