Information Technology
Agencies Need to Improve the Implementation and Use of Earned Value Techniques to Help Manage Major System Acquisitions
Gao ID: GAO-10-2 October 8, 2009
In fiscal year 2009, the federal government planned to spend about $71 billion on information technology (IT) investments. To more effectively manage such investments, in 2005 the Office of Management and Budget (OMB) directed agencies to implement earned value management (EVM). EVM is a project management approach that, if implemented appropriately, provides objective reports of project status, produces early warning signs of impending schedule delays and cost overruns, and provides unbiased estimates of anticipated costs at completion. GAO was asked to assess selected agencies' EVM policies, determine whether they are adequately using earned value techniques to manage key system acquisitions, and eval- uate selected investments' earned value data to determine their cost and schedule performances. To do so, GAO compared agency policies with best practices, performed case studies, and reviewed documenta- tion from eight agencies and 16 major investments with the highest levels of IT development-related spending in fiscal year 2009.
While all eight agencies have established policies requiring the use of EVM on major IT investments, these policies are not fully consistent with best practices. In particular, most lack training requirements for all relevant personnel responsible for investment oversight. Most policies also do not have adequately defined criteria for revising program cost and schedule baselines. Until agencies expand and enforce their EVM policies, it will be difficult for them to gain the full benefits of EVM. GAO's analysis of 16 investments shows that agencies are using EVM to manage their system acquisitions; however, the extent of implementation varies. Specifically, for 13 of the 16 investments, key practices necessary for sound EVM execution had not been implemented. For example, the project schedules for these investments contained issues--such as the improper sequencing of key activities--that undermine the quality of their performance baselines. This inconsistent application of EVM exists in part because of the weaknesses contained in agencies' policies, combined with a lack of enforcement of policies already in place. Until key EVM practices are fully implemented, these investments face an increased risk that managers cannot effectively optimize EVM as a management tool. Furthermore, earned value data trends of these investments indicate that most are currently experiencing shortfalls against cost and schedule targets. The total life-cycle costs of these programs have increased by about $2 billion. Based on GAO's analysis of current performance trends, 11 programs will likely incur cost overruns that will total about $1 billion at contract completion--in particular, 2 of these programs account for about 80 percent of this projection. As such, GAO estimates the total cost overrun to be about $3 billion at program completion (see figure). However, with timely and effective management action, it is possible to reverse negative trends so that the projected cost overruns may be reduced.
Recommendations
Our recommendations from this work are listed below with a Contact for more information. Status will change from "In process" to "Open," "Closed - implemented," or "Closed - not implemented" based on our follow up work.
Director:
Team:
Phone:
GAO-10-2, Information Technology: Agencies Need to Improve the Implementation and Use of Earned Value Techniques to Help Manage Major System Acquisitions
This is the accessible text file for GAO report number GAO-10-2
entitled 'Information Technology: Agencies Need to Improve the
Implementation and Use of Earned Value Techniques to Help Manage Major
System Acquisitions' which was released on November 9, 2009.
This text file was formatted by the U.S. Government Accountability
Office (GAO) to be accessible to users with visual impairments, as part
of a longer term project to improve GAO products' accessibility. Every
attempt has been made to maintain the structural and data integrity of
the original printed product. Accessibility features, such as text
descriptions of tables, consecutively numbered footnotes placed at the
end of the file, and the text of agency comment letters, are provided
but may not exactly duplicate the presentation or format of the printed
version. The portable document format (PDF) file is an exact electronic
replica of the printed version. We welcome your feedback. Please E-mail
your comments regarding the contents or accessibility features of this
document to Webmaster@gao.gov.
This is a work of the U.S. government and is not subject to copyright
protection in the United States. It may be reproduced and distributed
in its entirety without further permission from GAO. Because this work
may contain copyrighted images or other material, permission from the
copyright holder may be necessary if you wish to reproduce this
material separately.
Report to the Chairman, Subcommittee on Federal Financial Management,
Government Information, Federal Services, and International Security,
Committee on Homeland Security and Governmental Affairs, U.S. Senate:
United States Government Accountability Office: GAO:
October 2009:
Information Technology:
Agencies Need to Improve the Implementation and Use of Earned Value
Techniques to Help Manage Major System Acquisitions:
GAO-10-2:
GAO Highlights:
Highlights of GAO-10-2, a report to the Chairman, Subcommittee on
Federal Financial Management, Government Information, Federal Services,
and International Security, Committee on Homeland Security and
Governmental Affairs, U.S. Senate.
Why GAO Did This Study:
In fiscal year 2009, the federal government planned to spend about $71
billion on information technology (IT) investments. To more effectively
manage such investments, in 2005 the Office of Management and Budget
(OMB) directed agencies to implement earned value management (EVM). EVM
is a project management approach that, if implemented appropriately,
provides objective reports of project status, produces early warning
signs of impending schedule delays and cost overruns, and provides
unbiased estimates of anticipated costs at completion.
GAO was asked to assess selected agencies‘ EVM policies, determine
whether they are adequately using earned value techniques to manage key
system acquisitions, and evaluate selected investments‘ earned value
data to determine their cost and schedule performances. To do so, GAO
compared agency policies with best practices, performed case studies,
and reviewed documentation from eight agencies and 16 major investments
with the highest levels of IT development-related spending in fiscal
year 2009.
What GAO Found:
While all eight agencies have established policies requiring the use of
EVM on major IT investments, these policies are not fully consistent
with best practices. In particular, most lack training requirements for
all relevant personnel responsible for investment oversight. Most
policies also do not have adequately defined criteria for revising
program cost and schedule baselines. Until agencies expand and enforce
their EVM policies, it will be difficult for them to gain the full
benefits of EVM.
GAO‘s analysis of 16 investments shows that agencies are using EVM to
manage their system acquisitions; however, the extent of implementation
varies. Specifically, for 13 of the 16 investments, key practices
necessary for sound EVM execution had not been implemented. For
example, the project schedules for these investments contained issues”
such as the improper sequencing of key activities”that undermine the
quality of their performance baselines. This inconsistent application
of EVM exists in part because of the weaknesses contained in agencies‘
policies, combined with a lack of enforcement of policies already in
place. Until key EVM practices are fully implemented, these investments
face an increased risk that managers cannot effectively optimize EVM as
a management tool.
Furthermore, earned value data trends of these investments indicate
that most are currently experiencing shortfalls against cost and
schedule targets. The total life-cycle costs of these programs have
increased by about $2 billion. Based on GAO‘s analysis of current
performance trends, 11 programs will likely incur cost overruns that
will total about $1 billion at contract completion”in particular, 2 of
these programs account for about 80 percent of this projection. As
such, GAO estimates the total cost overrun to be about $3 billion at
program completion (see figure). However, with timely and effective
management action, it is possible to reverse negative trends so that
the projected cost overruns may be reduced.
Figure: Cost Overruns Incurred and Projected Overruns of 16 Programs:
$2.0 billion: Life-cycle cost overruns already incurred; $1.0 billion:
GAO-estimated most likely cost overruns; $3.0 billion: GAO-estimated
total cost overrun at completion.
Source: GAO analysis of program data.
[End of figure]
What GAO Recommends:
GAO is recommending that the selected agencies modify EVM policies to
be consistent with best practices, implement EVM practices that address
identified weaknesses, and manage negative earned value trends. Seven
agencies that commented on a draft of this report generally agreed with
GAO‘s results and recommendations.
View [hyperlink, http://www.gao.gov/products/GAO-10-2] or key
components. For more information, contact David A. Powner at (202) 512-
9286 or pownerd@gao.gov.
[End of section]
Contents:
Letter:
Background:
Agencies' EVM Policies Are Not Comprehensive:
Agencies' Key Acquisition Programs Are Using EVM, but Are Not
Consistently Implementing Key Practices:
Earned Value Data Show Trends of Cost Overruns and Schedule Slippages
on Most Programs:
Conclusions:
Recommendations for Executive Action:
Agency Comments and Our Evaluation:
Appendix I: Objectives, Scope, and Methodology:
Appendix II: Case Studies of Selected Programs' Implementation of
Earned Value Management:
Appendix III: Comments from the Department of Commerce:
Appendix IV: Comments from the Department of Defense:
Appendix V: Comments from the Department of Justice:
Appendix VI: Comments from the National Aeronautics and Space
Administration:
Appendix VII: Comments from the Department of Veterans Affairs:
Appendix VIII: GAO Contact and Staff Acknowledgments:
Related GAO Products:
Tables:
Table 1: Key Components of an Effective EVM Policy:
Table 2: Assessment of Key Agencies' EVM Policies:
Table 3: Eleven Key EVM Practices for System Acquisition Programs:
Table 4: Assessment of EVM Practices for Case Study Programs:
Table 5: Program Life-cycle Cost Estimate Changes:
Table 6: Contractor Cumulative Cost and Schedule Performances:
Table 7: Sixteen Case Study Programs:
Table 8: GAO EVM Practice Assessment of Agriculture's MIDAS Program:
Table 9: GAO EVM Practice Assessment of Commerce's DRIS Program:
Table 10: GAO EVM Practice Assessment of Commerce's FDCA Program:
Table 11: GAO EVM Practice Assessment of Defense's AOC Program:
Table 12: GAO EVM Practice Assessment of Defense's JTRS-HMS Program:
Table 13: GAO EVM Practice Assessment of Defense's WIN-T Program:
Table 14: GAO EVM Practice Assessment of Homeland Security's ACE
Program:
Table 15: GAO EVM Practice Assessment of Homeland Security's Deepwater
COP Program:
Table 16: GAO EVM Practice Assessment of Homeland Security's WHTI
Program:
Table 17: GAO EVM Practice Assessment of Justice's NGI Program:
Table 18: GAO EVM Practice Assessment of NASA's JWST Project:
Table 19: GAO EVM Practice Assessment of NASA's Juno Project:
Table 20: GAO EVM Practice Assessment of NASA's MSL Project:
Table 21: GAO EVM Practice Assessment of Transportation's ERAM Program:
Table 22: GAO EVM Practice Assessment of Transportation's SBS Program:
Table 23: GAO EVM Practice Assessment of Veterans Affairs' VistA-FM
Program:
Figures:
Figure 1: GAO EV Data Analysis of Agriculture's MIDAS Program:
Figure 2: GAO EV Data Analysis of Commerce's DRIS Program:
Figure 3: GAO EV Data Analysis of Commerce's FDCA Program:
Figure 4: GAO EV Data Analysis of Defense's AOC Program:
Figure 5: GAO EV Data Analysis of Defense's JTRS-HMS Program:
Figure 6: GAO EV Data Analysis of Defense's WIN-T Program:
Figure 7: GAO EV Data Analysis of Homeland Security's ACE Program:
Figure 8: GAO EV Data Analysis of Homeland Security's Deepwater COP
Program:
Figure 9: GAO EV Data Analysis of Homeland Security's WHTI Program:
Figure 10: GAO EV Data Analysis of Justice's NGI Program:
Figure 11: GAO EV Data Analysis of NASA's JWST Project:
Figure 12: GAO EV Data Analysis of NASA's Juno Project:
Figure 13: GAO EV Data Analysis of NASA's MSL Project:
Figure 14: GAO EV Data Analysis of Transportation's ERAM Program:
Figure 15: GAO EV Data Analysis of Transportation's SBS Program:
Figure 16: GAO EV Data Analysis of Veterans Affairs' VistA-FM Program:
Abbreviations:
ACE: Automated Commercial Environment:
ANSI: American National Standards Institute AOC Air and Space
Operations Center--Weapon System:
COP: Integrated Deepwater System--Common Operational Picture:
DOD: Department of Defense:
DRIS: Decennial Response Integration System:
EIA: Electronic Industries Alliance:
ERAM: En Route Automation Modernization:
EV: earned value:
EVM: earned value management:
FDCA: Field Data Collection Automation:
IT: information technology:
JTRS-HMS: Joint Tactical Radio System--Handheld, Manpack, Small Form
Fit:
JWST: James Webb Space Telescope:
MIDAS: Farm Program Modernization:
MSL: Mars Science Laboratory:
NASA: National Aeronautics and Space Administration:
NGI: Next Generation Identification:
OMB: Office of Management and Budget:
SBS: Surveillance and Broadcast System:
VistA-FM: Veterans Health Information Systems and Technology
Architecture--Foundations Modernization:
WHTI: Western Hemisphere Travel Initiative:
WIN-T: Warfighter Information Network--Tactical:
[End of section]
United States Government Accountability Office: Washington, DC 20548:
October 8, 2009:
The Honorable Thomas R. Carper:
Chairman:
Subcommittee on Federal Financial Management, Government Information,
Federal Services, and International Security: Committee on Homeland
Security and Governmental Affairs: United States Senate:
Dear Mr. Chairman:
In fiscal year 2009, the federal government planned to spend over $70
billion on information technology (IT) investments, many of which
involve systems and technologies to modernize legacy systems, increase
communication and networking capabilities, and transition to new
systems designed to significantly improve the government's ability to
carry out critical mission functions into the 21st century. To more
effectively manage such investments, the Office of Management and
Budget (OMB) has a number of key initiatives under way--one of which
was established in 2005 and directs agencies to implement earned value
management (EVM).[Footnote 1] EVM is a project management approach
that, if implemented appropriately, provides objective reports of
project status, produces early warning signs of impending schedule
slippages and cost overruns, and provides unbiased estimates of
anticipated costs at completion.
This report responds to your request that we review the federal
government's use of EVM. Specifically, our objectives were to (1)
assess whether key departments and agencies have appropriately
established EVM policies, (2) determine whether these agencies are
adequately using earned value techniques to manage key system
acquisitions, and (3) evaluate the earned value data of these selected
investments to determine their cost and schedule performances.
To address our objectives, we reviewed agency EVM policies and
individual programs' EVM-related documentation, including cost
performance reports and project schedules, from eight agencies and 16
major investments from those agencies, respectively.[Footnote 2] The
eight agencies account for about 75 percent of the planned IT spending
for fiscal year 2009. The 16 programs selected for case study represent
investments with about $3.5 billion in total planned spending for
system development work in fiscal year 2009. We compared the agencies'
policies and practices with federal standards and best practices of
leading organizations to determine the effectiveness of their use of
earned value data in managing IT investments. We also analyzed the
earned value data from the programs to determine whether they are
projected to finish within planned cost and schedule targets. In
addition, we interviewed relevant agency officials, including key
personnel on programs that we selected for case study and officials
responsible for implementing EVM.
We conducted this performance audit from February to October 2009, in
accordance with generally accepted government auditing standards. Those
standards require that we plan and perform the audit to obtain
sufficient, appropriate evidence to provide a reasonable basis for our
findings and conclusions based on our audit objective. We believe that
the evidence obtained provides a reasonable basis for our findings and
conclusions based on our audit objective. Appendix I contains further
details about our objectives, scope, and methodology. See also the page
of related products at the end of this report for previous work that we
have done on certain programs in our case studies.
Background:
Each year, OMB and federal agencies work together to determine how much
the government plans to spend on IT projects and how these funds are to
be allocated. Planned federal IT spending in fiscal year 2009 totaled
about $71 billion--of which $22 billion was planned for IT system
development work, and the remainder was planned for operations and
maintenance of existing systems. OMB plays a key role in overseeing
federal agencies' IT investments and how they are managed, stemming
from its functions of assisting the President in overseeing the
preparation of the federal budget and supervising budget preparation in
executive branch agencies. In helping to formulate the President's
spending plans, OMB is responsible for evaluating the effectiveness of
agency programs, policies, and procedures; assessing competing funding
demands among agencies; and setting funding priorities. To carry out
these responsibilities, OMB depends on agencies to collect and report
accurate and complete information; these activities depend, in turn, on
agencies having effective IT management practices.
To drive improvement in the implementation and management of IT
projects, Congress enacted the Clinger-Cohen Act in 1996, expanding the
responsibilities delegated to OMB and agencies under the Paperwork
Reduction Act.[Footnote 3] The Clinger-Cohen Act requires agencies to
engage in performance-and results-based management, and to implement
and enforce IT management policies and guidelines. The act also
requires OMB to establish processes to analyze, track, and evaluate the
risks and results of major capital investments in information systems
made by executive agencies.
Over the past several years, we have reported and testified on OMB's
initiatives to highlight troubled projects,[Footnote 4] justify IT
investments,[Footnote 5] and use project management tools.[Footnote 6]
We have made multiple recommendations to OMB and federal agencies to
improve these initiatives to further enhance the oversight and
transparency of federal IT projects. As a result, OMB recently used
this body of work to develop and implement improved processes to
oversee and increase transparency of IT investments. Specifically, in
June 2009, OMB publicly deployed a Web site that displays dashboards of
all major federal IT investments to provide OMB and others with the
ability to track the progress of these investments over time.
EVM Provides Insight on Program Cost and Schedule:
Given the size and significance of the government's investment in IT,
it is important that projects be managed effectively to ensure that
public resources are wisely invested. Effectively managing projects
entails, among other things, pulling together essential cost, schedule,
and technical information in a meaningful, coherent fashion so that
managers have an accurate view of the program's development status.
Without meaningful and coherent cost and schedule information, program
managers can have a distorted view of a program's status and risks. To
address this issue, in the 1960s, the Department of Defense (DOD)
developed the EVM technique, which goes beyond simply comparing
budgeted costs with actual costs. This technique measures the value of
work accomplished in a given period and compares it with the planned
value of work scheduled for that period and with the actual cost of
work accomplished.
Differences in these values are measured in both cost and schedule
variances. Cost variances compare the value of the completed work
(i.e., the earned value) with the actual cost of the work performed.
For example, if a contractor completed $5 million worth of work and the
work actually cost $6.7 million, there would be a negative $1.7 million
cost variance. Schedule variances are also measured in dollars, but
they compare the earned value of the completed work with the value of
the work that was expected to be completed. For example, if a
contractor completed $5 million worth of work at the end of the month
but was budgeted to complete $10 million worth of work, there would be
a negative $5 million schedule variance. Positive variances indicate
that activities are costing less or are completed ahead of schedule.
Negative variances indicate activities are costing more or are falling
behind schedule. These cost and schedule variances can then be used in
estimating the cost and time needed to complete the program.
Without knowing the planned cost of completed work and work in progress
(i.e., the earned value), it is difficult to determine a program's true
status. Earned value allows for this key information, which provides an
objective view of program status and is necessary for understanding the
health of a program. As a result, EVM can alert program managers to
potential problems sooner than using expenditures alone, thereby
reducing the chance and magnitude of cost overruns and schedule
slippages. Moreover, EVM directly supports the institutionalization of
key processes for acquiring and developing systems and the ability to
effectively manage investments--areas that are often found to be
inadequate on the basis of our assessments of major IT investments.
Federal Guidance Calls for Using EVM to Improve IT Management:
In August 2005, OMB issued guidance outlining steps that agencies must
take for all major and high-risk development projects to better ensure
improved execution and performance and to promote more effective
oversight through the implementation of EVM.[Footnote 7] Specifically,
this guidance directs agencies to (1) develop comprehensive policies to
ensure that their major IT investments are using EVM to plan and manage
development; (2) include a provision and clause in major acquisition
contracts or agency in-house project charters directing the use of an
EVM system that is compliant with the American National Standards
Institute (ANSI) standard;[Footnote 8] (3) provide documentation
demonstrating that the contractor's or agency's in-house EVM system
complies with the national standard; (4) conduct periodic surveillance
reviews; and (5) conduct integrated baseline reviews[Footnote 9] on
individual programs to finalize their cost, schedule, and performance
goals.
Building on OMB's requirements, in March 2009, we issued a guide on
best practices for estimating and managing program costs.[Footnote 10]
This guide highlights the policies and practices adopted by leading
organizations to implement an effective EVM program. Specifically, in
the guide, we identify the need for organizational policies that
establish clear criteria for which programs are required to use EVM,
specify compliance with the ANSI standard, require a standard product-
oriented structure for defining work products, require integrated
baseline reviews, provide for specialized training, establish criteria
and conditions for rebaselining programs, and require an ongoing
surveillance function. In addition, we identify key practices that
individual programs can use to ensure that they establish a sound EVM
system, that the earned value data are reliable, and that the data are
used to support decision making.
Prior Reviews on Agency Use of EVM to Acquire and Manage IT Systems
Have Identified Weaknesses:
We have previously reported on the weaknesses associated with the
implementation of sound EVM programs at various agencies, as well as on
the lack of aggressive management action to correct poor cost and
schedule performance trends based on earned value data for major system
acquisition programs:
* In July 2008, we reported that the Federal Aviation Administration's
EVM policy was not fully consistent with best practices.[Footnote 11]
For example, the agency required its program managers to obtain EVM
training, but did not enforce completion of this training or require
other relevant personnel to obtain this training. In addition, although
the agency was using EVM to manage IT acquisitions, not all programs
were ensuring that their earned value data were reliable. Specifically,
of the three programs collecting EVM data, only one program adequately
ensured that its earned value data were reliable. As a result, the
agency faced an increased risk that managers were not getting the
information they needed to effectively manage the programs. In response
to our findings and recommendations, the Federal Aviation
Administration reported that it had initiatives under way to improve
its EVM oversight processes.
* In September 2008, we reported that the Department of the Treasury's
EVM policy was not fully consistent with best practices.[Footnote 12]
For example, while the department's policy addressed some practices,
such as establishing clear criteria for which programs are to use EVM,
it did not address others, such as requiring and enforcing EVM
training. In addition, six programs at Treasury and its bureaus were
not consistently implementing practices needed for establishing a
comprehensive EVM system. For example, when executing work plans and
recording actual costs, a key practice for ensuring that the data
resulting from the EVM system are reliable, only two of the six
investments that we reviewed incorporated government costs with
contractor costs. As a result, we reported that Treasury may not be
able to effectively manage its critical programs. In response to our
findings and recommendations, Treasury reported that it would release a
revised EVM policy and further noted that initiatives to improve EVM-
related training were under way.
* In a series of reports and testimonies from September 2004 to June
2009, we reported that the National Oceanic and Atmospheric
Administration's National Polar-orbiting Operational Environmental
Satellite System program was likely to overrun its contract at
completion on the basis of our analysis of contractor EVM data.
[Footnote 13] Specifically, the program had delayed key milestones and
experienced technical issues in the development of key sensors, which
we stated would affect cost and schedule estimates. As predicted, in
June 2006 the program was restructured, decreasing its complexity,
delaying the availability of the first satellite by 3 to 5 years, and
increasing its cost estimate from $6.9 billion to $12.5 billion.
However, the program has continued to face significant technical and
management issues. As of June 2009, launch of the first satellite was
delayed by 14 months, and our current projected total cost estimate is
approximately $15 billion. We made multiple recommendations to improve
this program, including establishing a realistic time frame for
revising the cost and schedule baselines, developing plans to mitigate
the risk of gaps in satellite continuity, and tracking the program
executive committee's action items from inception to closure.
Agencies' EVM Policies Are Not Comprehensive:
While the eight agencies we reviewed have established policies
requiring the use of EVM on their major IT investments, none of these
policies are fully consistent with best practices, such as
standardizing the way work products are defined. We recently reported
[Footnote 14] that leading organizations establish EVM policies that:
* establish clear criteria for which programs are to use EVM;
* require programs to comply with the ANSI standard;
* require programs to use a product-oriented structure for defining
work products;
* require programs to conduct detailed reviews of expected costs,
schedules, and deliverables (called an integrated baseline review);
* require and enforce EVM training;
* define when programs may revise cost and schedule baselines (called
rebaselining); and:
* require system surveillance--that is, routine validation checks to
ensure that major acquisitions are continuing to comply with agency
policies and standards.
Table 1 describes the key components of an effective EVM policy.
Table 1: Key Components of an Effective EVM Policy:
Component: Clear criteria for implementing EVM on all major IT
investments; Description: OMB requires agencies to implement EVM on all
major IT investments and ensure that the corresponding contracts
include provisions for using EVM systems. However, each agency is
responsible for establishing its own definition of a "major" IT
investment. As a result, agencies should clearly define the conditions
under which a new or ongoing acquisition program is required to
implement EVM.
Component: Compliance with the ANSI standard; Description: OMB requires
agencies to use EVM systems that are compliant with a national standard
developed by ANSI and EIA (ANSI/EIA-748-B). This standard consists of
32 guidelines that an organization can use to establish a sound EVM
system, ensure that the data resulting from the EVM system are
reliable, and use earned value data for decision-making purposes.
Component: Standard structure for defining the work products;
Description: The work breakdown structure defines the work necessary to
accomplish a program's objectives. It is the first criterion stated in
the ANSI standard and the basis for planning the program baseline and
assigning responsibility for the work. It is a best practice to
establish a product-oriented work breakdown structure because it allows
a program to track cost and schedule by defined deliverables, such as a
hardware or software component. This allows a program manager to more
precisely identify which components are causing cost or schedule
overruns and to more effectively mitigate the root cause of the
overruns. Standardizing the work breakdown structure is also considered
a best practice because it enables an organization to collect and share
data among programs.
Component: Integrated baseline review; Description: An integrated
baseline review is an evaluation of the performance measurement
baseline--the foundation for an EVM system--to determine whether all
program requirements have been addressed, risks have been identified,
mitigation plans are in place, and available and planned resources are
sufficient to complete the work. The main goal of an integrated
baseline review is to identify potential program risks, including risks
associated with costs, management processes, resources, schedules, and
technical issues.
Component: Training requirements; Description: EVM training should be
provided and enforced for all personnel with investment oversight and
program management responsibilities. Executive personnel with oversight
responsibilities need to understand EVM terms and analysis products to
make sound investment decisions. Program managers and staff need to be
able to interpret and validate earned value data to effectively manage
deliverables, costs, and schedules.
Component: Rebaselining criteria; Description: At times, management may
conclude that the remaining budget and schedule targets for completing
a program (including the contract) are significantly insufficient, and
that the current baseline is no longer valid for realistic performance
measurement. Management may decide that a revised baseline for the
program is needed to restore its control of the remaining work effort.
An agency's rebaselining criteria should define acceptable reasons for
rebaselining and require programs to (1) explain why the current plan
is no longer feasible and what measures will be implemented to prevent
recurrence and (2) develop a realistic cost and schedule estimate for
remaining work that has been validated and spread over time to the new
plan.
Component: System surveillance;
Description: Surveillance is the process of reviewing a program's
(including contractors') EVM system as it is applied to one or more
programs. The purpose of surveillance is to focus on how well a program
is using its EVM system to manage cost, schedule, and technical
performances. The following two goals are associated with EVM system
surveillance: (1) ensure that the program is following corporate
processes and procedures and (2) confirm that the program's processes
and procedures continue to satisfy ANSI guidelines.
Source: GAO-09-3SP.
[End of table]
The eight agencies we reviewed do not have comprehensive EVM policies.
Specifically, none of the agencies' policies are fully consistent with
all seven key components of an effective EVM policy. Table 2 provides a
detailed assessment, by agency, and a discussion of the agencies'
policies follows the table.
Table 2: Assessment of Key Agencies' EVM Policies:
Agency: Agriculture;
Clear criteria for implementing EVM on all major IT investments: The
agency addressed all EVM practices in this policy area; Compliance with
the ANSI standard: The agency addressed all EVM practices in this
policy area; Standard structure for defining the work products: The
agency did not address any EVM practices in this policy area;
Integrated baseline review: The agency addressed all EVM practices in
this policy area; Training requirements: The agency addressed some EVM
practices in this policy area; Rebaselining criteria: The agency
addressed some EVM practices in this policy area; System surveillance:
The agency addressed all EVM practices in this policy area.
Agency: Commerce;
Clear criteria for implementing EVM on all major IT investments: The
agency addressed all EVM practices in this policy area; Compliance with
the ANSI standard: The agency addressed all EVM practices in this
policy area; Standard structure for defining the work products: The
agency did not address any EVM practices in this policy area;
Integrated baseline review: The agency addressed all EVM practices in
this policy area; Training requirements: The agency addressed all EVM
practices in this policy area; Rebaselining criteria: The agency
addressed all EVM practices in this policy area; System surveillance:
The agency addressed all EVM practices in this policy area.
Agency: Defense;
Clear criteria for implementing EVM on all major IT investments: The
agency addressed all EVM practices in this policy area; Compliance with
the ANSI standard: The agency addressed all EVM practices in this
policy area; Standard structure for defining the work products: The
agency addressed all EVM practices in this policy area; Integrated
baseline review: The agency addressed all EVM practices in this policy
area; Training requirements: The agency addressed some EVM practices in
this policy area; Rebaselining criteria: The agency addressed all EVM
practices in this policy area; System surveillance: The agency
addressed all EVM practices in this policy area.
Agency: Homeland Security;
Clear criteria for implementing EVM on all major IT investments: The
agency addressed all EVM practices in this policy area; Compliance with
the ANSI standard: The agency addressed all EVM practices in this
policy area; Standard structure for defining the work products: The
agency addressed some EVM practices in this policy area; Integrated
baseline review: The agency addressed all EVM practices in this policy
area; Training requirements: The agency addressed some EVM practices in
this policy area; Rebaselining criteria: The agency addressed some EVM
practices in this policy area; System surveillance: The agency
addressed all EVM practices in this policy area.
Agency: Justice;
Clear criteria for implementing EVM on all major IT investments: The
agency addressed all EVM practices in this policy area; Compliance with
the ANSI standard: The agency addressed all EVM practices in this
policy area; Standard structure for defining the work products: The
agency addressed some EVM practices in this policy area; Integrated
baseline review: The agency addressed all EVM practices in this policy
area; Training requirements: The agency addressed some EVM practices in
this policy area; Rebaselining criteria: The agency addressed all EVM
practices in this policy area; System surveillance: The agency
addressed all EVM practices in this policy area.
Agency: National Aeronautics and Space Administration; Clear criteria
for implementing EVM on all major IT investments: The agency addressed
all EVM practices in this policy area; Compliance with the ANSI
standard: The agency addressed all EVM practices in this policy area;
Standard structure for defining the work products: The agency addressed
some EVM practices in this policy area; Integrated baseline review: The
agency addressed all EVM practices in this policy area; Training
requirements: The agency addressed some EVM practices in this policy
area; Rebaselining criteria: The agency addressed some EVM practices in
this policy area; System surveillance: The agency addressed all EVM
practices in this policy area.
Agency: Transportation;
Clear criteria for implementing EVM on all major IT investments: The
agency addressed all EVM practices in this policy area; Compliance with
the ANSI standard: The agency addressed some EVM practices in this
policy area; Standard structure for defining the work products: The
agency did not address any EVM practices in this policy area;
Integrated baseline review: The agency addressed all EVM practices in
this policy area; Training requirements: The agency addressed some EVM
practices in this policy area; Rebaselining criteria: The agency
addressed some EVM practices in this policy area; System surveillance:
The agency addressed all EVM practices in this policy area.
Agency: Veterans Affairs;
Clear criteria for implementing EVM on all major IT investments: The
agency addressed some EVM practices in this policy area; Compliance
with the ANSI standard: The agency addressed all EVM practices in this
policy area; Standard structure for defining the work products: The
agency did not address any EVM practices in this policy area;
Integrated baseline review: The agency addressed all EVM practices in
this policy area; Training requirements: The agency addressed some EVM
practices in this policy area; Rebaselining criteria: The agency
addressed some EVM practices in this policy area; System surveillance:
The agency addressed all EVM practices in this policy area.
Source: GAO analysis of agency data.
[End of table]
* Criteria for implementing EVM on all IT major investments: Seven of
the eight agencies fully defined criteria for implementing EVM on major
IT investments. The agencies with sound policies typically defined
"major" investments as those exceeding a certain cost threshold, and,
in some cases, agencies defined lower tiers of investments requiring
reduced levels of EVM compliance. Veterans Affairs only partially met
this key practice because its policy did not clearly state whether
programs or major subcomponents of programs (projects and subprojects)
had to comply with EVM requirements. According to agency officials,
this lack of clarity may cause EVM to be inconsistently applied across
the investments. Without an established policy that clearly defines the
conditions under which new or ongoing acquisition programs are required
to implement EVM, these agencies cannot ensure that EVM is being
appropriately applied on their major investments.
* Compliance with the ANSI standard: Seven of the eight agencies
required that all work activities performed on major investments be
managed by an EVM system that complies with industry standards. One
agency, Transportation, partially met this key practice because its
policy contained inconsistent criteria for when investments must comply
with standards. Specifically, in one section, the policy requires a
certain class of investments to adhere to a subset of the ANSI
standard; however, in another section, the policy merely states that
the investments must comply with general EVM principles. This latter
section is vague and could be interpreted in multiple ways, either more
broadly or narrowly than the specified subset of the ANSI standard.
Without consistent criteria on investment compliance, Transportation
may be unable to ensure that the work activities for some of its major
investments are establishing sound EVM systems that produce reliable
earned value data and provide the basis for informed decision making.
* Standard structure for defining the work products: DOD was the only
agency to fully meet this key practice by developing and requiring the
use of standard product-oriented work breakdown structures. Four
agencies did not meet this key practice, while the other three only
partially complied. Of those agencies that partially complied, National
Aeronautics and Space Administration (NASA) policy requires mission (or
space flight) projects to use a standardized product-oriented work
breakdown structure; however, IT projects do not have such a
requirement. NASA officials reported that they are working to develop a
standard structure for their IT projects; however, they were unable to
provide a time frame for completion. Homeland Security and Justice have
yet to standardize their product structures.
Among the agencies that did not implement this key practice, reasons
included, among other things, the difficulty in establishing a standard
structure for component agencies that conduct different types of work
with varying complexity. While this presents a challenge, agencies
could adopt an approach similar to DOD's and develop various standard
work structures based on the kinds of work being performed by the
various component agencies (e.g., automated information system, IT
infrastructure, and IT services). Without fully implementing a standard
product-oriented structure (or structures), agencies will be unable to
collect and share data among programs and may not have the information
they need to make decisions on specific program components.
* Integrated baseline review: All eight agencies required major IT
investments to conduct an integrated baseline review to ensure that
program baselines fully reflect the scope of work to be performed, key
risks, and available resources. For example, DOD required that these
reviews occur within 6 months of contract award and after major
modifications have taken place, among other things.
* Training requirements: Commerce was the only agency to fully meet
this key practice by requiring and enforcing EVM training for all
personnel with investment oversight and program management
responsibilities. Several of the partially compliant agencies required
EVM training for project managers--but did not extend this requirement
to other program management personnel or executives with investment
oversight responsibilities. Many agencies told us that it would be a
significant challenge to require and enforce EVM training for all
relevant personnel, especially at the executive level. Instead, most
agencies have made voluntary EVM training courses available agencywide.
However, without comprehensive EVM training requirements and
enforcement, agencies cannot effectively ensure that programs have the
appropriate skills to validate and interpret EVM data, and that their
executives will be able to make fully informed decisions based on the
EVM analysis.
* Rebaselining criteria: Three of the eight agencies fully met this key
practice. For example, the Justice policy outlines acceptable reasons
for rebaselining, such as when the baseline no longer reflects the
current scope of work being performed, and requires investments to
explain why their current plans are no longer feasible and to develop
realistic cost and schedule estimates for remaining work. Among the
five partially compliant agencies, Agriculture and Veterans Affairs
provided policies, but in draft form; NASA was in the process of
updating its policy to include more detailed criteria for rebaselining;
and Homeland Security did not define acceptable reasons but did require
an explanation of the root causes for cost and schedule variances and
the development of new cost and schedule estimates. In several cases,
agencies were unaware of the detailed rebaselining criteria to be
included in their EVM policies. Until their policies fully meet this
key practice, agencies face an increased risk that their executive
managers will make decisions about programs with incomplete
information, and that these programs will continue to overrun costs and
schedules because their underlying problems have not been identified or
addressed.
* System surveillance: All eight agencies required ongoing EVM system
surveillance of all programs (and contracts with EVM requirements) to
ensure their continued compliance with industry standards. For example,
Agriculture required its surveillance teams to submit reports--to the
programs and the Chief Information Officer--with documented findings
and recommendations regarding compliance. Furthermore, the agency also
established a schedule to show when EVM surveillance is expected to
take place on each of its programs.
Agencies' Key Acquisition Programs Are Using EVM, but Are Not
Consistently Implementing Key Practices:
Our studies of 16 major system acquisition programs showed that all
agencies are using EVM; however, the extent of that implementation
varies among the programs. Our work on best practices in EVM identified
11 key practices that are implemented on acquisition programs of
leading organizations. These practices can be organized into three
management areas: establishing a sound EVM system, ensuring reliable
data, and using earned value data to make decisions. Table 3 lists
these 11 key EVM practices by management area.
Table 3: Eleven Key EVM Practices for System Acquisition Programs:
Program management area of responsibility: Establish a comprehensive
EVM system; EVM practice:
* Define the scope of effort using a work breakdown structure.
* Identify who in the organization will perform the work.
* Schedule the work.
* Estimate the labor and material required to perform the work and
authorize the budgets, including management reserve.
* Determine objective measure of earned value.
* Develop the performance measurement baseline.
Program management area of responsibility: Ensure that the data
resulting from the EVM system are reliable; EVM practice:
* Execute the work plan and record all costs.
* Analyze EVM performance data and record variances from the
performance measurement baseline plan.
* Forecast estimates at completion.
Program management area of responsibility: Ensure that the program
management team is using earned value data for decision-making
purposes; EVM practice:
* Take management action to mitigate risks.
* Update the performance measurement baseline as changes occur.
Source: GAO-09-3SP.
[End of table]
Of the 16 case study programs, 3 demonstrated a full level of maturity
in all three management areas; 3 had full maturity in two areas; and 4
had reached full maturity in one area. The remaining 6 programs did not
demonstrate full levels of maturity in any of the management areas;
however, in all but 1 case, they were able to demonstrate partial
capabilities in each of the three areas. Table 4 identifies the 16 case
study programs and summarizes our results for these programs. Following
the table is a summary of the programs' implementation of each key area
of EVM program management responsibility. Additional details on the 16
case studies are provided in appendix II.
Table 4: Assessment of EVM Practices for Case Study Programs:
Agency: Agriculture;
Program: Farm Program Modernization; Establishing a comprehensive EVM
system: The program partially implemented the EVM practices in this
program management area; Ensuring that data resulting from the EVM
system are reliable: The program fully implemented all EVM practices in
this program management area; Ensuring that the program management team
is using earned value data for decision-making purposes: The program
fully implemented all EVM practices in this program management area.
Agency: Commerce;
Program: Decennial Response Integration System; Establishing a
comprehensive EVM system: The program fully implemented all EVM
practices in this program management area; Ensuring that data resulting
from the EVM system are reliable: The program fully implemented all EVM
practices in this program management area; Ensuring that the program
management team is using earned value data for decision-making
purposes: The program fully implemented all EVM practices in this
program management area.
Agency: Commerce;
Program: Field Data Collection Automation; Establishing a comprehensive
EVM system: The program partially implemented the EVM practices in this
program management area; Ensuring that data resulting from the EVM
system are reliable: The program partially implemented the EVM
practices in this program management area; Ensuring that the program
management team is using earned value data for decision-making
purposes: The program partially implemented the EVM practices in this
program management area.
Agency: Defense;
Program: Air and Space Operations Center--Weapon System; Establishing a
comprehensive EVM system: The program partially implemented the EVM
practices in this program management area; Ensuring that data resulting
from the EVM system are reliable: The program partially implemented the
EVM practices in this program management area; Ensuring that the
program management team is using earned value data for decision-making
purposes: The program fully implemented all EVM practices in this
program management area.
Agency: Defense;
Program: Joint Tactical Radio System--Handheld, Manpack, Small Form
Fit; Establishing a comprehensive EVM system: The program partially
implemented the EVM practices in this program management area; Ensuring
that data resulting from the EVM system are reliable: The program fully
implemented all EVM practices in this program management area; Ensuring
that the program management team is using earned value data for
decision-making purposes: The program fully implemented all EVM
practices in this program management area.
Agency: Defense;
Program: Warfighter Information Network--Tactical; Establishing a
comprehensive EVM system: The program partially implemented the EVM
practices in this program management area; Ensuring that data resulting
from the EVM system are reliable: The program fully implemented all EVM
practices in this program management area; Ensuring that the program
management team is using earned value data for decision-making
purposes: The program partially implemented the EVM practices in this
program management area.
Agency: Homeland Security;
Program: Automated Commercial Environment; Establishing a comprehensive
EVM system: The program partially implemented the EVM practices in this
program management area; Ensuring that data resulting from the EVM
system are reliable: The program partially implemented the EVM
practices in this program management area; Ensuring that the program
management team is using earned value data for decision-making
purposes:
Agency: Homeland Security;
Program: Agency: Integrated Deepwater System--Common Operational
Picture; Establishing a comprehensive EVM system: The program partially
implemented the EVM practices in this program management area; Ensuring
that data resulting from the EVM system are reliable: The program
partially implemented the EVM practices in this program management
area; Ensuring that the program management team is using earned value
data for decision-making purposes: The program partially implemented
the EVM practices in this program management area.
Agency: Homeland Security;
Program: Western Hemisphere Travel Initiative; Establishing a
comprehensive EVM system: The program partially implemented the EVM
practices in this program management area; Ensuring that data resulting
from the EVM system are reliable: The program partially implemented the
EVM practices in this program management area; Ensuring that the
program management team is using earned value data for decision-making
purposes: The program partially implemented the EVM practices in this
program management area.
Agency: Justice;
Program: Next Generation Identification; Establishing a comprehensive
EVM system: The program fully implemented all EVM practices in this
program management area; Ensuring that data resulting from the EVM
system are reliable: The program fully implemented all EVM practices in
this program management area; Ensuring that the program management team
is using earned value data for decision-making purposes: The program
fully implemented all EVM practices in this program management area.
Agency: National Aeronautics and Space Administration; Program: James
Webb Space Telescope; Establishing a comprehensive EVM system: The
program partially implemented the EVM practices in this program
management area; Ensuring that data resulting from the EVM system are
reliable: The program partially implemented the EVM practices in this
program management area; Ensuring that the program management team is
using earned value data for decision-making purposes: The program
partially implemented the EVM practices in this program management
area.
Agency: National Aeronautics and Space Administration; Program: Juno;
Establishing a comprehensive EVM system: The program partially
implemented the EVM practices in this program management area; Ensuring
that data resulting from the EVM system are reliable: The program fully
implemented all EVM practices in this program management area; Ensuring
that the program management team is using earned value data for
decision-making purposes: The program fully implemented all EVM
practices in this program management area.
Agency: National Aeronautics and Space Administration; Program: Mars
Science Laboratory; Establishing a comprehensive EVM system: The
program partially implemented the EVM practices in this program
management area; Ensuring that data resulting from the EVM system are
reliable: The program partially implemented the EVM practices in this
program management area; Ensuring that the program management team is
using earned value data for decision-making purposes: The program
partially implemented the EVM practices in this program management
area.
Agency: Transportation;
Program: En Route Automation Modernization; Establishing a
comprehensive EVM system: The program partially implemented the EVM
practices in this program management area; Ensuring that data resulting
from the EVM system are reliable: The program partially implemented the
EVM practices in this program management area; Ensuring that the
program management team is using earned value data for decision-making
purposes: The program fully implemented all EVM practices in this
program management area.
Agency: Transportation;
Program: Surveillance and Broadcast System; Establishing a
comprehensive EVM system: The program fully implemented all EVM
practices in this program management area; Ensuring that data resulting
from the EVM system are reliable: The program fully implemented all EVM
practices in this program management area; Ensuring that the program
management team is using earned value data for decision-making
purposes: The program fully implemented all EVM practices in this
program management area.
Agency: Veterans Affairs;
Program: Veterans Health Information Systems and Technology
Architecture--Foundations Modernization; Establishing a comprehensive
EVM system: The program partially implemented the EVM practices in this
program management area; Ensuring that data resulting from the EVM
system are reliable: The program partially implemented the EVM
practices in this program management area; Ensuring that the program
management team is using earned value data for decision-making
purposes: The program did not implement the EVM practices in this
program management area.
Source: GAO analysis of program data.
[End of table]
Most Programs Did Not Fully Establish Comprehensive EVM Systems:
Most programs did not fully implement the key practices needed to
establish comprehensive EVM systems. Of the 16 programs, 3 fully
implemented the practices in this program management area, and 13
partially implemented the practices. The Decennial Response Integration
System, Next Generation Identification, and Surveillance and Broadcast
System programs demonstrated that they had fully implemented the six
practices in this area. For example, our analysis of the Decennial
Response Integration System program schedule showed that activities
were properly sequenced, realistic durations were established, and
labor and material resources were assigned. The Surveillance and
Broadcast System program conducted a detailed integrated baseline
review to validate its performance baseline. It was also the only
program to fully institutionalize EVM at the program level--meaning
that it collects performance data on the contractor and government work
efforts--in order to get a complete view into program status.
Thirteen programs demonstrated that they partially implemented the six
key practices in this area. In most cases, programs had work breakdown
structures that defined work products to an appropriate level of detail
and had identified the personnel responsible for delivering these work
products. However, for all 13 programs, the project schedules contained
issues that undermined the quality of their performance baselines.
Weaknesses in these schedules included the improper sequencing of
activities, such as incomplete or missing linkages between tasks; a
lack of resources assigned to all activities; invalid critical paths
(the sequence of activities that, if delayed, will impact the planned
completion date of the project); and the excessive or unjustified use
of constraints, which impairs the program's ability to forecast the
impact of ongoing delays on future planned work activities. These
weaknesses are of concern because the schedule serves as the
performance baseline against which earned value is measured. As such,
poor schedules undermine the overall quality of a program's EVM system.
Other key weaknesses included the following examples:
* Nine programs did not adequately determine an objective measure of
earned value and develop the performance baseline--that is, key
practices most appropriately addressed through a comprehensive
integrated baseline review, which none of them fully performed. For
example, the Air and Space Operations Center--Weapon System program
conducted an integrated baseline review in May 2007 to validate one
segment of work contained in the baseline; however, the program had not
conducted subsequent reviews for the remaining work because doing so
would preclude staff from completing their normal work activities.
Other reasons cited by the programs for not performing these reviews
included the lack of a fully defined scope of work or management's
decision to use ongoing EVM surveillance to satisfy these practices.
Without having performed a comprehensive integrated baseline review,
programs have not sufficiently evaluated the validity of their baseline
plan to determine whether all significant risks contained in the plan
have been identified and mitigated, and that the metrics used to
measure the progress made on planned work elements are appropriate.
* Four programs did not define the scope of effort using a work
breakdown structure. For example, the Veterans Health Information
Systems and Technology Architecture--Foundations Modernization program
provided a list of its subprograms; however, it did not define the
scope of the detailed work elements that comprise each subprogram.
Without a work breakdown structure, programs lack a basis for planning
the performance baseline and assigning responsibility for that work,
both of which are necessary to accomplish a program's objectives.
Many Programs Did Not Fully Implement Practices to Ensure Data
Reliability:
Many programs did not fully ensure that their EVM data were reliable.
Of the 16 programs, 7 fully implemented the practices for ensuring the
reliability of the prime contractor and government performance data,
and 9 partially implemented the practices. All 7 programs that
demonstrated full implementation conduct monthly reviews of earned
value data with technical engineering staff and other key personnel to
ensure that the data are consistent with actual performance; perform
detailed performance trend analyses to track program progress, cost,
and schedule drivers; and make estimates of cost at completion. Four
programs that we had previously identified as having schedule
weaknesses (Farm Program Modernization; Joint Tactical Radio System--
Handheld, Manpack, Small Form Fit; Juno; and Warfighter Information
Network--Tactical) were aware of these issues and had sufficient
controls in place to mitigate them in order to ensure that the earned
value data are reliable.
Nine programs partially implemented the three practices for ensuring
that earned value data are reliable. In all cases, the program had
processes in place to review earned value data (from monthly contractor
EVM reports in all but one case), identify and record cost and schedule
variances, and forecast estimates at completion. However, 5 of these
programs did not adequately analyze EVM performance data and properly
record variances from the performance baseline. For example, 2 programs
did not adequately document justifications for cost and schedule
variances, including root causes, potential impacts, and corrective
actions. Other weaknesses in this area include anomalies in monthly
performance reports, such as negative dollars being spent for work
performed, which impacts the validity of performance data. In addition,
7 of these programs did not demonstrate that they could adequately
execute the work plan and record costs because, among other things,
they were unaware of the schedule weaknesses we identified and did not
have sufficient internal controls in place to deal with these issues to
improve the reliability of the earned value data. Lastly, 2 of these
programs could not adequately forecast estimates at completion due, in
part, to anomalies in the prime contractor's EVM reports, in
combination with the weaknesses contained in the project schedule.
Most Programs Used Earned Value Data for Decision-making Purposes:
Programs were uneven in their use of earned value data to make
decisions. Of the 16 programs, 9 fully implemented the practices for
using earned value data for decision making, 6 partially implemented
them, and 1 did not implement them. Among the 9 fully implemented
programs, both the Automated Commercial Environment and Juno programs
integrated their EVM and risk management processes to support the
program manager in making better decisions. The Automated Commercial
Environment program actively recorded risks associated with major
variances from the EVM reports in the program's risk register. Juno
further used the earned value data to analyze threats against remaining
management reserve and to estimate the cost impact of these threats.
Six programs demonstrated limited capabilities in using earned value
data for making decisions. In most cases, these programs included
earned value performance trend data in monthly program management
review briefings. However, the majority had processes for taking
management action to address the cost and schedule drivers causing poor
trends that were ad hoc and separate from the programs' risk management
processes--and, in most cases, the risks and issues found in the EVM
reports did not correspond to the risks contained in the program risk
registers. In addition, 4 of these programs were not able to adequately
update the performance baseline as changes occurred because, in many
cases, the original baseline was not appropriately validated. For
example, the Mars Science Laboratory program just recently updated its
performance baseline as part of a recent replan effort. However,
without validating the original and current baselines with a project-
level integrated baseline review, it is unclear whether the changes to
the baseline were reasonable, and whether the risks assumed in the
baseline have been identified and appropriately mitigated.
One program (Veterans Health Information Systems and Technology
Architecture--Foundations Modernization) was not using earned value
data for decision making. Specifically, the program did not actively
manage earned value performance trends, nor were these data
incorporated into programwide management reviews.
Inconsistent Implementation Is Due in Part to Weaknesses in Policy and
Lack of Enforcement:
The inconsistent application of EVM across the investments exists in
part because of the weaknesses we previously identified in the eight
agencies' policies, as well as a lack of enforcement of the EVM policy
components already in place. For example, deficiencies in all three
management areas can be attributed, in part, to a lack of comprehensive
EVM training requirements--which was a policy component that most
agencies did not fully address. The only 3 programs that had fully
implemented all key EVM practices either had comprehensive training
requirements in their agency EVM policy or enforced rigorous training
requirements beyond that for which the policy called. Most of the
remaining programs met the minimum requirements of their agencies'
policies. However, all programs that had attained full maturity in two
management areas had also implemented more stringent training
requirements, although none could match the efforts made on the other 3
programs. Without making this training a comprehensive requirement,
these agencies are at risk that their major system acquisition programs
will continue to have management and technical staff who lack the
skills to fully implement key EVM practices.
Our case study analysis also highlighted multiple areas in which
programs were not in compliance with their agencies' established EVM
policies. This is an indication that agencies are not adequately
enforcing program compliance. These policy areas include requiring EVM
compliance at the start of the program, validating the baseline with an
integrated baseline review, and conducting ongoing EVM surveillance.
Until key EVM practices are fully implemented, selected programs face
an increased risk that program managers cannot effectively optimize EVM
as a management tool to mitigate and reverse poor cost and schedule
performance trends.
Earned Value Data Show Trends of Cost Overruns and Schedule Slippages
on Most Programs:
Earned value data trends of the 16 case study programs indicate that
most are currently experiencing cost overruns and schedule slippages,
and, based on our analysis, it is likely that when these programs are
completed, the total cost overrun will be about $3 billion. To date,
these programs, collectively, have already overrun their original life-
cycle cost estimates by almost $2 billion (see table 5).
Table 5: Program Life-cycle Cost Estimate Changes (Dollars in
millions):
Agency: Agriculture;
Program: Farm Program Modernization; Original life-cycle cost estimate:
$451.0; Current life-cycle cost estimate: $451.0; Cost overruns in
excess of original cost estimate: $0.0.
Agency: Commerce;
Program: Decennial Response Integration System; Original life-cycle
cost estimate: $574.0[A]; Current life-cycle cost estimate: $946.0[A];
Cost overruns in excess of original cost estimate: $372.0.
Agency: Commerce;
Program: Field Data Collection Automation; Original life-cycle cost
estimate: $595.7; Current life-cycle cost estimate: $801.1; Cost
overruns in excess of original cost estimate: $205.4.
Agency: Defense;
Program: Air and Space Operations Center--Weapon System; Original life-
cycle cost estimate: $4,425.0; Current life-cycle cost estimate:
$4,425.0; Cost overruns in excess of original cost estimate: 0.0.
Agency: Defense;
Program: Joint Tactical Radio System--Handheld, Manpack, Small Form
Fit; Original life-cycle cost estimate: $19,214.0; Current life-cycle
cost estimate: $11,599.0; Cost overruns in excess of original cost
estimate: Dollars in millions: n/a[B].
Agency: Defense;
Program: Warfighter Information Network--Tactical; Original life-cycle
cost estimate: $38,157.1; Current life-cycle cost estimate: $38,157.1;
Cost overruns in excess of original cost estimate: 0.0.
Agency: Homeland Security;
Program: Automated Commercial Environment; Original life-cycle cost
estimate: $1,500.0[C]; Current life-cycle cost estimate: $2,241.0[C];
Cost overruns in excess of original cost estimate: $741.0.
Agency: Homeland Security;
Program: Integrated Deepwater System--Common Operational Picture;
Original life-cycle cost estimate: $1,353.0[C]; Current life-cycle cost
estimate: $1,353.0[C]; Cost overruns in excess of original cost
estimate: 0.0.
Agency: Homeland Security;
Program: Western Hemisphere Travel Initiative; Original life-cycle cost
estimate: $1,228.0; Current life-cycle cost estimate: $1,228.0; Cost
overruns in excess of original cost estimate: 0.0.
Agency: Justice;
Program: Next Generation Identification; Original life-cycle cost
estimate: $1,075.9; Current life-cycle cost estimate: $1,075.9; Cost
overruns in excess of original cost estimate: 0.0.
Agency: National Aeronautics and Space Administration; Program: James
Webb Space Telescope; Original life-cycle cost estimate: $4,964.0;
Current life-cycle cost estimate: $4,964.0; Cost overruns in excess of
original cost estimate: 0.0.
Agency: National Aeronautics and Space Administration; Program: Juno;
Original life-cycle cost estimate: $1,050.0; Current life-cycle cost
estimate: $1,050.0; Cost overruns in excess of original cost estimate:
0.0.
Agency: National Aeronautics and Space Administration; Program: Mars
Science Laboratory; Original life-cycle cost estimate: $1,634.0;
Current life-cycle cost estimate: $2,286.0; Cost overruns in excess of
original cost estimate: $652.0.
Agency: Transportation;
Program: En Route Automation Modernization; Original life-cycle cost
estimate:$3,649.4; Current life-cycle cost estimate: $3,649.4; Cost
overruns in excess of original cost estimate: 0.0.
Agency: Transportation;
Program: Surveillance and Broadcast System; Original life-cycle cost
estimate: $4,313.0; Current life-cycle cost estimate: $4,328.9; Cost
overruns in excess of original cost estimate: $15.9.
Agency: Veterans Affairs;
Program: Veterans Health Information Systems and Technology
Architecture--Foundations Modernization; Original life-cycle cost
estimate: $1,897.4; Current life-cycle cost estimate: $1,897.4; Cost
overruns in excess of original cost estimate: 0.0.
Agency: Total;
Cost overruns in excess of original cost estimate: $1,986.3 billion.
Source: GAO analysis of program and contractor data.
[A] We removed $37 million from the original estimate, which
represented costs associated with the closeout of the program. We did
this because the current estimate does not include costs for these
activities. An estimate for these activities is currently being
revised. In addition, the cost increase associated with the current
estimate is due, in part, to an agency-directed expansion of program
scope (related to the system's ability to process a higher volume of
paper forms) in April 2008.
[B] It is not appropriate to compare the original and current life-
cycle cost estimates for this program because the scope has
significantly changed since inception (such as newly imposed security
requirements). In addition, due to a change in the agency's migration
strategy for replacing legacy radios with new tactical radios, the
planned quantity of radios procured was decreased from 328,514 to
95,551. As a result, the life-cycle cost estimate was reduced and no
longer represents the original scope of the program.
[C] The original and current life-cycle costs do not include operations
and maintenance costs.
[End of table]
Taking the current earned value performance[Footnote 15] into account,
our analysis of the 16 case study programs indicated that most are
experiencing shortfalls against their currently planned cost and
schedule targets. Specifically, earned value performance data over a 12-
month period showed that the 16 programs combined have exceeded their
cost targets by $275 million. During that period, they also experienced
schedule variances and were unable to accomplish almost $93 million
worth of planned work. In most cases, the negative cost and schedule
performance trends were attributed to ongoing technical issues in the
development or testing of system components.
Furthermore, our projections of future estimated costs at completion
based on our analysis of current contractor performance trends indicate
that these programs will most likely continue to experience cost
overruns to completion, totaling almost $1 billion. In contrast, the
programs' contractors estimate the cost overruns at completion will be
approximately $469.7 million. These estimates are based on the
contractors' assumption that their efficiency in completing the
remaining work will significantly improve over what has been done to
date. Furthermore, it should be noted that in 4 cases, the contractor-
estimated overrun is smaller than the cost variances they have already
accumulated--which is an indication that these estimates are
aggressively optimistic.[Footnote 16]
With the inclusion of the overruns already incurred to date, the total
increase in life-cycle costs will be about $3 billion. Our analysis is
presented in table 6. Additional details on the 16 case studies are
provided in appendix II.
Table 6: Contractor Cumulative Cost and Schedule Performances (Dollars
in millions):
Agency: Agriculture;
Program: Farm Program Modernization[A,B]; Contractor budget at
completion: $7.0; Percentage complete: 94%;
Cumulative cost variance: $