Air Traffic Control
FAA Uses Earned Value Techniques to Help Manage Information Technology Acquisitions, but Needs to Clarify Policy and Strengthen Oversight
Gao ID: GAO-08-756 July 18, 2008
In fiscal year 2008, the Federal Aviation Administration (FAA) plans to spend over $2 billion on information technology (IT) investments--many of which support FAA's air traffic control modernization. To more effectively manage such investments, in 2005 the Office of Management and Budget required agencies to use earned value management (EVM). EVM is a project management approach that, if implemented appropriately, provides objective reports of project status, produces early warning signs of impending schedule delays and cost overruns, and provides unbiased estimates of a program's total costs. Among other objectives, GAO was asked to assess FAA's policies for implementing EVM on its IT investments, evaluate whether the agency is adequately using these techniques to manage key IT acquisitions, and assess the agency's efforts to oversee EVM compliance. To do so, GAO compared agency policies with best practices, performed four case studies, and interviewed key FAA officials.
FAA has established a policy requiring the use of EVM on its major IT acquisition programs, but key components of this policy are not fully consistent with best practices of leading organizations. Specifically, FAA fully met four and partially met three components of an effective EVM policy. For example, FAA requires its program managers to obtain EVM training, but it does not enforce completion of this training or require other relevant personnel to obtain this training. Until FAA expands and enforces its policy, it will be difficult for the agency to gain the full benefits of EVM. FAA is using EVM to manage IT acquisition programs, but not all programs are ensuring that their earned value data are reliable. Case studies of four programs demonstrated that all are using or planning to use EVM systems. However, of the three programs currently collecting EVM data, only one program is adequately ensuring that its earned value data are reliable. Another program is limited in its ability to ensure data reliability because it was initiated before earned value was required. The third program did not adequately validate contractor performance data. For example, GAO found anomalies in which the contractor reported spending funds without accomplishing work and others in which the contractor reported accomplishing work while crediting funds to the government. Until programs undertake a rigorous validation of their EVM data, FAA faces an increased risk that managers may not be getting the information they need to effectively manage the programs. FAA has taken important steps to oversee program compliance with EVM policies, but its oversight process lacks sufficient rigor. Through its recurring assessments, FAA has reported that most programs have improved their earned value capabilities over time, and that 74 percent of the programs were fully compliant with national standards. However, FAA's assessments are not thorough enough to identify anomalies in contractor data, and its progress reports do not distinguish between systems that collect comprehensive data and those that do not. As a result, FAA executives do not always receive an accurate view of the quality of a program's EVM data when making investment decisions on that program.
Recommendations
Our recommendations from this work are listed below with a Contact for more information. Status will change from "In process" to "Open," "Closed - implemented," or "Closed - not implemented" based on our follow up work.
Director:
Team:
Phone:
GAO-08-756, Air Traffic Control: FAA Uses Earned Value Techniques to Help Manage Information Technology Acquisitions, but Needs to Clarify Policy and Strengthen Oversight
This is the accessible text file for GAO report number GAO-08-756
entitled 'Air Traffic Control: FAA Uses Earned Value Techniques to Help
Manage Information Technology Acquisitions, but Needs to Clarify Policy
and Strengthen Oversight' which was released on July 18, 2008.
This text file was formatted by the U.S. Government Accountability
Office (GAO) to be accessible to users with visual impairments, as part
of a longer term project to improve GAO products' accessibility. Every
attempt has been made to maintain the structural and data integrity of
the original printed product. Accessibility features, such as text
descriptions of tables, consecutively numbered footnotes placed at the
end of the file, and the text of agency comment letters, are provided
but may not exactly duplicate the presentation or format of the printed
version. The portable document format (PDF) file is an exact electronic
replica of the printed version. We welcome your feedback. Please E-mail
your comments regarding the contents or accessibility features of this
document to Webmaster@gao.gov.
This is a work of the U.S. government and is not subject to copyright
protection in the United States. It may be reproduced and distributed
in its entirety without further permission from GAO. Because this work
may contain copyrighted images or other material, permission from the
copyright holder may be necessary if you wish to reproduce this
material separately.
Report to Congressional Requesters:
United States Government Accountability Office:
GAO:
July 2008:
Air Traffic Control:
FAA Uses Earned Value Techniques to Help Manage Information Technology
Acquisitions, but Needs to Clarify Policy and Strengthen Oversight:
GAO-08-756:
GAO Highlights:
Highlights of GAO-08-756, a report to congressional requesters.
Why GAO Did This Study:
In fiscal year 2008, the Federal Aviation Administration (FAA) plans to
spend over $2 billion on information technology (IT) investments”many
of which support FAA‘s air traffic control modernization. To more
effectively manage such investments, in 2005 the Office of Management
and Budget required agencies to use earned value management (EVM). EVM
is a project management approach that, if implemented appropriately,
provides objective reports of project status, produces early warning
signs of impending schedule delays and cost overruns, and provides
unbiased estimates of a program‘s total costs.
Among other objectives, GAO was asked to assess FAA‘s policies for
implementing EVM on its IT investments, evaluate whether the agency is
adequately using these techniques to manage key IT acquisitions, and
assess the agency‘s efforts to oversee EVM compliance. To do so, GAO
compared agency policies with best practices, performed four case
studies, and interviewed key FAA officials.
What GAO Found:
FAA has established a policy requiring the use of EVM on its major IT
acquisition programs, but key components of this policy are not fully
consistent with best practices of leading organizations. Specifically,
FAA fully met four and partially met three components of an effective
EVM policy (see table). For example, FAA requires its program managers
to obtain EVM training, but it does not enforce completion of this
training or require other relevant personnel to obtain this training.
Until FAA expands and enforces its policy, it will be difficult for the
agency to gain the full benefits of EVM.
FAA is using EVM to manage IT acquisition programs, but not all
programs are ensuring that their earned value data are reliable. Case
studies of four programs demonstrated that all are using or planning to
use EVM systems. However, of the three programs currently collecting
EVM data, only one program is adequately ensuring that its earned value
data are reliable. Another program is limited in its ability to ensure
data reliability because it was initiated before earned value was
required. The third program did not adequately validate contractor
performance data. For example, GAO found anomalies in which the
contractor reported spending funds without accomplishing work and
others in which the contractor reported accomplishing work while
crediting funds to the government. Until programs undertake a rigorous
validation of their EVM data, FAA faces an increased risk that managers
may not be getting the information they need to effectively manage the
programs.
FAA has taken important steps to oversee program compliance with EVM
policies, but its oversight process lacks sufficient rigor. Through its
recurring assessments, FAA has reported that most programs have
improved their earned value capabilities over time, and that 74 percent
of the programs were fully compliant with national standards. However,
FAA‘s assessments are not thorough enough to identify anomalies in
contractor data, and its progress reports do not distinguish between
systems that collect comprehensive data and those that do not. As a
result, FAA executives do not always receive an accurate view of the
quality of a program‘s EVM data when making investment decisions on
that program.
Table: Seven Key Components of an Effective EVM Policy:
Policy component:
Assessment of FAA policy:
Policy component: Establish clear criteria for which programs are to
use EVM; Assessment of FAA policy: Fully met.
Policy component: Require programs to comply with national standards;
Assessment of FAA policy: Fully met.
Policy component: Require programs to use a standard structure for
defining the work products that enables managers to track cost and
schedule by defined deliverables (e.g., hardware or software
component); Assessment of FAA policy: Partially met.
Policy component: Require programs to conduct detailed reviews of
expected costs, schedules, and deliverables (called an integrated
baseline review); Assessment of FAA policy: Fully met.
Policy component: Require and enforce EVM training; Assessment of FAA
policy: Partially met.
Policy component: Define when programs may revise cost and schedule
baselines (called rebaselining) Partially met.
Policy component: Require system surveillance”routine validation checks
to ensure that major acquisitions continue to comply with agency
policies and standards; Assessment of FAA policy: Fully met.
Sources: GAO Cost Guide, Exposure Draft (GAO-07-1134SP) and analysis of
FAA data.
[End of table]
What GAO Recommends:
GAO is making recommendations to the Secretary of Transportation to
improve FAA‘s acquisition policies governing EVM, contractor data
reliability on a key system, and the process for overseeing major
systems. The Department of Transportation generally agreed with the
recommendations and provided technical comments, which GAO incorporated
as appropriate.
To view the full product, including the scope and methodology, click on
[hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-08-756]. For more
information, contact David A. Powner, (202) 512-9286, pownerd@gao.gov.
[End of section]
Contents:
Letter:
Results in Brief:
Background:
FAA Has Established an EVM Policy for Major IT Investments, but Key
Components Are Not Fully Consistent with Best Practices:
Key FAA Systems Are Using EVM, but Are Not Consistently Implementing
Key Practices:
FAA Has Taken Steps to Oversee EVM Compliance, but Its Oversight
Process Lacks Sufficient Rigor:
FAA Has Incorporated Important EVM Performance Data into Its IT
Investment Management Process:
Conclusions:
Recommendations for Executive Action:
Agency Comments:
Appendix I: Objectives, Scope, and Methodology:
Appendix II: Overview of Industry Guidelines That Support Sound EVM:
Appendix III: Case Studies of FAA's Implementation of EVM:
Appendix IV: GAO Contact and Staff Acknowledgments:
Tables:
Table 1: Key Components of an Effective EVM Policy:
Table 2: Assessment of FAA's EVM Policies, as of April 2008:
Table 3: Eleven Key EVM Practices for System Acquisition Programs:
Table 4: Management Functions Addressed by ANSI Guidance on Earned
Value Management Systems:
Table 5: Funding Data for ASR-11:
Table 6: Assessment of ASR-11's EVM Practices, as of April 2008:
Table 7: Funding Data for ERAM:
Table 8: Assessment of ERAM's EVM Practices, as of April 2008:
Table 9: SBS Funding Data:
Table 10: Assessment of SBS's EVM Practices, as of April 2008:
Table 11: Financial Funding Data for SWIM:
Table 12: Assessment of SWIM's EVM Practices, as of April 2008:
Figures:
Figure 1: Assessment of EVM Practices for Key FAA Systems, as of April
2008:
Figure 2: Cumulative Cost and Schedule Variances for the ASR-11 Program
in Calendar Year 2007:
Figure 3: Cumulative Cost and Schedule Variances of the ERAM Prime
Contract in Calendar Year 2007:
Figure 4: Cumulative Cost and Schedule Variances for the SBS Program:
Abbreviations:
ANSI: American National Standards Institute:
ASR-11: Airport Surveillance Radar:
ATC: air traffic control:
EIA: Electronic Industries Alliance:
ERAM: En Route Automation Modernization:
FAA: Federal Aviation Administration:
EVM: earned value management:
IT: information technology:
OMB: Office of Management and Budget:
NextGen: Next Generation Air Transportation System:
SBS: Surveillance and Broadcast Service:
SWIM: System Wide Information Management:
[End of section]
United States Government Accountability Office:
Washington, DC 20548:
July 18, 2008:
Congressional Requesters:
In fiscal year 2008, the Federal Aviation Administration (FAA) plans to
spend approximately $2 billion on information technology (IT)
investments, many of which involve systems and technologies to
modernize the air traffic control (ATC) system or to transition to a
Next Generation Air Transportation System (NextGen). Over the past 13
years, we have identified FAA's ATC modernization as a high-risk
initiative due to the cost, size, and complexity of this program as
well as the cost overruns, schedule delays, and performance shortfalls
that have plagued the system acquisitions that make up this effort.
[Footnote 1] To more effectively manage such investments, in 2005 the
Office of Management and Budget (OMB) required agencies to implement
earned value management (EVM).[Footnote 2] EVM is a project management
approach that, if implemented appropriately, provides objective reports
of project status, produces early warning signs of impending schedule
delays and cost overruns, and provides unbiased estimates of
anticipated costs at completion.
This report responds to your request that we review FAA's use of EVM.
Specifically, our objectives were to (1) assess FAA's policies for
implementing EVM on its IT investments, (2) evaluate whether the agency
is adequately using these techniques to manage key IT acquisitions, (3)
assess the agency's efforts to oversee compliance with its EVM
policies, and (4) evaluate whether the agency is using earned value
data as part of its investment management process.
To address our objectives, we reviewed agency documentation, including
FAA-wide policies and plans governing the use of EVM on IT
acquisitions, selected programs' documented EVM practices and
performance reports, internal EVM assessment criteria and reports, and
executive management briefings. We conducted case studies of four
programs that we selected for their large development and life-cycle
costs, representation of FAA's major modernization initiatives, and
different stages of life-cycle maturity. We compared the agency's
policies and practices with federal standards and best practices of
leading organizations to determine the effectiveness of FAA's use of
earned value data in managing its IT investments. We also interviewed
relevant agency officials, including key personnel on programs selected
for case study and the official responsible for implementing EVM, and
we observed working group meetings on EVM. This report builds on a body
of work we have performed on FAA's ATC modernization efforts.[Footnote
3]
We conducted this performance audit from November 2007 to July 2008 in
accordance with generally accepted government auditing standards. Those
standards require that we plan and perform the audit to obtain
sufficient, appropriate evidence to provide a reasonable basis for our
findings and conclusions based on our audit objectives. We believe that
the evidence obtained provides a reasonable basis for our findings and
conclusions based on our audit objectives. Further details on our
objectives, scope, and methodology are provided in appendix I.
Results in Brief:
FAA has established a policy requiring the use of EVM on its major IT
acquisition programs, but key components of this policy are not fully
consistent with the best practices of leading organizations. We
recently reported that leading organizations establish EVM policies
with seven key components.[Footnote 4] These organizations:
* establish clear criteria for which programs are to use EVM;
* require programs to comply with a national standard[Footnote 5] on
EVM systems;
* require programs to use a product-oriented structure[Footnote 6] for
defining work products;
* require programs to conduct detailed reviews of expected costs,
schedules, and deliverables (called an integrated baseline review);
* require and enforce EVM training;
* define when programs may revise cost and schedule baselines (called
rebaselining); and:
* require system surveillance--routine validation checks to ensure that
major acquisitions are continuing to comply with agency policies and
standards.
FAA has fully addressed four of these components. Specifically, FAA has
established a policy that requires the use of EVM on all major IT
acquisition programs, compliance with the national standard, completion
of rigorous integrated baseline reviews, and routine validation checks.
However, the agency has only partially addressed the remaining three
components. Specifically, FAA requires that acquisition programs use a
common structure for defining work products, but does not require a
product-oriented work structure. Furthermore, FAA requires that its
program managers obtain EVM training, but does not require that other
relevant personnel obtain this training or that the completion of this
training be monitored and enforced. In addition, FAA requires that
programs obtain approval to revise their cost and schedule baselines,
but does not require programs to identify and mitigate the root cause
of any cost or schedule overruns. Until FAA provides more clarification
on its policy, it will be difficult for the agency to optimize the
effectiveness of EVM as a management tool.
FAA is using EVM to manage IT acquisition programs, but not all
programs are ensuring that their earned value data are reliable. Case
studies of four programs demonstrated that all are using or planning to
use EVM; three programs are currently collecting earned value data and
using these data to make program management decisions; and the fourth
program is not yet far enough along in its development to collect data.
However, of the three programs collecting data, only one is adequately
ensuring that its earned value data are reliable. Another program is
limited in its ability to ensure data reliability because it was
initiated before the use of EVM was required by FAA. The third program-
-called the En Route Automation Modernization--did not adequately
validate contractor performance data. For example, we found anomalies
in which the contractor reported spending funds without accomplishing
work and others in which the contractor reported accomplishing work
while crediting funds to the government. Program officials were unable
to explain these anomalies. Until programs undertake a rigorous
validation of their EVM data, FAA faces an increased risk that managers
may not be receiving the information they need to effectively manage
the programs.
FAA has taken important steps to oversee program compliance with EVM
policies, but its oversight process lacks sufficient rigor. In 2005,
FAA established an EVM oversight office that is responsible for
assessing the major systems using defined evaluation criteria and
providing executives with a summary of its results. Through its
recurring assessments, FAA has reported that most programs have
improved their earned value capabilities over time, and that 74 percent
of its 23 major programs were fully compliant with the national EVM
standard as of February 2008. However, the oversight office's
assessments are not thorough enough to identify anomalies in contractor
data, and its agencywide progress reports do not distinguish between
systems that collect comprehensive data and those that do not. As a
result, FAA executives do not always receive an accurate view of the
quality of a program's EVM data when making investment decisions on
that program.
FAA has incorporated EVM performance data into multiple levels of its
senior executive investment reviews to provide better insight into
system acquisition programs. The level of detail of EVM data reporting
varies depending on the level of executive review. For example, senior
FAA executives responsible for investment decisions review earned value
cost and schedule efficiency data, while program executives responsible
for a portfolio of systems review data on cumulative cost and schedule
variance trends over an extended period of time, estimated costs at
program completion, and management reserves. FAA also has work under
way to improve the information provided to its executive decision
makers.
We are making recommendations to the Secretary of Transportation to
direct the Acting FAA Administrator to modify IT acquisition policies
governing EVM to better define requirements for describing work
products, training requirements, and rebaselining criteria. We are also
recommending that program officials responsible for the En Route
Automation Modernization system investigate anomalies in contractor
data to ensure the reliability of these data. Furthermore, we are
recommending that FAA's oversight office strengthen its oversight
process to include an assessment of contractor data and clarify its
reporting to distinguish between systems that collect comprehensive
data and those that do not, in order to provide insight on the quality
of its EVM data to decision makers. The Department of Transportation's
Director of Audit Relations provided comments on a draft of this report
via e-mail. In those comments, he said that the department generally
agreed with the draft's findings and recommendations. The department
also provided technical comments, which we incorporated as appropriate.
Background:
The mission of FAA, an agency within the Department of Transportation,
is to promote the safe, orderly, and expeditious flow of air traffic in
the U.S. airspace system, commonly referred to as the National Airspace
System. To maintain its ability to effectively carry out this mission,
address an aging infrastructure, and meet an increasing demand for air
transportation, in 1981, FAA embarked on a multibillion-dollar effort
to modernize its aging ATC system. Under this modernization program,
FAA has acquired and deployed new technologies and systems--and
continues to do so today. Looking to the future, FAA is now beginning
to fund components of NextGen, a transformation to a new system that is
expected to use satellite-based technologies and state-of-the-art
procedures to handle increasing air traffic volume through 2025, while
further improving safety and security.
FAA Relies on IT to Carry out Its Mission:
FAA relies extensively on IT to carry out its mission--both in terms of
its operational air traffic responsibilities and its administrative
activities. The agency depends on the adequacy and reliability of the
nation's ATC system, which includes a vast network of radars,
navigation and communications equipment, and information processing
systems located at air traffic facilities across the country.[Footnote
7] Through its ATC system, FAA provides services such as controlling
takeoffs and landings, and managing the flow of traffic between
airports. For example, the Integrated Terminal Weather System
integrates local weather data to allow the maximum use of airport
runways. The Wide Area Augmentation System is used to provide
vertically guided system approaches via Global Positioning System
satellites and its own satellites to aircraft at thousands of airports
and airstrips where there is currently no vertically guided landing
capability, thereby improving safety and reducing pilot workload. FAA
also relies on IT to carry out its mission-support and administrative
operations. For example, FAA uses IT to support accident and incident
investigations, security inspections, and personnel and payroll
functions.
With an IT budget of $2.1 billion for fiscal year 2008, FAA accounts
for about 83 percent of the Department of Transportation's IT budget.
For fiscal years 2007 through 2011, FAA plans to acquire more than $14
billion in new systems to continue operating the nation's current ATC
system, while simultaneously transitioning to NextGen. This transition
involves acquiring numerous systems to support precision satellite
navigation; digital, networked communications; integrated weather
information; layered, adaptive security; and more. A cost-effective and
timely transition to NextGen depends in large part on FAA's ability to
keep these acquisitions within budget and on schedule. Historically,
however, FAA has had chronic difficulties in meeting budget, schedule,
and performance targets for acquisitions aimed at modernizing the
National Airspace System.[Footnote 8] For example, in June 2005, we
reported that 13 of 16 selected major ATC system acquisitions
experienced cost, schedule, or performance shortfalls when assessed
against their original milestones. These 13 system acquisitions
experienced cost increases ranging from $1.1 million to about $1.5
billion; schedule extensions ranging from 1 to 13 years; and
performance shortfalls, including safety problems.
GAO Designated FAA's ATC Modernization as High Risk; FAA Has Taken
Steps to Address Weaknesses:
In 1995, we designated FAA's modernization of its ATC system as a high-
risk initiative because of the size, cost, and complexity of the
program as well as difficulties in meeting cost, schedule, and
performance goals on the individual projects that make up the
modernization.[Footnote 9] Since then, in our High-Risk Series updates,
we have reported on FAA's efforts to address the underlying weaknesses
that put it on the high-risk list.[Footnote 10] These include FAA's
efforts to:
* institutionalize key processes for acquiring and developing software
systems;
* develop and enforce its enterprise architecture;
* improve its cost accounting and estimating practices;
* improve its ability to effectively manage IT investments, and:
* develop an organizational culture that supports sound acquisitions.
To the agency's credit, FAA has taken a number of steps over the years
to better manage its ATC modernization program. Because of FAA's
contention that its modernization efforts were hindered by federal
acquisition regulations, in November 1995 Congress enacted legislation
that exempted the agency from most federal acquisition laws and
regulations.[Footnote 11] The legislation directed FAA to develop and
implement a new acquisition management system that would address the
unique needs of the agency. In April 1996, FAA implemented an
acquisition management system that provided acquisition policy and
guidance for selecting and controlling FAA's investments through all
phases of the acquisition life cycle. This guidance was intended to
reduce the time and cost needed for fielding new products and services
by introducing (1) a new investment management system that spans the
entire life cycle of an acquisition, (2) a new procurement system that
provides flexibility in selecting and managing contractors, and (3)
organizational and human capital reforms that support the new
investment and procurement systems.
More recently, in February 2004, FAA created the performance-based Air
Traffic Organization to control and improve FAA's investments and
operations and to better provide safe, secure, and cost-effective air
traffic services now and into the future. This change combined the
groups responsible for developing and acquiring systems with those that
operate them into a single organization. The Air Traffic Organization
is led by FAA's Chief Operating Officer.
EVM Provides Insight on Program Cost and Schedule:
Pulling together essential cost, schedule, and technical information in
a meaningful, coherent fashion is a challenge for most programs.
Without meaningful and coherent cost and schedule information, program
managers can have a distorted view of a program's status and risks. To
address this issue, in the 1960s, the Department of Defense developed
the EVM technique, which goes beyond simply comparing budgeted costs
with actual costs. This technique measures the value of work
accomplished in a given period and compares it with the planned value
of work scheduled for that period and with the actual cost of work
accomplished.
Differences in these values are measured in both cost and schedule
variances. Cost variances compare the earned value of the completed
work with the actual cost of the work performed. For example, if a
contractor completed $5 million worth of work and the work actually
cost $6.7 million, there would be a -$1.7 million cost variance.
Schedule variances are also measured in dollars, but they compare the
earned value of the work completed with the value of work that was
expected to be completed. For example, if a contractor completed $5
million worth of work at the end of the month but was budgeted to
complete $10 million worth of work, there would be a -$5 million
schedule variance. Positive variances indicate that activities are
costing less or are completed ahead of schedule. Negative variances
indicate activities are costing more or are falling behind schedule.
These cost and schedule variances can then be used in estimating the
cost and time needed to complete the program.
Without knowing the planned cost of completed work and work in progress
(i.e., the earned value), it is difficult to determine a program's true
status. Earned value provides information that is necessary for
understanding the health of a program; it provides an objective view of
program status. As a result, EVM can alert program managers to
potential problems sooner than expenditures alone can, thereby reducing
the chance and magnitude of cost overruns and schedule delays.
Moreover, EVM directly supports the institutionalization of key
processes for acquiring and developing systems and the ability to
effectively manage investments--areas that are often found to be
inadequate on the basis of our assessments of major IT investments.
Because of the importance of ensuring quality earned value data, in May
1998, the American National Standards Institute (ANSI) and the
Electronic Industries Alliance (EIA) jointly established a national
standard for EVM systems.[Footnote 12] This standard, commonly called
the ANSI standard, consists of 32 guidelines to instruct programs on
how to establish a sound EVM system, ensure that the data coming from
the system are reliable, and use the earned value data to manage the
program. See appendix II for an overview of this standard.
Federal Guidance Calls for Using EVM to Improve IT Management:
In August 2005, OMB issued guidance outlining steps that agencies must
take for all major and high-risk development projects to better ensure
improved execution and performance and to promote more effective
oversight through the implementation of EVM.[Footnote 13] Specifically,
this guidance directs agencies to (1) develop comprehensive policies to
ensure that agencies are using EVM to plan and manage development
activities for major IT investments; (2) include a provision and clause
in major acquisition contracts or agency in-house project charters
directing the use of an EVM system compliant with the ANSI standard;
(3) provide documentation demonstrating that the contractor's or
agency's in-house EVM system complies with the national standard; (4)
conduct periodic surveillance reviews; and (5) conduct integrated
baseline reviews[Footnote 14] on individual programs to finalize the
cost, schedule, and performance goals.
Building on OMB's requirements, in July 2007, we issued a draft guide
on best practices for estimating and managing program costs.[Footnote
15] This guide highlights the policies and practices adopted by leading
organizations to implement an effective EVM program. Specifically, in
the guide, we identify the need for organizational policies that
establish clear criteria for which programs are required to use EVM,
compliance with the ANSI standard, a standard product-oriented
structure for defining work products, integrated baseline reviews,
specialized training, criteria and conditions for rebaselining
programs, and an ongoing surveillance function. In addition, we
identify key practices that individual programs can use to ensure that
they establish a sound EVM system, that the earned value data are
reliable, and that the data are used to support decision making. OMB
refers to this guide as a key reference manual for agencies in its 2006
Capital Programming Guide.[Footnote 16]
Two FAA Executives Are Responsible for EVM Implementation:
Two FAA executives--the Acquisition Executive and the Chief Information
Officer--are jointly responsible for implementing EVM and ensuring its
consistent application across the agency's IT acquisitions. The
Acquisition Executive's responsibilities include developing EVM policy
and guidance, certifying contractors' conformance with the ANSI
standard, advising and assisting programs with integrated baseline
reviews, approving programs' plans for continued surveillance of
contractors' EVM systems, and managing the EVM training program and
curriculum. The Acquisition Executive established the position of EVM
Focal Point to lead these efforts.
The Chief Information Officer's responsibilities include assisting in
the development of EVM policy and guidance, certifying programwide
conformance with the ANSI standard, performing ongoing programwide EVM
system surveillance, and managing the preparation of information
reported on programs' annual business cases--which includes verifying
the accuracy of the program baseline, schedule and cost performance,
and corrective action plans. The Chief Information Officer established
a Value Management Office to perform these functions.
FAA Has Established an EVM Policy for Major IT Investments, but Key
Components Are Not Fully Consistent with Best Practices:
In 2005, FAA established a policy requiring the use of EVM on its major
IT investments; however, key components of this policy are not fully
consistent with best practices. We recently reported[Footnote 17] that
leading organizations establish EVM policies that:
* establish clear criteria for which programs are to use EVM;
* require programs to comply with the ANSI standard;
* require programs to use a product-oriented structure for defining
work products;
* require programs to conduct detailed reviews of expected costs,
schedules, and deliverables (called an integrated baseline review);
* require and enforce EVM training;
* define when programs may revise cost and schedule baselines (called
rebaselining); and:
* require system surveillance--routine validation checks to ensure that
major acquisitions are continuing to comply with agency policies and
standards.
Table 1 describes the key components of an effective EVM policy.
Table 1: Key Components of an Effective EVM Policy:
Component: Clear criteria for implementing EVM on all major IT
investments;
Description: OMB requires agencies to implement EVM on all major IT
investments and ensure that the corresponding contracts include
provisions for using EVM systems. However, each agency is responsible
for establishing its own definition of a "major" IT investment. As a
result, agencies should clearly define the conditions under which a new
or ongoing acquisition program is required to implement EVM.
Component: Compliance with the ANSI standard;
Description: OMB requires agencies to use EVM systems that are
compliant with a national standard developed by ANSI and EIA (ANSI/EIA-
748-B). This standard consists of 32 guidelines that an organization
can use to establish a sound EVM system, ensure that the data resulting
from the EVM system are reliable, and use earned value data for
decision-making purposes (see app. II).
Component: Standard structure for defining the work products;
Description: The work breakdown structure defines the work necessary to
accomplish a program's objectives. It is the first criterion stated in
the ANSI standard and the basis for planning the program baseline and
assigning responsibility for the work. It is a best practice to
establish a product-oriented work breakdown structure because it allows
a program to track cost and schedule by defined deliverables, such as a
hardware or software component. This allows a program manager to more
precisely identify which components are causing cost or schedule
overruns and to more effectively mitigate the root cause of the
overruns. Standardizing the work breakdown structure is also considered
a best practice because it enables an organization to collect and share
data among programs.
Component: Integrated baseline review;
Description: An integrated baseline review is an evaluation of the
performance measurement baseline--the foundation for an EVM system--to
determine whether all program requirements have been addressed, risks
have been identified, mitigation plans are in place, and available and
planned resources are sufficient to complete the work. The main goal of
an integrated baseline review is to identify potential program risks,
including risks associated with costs, management processes, resources,
schedules, and technical issues.
Component: Training requirements;
Description: EVM training should be provided and enforced for all
personnel with investment oversight and program management
responsibilities. Executive personnel with oversight responsibilities
need to understand EVM terms and analysis products to make sound
investment decisions. Program managers and staff need to be able to
interpret and validate earned value data to effectively manage
deliverables, costs, and schedules.
Component: Rebaselining criteria;
Description: At times, management may conclude that the remaining
budget and schedule targets for completing a program (including the
contract) are significantly insufficient, and that the current baseline
is no longer valid for realistic performance measurement. Management
may decide that a revised baseline for the program is needed to restore
its control of the remaining work effort. An agency's rebaselining
criteria should define acceptable reasons for rebaselining and require
programs to (1) explain why the current plan is no longer feasible and
what measures will be implemented to prevent recurrence and (2) develop
a realistic cost and schedule estimate for remaining work that has been
validated and spread over time to the new plan.
Component: System surveillance;
Description: Surveillance is the process of reviewing a program's
(including contractor's) EVM system as it is applied to one or more
programs. The purpose of surveillance is to focus on how well a program
is using its EVM system to manage cost, schedule, and technical
performance. The following two goals are associated with EVM system
surveillance: (1) ensure that the program is following corporate
processes and procedures and (2) confirm that the program's processes
and procedures continue to satisfy ANSI guidelines.
Source: GAO, Cost Assessment Guide: Best Practices for Estimating and
Managing Program Costs, Exposure Draft, GAO-07-1134SP (Washington,
D.C.: July 2007).
[End of table]
FAA began developing EVM-related policies for its IT acquisition
programs in 2005. The agency currently has a policy in place that fully
addresses four of the seven areas and partially addresses the remaining
three areas (see table 2).
Table 2: Assessment of FAA's EVM Policies, as of April 2008:
Policy component: Clear criteria for implementing EVM on all major IT
investments;
Assessment of FAA policy: Fully met.
Policy component: Compliance with the ANSI standard;
Assessment of FAA policy: Fully met.
Policy component: Standard structure for defining the work products;
Assessment of FAA policy: Partially met.
Policy component: Integrated baseline review;
Assessment of FAA policy: Fully met.
Policy component: Training requirements;
Assessment of FAA policy: Partially met.
Policy component: Rebaselining criteria;
Assessment of FAA policy: Partially met.
Policy component: System surveillance;
Assessment of FAA policy: Fully met.
Source: GAO analysis of FAA data.
[End of table]
Specifically, FAA has policies and guidance in its Acquisition
Management System[Footnote 18] that fully address EVM implementation on
all major IT investments, compliance with the ANSI standard, integrated
baseline reviews, and system surveillance. These policies are discussed
below.
* Criteria for implementing EVM on all IT major investments: FAA
requires all of its major development, modernization, and enhancement
programs to use EVM. Specifically, these are all programs with a
requirement to provide a business case to OMB.[Footnote 19] In
addition, FAA requires that all contracts and subcontracts that are
expected to exceed a cost of $10 million for development,
modernization, and enhancement work must be managed using an EVM
system. Projects lasting less than 1 year are not required to use EVM.
* Compliance with the ANSI standard: FAA requires that all work
activities performed on major programs by government personnel, major
contractors, and support contractors be managed using an EVM system
that complies with industry standards. FAA's EVM Focal Point is
responsible for certifying that contractors with contracts over $10
million conform with the standard. FAA's Value Management Office is
responsible for certifying that each program conforms with the
standard.
* Integrated baseline reviews: FAA requires each program manager to
conduct a comprehensive review of a program baseline for major programs
and contracts within 90 to 180 days of contract award or program
baseline establishment. Furthermore, an updated integrated baseline
review must be performed after a program exercises significant contract
options or executes modifications. The agency's guidance calls for the
involvement of program management teams, prime contractor management,
and independent subject matter experts who validate the program
baselines and performance measurement processes.
* System surveillance: FAA requires ongoing surveillance of all
programs and contracts that are required to use EVM systems to ensure
their continued compliance with industry standards. The Value
Management Office is responsible for providing surveillance at the
program level through annual assessments of each major program.
Individual program managers and contracting officers are responsible
for conducting surveillance on their contractors' EVM in accordance
with a surveillance plan approved by the EVM Focal Point.
However, FAA's policy and guidance are not consistent with best
practices in three areas: defining a product-oriented structure for
defining work products, requiring EVM training, and establishing
rebaselining criteria. These areas are discussed below.
* Standard structure for defining work products: FAA requires its
programs to establish a standard work breakdown structure. However, FAA
calls for a function-oriented structure, rather than a product-oriented
one. This means that work is delineated based on functional activities,
such as design engineering, requirements analysis, and quality control.
In contrast, a product-oriented work breakdown structure reflects cost,
schedule, and technical performance on specific deliverables. Without
the level of detail provided by a product-oriented approach, program
managers may not have the information they need to make decisions on
specific program components. For example, cost overruns associated with
a specific radar component could be quickly identified and addressed
using a product-oriented structure. If a function-oriented structure
were used, these costs could be spread out over design, engineering,
and quality control.
FAA program managers can choose to use a product-oriented work
breakdown structure to manage their programs and contracts, but then
they need to transfer their data to FAA's required function-oriented
work breakdown structure when reporting to management. EVM experts
agree that such mapping efforts are time-consuming and subject to
error. Furthermore, programs do not always map items in the same way,
and, as a result, costs may not be captured consistently across
programs.
FAA officials stated that they use the functional format because it is
aligned with the agency's cost accounting system. While this presents a
challenge, it is not insurmountable. For example, in the near-term, the
agency could develop a standard mapping function to translate product-
oriented program data into the function-oriented cost accounting
system. While this approach would not resolve the time-consuming nature
of mapping (since programs would still be expected to complete this
activity), it does at least allow costs to be captured consistently
across programs. As a longer-term solution, we have repeatedly urged
government agencies to adopt cost accounting systems that provide
meaningful links among budget, accounting, and performance. Such
systems are consistent with product-oriented work breakdown structures.
[Footnote 20]
Until FAA establishes a standard product-oriented work breakdown
structure, program officials who use the function-oriented approach to
manage their contracts may not be obtaining the information they need.
Furthermore, program officials who choose to manage using a product-
oriented structure will continue to spend valuable time and effort
mapping their product-oriented structures to the FAA standard, and the
agency will continue to risk that data are captured inaccurately or
inconsistently during this mapping exercise.
* EVM training requirements: FAA has developed EVM training and
requires program managers to complete a minimum of 24 hours of EVM and
cost estimating training. However, the agency does not specify EVM
training requirements for program team members or senior executives
with program oversight responsibilities. In addition, the agency does
not enforce EVM training to ensure that all relevant staff have
completed the required training. Instead, individual program offices
are responsible for ensuring that their teams obtain sufficient EVM
training. Some programs ensure that all key program staff have
completed the appropriate level of training they need to understand
their roles and responsibilities, while other programs do not. Until
FAA establishes EVM training requirements for all relevant personnel
(including executives with oversight responsibilities and program staff
responsible for contract management) and verifies the completion of
this training, it cannot effectively ensure that its program staff have
the appropriate skills to validate and interpret EVM data, and that its
executives fully understand the data they are given in order to ask the
right questions and make informed decisions.
* Rebaselining criteria: FAA requires that programs seeking a new cost
and schedule baseline gain approval from a board of executives, called
the Joint Resources Council, which is responsible for investment
decisions. However, the agency does not define acceptable reasons for
rebaselining or require programs to identify and address the reasons
for the need to rebaseline. Until FAA addresses these elements, it will
face an increased risk that its executive managers will make decisions
about programs with incomplete information, and that these programs
will continue to overrun costs and schedules because their underlying
problems have not been identified or addressed.
Key FAA Systems Are Using EVM, but Are Not Consistently Implementing
Key Practices:
FAA is using EVM to manage system acquisition programs, but the extent
of implementation varies among programs. Case studies of four programs
demonstrated that all are using or planning to use EVM. However, the
four programs are not consistently performing EVM on the full scope of
the program (as opposed to the scope of the contract) and ensuring that
the earned value data are reliable. Until these areas are fully
addressed, FAA faces an increased risk that program managers are not
adequately using earned value to manage their programs.
Our work on best practices in EVM identified 11 key practices that are
implemented on acquisition programs of leading organizations. These
practices can be organized into three management areas: establishing a
sound EVM system, ensuring reliable data, and using earned value data
to manage. Table 3 lists these 11 key practices.
Table 3: Eleven Key EVM Practices for System Acquisition Programs:
Program management area: Establish a comprehensive EVM system;
EVM practice: Define the scope of effort using a work breakdown
structure.
EVM practice: Identify who in the organization will perform the work.
EVM practice: Schedule the work.
EVM practice: Estimate the labor and material required to perform the
work and authorize the budgets, including management reserve.
EVM practice: Determine objective measure of earned value.
EVM practice: Ensure that the data resulting from the EVM system are
reliable: Develop the performance measurement baseline.
Program management area: Ensure that the data resulting from the EVM
system are reliable;
EVM practice: Execute the work plan and record all costs.
EVM practice: Analyze EVM performance data and record variances from
the performance measurement baseline plan.
EVM practice: Ensure that the program management team is using earned
value data for decision-making purposes: Forecast estimates at
completion.
Program management area: Ensure that the program management team is
using earned value data for decision-making purposes;
EVM practice: Take management action to mitigate risks.
EVM practice: Update the performance measurement baseline as changes
occur.
Source: GAO, Cost Assessment Guide: Best Practices for Estimating and
Managing Program Costs, Exposure Draft, GAO-07-1134SP (Washington,
D.C.: July 2007).
[End of table]
We performed case studies of four FAA system acquisitions: the Airport
Surveillance Radar (ASR-11), En Route Automation Modernization (ERAM),
Surveillance and Broadcast Services (SBS), and System Wide Information
Management (SWIM). All of the four key FAA system programs demonstrated
at least a partial level of EVM implementation. Figure 1 summarizes our
results on these selected programs. Following the figure, we provide a
summary of each key area of program management responsibility in EVM.
In addition, more details on the four case studies are provided in
appendix III.
Figure 1: Assessment of EVM Practices for Key FAA Systems, as of April
2008:
[See PDF for image]
This figure is a table depicting the following data:
Assessment of EVM Practices for Key FAA Systems, as of April 2008:
Program management area: Establish a comprehensive EVM system;
FAA system program, ASR-11: Partially implemented/with justification:
The program partially addressed the EVM practices in this program
management area; however, external factors prevented the program from
fully implementing these practices;
FAA system program, ERAM: Partially implemented/with justification: The
program partially addressed the EVM practices in this program
management area; however, external factors prevented the program from
fully implementing these practices;
FAA system program, SBS: Fully implemented: The program fully
implemented all EVM practices in this program management area;
FAA system program, SWIM: Work in progress: The program is early in its
life cycle and is working to address the EVM practices in this program
management area.
Program management area: Ensure that the data resulting from the EVM
system are reliable;
FAA system program, ASR-11: Partially implemented/with justification:
The program partially addressed the EVM practices in this program
management area; however, external factors prevented the program from
fully implementing these practices;
FAA system program, ERAM: Partially implemented: The program partially
implemented the EVM practices in this program management area;
FAA system program, SBS: Fully implemented: The program fully
implemented all EVM practices in this program management area;
FAA system program, SWIM: Not applicable: The program is not yet at a
stage of development that would implement the EVM practices in this
program management area.
Program management area: Ensure that the program management team is
using earned value data for decision-making purposes;
FAA system program, ASR-11: Fully implemented: The program fully
implemented all EVM practices in this program management area;
FAA system program, ERAM: Fully implemented: The program fully
implemented all EVM practices in this program management area;
FAA system program, SBS: Fully implemented: The program fully
implemented all EVM practices in this program management area;
FAA system program, SWIM: Not applicable: The program is not yet at a
stage of development that would implement the EVM practices in this
program management area.
Source: GAO analysis of FAA data.
[End of figure]
Programs Did Not Consistently Establish Comprehensive EVM Systems, but
Had Justification for These Shortfalls:
The four programs did not consistently establish comprehensive EVM
systems, but were able to justify these shortfalls. Of the four
programs, only SBS demonstrated that it had fully implemented the six
practices in this area. For example, the program established an
integrated performance baseline that captures the full scope of work on
the program and links directly to the integrated master schedule.
Two programs--ASR-11 and ERAM--demonstrated that they partially
implemented each of the six key practices in this area. Both had a
reasonable justification for their partial EVM implementation: the
systems were initiated before FAA required projects to obtain EVM data
and have implemented work-arounds to allow them to meet FAA's current
earned value reporting requirements. Specifically, the ASR-11 team does
not receive any EVM data, so the team established a performance
measurement baseline to estimate the work remaining on both the
contractor and government portions of the program.[Footnote 21]
Alternatively, ERAM has implemented EVM to govern the contract
deliverables, but not the government's portion of the program. Instead,
the program estimates government costs.
The fourth program, SWIM, has initiated EVM practices, but these
efforts are still under way because the system is in an early stage in
its acquisition life cycle. At the time of our review, SWIM had fully
met two of the six key practices. For example, SWIM has a work
breakdown structure and has identified who will perform the work. In
addition, the program is currently developing its integrated master
schedule and plans to complete all key EVM process steps prior to
beginning development work (which is expected to begin in fiscal year
2009). SWIM is not currently collecting EVM data.
Programs Did Not Consistently Ensure That EVM Data Were Reliable:
The three programs that currently collect or estimate monthly EVM data
(ASR-11, ERAM, and SBS) did not consistently ensure that their EVM data
were reliable. Of the three programs, one fully implemented the
practices for ensuring the reliability of the prime contractor and
government performance data, one partially implemented the practices
but had justification for its shortfalls, and one partially implemented
the practices.
SBS demonstrated that it fully implemented the three practices. The
program requires its technical managers to validate the earned value
data they are responsible for collecting on a monthly basis. It also
established mechanisms to alert the team if the contractor's
deliverables may not meet system requirements. In addition, program EVM
analysts are expected to analyze and report cost and schedule
performance trends and cost estimates to complete the remaining work to
the program manager and an internal management review board.
ASR-11 partially implemented each of the three practices for ensuring
that earned value data are reliable, but had a justification for this
shortfall. As we have previously noted, ASR-11 measures government and
contractor effort; however, it is constrained in its oversight
capabilities since the prime contractor is not required to report
earned value information or cost data to FAA. As a result, the program
is unable to collect or validate actual costs expended on the
contractor's scope of work. Instead, ASR-11 relies on schedule status
to determine when planned work on a contract deliverable has been
authorized to begin--such as work to dismantle a legacy facility site-
-and completed. The program depends on the receipt of Air Force
invoices to determine the actual costs for that planned effort, and
relies on its FAA teams that are on-site to get qualitative assessments
of the cost and schedule drivers impacting performance. Despite the
external constraints, ASR-11 has a skilled team in place to assess the
EVM data, perform the appropriate analyses of performance trends, and
make projections of estimated costs at program completion.
ERAM also partially implemented each of the three practices for
ensuring that earned value data are reliable. The ERAM program team
analyzes the prime contractor's monthly EVM data and variance reports
and then uses that information to make projections of estimated costs
at program completion. However, we identified several anomalies in the
contractor's reports over an 11-month period that suggest the
contractor may not be reliably reporting its work activities. For
example:
* There were multiple cases in which the contractor reported that no
work was planned or accomplished, yet funds were spent; in other cases,
the contractor reported that work was planned and accomplished, but
funds were credited to the government. There were also cases in which
the contractor reported that work was planned and dollars spent, but a
negative amount of work was performed (i.e., work that was previously
reported as completed was now reported as not completed). The
contractor did not provide an explanation for these issues in its
reports to the ERAM program office.
* In September 2007, the contractor planned to complete $102 million
worth of work--a significant spike in planned work, given that the
average amount of work planned and accomplished in a single month is
about $25 million. Furthermore, the contractor reported that it
accomplished $100 million worth of that work and spent only $31 million
to complete it. The contractor did not provide a justification for this
steep spike in work planned and accomplished, or for the sizable gap
between the work accomplished and the cost of this work. The ERAM
program office was also unable to explain why this occurred.
These reporting anomalies raise questions about the reliability of the
contractor data and the quality of the program's efforts to verify and
validate these data. Program officials were unable to explain these
anomalies. Until ERAM improves its ability to assess contract data and
resolve anomalies, it risks using inaccurate data to manage the
contractor, potentially resulting in cost overruns, schedule delays,
and performance shortfalls.
Program Management Teams Consistently Used Earned Value Data to Make
Decisions:
All three programs that currently collect monthly EVM data were able to
demonstrate that they use these data to manage their programs. The SBS
program manager conducts rigorous reviews with its internal performance
management review board to discuss the program's earned value
performance against planned cost and schedule targets and take
appropriate actions to reverse negative trends. The ASR-11 program
manager is using the current cost and schedule variances being accrued
on site construction work to make projections on the overall cost to
complete this work and to create risk mitigation plans to address the
cost and schedule drivers. The ERAM program manager uses the earned
value data to identify areas of concern and make recommendations to the
contractor on items that should be watched, mitigated, and tracked to
closure. Currently, the program manager is monitoring the contractor's
use of management reserve as well as fluctuating cost variances
associated with the design and engineering supporting ERAM's initial
capability.
FAA Has Taken Steps to Oversee EVM Compliance, but Its Oversight
Process Lacks Sufficient Rigor:
FAA has taken important steps to oversee compliance with EVM policies
by establishing an oversight office, assessing major systems using
defined evaluation criteria, and demonstrating improved capabilities on
most programs. However, the oversight office's assessments are not
thorough enough to identify anomalies in contractor data, and its
agencywide progress reports can be misleading, in that the agency's
evaluation process does not distinguish between systems that collect
comprehensive data and those that do not. As a result, FAA executives
do not always receive an accurate view of the quality of a program's
EVM data when making investment decisions on that program.
FAA Established an Oversight Program to Ensure Compliance with EVM
Requirements:
According to best practices in program oversight, an organization
should assign responsibility for providing oversight, establish and
implement a plan for conducting oversight that is sufficiently detailed
to identify problems, and report on its progress over time. FAA
established an oversight program to ensure EVM compliance assessments
on its major programs.
In August 2005, FAA established the Value Management Office, an
organization responsible for assessing the EVM compliance of all major
IT acquisition programs. This office developed an EVM system assessment
plan to evaluate each major system program. This plan defines the
evidence needed to obtain a weak, moderate, or strong score for each of
the 32 guidelines in the ANSI standard. The group assesses each major
program's earned value capabilities on an annual basis. In addition,
this office provides its senior executives and OMB with a summary of
the EVM compliance status of all major programs. FAA reports that its
IT systems have made major improvements in their earned value
capabilities over the last few years. For example, in August 2005, FAA
reported that 6 of its 19 major IT acquisition programs (or 32 percent)
had fully complied with the standard. As of February 2008, FAA reported
that 17 of its 23 major IT programs (or 74 percent) had achieved full
compliance with the ANSI standard.
FAA's Oversight Process Lacks Sufficient Rigor:
While FAA's oversight has accomplished much since it was established,
the process used to assess and report on programs lacks the rigor
needed to be a reliable gauge of agency progress. Best practices call
for program EVM oversight to include an assessment of both government
and contractor performance data to identify issues that may undermine
the validity of these data. In addition, to be transparent and
reliable, reports on the status of programs' EVM implementation should
clearly identify situations in which programs are unable to fully
comply with FAA policies.
In assessing programs' EVM compliance, FAA's oversight office obtains
and reviews earned value data for the program as a whole. It does not
analyze the contractor's performance data. For example, FAA's oversight
office did not review ERAM's contractor data and, therefore, did not
identify anomalies in which funds were spent on no work and other work
was performed for no funds. As a result, it rated the program highly on
factors associated with data reliability.
In addition, in reporting agencywide progress in implementing EVM, the
agency's oversight process does not distinguish between programs that
collect earned value data only on the contract level, and those that
collect integrated data on the program as a whole. For example, both
ERAM and ASR-11 use approximations to reflect their earned value data.
As we have previously noted, ASR-11 uses approximations for the entire
program because another agency administers the contract. ERAM uses
approximations only for the government portions of the program.
Nonetheless, FAA gave both of these programs their highest ratings.
This is misleading in that it portrays the performance data on these
programs as having the same level of precision as programs that have an
integrated approach to EVM. Since these programs were initiated before
the EVM requirement, it is likely that other older acquisition programs
have also implemented work-arounds. Of the 23 major programs assessed
by FAA, 16 were initiated before the EVM policy was established. Until
these issues are resolved, FAA will be unable to effectively ensure
that EVM implementation is consistent across the agency, and that FAA
executives obtain an inaccurate view of the quality of an individual
program's EVM data when making investment decisions.
FAA Has Incorporated Important EVM Performance Data into Its IT
Investment Management Process:
To obtain better insight into the progress made on its system
acquisition programs, FAA incorporated EVM performance data into its
process for reviewing IT investments. Our work in IT investment
management highlights the importance of executive decision makers
having sufficient insight into program status so that they can identify
and mitigate risks, and ensure that programs are on track against
established cost and schedule expectations. The performance data from
program EVM systems are critical for helping managers achieve
sufficient insight on program status.
FAA executives are reviewing EVM data as part of their investment
review process. The level of detail in EVM data reporting is dependent
on the level of executive review. For example, executives responsible
for a portfolio of projects conduct project reviews on a quarterly
basis. They obtain project data that include cumulative cost and
schedule variance reporting over an extended period. For example, ASR-
11 has reported cumulative trends over an 11-month period. Other key
reported performance metrics include estimated costs at program
completion, cost and schedule efficiency indexes (which describe the
dollar value of work being accomplished for every dollar spent), and
management reserve. At a more senior level, FAA's Joint Resource
Council receives project data on a monthly basis, is briefed on
projects that are breaching cost and schedule variances by more than 10
percent on a quarterly basis, and obtains detailed briefings on
projects twice a year. At this time, these briefings contain a program
dashboard matrix, which shows the earned value cost and schedule
efficiency indexes taken over a 6-month period. FAA's Value Management
Office also has a joint initiative under way with the Joint Resource
Council to refine the dashboard matrix in order to determine the most
appropriate data, as well as level of detail, that will enable decision
makers to prevent, detect, and respond to issues in a timely manner.
Conclusions:
FAA has taken a number of important steps to improve the management of
its IT investments through the implementation of EVM. The agency has
established policies that require the use of EVM; system acquisition
programs are using earned value data to manage their programs; an
oversight office monitors system acquisition programs' compliance with
policy and standards; and earned value performance data are being used
by multiple levels of management as they review and manage IT
investment.
However, the agency does not fully ensure the accuracy and usefulness
of earned value data as a management tool. Specifically, FAA policies
lack sufficient guidance on the type of work structure needed to most
effectively use EVM data, training requirements do not extend to all
relevant personnel and call for this training to be monitored and
enforced, and programs are not required to identify or mitigate the
root cause of any cost and schedule overruns when they request a
revised cost and schedule baseline. In addition, FAA programs are not
consistently ensuring that the data coming from contractors are
reliable. Of the three programs we reviewed that currently collect
earned value data, one program, ERAM, had no explanation for anomalies
in its contractor data wherein funds were spent but no work was done;
in another situation, work was accomplished but funds were credited to
the government. This is of concern because both program managers and
agency executives could be making programmatic and investment decisions
on the basis of inaccurate and misleading data. Furthermore, FAA's
Value Management Office--an internal EVM oversight group--does not
evaluate the validity of contractor data or distinguish between
programs that have comprehensive earned value systems and ones that
have implemented work-arounds. As a result, FAA executives are, in
selected cases, receiving an inaccurate view of the quality of a
program's EVM data, which could impede sound investment decisions.
Until these issues are resolved, it will be difficult for FAA to
effectively implement EVM or optimize its investment in this critical
management tool.
Recommendations for Executive Action:
To improve FAA's ability to effectively implement EVM on its IT
acquisition programs, we recommend that the Secretary of Transportation
direct the Acting FAA Administrator to take the following seven
actions:
Modify acquisition policies governing EVM to:
* require the use of a product-oriented standard work breakdown
structure,
* enforce existing EVM training requirements and expand these
requirements to include senior executives responsible for investment
oversight and program staff responsible for program oversight, and:
* define acceptable reasons for rebaselining and require programs
seeking to rebaseline to (1) perform a root cause analysis to determine
why significant cost and schedule variances occurred and (2) establish
mitigation plans to address the root cause.
Direct the ERAM program office to work with FAA's Value Management
Office to:
* determine the root causes for the anomalies found in the contractor's
EVM reports and:
* develop a corrective action plan to resolve these problems.
Direct the Value Management Office to improve its oversight processes
by:
* including an evaluation of contractors' performance data as part of
its program assessment criteria, when FAA has the authority to do so,
and:
* distinguishing between programs that collect earned value data on
fully integrated programs and those that do not in its agencywide
progress reports to provide transparency to decision makers.
Agency Comments:
The Department of Transportation's Director of Audit Relations provided
comments on a draft of this report via e-mail. In those comments, he
said that the department generally agreed with the findings and
recommendations contained in the draft. The department also provided
technical comments, which we have incorporated in this report as
appropriate.
We will be sending copies of this report to interested congressional
committees, the Secretary of Transportation, the Acting FAA
Administrator, and other interested parties. We will also make copies
available to others upon request. In addition, the report will be
available at no charge on our Web site at [hyperlink,
http://www.gao.gov].
If you or your staffs have any questions on the matters discussed in
this report, please contact me at (202) 512-9286 or by e-mail at
pownerd@gao.gov. Contact points for our Offices of Congressional
Relations and Public Affairs may be found on the last page of this
report. GAO staff who made major contributions to this report are
listed in appendix IV.
Signed by:
David A. Powner:
Director, Information Technology Management Issues:
List of Requesters:
The Honorable Bart Gordon:
Chairman:
The Honorable Ralph Hall:
Ranking Member:
Committee on Science and Technology:
House of Representatives:
The Honorable Jerry Costello:
Chairman:
The Honorable Thomas Petri:
Ranking Member:
Subcommittee on Aviation:
Committee on Transportation and Infrastructure:
House of Representatives:
The Honorable John D. Rockefeller IV:
Chairman:
The Honorable Kay Bailey Hutchison:
Ranking Member:
Subcommittee on Aviation Operations, Safety, and Security:
Committee on Commerce, Science, and Transportation:
United States Senate:
The Honorable John Mica:
House of Representatives:
[End of section]
Appendix I: Objectives, Scope, and Methodology:
Our objectives were to (1) assess the Federal Aviation Administration's
(FAA) policies for implementing earned value management (EVM) on its
information technology (IT) investments, (2) evaluate whether the
agency is adequately using EVM techniques to manage key system
acquisitions, (3) assess the agency's efforts to oversee compliance
with its EVM policies, and (4) evaluate whether the agency is using EVM
data as part of its IT investment management.
To assess whether FAA has policies in place to effectively implement
EVM, we analyzed FAA's policies and guidance that support EVM
implementation agencywide as well as on system acquisition programs.
Specifically, we compared these policies and guidance documents with
both the Office of Management and Budget's (OMB) requirements and key
best practices recognized within the federal government and industry
for the implementation of EVM. These best practices are contained in an
exposure draft version of our cost guide.[Footnote 22] We also
interviewed key agency officials and observed FAA EVM working group
meetings to obtain information on the agency's ongoing and future EVM
plans.
To determine whether key FAA system programs are adequately using EVM
techniques, we performed case studies on 4 of FAA's 23 system
acquisition programs currently required to use EVM: the Airport
Surveillance Radar (ASR-11), En Route Automation Modernization (ERAM),
Surveillance and Broadcast Services (SBS), and System Wide Information
Management (SWIM). In consultation with FAA officials, we selected
programs with high development and life-cycle costs, which represented
FAA's two major modernization initiatives--the Air Traffic Control
Modernization and the Next Generation Air Transportation System
(NextGen)--and reflected different stages of life-cycle maturity. These
studies were not intended to be generalizable, but instead to
illustrate the status of a variety of programs. To determine the extent
of each program's implementation of sound EVM, we compared program
documentation with the fundamental EVM practices implemented on
acquisition programs of leading organizations, as identified in our
cost guide.[Footnote 23] We determined whether the program implemented,
partially implemented, or did not implement each of the 11 practices.
We further analyzed the EVM data obtained from the programs to assess
the program performance against planned cost and schedule targets.
Finally, we interviewed program officials to obtain clarification on
how EVM practices are implemented and how the data are validated and
used for decision-making purposes. Regarding the reliability of cost
data, we did not test the adequacy of agency or contractor cost-
accounting systems. Our evaluation of these cost data was based on what
we were told by the agency and the information they could provide.
To determine whether FAA is effectively overseeing compliance with its
EVM policies, we reviewed the quality and completeness of the agency's
surveillance efforts on its system acquisition programs. Specifically,
we reviewed the agency's EVM assessment reports for programs, FAA-
developed EVM assessment criteria, and other relevant documents. We
further compared the results of FAA's EVM assessment for each of the
selected case study programs with the results of our case evaluation to
ascertain the extent to which the results were in agreement. We also
interviewed key agency officials and observed FAA EVM working group
meetings to obtain information on the agency's ongoing surveillance
efforts and issues regarding these efforts.
To evaluate whether FAA is using EVM data as part of its IT investment
management process, we analyzed senior executive management briefings,
OMB business cases (exhibit 300), and other key management reports on
program status. Specifically, we analyzed briefings and status reports
to determine the types of EVM metrics used in describing program status
for senior-level decision-making purposes. We also compared this
analysis with the key best practices recognized within the federal
government and industry for the implementation of EVM, as well as for
the execution of sound IT investment management. We also interviewed
key agency officials to obtain information on the extent of executive-
level EVM awareness and clarification on how EVM is used in FAA's
capital planning process.
We conducted this performance audit from November 2007 to July 2008 at
FAA offices in Washington, D.C., in accordance with generally accepted
government auditing standards. Those standards require that we plan and
perform the audit to obtain sufficient, appropriate evidence to provide
a reasonable basis for our findings and conclusions based on our audit
objectives. We believe that the evidence obtained provides a reasonable
basis for our findings and conclusions based on our audit objectives.
[End of section]
Appendix II: Overview of Industry Guidelines That Support Sound EVM:
Organizations must be able to evaluate the quality of an EVM system to
determine the extent to which the cost, schedule, and technical
performance data can be relied on for program management purposes. In
recognition of this, the American National Standards Institute (ANSI)
and the Electronic Industries Alliance (EIA) jointly established a
national standard for EVM systems--ANSI/EIA-748-B (commonly referred to
as the ANSI standard). This standard consists of 32 guidelines
addressing organizational structure; planning, scheduling, and
budgeting; accounting considerations; analysis and management reports;
and revisions and data maintenance. These guidelines comprise three
fundamental management functions for effectively using EVM:
establishing a sound EVM system, ensuring that the EVM data are
reliable, and using earned value data for decision-making purposes.
Table 4 lists the management functions and the ANSI guidelines.
Table 4: Management Functions Addressed by ANSI Guidance on Earned
Value Management Systems:
Management function: Establish a comprehensive EVM system;
ANSI guideline:
(1); Define the authorized work elements for the program. A work
breakdown structure, tailored for effective internal management
control, is commonly used in this process.
(2); Management function: Identify the program organizational
structure, including the major subcontractors responsible for
accomplishing the authorized work, and define the organizational
elements in which work will be planned and controlled.
(3); Management function: Provide for the integration of the planning,
scheduling, budgeting, work authorization, and cost accumulation
processes with each other and, as appropriate, with the program work
breakdown structure and the program organizational structure.
(4); Management function: Identify the organization or function
responsible for controlling overhead (indirect costs).
(5); Management function: Provide for integration of the program work
breakdown structure and the program organizational structure in a
manner that permits cost and schedule performance measurement by
elements of either or both structures as needed.
(6); Management function: Schedule the authorized work in a manner that
describes the sequence of work and identifies significant task
interdependencies required to meet the requirements of the program.
(7); Management function: Identify physical products, milestones,
technical performance goals, or other indicators that will be used to
measure progress.
(8); Management function: Establish and maintain a time-phased budget
baseline, at the control account level, against which program
performance can be measured. Initial budgets established for
performance measurement will be based on either internal management
goals or the external customer negotiated target cost, including
estimates for authorized but undefinitized work. Budget for far-term
efforts may be held in higher level accounts until an appropriate time
for allocation at the control account level. If an over-target baseline
is used for performance measurement reporting, prior notification must
be provided to the customer.
(9); Management function: Establish budgets for authorized work with
identification of significant cost elements (e.g., labor and material)
as needed for internal management and for control of subcontractors.
(10); Management function: To the extent that it is practicable to
identify the authorized work in discrete work packages, establish
budgets for this work in terms of dollars, hours, or other measurable
units. Where the entire control account is not subdivided into work
packages, identify the far-term effort in larger planning packages for
budget and scheduling purposes.
(11); Management function: Provide that the sum of all work package
budgets plus planning package budgets within a control account equals
the control account budget.
(12); Management function: Identify and control "level-of-effort"
activities by time-phased budgets established for this purpose. Only
efforts that are unmeasurable or for which measurement is impractical
may be classified as level-of-effort activities.
(13); Management function: Establish overhead budgets for each
significant organizational component of the company for expenses that
will become indirect costs. Reflect in the program budgets, at the
appropriate level, the amounts in overhead pools that are planned to be
allocated to the program as indirect costs.
(14); Management function: Identify management reserves and
undistributed budget.
(15); Management function: Provide that the program target cost goal is
reconciled with the sum of all internal program budgets and management
reserves.
Management function: Ensure that the data resulting from the EVM system
are reliable;
ANSI guideline:
(16); Record direct costs in a manner consistent with the budgets in a
formal system controlled by the general books of account.
(17); Management function: When a work breakdown structure is used,
summarize direct costs from control accounts into the work breakdown
structure without allocation of a single control account to two or more
work breakdown structure elements.
(18); Management function: Summarize direct costs from the control
accounts into the contractor's organizational elements without
allocation of a single control account to two or more organizational
elements.
(19); Management function: Record all indirect costs that will be
allocated to the program consistent with the overhead budgets.
(20); Management function: Identify unit costs, equivalent unit costs,
or lot costs when needed.
(21); Management function: For the EVM system, the material accounting
system will provide for (1) accurate cost accumulation and assignment
of costs to control accounts in a manner consistent with the budgets
using recognized, acceptable, costing techniques; (2) cost recorded for
accomplishing work performed in the same period that earned value is
measured and at the point most suitable for the category of material
involved, but no earlier than the actual receipt of material; and (3)
full accountability of all material purchased for the program,
including the residual inventory.
(22); Management function: At least on a monthly basis, generate the
following information at the control account and other levels as
necessary for management control using actual cost data from, or
reconcilable with, the accounting system: (1) Comparison of the amount
of planned budget and the amount of budget earned for work
accomplished. This comparison provides the schedule variance. (2)
Comparison of the amount of the budget earned and the actual (applied
where appropriate) direct costs for the same work. This comparison
provides the cost variance.
(23); Management function: Identify, at least monthly, the significant
differences between both planned and actual schedule performance and
planned and actual cost performance, and provide the reasons for the
variances in the detail needed by program management.
(24); Management function: Identify budgeted and applied (or actual)
indirect costs at the level and frequency needed by management for
effective control, along with the reasons for any significant
variances.
(25); Management function: Summarize the data elements and associated
variances through the program organization and work breakdown structure
to support management needs and any customer reporting specified in the
contract.
Management function: Ensure that the program management team is using
earned value data for decision-making purposes;
ANSI guideline:
26); Implement managerial actions taken as a result of the earned value
information.
(27); Management function: Develop revised estimates of cost at
completion on the basis of performance to date, commitment values for
material, and estimates of future conditions. Compare this information
with the performance measurement baseline to identify variances at
completion that are important to company management and any applicable
customer reporting requirements, including statements of funding
requirements.
(28); Management function: Incorporate authorized changes in a timely
manner, recording the effects of such changes in budgets and schedules.
In the directed effort before negotiation of a change, base such
revisions on the amount estimated and budgeted to the program
organizations.
(29); Management function: Reconcile current budgets to prior budgets
in terms of changes to the authorized work and internal replanning in
the detail needed by management for effective control.
(30); Management function: Control retroactive changes to records
pertaining to work performed that would change previously reported
amounts for actual costs, earned value, or budgets. Adjustments should
be made only for correction of errors, routine accounting adjustments,
the effects of customer-or management-directed changes, or improvements
to the baseline integrity and accuracy of performance measurement data.
(31); Management function: Prevent revisions to the program budget,
except for authorized changes.
(32); Management function(32): Document changes to the performance
measurement baseline.
Source: ©2007, Information Technology Association of America. Excerpts
from "Earned Value Management Systems" (ANSI/EIA-748-B). All Rights
Reserved. Reprinted by permission.
[End of table]
[End of section]
Appendix III: Case Studies of FAA's Implementation of EVM:
We conducted case studies of four major system acquisition programs:
ASR-11, ERAM, SBS, and SWIM. For each of these programs, the following
sections provide a brief description of the system; an assessment of
the system's implementation of the 11 key EVM practices; and, where
applicable, an analysis of the system's recent earned value data and
trends. These data and trends are often described in terms of cost and
schedule variances. Cost variances compare the earned value of the
completed work with the actual cost of the work performed. Schedule
variances are also measured in dollars, but they compare the earned
value of the work completed with the value of work that was expected to
be completed. Positive variances are good--they indicate that
activities are costing less than expected or are completed ahead of
schedule. Negative variances are bad--they indicate activities are
costing more than expected or are falling behind schedule.
Airport Surveillance Radar:
ASR-11 is a joint program sponsored by both FAA and the U.S. Air Force
to replace outdated primary radar systems at selected airports with an
integrated digital primary and secondary radar system. This investment
is also to replace the deteriorating infrastructure supporting current
radar systems with new radar facilities, including advanced grounding
and lightning protection systems, digital or fiber-optic
telecommunications, emergency backup power supplies, and enhanced
physical security. The contract was awarded in 1996 and is managed by
the Air Force. The total program cost is currently estimated at $1.15
billion, with $437.2 million remaining to be spent (see table 5). ASR-
11 is currently being deployed across the country. As of April 2008, 44
of the total of 66 systems were operational. FAA plans to complete
deployment of these systems by March 2010.
Table 5: Funding Data for ASR-11 (Dollars in millions):
Cost type: Life cycle;
Fiscal year 2007: $55.2;
Fiscal year 2008: $34.6;
To complete: $437.2;
Total: $1,148.3.
Cost type: Development;
Fiscal year 2007: $43.6;
Fiscal year 2008: $19.6;
To complete: $19.6;
Total: $696.5.
Source: OMB FY2008 Exhibit 300.
[End of table]
ASR-11 fully met 2 of the 11 key practices and partially met 9 others
(with justification for not being able to fully meet these). For
example, ASR-11 fully met the practices involving using earned value
information to mitigate risks and updating baselines as changes occur.
ASR-11 partially met the other practices because, while the program
implemented many key components of an effective EVM system, ASR-11 is
limited in what it can measure and validate. There are two reasons for
these limitations: (1) the contract was awarded in the mid-1990s,
before FAA implemented its EVM requirements, and (2) FAA does not have
the authority to obtain data on actual costs expended by the contractor
or Air Force because Air Force is the contracting agency. To work
around these constraints, FAA's ASR-11 program management team
developed a system that allows them to approximate EVM reporting and
tracking at the program level on the basis of estimated (not actual)
costs. Specifically, ASR-11 established a program-level work breakdown
structure, developed a work schedule, and identified who will perform
the work. ASR-11 has also implemented EVM using estimated data and
analyzes its estimated EVM results against its performance measurement
baseline. While valuable, this approximation does not fully meet the
key practices needed to establish a sound EVM system and ensure data
reliability. However, FAA is limited in what it can measure and how it
can validate the work accomplished and the dollars spent. Table 6 shows
the detailed assessment results for ASR-11.
Table 6: Assessment of ASR-11's EVM Practices, as of April 2008:
Program management area of responsibility: Establish a comprehensive
EVM system;
Key practice: Define the scope of effort using a work breakdown
structure;
GAO assessment: Partially implemented--with justification.
Program management area of responsibility: Establish a comprehensive
EVM system;
Key practice: Identify who in the organization will perform the work;
GAO assessment: Partially implemented--with justification.
Program management area of responsibility: Establish a comprehensive
EVM system;
Key practice: Schedule the work;
GAO assessment: Partially implemented--with justification.
Program management area of responsibility: Establish a comprehensive
EVM system;
Key practice: Estimate the labor and material required to perform the
work and authorize the budgets, including management reserve;
GAO assessment: Partially implemented--with justification.
Program management area of responsibility: Establish a comprehensive
EVM system;
Key practice: Determine objective measure of earned value;
GAO assessment: Partially implemented--with justification.
Program management area of responsibility: Establish a comprehensive
EVM system;
Key practice: Develop the performance measurement baseline;
GAO assessment: Partially implemented--with justification.
Program management area of responsibility: Ensure that the data
resulting from the EVM system are reliable;
Key practice: Execute the work plan and record all costs;
GAO assessment: Partially implemented--with justification.
Program management area of responsibility: Ensure that the data
resulting from the EVM system are reliable;
Key practice: Analyze EVM performance data and record variances from
the performance measurement baseline plan;
GAO assessment: Partially implemented--with justification.
Program management area of responsibility: Ensure that the data
resulting from the EVM system are reliable;
Key practice: Forecast estimates at complete;
GAO assessment: Partially implemented--with justification.
Program management area of responsibility: Ensure that the program
management team is using earned value data for decision-making
purposes;
Key practice: Take management action to mitigate risks;
GAO assessment: Fully implemented.
Program management area of responsibility: Ensure that the program
management team is using earned value data for decision-making
purposes;
Key practice: Update the performance measurement baseline as changes
occur;
GAO assessment: Fully implemented.
Source: GAO analysis of FAA data.
[End of table]
Earned Value Data Show Cost Overruns and Schedule Delays:
ASR-11 experienced negative cost variances between January 2007 and
December 2007 (see fig. 2). In this period, the program exceeded cost
targets by $19.2 million--which is 3.3 percent of the program budget
for that time. Similarly, the ASR-11 program was unable to complete
$20.6 million (3.4 percent) of the work planned in this period. The
main factors contributing to the cost and schedule variances were high
construction costs, due mainly to the effects of Hurricane Katrina, and
an unusually long real estate acquisition for the Green Bay, Wisconsin,
ASR-11 site. Program officials are currently working on a request to
rebaseline the program due to the current high variances.
Based on the program performance trends, we estimate that the program
will overrun its budget by between $7.6 million and $53.3 million. Our
projection of the most likely cost overrun will be about $9.8 million.
In comparison, the ASR-11 program office estimates about a $6.2 million
overrun at program completion.
Figure 2: Cumulative Cost and Schedule Variances for the ASR-11 Program
in Calendar Year 2007 (Dollars in millions):
[See PDF for image]
This figure is a multiple line graph depicting the following data:
Date: January, 2007;
Cumulative cost variance: $1.0;
Cumulative schedule variance: -$1.1.
Date: February, 2007;
Cumulative cost variance: $1.0;
Cumulative schedule variance: -$1.4.
Date: March, 2007;
Cumulative cost variance: $0;
Cumulative schedule variance: $1.1.
Date: April, 2007;
Cumulative cost variance: $0;
Cumulative schedule variance: $5.6.
Date: May, 2007;
Cumulative cost variance: $2.0;
Cumulative schedule variance: $4.3.
Date: June, 2007;
Cumulative cost variance: -$1.4;
Cumulative schedule variance: $4.4.
Date: July, 2007;
Cumulative cost variance: -$1.4;
Cumulative schedule variance: $4.8.
Date: August, 2007; -$1.4;
Cumulative cost variance: $5.6.
Cumulative schedule variance:
Date: September, 2007; -$1.4;
Cumulative cost variance: $5.2.
Cumulative schedule variance:
Date: October, 2007;
Cumulative cost variance: -$2.9;
Cumulative schedule variance: -$14.3.
Date: November, 2007;
Cumulative cost variance: -$12.1;
Cumulative schedule variance: -$18.6.
Date: December, 2007;
Cumulative cost variance: -$19.2;
Cumulative schedule variance: -$20.6.
As of December 2007, ASR-11 incurred a cost overrun of $19.2 million.
In addition, it was unable to complete $20.6 million worth of planned
work.
Source: GAO analysis of FAA data.
[End of figure]
En Route Automation Modernization:
ERAM is to replace existing software and hardware in the air traffic
control automation computer system and its backup system, the Direct
Access Radar Channel, and other associated interfaces, communications,
and support infrastructure at en route centers across the country. It
is a critical effort because it is expected to upgrade hardware and
software for facilities that control high altitude air traffic. The
contract was awarded in 2002. The ERAM prime contract requires EVM to
be accomplished by the contractor in accordance with the ANSI standard.
The total program cost is estimated at $2.93 billion, with $1.2 billion
still to be spent (see table 7). ERAM consists of two major components.
One component has been fully deployed and is currently in operation at
facilities across the country. The other component is scheduled for
deployment through fiscal year 2009.
Table 7: Funding Data for ERAM (Dollars in millions):
Cost type: Life cycle;
Fiscal year 2007: $375.0;
Fiscal year 2008: $377.4;
To complete: $1,283.8;
Total: $2,930.4.
Cost type: Development;
Fiscal year 2007: $375.0;
Fiscal year 2008: $368.0;
To complete: $517.4;
Total: $2,153.2.
Source: OMB FY2008 Exhibit 300.
[End of table]
ERAM fully met 2 of the 11 key practices for implementing EVM and
partially met 9 others (with justification for 6 of these). ERAM fully
met the practices involving using EVM data to mitigate risks and
updating performance baselines as changes occur. ERAM partially met 6
other practices, with justification, because of limitations in the
earned value data for the government portions of the program.
Specifically, ERAM manages its contractor using an EVM system that
includes a work breakdown structure, master schedule, and performance
baseline. However, ERAM did not implement a comprehensive EVM system
that integrates government and contractor data because this was not a
requirement when the program was initiated in 2002. Program officials
reported that they implemented a work-around to approximate the
government portion of the program. The ERAM program partially
implemented the 3 remaining practices associated with data reliability.
Anomalies in the prime contractor's EVM reports affect the program's
ability to execute the work plan, analyze variances, and estimate the
cost of the program at completion. Table 8 shows the detailed
assessment results for ERAM.
Table 8: Assessment of ERAM's EVM Practices, as of April 2008:
Program management area of responsibility: Establish a comprehensive
EVM system;
Key practice: Define the scope of effort using a work breakdown
structure;
GAO assessment: Partially implemented--with justification.
Program management area of responsibility: Establish a comprehensive
EVM system;
Key practice: Identify who in the organization will perform the work;
GAO assessment: Partially implemented--with justification.
Program management area of responsibility: Establish a comprehensive
EVM system;
Key practice: Schedule the work;
GAO assessment: Partially implemented--with justification.
Program management area of responsibility: Establish a comprehensive
EVM system;
Key practice: Estimate the labor and material required to perform the
work and authorize the budgets, including management reserve;
GAO assessment: Partially implemented--with justification.
Program management area of responsibility: Establish a comprehensive
EVM system;
Key practice: Determine objective measure of earned value;
GAO assessment: Partially implemented--with justification.
Program management area of responsibility: Establish a comprehensive
EVM system;
Key practice: Develop the performance measurement baseline;
GAO assessment: Partially implemented--with justification.
Program management area of responsibility: Ensure that the data
resulting from the EVM system are reliable;
Key practice: Execute the work plan and record all costs;
GAO assessment: Partially implemented.
Program management area of responsibility: Ensure that the data
resulting from the EVM system are reliable;
Key practice: Analyze EVM performance data and record variances from
the performance measurement baseline plan;
GAO assessment: Partially implemented.
Program management area of responsibility: Ensure that the data
resulting from the EVM system are reliable;
Key practice: Forecast estimates at complete;
GAO assessment: Partially implemented.
Program management area of responsibility: Ensure that the program
management team is using earned value data for decision-making
purposes;
Key practice: Take management action to mitigate risks;
GAO assessment: Fully implemented.
Program management area of responsibility: Ensure that the program
management team is using earned value data for decision-making
purposes;
Key practice: Update the performance measurement baseline as changes
occur;
GAO assessment: Fully implemented.
Source: GAO analysis of FAA data.
[End of table]
Earned Value Data Show ERAM Is Ahead of Schedule and Under Budget, but
Data May Not Be Reliable:
Using contractor-provided data, our analysis indicates that the ERAM
program experienced positive cost and schedule performance in 2007 (see
fig. 3). Specifically, from January 2007 to December 2007, the
contractor was able to outperform its planned targets by finishing
under budget by $11.3 million (1 percent of the work for this period)
and by completing $25.5 million, or 3 percent, worth of work beyond
what was planned. Factors that contributed to the positive cost and
schedule variances include less labor needed than planned, savings in
materials purchased, and higher productivity and efficiency. For
example, the program contractor reported a positive schedule variance
in 2007 due to technology refresh activities at the William J. Hughes
Technical Center[Footnote 24] being accomplished earlier than planned.
However, as we have previously noted, our analysis of ERAM's contractor
performance reports uncovered a number of anomalies that raise
questions regarding the reliability of these data. Furthermore, the
contractor did not provide justification for these anomalies, and the
program office was unable to explain the occurrences.
Figure 3: Cumulative Cost and Schedule Variances of the ERAM Prime
Contract in Calendar Year 2007 (Dollars in millions):
[See PDF for image]
This figure is a multiple line graph depicting the following data:
Date: January, 2007;
Cumulative cost variance: $2.8;
Cumulative schedule variance: $13.7.
Date: February, 2007;
Cumulative cost variance: $4.3;
Cumulative schedule variance: $14.6.
Date: March, 2007;
Cumulative cost variance: $7.1;
Cumulative schedule variance: $15.5.
Date: April, 2007;
Cumulative cost variance: $8.2;
Cumulative schedule variance: $6.0.
Date: May, 2007;
Cumulative cost variance: $10.8;
Cumulative schedule variance: $4.1.
Date: June, 2007;
Cumulative cost variance: $7.6;
Cumulative schedule variance: $8.5.
Date: July, 2007;
Cumulative cost variance: $10.2;
Cumulative schedule variance: $8.7.
Date: August, 2007;
Cumulative cost variance: $17.1;
Cumulative schedule variance: $13.9.
Date: September, 2007;
Cumulative cost variance: $14.1;
Cumulative schedule variance: $10.9.
Date: October, 2007;
Cumulative cost variance: $13.5;
Cumulative schedule variance: $8.7.
Date: November, 2007;
Cumulative cost variance: $9.1;
Cumulative schedule variance: $10.3.
Date: December, 2007;
Cumulative cost variance: $11.3;
Cumulative schedule variance: $25.6.
In December 2007, the ERAM contractor reported that it outperformed its
planned cost and schedule goals. Specifically, it reported coming in
under budget by $11.3 million and completing $25.5 million worth of
work beyond what was planned for that period.
Source: GAO analysis of FAA data.
Note: As we indicated in the previous text, we question the reliability
of these data on the basis of the anomalies found in the contractor
reports.
[End of figure]
Surveillance and Broadcast Services:
SBS is to provide new surveillance solutions that employ technology
using avionics and ground stations for improved accuracy and update
rates and provide shared situational awareness (including visual
updates of traffic, weather, and flight notices) between pilots and air
traffic control. These technologies are considered critical to
achieving the FAA strategic goals of decreasing the rate of accidents
and incursions, improving the efficiency of air traffic, and reducing
congestion. The program is currently estimated at $4.31 billion, with a
total of $4.11 billion planned to be spent for the remaining work until
completion (see table 9).
Table 9: SBS Funding Data (Dollars in millions):
Cost type: Life cycle;
Fiscal year 2007: $91.6;
Fiscal year 2008: $101.9;
To complete: $4,109.6;
Total: $4,313.0.
Cost type: Development;
Fiscal year 2007: $90.0;
Fiscal year 2008: $100.0;
To complete: $3,771.1;
Total: $3,961.1.
Source: OMB FY2008 Exhibit 300.
[End of table]
The program reported that the achievement of cost, schedule, and
performance goals was expected to be tracked and monitored through FAA
best practices and established EVM processes defined by FAA. Monthly
program reviews, detailed schedule updates, and EVM reporting are
expected to be applied in accordance with the FAA EVM policy. Future
contracts are expected to include all EVM requirements since
established by FAA and to be consistent with the industry standards and
OMB A-11 guidance.
SBS implemented all 11 of the key practices necessary to ensure that
the program was planned in accordance with industry standards, that the
resulting EVM data were appropriately verified and validated for
reliability, and that the SBS management team was using these data for
decision-making purposes. Table 10 shows the detailed assessment
results for SBS.
Table 10: Assessment of SBS's EVM Practices, as of April 2008:
Program management area of responsibility: Establish a comprehensive
EVM system;
Key practice: Define the scope of effort using a work breakdown
structure;
GAO assessment: Fully implemented.
Program management area of responsibility: Establish a comprehensive
EVM system;
Key practice: Identify who in the organization will perform the work;
GAO assessment: Fully implemented.
Program management area of responsibility: Establish a comprehensive
EVM system;
Key practice: Schedule the work;
GAO assessment: Fully implemented.
Program management area of responsibility: Establish a comprehensive
EVM system;
Key practice: Estimate the labor and material required to perform the
work and authorize the budgets, including management reserve;
GAO assessment: Fully implemented.
Program management area of responsibility: Establish a comprehensive
EVM system;
Key practice: Determine objective measure of earned value;
GAO assessment: Fully implemented.
Program management area of responsibility: Establish a comprehensive
EVM system;
Key practice: Develop the performance measurement baseline;
GAO assessment: Fully implemented.
Program management area of responsibility: Ensure that the data
resulting from the EVM system are reliable;
Key practice: Execute the work plan and record all costs;
GAO assessment: Fully implemented.
Program management area of responsibility: Ensure that the data
resulting from the EVM system are reliable;
Key practice: Analyze EVM performance data and record variances from
the performance measurement baseline plan;
GAO assessment: Fully implemented.
Program management area of responsibility: Ensure that the data
resulting from the EVM system are reliable;
Key practice: Forecast estimates at complete;
GAO assessment: Fully implemented.
Program management area of responsibility: Ensure that the program
management team is using earned value data for decision-making
purposes;
Key practice: Take management action to mitigate risks;
GAO assessment: Fully implemented.
Program management area of responsibility: Ensure that the program
management team is using earned value data for decision-making
purposes;
Key practice: Update the performance measurement baseline as changes
occur;
GAO assessment: Fully implemented.
Source: GAO analysis of FAA data.
[End of table]
Earned Value Data Show SBS Performance Is under Cost Targets, but over
Schedule Targets:
From December 2007 to February 2008, SBS cost performance has been
mixed against its planned cost and schedule targets (see fig. 4). The
program was able to outperform its cost targets by $3.0 million.
However, the SBS program was unable to complete $4.5 million, or 6
percent of the value of planned work. The program indicated that the
positive program cost variances were associated with key activities
(including the preliminary design review) taking less effort than
expected to complete. The negative schedule variances were primarily
due to scheduling errors and system-level testing issues. In
particular, the system-level testing was delayed due to a lack of
readiness of the test environment, test documentation, and equipment.
Figure 4: Cumulative Cost and Schedule Variances for the SBS Program
(Dollars in millions):
[See PDF for image]
This figure is a multiple line graph depicting the following data:
Date: December, 2007;
Cumulative cost variance: $1.8;
Cumulative schedule variance: -$0.6.
Date: January, 2008;
Cumulative cost variance: $1.0;
Cumulative schedule variance: -$5.4.
Date: February, 2008;
Cumulative cost variance: $3.1;
Cumulative schedule variance: -4.5.
As of February 2008, SBS outperformed its planned cost target and
finished under budget by $3 million. However, during this time, it was
unable to complete $4.5 million worth of planned work.
Source: GAO analysis of FAA data.
[End of figure]
System Wide Information Management:
As the key information management and data sharing system for NextGen,
SWIM is expected to provide policies and standards to support data
management, along with the core services needed to publish data to the
network, retrieve the data, secure the data's integrity, and control
access and use of the data. SWIM is also expected to reduce the number
and types of interfaces and systems, reduce unnecessary redundancy of
information, better facilitate information-sharing, improve
predictability and operational decision making, and reduce cost of
service. The FAA's Joint Resource Council established a baseline for
the first 2 years of the first segment of this program on June 20,
2007. The estimated life-cycle cost for the total SWIM program is
$546.1 million, with $501.3 million still to be spent (see table 11).
Table 11: Financial Funding Data for SWIM (Dollars in millions):
Cost type: Life cycle;
Fiscal year 2007: $24.0;
Fiscal year 2008: $20.8;
To complete: $501.3;
Total: $546.12.
Cost type: Development;
Fiscal year 2007: $0.0;
Fiscal year 2008: $0.0;
To complete: $234.5;
Total: $234.5.
Source: OMB FY2008 Exhibit 300.
[End of table]
SWIM is in the planning phase of its life cycle, which entails setting
up the program's EVM system of internal controls and the resulting
performance measurement baseline. EVM data will not be available until
development work begins in fiscal year 2009.
Our assessment of SWIM's EVM process maturity indicated that the
program is on track in its implementation of EVM. Specifically, it has
fully met two of the six key process steps for ensuring that the
program is planned in accordance with industry standards. SWIM also has
work under way to address the other four steps. We did not assess SWIM
in the five key process steps related to EVM data reliability and use
in program decision making because the program has not begun
development work at this time. Table 12 shows the detailed assessment
results for SWIM.
Table 12: Assessment of SWIM's EVM Practices, as of April 2008:
Program management area of responsibility: Establish a comprehensive
EVM system;
Key practice: Define the scope of effort using a work breakdown
structure;
GAO assessment: Fully implemented.
Program management area of responsibility: Establish a comprehensive
EVM system;
Key practice: Identify who in the organization will perform the work;
GAO assessment: Fully implemented.
Program management area of responsibility: Establish a comprehensive
EVM system;
Key practice: Schedule the work;
GAO assessment: Work in progress.
Program management area of responsibility: Establish a comprehensive
EVM system;
Key practice: Estimate the labor and material required to perform the
work and authorize the budgets, including management reserve;
GAO assessment: Work in progress.
Program management area of responsibility: Establish a comprehensive
EVM system;
Key practice: Determine objective measure of earned value;
GAO assessment: Work in progress.
Program management area of responsibility: Establish a comprehensive
EVM system;
Key practice: Develop the performance measurement baseline;
GAO assessment: Work in progress.
Program management area of responsibility: Ensure that the data
resulting from the EVM system are reliable;
Key practice: Execute the work plan and record all costs;
GAO assessment: N/A.
Program management area of responsibility: Ensure that the data
resulting from the EVM system are reliable;
Key practice: Analyze EVM performance data and record variances from
the performance measurement baseline plan;
GAO assessment: N/A.
Program management area of responsibility: Ensure that the data
resulting from the EVM system are reliable;
Key practice: Forecast estimates at complete;
GAO assessment: Take management action to mitigate risks: N/A.
Program management area of responsibility: Ensure that the program
management team is using earned value data for decision-making
purposes;
Key practice: Take management action to mitigate risks;
GAO assessment: N/A.
Program management area of responsibility: Ensure that the program
management team is using earned value data for decision-making
purposes;
Key practice: Update the performance measurement baseline as changes
occur;
GAO assessment: N/A.
Source: GAO analysis of FAA data.
[End of table]
[End of section]
Appendix IV: GAO Contact and Staff Acknowledgments:
GAO Contact:
David A. Powner, (202) 512-9286 or pownerd@gao.gov:
Staff Acknowledgments:
In addition to the contact named above, Colleen Phillips (Assistant
Director), Kate Agatone, Carol Cha, Neil Doherty, Nancy Glover, and
Teresa Smith made key contributions to this report.
[End of section]
Footnotes:
[1] GAO, High-Risk Series: An Update, [hyperlink,
http://www.gao.gov/cgi-bin/getrpt?GAO-07-310] (Washington, D.C.:
January 2007); High-Risk Series: An Update, [hyperlink,
http://www.gao.gov/cgi-bin/getrpt?GAO-05-207] (Washington, D.C.:
January 2005); High-Risk Series: An Update, [hyperlink,
http://www.gao.gov/cgi-bin/getrpt?GAO-03-119] (Washington, D.C.:
January 2003); High-Risk Series: An Update, [hyperlink,
http://www.gao.gov/cgi-bin/getrpt?GAO-01-263] (Washington, D.C.:
January 2001); High-Risk Series: An Update, [hyperlink,
http://www.gao.gov/cgi-bin/getrpt?GAO/HR-99-1] (Washington, D.C.:
January 1999); High-Risk Areas: Update on Progress and Remaining
Challenges, [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO/T-HR-97-
22] (Washington, D.C.: Feb. 13, 1997); and High-Risk Series: An
Overview, [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO/HR-95-1]
(Washington, D.C.: February 1995).
[2] OMB Memorandum, M-05-23 (Aug. 4, 2005).
[3] GAO, Air Traffic Control: FAA Reports Progress in System
Acquisitions, but Changes in Performance Measurement Could Improve
Usefulness of Information, [hyperlink, http://www.gao.gov/cgi-
bin/getrpt?GAO-08-42] (Washington, D.C.: Dec. 18, 2007); National
Airspace System: FAA Has Made Progress but Continues to Face Challenges
in Acquiring Major Air Traffic Control Systems, [hyperlink,
http://www.gao.gov/cgi-bin/getrpt?GAO-05-331] (Washington, D.C.: June
10, 2005); and Air Traffic Control: FAA's Acquisition Management Has
Improved, but Policies and Oversight Need Strengthening to Help Ensure
Results, [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-05-23]
(Washington, D.C.: Nov. 12, 2004).
[4] GAO, Cost Assessment Guide: Best Practices for Estimating and
Managing Program Costs, Exposure Draft, [hyperlink,
http://www.gao.gov/cgi-bin/getrpt?GAO-07-1134SP] (Washington, D.C.:
July 2007).
[5] American National Standards Institute/Electronic Industries
Alliance Standard, Earned Value Management Systems, ANSI/EIA-748-B,
approved July 2007.
[6] A product-oriented work breakdown structure allows a program to
track cost and schedule by defined deliverables, such as a hardware or
software component. This allows a program manager to more precisely
identify which components are causing cost or schedule overruns and to
more effectively mitigate the root cause of the overruns.
[7] FAA uses airport towers, terminal radar approach control
facilities, and air route traffic control centers (also called en route
centers) located throughout the country to control air traffic. In
addition, FAA's ATC System Command Center manages the flow of traffic
across the country.
[8] [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-05-331] and
[hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-05-23].
[9] [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO/HR-95-1].
[10] [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-07-310],
[hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-05-207], [hyperlink,
http://www.gao.gov/cgi-bin/getrpt?GAO-03-119], [hyperlink,
http://www.gao.gov/cgi-bin/getrpt?GAO-01-263], [hyperlink,
http://www.gao.gov/cgi-bin/getrpt?GAO/HR-99-1], and [hyperlink,
http://www.gao.gov/cgi-bin/getrpt?GAO/T-HR-97-22].
[11] 49 U.S.C. § 40110.
[12] ANSI/EIA Standard, Earned Value Management Systems, ANSI/EIA-748-
A-1998. This document was updated in July 2007 and is referred to as
ANSI/EIA-748-B.
[13] OMB Memorandum, M-05-23 (Aug. 4, 2005).
[14] An integrated baseline review is an evaluation of a program's
baseline plan to determine whether all program requirements have been
addressed, risks have been identified, mitigation plans are in place,
and available and planned resources are sufficient to complete the
work.
[15] [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-07-1134SP].
[16] OMB, Capital Programming Guide, Supplement to Circular A-11, Part
7, version 2.0 (June 2006), 9 and 91-94 (app. 9).
[17] [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-07-1134SP].
[18] The Acquisition Management System defines all acquisition
management and procurement policy and guidance within FAA.
[19] OMB requires agencies to submit justification packages for major
IT investments on an annual basis. This justification package is called
the exhibit 300.
[20] GAO, Financial Management: Improvements Under Way but Serious
Financial Systems Problems Persist, [hyperlink, http://www.gao.gov/cgi-
bin/getrpt?GAO-06-970] (Washington, D.C.: Sept. 26, 2006); Financial
Management Systems: Additional Efforts Needed to Address Key Causes of
Modernization Failures, [hyperlink, http://www.gao.gov/cgi-
bin/getrpt?GAO-06-184] (Washington, D.C.: Mar. 15, 2006); Managerial
Cost Accounting Practices: Departments of Education, Transportation,
and the Treasury, [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-06-
301R] (Washington, D.C.: Dec. 19, 2005); and Financial Management:
Achieving FFMIA Compliance Continues to Challenge Agencies, [hyperlink,
http://www.gao.gov/cgi-bin/getrpt?GAO-05-881] (Washington, D.C.: Sept.
20, 2005).
[21] ASR-11 is a joint program sponsored by both FAA and the U.S. Air
Force. FAA does not have the authority to obtain data on actual costs
expended by the contractor or the Air Force because the Air Force is
the sole acquisition authority on this contract.
[22] GAO, Cost Assessment Guide: Best Practices for Estimating and
Managing Program Costs, Exposure Draft, [hyperlink,
http://www.gao.gov/cgi-bin/getrpt?GAO-07-1134SP] (Washington, D.C.:
July 2007).
[23] [hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-07-1134SP].
[24] This is the center where metrics are being developed to test the
accuracy of ERAM.
[End of section]
GAO's Mission:
The Government Accountability Office, the audit, evaluation and
investigative arm of Congress, exists to support Congress in meeting
its constitutional responsibilities and to help improve the performance
and accountability of the federal government for the American people.
GAO examines the use of public funds; evaluates federal programs and
policies; and provides analyses, recommendations, and other assistance
to help Congress make informed oversight, policy, and funding
decisions. GAO's commitment to good government is reflected in its core
values of accountability, integrity, and reliability.
Obtaining Copies of GAO Reports and Testimony:
The fastest and easiest way to obtain copies of GAO documents at no
cost is through GAO's Web site [hyperlink, http://www.gao.gov]. Each
weekday, GAO posts newly released reports, testimony, and
correspondence on its Web site. To have GAO e-mail you a list of newly
posted products every afternoon, go to [hyperlink, http://www.gao.gov]
and select "E-mail Updates."
Order by Mail or Phone:
The first copy of each printed report is free. Additional copies are $2
each. A check or money order should be made out to the Superintendent
of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or
more copies mailed to a single address are discounted 25 percent.
Orders should be sent to:
U.S. Government Accountability Office:
441 G Street NW, Room LM:
Washington, D.C. 20548:
To order by Phone:
Voice: (202) 512-6000:
TDD: (202) 512-2537:
Fax: (202) 512-6061:
To Report Fraud, Waste, and Abuse in Federal Programs:
Contact:
Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]:
E-mail: fraudnet@gao.gov:
Automated answering system: (800) 424-5454 or (202) 512-7470:
Congressional Relations:
Ralph Dawn, Managing Director, dawnr@gao.gov:
(202) 512-4400:
U.S. Government Accountability Office:
441 G Street NW, Room 7125:
Washington, D.C. 20548:
Public Affairs:
Chuck Young, Managing Director, youngc1@gao.gov:
(202) 512-4800:
U.S. Government Accountability Office:
441 G Street NW, Room 7149:
Washington, D.C. 20548: