Space Acquisitions
DOD Needs to Take More Action to Address Unrealistic Initial Cost Estimates of Space Systems
Gao ID: GAO-07-96 November 17, 2006
Estimated costs for the Department of Defense's (DOD) major space acquisition programs have increased by about $12.2 billion from initial estimates for fiscal years 2006 through 2011. Cost growth for ongoing Air Force programs above initial estimates accounts for a substantial portion of this 44 percent increase. In light of the role that optimistic estimating is believed to have played in exacerbating space acquisition cost growth, you requested that we examine (1) in what areas space system acquisition cost estimates have been unrealistic and (2) what incentives and pressures have contributed to the quality and usefulness of cost estimates for space system acquisitions.
Costs for DOD space acquisitions over the past several decades have consistently been underestimated--sometimes by billions of dollars. For example, Space Based Infrared System High program costs were originally estimated at $4 billion, but the program is now estimated to cost over $10 billion. Estimated costs for the National Polar-orbiting Operational Satellite System program have grown from almost $6 billion at program start to over $11 billion. For the most part, cost growth has not been caused by poor cost estimating, but rather the tendency to start programs before knowing whether requirements can be achieved within available resources--largely because of pressures to secure funding. At the same time, however, unrealistic program office cost estimates have exacerbated space acquisition problems. Specifically, with budgets originally set at unrealistic amounts, DOD has had to resort to continually shifting funds to and from programs, and such shifts have had costly, reverberating effects. Our analyses of six ongoing space programs found that original cost estimates were particularly unrealistic about the promise of savings from increased contractor program management responsibilities, the constancy and availability of the industrial base, savings that could be accrued from heritage systems, the amount of weight growth that would occur during a program, the availability of mature technology, the stability of funding, the stability of requirements, and the achievability of planned schedules. At times, estimates that were more realistic in these areas were available to the Air Force, but they were not used. Cost-estimating and program officials we spoke with identified a number of factors that have contributed to this condition, in addition to larger pressures to produce low estimates that are more likely to win support for funding. Although the National Security Space Acquisition policy requires that independent cost estimates be prepared by bodies outside the acquisition chain of command, it does not require that they be relied upon to develop program budgets. While the policy requires that cost estimates be updated at major acquisition milestones, significant events, such as changes in the industrial base or funding, have occurred between milestones. Within space system acquisitions, cost-estimating officials believe that their roles and responsibilities are not clear and the cost-estimating function is fragmented. Cost-estimating resources have atrophied over the years because of previous downsizing of the workforce, making resources such as staff and data inadequate and the Air Force more dependent on support contractors for the estimating function.
Recommendations
Our recommendations from this work are listed below with a Contact for more information. Status will change from "In process" to "Open," "Closed - implemented," or "Closed - not implemented" based on our follow up work.
Director:
Team:
Phone:
GAO-07-96, Space Acquisitions: DOD Needs to Take More Action to Address Unrealistic Initial Cost Estimates of Space Systems
This is the accessible text file for GAO report number GAO-07-96
entitled 'Space Acquisitions: DOD Needs to Take More Action to Address
Unrealistic Initial Cost Estimates of Space Systems' which was released
on November 17, 2006.
This text file was formatted by the U.S. Government Accountability
Office (GAO) to be accessible to users with visual impairments, as part
of a longer term project to improve GAO products' accessibility. Every
attempt has been made to maintain the structural and data integrity of
the original printed product. Accessibility features, such as text
descriptions of tables, consecutively numbered footnotes placed at the
end of the file, and the text of agency comment letters, are provided
but may not exactly duplicate the presentation or format of the printed
version. The portable document format (PDF) file is an exact electronic
replica of the printed version. We welcome your feedback. Please E-mail
your comments regarding the contents or accessibility features of this
document to Webmaster@gao.gov.
This is a work of the U.S. government and is not subject to copyright
protection in the United States. It may be reproduced and distributed
in its entirety without further permission from GAO. Because this work
may contain copyrighted images or other material, permission from the
copyright holder may be necessary if you wish to reproduce this
material separately.
Report to Subcommittee on Strategic Forces, Committee on Armed
Services, House of Representatives:
United States Government Accountability Office:
GAO:
November 2006:
Space Acquisitions:
DOD Needs to Take More Action to Address Unrealistic Initial Cost
Estimates of Space Systems:
Space Acquisitions:
GAO-07-96:
GAO Highlights:
Highlights of GAO-07-96, a report to Subcommittee on Strategic Forces,
Committee on Armed Services, House of Representatives
Why GAO Did This Study:
Estimated costs for the Department of Defense‘s (DOD) major space
acquisition programs have increased by about $12.2 billion from initial
estimates for fiscal years 2006 through 2011. Cost growth for ongoing
Air Force programs above initial estimates accounts for a substantial
portion of this 44 percent increase. In light of the role that
optimistic estimating is believed to have played in exacerbating space
acquisition cost growth, you requested that we examine (1) in what
areas space system acquisition cost estimates have been unrealistic and
(2) what incentives and pressures have contributed to the quality and
usefulness of cost estimates for space system acquisitions.
What GAO Found:
Costs for DOD space acquisitions over the past several decades have
consistently been underestimated”sometimes by billions of dollars. For
example, Space Based Infrared System High program costs were originally
estimated at $4 billion, but the program is now estimated to cost over
$10 billion. Estimated costs for the National Polar-orbiting
Operational Satellite System program have grown from almost $6 billion
at program start to over $11 billion.
For the most part, cost growth has not been caused by poor cost
estimating, but rather the tendency to start programs before knowing
whether requirements can be achieved within available resources”largely
because of pressures to secure funding. At the same time, however,
unrealistic program office cost estimates have exacerbated space
acquisition problems. Specifically, with budgets originally set at
unrealistic amounts, DOD has had to resort to continually shifting
funds to and from programs, and such shifts have had costly,
reverberating effects.
Our analyses of six ongoing space programs found that original cost
estimates were particularly unrealistic about the promise of savings
from increased contractor program management responsibilities, the
constancy and availability of the industrial base, savings that could
be accrued from heritage systems, the amount of weight growth that
would occur during a program, the availability of mature technology,
the stability of funding, the stability of requirements, and the
achievability of planned schedules. At times, estimates that were more
realistic in these areas were available to the Air Force, but they were
not used.
Cost-estimating and program officials we spoke with identified a number
of factors that have contributed to this condition, in addition to
larger pressures to produce low estimates that are more likely to win
support for funding.
* Although the National Security Space Acquisition policy requires that
independent cost estimates be prepared by bodies outside the
acquisition chain of command, it does not require that they be relied
upon to develop program budgets.
* While the policy requires that cost estimates be updated at major
acquisition milestones, significant events, such as changes in the
industrial base or funding, have occurred between milestones.
* Within space system acquisitions, cost-estimating officials believe
that their roles and responsibilities are not clear and the cost-
estimating function is fragmented.
* Cost-estimating resources have atrophied over the years because of
previous downsizing of the workforce, making resources such as staff
and data inadequate and the Air Force more dependent on support
contractors for the estimating function.
What GAO Recommends:
GAO recommends that DOD take a number of actions to increase the
likelihood that independent, more realistic cost estimates will be
developed and utilized.
DOD concurred with the overall findings of this report and provided
information on the specific actions it was already taking to improve
the Air Force‘s cost-estimating capability.
[Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-07-96].
To view the full product, including the scope and methodology, click on
the link above. For more information, contact Cristina T. Chaplain at
(202) 512-4841 or chaplainc@gao.gov.
[End of Section]
Contents:
Letter:
Results in Brief:
Background:
Program Office Cost Estimates on Space Programs Not Realistic:
Various Incentives and Pressures within DOD Have Contributed to Cost-
Estimating Weaknesses:
Successful Organization Approaches That Better Support Cost Estimating:
Conclusions:
Recommendations for Executive Action:
Agency Comments and Our Evaluation:
Appendix I: Scope and Methodology:
Appendix II: DOD Acquisition Categories for Major Defense Acquisition
Programs:
Appendix III: Examples of Where Program Officials Were Too Optimistic
in Their Assumptions:
Appendix IV: Examples Where Independent Cost Estimates Were Not Relied
Upon:
Appendix V: Comments from the Department of Defense:
Appendix VI: GAO Contacts and Staff Acknowledgments:
Related GAO Products:
Tables:
Table 1: Areas Where Program Officials Were Too Optimistic in Their
Assumptions:
Table 2: Comparison of 2004 AEHF Program Office and Independent Cost
Estimates:
Table 3: Comparison of 2003 NPOESS Program Office and Independent Cost
Estimates:
Table 4: Comparison of 1996 SBIRS High Program Office Cost Estimate and
Independent Cost Estimate:
Table 5: DOD Acquisition Categories and Decision Authorities:
Table 6: Examples of Optimistic Assumptions:
Table 7: Comparison of 2004 AEHF Program Office and Independent Cost
Estimates:
Table 8: Historical AEHF Weight Growth:
Table 9: Comparison of 2003 NPOESS Program Office and Independent Cost
Estimates:
Table 10: Program Office Integration Estimates for NPOESS:
Table 11: SBIRS High GEO 3-5 Procurement Funding Analysis:
Figure:
Figure 1: Key Events and Funding Shifts That Occurred between Estimates
for SBIRS High:
Abbreviations:
ACAT: Acquisition Category:
AEHF: Advanced Extremely High Frequency:
AFCAA: Air Force Cost Analysis Agency:
APB: Acquisition Program Baseline:
CAIG: Cost Analysis Improvement Group:
DOD: Department of Defense:
DMSP: Defense Meteorological Satellite Program:
EELV: Evolved Expendable Launch Vehicle:
GEO: geosynchronous earth orbit:
GPS: Global Positioning System:
HEO: highly elliptical orbit:
KDP: key decision point:
NPOESS: National Polar- orbiting Operational Environmental Satellite
System:
NRO: National Reconnaissance Office:
NSA: National Security Agency:
SBIRS: Space Based Infrared System:
SMC: Space and Missile Systems Center:
TRL: Technology Readiness Level:
TSAT: Transformational Satellite Communications System:
TSPR: Total System Performance Responsibility:
WGS: Wideband Gapfiller Satellites:
United States Government Accountability Office:
Washington, DC 20548:
November 17, 2006:
The Honorable Terry Everett:
Chairman:
The Honorable Silvestre Reyes:
Ranking Minority Member:
Subcommittee on Strategic Forces:
Committee on Armed Services:
House of Representatives:
Estimated costs for the Department of Defense's (DOD) major space
acquisition programs have increased a total of about $12.2 billion--or
nearly 44-percent--above initial estimates for fiscal years 2006
through 2011. In some cases, current estimates of costs are more than
double the original estimates. For example, the Space Based Infrared
System (SBIRS) High program was originally estimated to cost about $4
billion, but is now estimated to cost over $10 billion. The National
Polar-orbiting Operational Satellite System (NPOESS) program was
originally estimated to cost almost $6 billion but is now over $11
billion. Such growth has had a dramatic impact on DOD's overall space
portfolio. To cover the added costs of poorly performing programs, DOD
has shifted scarce resources away from other programs, creating a
cascade of cost and schedule inefficiencies.
Our work has identified a variety of reasons for this cost growth, most
notably that weapons programs are incentivized to produce and use
optimistic cost and schedule estimates in order to successfully compete
for funding and that DOD starts its space programs too early, that is,
before it has assurance that the capabilities it is pursuing can be
achieved within available resources and time constraints. At the same
time, however, this cost growth was partly due to the fact that DOD
used low cost estimates to establish programs' budgets and later found
it was necessary to make funding shifts that had costly, reverberating
effects. In 2003, a DOD study of space acquisition problems found that
the space acquisition system is strongly biased to produce
unrealistically low cost estimates throughout the process. The study
found that most programs at the time of contract initiation had a
predictable cost growth of 50 to 100 percent. The study also found that
the unrealistically low projections of program cost and lack of
provisions for management reserve seriously distorted management
decisions and program content, increased risks to mission success, and
virtually guaranteed program delays. We have found that most of these
conditions exist in many DOD programs.
Given concerns about the role optimistic cost estimating has played in
exacerbating space acquisition problems, you requested that we examine
(1) in what areas space system acquisitions cost estimates have been
unrealistic and (2) what incentives and pressures have contributed to
the quality and usefulness of cost estimates for space system
acquisitions.
In conducting our work, we developed case studies of six ongoing major
space acquisition programs that included analysis of cost and other
program documentation. These include the Advanced Extremely High
Frequency (AEHF) satellite program (communications satellites), the
Evolved Expendable Launch Vehicle (EELV) (satellite launch systems),
the Global Positioning System (GPS) IIF (navigational satellites), the
National Polar-orbiting Operational Environmental Satellite System
(weather and environmental monitoring satellites), the Space Based
Infrared System High (missile detection satellites), and the Wideband
Gapfiller Satellites (WGS) (communication satellites). We also spoke
with officials from DOD, the Air Force, and contractor offices and
analyzed DOD and Air Force acquisition and cost-estimating policies. In
addition, we obtained input on our findings from a panel of cost-
estimating experts who work within the Office of the Secretary of
Defense as well as the Air Force. Additional information on our scope
and methodology is in appendix I. We conducted our work from August
2005 to October 2006 in accordance with generally accepted government
auditing standards.
Results in Brief:
Our analyses of six ongoing space programs found that original cost
estimates were unrealistic in a number of areas, specifically, savings
from increased contractor program management responsibilities, the
constancy and availability of the industrial base, savings that could
be accrued from heritage systems, the amount of weight growth that
would occur during a program, the availability of mature technology,
the stability of funding, the stability of requirements, and the
achievability of planned schedules. At times, estimates that were more
realistic in these areas were available to the Air Force, but they were
not used so that programs could sustain support amid competition for
funding.
Cost-estimating and program officials we spoke with identified a number
of factors that have contributed to low estimates in addition to the
larger pressures to win support for funding. For example, although the
National Security Space Acquisition policy requires independent cost
estimates that are prepared by bodies outside the acquisition chain of
command, such estimates have not always been relied upon for program
decisions or to develop program budgets. In addition, while the policy
requires that independent cost estimates be prepared or updated at
major acquisition milestones, significant events, such as changes in
the industrial base or funding, have occurred between milestones.
Moreover, within space system acquisitions, cost-estimating officials
believe that their roles and responsibilities are not clear, and the
cost-estimating function is fragmented. Finally, according to Air Force
officials, cost-estimating resources have atrophied over the years
because of the previous downsizing of the workforce, making resources
such as staff and data inadequate and the Air Force more dependent on
support contractors for the estimating function.
While the Air Force has taken steps recently to emphasize the use of
independent cost estimates, it has not made additional changes needed
to enhance the quality of cost estimates. We are making recommendations
aimed at instituting these actions. DOD agreed with most of our
recommendations, and is taking a number of actions to improve the Air
Force's cost-estimating capability for space programs. DOD expressed
concern that requiring officials involved in milestone decisions to
document and justify their choice of cost estimates would reduce the
milestone decision authority's future decision-making flexibility.
While we recognize the importance of decision-making flexibility, we
believe that more transparency in DOD's decision making is needed given
the poor foundation of choices made in the past on space programs.
Background:
Estimates of the total cost of a program are critical components in the
acquisition process because they help decision makers decide among
competing options and evaluate resource requirements at key decision
points. All military services prepare life-cycle cost estimates in
support of their acquisition programs that attempt to identify all
costs of an acquisition program, from initiation through development,
production, and disposal of the resulting system at the end of its
useful life. These estimates serve two primary purposes. First, they
are used at acquisition program milestone and decision reviews to
assess whether the acquisition is affordable or consistent with the
military services' and DOD's overall long-range funding, investment,
and force structure plans. Second, they form the basis for budget
requests to Congress. A realistic estimate of projected costs makes for
effective resource allocation, and it increases the probability of a
project's success.
The requirements and guidance for cost estimating are specified in
statute and in DOD policies. By law, there is a requirement that an
independent life-cycle cost estimate be considered by the milestone
decision authority before approving system development and
demonstration, or production and deployment, of a major defense
acquisition program.[Footnote 1] The statute requires DOD to prescribe
regulations governing the content and submission of such estimates and
that the estimate be prepared by (1) an office or other entity that is
not under the supervision, direction, or control of the military
department, DOD agency, or other DOD component directly responsible for
carrying out the development or acquisition of the program, or (2) by
an office or other entity that is not directly responsible for carrying
out the development or acquisition of the program if the decision
authority for the program has been delegated to an official of a
military department, DOD agency, or other DOD component.[Footnote 2]
The statute specifies that the independent estimate is to include all
costs of development, procurement, military construction, and
operations and support, without regard to funding source or management
control.[Footnote 3] DOD policy assigns specific responsibility for
fulfilling the requirement of an independent cost estimate to the
Office of the Secretary of Defense Cost Analysis Improvement Group
(CAIG) for any major defense acquisition program and major system that
are subject to review by the Defense Acquisition Board of the Defense
Space Acquisition Board.[Footnote 4] These board reviews address major
defense acquisition programs (including space programs) that are
designated as acquisition category (ACAT) ID, pre-major defense
acquisition programs, or ACAT IC programs (see app. II for a
description of acquisition categories ID and IC). The CAIG independent
cost estimate is prepared for milestone (known as key decision point in
space programs) B (program start, or preliminary design for space
programs), and C (low-rate initial production or build approval for
space programs). In addition, the milestone decision authority may
request the CAIG to prepare other independent cost estimates, or
conduct other ad hoc cost assessments for programs subject to its
review and oversight. The CAIG serves as the principal advisory body to
the milestone decision authority on all matters concerning an
acquisition program's life-cycle cost, and is given general
responsibilities for establishing DOD policy guidance on a number of
matters relating to cost estimating.
Since 2003, cost estimating for major space system acquisitions has
been governed by the National Security Space Acquisition
Policy.[Footnote 5] Under this policy, the CAIG is responsible for and
leads the development of independent cost analyses of major space
acquisition programs.[Footnote 6] Fulfilling the requirement that an
independent cost estimate be developed by an organization independent
of the program office and the acquisition chain of command, the CAIG
does so in support of a distinct Defense Space Acquisition Board, with
the Under Secretary of the Air Force as the milestone decision
authority.[Footnote 7] The CAIG is to prepare independent cost analyses
for space acquisition programs by augmenting its own staff with an
independent team of qualified personnel from across the space
community, including the Air Force Cost Analysis Agency (AFCAA) and the
cost estimating organizations of the Air Force Space Command and the
Air Force Space and Missile Systems Center. In addition to the
independent cost estimates, individual program offices also prepare
cost estimates for their acquisition programs. The independent CAIG
cost estimate is designed to assess the program office estimate and
ensure realistic cost estimates are considered. In addition, although
not required in the space acquisition policy, in some cases a cost
analysis is prepared by an Air Force service organization, such as the
Air Force Cost Analysis Agency.
Past GAO Findings on Space Cost Growth:
For fiscal years 2006 through 2011, estimated costs for DOD's major
space acquisition programs have increased a total of about $12.2
billion above initial estimates. For example, the cost estimate for the
SBIRS High program rose from about $4 billion at the start of
development in October 1996 to over $10 billion in September 2005, and
costs are expected to rise further. In addition, the cost estimate for
the NPOESS program grew from about $5.9 billion at program start in
2002 to nearly $11.4 billion currently, according to the CAIG's latest
estimate.
Our past work has identified a number of causes behind the cost growth
and related problems, but several consistently stand out. First, on a
broad scale, DOD starts more weapon programs than it can afford,
creating a competition for funding that encourages low cost estimating,
optimistic scheduling, overpromising, suppressing of bad news, and, for
space programs, forsaking the opportunity to identify and assess
potentially better alternatives. Programs focus on advocacy at the
expense of realism and sound management. Invariably, with too many
programs in its portfolio, DOD is forced to continually shift funds to
and from programs--particularly as programs experience problems that
require more time and money to address. Such shifts, in turn, have had
costly, reverberating effects.
Second, as we have previously testified and reported, DOD starts its
space programs too early, that is, before it has the assurance that the
capabilities it is pursuing can be achieved within available resources
and time constraints. This tendency is caused largely by the funding
process, since acquisition programs attract more dollars than efforts
concentrating solely on proving technologies. Nevertheless, when DOD
chooses to extend technology invention into acquisition, programs
experience technical problems that require large amounts of time and
money to fix. Moreover, when this approach is followed, cost estimators
are not well positioned to develop accurate cost estimates because
there are too many unknowns. Put more simply, there is no way to
estimate how long it would take to design, develop, and build a
satellite system when critical technologies planned for that system are
still in relatively early stages of discovery and invention.
A companion problem for space systems is that programs have
historically attempted to satisfy all requirements in a single step,
regardless of the design challenge or the maturity of the technologies
necessary to achieve the full capability. Increasingly, DOD has
preferred to make fewer, but heavier, large and complex satellites that
perform a multitude of missions rather than larger constellations of
smaller, less complex satellites that gradually increase in
sophistication. This has stretched technology challenges beyond current
capabilities in some cases and vastly increased the complexities
related to software--a problem that affected SBIRS High and AEHF, for
example.
In addition, several of the space programs included in our case
studies, began in the late 1990s, when DOD structured contracts in a
way that reduced oversight and shifted key decision-making
responsibility onto contractors. This approach--known as Total System
Performance Responsibility, or TSPR--was intended to facilitate
acquisition reform and enable DOD to streamline a cumbersome
acquisition process and leverage innovation and management expertise
from the private sector. However, DOD later found that this approach
magnified problems related to requirements creep and poor contractor
performance. In addition, under TSPR, the government decided not to
obtain certain cost data, a decision that resulted in the government
having even less oversight of the programs and limited information from
which to manage the programs. Further, the reduction in government
oversight and involvement led to major reductions in various government
capabilities, including cost-estimating and systems-engineering staff.
The loss of cost-estimating and systems-engineering staff in turn led
to a lack of technical data needed to develop sound cost estimates.
Our reviews have identified additional factors that have contributed to
space cost growth, though less directly. These include consolidations
within the defense supplier base for space programs, the diverse array
of officials and organizations involved with space programs, short
tenures for top leadership and program managers, as well as capacity
shortfalls that have constrained DOD's ability to optimize and oversee
its space programs. A section at the end of this report lists prior
relevant GAO reports.
Program Office Cost Estimates on Space Programs Not Realistic:
Our case study analyses found that program office cost estimates--and
more specifically, the assumptions upon which those estimates were
based--have been unrealistic in eight areas, many of which are
interrelated. In some cases, such as assumptions regarding weight
growth and the ability to gain leverage from heritage, or legacy,
systems, past experiences or contrary data were ignored. In other
cases, such as when contractors were given more program management
responsibility, as with TSPR, or when growth in the commercial market
was predicted, estimators assumed that promises of reduced cost and
schedule would be borne out and did not have the benefit of experience
to factor into their work. We also identified flawed assumptions that
reflected deeper flaws in acquisition strategies or development
approaches. For example, five of six programs we reviewed assumed
technology would be sufficiently mature when needed, even though the
programs began without a complete understanding of how long it would
take or how much it would cost to ensure technologies could work as
intended. In four programs, estimators assumed there would be few
delays, even though programs were adopting highly aggressive schedules
while simultaneously attempting to make ambitious leaps in capability.
In four programs, estimators assumed funding would stay constant, even
though space and weapon programs frequently experience funding shifts
and the Air Force was in the midst of starting a number of costly new
space programs to replenish older constellations.
Table 1 highlights major areas where program officials were too
optimistic in their assumptions for the six space system acquisitions
we examined or where additional evidence showed the estimate was
unrealistic. In some cases, programs may have experienced problems
related to one of the categories, but we did not have evidence to show
the original assumptions were optimistic.
Table 1: Areas Where Program Officials Were Too Optimistic in Their
Assumptions:
Optimistic assumptions: Industrial base would remain constant and
available;
Space programs affected: AEHF: [Empty];
Space programs affected: EELV: X;
Space programs affected: GPS IIF: X;
Space programs affected: NPOESS: X;
Space programs affected: SBIRS High: X;
Space programs affected: WGS: X.
Optimistic assumptions: Technology would be mature enough when needed;
Space programs affected: AEHF: X;
Space programs affected: EELV: [Empty];
Space programs affected: GPS IIF: X;
Space programs affected: NPOESS: X;
Space programs affected: SBIRS High: X;
Space programs affected: WGS: X.
Optimistic assumptions: TSPR would reduce costs and schedule;
Space programs affected: AEHF: [Empty];
Space programs affected: EELV: X;
Space programs affected: GPS IIF: X;
Space programs affected: NPOESS: X;
Space programs affected: SBIRS High: X;
Space programs affected: WGS: [Empty].
Optimistic assumptions: Savings would occur from experience on heritage
systems;
Space programs affected: AEHF: X;
Space programs affected: EELV: [Empty];
Space programs affected: GPS IIF: [Empty];
Space programs affected: NPOESS: X;
Space programs affected: SBIRS High: X;
Space programs affected: WGS: X.
Optimistic assumptions: No weight growth would occur;
Space programs affected: AEHF: X;
Space programs affected: EELV: [Empty];
Space programs affected: GPS IIF: [Empty];
Space programs affected: NPOESS: X;
Space programs affected: SBIRS High: X;
Space programs affected: WGS: X.
Optimistic assumptions: Funding stream would be stable;
Space programs affected: AEHF: X;
Space programs affected: EELV: [Empty];
Space programs affected: GPS IIF: X;
Space programs affected: NPOESS: X;
Space programs affected: SBIRS High: X;
Space programs affected: WGS: [Empty].
Optimistic assumptions: An aggressive schedule;
Space programs affected: AEHF: X;
Space programs affected: EELV: [Empty];
Space programs affected: GPS IIF: [Empty];
Space programs affected: NPOESS: X;
Space programs affected: SBIRS High: X;
Space programs affected: WGS: X.
Optimistic assumptions: No growth in requirements;
Space programs affected: AEHF: X;
Space programs affected: EELV: [Empty];
Space programs affected: GPS IIF: X;
Space programs affected: NPOESS: [Empty];
Space programs affected: SBIRS High: X;
Space programs affected: WGS: [Empty].
Source: This table is based on conversations with program and
contracting officials and analysis of data they provided. In some
cases, we made our own designations based on our prior findings.
[End of table]
* Assumptions about the space industrial base: Five programs
experienced challenges due to assumptions that were made about the
availability and constancy of the industrial base. When cost estimates
for some of these programs were developed, cost estimators assumed the
programs would gain leverage from the commercial satellite market,
which, at the time the programs were initiated, was widely expected to
continue to grow. In the EELV program, for instance, the original
contracting concept was for the Air Force to piggyback on the
anticipated launch demand of the commercial sector. Furthermore, the
Air Force assumed that it would benefit financially from competition
among commercial vendors. However, the commercial demand never
materialized, and the government was forced to bear the cost burden of
maintaining the industrial base in order to maintain launch capability,
and assumed savings from competition were never realized. In other
cases, programs experienced unanticipated problems resulting from
consolidations in the supplier base. For example, contractors took cost-
cutting measures that reduced the quality of parts. Contractors also
lost key technical personnel as they consolidated development and
manufacturing facilities.
* Assumptions about technology maturity: In five of the six space
system acquisition programs, when cost estimates were developed,
program officials and cost estimators assumed that technologies
critical to the programs would be mature and available--even though the
programs began without a complete understanding of how long or how much
it would cost to ensure technologies could work as intended.
Invariably, after the programs began and as their development
continued, the technology issues ended up being more complex than
initially believed. For example, on the NPOESS program, DOD and the
Department of Commerce committed funds for the development and
production of satellites before the technology was mature--only 1 of 14
critical technologies was mature at program initiation and 1 technology
was determined to be less mature after the contractor conducted more
verification testing. The program has since been beset by significant
cost increases and schedule delays due in part to technical problems,
such as the development of key sensors. On the GPS IIF program, the
cost estimate was built on the assumption that the military code signal
being developed would fit on a single microchip. However, once
development started, interface issues arose and the subcontractor had
to move to a two-microchip design, which took 8 months to resolve and
increased cost to the program.
* Assumptions about TSPR savings: Four programs we examined assumed
that there would be significant savings associated with adopting the
TSPR policy. For example, while TSPR was supposed to relieve
contractors of unnecessary oversight, the government assumed that the
contractors would still maintain sufficient systems engineering and
program management levels by following standard practices to provide
oversight of their subcontractors and vendors. However, for a variety
of reasons, the savings never materialized. For instance, it was
believed that by giving more program management responsibility to
contractors and increasing use of commercial equipment, the government
could reduce the number of in-house systems engineers--who normally
help the government define its requirements by analyzing differences
between customer needs and technical possibilities and analyze progress
in development. Ultimately, the reduction in systems engineering staff
resulted in cost growth as the programs experienced technical and
quality problems that the government was no longer in a position to
detect and prevent. Programs also came to realize that commercial parts
being relied on were not always suitable for their efforts, and had to
resort to costly measures to address this problem. In addition, in
implementing TSPR, the government initially entered into contracts that
did not allow it to obtain certain cost data from the contractors
(e.g., contractor cost data reports and contractor performance
reports), even though such data are critical for cost estimators to
develop sound cost estimates and important for the government to
maintain adequate insight. This was the case for EELV and GPS IIF--both
of which have either been restructured or are now planning to issue
follow-on contracts that will require cost and pricing data and earned
value management data. It should be noted that the Air Force has since
recognized problems related to its implementation of TSPR and rejected
it as a recommended approach.
* Assumptions about savings from heritage systems: Four programs
assumed that they would be able to gain leverage from legacy satellite
systems and save costs, but as the programs continued and more
knowledge was gained about the requirements and the technologies needed
to meet the requirements, DOD discovered that the legacy systems could
not be relied on, as initially believed, and the savings were not
realized. In addition, SBIRS High and WGS, for example, had all planned
to gain leverage from commercial satellite development efforts because
the government had planned to use portions of these satellites as
lessons already learned in order to obtain design savings. However,
when hardware and software development advances were slowed as a result
of the Internet sector economic downturn, the government had to carry
more design and development costs than anticipated.
* Assumptions about weight growth: Four case study programs assumed no
weight growth, which is among the highest drivers of cost growth for
space systems, would occur despite leaps hoped for in technology and
experiences in past programs. For example, the SBIRS High program
assumed little to no weight growth, but the weight of the satellite
spacecraft eventually grew by more than 59 percent, while payload
aboard the spacecraft grew by 44 percent. Moreover, with such
considerable weight growth, the program could no longer rely on the
commercial bus it had originally selected for this acquisition, and
instead had to develop a custom satellite bus--a more expensive
endeavor.
* Assumptions about funding: Space programs frequently experienced
funding shifts. Moreover, at the time the Air Force undertook the
programs included in our case studies, it was attempting to replenish
several older satellite constellations, which put further stress on its
total investment in space. Despite this condition, when making
estimates on four programs we reviewed, cost estimators assumed that
program budgets would remain stable. As the programs progressed through
the acquisition cycle, they experienced changes to their funding
stream, which created program instability and cost growth due to the
stopping and starting of activities. Cost estimators and program
officials we interviewed generally agreed that space programs are not
often fully funded and that their programs have experienced shifts in
funding. However, they could not separate the ultimate effects of
funding shifts, since the programs were concurrently experiencing other
problems, such as technical or design problems, which were also adding
costs, and these funding cuts led to other decisions that had
reverberating consequences. For example, in some cases, programs
abandoned their original plans to purchase satellites in one
procurement in favor of individual orders in an effort to address a
funding cut. While this decision enabled the programs to continue in
the short term, it had significant long-term consequences on program
costs since the price of each satellite substantially increased with
the change to individual orders. In previous testimony and reports, we
have stressed that DOD could avoid the need to make costly funding
shifts by developing an overall investment strategy that would
prioritize systems in its space portfolio with an eye toward balancing
investments between legacy systems and new programs as well as between
science and technology programs and acquisition investments. Such
prioritizing would also reduce incentives to produce low estimates.
* Assumptions about schedules: Four case study programs assumed that
compressed schedules being proposed could be achieved--even though the
programs were pursuing ambitious leaps in capability or attempting new
approaches, such as using commercial equipment for military purposes.
Moreover, in some cases, DOD had data available demonstrating such
schedules were not realistic. In one case study program, WGS, the
request for proposals specified that the budget available was $750
million for three satellites plus ground control with a schedule
constraint of 36 months. On the basis of these requirements, competing
contractors were asked to offer maximum capacity, coverage, and
connectivity via a contract that would make use of existing commercial
practices and technologies. This aggressive schedule was never
achieved. Instead, problems due to higher design complexity and
supplier quality issues have caused the WGS schedule to stretch to 78
months for the first expected launch. Historically, the Air Force has
required between 55 and 79 months to build satellites similar to WGS,
so while the schedule slip is within the expected range, the original
36-month schedule was optimistic and not based on realistic data. For
AEHF, the program accelerated its schedule in response to a potential
gap in satellite coverage due to the launch failure of the third
Milstar satellite. However, when the funding needed to achieve the
acceleration was not delivered, the program experienced cost and
schedule delays. Again, because these assumptions were made before
enough information about the development was available, the assumptions
did not hold up, and the programs experienced cost and schedule growth
as a result.
* Assumptions about requirements growth: Three programs--AEHF, GPS IIF,
and SBIRS High--did not assume any requirements growth, even though
there was a risk of growth because of the variety of stakeholders
involved. High-level requirements for the SBIRS High program--which is
being developed to improve missile warning, missile defense, technical
intelligence, and battle space characterization--have remained stable
since the program began, but prior DOD studies have found that lower-
level requirements were in flux and mismanaged until the program was
restructured in 1999. According to DOD studies, this was partially due
to the TSPR approach, which placed too much responsibility on
contractors to negotiate these requirements; the broad customer base
for SBIRS; and the ambitious nature of the program to begin with. To
illustrate, the SBIRS High program has 19 key performance parameters to
satisfy--nearly five times more than the typical DOD space program. In
addition, there are over 12,600 requirements that the program must
address, and to date, requirements for external users have not been
fully defined. DOD has since realized that responsibility for setting
lower-level requirements should rest with the government and has taken
actions to add more discipline to the requirements-setting process. In
another example, GPS IIF was intended to follow on to the GPS II
program, yet shortly after the contract was awarded, the government
added the requirement for an additional auxiliary payload. This
requirement caused the satellite design to be larger than originally
planned, and this, in turn, required a larger launch vehicle.
Requirements for more robust jamming capability to secure satellite
transmissions were also added. Changes from a two-panel to a three-
panel solar array design and flexible power were necessary to allow for
more power and thermal capability requirements.
Appendix III contains additional detailed examples of instances where
program officials were too optimistic in their assumptions for the six
space system acquisitions we examined.
Various Incentives and Pressures within DOD Have Contributed to Cost-
Estimating Weaknesses:
Various incentives and pressures within DOD have contributed to
optimistic program office cost estimates for space system acquisitions.
As noted earlier, our prior work has found that programs are
incentivized to produce optimistic estimates in order to gain approval
for funding. At present, DOD does not have a long-term investment
strategy that would prioritize its investments and, in turn, reduce
pressures associated with competition for funding. A 2003 DOD study on
crosscutting problems affecting space acquisitions, known as the Young
Panel report, also found that the space acquisition system, in
particular, is strongly biased to produce unrealistically low cost
estimates throughout the process; advocacy tends to dominate, and a
strong motivation exists to minimize program cost estimates, and
proposals from competing contractors typically reflected the minimum
program content and a price to win. In responding to the Young Panel
report as well as our prior reports, DOD officials have not disputed
the need for long-term investment planning or that programs are
incentivized to produce low estimates.
In conducting this review, we asked cost estimators, program managers,
industry officials, and higher-level oversight officials what
additional impediments there were to sound cost estimating for space.
Their responses included that (1) there is little accountability for
producing realistic program office estimates--among both program
managers and estimators; (2) estimates produced within program offices
are more often used to set budgets than estimates produced by
independent estimators; (3) even though space programs experience
frequent changes, independent cost estimates are not updated for years
at a time; (4) cost-estimator roles and responsibilities are not clear
and the cost-estimating function is fragmented;
and (5) there are not enough in-house government cost estimators or
sufficient data to support their work.
Accountability Is Lacking:
It is difficult for cost estimators to be held accountable for the
estimates they develop because program decision makers are rarely held
accountable for the estimates they use to establish program budgets.
This, coupled with the pressure to compete for funding, invites program
officials to accept optimistic assumptions and ignore risk and reality
when developing cost estimates.
This view was also expressed by many DOD program managers we
interviewed for a 2005 review on program management best
practices.[Footnote 8] While many program managers told us that they
personally held themselves accountable, many also commented that it is
difficult to be accountable when so much is outside their control.
During our focus groups, program managers cited sporadic instances when
program managers were removed from their positions or forced to retire
if programs came in over cost or schedule, but they also cited
instances when a program manager was promoted even though the program
was experiencing difficulties.
Independent Estimates Not Always Relied Upon:
We found examples from our closer examinations of the AEHF, NPOESS, and
SBIRS High programs where independent cost estimates were not relied
upon by program decision makers. Independent estimates for these space
system acquisitions forecasted considerably higher costs and lengthier
schedules than program office or service cost estimates. Yet the
milestone decision authorities used program office estimates or even
lower estimates instead of the independent estimates to establish
budgets for their programs. DOD's current space acquisition policy
requires that independent cost estimates be prepared by bodies outside
the acquisition chain of command, and be considered by program and DOD
decision makers. However, the policy does not require that the
independent estimates be relied upon to set budgets, only that they be
considered at key acquisition decision points.
* AEHF: In 2004, AEHF program decision makers relied upon the program
office cost estimate rather than the independent estimate developed by
the CAIG to support the production decision for the AEHF program--which
was more than $2 billion higher. At that time, the AEHF program office
estimated the system would cost $6 billion. This was based on the
assumption that AEHF would have 10 times more capacity than the
predecessor satellite--Milstar--but at half the cost and weight. The
CAIG believed that this assumption was overly optimistic given that the
AEHF weight had more than doubled since the program began in 1999 to
obtain the desired increase in data rate. The latest program office
estimate for AEHF is $6.1 billion.
Table 2: Comparison of 2004 AEHF Program Office and Independent Cost
Estimates:
Program office estimate: $6 billion;
Independent cost estimate: AFCAA: AFCAA worked jointly with the CAIG to
develop the independent estimate;
Independent cost estimate: CAIG: $8.7 billion;
Difference: 44%;
Latest program office estimate: $6.1 billion.
Source: CAIG and GAO analysis.
Note: Estimates are in fiscal year 2006 dollars.
[End of table]
* NPOESS: In 2003, to support the NPOESS development decision,
government decision makers relied on the program office's $7.2 billion
cost estimate rather than the $8.8 billion independent cost estimate
presented by the Air Force Cost Analysis Agency. AFCAA based its
estimate on an analysis of historical data from satellite systems,
independent software and hardware models, and a risk simulation model
using input from 30 independent engineers. The program office relied
largely on the contractor's proposal as well as on an unrealistic
estimate of what it would cost to integrate the payloads onto the
satellite bus. The program has encountered many problems as a result of
these optimistic assumptions, and costs have risen to $11.4 billion,
based on the latest program office cost estimate.
Table 3: Comparison of 2003 NPOESS Program Office and Independent Cost
Estimates:
Program office estimate: $7.2 billion (based on planned purchase of six
satellites);
Independent cost estimate: AFCAA: $8.8 billion;
Independent cost estimate: CAIG: [Empty];
Difference: 23%;
Latest program office estimate: $11.4 billion (based on planned
purchase of four satellites).
Source: CAIG and GAO analysis.
Note: Estimates are in fiscal year 2006 dollars. The CAIG was not
involved in preparing the 2003 independent cost estimate.
[End of table]
SBIRS High. On the SBIRS High program, the program office and AFCAA
predicted cost growth as early as 1996, when the program was initiated.
While both estimates at that time were close, approximately $5.6
billion, both were much higher than the contractor's estimated costs.
The program was subsequently estimated to cost $3.6 billion by the
program office, almost $2 billion less than the original AFCAA or
program office estimate. The program office and contractor ultimately
assumed savings under TSPR that did not materialize. For instance, with
this approach, the SBIRS High contractor used far fewer systems
engineers than historical data show have been used for similar
programs. To achieve savings, the contractor dropped important systems
engineering tasks such as verification and cycling of requirements. The
lack of systems engineering resulted in latent design flaws that
required more integration and testing when components failed initial
testing.
Table 4: Comparison of 1996 SBIRS High Program Office Cost Estimate and
Independent Cost Estimate:
Program office estimate: $5.7 billion (based on a planned purchase of
five satellites);
AFCAA independent cost estimate: $5.6 billion;
Total program funding: $3.6 billion;
Latest program office estimate: $10.2 billion (based on a planned
purchase of three satellites).
Source: AFCAA and Air Force documentation and GAO analysis.
Note: Estimates are in fiscal year 2006 dollars.
[End of table]
We were informed by the CAIG that independent cost estimates are rarely
used by the services to develop budgets for acquisition programs.
Because CAIG estimates are seldom used and the program offices know
this, officials we spoke with believe that there is no incentive on the
part of program offices to change their approach to cost estimating.
According to a senior CAIG official, program managers often promise to
meet the maximum amount of requirements for the least cost. These
program officials would rather rely on optimistic cost estimates from
the contractors because these estimates most likely align with program
objectives.
Appendix IV contains detailed examples of where program and cost-
estimating officials disagreed on estimates.
Independent Cost Estimates Not Updated Frequently Enough to Account for
Significant Events and Changes:
It is possible for space programs to continue for years--as many as 4
years--without updates of independent cost estimates and to see changes
within that span of time that have had a substantial impact on cost--
including changes in requirements, changes in planned quantities,
funding instability, design changes, quality variances resulting from
rework, manufacturing or engineering changes, changes in supply chain
and logistics management and support, technology-related problems,
among others. At times, the only mechanism that forced an updated
estimate was DOD policy that the CAIG support the Nunn-McCurdy
certification process for programs breaching a certain unit cost
threshold.[Footnote 9] Under this policy,[Footnote 10] the CAIG
provides the Under Secretary with a recommendation concerning the
reasonableness of the most recent unit cost estimates by the program.
Because space programs tend to experience such changes after program
start, some officials we spoke with in the DOD space cost-estimating
community believe that independent cost estimates should be updated
more frequently. Opinions differ as to the frequency and phasing of
these non milestone estimates, assessments, or reviews. A CAIG official
suggested updating cost estimates about every 18 to 24 months, while
AFCAA officials suggested annually to correspond with the annual
budgeting cycle. The current space acquisition policy requires only one
independent cost estimate after critical design review, but CAIG
officials noted that years can go by between critical design review and
program completion, during which time programs have historically
experienced substantial changes.
Figure 1 illustrates significant changes that took place on the SBIRS
High program both before and after critical design review.
Figure 1: Key Events and Funding Shifts That Occurred between Estimates
for SBIRS High:
[See PDF for image]
Source: GAO analysis of SBIRS High program data.
[End of figure]
Cost-Estimating Roles and Responsibilities Are Unclear:
Air Force cost-estimating officials believe that their roles and
responsibilities are not clear and that the cost-estimating function is
too fragmented. Some also asserted that the cost-estimating function
within the space community would be stronger if estimators themselves
were centralized outside the acquisition chain of command so that they
would not be biased or pressured by program office leadership to
produce optimistic estimates.
In an attempt to make the most efficient use of the limited cost
estimate expertise for DOD space system acquisitions, the space
acquisition policy called on the CAIG to augment its own staff with
cost-estimating personnel drawn from across the community to serve as
team members when it developed independent estimates. Members were to
include the intelligence community's cost analysis improvement group,
the Air Force Cost Analysis Agency, the National Reconnaissance Office
(NRO) Cost Group, the Office of the Deputy Assistant Secretary of the
Army for Cost and Economics, the Naval Center for Cost Analysis, the
cost-estimating organizations of the Air Force Space Command, Air Force
Space and Missile Systems Center, and the Space and Naval Warfare
Systems Command.
At this time, however, there are still significant disconnects in views
about roles and responsibilities. Officials who reside in the
acquisition chain of command--the Air Force Space and Missile Systems
Center--believe that because the program executive officer and the
program managers are responsible for executing the programs, they are
also solely responsible for the cost estimates for the program. On the
other hand, Air Force cost estimators outside the acquisition chain of
command--the Air Force Cost Analysis Agency--believe they also hold
some responsibility to ensure the quality and consistency of cost
estimates and to produce independent cost estimates for consideration
by Air Force decision makers. However, according to officials within
the Space and Missile Systems Center's (SMC) cost-estimating group and
AFCAA, the SMC cost-estimating group sees no role for AFCAA in
developing program or Air Force cost estimates and has rejected
assistance from AFCAA. According to Air Force officials, until a
clearer distinction of roles and responsibilities is defined by Air
Force leadership, issues of conflicting policy interpretation and
implementation will remain. It is also possible that these disconnects
have been exacerbated by the perception that these two communities are
competing for responsibility.
In addition, according to a senior CAIG official, the collaborative
process for developing independent estimates has not been achieved as
envisioned--principally because those who should be involved have not
seen their involvement as a priority, and those who have been involved
have required a lot of extra training to be able to make valuable
contributions. Moreover, because the various cost-estimating
organizations each have different customers, agendas, and approaches to
developing cost estimates, these differences have made it difficult for
them to work as a cohesive team.
Cost-Estimating Resources Are Considered Inadequate:
Air Force space cost-estimating organizations and program offices
believe that cost-estimating resources are inadequate to do a good job
of accurately predicting costs. They believe that their cost-estimating
resources have atrophied over the years because of previous downsizing
of the workforce, making resources such as staff and data inadequate.
As noted earlier, there was a belief within the government that cost
savings could be achieved under acquisition reform initiatives by
reducing technical staff, including cost estimators, since the
government would be relying more on commercial-based solutions to
achieve desired capabilities. According to one Air Force cost-
estimating official we spoke with, this led to a decline in the number
of Air Force cost estimators from 680 to 280. High-grade positions and
specialty cost-estimating job codes were eliminated, abolishing an
official cost-estimating career path, and subordinating cost estimating
as an additional duty. In the process, according to this same Air Force
official, many military and civilian cost-estimating personnel left the
cost-estimating field, and the Air Force lost some of its best and
brightest cost estimators.
Information we obtained from space program offices and cost-estimating
organizations is consistent with the assertion of a lack of requisite
resources. Eight of 13 cost-estimating organizations and program
offices we informally surveyed believe the number of cost estimators is
inadequate. Furthermore, some of these same organizations believe that
cost estimation is not a respected career field within the Air Force,
and more specifically, that Air Force cost estimators are not
encouraged, nor do they have opportunities for promotion or
advancement. Regarding the recognition and career paths for cost
estimators, our data showed that only 3 of 12 organizations agreed that
previous cost estimators had moved on to positions of equal or higher
responsibility. Further, only 4 of 12 agreed that people ask to become
cost estimators.
The belief that cost-estimating skills have been depleted has been
echoed in other DOD and GAO studies. According to the Young Panel
report, government capabilities to lead and manage the acquisition
process have seriously eroded, in part because of actions taken in the
acquisition reform environment of the 1990s. This has extended to cost
estimating. During our 2005 review of program management, we surveyed
DOD's major weapon system program managers and interviewed program
executive officers who similarly pointed to critical skill shortages
for staff that support them, including cost estimators. Other skill
gaps identified included systems engineering, program management, and
software development. We continue to observe these deficiencies in our
more recent reviews of the space acquisition workforce.[Footnote 11]
Because of the decline in in-house cost-estimating resources, space
program offices and Air Force cost-estimating organizations are now
more dependent on support contractors. Ten of 13 cost-estimating
organizations and program offices have more contractor personnel
preparing cost estimates than government personnel. At 11 space program
offices, contractors account for 64 percent of cost-estimating
personnel. Support contractor personnel generally prepare cost
estimates, while government personnel provide oversight, guidance, and
review of the cost-estimating work. By contrast, the CAIG had made a
determination that cost estimating is too important of a function to
place in the hands of support contractors, and assigns only government
personnel to develop cost estimates.
Reliance on support contractors raises questions from the cost-
estimating community about whether numbers and qualifications of
government personnel are sufficient to provide oversight of and insight
into contractor cost estimates. A senior CAIG official involved with
estimating for space acquisition programs, for example, suggested that
reliance on support contractors is a problem if the government cannot
evaluate how good a cost estimate is or lacks the ability to track it.
Two studies have also raised the concern that relying on support
contractors makes it more difficult to retain institutional knowledge
and instill accountability. Further, in the most recent defense
authorization act, Congress is requiring DOD to make it a goal that
within 5 years certain critical acquisition functions, including cost
estimating, be performed by properly qualified DOD employees, and that
in developing a comprehensive strategy for supporting the program
manager role, DOD address improved resources and support such as cost-
estimating expertise.[Footnote 12]
A second resource gap hampering cost estimating is the lack of reliable
technical source data. Officials we spoke with believe that cost
estimation data and databases from which to base cost estimates are
incomplete, insufficient, and outdated. They cite a lack of reliable
historical and current cost, technical, and programmatic data and
expressed concerns that available cost, schedule, technical, and risk
data are not similar to the systems they are developing cost estimates
for. In addition, some expressed concerns that relevant classified and
proprietary commercial data may exist but are not usually available to
the cost-estimating community working on unclassified programs. Some
believe that Air Force cost estimators need to be able to use all
relevant data, including those contained in NRO cost databases, since
the agency builds highly complex, classified satellites in comparable
time and at comparable costs per pound.
Successful Organization Approaches That Better Support Cost Estimating:
Over the past decade, GAO has examined successful organizations in the
commercial sector to identify best practices that can be applied to
weapon system acquisitions. This work has identified a number of
practices that better support cost estimating than DOD does. For
instance, unlike most space programs we have reviewed, the successful
organizations we have studied extensively researched and defined
requirements before program start to ensure that they are achievable,
given available resources. They do not define requirements after
starting programs. They also ensure technologies are mature, that is,
proven to work as intended, and assign more ambitious efforts to
corporate research departments until they are ready to be added to
future increments. In addition, these organizations use systems
engineering to close gaps between resources and requirements before
launching the development process. Taken together, these practices help
ensure that there is little guessing in how long or how many dollars it
will take to achieve an intended capability. Moreover, within the
organizations we studied, decisions to start programs are made through
long-term strategic planning and prioritizing. As a result, competition
for funding is minimized, and programs themselves do not have
incentives to present low estimates.
The successful organizations we have studied have taken additional
steps to ensure cost estimates are complete and accurate that DOD has
not. For instance, they hold program managers accountable for their
estimates and require program managers to stay with a project to its
end. At the same time, they develop common templates and tools to
support data gathering and analysis and maintain databases of
historical cost, schedule, quality, test, and performance data. Cost
estimates themselves are continually monitored and regularly updated
through a series of numerous gates or milestone decisions that demand
programs assess readiness and remaining risk within key sectors of the
program as well as overall cost and schedule issues.
Senior leaders within these organizations also actively encourage
program managers to share bad news about their programs and spend a
great deal of time breaking down stovepipes and other barriers to
sharing information. More important, they commit to fully funding
programs and adhere to those commitments. Commonly, the organizations
we studied have centralized cost estimators and other technical and
business experts so that there is more effective deployment of
technical and business skills while at the same time ensuring some
measure of independence. Within DOD, the CAIG is a good example of
this. Its cost estimates are produced by civilian government personnel
(the sole military space cost estimating position will convert to a
civilian position later on this year when the military cost estimator
retires), to ensure long-term institutional knowledge and limit the
effects of staff turnover that commonly occur with military personnel.
Although the CAIG uses support contractors for conducting studies, it
does not allow cost estimates to be developed by contractors. The CAIG
takes this approach because it considers cost estimating to be a core
function and therefore too important to contract out. The Naval Air
Systems Command's Cost Analysis Division is also considered a model by
some in the cost-estimating community because of its organizational
structure and leadership support. It is a centralized cost department
that provides support to multiple program offices. The department is
headed by a senior executive-level manager, and various branches within
the department are headed by GS-15-level managers. Analysts are
somewhat independent of the program offices, as their supervisors are
within the engineering department. This cost department has strong
support from its leadership, and this support has helped it hire the
number of analysts and receive the resources it needs. However, another
official pointed out that this cost department is not completely
independent from the acquisition chain of command, since it receives
funding from the program offices to conduct the cost estimates.
GAO has made recommendations to DOD to adopt best practices we have
identified that would strengthen program management DOD-wide. Congress
also recently directed DOD to develop a strategy to enhance program
manager empowerment and accountability, agreeing with GAO's assessment
that DOD has consistently failed to give program managers the authority
that they need to successfully execute acquisition programs and, as a
result, is unable to hold them accountable.[Footnote 13] GAO has also
made recommendations to the Air Force to better position its space
programs for success. In response, the Air Force has restructured its
Transformational Satellite Communications System (TSAT) to ensure that
the program incorporates technologies that have been proven to work as
intended, and it has deferred more ambitious efforts to the science and
technology community. It has committed to do the same on other
programs. If effectively implemented, such actions would, in turn,
significantly enhance the ability of independent estimators to forecast
costs. However, we have testified that DOD faces a number of challenges
and impediments in its effort to instill this approach. It needs
significant shifts in thinking about how space systems should be
developed, changes in incentives and perceptions; and further policy
and process changes. And such changes will need to be made within a
larger acquisition environment that still encourages a competition for
funding and consequently pressures programs to view success as the
ability to secure the next installment rather than the end goal of
delivering the capabilities when and as promised.
The Air Force has also been taking actions to make specific
improvements to cost estimating for space programs. In the case of
TSAT, program officials said they are updating the program's planning
cost estimate on an annual basis. Furthermore, according to one CAIG
official, some program offices have recently been using the CAIG's
independent cost estimates. Both the SBIRS High and NPOESS program
offices are developing their budgets based on the CAIG independent
estimates that support the certification process for the programs' most
recent Nunn-McCurdy breaches. Further, DOD and Air Force cost
estimators we spoke to recognize that amendments made to the Nunn-
McCurdy law by the 2006 Defense Authorization Act may increase realism
in establishing initial cost estimates. As part of the revisions, DOD
is barred from changing its original baseline cost estimate for a
program until after it has breached certain Nunn-McCurdy thresholds
that require a certification and assessment of the program, and DOD
must report the baseline changes to Congress.
The Air Force has also committed to strengthening its cost-estimating
capabilities in terms of people, methodologies, and tools. For
instance, 50 new cost estimators have recently been authorized to the
AFCAA, some of whom may be detailed to the Space and Missile Systems
Center. Finally, key players within the DOD space cost-estimating
community are meeting on a regular basis to discuss issues, review
recent findings from GAO and other groups, and explore lessons learned
and potential ideas for improvement.
Conclusions:
Costs for DOD space acquisitions over the past several decades have
consistently been underestimated--sometimes by billions of dollars. For
the most part, this has not been caused by poor cost estimating itself,
but rather the tendency to start programs before knowing whether
requirements can be achieved within available resources. In fact, with
so many unknowns about what could be achieved, how, and when, even the
most rigorous independent cost estimate could have been off by a
significant margin. Nevertheless, in the past, the Air Force has
exacerbated acquisition problems by not relying on independent cost
estimates and failing to encourage more realism in program planning and
budgeting. Moreover, even after the Air Force embraced independent cost
estimating in its acquisition policy for space, it did not facilitate
better estimating by according the cost-estimating community with the
organizational clout, support, and guidance the Air Force believes are
needed to ensure the community's analyses are used. On a positive note,
the Air Force has committed to addressing some of the root causes
behind cost growth, principally by accumulating more knowledge about
technologies before starting new programs. Though adopting this
approach will be challenging without larger DOD acquisition, funding,
and requirement-setting reforms, the Air Force can facilitate better
planning and funding approaches by aligning resources and policy to
support improved cost-estimating capability and by following through on
its commitment to use independent estimates.
Recommendations for Executive Action:
We recommend that the Secretary of Defense direct the Under Secretary
of Defense for Acquisition, Technology and Logistics or the Secretary
of the Air Force, as appropriate, to take the following actions:
1. To increase accountability and transparency of decisions in space
programs where an independent estimate produced by the CAIG or AFCAA is
not chosen, require officials involved in milestone decisions to
document and justify the reasons for their choice and the differences
between the program cost estimate and the independent cost estimate.
2. To better ensure investment decisions for space programs are
knowledge-based, instill processes and tools necessary to ensure
lessons learned are incorporated into future estimates. This could
include:
* conducting postmortem reviews of past space program cost estimates
(program office and independent cost estimates) to measure cost-
estimating effectiveness and to track and record cost-estimating
mistakes;
* developing a centralized cost-estimating database that provides
realistic and credible data to cost estimators;
* establishing protocols by which cost estimators working with the
National Reconnaissance Office can share data with the DOD space cost-
estimating community while still maintaining appropriate security over
classified data; and:
* ensuring estimates are updated as major events occur within a program
that could have a material impact on cost, such as budget reductions,
integration problems, hardware/software quality problems, and so forth.
3. To optimize analysis and collaboration within the space cost-
estimating community, clearly articulate the roles and responsibilities
of the various Air Force cost-estimating organizations, and ensure that
space system cost estimators are organized so that the Air Force can
gain the most from their knowledge and expertise. In taking these
actions for programs for which no independent estimate is developed by
the CAIG, consider assigning AFCAA the responsibility for the
development of independent cost estimates for space system
acquisitions, since it is outside of the acquisition chain of command
and therefore likely to be unbiased and not pressured to produce
optimistic estimates.
Agency Comments and Our Evaluation:
DOD provided us with written comments on a draft of this report. DOD
concurred with the overall findings in our report and provided
technical comments, which have been incorporated where appropriate. DOD
also concurred with two of our recommendations and partially concurred
with one.
DOD concurred with our recommendation to instill processes and tools
necessary to ensure lessons learned are incorporated into future
estimates. DOD stated it was already taking actions to address our
recommendations. For example, the CAIG has established a process
whereby key members of the national security space cost analysis
community meet to discuss and evaluate outcomes following ACAT I space
program milestone reviews or key decision point Defense Acquisition
Board-level reviews, to provide visibility to other members of the
community on how the CAIG approaches independent cost estimate
development and to give the community an opportunity to provide
feedback to the CAIG on how to improve its processes. DOD stated that
the CAIG will work in the future to incorporate peer reviews of the
program office estimates within this existing framework. DOD also
concurred with our recommendation to develop a centralized cost-
estimating database, and stated that several groups within the space
cost-estimating community have been working to develop a database of
historical space program costs available to the community as a whole,
and has also reestablished a common space program work breakdown
structure that supports the various estimating methodologies employed
by the space cost community. Through the common database development
process, the community is working to make historical program cost data
as widely available as possible. DOD also agreed with our
recommendation to update cost estimates as major events occur within a
program, as long as they are program and program phase dependent.
Finally, DOD concurred with our recommendation to clearly articulate
the roles and responsibilities of the various cost-estimating
organizations. DOD stated that the Air Force is currently updating its
policy directive to further clarify the roles and responsibilities of
the space cost analysis organizations to optimize analysis and
collaborations, thus making the best use of the limited number of
qualified and experienced space program cost analysts. We agree that
these actions are steps in the right direction and that they will
strengthen cost-estimating capabilities and improve space program cost
estimates.
DOD partially concurred with our recommendation to require officials
involved in milestone decisions to document and justify the reasons for
their cost estimate choice and the differences between the program cost
estimate and the independent cost estimate. In commenting on this
recommendation, DOD stated that the complex decision to determine which
cost figure to use as basis for funding and to evaluate future program
performance must weigh many competing factors that are often
qualitative in nature. It further stated that the decision is the
milestone decision authority's alone, and that documenting the explicit
justification will reduce the milestone decision authority's future
decision-making flexibility. We do not see how documenting the explicit
justification will significantly reduce the milestone decision
authority's future decision-making flexibility. While we recognize the
value of decision-making flexibility and the role that judgment must
play in such decisions, we also believe that the basis for the
decisions should withstand review, particularly after the person who
made the decision has left office. We also believe that the greater
transparency of cost-estimating decisions that a documented
justification provides is needed, particularly in light of the poor
foundation of choices made in the past on space programs.
We are sending copies of this report to interested congressional
committees and the Secretaries of Defense and the Air Force. We will
also provide copies to others on request. In addition, this report will
be available at no charge on the GAO Web site at [Hyperlink,
http://www.gao.gov].
If you have any questions about this report or need additional
information, please call me at (202) 512-4841 (chaplainc@gao.gov).
Contact points for our Offices of Congressional Relations and Public
Affairs may be found on the last page of this report. GAO staff who
made major contributions to this report are listed in appendix VI.
Signed by:
Cristina T. Chaplain:
Acting Director, Acquisition and Sourcing Management:
[End of section]
Appendix I: Scope and Methodology:
The Chairman and the Ranking Member, Subcommittee on Strategic Forces,
House Committee on Armed Services, requested that we examine (1) in
what areas space system acquisitions cost estimates have been
unrealistic and (2) what incentives and pressures have contributed to
the quality and usefulness of cost estimates for space system
acquisitions.
To determine whether cost estimates for space system acquisitions have
been realistic, we used a case study methodology. We selected six
ongoing Air Force space system acquisitions. We selected these
acquisitions because they were far enough along in their acquisition
cycles for us to be able to observe changes in the programs since their
initial cost estimates were developed. The six space system
acquisitions are the Advanced Extremely High Frequency Satellites, the
Evolved Expendable Launch Vehicle, the Global Positioning System IIF,
the National Polar-orbiting Operational Environmental Satellite System,
the Space Based Infrared System High, and the Wideband Gapfiller
Satellites. For each of the case studies, we met with the program
office representatives at the Air Force's Space and Missile Systems
Center and at the program's prime contractors. We also obtained program
cost and other program documentation to determine how the cost
estimates were formulated and on what basis they were formulated.
To determine what incentives and pressures contributed to the quality
and usefulness of cost estimates for space system acquisitions, we
examined Department of Defense (DOD) and Air Force policies for
developing and updating cost estimates for space programs. We also used
a data collection instrument to obtain information on cost-estimating
practices and resources within the Air Force Cost Analysis Agency, at
the Space and Missile Systems Center, and at the space program offices.
We conducted interviews with the Office of the Secretary of Defense's
Cost Analysis Improvement Group, the Air Force Cost Analysis Agency,
and the Air Force Space and Missile Systems Center's Cost Center. On
the basis of the results of the data collection instruments and
interviews, we obtained information on the organizational alignment of
cost-estimating organizations, including roles and responsibilities, as
well as concerns over the current cost-estimating policies and
practices.
We also relied on our previous best practice studies, which have
examined pressures and incentives affecting space system acquisition
programs, the optimal levels of knowledge needed to successfully
execute programs, and complementary management practices and processes
that have helped commercial and DOD programs to reduce costs and cycle
time. Moreover, we reviewed studies from the Defense Science Board, the
DOD Inspector General, IBM, and others on space system acquisition and
cost-estimating issues.
Finally, we discussed the results of our work and our observations with
an expert panel made up of representatives from the DOD space cost-
estimating community.
We conducted our review between August 2005 and October 2006 in
accordance with generally accepted government auditing standards.
[End of section]
Appendix II: DOD Acquisition Categories for Major Defense Acquisition
Programs:
An acquisition program is categorized based on dollar value and
milestone decision authority special interest. Table 5 contains the
description and decision authority for acquisition categories ID and
IC.
Table 5: DOD Acquisition Categories and Decision Authorities:
Acquisition category (ACAT): ACAT ID; For designated major defense
acquisition programs (special interest based on technological
complexity, congressional interest, large commitment of resources,
critical role in achieving a capability, or a joint program);
Dollar value: Research, development, test, and evaluation > $365
million; Procurement > $2.19 billion;
Milestone decision authority: Under Secretary of Defense for
Acquisition, Technology and Logistics.
Acquisition category (ACAT): ACAT IC; For major defense acquisition
programs not designated as ACAT ID;
Dollar value: Research, development, test, and evaluation > $365
million; Procurement > $2.19 billion;
Milestone decision authority: Head of DOD component or, if delegated,
DOD component or service acquisition executive.
Source: DOD Instruction 5000.2, Enclosure 2, which also lists other
acquisition categories.
Note: Dollar values are fiscal year 2000 constant dollars.
[End of table]
[End of section]
Appendix III: Examples of Where Program Officials Were Too Optimistic
in Their Assumptions:
Table 6 highlights major areas where program officials were too
optimistic in their assumptions for the six space system acquisitions
we examined--the Advanced Extremely High Frequency (AEHF) Satellites,
the Evolved Expendable Launch Vehicle (EELV), the Global Positioning
System (GPS) IIF, the National Polar-orbiting Operational Environmental
Satellite System (NPOESS), the Space Based Infrared System (SBIRS)
High, and the Wideband Gapfiller Satellites (WGS).
Table 6: Examples of Optimistic Assumptions:
Space program affected: Assumed industrial base would remain constant
and available: EELV;
Examples: The original contracting concept was for the Air Force to
piggyback on the launch demand anticipated to be generated by the
commercial sector. However, the commercial demand never materialized,
and the government had to take on an additional cost burden. In
addition, the cost for launch services increased because fixed
infrastructure costs are being spread over 15 launches a year instead
of the original expectation of 75 launches a year.
Space program affected: Assumed industrial base would remain constant
and available: GPS IIF;
Examples: A deteriorating manufacturing base of contractors and
subcontractors caused the prime contractor to move the design team from
Seal Beach, California, to Anaheim, California, in 2001. Additional
moves occurred as the prime contractor consolidated development
facilities to remain competitive. For each move, the prime contractor
lost valuable workers, causing inefficiencies in the program. In
addition, the contractor took additional cost-cutting measures that
reduced quality.
Space program affected: Assumed industrial base would remain constant
and available: NPOESS;
Examples: A long production phase on this program increases the
probability for parts obsolescence. Over 70 percent of the value added
to the program is from the supply base, and some critical parts that
are unique to the program are produced by relatively small companies.
In addition, workers required to have specialized skills must be United
States citizens to obtain security clearances. The labor pool has to
produce these specialized skills because degree programs currently do
not produce them.
Space program affected: Assumed industrial base would remain constant
and available: SBIRS High;
Examples: Consolidation within the supplier base has adversely affected
the program. When suppliers merged, costs increased for supplier
technical assistance, product rework, and hardware qualifications. In
addition, unforeseen costs resulted when production processes and
materials were changed and facilities and personnel were relocated.
Space program affected: Assumed industrial base would remain constant
and available: WGS;
Examples: At the time of contract award, the satellite industry was
flourishing with commercial satellite orders, and the contractor
anticipated a large market. However, when the installation of optical
fiber communication lines became widespread, many of the commercial
initiatives involving proposed space systems did not materialize. The
government had planned to gain leverage from the design work of
commercial contractors but ended up having to pay for design efforts.
In addition, because of the reduction of the number of contracts
awarded, small subcontractors started to consolidate. Specialized parts
became obsolete, and the Air Force was no longer considered a high-
priority customer.
Space program affected: Assumed technology would be mature enough when
needed: AEHF;
Examples: AEHF faced several technology maturity problems including
developing a digital processing system that would support 10 times the
capacity of Milstar medium data rate without self- interference, and
using phased array antennas at extremely high frequencies, which had
never been done before. In addition, the change from a physical to an
electronic process for crypto re-keys was not expected at the start of
the AEHF. The predecessor program to AEHF was Milstar, which required
approximately 2,400 crypto re-keys per month, which could be done
physically. Regarding AEHF proposed capabilities, the number of crypto
re-keys is approximately 100,000, which is too large for a physical
process and must be done electronically. Changing the way the re-keys
were done called for a revolutionary change in the process and led to
unexpected cost and schedule growth.
Space program affected: Assumed technology would be mature enough when
needed: GPS IIF;
Examples: The cost estimate was built on the assumption that the
military code being developed in the program would fit on one chip.
However, once development started, there were interface issues, and the
subcontractor had to move to a two-chip design, which added cost growth
to the program. In addition, the problem took 8 months to solve.
Space program affected: Assumed technology would be mature enough when
needed: NPOESS;
Examples: DOD and the Department of Commerce committed funds for the
development and production of the satellites before the design was
proven and before the technology was mature. At program initiation,
only 1 of 14 critical technologies was mature, and some technology
levels have been assessed downward. For example, the 1394 Bus
Technology Readiness Level (TRL) was changed from 5 to 4 after the
contractor added more verification testing.
Space program affected: Assumed technology would be mature enough when
needed: SBIRS High;
Examples: In 2003, GAO reported that three critical technologies--the
infrared sensor, thermal management, and onboard processor--were now
mature. When the program began, in 1996, none of its critical
technologies was mature.
Space program affected: Assumed technology would be mature enough when
needed: WGS;
Examples: The X-band phased array antennas and the array power chips
were the most difficult technologies to mature, because these state-of-
the-art elements generated too much heat, which is very difficult to
remove in outer space, so they had to be redesigned.
Space program affected: Assumed Total System Performance Responsibility
(TSPR) would reduce costs and schedule: EELV;
Examples: The EELV program office entered into a TSPR contract that
does not require the contractor to deliver cost or earned value
management data. The program office stated that TSPR gave too many
responsibilities to the contractor and not enough to the government.
Space program affected: Assumed Total System Performance Responsibility
(TSPR) would reduce costs and schedule: GPS IIF;
Examples: The contract that was awarded during acquisition reform
efforts of the late 1990s adopted the TSPR approach. Under TSPR, there
was limited oversight of the contractor, and this contributed to
relaxed specifications and inspections on commercial practices, loss of
quality in the manufacturing process, and poor-quality parts that
caused test failures, unexpected redesigns, and the late delivery of
parts.
Space program affected: Assumed Total System Performance Responsibility
(TSPR) would reduce costs and schedule: NPOESS;
Examples: The NPOESS prime contractor has a Shared System Performance
Responsibility (SSPR), which was an outgrowth of TSPR. The SSPR
arrangement relegates the government's role as a participant in
contractor Integrated Product Team meetings. In addition, the program
is managed by officials from three separate government agencies. DOD
and Department of Commerce share the cost of funding the development of
NPOESS, while NASA provides funding for specific technologies and
studies. Difficulties have arisen with the tri-service approach to
managing NPOESS, including ensuring NPOESS follows DOD's acquisition
process, but Commerce, which has control over the program, has no
authority over the DOD process;
each agency is driven by different program objectives (i.e., military,
civilian, science);
and NASA shares equally in managing the program even though it provides
no funding for the development.
Space program affected: Assumed Total System Performance Responsibility
(TSPR) would reduce costs and schedule: SBIRS High;
Examples: When the original contract was awarded, acquisition reform
efforts were being implemented and called for the use of commercial
practices instead of government standards. In order to achieve cost
savings, the SBIRS program office reduced critical up-front systems
engineering design practices and follow-on quality assurance
inspections based on the expectation that the contractor would perform
these activities with no government oversight. The prime contractor
also held the same requirements for its subcontractors as a way to keep
costs down. This lack of oversight resulted in difficulties in
determining the root causes when components began to fail during
testing. For example, there have been latent defects that required
extensive corrective action and associated cost growth with the
software redesign, single board computer halts, payload reference bench
rework, payload electromagnetic interference, software configuration
issues, propulsion solder issues, and telescope foreign object damage.
In addition, the contractor had responsibility to coordinate different
agency needs, a responsibility that proved to be difficult when trying
to resolve hardware interface issues.
Space program affected: Assumed savings from heritage systems: AEHF;
Examples: The program office cost estimators relied on data from
heritage systems to estimate AEHF nonrecurring costs. The Cost Analysis
Improvement Group (CAIG) believed the estimates based on heritage data
were subjectively derived and therefore susceptible to bias. For
example, AEHF program officials assumed that the nulling antennas would
have the same performance as those on Milstar, requiring little if any
development. In fact, because of parts obsolescence, personnel
turnover, and other issues, the entire antenna had to be redesigned at
nearly the same cost as the first one. There were similar beliefs that
legacy processing technology could be used, which turned out to not be
possible. Further, almost all of the payload software had to be
rewritten to support the new hardware. As a result, there was much less
technology transfer from Milstar II to AEHF, even though the contractor
was the same.
Space program affected: Assumed savings from heritage systems: NPOESS;
Examples: NPOESS payload development proposals relied heavily on
leveraging heritage satellite instrument technology development. The
prime contractor and the program office agreed there was too much
optimism regarding heritage sensor reuse. For example, the Visible
Infrared Radiometer Suite (VIIRS) is more powerful and complex and will
weigh 20 percent more than the heritage sensor that was used to base
the estimate. In addition, the Conical Microwave Imager Sounder (CMIS)
is much more complex than the heritage sensor, which took more than 8
years to develop. The program office estimated a 4-year development
schedule for CMIS. The latest cost estimate for CMIS is now
approximately five times the initial estimate.
Space program affected: Assumed savings from heritage systems: SBIRS
High;
Examples: The original estimate for nonrecurring engineering was
significantly underestimated based on actual experience in legacy
sensor development and assumed software reuse. As a result,
nonrecurring costs should have been two to three times higher according
to historical data and independent cost estimators.
Space program affected: Assumed savings from heritage systems: WGS;
Examples: Originally, the contractor planned to gain leverage from a
commercial satellite development effort--using the same bus and phased
array antenna. The commercial satellite development effort did not
materialize, leaving DOD to pay for infrastructure and hardware design
costs. This caused WGS costs to increase and the schedule to slip.
Space program affected: Assumed no weight growth would occur: AEHF;
Examples: When the cost estimate was initially developed, satellite
payload weight was assumed to be constant by the program office. When
updating its independent cost estimate in 2004, the CAIG found that the
payload weight more than doubled between the start of development and
critical design review. Weight increased because of the addition of
phased array antennas, an antenna modification, and other requirements.
Space program affected: Assumed no weight growth would occur: NPOESS;
Examples: The CMIS sensor weight has almost doubled since the
preliminary design review. As a result, engineering change proposals
were issued to modify the spacecraft to accept the higher payload
weight.
Space program affected: Assumed no weight growth would occur: SBIRS
High;
Examples: Weight growth has occurred in the spacecraft and payload. The
spacecraft has experienced weight growth of about 59 percent because of
the need to lengthen and stiffen the structure, add a solar shield to
block sunlight from the payload, and add missing wire and harnessing.
The geosynchronous earth orbit (GEO) payload has experienced nearly a
44 percent weight growth because of integration hardware, pointing, and
control assembly.
Space program affected: Assumed no weight growth would occur: WGS;
Examples: Problems with solar panel concentrators overheating caused a
solar panel redesign that led to additional weight growth in the
spacecraft bus.
Space program affected: Assumed funding stream would be sufficient and
remain stable: AEHF;
Examples: The AEHF program sustained a $100 million fiscal year 2002
funding cut. The program office reported that the funding cut would
result in a 6-month launch delay to the first three satellites and a
delay in meeting initial operational capability. The program had
rapidly staffed personnel to support a warfighter need. The funding cut
resulted in contractor program reductions to fit within the revised
fiscal year 2002 budget. In addition, DOD made a decision to shift the
acquisition strategy from buying five satellites at one time to buying
three satellites as individual buys, which also caused costs to rise.
Space program affected: Assumed funding stream would be sufficient and
remain stable: GPS IIF;
Examples: The Operational Control Segment portion of the GPS IIF
program received a $37.7 million funding cut in fiscal year 2005.
Because of the funding cut, the program delayed some of the software
efforts and reduced some software requirements.
Space program affected: Assumed funding stream would be sufficient and
remain stable: NPOESS;
Examples: Between fiscal years 2004 and 2005, DOD reduced funding for
the program by about $65 million. However, funding was reduced $130
million since the Department of Commerce contributes no more funding
towards the program than DOD. The program office determined that the
funding cut resulted in satellite launch delays ranging from 5 to 26
months and a cost increase of $391.2 million.
Space program affected: Assumed funding stream would be sufficient and
remain stable: SBIRS High;
Examples: A funding cut in 1998-1999 because of higher budget
priorities caused a reduction in the systems engineering staff and
contributed to a 2-year delay of the geosynchronous earth orbit
satellites. This cut caused work activities to continually stop and
restart and drove the need for interim solutions that resulted in
program instability and cost growth. It also led to a breach of the
acquisition program baseline in 2001, resulting in a change in the
procurement strategy from a single buy of five satellites to two
separate buys--one for two satellites and the other for three
satellites. Independent cost estimators calculated that costs would
double as a result of the change in procurement strategy.
Space program affected: Assumed an aggressive schedule: AEHF;
Examples: The first launch was originally scheduled for June 2006, but
in response to a potential gap in satellite coverage due to the launch
failure of the third Milstar satellite, DOD accelerated the schedule by
18 months, aiming for a first launch in December 2004. An unsolicited
contractor proposal stated that it could meet the accelerated date,
even though all the requirements for AEHF were not fully determined. As
a result, the program office knew that the proposed schedule was overly
optimistic, but the decision was made at high levels in DOD to award
the contract. However, DOD did not commit the funding to support the
activities and manpower needed to design and build the satellites more
quickly. Funding issues further hampered development efforts and
increased schedule delays and contributed to cost increases.
Space program affected: Assumed an aggressive schedule: NPOESS;
Examples: When the estimate was developed, NPOESS was expected to be
heavier, require more power, and have over twice as many sensors than
heritage satellites. Yet the program office estimated that the
satellites would be developed, integrated, and tested in less time than
heritage satellites. Independent cost estimators highlighted to the
NPOESS program office that the proposed integration schedule was
unrealistic when compared to historical satellite programs. Later, the
CAIG cautioned the program office that not only was the system
integration assembly and test schedule unrealistic, but the assumptions
used to develop the estimate were not credible.
Space program affected: Assumed an aggressive schedule: SBIRS High;
Examples: The schedule proposed in 1996 did not allow sufficient time
for geosynchronous earth orbit system integration and did not
anticipate program design and workmanship flaws, which eventually cost
the program considerable delays. In addition, the schedule was
optimistic in regard to ground software productivity, and time needed
to calibrate and assess the health of the satellite. There has been
almost a 3-year delay in the delivery of the highly elliptical orbit
(HEO) sensors and a 6-year delay in the launch of the first GEO
satellite.
Space program affected: Assumed an aggressive schedule: WGS;
Examples: The request for proposals specified that the budget available
was $750 million for three satellites and the ground control system to
be delivered within 36 months. On the basis of these requirements,
competing contractors were asked to offer maximum capacity, coverage,
and connectivity through a contract that would make use of existing
commercial practices and technologies. However, higher design
complexity and supplier quality issues caused the WGS schedule to
stretch to 78 months for the first expected launch. Historically, DOD
experienced between 55 and 79 months to develop satellites similar to
WGS, so while DOD's experience is within the expected range, the
original 36-month schedule was unrealistic.
Space program affected: Assumed no growth in requirements: AEHF;
Examples: DOD awarded the contract for AEHF before the requirements
were fully established to fill the gap left by the Milstar launch
failure. As a result, DOD frequently and substantially altered
requirements in the early phases of the program and changed the system
design. For example, a new requirement increased the need for anti-
jamming protection, which led to a cost increase of $100 million. In
addition, new requirements related to training, support, and
maintainability led to a cost increase of $90 million.
Space program affected: Assumed no growth in requirements: GPS IIF;
Examples: GPS IIF was intended to follow on the GPS II program, yet
shortly after the contract was awarded, the government added the
requirement for an additional auxiliary payload. This requirement
caused the satellite design to be larger than originally planned and,
in turn, required a larger launch vehicle. Requirements for more robust
jamming capability to secure satellite transmissions were also added.
Changes from a two-panel to a three- panel solar array design and
flexible power were necessary to allow for more power and thermal
capability requirements.
Space program affected: Assumed no growth in requirements: SBIRS High;
Examples: DOD is developing SBIRS High to improve missile warning,
missile defense, technical intelligence, and battle-space
characterization. As such, SBIRS has many customers, including the Air
Force, Army, missile defense, and other agencies, each of which has its
own requirements. This has resulted in complications in developing
SBIRS, due to the fact that there are 19 key performance parameters to
satisfy, which are about five times more than the typical DOD program.
In addition, there are over 12,600 requirements for the program to
address, and to date, requirements from external users have not been
fully defined. Under the TSPR arrangement, the contractor was
responsible for coordinating these requirements. This effort was
challenging and, according to a DOD official, one better suited for the
government because all agencies were to agree on requirements. The
SBIRS contractor encountered numerous problems when trying to resolve
the interface issues among the various agencies. Moreover, the
development of interface control documents required different
certification requirements for each agency, and the SBIRS contractor
had limited systems engineers to handle the workload. This lack of
staff resulted in many requirements not flowing down, which led to
problems later on.
Source: This table is based on conversations with program and
contracting officials and analysis of data they provided. In some
cases, we made our own designations based on our prior findings.
[End of table]
[End of section]
Appendix IV: Examples Where Independent Cost Estimates Were Not Relied
Upon:
We found examples from our close examinations of the AEHF, NPOESS, and
SBIRS High programs where independent cost estimates were not relied
upon by program decision makers. Independent estimates for these space
system acquisitions forecasted higher costs and lengthier schedules
than program office or service cost estimates. This appendix provides
detailed information on the differences between the program office cost
estimates and the independent cost estimates for the AEHF, NPOESS, and
SBIRS High programs.
AEHF:
In 2004, AEHF program decision makers relied upon the program office
cost estimate rather than the independent estimate developed by the
CAIG to support the production decision for the AEHF program. At that
time, the AEHF program office estimated the system would cost about $6
billion. This was based on the assumption that AEHF would have 10 times
more capacity than the predecessor satellite--Milstar--but at half the
cost and weight. However, the CAIG concluded that the program could not
deliver more data capacity at half of the weight given the state of
technology at that time. In fact, the CAIG believed that in order to
get the desired increase in data rate, the weight would have to
increase proportionally. As a result, the CAIG estimated that AEHF
would cost $8.7 billion, and predicted a $2.7 billion cost overrun for
the AEHF program. Table 7 displays the differences between the program
office and CAIG cost estimates.
Table 7: Comparison of 2004 AEHF Program Office and Independent Cost
Estimates:
Millions of fiscal year 2006 dollars.
Program office estimate: $6,015;
Independent cost estimate: AFCAA: [A];
Independent cost estimate: CAIG: $8,688;
Difference: 44%;
Latest Program office estimate: $6,132.
Source: CAIG and GAO analysis.
[A] AFCAA worked jointly with the CAIG to develop the independent
estimate.
[End of table]
The CAIG relied on weight data from historical satellites to estimate
the cost of AEHF because it considers weight to be the single best cost
predictor for military satellite communications. The historical data
from the AEHF contractor showed that the weight had more than doubled
since the program began and the majority of the weight growth was in
the payload. The Air Force also used weight as a cost predictor, but
attributed the weight growth to structural components rather than the
more costly payload portion of the satellite. When the CAIG briefed the
Air Force on its estimate, the program office disagreed with the CAIG
results, saying it did not see much payload weight growth in the data
it analyzed. The CAIG reported that it used AEHF contractor cost
reports to determine the amount of weight growth for the payload, and
that these data were corroborated by AEHF monthly earned value
management data, which showed cost overruns for the payload effort. As
table 8 shows, the payload weight for the AEHF satellite increased
about 116 percent.
Table 8: Historical AEHF Weight Growth:
Milestone: Milestone I (A);
Date: January 1999;
Payload weight (lbs): 1,694;
Percent growth: n/a.
Milestone: Milestone II (B);
Date: May 2001;
Payload weight (lbs): 2,631;
Percent growth: 55.
Milestone: Preliminary design review;
Date: August 2001;
Payload weight (lbs): 3,437;
Percent growth: 103.
Milestone: Critical design review;
Date: April 2004;
Payload weight (lbs): 3,659;
Percent growth: 116.
Source: CAIG.
[End of table]
The Air Force attributed AEHF cost growth to problems to the
cryptographic portion of the program, which is being developed by the
National Security Agency (NSA). AEHF program officials stated that
weight growth was consistent with that of other space programs.
However, the CAIG stated that major cost growth was inevitable from the
start of the AEHF program because historical data showed that it was
possible to achieve a weight reduction or an increase in data capacity,
but not both at the same time.
In addition, the CAIG also stated that the Air Force was optimistic in
developing the AEHF schedule estimate. During the production decision
review in 2004, the CAIG estimated the first satellite launch date to
be 28 months longer than the program office estimate, which the CAIG
estimated to have no more than a 1 percent chance of success. The CAIG
also stated that because of problems with cryptographic development and
reliability concerns with other technical aspects of the program, such
as the phased array antenna and digital signal processing, the
ambitious AEHF schedule was in jeopardy, and the program would not
likely be implemented as planned.
In February 2005, the CAIG reviewed the proposed revision to the AEHF
Acquisition Program Baseline (APB). In a memorandum sent to the
Assistant Secretary of Defense for Network and Information Integration,
the CAIG chairman did not concur with the AEHF draft APB. The CAIG
chairman explained that while the Air Force estimate included a 24
percent increase to the average procurement unit cost, which was 1
percent below the threshold for a Nunn-McCurdy certification, the
CAIG's estimate prepared in December 2004 projected an increase of over
100 percent. Further, because of the vast differences between the Air
Force and CAIG cost estimates, the CAIG chairman expressed concern that
Congress would perceive the revised APB as an attempt to avoid a Nunn-
McCurdy certification.
There is still risk for the AEHF program costs to grow. As a result of
delays, AEHF satellites have not yet been through thermal vacuum
testing. Spacecraft must endure a wide range of temperatures associated
with liftoff and ascent through the atmosphere and exposure to the
extreme temperatures of space. The thermal environment is generally
considered the most stressful operating environment for hardware, and
electronic parts are especially sensitive to thermal conditions.
Problems such as cracks, bond defects, discoloration, performance
drift, coating damage, and solder-joint failure have typically
occurred. Thermal vacuum testing is used to screen out components with
physical flaws and demonstrate that a device can activate and operate
in extreme and changing temperatures. Because thermal vacuum testing
provides the most realistic simulation of flight conditions, problems
typically occur during testing. If this occurs on AEHF, more delays and
cost overruns are likely.
NPOESS:
NPOESS provides another example of where there were large differences
between program office and independent cost estimates. In 2003,
government decision makers relied on the program office's $7.2 billion
cost estimate rather than the $8.8 billion independent cost estimate
presented by the AFCAA to support the NPOESS development contract
award. Program officials and decision makers preferred the more
optimistic assumptions and costs of the program office estimate,
viewing the independent estimate as too high. The $1.65 billion
difference between the estimates is shown in table 9.
Table 9: Comparison of 2003 NPOESS Program Office and Independent Cost
Estimates:
Millions of fiscal year 2006 dollars.
Program office estimate: $7,219;
Independent cost estimate: AFCAA: $8,869;
Independent cost estimate: CAIG: [A];
Difference: 23%;
Latest Program office estimate: $11,400.
Source: Air Force Cost Analysis Improvement Group briefing, April 2003.
Note: The program office and the AFCAA cost estimates were based on a
purchase of six satellites, and the latest estimate is based on a
purchase of four satellites, with less capability and a renewed
reliance on a European contribution.
[A] The CAIG was not involved in preparing the 2003 independent cost
estimate.
[End of table]
AFCAA based its estimate on an analysis of historical data from
satellite systems (i.e., NASA's Aqua and Aura and DOD's Defense
Meteorological Satellite Program [DMSP] program)[Footnote 14]
independent software and hardware models, and a risk simulation model
using input from 30 independent engineers. The differences between the
two estimates revolved around three major areas:
* The first included a discrepancy of almost $270 million in the cost
for ground software development. The program office estimated the cost
at $90 million based on the contractor's proposal for scaling the
software and productivity rates that were highly optimistic. AFCAA
based its estimate on a commercial software cost model using DSMP and
SBIRS High historical software lines of code growth and actual
productivity rates from the Global Positioning System program.
* The second difference was in the assembly and integration and testing
estimates. Compared to actual integration efforts on historical
satellites used by the AFCAA, the program office estimate to integrate
the payloads onto the satellite bus was nearly $132 million less than
AFCAA's estimate.
* The third area involved the systems engineering and program
management costs for space segment development and production. AFCAA
used actual data from the Aqua and Aura satellites, while the program
office relied on the contractor's proposal--resulting in a difference
of more than $130 million. The program office's estimate was lower
based on an assumption that the costs for systems engineering and
program management would be reduced by almost 50 percent between
development and production. AFCAA stated concern that Aqua, Aura, and
DMSP did not show a significant decrease in these costs over time.
Because the program office's estimate was lower, AFCAA concluded that
the program office's cost and schedule estimates suffered from a lack
of realism. However, the results of AFCAA's independent cost estimate
were not used by the program office officials and decision makers.
In May 2004, the Under Secretary of the Air Force asked the CAIG to
prepare an independent cost estimate for the NPOESS program. The
estimate was completed in January 2005, following completion of the
contractor's re-evaluation of the program baseline in November 2004.
The cost estimate focused primarily on the proposed integration
schedule of the NPOESS satellites. This estimate, like AFCAA's estimate
before it, was based on historical cost data from analogous satellites
and concluded that the program office's proposed integration schedule
for the program was unrealistic. For example, the program office
proposed an integration schedule for the first NPOESS satellite that
was about half the time needed for an analogous satellite that had
almost the same number of sensors. In other words, the NPOESS program
estimated that it would integrate close to the same number of sensors
in half the time. Table 10 illustrates how the program office developed
its integration estimate for NPOESS, which was based on data from Aqua
satellites.
Table 10: Program Office Integration Estimates for NPOESS:
Program: Aqua;
Number of sensors: 6;
Months to integrate based on historical data: 31;
Months to integrate sensor: 5.2;
Deletion of months due to unforeseen problems: -17;
Months to integrate without problems: 14;
Months to integrate sensor without problems: 2.3.
Program: NPOESS (first satellite integration );
Number of sensors: 5;
Months to integrate based on historical data: 26;
Months to integrate sensor: 5.2;
Deletion of months due to unforeseen problems: N/A;
Months to integrate without problems: 14;
Months to integrate sensor without problems: 2.8.
Source: NPOESS Executive Committee briefing, January 2005.
[End of table]
The program office relied on actual data for Aqua, with no unforeseen
problems as the basis for estimating the amount of time needed to
integrate NPOESS sensors on the first satellite, rather than using
historical data that would have yielded an estimate of 26 months to
integrate five sensors. The program office and the contractor contended
that a novel approach was being taken to satellite integration on the
NPOESS program. The CAIG disagreed with this contention, stating that
the proposed integration approach was not really novel because the use
of a test bed model is a common tool used by satellite programs and
would not yield the significant savings asserted by the program office.
The CAIG, instead, estimated 25 months for integrating five sensors
based on Aqua, Aura, and DMSP historical data. As a result, the CAIG's
estimate was almost double the program office's. The CAIG also
expressed concern to program officials that the integration schedule
was severely underestimated and that the difference between the program
office estimate and the CAIG's estimate was more than 6 years.
The program office's 2003 estimate of $7.2 billion has been shown to be
highly unrealistic, with significant cost overruns and schedule delays-
-thus far--for sensor development only. Overall satellite integration
efforts have been delayed due to the problems experienced in
development of the sensors. In June 2006, the Office of the Secretary
of Defense completed the Nunn-McCurdy process and certified a
restructured program that reduced the number of satellites to be
developed--from six to four, with the first launch being delayed to
2013 from 2009. Cost has grown from the original estimate of nearly
$7.2 billion to over $11.4 billion--approximately a 60 percent
increase.
SBIRS High:
On the SBIRS High program, the program office and AFCAA predicted cost
growth as early as 1996, when the program was initiated. While both
estimates at that time were close ($5.7 billion in 2006 dollars by the
program office and $5.6 billion in 2006 dollars by AFCAA), both were
much more than the contractor's estimated costs. Nevertheless, the
program was subsequently budgeted at $3.6 billion by the program
office, almost $2 billion less than the AFCAA or program office
estimate. The CAIG stated that the SBIRS program assumed savings under
TSPR that simply did not materialize. SBIRS program officials also
planned on savings from simply rehosting existing legacy software, but
those savings were not realized because the all software eventually was
rewritten. Instead, it took 2 years longer than planned to complete the
first increment of software.
Savings were also assumed by the contractor in the area of systems
engineering. The SBIRS High contractor initially estimated using fewer
systems engineers, even though historical data showed programs similar
to SBIRS High relied on three to almost four times the number of system
engineers. Some of the tasks dropped from the systems engineering
effort included verification and cycling of requirements because the
government assumed that the contractor would perform these activities
with little or no oversight. The contractor also held the same
requirements for its subcontractors, resulting in a program with
limited systems engineering. The lack of systems engineers has led to
latent design flaws and substantially more integration and testing than
planned because no one knew what had gone wrong when components began
to fail during testing. This large amount of rework and troubleshooting
has led to substantial cost and schedule increases.
In 2005, the CAIG reviewed the SBIRS High production program including
estimating the cost to develop geosynchronous earth orbiting (GEO)
satellites 3-5 as clones of GEOs 1 and 2 in order to determine the cost
growth incurred by the production program since 2002. The CAIG's
analysis projected a 25 percent Nunn-McCurdy breach in average
procurement unit cost as a result of contractor cost and schedule
performance being markedly worse than those experienced on historical
satellite programs. In addition, the CAIG found that government actions
to date have been ineffective in controlling cost and schedule growth.
The program office, on the other hand, showed a much lower cost
estimate for the production cost of GEO satellites 3-5, as seen in
table 11.
Table 11: SBIRS High GEO 3-5 Procurement Funding Analysis:
Millions of then-year dollars.
Three individual satellite procurements;
CAIG estimate: $2,892;
Program office: $2,027;
Delta: $865;
Delta %: 43%.
Source: CAIG and GAO analysis.
[End of table]
The CAIG based its estimate on contractor data for prime contractor
systems engineering and program management, and payload integration
assembly and test, which showed substantial increases in the period of
performance, staffing levels, and hourly rates over initial estimates.
In addition, the CAIG's estimate reflected a contractual change from a
shared fee pool to a traditional prime contractor/subcontractor
relationship.
The CAIG expressed concern that despite restructuring and rebaselining
the program, SBIRS High has struggled unabated since contract award.
The CAIG also cautioned that rebaselining would only allow the program
to hide problems in the short term. For example, the CAIG reported that
earned value management data showed GEO costs were following the same
downward trend as the HEO portion of the program, which meant that
additional cost and schedule delays were possible.
[End of section]
Appendix V: Comments from the Department of Defense:
Office Of The Assistant Secretary Of Defense:
6000 Defense Pentagon:
Washington, DC 20301-6000:
Networks And Inform On Integration:
NOV 09 2006:
Ms. Cristina T. Chaplain:
Director, Acquisition and Sourcing Management:
U.S. Government Accountability Office:
441 G Street, NW:
Washington, DC 20548:
Dear Ms. Chaplain:
Thank you for the opportunity to comment on the GAO Draft Report, GAO-
07-96 entitled "Space Acquisitions: DoD Needs to Take More Action to
Address Unrealistic Initial Cost Estimates of Space Systems, dated
October 13, 2006 (GAO Code 120554)". I concur with the overall findings
of the report, and have enclosed comments to your specific
recommendations.
Again, thank you for this opportunity to comment on your report.
Sincerely,
Signed by:
John R. Landon:
Deputy Assistant Secretary of Defense:
(C3ISR & IT Acquisition):
Enclosures:
As stated:
GAO Draft Report Dated October 13, 2006 GAO-07-96 (GAO Code 120554):
"Space Acquisitions: DOD Needs To Take More Action To Address
Unrealistic Initial Cost Estimates Of Space Systems"
Department Of Defense Comments To The GAO Recommendations:
Recommendation 1: The GAO recommended that the Secretary of Defense
direct the Under Secretary of Defense for Acquisition, Technology and
Logistics or the Secretary of the Air Force, as appropriate, to
increase accountability and transparency of decisions in space programs
where an independent estimate produced by the Cost Analysis Improvement
Group (CAIG) or Air Force Cost Analysis Agency (AFCAA) is not chosen,
require officials involved in milestone decisions to document and
justify the reasons for their choice and the differences between the
program cost estimate and the independent cost estimate. (p. 21/GAO
Draft Report):
DOD Response: Partially Concur. At both the development and the
production Milestones for all Acquisition Category (ACAT) I programs
U.S. Code Title 10 - Armed Forces requires the Milestone Decision
Authority (MDA) [USD(AT&L)] to be informed by an independently
developed life cycle cost estimate before making a decision on how to
proceed. The OSD CAIG is charged with developing this estimate and, in
practice, presents their findings to the Defense Acquisition Board
(DAB) when they meet to advise the MDA. Additionally, the OSD CAIG
formally documents its Independent Cost Estimate (ICE) in a report to
the MDA. The complex decision to determine the cost figure used as a
basis for funding and to evaluate future program performance must weigh
many competing factors that are often qualitative in nature. As with
all other acquisition related decisions, the decision is the MDA's
alone, and although thoroughly discussed with the MDA advisors during
the DAB meeting and clearly documented in the Acquisition Decision
Memorandum (ADM), documenting the explicit justification will reduce
the MDA's future decision-making flexibility.
Recommendation 2: The GAO recommended that the Secretary of Defense
direct the Under Secretary of Defense for Acquisition, Technology and
Logistics or the Secretary of the Air Force, as appropriate, to better
ensure investment decisions for space programs are knowledge-based,
instill processes and tools necessary to ensure lessons learned are
incorporated into future estimates. (p. 21/GAO Draft Report):
DOD Response: Concur with this recommendation. DoD also concurs with
the following recommendations in the report:
* Conducting post-mortem reviews of past space program cost estimates
(program office and independent cost estimates) to measure cost
estimating effectiveness and to track and record cost estimating
mistakes.
DoD Response: Concur. The OSD CAIG has an established process whereby
they meet with the key members of the National Security space cost
analysis community to discuss and evaluate the outcomes following ACAT
I space program Milestone or Key Decision Point (UDP) DAB-level
reviews. The purpose of this meeting is to provide visibility to the
other members of the National Security space cost analysis community on
how the CAIG approaches ICE development and to give the community an
opportunity to provide feedback to the CAIG on how to improve their
processes. The OSD CAIG will work in the future to incorporate, within
this existing framework, peer reviews of the associated program office
estimate. The OSD CAIG, as required by the DoD's space acquisition
regulations, tracks and documents their ICES against current program
office estimates as each ACAT I program proceeds through its
development and production phases.
* Developing a centralized cost estimating database that provides
realistic and credible data to cost estimators.
DoD Response: Concur. Several groups, including the OSD CAIG sponsored
National Security Space Cost Analysis Symposium, Consortium on Space
Technology Estimating Research (CoSTER), and the National
Reconnaissance Office (NRO) led cost integrated process teams (IPTs),
have been working to develop a database of historical space program
costs available to the community as a whole for model development and
estimate preparation. Additionally, the OSD CAIG in conjunction with
USD(AT&L) and approved by the National Security space cost community,
has reestablished a common space program work breakdown structure
(WBS), incorporated in the latest version of Military Handbook 881,
that supports the various estimating methodologies employed by the
space cost community. This recently adopted standard will be used as a
basis for all future space program cost data collection.
* Establishing protocols by which cost estimators working with the NRO
can share data with the DOD space cost estimating community while still
maintaining appropriate security over classified data.
DoD Response: Concur. Through the common database development process
described above, the community, within security constraints, is working
to make historical program cost data as widely available as possible.
* Ensuring estimates are updated as major events occur within a program
that could have a material impact on cost, such as budget reductions,
integration problems, hardware/software quality problems, etc.
DoD Response: Concur. Updating estimates, including independently
developed cost estimates, more frequently than only at designated KDPs
is clearly helpful to inform budgets and support program resource
adjustment decisions. However, it is important not to mandate updates
as, by their nature, they should be program and program phase
dependent.
Recommendation 3: The GAO recommended that the Secretary of Defense
direct the Under Secretary of Defense for Acquisition, Technology and
Logistics or the Secretary of the Air Force, as appropriate, to
optimize analysis and collaboration within the space cost estimating
community, clearly articulate the roles and responsibilities of the
various Air Force cost estimating organizations, and ensure that space
system cost estimators are organized so that the Air Force can gain the
most from their knowledge and expertise. In taking these actions for
programs for which no independent estimate is developed by the DoD
CAIG, consider assigning the Air Force Cost Analysis Agency (AFCAA)
with responsibility for the development of independent cost estimates
for space system acquisitions, since is outside the acquisition chain
of command and therefore more likely to be unbiased and not pressured
to produce optimistic estimates. (p. 21 GAO Draft Report):
DOD Response: Concur. The Air Force is currently updating their Cost
and Economics Policy Directive (AFPD 65-5) and associated Instructions
to further clarify the roles and responsibilities of their space cost
analysis organizations. The purpose of this policy also addresses Air
Force goals to optimize analysis and collaboration thus making the best
use of the limited number of qualified and experienced space program
cost analysts.
[End of section]
Appendix VI: GAO Contacts and Staff Acknowledgments:
GAO Contact:
Cristina T. Chaplain (202) 512-4859 or chaplainc@gao.gov:
Staff Acknowledgments:
In addition to the contact named above, Brian Bothwell, Greg Campbell,
Joanna Chan, Jennifer Echard, Art Gallegos, Barbara Haynes, Anne
Hobson, Jason Lee, Sigrid McGinty, Karen Richey, Suzanne Sterling, Adam
Vodraska, and Peter Zwanzig made key contributions to this report.
[End of section]
Related GAO Products:
Defense Space Activities: Management Actions Are Needed to Better
Identify, Track, and Train Air Force Space Personnel. GAO-06-908.
Washington, D.C.: September 21, 2006.
Defense Acquisitions: DOD Needs to Establish an Implementing Directive
to Publish Information and Take Actions to Improve DOD Information on
Critical Acquisition Positions. GAO-06-987R. Washington, D.C.:
September 8, 2006.
Defense Acquisitions: Space System Acquisition Risks and Keys to
Addressing Them. GAO-06-776R. Washington, D.C.: June 1, 2006.
Space Acquisitions: Improvements Needed in Space Systems Acquisitions
and Keys to Achieving Them. GAO-06-626T. Washington, D.C.: April 6,
2006.
Best Practices: Better Support of Weapon System Program Managers Needed
to Improve Outcomes. GAO-06-110. Washington, D.C.: November 30, 2005.
Defense Acquisitions: Incentives and Pressures That Drive Problems
Affecting Satellite and Related Acquisitions. GAO-05-570R. Washington,
D.C.: June 23, 2005.
Defense Acquisitions: Improved Management Practices Could Help Minimize
Cost Growth in Navy Shipbuilding Programs. GAO-05-183. Washington,
D.C.: February 28, 2005.
NASA: Lack of Disciplined Cost-Estimating Processes Hinders Effective
Program Management. GAO-04-642. Washington, D.C.: May 28, 2004.
Defense Acquisitions: Despite Restructuring, SBIRS High Program Remains
at Risk of Cost and Schedule Overruns. GAO-04-48. Washington, D.C.:
October 31, 2003.
Defense Acquisitions: Improvements Needed in Space Systems Acquisition
Management Policy. GAO-03-1073. Washington, D.C.: September 15, 2003.
Military Space Operations: Common Problems and Their Effects on
Satellite and Related Acquisitions. GAO-03-825R. Washington, D.C.: June
2, 2003.
FOOTNOTES
[1] 10 U.S.C. § 2434 (2000).
[2] 10 U.S.C. § 2434(b)(1)(A).
[3] 10 U.S.C. § 2434(b)(1)(B).
[4] DOD Directive 5000.04, Cost Analysis Improvement Group at ¶ 2 (Aug.
2006);
DOD Instruction 5000.2, Enclosure 6, Resource Estimation (May 2003).
[5] National Security Space Acquisition Policy 03-01 (revised December
2004).
[6] National Security Space Acquisition Policy at Appendix 3.2.
[7] Recently, the Under Secretary of Defense for Acquisition,
Technology and Logistics withdrew its delegation of milestone decision
authority from the Air Force. As a result, although some acquisition
authority was returned to the Air Force, the Under Secretary of Defense
for Acquisition, Technology and Logistics is the current milestone
decision authority for major space system acquisitions. It is not known
when or if this role will be placed back within the Air Force.
[8] GAO, Best Practices: Better Support of Weapon System Program
Managers Needed to Improve Outcomes, GAO-06-110 (Washington, D.C.: Nov.
30, 2005).
[9] 10 U.S.C. § 2433. This oversight mechanism originated with the
Department of Defense Authorization Act, 1982. It was made permanent in
the following year's authorization act and has been amended several
times. Generally, the law requires DOD to review programs and report
(and in some cases submit a certification) to Congress whenever cost
growth reaches specified thresholds. The statute is commonly known as
Nunn-McCurdy, based on the names of the sponsors of the original
legislation.
[10] DOD Directive 5000.04 at ¶ 4.8.
[11] GAO, Defense Space Activities: Management Actions Are Needed to
Better Identify, Track, and Train Air Force Space Personnel, GAO-06-908
(Washington, D.C.: Sept. 21, 2006), and Defense Acquisitions: DOD Needs
to Establish an Implementing Directive to Publish Information and Take
Actions to Improve DOD Information on Critical Acquisition Positions,
GAO-06-987R (Washington, D.C.: Sept. 8, 2006).
[12] John Warner National Defense Authorization Act for Fiscal Year
2007, Pub. L. No. 109-364 §§ 820, 853 (2006).
[13] John Warner National Defense Authorization Act for Fiscal Year
2007, Pub. L. No. 109-364, § 853 (2006), and accompanying conference
report, H.R. Rep. No. 109-702, pages 784-785.
[14] Aqua collects information on evaporation from the oceans, water
vapor from the atmosphere, radioactive energy fluxes, land vegetation
cover, and land, air, and water temperatures, among other things.
Aura's mission is to study the Earth's ozone, air quality, and climate
focusing exclusively on the composition, chemistry, and dynamics of the
Earth's upper and lower atmospheres. The Defense Meteorological
Satellite Program collects weather data for military operations.
GAO's Mission:
The Government Accountability Office, the investigative arm of
Congress, exists to support Congress in meeting its constitutional
responsibilities and to help improve the performance and accountability
of the federal government for the American people. GAO examines the use
of public funds;
evaluates federal programs and policies;
and provides analyses, recommendations, and other assistance to help
Congress make informed oversight, policy, and funding decisions. GAO's
commitment to good government is reflected in its core values of
accountability, integrity, and reliability.
Obtaining Copies of GAO Reports and Testimony:
The fastest and easiest way to obtain copies of GAO documents at no
cost is through the Internet. GAO's Web site ( www.gao.gov ) contains
abstracts and full-text files of current reports and testimony and an
expanding archive of older products. The Web site features a search
engine to help you locate documents using key words and phrases. You
can print these documents in their entirety, including charts and other
graphics.
Each day, GAO issues a list of newly released reports, testimony, and
correspondence. GAO posts this list, known as "Today's Reports," on its
Web site daily. The list contains links to the full-text document
files. To have GAO e-mail this list to you every afternoon, go to
www.gao.gov and select "Subscribe to e-mail alerts" under the "Order
GAO Products" heading.
Order by Mail or Phone:
The first copy of each printed report is free. Additional copies are $2
each. A check or money order should be made out to the Superintendent
of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or
more copies mailed to a single address are discounted 25 percent.
Orders should be sent to:
U.S. Government Accountability Office
441 G Street NW, Room LM
Washington, D.C. 20548:
To order by Phone:
Voice: (202) 512-6000:
TDD: (202) 512-2537:
Fax: (202) 512-6061:
To Report Fraud, Waste, and Abuse in Federal Programs:
Contact:
Web site: www.gao.gov/fraudnet/fraudnet.htm
E-mail: fraudnet@gao.gov
Automated answering system: (800) 424-5454 or (202) 512-7470:
Public Affairs:
Jeff Nelligan, managing director,
NelliganJ@gao.gov
(202) 512-4800
U.S. Government Accountability Office,
441 G Street NW, Room 7149
Washington, D.C. 20548: