Defense Acquisitions
Improvements Needed in Space Systems Acquisition Management Policy
Gao ID: GAO-03-1073 September 15, 2003
The Department of Defense is spending nearly $18 billion annually to develop, acquire, and operate satellites and other space-related systems. The majority of satellite programs that GAO has reviewed over the past 2 decades experienced increased costs and delayed schedules. DOD has recently implemented a new acquisition management policy, which sets the stage for decision making on individual space programs. GAO was asked to assess the new policy.
DOD's new space acquisition policy may help provide more consistent and robust information on technologies, requirements, and costs. For example, the policy employs a new independent cost estimating process, independent program reviews performed by space experts not connected with the program, and more rigorous analyses of alternatives, requirements, and system interdependencies. This information may help decision-makers assess whether gaps exist between expectations and what the program can deliver. However, the benefits that can be derived from these tools will be limited since the new policy does not alter DOD's practice of committing major investments before knowing what resources will be required to deliver promised capability. Instead, the policy encourages development of leading edge technology within product development, that is, at the same time the program manager is designing the system and undertaking other product development activities. As our work has repeatedly shown, such concurrency increases the risk that significant problems will be discovered as the system is integrated and built, when it is more costly and time-consuming to fix them. Moreover, when even one technology does not mature as expected, the entire program can be thrown off course since time and cost for invention cannot be reliably estimated. DOD's new acquisition policy for its other weapon systems recognizes these risks and consequently requires technology and product development to be done separately.
Recommendations
Our recommendations from this work are listed below with a Contact for more information. Status will change from "In process" to "Open," "Closed - implemented," or "Closed - not implemented" based on our follow up work.
Director:
Team:
Phone:
GAO-03-1073, Defense Acquisitions: Improvements Needed in Space Systems Acquisition Management Policy
This is the accessible text file for GAO report number GAO-03-1073
entitled 'Defense Acquisitions: Improvements Needed in Space Systems
Acquisition Management Policy' which was released on September 15,
2003.
This text file was formatted by the U.S. General Accounting Office
(GAO) to be accessible to users with visual impairments, as part of a
longer term project to improve GAO products' accessibility. Every
attempt has been made to maintain the structural and data integrity of
the original printed product. Accessibility features, such as text
descriptions of tables, consecutively numbered footnotes placed at the
end of the file, and the text of agency comment letters, are provided
but may not exactly duplicate the presentation or format of the printed
version. The portable document format (PDF) file is an exact electronic
replica of the printed version. We welcome your feedback. Please E-mail
your comments regarding the contents or accessibility features of this
document to Webmaster@gao.gov.
This is a work of the U.S. government and is not subject to copyright
protection in the United States. It may be reproduced and distributed
in its entirety without further permission from GAO. Because this work
may contain copyrighted images or other material, permission from the
copyright holder may be necessary if you wish to reproduce this
material separately.
Report to the Chairman, Subcommittee on Defense, Committee on
Appropriations, House of Representatives:
United States General Accounting Office:
GAO:
September 2003:
Defense Acquisitions:
Improvements Needed in Space Systems Acquisition Management Policy:
GAO-03-1073:
GAO Highlights:
Highlights of GAO-03-1073, a report to the Chairman, Subcommittee on
Defense, Committee on Appropriations, House of Representatives
Why GAO Did This Study:
The Department of Defense is spending nearly $18 billion annually to
develop, acquire, and operate satellites and other space-related
systems. The majority of satellite programs that GAO has reviewed over
the past 2 decades experienced increased costs and delayed schedules.
DOD has recently implemented a new acquisition management policy,
which sets the stage for decision making on individual space programs.
GAO was asked to assess the new policy.
What GAO Found:
DOD‘s new space acquisition policy may help provide more consistent
and robust information on technologies, requirements, and costs. For
example, the policy employs a new independent cost estimating process,
independent program reviews performed by space experts not connected
with the program, and more rigorous analyses of alternatives,
requirements, and system interdependencies. This information may help
decision-makers assess whether gaps exist between expectations and
what the program can deliver.
However, the benefits that can be derived from these tools will be
limited since the new policy does not alter DOD‘s practice of
committing major investments before knowing what resources will be
required to deliver promised capability. Instead, the policy
encourages development of leading edge technology within product
development, that is, at the same time the program manager is
designing the system and undertaking other product development
activities. As our work has repeatedly shown, such concurrency
increases the risk that significant problems will be discovered as the
system is integrated and built, when it is more costly and time-
consuming to fix them. Moreover, when even one technology does not
mature as expected, the entire program can be thrown off course since
time and cost for invention cannot be reliably estimated. DOD‘s new
acquisition policy for its other weapon systems recognizes these risks
and consequently requires technology and product development to be
done separately.
What GAO Recommends:
GAO is recommending that DOD modify its policy to separate technology
development from product development and ensure decisions to start
programs are based on sound criteria. DOD disagreed with our
recommendations principally because it believes that implementing them
will slow down acquisitions, increase risks, and prevent DOD from
taking advantage of cutting edge technology. Our past reviews of best
practices, however, have shown that risk and time are reduced and
capability is increased when programs begin with knowledge that
technologies can work as intended. DOD‘s policy for other weapon
systems incorporates this view.
[End of section]
Contents:
Letter:
Results in Brief:
Background:
Gap between Resources and Requirements Has Undermined
Space Acquisitions:
Space Policy May Help Increase Insight into Gaps between Requirements
and Resources:
New Space Policy Does Not Call for a Match between Resources and
Requirements at Program Start:
Conclusions:
Recommendations for Executive Action:
Agency Comments and Our Evaluation:
Appendix I: The Department of Defense's Current and Planned Satellite
Systems:
Appendix II: Technology Readiness Levels and Their Definitions:
Appendix III: Comments From the Department of Defense:
Related GAO Products:
Table:
Table 1: Decision-Making Characteristics:
Figures:
Figure 1: Overview of Key Decision Points:
Figure 2: DOD Will Be Making Commitments before Obtaining Critical
Knowledge for Space Systems:
Abbreviations:
AEHF: Advanced Extremely High Frequency:
CAIG: Cost Analysis Improvement Group:
DAB: Defense Acquisition Board:
DOD: Department of Defense:
DSAB: Defense Space Acquisition Board:
EELV: Evolved Expendable Launch Vehicle:
GPS: Global Positioning System:
IPA: Independent Program Assessment:
IPT: Integrated Product Team:
KDP: key decision point:
MUOS: Mobile Users Objective System:
NPOESS: National Polar-orbiting Operational Environmental Satellite
System:
NRO: National Reconnaissance Office:
SBIRS: Space-Based Infrared System:
SBR: Space-Based Radar:
STSS: Space Tracking and Surveillance System:
TRL: Technology Readiness Level:
TSAT: Transformational Communications Satellite:
United States General Accounting Office:
Washington, DC 20548:
September 15, 2003:
The Honorable Jerry Lewis
Chairman,
Subcommittee on Defense
Committee on Appropriations
House of Representatives:
Dear Mr. Chairman:
The Department of Defense (DOD) is spending more than $18 billion
annually to develop, acquire, and operate satellites and other space-
related systems. Moreover, DOD is on the threshold of investing in
several new major satellite acquisition programs. These programs are
intended to help transform how information is collected on capabilities
and intentions of potential adversaries as well as how military forces
communicate and navigate and attack targets. We reported to you in June
2003 that the majority of satellite programs we have reviewed over the
past 2 decades experienced problems during acquisition that
significantly increased costs and delayed schedules, often to the point
where programs needed to be restructured by DOD.
DOD has recently implemented a new acquisition management policy
for space systems, which sets the stage for making decisions on
individual space programs. As you requested, we assessed the new
policy--specifically whether it will enable DOD to match requirements
(that is, what the system needs to do and how well it needs to perform)
to resources (time, money, and technical knowledge) at the onset of
product development. Our work shows that achieving this match is the
most critical determinant for successful outcomes of acquisitions.
Results in Brief:
DOD's new space acquisition policy may help provide more consistent
and robust information on technologies, requirements, and costs.
For example, the policy employs a new independent cost estimating
process, independent program reviews performed by space experts not
connected with the program, and more rigorous analyses of alternatives,
requirements, and system interdependencies. This information may help
decision-makers assess whether gaps exist between expectations and what
the program can deliver.
However, the benefits that can be derived from these tools will be
limited since the new policy does not alter DOD's practice of
committing major investments before knowing what resources will be
required to deliver promised capability. Instead, the policy encourages
development of leading edge technology within product development, that
is, at the same time the program manager is designing the system and
undertaking other product development activities. As our work has
repeatedly shown, such concurrency increases the risk that significant
problems will be discovered as the system is integrated and built, when
it is more costly and time-consuming to fix them. Moreover, when even
one technology does not mature as expected, the entire program can be
thrown off course since time and cost for invention cannot be reliably
estimated. DOD's new acquisition policy for its other weapon systems
recognizes these risks and consequently requires technology and product
development to be done separately.
We are making recommendations to DOD to modify its policy to separate
technology development from product development and ensure decisions to
start programs are based on sound criteria. DOD disagreed with our
recommendations principally because it believes that implementing them
will slow down acquisitions, increase risks, and prevent DOD from
taking advantage of cutting edge technology. Our past reviews of best
practices, however, have shown that risk and time are reduced and
capability is increased when programs begin with knowledge that
technologies can work as intended. DOD's policy for other weapon
systems incorporates this view.
Background:
DOD's current space network is comprised of constellations of
satellites, ground-based systems, and associated terminals and
receivers. Among other things, these assets are used to perform
intelligence, surveillance, and reconnaissance functions; perform
missile warning; provide communication services to DOD and other
government users; provide weather and environmental data; and provide
positioning and precise timing data to U.S. forces as well as national
security, civil, and commercial users.
DOD is now implementing a new acquisition management policy tailored to
its space systems.[Footnote 1] It expects to finalize the policy this
fiscal year. The policy is similar to the one used by the National
Reconnaissance Office (NRO). The policy is different from a new
acquisition management policy DOD is implementing for most other
weapons-related acquisitions in several respects.
* Key decisions, including the decision to start product development
and to start building and testing a satellite, will be made earlier in
the development process. According to DOD, this is because satellites
incur most of their costs during the early phases of development.
* The decision to build and produce a satellite will be made at the
same time instead of sequentially. According to DOD, this is because
satellites are produced in very small numbers as compared to other
acquisitions.
Figure 1 provides an overview of differences in key decision points.
Figure 1: Overview of Key Decision Points:
[See PDF for image]
Note: According to DOD officials, while technology development is
expected to ramp down during phase B, in some instances technology
development could even continue after key decision point C or critical
design review. Thus, technology development is depicted in a lighter
shade after decision point C.
[End of figure]
The new space acquisition policy is also different than DOD's policy
for other weapon systems in terms of decision-making support. For
example, the new policy has created an advisory board distinct from the
DOD's Defense Acquisition Board (DAB). The Defense Space Acquisition
Board (DSAB), comprised of senior-level DOD officials and mission
partners, will advise the Under Secretary of the Air Force, as the
milestone decision authority, on whether significant investments should
move forward in the development process. Also, temporary Independent
Program Assessment teams (IPA) will be used to conduct an intensive
review before key decisions are made. Under DOD's process for other
weapon systems, standing Integrated Product Teams (IPT) are used to
help programs conduct key analyses as well as to advise the DAB. Table
1 provides more details on these differences.
Table 1: Decision-Making Characteristics:
DOD Weapons Acquisitions: Milestone Decision Authority; Space
Acquisitions: Milestone Decision Authority.
DOD Weapons Acquisitions: Under Secretary of Defense for Acquisition,
Technology and Logistics (USD AT&L) makes decision on whether program
should proceed into next phase; Space Acquisitions: Under Secretary of
the Air Force makes decision on whether program should proceed into
next phase.
DOD Weapons Acquisitions: Advisory Board; Space Acquisitions: Advisory
Board.
Defense Acquisition Board (DAB); Composed of;
Vice Chairman, Joint Chiefs of Staff (Co-chairman of DAB); Under
Secretary of Defense-Comptroller; Under Secretary of Defense-Policy;
Under Secretary of Defense-Personnel and Readiness; Assistant Secretary
of Defense for Networks and Information Integration; Service
secretaries; Director of Operational Test and Evaluation; Additional
advisors as invited;
Defense Space Acquisition Board (DSAB); Composed of;
Vice Chairman, Joint Chiefs of Staff (Co-
chairman of DSAB); Under Secretary of the Air Force staff; Executive
Service offices; Mission partners (National Reconnaissance Office,
National Aeronautics and Space Administration, U.S. Strategic Command,
Department of Transportation); Stakeholders (Office of the Secretary of
Defense, Joint Chiefs Staff, Office of Management and Budget); Users
(e.g., combatant commands, military services, and intelligence
community); Director of Operational Test and Evaluation; Additional
advisors as invited.
Integrated Product Team;
Independent Program Assessment Team.
Integrated Product Team: Help programs prepare for DAB review and
provide decision-making support; Two teams (overarching and working
level), permanently assigned to certain weapon systems; Comprised of
different functional experts, e.g., engineering, manufacturing,
purchasing, and finance. Teams review various types of weapon systems,
so they will not necessarily include space experts; Teams meet with
programs once every few months. Because teams are dedicated to several
programs, they cannot do intensive drill downs. Time taken to help
programs prepare for review may take as long as 18 months; Independent
Program Assessment Team: Perform "drill down" reviews of programs
before decisions on whether to move programs forward are made;
Temporary team; Comprised of space experts; Review is done in 8 weeks
(or more, if required) on-site working full-time with program
officials.
Source: GAO.
[End of table]
DOD is already applying this new process to major satellite programs,
including the Space-Based Infrared System (High) (SBIRS-High), the
Transformational Communications Satellite (TSAT), the Advanced
Extremely High Frequency (AEHF) system, the Mobile User Objective
System (MUOS), the Global Positioning System (GPS), the National
Polar-orbiting Operational Environmental Satellite System (NPOESS),
and the Space-Based Radar (SBR) system. (See app. I for a further
description of DOD's current and planned systems.) SBR is the first
system to receive approval for the first key decision point--key
decision point (KDP) A--which begins a study phase. Other systems will
come in at a later decision point--KDP B, which starts the acquisition
program, or KDP C, which starts the process of building, testing, and
launching the satellite. Some space-related systems, such as user
equipment, are produced in mass numbers. They will be overseen under a
process that is more similar to the DOD-wide acquisition process.
Gap between Resources and Requirements Has Undermined
Space Acquisitions:
The majority of satellite programs we have reviewed over the past
2 decades experienced problems during acquisition that drove up costs
and schedules and increased technical risks. Several programs were
restructured by DOD in the face of delays and cost growth. We have
found that these problems, which are common among many weapon systems,
are largely rooted in a failure to match the customer's needs with the
developer's resources--technical knowledge, timing, and funding--when
starting product development. In other words, commitments were made to
satellite launch dates and achieving certain capabilities without
knowing whether technologies being pursued could really work as
intended. Time and costs were consistently underestimated.
Achieving a Match between Resources and Requirements Is Essential
to Success:
Leading commercial firms expect that their program managers will
deliver high quality products on time and within budgets. Doing
otherwise could result in losing a customer in the short term and
losing the company in the long term. Thus, these firms have adopted
practices that put their individual program managers in a good position
to succeed in meeting these expectations on individual products.
Collectively, these practices ensure that a high level of knowledge
exists about critical facets of the product at key junctures during its
development and is used to deliver capability as promised. While DOD is
different from the commercial world in terms of its need to push for
cutting edge technology to maintain military superiority, its policies
for major weapon systems recognize that maturing technology outside of
product development allows needed stability in executing budgets and
allows capability to be delivered to the warfighter sooner.
Our reviews have shown that there are three critical junctures at
which firms must have knowledge to make large investment decisions.
First, before product development is started, a match must be made
between the customer's needs and the available resources--technical
and engineering knowledge, time, and funding. Second, a product's
design must demonstrate its ability to meet performance requirements
and be stable about midway through development. Third, the
developer must show that the product can be manufactured within cost,
schedule, and quality targets and is demonstrated to be reliable before
production begins.
The process is building block in nature as the attainment of each
successive knowledge point builds on the proceeding one. While the
knowledge itself builds continuously without clear lines of
demarcation, the attainment of knowledge points is sequential. In other
words, production maturity cannot be attained if the design is not
mature, and design maturity cannot be attained if the key technologies
are not mature.
In applying the knowledge-based approach, the most leveraged decision
point of the three junctures is matching the customer's needs with the
developer's resources. This initial decision sets the stage for the
eventual outcome--desirable or problematic. The match is ultimately
achieved in every development program, but in successful development
programs, it occurs before product development. In successful programs,
negotiations and trade-offs occur before product development is started
to ensure that a match exists between customer expectations and
developer resources. Technologies that are not mature continue to be
developed in the technology base (for example, a research laboratory).
With achievable requirements and commitment of sufficient investment to
complete the development, programs are better able to deliver products
at cost and on schedule.[Footnote 2]
In DOD, Match between Resources and Requirements Is Seldom Achieved at
Start of Product Development:
Our past work has shown that space programs have not typically achieved
a match between requirements and resources before starting product
development. Product development was often started based on a rigid
set of requirements that proved to be unachievable within a reasonable
development time frame. At times, even more requirements were added
after the program began. When problems arose, adding resources in terms
of time and money became the primary option for solving problems, since
customer expectations about the product's performance had already
become hardened. For example:
* After starting its AEHF satellite program, DOD substantially and
frequently changed requirements. In addition, after the failure of one
of DOD's legacy communications satellites, DOD decided to accelerate
its plans to build AEHF satellites. The contractors proposed, and DOD
accepted, a high risk schedule that turned out to be overly optimistic
and highly compressed, leaving little room for error and depending on a
chain of events taking place at certain times. Moreover, at the time
DOD decided to accelerate the program, it did not have funding needed
to support the activities and manpower needed to design and build the
satellites quicker. The effects of DOD's inability to match
requirements to resources were significant. Cost estimates produced by
the Air Force reflected an increase from $4.4 billion in January 1999
to $5.6 billion in June 2001--a difference of 26 percent. Although
considered necessary, many changes to requirements were substantial,
leading to cost increases of hundreds of millions of dollars because
they required major design modifications. Also, schedule delays
occurred when some events did not occur on time, and additional delays
occurred when the program faced funding gaps. Scheduling delays
eventually culminated into a 2-year delay in the launch of the first
satellite. We also reported that there are still technical and
production risks that need to be overcome in the AEHF program, such as
a less-than-mature satellite antenna system and complications
associated with the production of the system's information security
system.
* The SBIRS-High[Footnote 3] contract for engineering, manufacturing
and development amounted to $2.4 billion. In the fall of 2001, DOD
identified cost growth of $2 billion or more, triggering a mandatory
program review and recertification under 10 U.S.C. section 2433.
Currently, SBIRS-High is under contract for $4.4 billion. We reported
that when DOD's SBIRS-High satellite program began in 1994, none of its
critical technologies were mature. Moreover, according to a DOD-
chartered independent review team, the complexity, schedule, and
resources required to develop SBIRS-High, in hindsight, were
misunderstood when the program began. This led to an immature
understanding of how requirements translated into detailed engineering
solutions. Even though the program was restructured by DOD, the
independent review team noted that SBIRS-High still faced significant
risks.
* DOD has initiated several programs and spent several billion dollars
over the past 2 decades to develop low-orbiting satellites that can
track ballistic missiles throughout their flight. However, it has not
launched a single satellite to perform this capability. We have
reported[Footnote 4] that a primary problem affecting these programs
was that DOD and the Air Force did not relax rigid requirements to more
closely match technical capabilities that were achievable. Program
baselines were based on artificial time and/or money constraints. Over
time, it became apparent that the lack of knowledge of program
challenges had led to overly optimistic schedules and budgets that were
funded at less than what was needed. Attempts to stay on schedule by
approving critical milestones without meeting program criteria resulted
in higher costs and more slips in technology development efforts. For
example, our 1997 and 2001 reviews of DOD's $1.7 billion SBIRS-Low
program showed that the program would enter into the product
development phase with critical technologies that were immature and
with optimistic deployment schedules. Some of these technologies were
so critical that SBIRS-Low would not be able to perform its mission if
they were not available when needed. DOD eventually restructured the
SBIRS-Low program because of the cost and scheduling problems, and it
put the equipment it had partially built into storage. In view of the
program's mismatch between expectations and what it could achieve, the
Congress directed DOD to restructure the program (now known as the
Space Tracking and Surveillance System or STSS) as a research and
development effort.
We recently reported[Footnote 5] on crosscutting factors that make it
more difficult for DOD to achieve a match between resources and
requirements for space acquisitions. In particular, space programs
often involve a diverse array of organizations with competing interests
involved in overall satellite development--from the individual
military services, to testing organizations, contractors, civilian
agencies, and in some cases, even international partners and industry.
This creates challenges in making tough tradeoff decisions.
In addition, like other weapon programs, space acquisition programs
have historically attempted to satisfy all requirements in a single
step, regardless of the design challenge or the maturity of
technologies to achieve the full capability. This approach has made it
more difficult to match requirements to available resources.
Space Policy May Help Increase Insight into Gaps between Requirements
and Resources:
DOD's new space acquisition oversight process may help increase insight
into gaps between requirements and resources. In particular, tools
being adopted, such as technology readiness assessments, alternatives
analyses, and independent cost estimates, may help provide more
consistent and robust information on technologies, requirements, and
costs. However, the value of these tools depends largely on whether or
not the knowledge is used to make decisions. According to DOD
officials, similar tools are also being adopted by other weapon system
programs.
First, DOD is requiring that all space programs conduct technology
maturity assessments before key oversight decisions to assess the
maturity level of technology. One tool used by many weapon systems is
known as Technology Readiness Levels (TRL). The tool associates
different TRLs with different levels of demonstrated performance,
ranging from paper studies to proven performance on the intended
product. The value of using a tool based on demonstrated performance is
that it can presage the likely consequences of incorporating a
technology at a given level of maturity into a product development,
enabling decision-makers to make informed choices. The tool is even
more valuable if it is commonly used. Our previous reviews have found
the use of TRLs to be a best practice.[Footnote 6] (App. II describes
TRL levels.):
Second, DOD is requiring space programs to more rigorously assess
alternatives, consider how their systems will operate in the context of
larger families of systems, and think through operational, technical,
and system requirements before programs are started. For example,
programs will be required to develop an architecture that specifies the
structure of system components, their relationships, and the principles
and guidelines governing their design and evolution over time.
It is important for DOD to increase attention to requirements earlier
in the acquisition process and force DOD to think through whether there
are more cost-effective alternatives to pursue. A recent DOD
study[Footnote 7] found that understanding of requirements often occurs
too late to affordably change the system and, more specifically, that
space programs do not always understand how systems fit in with other
systems with which they need to interact and that often a lack of
mutual understanding of requirements exists between the government and
contractors. The SBIRS independent review team also found a need across
space programs for more rigorous up front development of requirements.
In addition, in previous reviews, we found that space programs often do
not examine potentially more cost-effective approaches. In 2001, for
example, we reported[Footnote 8] that DOD's SBIRS-Low program was not
adequately analyzing alternatives to SBIRS-Low that could satisfy
critical missile defense requirements, such as Navy ship-based radar
capability. At the time, other studies supported the possibility that
other types of sensors could be used to track missiles in the midcourse
of their flight and to cue interceptors.
Third, the new policy seeks to improve the accuracy of cost estimates
by establishing an independent cost estimating process in partnership
with DOD's Cost Analysis Improvement Group (CAIG) and by adopting
methodologies and tools used by the NRO. To ensure timely cost
analyses, the CAIG will augment its own staff with cost estimating
personnel drawn from across the entire national security space cost
estimating community, including cost estimating teams belonging to the
intelligence communities, the Air Force, NRO, the Army, and the Navy.
The policy also calls on programs to produce performance metrics that
compare estimated to actual costs. The policy allows programs to
request assistance from the CAIG for purposes other than DSAB reviews.
However, there is no point in the process that requires DOD to commit
to fully fund a space program.
Improving reliability of cost estimates is critical. Several of our
studies--such as ones on GPS, Evolved Expendable Launch Vehicle (EELV),
and AEHF--have called attention to problems with estimating system
costs, such as errors, omissions, and conflicting assumptions. For
example, in 1980 we reported that the cost to acquire and maintain GPS
satellites through 2000 increased from $1.7 billion to $8.6 billion due
largely to estimates not previously included for replenishment of
satellites, launches, and user equipment. Moreover, recent DOD studies
found initial cost estimates for the AEHF program as well as SBIRS-High
did not accurately capture program content and risk and were based on
optimistic assumptions. We also reported that costs would be better
estimated if DOD required more knowledge before starting a program.
Without knowing that technologies can work as intended, for example,
programs cannot reliably estimate costs and schedules.
Another tool that could be useful in gaining insight into whether
programs are positioned for success is the IPA team. IPA teams are to
be drawn from experts who are not directly affiliated with the program.
They are to spend about 8 weeks on-site working full-time with program
officials to study the program, particularly by assessing the
acquisition strategy, contracting information, cost analyses, system
engineering, and requirements. After this study, they are to conclude
their work with recommendations to the DSAB on whether or not to allow
the program to proceed, typically using the traditional "red,"
"yellow," and "green" assessment colors to indicate whether the program
has satisfied key criteria in areas such as requirements setting, cost
estimates, and risk reduction. The Under Secretary of the Air Force,
however, makes the decision on whether to allow the program to proceed.
IPA team studies already performed have called attention to risks faced
by the GPS III, NPOESS, and SBR programs. The NPOESS study, for
example, noted that risk mitigation plans needed to be strengthened and
that independent cost estimates needed to include the winning
contractor's negotiated contract. The SBR study found that the program
needed to better define how the system would operate in the context of
DOD's transformational communications architecture and work with key
intelligence systems, such as the planned Distributed Common Ground
Station. Both reviews recommended that the programs move forward
(NPOESS into the build phase and SBR into the study phase) on the
condition that these programs address areas of concern.
An IPA team studying GPS III found the program was too optimistic
in estimating resources that would be needed. For example, the study
noted that the program budget was not sufficient to support the
program plan by several hundred million dollars. The team also pointed
out that the system's architecture and acquisition strategy were not
sufficiently defined.
New Space Policy Does Not Call for a Match between Resources and
Requirements at Program Start:
DOD's new acquisition management policy for space systems does not
alter DOD's practice of committing major investments before knowing
what resources will be required to deliver promised capability.
Instead, the policy allows programs to continue to mature technologies
while they are designing the system and undertaking other product
development activities. While space systems are different than other
weapon systems in terms of how they are developed and tested, it is
still necessary to mature technology before starting product
development and match resources to requirements in order to prevent
cost increases and schedule delays.
We previously recommended that DOD should not allow technologies to
enter into a weapon system's product development until they are
assessed at a TRL 7, meaning that a prototype has been demonstrated in
an operational environment.[Footnote 9] According to DOD officials, the
new space acquisition policy does not set TRL criteria for deciding
what the threshold for being mature should be. However, DOD officials
stated that technologies may well enter into product development at a
TRL 5, meaning basic components have only been tested in a laboratory,
or an even lower level of maturity. This means that programs will
design the system and conduct other program activities at the same time
they build representative models of key technologies and test them in
an environment that simulates the conditions of space. In essence, DOD
will be concurrently building knowledge about technology and design--an
approach with a problematic history.
As shown in figure 2, the knowledge building approach for space stands
in sharp contrast to that followed by successful programs and the
approach recommended by DOD's new acquisition policy for weapon
systems. Successful programs will not commit to undertaking product
development unless they have high confidence that they have achieved a
match between what the customer wants and what the program can deliver.
Technologies that are not mature continue to be developed in an
environment that is focused solely on technology development. This puts
programs in a better position to succeed because they can focus on
design, system integration, and manufacturing.
By contrast, allowing technology development to carry over into
product development increases the risk that significant problems will
be discovered late in development. Addressing such problems may require
more time, money, and effort to fix because they may require more
extensive retrofitting and redesign as well as retesting. The approach
also makes it more difficult for programs to demonstrate the same level
of design stability since technology and design activities will be done
concurrently. Further, the consequences of problems experienced during
development will be much greater for space programs since the design
review occurs at the same time as the commitment to build and deliver
the first product to a customer.
Figure 2: DOD Will Be Making Commitments before Obtaining Critical
Knowledge for Space Systems:
[See PDF for image]
[End of figure]
Space acquisition officials we spoke with acknowledged the added
risks that come when programs concurrently develop technologies
and design the system. However, they maintain that concurrent
technology and product development is necessary for space acquisitions
for several reasons.
* First, while some testing on satellites can be done on the ground in
thermovac or other environmental simulation chambers and some systems
can also be tested via aircraft, the only way to test satellites in a
true operational space environment is to build one or more demonstrator
satellites and launch them into orbit. Launching demonstrators is
costly and time consuming.
Our prior reports have recognized that space systems are uniquely
difficult to test in a true operational environment. However, DOD has
found ways to test sensors and other critical technologies on
experimental satellites and it has built and launched technology
demonstrator satellites.
* Second, in view of the length of time it takes to develop space
systems, DOD asserts that it will not be able to ensure that
satellites, when launched, will have the most advanced technologies,
unless program managers are continually developing technologies. DOD
officials have stated that they would reduce the added risks of their
approach by not allowing programs to start if too many technologies
were deemed to be immature or by deferring certain capabilities if it
turned out that technologies did not test well.
We agree that continuing to develop leading edge technology is
important for all system capabilities, not just space systems. However,
history has shown and we have repeatedly reported that conducting
technology development within a product environment consistently delays
the delivery of capability to the user, robs other programs of
necessary funds through unanticipated cost overruns, and consequently,
can result in money wasted and fewer units produced than originally
stated as necessary. A technology development environment is more
forgiving and less costly than a delivery-oriented acquisition program
environment. Events such as test "failures," new discoveries, and time
spent in attaining knowledge are considered normal in this environment.
Further, judgments of technology maturity have proven to be
insufficient as the basis for accurate estimates of program risks
relative to cost, schedule, and capability.
* Finally, because operation and support costs make up a smaller
portion of total costs for satellites than other weapon programs, DOD
asserts that earlier insight and decisions are needed on space
programs.
We agree that early insight into programs is important, as we have
reported that over 80 percent of the cost of a weapon system program
is determined by requirements set at the beginning. However, moving
decisions to an earlier point in the product development process
without additional knowledge may actually increase the risk of
promising more than can be delivered and at higher costs.
Conclusions:
The growing importance of space systems to military and civil
operations requires DOD to develop cutting edge technologies and
achieve timely delivery of capability. DOD's new space acquisition
policy does not position space programs to do either. By allowing major
investment commitments to continue to be made with unknowns about
technology readiness, requirements, and funding, programs will likely
continue to experience problems that require more time and money to
address than anticipated. Over the long run, the extra investment
required to address these problems may well prevent DOD from pursuing
more advanced capabilities. By contrast, DOD is taking steps to better
position other weapon systems for success. By separating technology
development and product development, the policy will help to align
customer expectations with resources, and therefore minimize problems
that could hurt the program in its design and production phases.
Recommendations for Executive Action:
In finalizing DOD's new space acquisition management policy, we
recommend that the Secretary of the Air Force, who is DOD's executive
agent for space, modify the policy to ensure that customer expectations
can be matched to resources before starting product development
(phase B). Specifically, we recommend that the Secretary separate
technology development from product development. To ensure that this is
done, we also recommend that the Secretary set a minimum threshold of
maturity for allowing technologies into a program. As noted in our
report, we previously recommended that DOD should not allow
technologies to enter into a weapon system's product development until
they are assessed at a TRL 7, meaning that a prototype has been
demonstrated in an operational environment.[Footnote 10]
Agency Comments and Our Evaluation:
In commenting on a draft of this report, the Assistant Secretary of
Defense for Networks and Information Integration disagreed with our
finding that the new space policy perpetuates risks for space programs
since it does not separate technology development from product
development. DOD disagreed with our recommendations as well, citing its
need to keep up with the fast-paced development of advanced
technologies for space systems and a requirement in its draft policy
for technology readiness assessments to be conducted at appropriate
milestones.
In fact, it is DOD's long-standing and continuous inability to bring
the benefits of technology to the warfighter in a timely manner that
underlies the report's findings and recommendations. In our reviews of
numerous DOD programs, including many satellite developments, it has
been clear that committing to major investments in design, engineering,
and manufacturing capacity without knowing a technology is mature and
what resources are needed to ensure that the technology can be
incorporated into a weapon system has consistently resulted in more
money, time, and talent spent than either was promised, planned for, or
necessary. The impact of such mistakes in individual programs has also
had a damaging effect on military capability as other programs are
taxed to meet unplanned cost increases and production units are often
cut because unit costs increase and funds run out.
Although each DOD program differs in its characteristics, GAO's work
with successful product developers in DOD and the commercial sector has
found that the process of developing leading edge technology and
products that have more capability than their predecessors does not
differ. In fact, successful product developments are marked by
adherence to a disciplined process that collects metrics and
establishes and uses common and consistent criteria for decision-
making. We have found that companies that adopt these best practices
often do so out of necessity, when their existence is threatened. While
the Air Force has taken some promising steps in drafting the policy to
address DOD's poor record of developing satellites within cost and
schedule targets and with promised performance, it will miss an
opportunity to dramatically improve outcomes if it does not adopt
similar practices. Therefore, we have not changed our recommendation.
DOD's detailed comments and our responses are provided in appendix III.
In conducting our review, we analyzed DOD's new interim acquisition
management policy for space. Because of the limited time of our review,
we focused on the question of whether the policy will enable DOD to
match requirements to resources at the onset of product development,
which our work has shown to be the most critical determinant for
successful outcomes of acquisitions. We compared the new space policy
to DOD's new acquisition policy for other weapon systems as well as our
past reviews of the best practices of commercial and military
acquisitions. In addition, we discussed this policy with Air Force
space acquisition officials. We analyzed IPA studies performed under
the new policy on DOD's NPOESS, GPS III, and SBR programs. We also
analyzed our past reviews of space programs as well as DOD studies on
the SBIRS-High program and on space systems development growth. See
Related GAO Products at the end of this report for a list of past GAO
reports we relied on. We conducted our review from June 2003 through
August 2003 in accordance with generally accepted government auditing
standards.
We are sending copies of this report to the Secretaries of Defense and
the Air Force and interested congressional committees. We will also
make copies available to others upon request. In addition, the report
will be available at no charge on the GAO Web site at http://
www.gao.gov.
If you or your staff have any questions concerning this report, please
contact me at (202) 512-4841. Key contributors to this report were
Cristina Chaplain, Jean Harker, Natalie Britton, and Bradley Terry.
Sincerely yours,
Katherine V. Schinasi
Director, Acquisition and Sourcing Management:
Signed by Katherine V. Schinasi:
[End of section]
Appendix I: The Department of Defense's Current and Planned Satellite
Systems:
Function: Missile warning and tracking; Current Systems: * Defense
Support Program; Planned Systems: * Space-Based Infrared System (High);
* Space Tracking and Surveillance System.
Function: Intelligence, Surveillance and Reconnaissance; Current
Systems: * National Reconnaissance Office (NRO) satellites; Planned
Systems: * NRO satellites; * DOD's Space-based Radar.
Function: Communications:
Function: Wideband/high capacity systems; Current Systems: * Defense
Satellite Communications System; * Global Broadcasting Service; Planned
Systems: * Wideband Gapfiller Satellite; * Advanced Wideband System.
Function: Protected systems (antijam, survivable); Current Systems: *
Milstar; Planned Systems: * Advanced Extremely High Frequency; *
Advanced Polar System.
Function: Narrowband systems; Current Systems: * Ultra High Frequency
Follow-On satellite communications system; Planned Systems: * Mobile
User Objective System.
Function: Navigation, Positioning, Timing; Current Systems: * Global
Positioning System (GPS); Planned Systems: * Next Generation GPS.
Function: Weather/ Environmental; Current Systems: * Defense
Meteorological Satellite Program; Planned Systems: * National Polar-
orbiting Operational Environmental Satellite System.
Source: GAO.
[End of table]
[End of section]
Appendix II: Technology Readiness Levels and Their Definitions:
Technology readiness level: 1. Basic principles observed and reported;
Description: Lowest level of technology readiness. Scientific research
begins to be translated into applied research and development. Examples
might include paper studies of a technology's basic properties.
Technology readiness level: 2. Technology concept and/or application
formulated; Description: Invention begins. Once basic principles are
observed, practical applications can be invented. The application is
speculative and there is no proof or detailed analysis to support the
assumption. Examples are still limited to paper studies.
Technology readiness level: 3. Analytical and experimental critical
function and/or characteristic proof of concept; Description: Active
research and development is initiated. This includes analytical studies
and laboratory studies to physically validate analytical predictions of
separate elements of the technology. Examples include components that
are not yet integrated or representative.
Technology readiness level: 4. Component and/or breadboard validation
in laboratory environment; Description: Basic technological
components are integrated to establish that the pieces will work
together. This is relatively "low fidelity" compared to the eventual
system. Examples include integration of "ad hoc" hardware in a
laboratory.
Technology readiness level: 5. Component and/or breadboard validation
in relevant environment; Description: Fidelity of breadboard
technology increases significantly. The basic technological components
are integrated with reasonably realistic supporting elements so that
the technology can be tested in a simulated environment. Examples
include "high fidelity" laboratory integration of components.
Technology readiness level: 6. System/subsystem model or prototype
demonstration in a relevant environment; Description: Representative
model or prototype system, which is well beyond the breadboard tested
for technology readiness level (TRL) 5, is tested in a relevant
environment. Represents a major step up in a technology's demonstrated
readiness. Examples include testing a prototype in a high fidelity
laboratory environment or in simulated operational environment.
Technology readiness level: 7. System prototype demonstration in an
operational environment; Description: Prototype near or at planned
operational system. Represents a major step up from TRL 6, requiring
the demonstration of an actual system prototype in an operational
environment, such as in an aircraft, vehicle or space. Examples include
testing the prototype in a test bed aircraft.
Technology readiness level: 8. Actual system completed and "flight
qualified" through test and demonstration; Description: Technology has
been proven to work in its final form and under expected conditions. In
almost all cases, this TRL represents the end of true system
development. Examples include developmental test and evaluation of the
system in its intended weapon system to determine if it meets design
specifications.
Technology readiness level: 9. Actual system "flight proven" through
successful mission operations; Description: Actual application of the
technology in its final form and under mission conditions, such as
those encountered in operational test and evaluation. In almost all
cases, this is the end of the last "bug fixing" aspects of true system
development. Examples include using the system under operational
mission conditions.
Source: GAO.
[End of table]
[End of section]
Appendix III: Comments from the Department of Defense:
Note: GAO comments supplementing those in the report text appear at the
end of this appendix.
ASSISTANT SECRETARY OF DEFENSE 6000 DEFENSE PENTAGON WASHINGTON, DC
20301-6000:
SEP 5 2003:
NETWORKS AND INFORMATION INTEGRATION:
Ms. Katherine Schinasi:
Director, Acquisition and Sourcing Management U.S. General Accounting
Office:
Washington, D.C. 20548:
Dear Ms. Schinasi:
This is the Department of Defense (DOD) response to the GAO Report,
GAO-03-1073R, "Defense Acquisition: Improvements Needed in Space
Systems Acquisition Management Policy," dated August 8, 2003 (GAO Code
120266).
We have received the final report and DoD non-concurs with the GAO
findings as outlined in the enclosure. Please note, the National
Security Space Acquisition Policy 03-01, which guides the process for
space acquisition programs, was released as interim policy on March 4,
2003 and is currently being reviewed within the Department. The
Department's dependence on technology development, the pace at which
this technology is increasing, and its vital role in the national
defense of our nation warrant a robust technology development program.
Therefore, the acquisition process for space and other defense systems
acknowledge this essential characteristic by establishing mandates for
technology readiness assessments at appropriate milestones. We welcome
the opportunity to further work with you and your staff to ensure the
final report has a clear understanding of the Defense Acquisition
Process.
Signed for:
John P. Stenbit:
Enclosure: As stated:
GAO DRAFT REPORT DATED AUGUST 8, 2003 GAO-03-1073R (GAO CODE 120266):
"DEFENSE ACQUISITION: IMPROVEMENTS NEEDED IN SPACE SYSTEMS ACQUISITION
MANAGEMENT POLICY":
DEPARTMENT OF DEFENSE COMMENTS TO THE GAO RECOMMENDATIONS:
RECOMMENDATION 1: The GAO recommended that the Secretary of the Air
Force modify the policy to ensure that customer expectations can be
matched to resources before starting product development (phase B). (p.
15/GAO Draft Report):
Specifically, the GAO recommended that the Secretary:
A. separate technology development from product development;
B. set a minimum threshold of maturity for allowing technologies into a
program.
DOD RESPONSE:
The DoD concurs with the findings that the new space acquisition policy
will help provide consistent and robust information on technologies,
requirements and costs. We also concur with the finding that matching
customers' needs with the resources of technical knowledge, schedule,
and funding is critical and that in the past space programs have
suffered from gaps between resources and requirements.
The DoD does not concur with the recommendation to solve this gap by
separating technology development and product development and mandating
a standard, prescribed technology readiness level for all programs.
The Air Force recognizes that there have been serious problems in the
past with space program acquisition. Indeed, it is because we recognize
this issue that we took up the challenge of creating a new acquisition
process, one that we feel will result in more informed, better
decisions and more successful programs. We believe that the new
National Security Space (NSS) Acquisition Policy 03-01 will address
many of the findings raised in this study and that it is premature to
recommend changes. We also believe that it is difficult to accurately
compare NSS Acquisition Policy 03-01 to the new DoDI 5000.2, since it
is too early to judge what effect the new 5000 series will have on
traditional acquisition challenges. However, it is our view that NSS
Acquisition Policy 03-01 and DoDI 5000.2 are consistent in their
intent.
The new NSS Acquisition Policy 03-01 and the new Chairman of the Joint
Chiefs of Staff Instruction (CJCSI) 3170.01 C have both been crafted to
address the need to match customers' needs with the resources
available. In concert, these documents enhance the interaction between
the requirements and acquisition community throughout the process as
lead users, operating commands, and affected agencies and departments
participate in the Defense Space Acquisition Board (DSAB) process and
the various acquisition documents' coordination process. In addition
to the CJCSI directed concept of operation materials, prior to KDP-A
study phase activities, the NSS Acquisition Policy 03-01 process
requires the development of a system-level
concept of operations (CONOPS). The NSS Acquisition Policy requirements
for an Independent Cost Assessment Team (ICAT) and Independent Program
Assessment Team (IPAT) processes are additional avenues to identify any
potential risks or gaps between requirements and resources.
Because there is so much in the acquisition process that we cannot
control (e.g. Congressional cuts, OSD priorities, industrial base
issues), NSS Acquisition Policy 03-01 was conceived as a method of
controlling that which we could. It is modeled on an effective process
and written to utilize best practices. It is designed to allow
oversight by the proper parties, but is primarily a management tool,
written to allow the necessary flexibility for good program management.
Our research into the acquisition process found that acquisition-by-
committee adds time and cost that national security space programs can
ill afford. Therefore, NSS Acquisition Policy 03-01 invests the
Milestone Decision Authority (MDA) with the authority and
responsibility of deciding how or if a program proceeds, based on an
in-depth review of all elements of a program by a group of independent
experts. The review studies, among other things, technology readiness
and risk. If the MDA, based on the findings and recommendations of the
ICAT, IPAT, and the larger DSAB process, is not convinced the
technology is mature, he will provide the appropriate direction to the
program to ensure maturation occurs including possibly delaying entry
into the next phase until ready.
While NSS Acquisition Policy 03-01 does not specify a Technology
Readiness Level
(TRL) that a program must meet in order to proceed, it does state in
Section E5.9 that: "At each KDP, the program office should identify the
key technology components of the system and provide their assessment of
the maturity ofeach key component using the Technology Readiness Level
(TRL) method identified in the Interim DoD Acquisition Guidebook. The
IPAT will review the program office assessment and determine if, in
their view, all key technology components of the program have been
identified. The IPA will also provide its own independent assessment of
the maturity of the key components using the TRL method. The intent is
not to require a specific TRL for each key component in order to
proceed into the next acquisition phase, but to instead allow for the
DoD Space MDA to be made knowledgeable of the state of key component
maturity so appropriate direction can be given in the ADM for
additional technology maturation/risk reduction activities.":
Forcing every program to meet the same TRL ignores the fact that not
only do space programs differ from typical DoD weapons systems, space
programs differ from each other. Forcing a program to meet a prescribed
TRL before proceeding also ignores evolutionary acquisition,
which space programs routinely engage in and which the new DoDD 5000.1
emphasizes is the "preferred approach to satisfying operational
needs.":
Separating technology development from product development also works
against evolutionary acquisition. In addition, it is not feasible for
space programs to separate out technology development; if done, the
acquisition time for NSS programs would significantly increase and the
technologies used in the systems would often be more than a decade
outdated when the product is placed into service. Further, according to
this study, all component level testing would need to be done in an
operational environment for technology development to be considered
complete. Satellite programs cannot meet this requirement. Launch costs
alone preclude launching full-up test versions of every satellite
program, plus fully operational satellite systems often average only 6
satellites, making it unrealistic to build a number of test articles
before sending up the actual system. While the Air Force does launch
demonstration satellites, and sometimes includes untested sensors on
operational satellites for demonstration, these tests are designed for
programs that are well in the future. These sensors or demo satellites,
although tested in an operational environment, are not capable of
meeting operational requirements nor are they truly representative of
operational components.
DoD recognizes that some programs are different from the typical DoD
weapons system; for example, per DoDI 5000.2, Sec 3.6.3:
"Shipbuilding programs may be initiated at the beginning of Technology
Development. The information required in the tables at enclosure 3
shall support program initiation. A cost assessment shall be prepared
in lieu of an independent cost estimate (ICE), and a preliminary
assessment of the maturity of key technologies shall be provided.":
Sec 3.6.7 also requires that technologies be demonstrated in a
"relevant environment," as opposed to an operational environment for
exit from technology development, acknowledging that some programs have
characteristics that require flexibility in program management (e.g.,
state-of-the-art technology, long product development times, and low
quantities required to perform the mission for a variety of users). The
new DoDI 5000.2 does not mandate a TRL level.
The report details three "knowledge points" based on best practices
drawn from industry study. However, to our knowledge, none of the case
studies included came from a commercial satellite manufacturer, or even
a producer of state-of-the-art, low-density, high-demand products that
are quite different from previous products provided by that producer.
The case studies mentioned come from producers of mass-produced, earth-
bound products, even if technologically advanced, providing only a
partial analogy. We feel that these knowledge points do not accurately
reflect the unique characteristics of space programs. For example,
knowledge point three is where "decision-makers know production
processes are under control." This point assumes a quantity of the
product has been produced sufficient to prove consistent quality by
working out flaws between initial low quantity batches, before going on
to produce larger quantities of that same product. Satellite programs
produce from 1-25 satellites - 6 being average. National security space
programs cannot by defmition reach knowledge point three.
The report also states that moving decision points to an earlier point
in the program will increase risk. We believe the exact opposite is
true. We have placed Key Decision Points at the appropriate points
within a space program to make sure there is senior level involvement,
including in-depth independent review of all elements of the program
before major funding decisions are made. Since all programs will have
challenges, finding these potential problems earlier can only help us
overcome them without damaging the program's stability. NSS Acquisition
Policy 03-01 decision points were designed to involve the MDA, and
independent assessments, earlier in the program, allowing review of the
program to make an informed decision on how or if the program should
proceed. The in-depth review at each decision point, including cost and
technology maturation studies, allows the MDA to make an informed
decision while maintaining the flexibility required by good program
management. This early involvement is important since the majority of a
space program's budget is in the early part of the program.
We also believe the report does not accurately reflect the timeline for
NSS Acquisition Policy 03-01. Page 5 of the report shows a graphic
overview of the DoD and Space acquisition policies. Technology
development in the space acquisition policy is shown as going almost
all the way to KDP C. While NSS Acquisition Policy 03-01 does state
that technology development
is part of Phase B, this phase is primarily focused on risk reduction;
component level technology development is finished by PDR. These risk
reductions activities are similar to the DoDl 5000.2 Phase B activities
to reduce integration and manufacturing risks. In addition, technology
maturity is assessed at each KDP. If a program does not demonstrate the
necessary level of technological maturity, the MDA may stop the
program, delay it from moving into the next phase, or direct the
program to conduct additional reviews to assess maturity within a given
timeframe.
The following are GAO's comments on the Department of Defense's letter
dated September 5, 2003.
GAO Comments:
We agree that there are consistencies between the two policies in terms
of how they enhance the development of requirements. However, the
policies are very different in terms of their views on technology
development. DOD's policy for weapon systems clearly requires
technologies to be mature (demonstrated in a relevant, preferably
operational environment) before beginning product development. The
space policy does not. In fact, DOD officials stated that, under the
space policy, technologies may well enter product development without
being demonstrated in a relevant environment. This might not occur
until DOD is close to making its production decision. In our view, this
difference will be a detriment to the future success of space programs.
DOD contended that our recommendation to set a minimum threshold of
maturity for allowing technologies into a program ignores differences
among programs and ignores evolutionary acquisition. We disagree with
these points. Technology maturity is fundamental to the success of all
programs and cannot be ignored as part of a satellite's business case.
While it is possible to take a gamble on a key technology and have it
work out in the end, DOD's experiences show that this is an unlikely
result. Moreover, this is not an approach that successful product
developers emulate. In addition, technology maturity is essential to
successful evolutionary acquisitions. The principle of evolutionary
development is reaching full capability in more doable steps. Technical
maturity essentially defines what is doable for each increment or
block.
DOD asserted that it is not feasible for space programs to separate
technology development from product development because it would delay
delivery of the product and make its technologies obsolete. We
disagree. Separation of technology development from product development
has been found to be essential to reducing overall development cycle
times and delivering new products within estimated resources. The DOD
policy for other weapons acquisitions is quite clear on this as well.
In successful programs, the technologies are matured, hybrid
organizations and agreements between the technologists and the product
developers are established, and preliminary designs are done, thus
providing the basis for a match between the user's needs and the
developer's resources--all before the commitment to product development
is made. By maturing technologies before committing significant time
and money to product development and following an evolutionary
approach, the product development cycle time is reduced, while
opportunities for inserting new technologies are more frequent.
DOD asserted that satellite programs cannot be demonstrated in an
operational environment (TRL 7). We disagree. NASA, the creator of
TRLs, tests some technologies to a TRL 7 if they are mission critical.
Moreover, while we recognize the difficulties in attaining this level
of maturity for space systems, the space policy does not even encourage
programs to demonstrate technologies in a relevant environment before
committing to a program. In fact, according to DOD officials, under the
space policy, technologies could enter product development with a TRL 5
or even lower. The policy is silent on what the minimum threshold for
maturity should be, leaving that decision to the milestone decision
authority.
DOD stated that none of our prior best practices case studies included
a commercial satellite producer, making the knowledge points irrelevant
to space systems. This assertion is wrong. In the report that first
promulgated the knowledge points (GAO/NSIAD-98-56), one of the key case
studies was Hughes Space and Communications and its experience with the
HS-702 satellite. We deliberately included Hughes because it was a low-
volume, high technology producer. Hughes insisted on having process
control for all key processes and proved them either through use on
other satellite production or through statistical process control
techniques. Hughes was also included as part of our best practice study
on technology development (GAO/NSIAD-99-162).
DOD asserted that moving decision points to an earlier point in the
program reduces risks, rather than increases them as our report states.
We disagree. The space policy proposes to make commitments to product
development (including point estimates on cost, schedule, and
performance) before sufficient knowledge has been achieved and requires
decision makers to commit first to product development without having
technology in hand and second to production of the first two products
without production knowledge in hand. This is the traditional DOD
approach, which has consistently resulted in capability being delivered
much later and much more expensively than planned. The commitment to
product development (and the requisite estimates) can be done more
confidently and the product development cycle time can be much shorter
only if decisions are knowledge-based.
While officials have told us that the intent of the policy is to
complete technology development during phase B, they acknowledged that
the policy does not identify an end point for technology development
and that, in some cases, it could continue until the point the program
is ready to begin building the first satellite.
[End of section]
Related GAO Products:
Space Reports:
Military Space Operations: Common Problems and Their Effects on
Satellite and Related Acquisitions. GAO-03-825R. Washington, D.C.:
June 2, 2003.
Polar-Orbiting Environmental Satellites: Project Risks Could Affect
Weather Data Needed by Civilian and Military Users. GAO-03-987T.
Washington, D.C.: July 15, 2003.
Missile Defense: Alternate Approaches to Space Tracking
and Surveillance System Need to Be Considered. GAO-03-597.
Washington, D.C.: May 23, 2003.
Military Space Operations: Planning, Funding, and Acquisition
Challenges Facing Efforts to Strengthen Space Control. GAO-02-738.
Washington, D.C.: September 23, 2002.
Polar-Orbiting Environmental Satellites: Status, Plans, and Future Data
Management Challenges. GAO-02-684T. Washington, D.C.: July 24, 2002.
Defense Acquisitions: Space-Based Infrared System-Low at Risk of
Missing Initial Deployment Date. GAO-01-6. Washington, D.C.: February
28, 2001.
Best Practice Reports:
Defense Acquisitions: Assessments of Major Weapon Programs. GAO-03-476.
Washington, D.C.: May 15, 2003.
Defense Acquisitions: Matching Resources With Requirements Is Key to
the Unmanned Combat Air Vehicle Program's Success. GAO-03-598.
Washington, D.C.: June 30, 2003.
Best Practices: Better Acquisition Outcomes Are Possible If DOD Can
Apply Lessons from F/A-22 Program. GAO-03-645T. Washington, D.C.: April
11, 2003.
Best Practices: Setting Requirements Differently Could Reduce Weapon
Systems' Total Ownership Costs. GAO-03-57. Washington, D.C.: February
11, 2003.
Best Practices: Capturing Design and Manufacturing Knowledge
Early Improves Acquisition Outcomes. GAO-02-701. Washington, D.C.:
July 15, 2002.
Defense Acquisitions: DOD Faces Challenges in Implementing Best
Practices. GAO-02-469T. Washington, D.C.: February 27, 2002.
Best Practices: DOD Teaming Practices Not Achieving Potential Results.
GAO-01-510. Washington, D.C.: April 10, 2001.
Best Practices: Better Matching of Needs and Resources Will Lead
to Better Weapon System Outcomes. GAO-01-288. Washington, D.C.:
March 8, 2001.
Best Practices: A More Constructive Test Approach Is Key to
Better Weapon System Outcomes. GAO/NSIAD-00-199. Washington, D.C.:
July 31, 2000.
Defense Acquisitions: Employing Best Practices Can Shape Better Weapon
System Decisions. GAO/T-NSIAD-00-137. Washington, D.C.: April 26, 2000.
Best Practices: Better Management of Technology Development Can Improve
Weapon System Outcomes. GAO/NSIAD-99-162. Washington, D.C.: July 30,
1999.
Best Practices: Successful Application to Weapons Acquisitions Requires
Changes in DOD's Environment. GAO/NSIAD-98-56. Washington, D.C.:
February 24, 1998.
FOOTNOTES
[1] Other DOD weapons-related acquisitions (e.g., aircraft, ships, and
tanks) fall under DOD's new 5000 Series. Missile defense systems, such
as the Space Tracking and Surveillance System, fall under a process
designed and managed by the Missile Defense Agency.
[2] Our best practice reviews are identified in the Related GAO
Products at the end of this report.
[3] In the mid-1990s, SBIRS was established as a "systems of systems"
approach with two components, SBIRS-High and SBIRS-Low, that were
managed by the Air Force. In 2000, SBIRS-Low was shifted back from the
Air Force to the Ballistic Missile Defense Organization, which is now
the Missile Defense Agency. In 2002, SBIRS-Low was renamed Space
Tracking and Surveillance System (STSS). While STSS is focused
primarily on supporting the missile defense mission, SBIRS-High is
focused on missile warning, missile defense, technical intelligence,
and battlespace characterization and is managed by the Air Force.
[4] U.S. General Accounting Office, Missile Defense: Alternate
Approaches to Space Tracking and Surveillance System Need to Be
Considered, GAO-03-597 (Washington, D.C.: May 23, 2003).
[5] U.S. General Accounting Office, Military Space Operations: Common
Problems and Their Effects on Satellite and Related Acquisitions, GAO-
03-825R (Washington, D.C.: June 2, 2003).
[6] U.S. General Accounting Office, Best Practices: Better Management
of Technology Development Can Improve Weapon System Outcomes, GAO/
NSIAD-99-162 (Washington, D.C.: July 30, 1999).
[7] Booz Allen Hamilton, "Space Systems Development Growth Analysis,"
Los Angeles, CA, August 2, 2002.
[8] U.S. General Accounting Office, Defense Acquisitions: Space-Based
Infrared System-Low At Risk of Missing Initial Deployment Date, GAO-01-
6 (Washington, D.C.: February 28, 2001).
[9] U.S. General Accounting Office, Best Practices: Better Management
of Technology Development Can Improve Weapon System Outcomes, GAO/
NSIAD-99-162 (Washington, D.C.: July 30, 1999).
[10] U.S. General Accounting Office, Best Practices: Better Management
of Technology Development Can Improve Weapon System Outcomes, GAO/
NSIAD-99-162 (Washington, D.C.: July 30, 1999).
GAO's Mission:
The General Accounting Office, the investigative arm of Congress,
exists to support Congress in meeting its constitutional
responsibilities and to help improve the performance and accountability
of the federal government for the American people. GAO examines the use
of public funds; evaluates federal programs and policies; and provides
analyses, recommendations, and other assistance to help Congress make
informed oversight, policy, and funding decisions. GAO's commitment to
good government is reflected in its core values of accountability,
integrity, and reliability.
Obtaining Copies of GAO Reports and Testimony:
The fastest and easiest way to obtain copies of GAO documents at no
cost is through the Internet. GAO's Web site ( www.gao.gov ) contains
abstracts and full-text files of current reports and testimony and an
expanding archive of older products. The Web site features a search
engine to help you locate documents using key words and phrases. You
can print these documents in their entirety, including charts and other
graphics.
Each day, GAO issues a list of newly released reports, testimony, and
correspondence. GAO posts this list, known as "Today's Reports," on its
Web site daily. The list contains links to the full-text document
files. To have GAO e-mail this list to you every afternoon, go to
www.gao.gov and select "Subscribe to e-mail alerts" under the "Order
GAO Products" heading.
Order by Mail or Phone:
The first copy of each printed report is free. Additional copies are $2
each. A check or money order should be made out to the Superintendent
of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or
more copies mailed to a single address are discounted 25 percent.
Orders should be sent to:
U.S. General Accounting Office
441 G Street NW,
Room LM Washington,
D.C. 20548:
To order by Phone:
Voice: (202) 512-6000:
TDD: (202) 512-2537:
Fax: (202) 512-6061:
To Report Fraud, Waste, and Abuse in Federal Programs:
Contact:
Web site: www.gao.gov/fraudnet/fraudnet.htm E-mail: fraudnet@gao.gov
Automated answering system: (800) 424-5454 or (202) 512-7470:
Public Affairs:
Jeff Nelligan, managing director, NelliganJ@gao.gov (202) 512-4800 U.S.
General Accounting Office, 441 G Street NW, Room 7149 Washington, D.C.
20548: