Defense Acquisitions

The Expeditionary Fighting Vehicle Encountered Difficulties in Design Demonstration and Faces Future Risks Gao ID: GAO-06-349 May 1, 2006

The Marine Corps' Expeditionary Fighting Vehicle (EFV) is the Corps' number-one priority ground system acquisition program and accounts for 25.5 percent of the Corps' total acquisition budget for fiscal years 2006 through 2011. It will replace the current amphibious assault craft and is intended to provide significant increases in mobility, lethality, and reliability. We reviewed the program under the Comptroller General's authority to examine (1) the cost, schedule, and performance of the EFV program during system development and demonstration; (2) factors that have contributed to this performance; and (3) future risks the program faces as it approaches production.

Although the EFV program had followed a knowledge-based approach early in development, its buying power has eroded during System Development and Demonstration (SDD). Since beginning this final phase of development in December 2000, cost has increased 45 percent. Unit costs have increased from $8.5 million to $12.3 million. The program schedule has grown 35 percent or 4 years, and its reliability requirement has been reduced from 70 hours of continuous operation to 43.5 hours. Program difficulties occurred in part because not enough time was allowed to demonstrate maturity of the EFV design during SDD. The SDD schedule of about 3 years proved too short to conduct all necessary planning and to incorporate the results of tests into design changes, resulting in schedule slippages. In addition, several significant technical problems surfaced, including problems with the hull electronic unit, the bow flap, and the hydraulics. Reliability also remains a challenge. Three areas of significant risk remain for demonstrating design and production maturity that have potential significant cost and schedule consequences. First, EFV plans are to enter low-rate initial production without requiring the contractor to demonstrate that the EFV's manufacturing processes are under control. Second, the EFV program will begin low-rate initial production without the knowledge that software development capabilities are sufficiently mature. Third, two key performance parameters--reliability and interoperability--are not scheduled to be demonstrated until the initial test and evaluation phase in fiscal year 2010--about 4 years after low-rate initial production has begun.

Recommendations

Our recommendations from this work are listed below with a Contact for more information. Status will change from "In process" to "Open," "Closed - implemented," or "Closed - not implemented" based on our follow up work.

Director: Team: Phone:


GAO-06-349, Defense Acquisitions: The Expeditionary Fighting Vehicle Encountered Difficulties in Design Demonstration and Faces Future Risks This is the accessible text file for GAO report number GAO-06-349 entitled 'Defense Acquisitions: The Expeditionary Fighting Vehicle Encountered Difficulties in Design Demonstration and Faces Future Risks' which was released on May 2, 2006. This text file was formatted by the U.S. Government Accountability Office (GAO) to be accessible to users with visual impairments, as part of a longer term project to improve GAO products' accessibility. Every attempt has been made to maintain the structural and data integrity of the original printed product. Accessibility features, such as text descriptions of tables, consecutively numbered footnotes placed at the end of the file, and the text of agency comment letters, are provided but may not exactly duplicate the presentation or format of the printed version. The portable document format (PDF) file is an exact electronic replica of the printed version. We welcome your feedback. Please E-mail your comments regarding the contents or accessibility features of this document to Webmaster@gao.gov. This is a work of the U.S. government and is not subject to copyright protection in the United States. It may be reproduced and distributed in its entirety without further permission from GAO. Because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. United States Government Accountability Office: GAO: Report to Congressional Committees: May 2006: DEFENSE ACQUISITIONS: The Expeditionary Fighting Vehicle Encountered Difficulties in Design Demonstration and Faces Future Risks: GAO-06-349: GAO Highlights: Highlights of GAO-06-349, a report to Congressional Committees. Why GAO Did This Study: The Marine Corps‘ Expeditionary Fighting Vehicle (EFV) is the Corps‘ number-one priority ground system acquisition program and accounts for 25.5 percent of the Corps‘ total acquisition budget for fiscal years 2006 through 2011. It will replace the current amphibious assault craft and is intended to provide significant increases in mobility, lethality, and reliability. We reviewed the program under the Comptroller General‘s authority to examine (1) the cost, schedule, and performance of the EFV program during system development and demonstration; (2) factors that have contributed to this performance; and (3) future risks the program faces as it approaches production. What GAO Found: Although the EFV program had followed a knowledge-based approach early in development, its buying power has eroded during System Development and Demonstration (SDD). Since beginning this final phase of development in December 2000, cost has increased 45 percent as shown in figure 1. Figure: EFV Acquisition Cost Growth Since Start of SDD: [See PDF for Image] Source: GAO analysis of program office data. [End of figure] Unit costs have increased from $8.5 million to $12.3 million. The program schedule has grown 35 percent or 4 years, and its reliability requirement has been reduced from 70 hours of continuous operation to 43.5 hours. Program difficulties occurred in part because not enough time was allowed to demonstrate maturity of the EFV design during SDD. The SDD schedule of about 3 years proved too short to conduct all necessary planning and to incorporate the results of tests into design changes, resulting in schedule slippages. In addition, several significant technical problems surfaced, including problems with the hull electronic unit, the bow flap, and the hydraulics. Reliability also remains a challenge. Three areas of significant risk remain for demonstrating design and production maturity that have potential significant cost and schedule consequences. First, EFV plans are to enter low-rate initial production without requiring the contractor to demonstrate that the EFV‘s manufacturing processes are under control. Second, the EFV program will begin low-rate initial production without the knowledge that software development capabilities are sufficiently mature. Third, two key performance parameters”reliability and interoperability”are not scheduled to be demonstrated until the initial test and evaluation phase in fiscal year 2010–about 4 years after low-rate initial production has begun. What GAO Recommends: GAO is making recommendations in this report to the Secretary of Defense that (1) the EFV program delay Milestone C until design maturity and other conditions are achieved, and (2) draw lessons from the EFV experience that can be applied to other acquisition programs. DOD agreed with our recommendations. [Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-06-349]. To view the full product, including the scope and methodology, click on the link above. For more information, contact Paul Francis at (202) 512-4841 or francisp@gao.gov. [End of section] Contents: Letter: Results in Brief: Background: Cost, Schedule, and Other Problems Have Reduced EFV Buying Power: Difficulty of Demonstrating Design Maturity Was Underestimated: Risks Remain for Demonstrating Design and Production Maturity: Conclusions: Recommendations for Executive Actions: Agency Comments and Our Evaluation: Appendix I: Scope and Methodology: Appendix II: Comments from the Department of Defense: Appendix III: GAO Contact and Staff Acknowledgment: Tables: Table 1: Program Office Rationales for Rebaselining the EFV Program Since Entering SDD: Table 2: Comparison of Key Events Timing: Figures: Figure 1: Current EFV under Development: Figure 2: Comparison of EFV Acquisition Cost to the Marine Corps' Total Acquisition Cost for Fiscal Years 2006-2011 (Then-year dollars): Figure 3: EFV Acquisition Cost Growth Since the Start of System Development and Demonstration: Figure 4: Best Practices for Demonstrating Design Maturity: Figure 5: EFV Hull Electronics Unit: Figure 6: EFV Bow Flap: Figure 7: Original Reliability Growth Plan: Figure 8: Current Reliability Growth Plan: Abbreviations: DOD: Department of Defense; DOT&E: Director, Operational Test and Evaluation; EFV: Expeditionary Fighting Vehicle; GDAMS: General Dynamics Amphibious Systems; GDLS: General Dynamics Land Systems; HEU: Hull Electronic Unit; SDD: System Development and Demonstration: United States Government Accountability Office: Washington, DC 20548: May 1, 2006: The Honorable John Warner: Chairman: The Honorable Carl Levin: Ranking Minority Member: Committee on Armed Services: United States Senate: The Honorable Duncan L. Hunter: Chairman: The Honorable Ike Skelton: Ranking Minority Member: Committee on Armed Service: House of Representatives: Congress continues to express concerns over both the costs and the cost growth of the Department of Defense's (DOD) major acquisition programs[Footnote 1] and not following its own acquisition policies. In the November 2005, hearing on DOD Acquisition Reform, the House Armed Services Committee noted that DOD's acquisition costs and capabilities were increasing so much for individual systems that the nation will not be able to afford enough of them to support its missions; it also observed that the symptoms of this problem include increasing costs and programs ignoring internal regulations and processes. We have reported on widespread and persistent cost, schedule, and performance problems with major weapon system developments and DOD's inability to resolve them. Over the last 9 years, we have benchmarked successful commercial and defense development programs and identified the key characteristics for getting better outcomes as being knowledge- based. Successful programs insist on having key product knowledge demonstrated at key points in a new development. We have found that a sound business case at the beginning of the system development and demonstration (SDD) phase is essential for the successful completion of a weapon system program.[Footnote 2] Demonstrated knowledge at key junctures is at the core of the business case. The basic elements of a sound business case at the start of SDD include: * A match must be made between the customer's needs and mature technology. We refer to this as knowledge point 1. * The acquisition strategy for SDD should provide for demonstrating: - Design stability at the time of the critical design review (knowledge point 2). - The design meets performance requirements, is reliable, and can be produced within cost, schedule, and quality targets before production begins (knowledge point 3). * A realistic cost estimate is made to support the acquisition strategy. * Sufficient funds are available to cover realistic program costs. In sum, successful programs insist on having key product knowledge demonstrated at key points in a new development. Starting in October 2000, DOD incorporated a knowledge-based approach in its policy that guides major acquisitions and expanded this approach in its May 2003 policy.[Footnote 3] The way to implement this policy is through decisions on individual programs. As we have reported, most individual programs do not follow a knowledge-based approach, preferring instead to proceed without adequate knowledge and to accept the consequences of lost buying power that attend subsequent cost increases.[Footnote 4] The Marine Corps' Expeditionary Fighting Vehicle (EFV) is a major acquisition program that did show indications of following a knowledge- based approach and other best practices. For example, the program earlier adopted best practices in its implementation of Integrated Product Teams and has trained its program office staff on this acquisition improvement initiative. In addition, as we have reported, the earlier EFV program has been a leader both in the use of Integrated Product Teams and Cost as an Independent Variable.[Footnote 5] The EFV program has since been used by the Defense Acquisition University as a lessons-learned case study for training acquisition program managers. We reviewed the EFV program under the Comptroller General's authority to determine how it is performing against its business case. Specifically, this report addresses: * the cost, schedule, and performance of the EFV program during SDD; * factors that have contributed to this performance; and: * future risk the program faces as it approaches production. In conducting our review, we used knowledge-based acquisition strategy principles as a framework. Appendix I contains details of our approach. We conducted our work from May 2005 to May 2006 in accordance with generally accepted government auditing standards. Results in Brief: Since the EFV program began the System Development and Demonstration (SDD) phase, its return on investment has eroded as costs have increased, deliveries have been delayed, and expected reliability has been lowered. Since December 2000, the EFV's total cost has grown by about $3.9 billion or 45 percent, to $12.6 billion. Cost per vehicle has increased from $8.5 million to $12.3 million. Deliveries of vehicles to the warfighter have been delayed, as planned production quantities have been reduced by about 55 percent over fiscal years 2006- 2011, and the development schedule has grown by about 4 years, or 35 percent. Furthermore, a key requirement has been lowered. EFV reliability--a key performance parameter--has been reduced from 70 hours of continuous operation to 43.5 hours. Program difficulties occurred in part because not enough time was allowed to demonstrate maturity of the EFV design during SDD. Best practices (and current DOD acquisition policy) call for system integration work to be conducted before the critical design review is held. This review represents the commitment to building full-scale SDD prototypes that are representative of the production vehicle. In the case of the EFV, however, the SDD critical design review was held before the system integration work had been fully completed. While testing of early prototypes began 1 year before SDD critical design review, it continued for 3 more years after the decision to begin building the SDD prototypes. The SDD schedule of about 3 years proved too short to conduct all necessary planning and to incorporate the results into design changes, resulting in schedule delays and cost increases. Lessons learned from testing the early prototypes necessitated design changes in the SDD prototypes, which delayed their delivery and testing. The schedule was delayed further to allow more time to demonstrate the reliability of the EFV using the SDD prototypes. Even with the delays, it is clear that the actual test hours accumulated are significantly less than planned. While the original plan called for conducting 12,000 hours of testing by September 2005, the current plan will not achieve this level until after 2008. Also, several significant problems have surfaced in testing the SDD prototypes, including problems with the hull electronic unit (HEU), the bow flap, and the hydraulics. Three areas of risk remain for demonstrating design and production maturity, which have potential cost and schedule consequences--risks for the EFV's business case. First, while the EFV program has taken steps and made plans to reduce risk in the production phase, production risk remains in the program. Current plans are to enter low-rate initial production without requiring the contractor to ensure that all key EFV manufacturing processes are under control. Second, the EFV program will transition to low-rate initial production without the knowledge that software development capabilities are mature. Third, two key performance parameters--reliability and interoperability--are not scheduled to be demonstrated until the initial test and evaluation phase in fiscal year 2010, about 4 years after low-rate initial production has begun. The program office has developed plans to resolve performance challenges, and believes they will succeed. However, until the plans are actually implemented successfully, the EFV's design and production maturity will not be demonstrated and the potential for additional cost and schedule increases remains while production units are being made. We are making recommendations in this report to the Secretary of Defense that (1) the EFV program delay Milestone C until design maturity and other conditions are achieved and (2) draw lessons from the EFV experience that can be applied to other acquisition programs. After a review of a draft of this report, DOD concurred with our recommendations and provided some technical comments that were incorporated, as appropriate. Background: The EFV is the Corps' number-one priority ground system acquisition program and is the successor to the Marine Corps' existing amphibious assault vehicle. It is designed to transport troops from ships offshore to their inland destinations at higher speeds and from farther distances, and to be more mobile, lethal, reliable, and effective in all weather conditions. It will have two variants--a troop carrier for 17 combat-equipped Marines and a crew of 3 and a command vehicle to manage combat operations in the field. The Marine Corps' total EFV program requirement is for 1,025 vehicles. Figure 1 depicts the EFV system. Figure 1: Current EFV under Development: [See PDF for Image] Source: General Dynamics Land Systems. [End of figure] The EFV's total acquisition cost is currently estimated to be about $12.6 billion. In addition, the EFV accounts for a substantial portion of the Marine Corps' total acquisition budget for fiscal years 2006 through 2011, as figure 2 shows. Figure 2: Comparison of EFV Acquisition Cost to the Marine Corps' Total Acquisition Cost for Fiscal Years 2006-2011 (Then-year dollars): [See PDF for image] Source: GAO analysis of EFV program office data. [End of figure] The EFV program began its program definition and risk reduction phase in 1995, and was originally referred to as the Advanced Assault Amphibious Vehicle. The Marine Corps' existing assault amphibious vehicle was originally fielded in 1972 and will be over 30 years old when the EFV is fielded. Several Marine Corps studies identified deficiencies in the existing vehicle, including the lack of necessary lethality to defeat projected emerging threats. Despite efforts to extend the service life of the existing vehicle, Marine Corps officials stated that serious warfighting deficiencies remained. The studies concluded that the existing vehicle was unable to perform the type of combat missions envisioned by the Marine Corps' emerging combat doctrine and that a new vehicle was needed.[Footnote 6] In September 2003, DOD officially changed the name of the new vehicle to the EFV, which was in keeping with the Marine Corps' cultural shift from the 20TH century force defined by amphibious operations to a 21ST century force focusing on a broadened range of employment concepts and possibilities across a spectrum of conflict. The new vehicle is a self- deploying, high water-speed, amphibious, armored, tracked vehicle, and is to provide essential command, control, communications, computers, and intelligence functions for embarked personnel and EFV units. These functions are to be interoperable with other Marine Corps systems as well as with Army, Air Force, Navy, and NATO systems. The EFV transitioned to SDD in December 2000. The use of a knowledge-based acquisition approach was evident at the onset of the EFV program. Early in the program at the start of program definition and risk reduction, the Marine Corps ensured that four of the five critical program technologies were mature. Although the fifth technology (the moving map navigation technology, which provides situational awareness) was not mature at this same time, it was sufficiently matured after the program transitioned to SDD. Furthermore, the EFV design showed evidence of being stable by the completion and release of design drawings. At critical design review, 84 percent of the drawings were completed and released. The program now has 100 percent of the EFV drawings completed. Program officials expect that only about 12 percent of the design drawings are likely to be changed in the future as a result of planned reliability testing. Cost, Schedule, and Other Problems Have Reduced EFV Buying Power: Since entering SDD in December, 2000, the EFV program's total cost has grown by about $3.9 billion, or 45 percent.[Footnote 7] Production quantities have been reduced by about 55 percent over fiscal years 2006- 2011, thereby reducing the capabilities provided to the warfighter during this period. Cost per vehicle has increased from $8.5 million to $12.3 million. However, total quantities remain unchanged. During the same period, the EFV's development schedule has grown by about 4 years, or 35 percent. Furthermore, a key requirement has been lowered. EFV reliability--a key performance parameter--has been reduced from 70 hours of continuous operation to 43.5 hours. Thus, overall EFV buying power has been reduced, for it will now take substantially more money than was estimated at the start of SDD to acquire the same number of vehicles later and more slowly, and with a reduced operational reliability requirement. EFV Costs and Schedule Have Grown Significantly Since Entering SDD: Since entering SDD in December 2000 and holding the SDD critical design review in January, 2001, the EFV program's total acquisition cost has grown by about $3.9 billion, or 45 percent, to $12.6 billion. Figure 3 shows how costs have grown over time. Figure 3: EFV Acquisition Cost Growth Since the Start of System Development and Demonstration: [See PDF for image] Source: GAO analysis of program office data. [End of figure] While total quantities have not changed, production quantities over fiscal years 2006-2011 were reduced by about 55 percent, from 461 vehicles to 208. This means that the warfighter will get the capability the EFV provides more slowly. The EFV program has been rebaselined three times since SDD began, as shown in table 1[Footnote 8]. Table 1: Program Office Rationales for Rebaselining the EFV Program Since Entering SDD: Date of rebaseline: November 2002; Rationale for rebaselines: Prototypes were not delivered as anticipated; additional time was needed for reliability testing prior to the Milestone C decision; Impact on program schedule: 12-month increase. Date of rebaseline: March 2003; Rationale for rebaselines: DOD's Director, Operational Test and Evaluation directed more time be added for more robust operational testing prior to Milestone C; Impact on program schedule: 12-month increase. Date of rebaseline: March 2005; Rationale for rebaselines: Rebaseline was implemented to incorporate the program changes as a result of DOD's Program Budget Decision 753; Impact on program schedule: 24-month increase. Source: GAO analysis of EFV program office data. [End of table] Because the rebaselines have occurred incrementally over time, the EFV program has not previously been required to submit a unit cost increase report to Congress. Congress in 1982 enacted the unit cost reporting statute, now codified in 10 USC 2433, which is commonly referred to as Nunn-McCurdy, after the congressional leaders responsible for the requirement. The statute required the Secretary of Defense to certify a program to Congress when the unit cost growth in constant dollars reaches 25 percent above the most recent rebaseline cost estimate and report to Congress when it reaches 15 percent. The National Defense Authorization Act[Footnote 9] for fiscal year 2006 made changes to Nunn- McCurdy. The primary change that affects the EFV program was the additional requirement to report 30 percent unit cost growth above the original baseline estimate approved at SDD. The EFV program recently reported an increase in the EFV's program average unit cost increase of at least 30 percent above its original baseline estimate at SDD. Although the EFV program acquisition unit costs have increased by about at least 30 percent since SDD began, no single increase between rebaselines has reached the 15 percent reporting threshold. Overall, the program schedule has grown by 48 months or 35 percent from December 2000 at the start of SDD to the most recent rebaselining in March 2005. This schedule growth has delayed the occurrence of key events. For example, the EFV program was originally scheduled to provide the Marine Corps with its initial operational capability vehicles in September 2006, but is now scheduled to provide this capability in September 2010. Details of key event schedule changes are shown in table 2. Table 2: Comparison of Key Events Timing: Baseline SDD key events (12/2000): December 2000; Key events: Milestone B (System Development and Demonstration); Current SDD key events (3/ 2005): December 2000. Baseline SDD key events (12/2000): January 2001; Key events: Critical Design Review; Current SDD key events (3/2005): January 2001. Baseline SDD key events (12/2000): October 2003[A]; Key events: Milestone C (Low-rate initial Production); Current SDD key events (3/ 2005): December 2006. Baseline SDD key events (12/2000): Start-August 2007 End-April 2008; Key events: Initial Operational Test & Evaluation; Current SDD key events (3/2005): Start-May 2009 End-January 2010. Baseline SDD key events (12/2000): August 2008; May 2010; Key events: Full-Rate Production; Deliveries start; Current SDD key events (3/ 2005): August 2010; May 2012. Baseline SDD key events (12/2000): September 2006; Key events: Initial Operational Capability; Current SDD key events (3/2005): September 2010. Source: GAO analysis of EFV program office data. [A] In 1999, the program office accelerated Milestone C from July 2005 to October 2003. [End of table] Reliability Requirement Reduced: In 2005, the Marine Corps received approval to lower the EFV's reliability requirement from 70 hours before maintenance is needed to 43.5 hours before maintenance is needed.[Footnote 10] This decision was based on a revised analysis of the EFV's mission profile and the vehicle's demonstrated reliability. At the start of SDD, the EFV's operational reliability requirement was 70 hours of operation before maintenance is needed. Program officials told us this 70-hour requirement was based on the EFV's mission profile at the time, which called for a "do-all" mission for one 24.3 hour period of operation. The original reliability growth plan anticipated that this requirement would be met after initial operational test and evaluation, which was then planned for August 2007. In 2002, the Marine Corps' Combat Development Command performed an independent analysis of the original 70-hour reliability requirement and determined that it was likely that it would be very difficult to achieve. Additionally, the analysis determined that this requirement was excessively high when compared to similar types of vehicles. In fiscal year 2004, DOD's Director of Operational Test and Evaluation (DOT&E) office reported that overall EFV reliability remained a significant challenge because of the system's comparative complexity and harsh operating environment. In 2004, The Marine Corps' Combat Development Command reviewed the 70-hour requirement and recommended that it be reduced to 43.5 hours. According to program officials, the primary reason for the reduction to 43.5 hours was to more accurately depict the Marine Corps' current mission profile for the EFV, which calls for a 12.5-hour mission day. The Joint Requirements Oversight Council approved the reliability reduction to 43.5 hours in January 2005. Difficulty of Demonstrating Design Maturity Was Underestimated: The program's development schedule did not allow enough time to demonstrate maturity of the EFV design during SDD. The critical design review was held almost immediately after SDD began. Testing of early prototypes continued for 3 years after the decision to begin building the SDD prototypes. Test schedules for demonstrating design maturity in the integrated, full-system SDD prototypes proved optimistic and success-oriented, and were extended twice. After the schedules were extended, major problems were discovered in testing the prototypes. Best Practices for Demonstrating Design Maturity: Conceptually, as figure 4 illustrates, SDD has two phases: a system integration phase to stabilize the product's design and a system demonstration phase to demonstrate the product can be manufactured affordably and work reliably.[Footnote 11] Figure 4: Best Practices for Demonstrating Design Maturity: [See PDF for image] Source: DOD and GAO. [End of figure] The system integration phase is used to stabilize the overall system design by integrating components and subsystems into a product and by showing that the design can meet product requirements. When this knowledge is captured, knowledge point 2 has been achieved. Leading commercial companies use several criteria to determine that this point has been achieved, including completion of 90 percent of engineering drawings and prototype or variant testing to demonstrate that the design meets the requirements. When knowledge point 2 is reached, a decision review--or critical design review--is conducted to ensure that the program is ready to move into system demonstration. This review represents the commitment to building full-scale SDD prototypes that are representative of the production vehicle. The system demonstration phase is then used to demonstrate that the product will work as required and can be manufactured within targets. When this knowledge is captured, knowledge point 3 has been achieved. DOD uses this conceptualization of SDD for its acquisition policy and guidance.[Footnote 12] The EFV program met most of the criteria for SDD critical design review, which it held January 2001, about 1 month after entering SDD. In particular, it had 84 percent of drawings completed and had conducted early prototype testing during the last year of program definition and risk reduction. However, this early prototype testing had not been fully completed prior to critical design review. Testing of the early prototypes continued for 3 years into SDD, well after the program office established the SDD critical design decision to begin building the SDD prototypes. Initial SDD Test Schedules Were Optimistic and Success-Oriented: The program did not allow enough time to demonstrate maturity of the EFV design during SDD. The original SDD schedule of about 3 years proved too short to conduct all necessary planning and to incorporate the results of tests into design changes. Specifically, the original schedule did not allow adequate time for testing, evaluating the results, fixing the problems, and retesting to make certain that problems are fixed before moving forward. Testing is the main process used to gauge the progress provided to the customer. Consequently, it is essential to build sufficient testing and evaluation time into program development to minimize or avoid schedule slippages and cost increases being made when an idea or concept is translated into an actual product.[Footnote 13] Evaluation is the process of analyzing and learning from a test. The ultimate goal of testing and evaluation is to make sure the product works as intended before it is provided to the customer. Consequently, it is essential to build sufficient testing and evaluation time into program development to minimize or avoid schedule slippages and cost increases. Prior to entering SDD, during both the concept evaluation and the program definition and risk reduction phases, the EFV program conducted a variety of component and subsystem tests. This testing included an engineering-model and prototype-testing program, as well as modeling and simulation test programs. Early EFV testing also included early operational assessment tests on the initial prototype developed during program definition and risk reduction. During this phase, the EFV program demonstrated key aspects of performance including the technological maturity to achieve the high water speed and land mobility needed for the EFV mission. In addition, a number of subsystem tests were conducted on key components of the EFV, including the main engine; water jets; propulsion drive train components; weapons; nuclear, biological and chemical filters; track, suspension units; and nearly all of the vehicle electronics. Nevertheless, the SDD schedule was extended twice to ensure adequate system-level testing time. In November 2002, the program office extended the test schedule by 12 months for additional testing prior to low-rate initial production. According to program officials, this extension was necessary for several reasons. Lessons learned from testing the early prototypes necessitated design changes in the SDD prototypes, which delayed delivery and testing of the SDD prototypes. In addition, testing was taking longer than anticipated, additional time was needed for reliability testing, and more training was required to qualify crews prior to certain events. For example, the results of the early EFV firepower, water operations, and amphibious ship testing revealed the need for more testing. The schedule was delayed further to allow more time to demonstrate the reliability of the EFV using the SDD prototypes. In March 2003, DOT&E directed that the EFV test schedule be extended for yet another 12 months so that more developmental testing and more robust operational testing could occur before initial production. EFV Program Encountered Design Maturity Problems: After the two schedule adjustments, testing of SDD prototypes revealed major problems in maturing the system's design. Specifically, the program experienced problems with the HEU, bow flap, system hydraulics, and reliability. Hull Electronic Unit: The HEU provides the computer processing for the EFV's mobility, power, and auxiliary computer software configuration and for the command and control software application. Figure 5 shows the HEU. Figure 12: EFV Hull Electronics Unit: [See PDF for image] Source: EFV Program Office. [End of figure] In November 2004, during integrated system-level testing on the SDD prototypes, there were major problems with the HEU. For example, the water-mode steering froze, causing the vehicle to be non- responsive to the driver's steering inputs and both the HEU and the crew's display panel shut down during EFV operation. Consequently, testing ceased until the causes of the problems could be identified and corrections made. The program office conducted a root-cause analysis and traced the problems to both hardware and software sources. The program office made design changes and modifications to correct the problems, and testing resumed in January 2005, after about a 2-month delay. According to program officials, these changes and modifications were installed by May 2005, in the vehicles that will be used to conduct the operational assessment tests. Again, according to program officials, these problems have not recurred. However, the HEU has experienced some new problems in testing since then. For example, in June 2005, some status indicators on the crew's display panel shut down during land operations and had to be rebooted. Program officials commented that corrective actions for HEU problems have been initiated and tested to ensure that the actions resolved the problems. We did not independently verify program officials' statements about initiation and testing of corrective actions. Bow Flap: The bow flap is a folding appendage on the front of the EFV that is hydraulically extended forward during EFV water operations. The bow flap provides additional surface area that is used to generate additional hydrodynamic lift as the vehicle moves through the water. Figure 6 shows the bow flap. Figure 6: EFV Bow Flap: [See PDF for image] Source: EFV Program Office. [End of figure] Prior to entering SDD, major problems occurred with an earlier version of the bow flap in testing using early prototypes. Root-cause analysis traced these problems to bow flap overloading. Consequently, the bow flap was redesigned but was not retested on the early prototypes before the new design was installed on the SDD prototypes. Problems with the new bow flaps occurred during subsequent SDD prototype testing. For example, in September and October 2004, two bow flaps failed--one bent and one cracked. Again, the program office conducted a root-cause analysis, which determined that loading--while no longer excessive--was inappropriately distributed on the bow flaps. Following corrective action, tests were conducted in Hawaii during July to August 2005 to validate the load capacity of the new bow flap. These tests revealed that the design of the new bow flap needed some refinements in order to meet the operational requirement that the EFV be capable of operating in 3-foot significant wave heights.[Footnote 14] A program official indicated that the test results will be used to refine the design of the new bow flap. However, the refined bow flap design will not be tested in the operationally required 3-foot significant wave heights until initial operational testing and evaluation, well after the program enters low-rate initial production. Hydraulics: Hydraulic systems are key components in the EFV. For example, they control raising and lowering the bow flap, engine cooling systems, marine steering, and troop ramps. Hydraulic system failures are one of the top reliability drivers in the EFV program. If the reliability requirement is to be achieved, the myriad hydraulic problems must be resolved. The EFV has encountered hydraulic system problems on both early and SDD prototypes. The top four hydraulic system problems are: * Leaks from all sources, particularly leaks due to the loosening of fittings and connectors because of vibration during EFV operations. * Various component mechanical failures experienced during EFV testing. * Hydraulic fluid pressure spikes, particularly in the EFV's transmission and pumps. * Hydraulic fluid contamination by air, water, and particulates. Program officials said that the program office has instituted a design/test/redesign process to identify deficiencies and implement corrections to increase vehicle reliability. According to program officials, this process brings together the program office, contractor, various subcontractor vendors of hydraulic components, and experts from industry and academia to address and correct hydraulic problems as they occur. Corrective actions thus far include: * Leaks--better sealing of connections; installation of specialized, self-locking devices at connections most susceptible to vibration leaks; and replacement of rigid tubing with flexible hoses to absorb vibration. * Component mechanical failures--redesigning, strengthening, and upgrading various parts. * Hydraulic fluid pressure spikes--reducing gear shifting during EFV operations and installing devices to control pressure. * Hydraulic fluid contamination--flushing hydraulic systems and instituting a variety of monitoring, maintenance, and inspection plans to maintain hydraulic fluid and component cleanliness requirements. Program officials noted that corrective actions thus far have been tested to ensure that they resolved the problems, and have been installed on the SDD prototype vehicles. We did not independently verify this. System Reliability: Based on lower demonstrated reliability and problems with early program testing, the EFV's reliability has not grown as planned. Expectations for reliability are now lower, as reflected in the recent reduction to the reliability requirement. When SDD began, the EFV was expected to demonstrate 48 hours between failures by September 2005. Actual growth demonstrated 28 hours between failures in August 2005. At the time of the low-rate initial production decision now planned for December 2006, demonstrated reliability is projected to be 38 hours between failures. The original and current reliability growth curves for the EFV are shown in figures 7 and 8, respectively. Figure 7: Original Reliability Growth Plan: [See PDF for image] Source: GAO analysis of EFV program office data. [End of figure] Figure 8: Current Reliability Growth Plan: [See PDF for image] Source: GAO analysis of EFV program office data. [End of figure] In comparing the planned and actual reliability growth curves, it is clear that the actual test hours accumulated have been significantly less than planned. In fact, the original plan called for conducting 12,000 hours of testing by the original September 2005 production decision; according to the current plan, test hours will not reach this level until early 2008. The reduction in test hours is due, in part, to the other problems that occurred in testing. The accumulation of test hours is significant for reliability. In general, reliability growth is the result of an iterative design, build, test, analyze, and fix process. Initial prototypes for a complex product with major technological advances have inherent deficiencies. As the prototypes are tested, failures occur and, in fact, are desired so that the product's design can be made more reliable. Reliability improves over time with design changes or manufacturing process improvements. The program office acknowledges that even with the changes in mission profile and reduction in the operational requirement, reliability for the EFV remains challenging. In addition, the most recent DOT&E annual report found that the EFV system's reliability is the area of highest risk in the program.[Footnote 15] DOT&E has reviewed the EFV's current reliability growth plan and believes that it is realistic but can only be validated during initial operational testing and evaluation in 2010. According to the program manager, an additional 15 months would have been needed for more robust reliability testing, production qualification testing, and training, after the program entered low-rate initial production in September 2005, as originally planned. The March 25, 2005, rebaselining extended the schedule by 24 months and postponed low-rate initial production until September 2006, which has now been extended to December 2006. While DOD's December 2004, Program Budget Decision 753 served as the catalyst for this rebaselining, the program manager stated that he probably would have asked for a schedule extension of 15 months after entering low-rate initial production in September 2005, even if the budget decision had not occurred. DOD and Marine Corps officials verified that, although the program manager did not officially request this 15-month extension, he had been discussing an extension with them before the budget decision was issued. However, to the extent that the extra 9 months resulting from the budget decision prove unneeded for program management reasons, they will be an added cause for schedule and cost growth. Risks Remain for Demonstrating Design and Production Maturity: Three areas of risk remain for demonstrating design and production maturity, which have potential cost and schedule consequences--risks to the EFV business case. First, while the EFV program has taken steps and made plans to reduce risk in the production phase, production risk remains in the program. Current plans are to enter low-rate initial production without requiring the contractor to ensure that all key EFV manufacturing processes are under control. Second, the EFV program will transition to initial production without the knowledge that software capabilities are mature. Third, two key performance parameters-- reliability and interoperability--are not scheduled to be demonstrated until the initial operational test and evaluation phase in fiscal year 2010, about 4 years after low-rate initial production has begun. The program office has developed plans to resolve performance challenges and believes it will succeed. However, until the plans are actually implemented successfully, the EFV's design and production maturity will not be demonstrated and the potential for additional cost and schedule increases remains. Manufacturing Process Maturity Problems: While the EFV program has taken steps and made plans to reduce risk in the production phase, production maturity risk remains in the program. Current EFV program plans are to enter low-rate initial production without requiring the contractor to ensure that all key EFV manufacturing processes are under control, i.e., repeatable, sustainable, and capable of consistently producing parts within the product's tolerance and standards. Establishing such control is critical to ensuring that the EFV can be produced reliably and without unexpected production problems. In addition, DOD's system acquisition policy provides that there be no significant manufacturing risks prior to entering low-rate initial production and that manufacturing processes be under statistical process control prior to starting full- rate production.[Footnote 16] Leading commercial firms rely on statistical process control to ensure that all key manufacturing processes are under control before they enter production.[Footnote 17] Statistical process control is a technique that focuses on reducing variations in manufactured parts, which in turn reduces the risk of entering production with unknown production capability problems. Reducing and controlling variability lowers the incidence of defective parts and thereby products, which may have degraded performance and lower reliability. Defects can also delay delivery and increase support and production costs by requiring reworking or scrapping. Consequently, prior to entering production, leading commercial firms collect and analyze statistical process control data. Leading commercial firms also use a measure of process control called the process capability index to measure both the consistency and the quality of output of a process. DOD's acquisition policy applies a lower standard. It provides that there be no significant manufacturing risks prior to entering low-rate initial production and that manufacturing processes be under statistical process control prior to starting full-rate production.[Footnote 18] The EFV program is working toward the DOD standard. EFV program officials said that statistical process control will not be used to ensure that all key EFV manufacturing processes are under control prior to entering low-rate initial production. They stated that they have taken actions to enhance EFV production readiness. For example, they noted that one of the most important risk mitigating actions taken was ensuring that SDD prototypes were built using production-representative tooling and processes. Program officials also believe that production process maturity will be demonstrated by achieving repetitive schedule and quality performance during low-rate initial production. In addition, the program plans to collect statistical process control data during low-rate initial production to track equipment and machine performance and detect statistical shifts. The program believes that using statistical process control data in this manner will result in earlier detection of machine malfunctions. Program officials told us that once sufficient quantities of the EFV are produced and baseline statistical process control data collected, the results of the of this data will be implemented for any production measurements that demonstrate process stability. The program office believes that this approach will allow for use of statistical process control for implementation of stable manufacturing processes during low-rate initial production. However, the program office does not plan to set and achieve a process capability index for the EFV production efforts. The actions taken by the program may help to mitigate some production risk. In fact, EFV's plan to collect and use statistical process control data goes further than what we have found on most DOD weapon system programs. However, these actions do not provide the same level of confidence as having the manufacturing processes under statistical process control before production. The EFV program's approach of foregoing such control increases the risk of unexpected production problems during manufacturing. This risk is compounded by the fact that plans call for reliability and interoperability, along with resolution of other technical problems, to be operationally tested and demonstrated during low-rate initial production, not before. Software Development Capability Maturity Problems: Under current plans, the EFV program is at risk of entering low-rate initial production before software development capabilities are mature. Again, leading commercial firms ensure that software development capabilities are mature before entering production in order to prevent or minimize additional cost growth and schedule delays during this phase.[Footnote 19] Furthermore, DOD's weapon system acquisition policy calls for weapon systems to have mature software development capabilities before they enter low-rate initial production[Footnote 20]. In assessing software capability maturity, commercial firms, DOD, and GAO consider the software capability maturity model developed by Carnegie Mellon University's Software Engineering Institute to be an industry standard.[Footnote 21] This model focuses on improving, standardizing, and certifying software development processes, including key process areas that must be established in the software developer's organization. The model is essentially an evolutionary path organized into five maturity levels: * Level 1, Initial--the software process is ad hoc and occasionally chaotic. Few processes are defined, and success depends on individual effort. * Level 2, Repeatable---basic project management processes are established to track cost, schedule, and functionality. The necessary process discipline is in place to repeat earlier successes on projects with similar applications. * Level 3, Defined--the software process for both management and engineering activities is documented, standardized, and integrated into a standard process for the organization. All projects use an approved, tailored version of the organization's standard process for developing and maintaining software. * Level 4, Managed--Detailed measures of the software process and product quality are collected. Both the software development process and products are quantitatively understood and controlled. * Level 5, Optimizing--Continuous process improvement is enabled by quantitative feedback from the process and from plotting innovative ideas and technologies. The EFV program has had problems with maturing its software development capabilities. The EFV's prime contractor, General Dynamics Land Systems (GDLS), which at the time had a level 3 maturity software capability, developed all software for the early EFV program.[Footnote 22] According to the program office, when the program entered SDD, responsibility for EFV's software development was transferred to GDLS' amphibious development division, General Dynamics Amphibious Systems (GDAMS). GDAMS has a level 1 maturity software capability. Consequently, the SDD contract required GDLS to achieve a software development capability maturity level 3 for all EFV software contractors and subcontractors within 1 year of the contract award date, July 2001. In January 2002, the program extended this requirement by 1 year, until July 2003. Nevertheless, while GDAMS twice attempted to achieve level 3 software development capability maturity, it did not succeed. Program officials considered GDAMS's inability to achieve an acceptable level of software development capability maturity a risk to the program. To mitigate this risk, in January 2004, the program manager began developing a risk mitigation plan. As part of this plan, representatives from the EFV program office, GDAMS, and Ogden Air Logistics Center's 309TH Software Maintenance Group--a certified level 5 maturity software development organization--formed a Software Partnership Working Group to address software development capability maturity issues. As of February 2006, EFV program officials were in the process of negotiating a memorandum of agreement with the 309TH Software Partnership Working Group to develop the EFV's low-rate initial production software. The 309TH will work in partnership with GDAMS as specified by the terms of the memorandum of agreement. Its involvement is to ensure that the EFV's software development capability will be at the desired maturity level. However, the 309TH Software Maintenance Group will not complete the software development for the EFV's low-rate initial production version until September 2006. Furthermore, GDAMS does not plan to insert this software into the EFV vehicles until fiscal year 2008, well after low- rate initial production has begun. This means that the low-rate initial production decision will be made without the integration of mature software. Furthermore, the software itself will not be demonstrated in the vehicle until well into low-rate initial production. While the program office believes that the level of software risk is an acceptable level risk, we have found that technology--including software--is mature when it has been demonstrated in its intended environment.[Footnote 23] While involving the 309TH Software Maintenance Group helps to mitigate the risk of immature software development capability in the EFV program, it increases certain other risks. The memorandum of agreement distributes the responsibility for software development between the three participants. However, much of the responsibility for developing a working software package in an acceptably mature environment shifts from the prime contractor to the Marine Corps. The software will now become government-furnished equipment or information. In essence, the Marine Corps has now assumed much of the risk in the software development effort. If the software does not work according to the requirements, it will be incumbent upon the Marine Corps--not the prime contractor, GDLS--to correct the problems. Furthermore, if the integration of the government-furnished software into the vehicles creates additional problems, the Marine Corps could be responsible for corrections. Both of these situations could lead to cost and schedule growth, and thus increase risks to the program. Performance Challenges Not Yet Fully Resolved: Several EFV performance challenges are not yet fully resolved. Specifically, a key performance parameter--interoperability--cannot be properly demonstrated until initial operational testing and evaluation in fiscal year 2010, well after low-rate initial production has begun. Interoperability means that the EFV communication system must provide essential command, control, communications, and intelligence functions for embarked personnel and EFV units. In addition, the EFV communication system must be compatible--able to communicate--with other Marine Corps systems as well as with Army, Navy, Air Force, and North Atlantic Treaty Organization systems. In order to demonstrate interoperability, the EFV must participate in operational tests that involve these joint forces. Another key performance parameter-- reliability--has been problematic and still presents a significant challenge.[Footnote 24] It also is not scheduled to be demonstrated until initial operational testing and evaluation. Furthermore, the bow flap has been problematic and, while improved, still requires some design refinement and has not yet been successfully tested at its operational performance level. Program officials commented that they have developed plans to resolve remaining EFV performance challenges and are optimistic that these plans will be implemented effectively and testing successfully completed. However, there are no guarantees that this will actually happen. Consequently, the performance challenges remain risks to the program until they are fully resolved with effective solutions actually demonstrated. Conclusions: The EFV has encountered risks to its business case because of problems encountered in full-system testing, coupled with an SDD schedule that did not allow enough time for conducting the testing and learning from it. Using the lens of a knowledge-based business case, the start of SDD was sound on requirements and technology maturity (knowledge point 1). While design stability was judged to be attained at the critical design review (knowledge point 2) immediately after entering SDD, it appears that holding critical design review so soon was premature. The acquisition strategy did not provide the resources (time and money) necessary to demonstrate design maturity and production maturity (knowledge point 3). However, we do note that the EFV program is planning to do more with statistical process control than most other programs we have reviewed. In retrospect, the EFV program would have been more executable had the SDD phase allowed for completion of early prototype testing before holding the SDD critical design review and committing to building the SDD prototypes. Another lesson learned is that while it is necessary to demonstrate one knowledge point before a subsequent one can be demonstrated, this alone is not sufficient. Attaining one knowledge point does not guarantee the attainment of the next one. Rather, the acquisition strategy for any program must adequately provide for the attainment of each knowledge point even in programs, such as the EFV, which were in a favorable position at the start of SDD. The EFV program has put into place a number of corrective actions and plans to overcome and mitigate weaknesses in acquisition strategy. Nevertheless, design, production, and software development capability maturity have not yet been fully demonstrated and technical problems fully corrected. It is important for the business case for the EFV to remain valid in light of these changes and that the remainder of SDD adequately provide for the demonstration of design, production, and software development capability maturity before committing to production. While these problems must be acknowledged and addressed, the fact that the EFV program has had a number of sound features should not be overlooked. In this vein, the program can still be the source of lessons that DOD can apply to other programs. In particular, it is important that all of the elements of a sound business case be present at the start of SDD. While it is generally recognized that missing an early knowledge point will jeopardize the remaining ones, it must also be recognized that later knowledge points are not guaranteed even if early ones are achieved. If the acquisition strategy does not adequately provide for the attainment of all knowledge points, the estimates for cost and schedule will not have a sound basis. Recommendations for Executive Actions: We are recommending that the Secretary of Defense ensure that: * EFV design, production, and mature software development capabilities are demonstrated before Milestone C; * adequate resources are available to cover such demonstration and provide for risks; and: * the business case for EFV (including cost and expected capability), after including the above, still warrants continued investment. We also recommend that the Secretary of Defense draw lessons learned from EFV and apply them to the Defense Acquisition University's curriculum for instructing program executives, managers, and their staffs. Such lessons might include understanding that attaining one knowledge point does not guarantee the attainment of the next one; the importance of having a sound business case for each phase of development; the right time to hold a critical design review; and the importance of allowing sufficient time to learn from testing. Agency Comments and Our Evaluation: In commenting on a draft of our report, DOD's Acting Director for Defense Systems concurred with our recommendations. In doing so, DOD stated that the Department currently plans to assess the readiness of the EFV program for a low-rate initial production decision within a year. This assessment will review the maturity of the EFV design, including software, its production readiness for low-rate initial production, and its demonstrated capability, as well as program costs and risks. Continued investment in EFV will be based on that information. The full text of the department's response is in appendix II. The Department notes that our best practices construct for production readiness is difficult to reconcile with its current acquisition production decision points. World class companies we have visited do, in fact, often have a limited production run that they use to manufacture a small number of production representative assets; however, they do not make a decision to invest in the tooling necessary to ramp up to full production until after those assets have been tested by the customer and their critical manufacturing processes are in control. DOD's low-rate initial production decision reflects the decision to invest in all of the resources needed to achieve full-rate production. We believe this is too soon and that DOD would benefit from this lesson by focusing low-rate initial production on demonstrating the product and process and waiting to invest in more resources, such as tooling, to ramp up until the full-rate production decision has been made. We are sending copies of this report to the Secretary of Defense, Secretary of the Navy, and other interested parties. We will also provide copies to others on request. In addition, the report will be available at no charge on the GAO Web site at [Hyperlink, http://www.gao.gov]. If you or your staff have any questions about this report, please contact me on (202) 512-4841. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this report. GAO staff who made major contributions to this report are listed in appendix III. Signed By: Paul L. Francis: Director: Acquisition and Sourcing Management. [End of section] Appendix I: Scope and Methodology: To assess the current status of the EFV (particularly the status of the production decision), the factors that contributed to the current status, and future risks in the program, we interviewed key officials from DOD's Director, Operational Test and Evaluation, the Office of the Secretary of Defense's Program Analysis and Evaluation office, the U.S. Marine Corps, Isothermal Systems Research, Inc., in Washington, D.C., and the 309TH Software Maintenance Group, in Ogden, Utah. We also interviewed the Direct Reporting Program Manager for the EFV and the prime contractor, General Dynamics Land Systems, in Woodbridge Virginia. We examined and analyzed pertinent program documentation, including the Selected Acquisition Reports; Test and Evaluation Master Plan; Developmental Testing Schedule; Budget Justification documents, Program Management Plan; Acquisition Strategy Plan; DOD's Operational Testing, and Evaluation reports; Operational Requirement Documents, and the Software Development Plan. We relied on previous GAO work as a framework for knowledge-based acquisition. [End of section] Appendix II: Comments from the Department of Defense: Office Of The Under Secretary Of Defense: 3000 Defense Pentagon: Washington, Dc 20301-3000: Acquisition, Technology And Logistics: Mr. Paul L. Francis: Director: Acquisition and Sourcing Management: U.S. Government Accountability Office: Washington, D.C. 20548: Dear Mr. Francis: This is the Department of Defense (DoD) response to the GAO draft report GAO-06-349, "Defense Acquisitions: The Expeditionary Fighting Vehicle Encountered Difficulties in Design Demonstration and Faces Future Risks," dated March 28, 2006 (GAO Code 120447), The report recommends the Secretary of Defense ensure the Expeditionary Fighting Vehicle program business case is fully evaluated for demonstrated performance, expected capability, vehicle cost, and resourcing prior to Milestone C, It further recommends the Secretary of Defense draw lessons learned from the EFV program for use in Defense Acquisition University's curriculum. The Department concurs with both GAO recommendations. The Department would like to note that the GAO best practices construct for production readiness is difficult to reconcile with the current Department acquisition production decision points. The uniqueness and magnitude of investment associated with defense acquisitions has resulted in both statutory and regulatory policies which require reviews of program acquisitions prior to major investment decisions - Low Rate Initial Production (LRIP) and Beyond LRIP (or full rate production), The LRIP decision authorizes production of assets for Initial Operational Test and Evaluation (IOT&E) and to support a ramp up of production capability, The full rate production decision authorizes the procurement of the vast majority of the acquisition objective (90% or greater) and is based on the demonstrated capability of production assets in IOT&E and the demonstrated production capabilities. The GAO does not differentiate between these decision points and appears to evaluate acquisitions against full rate production readiness measures at LRIP - well prior to a time the Department is prepared to commit to the investments necessary to support a full rate production decision. Detailed comments on the report are enclosed. Sincerely, Signed By: Mark D. Schaeffer: Acting Director: Defense Systems: Enclosure. As stated: GAO DRAFT REPORT DATED MARCH 28, 2006 GAO-06-349 (GAO CODE 120447): "Defense Acquisitions: The Expeditionary Fighting Vehicle Encountered Difficulties In Design Demonstration And Faces Future Risks" Department Of Defense Comments To The Gao Recommendations: Recommendation 1: The GAO recommended that the Secretary of Defense ensure that: (1) Expeditionary Fighting Vehicle (EFV) design, production, and software maturity are demonstrated before Milestone C; (2) adequate resources are available to cover such demonstration and provide for risks; and (3) the business case for EFV (including cost and expected capability), after including the above, still warrants continued investment, (p. 28/GAO Draft Report): DOD Response: Concur, The Department currently plans to assess the readiness of the EFV program for a Low-Rate Initial Production (LRIP) decision within a year. This assessment will review the maturity of the EFV design, including software, its production readiness for LRIP, its demonstrated capability, as well as program costs and risks. Continued investment in EFV will be based on that information: Recommendation 2: The GAO recommended that the Secretary of Defense draw lessons learned from EFV and apply them to the Defense Acquisition University's curriculum for instructing program executives, manager, and their staffs. (p. 28/GAO Draft Report): DOD RESPONSE: Concur. [End of section] Appendix III: GAO Contact and Staff Acknowledgement: GAO Contact: Paul Francis (202) 512-4841: Acknowledgements: In addition to the contact named above, D. Catherine Baltzell, Assistant Director; Leon S. Gill; Danny Owens; Steven Stern; Martin G. Campbell; and John Krump made key contributions to this report. FOOTNOTES [1] Major defense acquisition programs are defined by DOD as those estimated as requiring an eventual total expenditure for research, development, test, and evaluation of more than $365 million or for procurement of more than $2.190 billion in fiscal year 2000 constant dollars. [2] GAO, Tactical Aircraft: F/A-22 and JSF Acquisition Plans and Implications for Tactical Aircraft Modernization GAO-05-519T, (Washington, D.C.: April 6, 2005). [3] Department of Defense Instruction 5000.2, Subject: Operation of the Defense Acquisition System (May 12, 2003). [4] GAO, Defense Acquisitions: Assessments of Selected Major Weapon Programs, GAO-05-301(Washington, D.C.: March 2005). [5] GAO, Best Practices: DOD Training Can Do More to Help Weapon System Programs Implement Best Practices, GAO/NSIAD-99-206 (Washington, D.C.: March 1999). [6] In 2003, GAO also reported that the existing amphibious assault vehicle needed attention due to aged equipment that needed upgrading. Military Readiness: DOD Needs to Reassess Program Strategy, Funding Priorities, and Risks for Selected Equipment, GAO-04-112 (Washington, D.C.: December 2003). [7] In constant 2006 dollars, the December 2000 cost is $9.6 billion, for an increase of $3.1 billion, or 32 percent. [8] A program‘s baseline is derived from its performance and schedule needs and theestimates of total program cost consistent with projected funding, and reflects theprogram‘s estimated total acquisition cost and schedule at the time the baseline is derived.Under certain circumstances, DOD will ’rebaseline“ a program--i.e., change its estimatedcost and schedule so that goals more realistically reflect the program‘s current status. Rebaselining is useful and appropriate in many situations. [9] Public Law 109-163. [10] As measured by mean time (hours) between operational mission failures. [11] GAO, Best Practices: Capturing Design and Manufacturing Knowledge Early Improves Acquisition Outcomes, GAO-02-701 (Washington, D.C.: July 15, 2002). [12] Department of Defense Instruction 5000.2, Subject: Operation of the Defense Acquisition System (May 12, 2003). [13] GAO, Best Practices: A More Constructive Test Approach Is Key to Better Weapon System Outcomes, GAO/NSIAD-00-199 (Washington, D.C.: July 31, 2000). [14] Significant wave height is defined as the distance from the crest to the trough of the biggest one-third of the waves. [15] Director of Operational Test and Evaluation's Fiscal Year 2005 Annual Report, December 2005. [16] Department of Defense Instruction 5000.2, Subject: Operation of the Defense Acquisition System (May 12, 2003). [17] GAO, DOD Acquisition Outcomes: A Case for Change GAO-06-257T (Washington, D.C.: November 15, 2005). [18] Department of Defense Instruction 5000.2, Subject: Operation of the Defense Acquisition System (May 12, 2003). [19] GAO, Best Practices: A More Constructive Test Approach Is Key to Better Weapons System Outcomes, GAO/NSIAD-00-199 (Washington, D. C.: July 31, 2000). [20] DOD Instructions 5000.2, Subject: Operation of the Defense Acquisition System (May 12, 2003). [21] GAO, Defense Acquisitions: Stronger Management Practices Are Needed to Improve DOD's Software-Intensive Weapon Acquisitions, GAO-04- 393 (Washington, D.C.: March 1, 2004). [22] GDLS now has level 5 certification. [23] GAO Missile Defense: Knowledge-Based Practices Are Being Adopted, but Risks Remains, GAO-03-441 (Washington, D.C.: April 30, 2003). [24] Director, Operational Test and Evaluation's FY 2005 Annual Report, December 2005. GAO's Mission: The Government Accountability Office, the investigative arm of Congress, exists to support Congress in meeting its constitutional responsibilities and to help improve the performance and accountability of the federal government for the American people. GAO examines the use of public funds; evaluates federal programs and policies; and provides analyses, recommendations, and other assistance to help Congress make informed oversight, policy, and funding decisions. GAO's commitment to good government is reflected in its core values of accountability, integrity, and reliability. Obtaining Copies of GAO Reports and Testimony: The fastest and easiest way to obtain copies of GAO documents at no cost is through the Internet. GAO's Web site ( www.gao.gov ) contains abstracts and full-text files of current reports and testimony and an expanding archive of older products. The Web site features a search engine to help you locate documents using key words and phrases. You can print these documents in their entirety, including charts and other graphics. Each day, GAO issues a list of newly released reports, testimony, and correspondence. GAO posts this list, known as "Today's Reports," on its Web site daily. The list contains links to the full-text document files. To have GAO e-mail this list to you every afternoon, go to www.gao.gov and select "Subscribe to e-mail alerts" under the "Order GAO Products" heading. Order by Mail or Phone: The first copy of each printed report is free. Additional copies are $2 each. A check or money order should be made out to the Superintendent of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or more copies mailed to a single address are discounted 25 percent. Orders should be sent to: U.S. Government Accountability Office 441 G Street NW, Room LM Washington, D.C. 20548: To order by Phone: Voice: (202) 512-6000: TDD: (202) 512-2537: Fax: (202) 512-6061: To Report Fraud, Waste, and Abuse in Federal Programs: Contact: Web site: www.gao.gov/fraudnet/fraudnet.htm E-mail: fraudnet@gao.gov Automated answering system: (800) 424-5454 or (202) 512-7470: Public Affairs: Jeff Nelligan, managing director, NelliganJ@gao.gov (202) 512-4800 U.S. Government Accountability Office, 441 G Street NW, Room 7149 Washington, D.C. 20548:

The Justia Government Accountability Office site republishes public reports retrieved from the U.S. GAO These reports should not be considered official, and do not necessarily reflect the views of Justia.