Air Traffic Control

System Management Capabilities Improved, but More Can Be Done to Institutionalize Improvements Gao ID: GAO-04-901 August 20, 2004

Since 1981, the Federal Aviation Administration (FAA) has been working to modernize its aging air traffic control (ATC) system. Individual projects have suffered cost increases, schedule delays, and performance shortfalls of large proportions, leading GAO to designate the program a high-risk information technology initiative in 1995. Because the program remains a high risk initiative, GAO was requested to assess FAA's progress in several information technology management areas. This report, one in a series responding to that request, has two objectives: (1) to evaluate FAA's capabilities for developing and acquiring software and systems on its ATC modernization program and (2) to assess the actions FAA has under way to improve these capabilities.

FAA has made progress in improving its capabilities for acquiring software-intensive systems, but some areas still need improvement. GAO had previously reported in 1997 that FAA's processes for acquiring software were ad hoc and sometimes chaotic. Focusing on four mission critical air traffic projects, GAO's current review assessed system and software management practices in numerous key areas such as project planning, risk management, and requirements development. GAO found that these projects were generally performing most of the desired practices: of the 900 individual practices evaluated, 83 percent were largely or fully implemented. The projects were generally strong in several areas such as project planning, requirements management, and identifying technical solutions. However, there were recurring weaknesses in the areas of measurement and analysis, quality assurance, and verification. These weaknesses hinder FAA from consistently and effectively managing its mission critical systems and increase the risk of cost overruns, schedule delays, and performance shortfalls. To improve its software and system management capabilities, FAA has undertaken a rigorous process improvement initiative. In response to earlier GAO recommendations, in 1999, FAA established a centralized process improvement office, which has worked to help FAA organizations and projects to improve processes through the use of a standard model, the integrated Capability Maturity Model. This model, which is a broad model that integrates multiple maturity models, is used to assess the maturity of FAA's software and systems capabilities. The projects that have adopted the model have demonstrated growth in the maturity of their processes, and more and more projects have adopted the model. However, the agency does not require the use of this process improvement method. To date, less than half of FAA's major ATC projects have used this method, and the recurring weaknesses we identified in our project-specific evaluations are due in part to the choices these projects were given in deciding whether to and how to adopt this process improvement initiative. Further, as a result of reorganizing its ATC organizations to a performance-based organization, FAA is reconsidering prior policies, and it is not yet clear that process improvement will continue to be a priority. Without a strong senior-level commitment to process improvement and a consistent, institutionalized approach to implementing and evaluating it, FAA cannot ensure that key projects will continue to improve systems acquisition and development capabilities. As a result, FAA will continue to risk the project management problems--including cost overruns, schedule delays, and performance shortfalls--that have plagued past acquisitions.

Recommendations

Our recommendations from this work are listed below with a Contact for more information. Status will change from "In process" to "Open," "Closed - implemented," or "Closed - not implemented" based on our follow up work.

Director: Team: Phone:


GAO-04-901, Air Traffic Control: System Management Capabilities Improved, but More Can Be Done to Institutionalize Improvements This is the accessible text file for GAO report number GAO-04-901 entitled 'Air Traffic Control: System Management Capabilities Improved, but More Can Be Done to Institutionalize Improvements' which was released on September 20, 2004. This text file was formatted by the U.S. Government Accountability Office (GAO) to be accessible to users with visual impairments, as part of a longer term project to improve GAO products' accessibility. Every attempt has been made to maintain the structural and data integrity of the original printed product. Accessibility features, such as text descriptions of tables, consecutively numbered footnotes placed at the end of the file, and the text of agency comment letters, are provided but may not exactly duplicate the presentation or format of the printed version. The portable document format (PDF) file is an exact electronic replica of the printed version. We welcome your feedback. Please E-mail your comments regarding the contents or accessibility features of this document to Webmaster@gao.gov. This is a work of the U.S. government and is not subject to copyright protection in the United States. It may be reproduced and distributed in its entirety without further permission from GAO. Because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. Report to Congressional Committees: August 2004: AIR TRAFFIC CONTROL: System Management Capabilities Improved, but More Can Be Done to Institutionalize Improvements: [Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-04-901]: GAO Highlights: Highlights of GAO-04-901, a report to congressional committees: Why GAO Did This Study: Since 1981, the Federal Aviation Administration (FAA) has been working to modernize its aging air traffic control (ATC) system. Individual projects have suffered cost increases, schedule delays, and performance shortfalls of large proportions, leading GAO to designate the program a high-risk information technology initiative in 1995. Because the program remains a high risk initiative, GAO was requested to assess FAA‘s progress in several information technology management areas. This report, one in a series responding to that request, has two objectives: (1) to evaluate FAA‘s capabilities for developing and acquiring software and systems on its ATC modernization program and (2) to assess the actions FAA has under way to improve these capabilities. What GAO Found: FAA has made progress in improving its capabilities for acquiring software-intensive systems, but some areas still need improvement. GAO had previously reported in 1997 that FAA‘s processes for acquiring software were ad hoc and sometimes chaotic. Focusing on four mission critical air traffic projects, GAO‘s current review assessed system and software management practices in numerous key areas such as project planning, risk management, and requirements development. GAO found that these projects were generally performing most of the desired practices: of the 900 individual practices evaluated, 83 percent were largely or fully implemented. The projects were generally strong in several areas such as project planning, requirements management, and identifying technical solutions. However, there were recurring weaknesses in the areas of measurement and analysis, quality assurance, and verification. These weaknesses hinder FAA from consistently and effectively managing its mission critical systems and increase the risk of cost overruns, schedule delays, and performance shortfalls. To improve its software and system management capabilities, FAA has undertaken a rigorous process improvement initiative. In response to earlier GAO recommendations, in 1999, FAA established a centralized process improvement office, which has worked to help FAA organizations and projects to improve processes through the use of a standard model, the integrated Capability Maturity Model. This model, which is a broad model that integrates multiple maturity models, is used to assess the maturity of FAA‘s software and systems capabilities. The projects that have adopted the model have demonstrated growth in the maturity of their processes, and more and more projects have adopted the model. However, the agency does not require the use of this process improvement method. To date, less than half of FAA‘s major ATC projects have used this method, and the recurring weaknesses we identified in our project-specific evaluations are due in part to the choices these projects were given in deciding whether to and how to adopt this process improvement initiative. Further, as a result of reorganizing its ATC organizations to a performance-based organization, FAA is reconsidering prior policies, and it is not yet clear that process improvement will continue to be a priority. Without a strong senior-level commitment to process improvement and a consistent, institutionalized approach to implementing and evaluating it, FAA cannot ensure that key projects will continue to improve systems acquisition and development capabilities. As a result, FAA will continue to risk the project management problems”including cost overruns, schedule delays, and performance shortfalls”that have plagued past acquisitions. What GAO Recommends: GAO is making recommendations to the Secretary of Transportation to address specific weaknesses and to institutionalize FAA‘s process improvement initiatives by establishing a policy and plans for implementing and overseeing process improvement initiatives. In commenting on a draft of this report, agency officials generally agreed with GAO‘s recommendations. They also provided technical corrections, which were incorporated as appropriate. www.gao.gov/cgi-bin/getrpt?GAO-04-901. To view the full product, including the scope and methodology, click on the link above. For more information, contact Dave Powner at (202) 512-9286 or pownerd@gao.gov. [End of section] Contents: Letter: Executive Summary: Purpose: Background: Results in Brief: Principal Findings: Recommendations for Executive Action: Agency Comments: Chapter 1: Introduction Overview of ATC: FAA's ATC Modernization Is a High-Risk Initiative: Prior Report Noted Weaknesses in FAA's Software Acquisition Capabilities: FAA Is Changing Its Approach to ATC Management: The Capability Maturity Model Integration Provides a Means of Assessing an Organization's Ability to Manage Software and System Acquisition and Development: Objectives, Scope, and Methodology: Chapter 2: FAA Is Performing Most Project Planning Practices, but It Is Not Yet Fully Managing the Process: Chapter 3: FAA Is Performing Most Project Monitoring and Control Practices, but It Is Not Yet Fully Managing the Process: Chapter 4: FAA Is Performing Most Risk Management Practices, but It Is Not Yet Fully Managing the Process: Chapter 5: FAA Is Performing Requirements Development Practices, but It Is Not Yet Fully Managing the Process: Chapter 6: FAA Is Performing Requirements Management Practices, but It Is Not Yet Fully Managing the Process: Chapter 7: FAA Is Performing Most Technical Solution Practices, but It Is Not Yet Fully Managing the Process: Chapter 8: FAA Is Performing Product Integration Practices, but It Is Not Yet Fully Managing the Process: Chapter 9: FAA Is Not Performing Key Verification Practices or Fully Managing the Process: Chapter 10: FAA Is Performing Validation Practices, but It Is Not Yet Fully Managing the Process: Chapter 11: FAA Is Performing Most Configuration Management Practices, but It Is Not Yet Fully Managing the Process: Chapter 12: FAA Is Not Performing Key Process and Product Quality Assurance Practices or Managing the Process: Chapter 13: FAA Is Not Performing Most Measurement and Analysis Practices or Managing the Process: Chapter 14: FAA Is Performing Supplier Agreement Management Practices, but It Is Not Yet Fully Managing the Process: Chapter 15: FAA Is Performing Deployment, Transition, and Disposal Practices, but It Is Not Yet Fully Managing the Process: Chapter 16: FAA‘s Process Improvement Initiative Has Matured, but It Is Not Yet Institutionalized: FAA‘s Process Improvement Initiative Has Matured: FAA Has Not Yet Institutionalized Process Improvement in Its ATC Organizations: Chapter 17: Conclusions and Recommendations: Conclusions: Recommendations for Executive Action: Agency Comments: Appendix: Appendix I: GAO Contacts and Staff Acknowledgments: GAO Contacts: Staff Acknowledgments: Tables: Table 1: CMMI Process Areas: Table 2: FAA Projects That GAO Assessed Using CMMI: Table 3: Process Areas Assessed on Four ATC Projects: Table 4: Four Projects' Appraisal Results in Project Planning: Table 5: VSCS Project Planning: Detailed Findings on Level 1 Goals and Practices: Table 6: VSCS Project Planning: Detailed Findings on Level 2 Goals and Practices: Table 7: ERAM Project Planning: Detailed Findings on Level 1 Goals and Practices: Table 8: ERAM Project Planning: Detailed Findings on Level 2 Goals and Practices: Table 9: ITWS Project Planning: Detailed Findings on Level 1 Goals and Practices: Table 10: ITWS Project Planning: Detailed Findings on Level 2 Goals and Practices: Table 11: ASDE-X Project Planning: Detailed Findings on Level 1 Goals and Practices: Table 12: ASDE-X Project Planning: Detailed Findings on Level 2 Goals and Practices: Table 13: Four Projects' Appraisal Results in Project Monitoring and Control: Table 14: VSCS Project Monitoring and Control: Detailed Findings on Level 1 Goals and Practices: Table 15: VSCS Project Monitoring and Control: Detailed Findings on Level 2 Goals and Practices: Table 16: ERAM Project Monitoring and Control: Detailed Findings on Level 1 Goals and Practices: Table 17: ERAM Project Monitoring and Control: Detailed Findings on Level 2 Goals and Practices: Table 18: ITWS Project Monitoring and Control: Detailed Findings on Level 1 Goals and Practices: Table 19: ITWS Project Monitoring and Control: Detailed Findings on Level 2 Goals and Practices: Table 20: ASDE-X Project Monitoring and Control: Detailed Findings on Level 1 Goals and Practices: Table 21: ASDE-X Project Monitoring and Control: Detailed Findings on Level 2 Goals and Practices: Table 22: Four Projects' Appraisal Results in Risk Management: Table 23: VSCS Risk Management: Detailed Findings on Level 1 Goals and Practices: Table 24: VSCS Risk Management: Detailed Findings on Level 2 Goals and Practices: Table 25: ERAM Risk Management: Detailed Findings on Level 1 Goals and Practices: Table 26: ERAM Risk Management: Detailed Findings on Level 2 Goals and Practices: Table 27: ITWS Risk Management: Detailed Findings on Level 1 Goals and Practices: Table 28: ITWS Risk Management: Detailed Findings on Level 2 Goals and Practices: Table 29: ASDE-X Risk Management: Detailed Findings on Level 1 Goals and Practices: Table 30: ASDE-X Risk Management: Detailed Findings on Level 2 Goals and Practices: Table 31: Four Projects' Appraisal Results in Requirements Development: Table 32: VSCS Requirements Development: Detailed Findings on Level 1 Goals and Practices: Table 33: VSCS Requirements Development: Detailed Findings on Level 2 Goals and Practices: Table 34: ERAM Requirements Development: Detailed Findings on Level 1 Goals and Practices: Table 35: ERAM Requirements Development: Detailed Findings on Level 2 Goals and Practices: Table 36: ITWS Requirements Development: Detailed Findings on Level 1 Goals and Practices: Table 37: ITWS Requirements Development: Detailed Findings on Level 2 Goals and Practices: Table 38: ASDE-X Requirements Development: Detailed Findings on Level 1 Goals and Practices: Table 39: ASDE-X Requirements Development: Detailed Findings on Level 2 Goals and Practices: Table 40: Four Projects' Appraisal Results in Requirements Management: Table 41: VSCS Requirements Management: Detailed Findings on Level 1 Goals and Practices: Table 42: VSCS Requirements Management: Detailed Findings on Level 2 Goals and Practices: Table 43: ERAM Requirements Management: Detailed Findings on Level 1 Goals and Practices: Table 44: ERAM Requirements Management: Detailed Findings on Level 2 Goals and Practices: Table 45: ITWS Requirements Management: Detailed Findings on Level 1 Goals and Practices: Table 46: ITWS Requirements Management: Detailed Findings on Level 2 Goals and Practices: Table 47: ASDE-X Requirements Management: Detailed Findings on Level 1 Goals and Practices: Table 48: ASDE-X Requirements Management: Detailed Findings on Level 2 Goals and Practices: Table 49: Four Projects' Appraisal Results in Technical Solution: Table 50: VSCS Technical Solution: Detailed Findings on Level 1 Goals and Practices: Table 51: VSCS Technical Solution: Detailed Findings on Level 2 Goals and Practices: Table 52: ERAM Technical Solution: Detailed Findings on Level 1 Goals and Practices: Table 53: ERAM Technical Solution: Detailed Findings on Level 2 Goals and Practices: Table 54: ITWS Technical Solution: Detailed Findings on Level 1 Goals and Practices: Table 55: ITWS Technical Solution: Detailed Findings on Level 2 Goals and Practices: Table 56: ASDE-X Technical Solution: Detailed Findings on Level 1 Goals and Practices: Table 57: ASDE-X Technical Solution: Detailed Findings on Level 2 Goals and Practices: Table 58: Three Projects' Appraisal Results in Product Integration: Table 59: VSCS Product Integration: Detailed Findings on Level 1 Goals and Practices: Table 60: VSCS Product Integration: Detailed Findings on Level 2 Goals and Practices: Table 61: ITWS Product Integration: Detailed Findings on Level 1 Goals and Practices: Table 62: ITWS Product Integration: Detailed Findings on Level 2 Goals and Practices: Table 63: ASDE-X Product Integration: Detailed Findings on Level 1 Goals and Practices: Table 64: ASDE-X Product Integration: Detailed Findings on Level 2 Goals and Practices: Table 65: Four Projects' Appraisal Results in Verification: Table 66: VSCS Verification: Detailed Findings on Level 1 Goals and Practices: Table 67: VSCS Verification: Detailed Findings on Level 2 Goals and Practices: Table 68: ERAM Verification: Detailed Findings on Level 1 Goals and Practices: Table 69: ERAM Verification: Detailed Findings on Level 2 Goals and Practices: Table 70: ITWS Verification: Detailed Findings on Level 1 Goals and Practices: Table 71: ITWS Verification: Detailed Findings on Level 2 Goals and Practices: Table 72: ASDE-X Verification: Detailed Findings on Level 1 Goals and Practices: Table 73: ASDE-X Verification: Detailed Findings on Level 2 Goals and Practices: Table 74: Four Projects' Appraisal Results in Validation: Table 75: VSCS Validation: Detailed Findings on Level 1 Goals and Practices: Table 76: VSCS Validation: Detailed Findings on Level 2 Goals and Practices: Table 77: ERAM Validation: Detailed Findings on Level 1 Goals and Practices: Table 78: ERAM Validation: Detailed Findings on Level 2 Goals and Practices: Table 79: ITWS Validation: Detailed Findings on Level 1 Goals and Practices: Table 80: ITWS Validation: Detailed Findings on Level 2 Goals and Practices: Table 81: ASDE-X Validation: Detailed Findings on Level 1 Goals and Practices: Table 82: ASDE-X Validation: Detailed Findings on Level 2 Goals and Practices: Table 83: Four Projects' Appraisal Results in Configuration Management: Table 84: VSCS Configuration Management: Detailed Findings on Level 1 Goals and Practices: Table 85: VSCS Configuration Management: Detailed Findings on Level 2 Goals and Practices: Table 86: ERAM Configuration Management: Detailed Findings on Level 1 Goals and Practices: Table 87: ERAM Configuration Management: Detailed Findings on Level 2 Goals and Practices: Table 88: ITWS Configuration Management: Detailed Findings on Level 1 Goals and Practices: Table 89: ITWS Configuration Management: Detailed Findings on Level 2 Goals and Practices: Table 90: ASDE-X Configuration Management: Detailed Findings on Level 1 Goals and Practices: Table 91: ASDE-X Configuration Management: Detailed Findings on Level 2 Goals and Practices: Table 92: Four Projects' Appraisal Results in Process and Product Quality Assurance: Table 93: VSCS Process and Product Quality Assurance: Detailed Findings on Level 1 Goals and Practices: Table 94: VSCS Process and Product Quality Assurance: Detailed Findings on Level 2 Goals and Practices: Table 95: ERAM Process and Product Quality Assurance: Detailed Findings on Level 1 Goals and Practices: Table 96: ERAM Process and Product Quality Assurance: Detailed Findings on Level 2 Goals and Practices: Table 97: ITWS Process and Product Quality Assurance: Detailed Findings on Level 1 Goals and Practices: Table 98: ITWS Process and Product Quality Assurance: Detailed Findings on Level 2 Goals and Practices: Table 99: ASDE-X Process and Product Quality Assurance: Detailed Findings on Level 1 Goals and Practices: Table 100: ASDE-X Process and Product Quality Assurance: Detailed Findings on Level 2 Goals and Practices: Table 101: Four Projects' Appraisal Results in Measurement and Analysis: Table 102: VSCS Measurement and Analysis: Detailed Findings on Level 1 Goals and Practices: Table 103: VSCS Measurement and Analysis: Detailed Findings on Level 2 Goals and Practices: Table 104: ERAM Measurement and Analysis: Detailed Findings on Level 1 Goals and Practices: Table 105: ERAM Measurement and Analysis: Detailed Findings on Level 2 Goals and Practices: Table 106: ITWS Measurement and Analysis: Detailed Findings on Level 1 Goals and Practices: Table 107: ITWS Measurement and Analysis: Detailed Findings on Level 2 Goals and Practices: Table 108: ASDE-X Measurement and Analysis: Detailed Findings on Level 1 Goals and Practices: Table 109: ASDE-X Measurement and Analysis: Detailed Findings on Level 2 Goals and Practices: Table 110: Two Projects' Appraisal Results in Supplier Agreement Management: Table 111: ERAM Supplier Agreement Management: Detailed Findings on Level 1 Goals and Practices: Table 112: ERAM Supplier Agreement Management: Detailed Findings on Level 2 Goals and Practices: Table 113: ASDE-X Supplier Agreement Management: Detailed Findings on Level 1 Goals and Practices: Table 114: ASDE-X Supplier Agreement Management: Detailed Findings on Level 2 Goals and Practices: Table 115: Two Projects' Appraisal Results in Deployment, Transition, and Disposal: Table 116: ITWS Deployment, Transition, and Disposal: Detailed Findings on Level 1 Goals and Practices: Table 117: ITWS Deployment, Transition, and Disposal: Detailed Findings on Level 2 Goals and Practices: Table 118: ASDE-X Deployment, Transition, and Disposal: Detailed Findings on Level 1 Goals and Practices: Table 119: ASDE-X Deployment, Transition, and Disposal: Detailed Findings on Level 2 Goals and Practices: Figures: Figure 1: Percentage of Fully, Largely, Partially and Not Implemented Practices for All Four Projects: Figure 2: Summary of the ATC over the Continental United States and Oceans: Figure 3: CMMI Capability Levels: Figure 4: Four Projects' Capability Levels in Project Planning: Figure 5: Four Projects' Capability Levels in Project Monitoring and Control: Figure 6: Four Projects' Capability Levels in Risk Management: Figure 7: Four Projects' Capability Levels in Requirements Development: Figure 8: Four Projects' Capability Levels in Requirements Management: Figure 9: Four Projects' Capability Levels in Technical Solution: Figure 10: Three Projects' Capability Levels in Product Integration: Figure 11: Four Projects' Capability Levels in Verification: Figure 12: Four Projects' Capability Levels in Validation: Figure 13: Four Projects' Capability Levels in Configuration Management: Figure 14: Four Projects' Capability Levels in Process and Product Quality Assurance: Figure 15: Four Projects' Capability Levels in Measurement and Analysis: Figure 16: Two Projects' Capability Levels in Supplier Agreement Management: Figure 17: Two Projects' Capability Levels in Deployment, Transition, and Disposal: Abbreviations: ASDE-X: Airport Surface Detection Equipment-Model X: ATC: air traffic control: CMMI: Capability Maturity Model Integration: ERAM: En Route Automation Modernization: FAA: Federal Aviation Administration: iCMM: integrated Capability Maturity Model: ITWS: Integrated Terminal Weather System: SEI: Software Engineering Institute: VSCS: Voice Switching and Control System: Letter August 20, 2004: The Honorable Tom Davis: Chairman, Committee on Government Reform: House of Representatives: The Honorable Adam Putnam: Chairman, Subcommittee on Technology, Information Policy, Intergovernmental Relations and Census: Committee on Government Reform: House of Representatives: The accompanying report is one in a series of reports responding to your request that we review the information technology management processes that the Federal Aviation Administration (FAA) uses to manage its air traffic control modernization. This report discusses our assessment of the department's capabilities to develop and acquire software and systems, as well as actions FAA has under way to improve these capabilities. We are making recommendations to the Secretary of Transportation to assist the agency in improving these capabilities. As agreed with your staffs, unless you publicly announce its contents earlier, we plan no further distribution of this report until 30 days from the date on the report. At that time, we will send copies of the report to the Secretary of Transportation, the Administrator of the Federal Aviation Administration, and other congressional committees. Copies will also be made available at our Web site at [Hyperlink, http://www.gao.gov]. If you have any questions or wish to discuss the issues in this report, please contact David Powner at (202) 512-9286 or by e-mail at [Hyperlink, PownerD@gao.gov], or Keith Rhodes at (202) 512-6412 or by e-mail at [Hyperlink, RhodesK@gao.gov]. Additional GAO contact and staff acknowledgements are listed in appendix I. Signed by: David A. Powner: Director, Information Technology Management Issues: Signed by: Keith Rhodes: Chief Technologist: [End of section] Executive Summary: Purpose: In 1981, the Federal Aviation Administration (FAA) initiated an ambitious effort to modernize its aging air traffic control (ATC) system to enhance the safety, capacity, and efficiency of the national airspace system. Over the ensuing years, individual FAA modernization projects suffered cost increases, schedule delays, and performance shortfalls of large proportions. In 1995, GAO designated the modernization program as a high-risk information technology initiative because of its size, complexity, cost, and problem-plagued past. Recognizing that FAA's ATC modernization remained a high-risk information technology initiative in 2003,[Footnote 1] the Chairman of the House Committee on Government Reform and the Chairman of that Committee's Subcommittee on Technology, Information Policy, Intergovernmental Relations and the Census asked GAO to assess FAA's progress in several information technology management areas. This report is one in a series of reports responding to that request. The objectives of this review are (1) to evaluate FAA's capabilities for developing and acquiring software and systems on its ATC modernization program and (2) to assess the actions FAA has under way to improve these capabilities. Background: To accommodate forecasted growth in air traffic and to replace aging equipment, in 1981, FAA embarked on an ambitious ATC modernization program. This modernization effort includes the acquisition of new radar, navigation, communication, and information processing systems, as well as new ATC facilities, and is expected to cost $51 billion through 2007. FAA has spent over $35 billion on this effort to date. Quality software and systems are essential to FAA's ATC modernization program, and the quality of this software and these systems is governed by the value and maturity of the processes involved in acquiring, developing, managing, and maintaining them. Carnegie Mellon University's Software Engineering Institute, recognized for its expertise in software and system processes, has developed the Capability Maturity Model® Integration (CMMISM)[Footnote 2] and a CMMI appraisal methodology[Footnote 3] to evaluate, improve, and manage system and software development processes. The CMMI model and appraisal methodology provide a logical framework for measuring and improving key processes needed for achieving quality software and systems. In brief, the CMMI appraisal methodology calls for assessing up to 25 different process areas--clusters of related activities such as project planning, requirements management, and quality assurance--by determining whether key practices are implemented and whether overarching goals are satisfied. Successful implementation of these practices and satisfaction of these goals result in the achievement of successive capability levels. CMMI capability levels range from 0 to 5, with level 0 meaning that the process is either not performed or partially performed; level 1 meaning that the basic process is performed; level 2 meaning that the process is managed; level 3 meaning that the process is defined throughout the organization; level 4 meaning that the process is quantitatively managed; and level 5 meaning that the process is optimized. Using the CMMI model and appraisal methodology, GAO specialists, trained by the Software Engineering Institute, evaluated four FAA projects' capabilities in 12 to 14 process areas.[Footnote 4] (See ch. 1 of this report for more detailed information on GAO's scope and methodology.) Results in Brief: FAA has made progress in improving its capabilities for acquiring software-intensive systems, but there are still areas that need improvement. In a 1997 report, GAO found that FAA's processes for acquiring software were ad hoc and sometimes chaotic.[Footnote 5] However, currently, FAA projects that GAO reviewed are performing most of the desired system and software management practices. GAO appraised four different air traffic projects in 12 to 14 different process areas through capability level 2 and found that capability levels varied from level 0 to level 2. Despite these varied capability levels, many practices are being performed. Of the 900 individual practices evaluated, 83 percent were largely or fully implemented. The projects generally performed key practices in project planning, project monitoring and control, risk management, configuration management, requirements development, requirements management, validation, identifying technical solutions, deployment, and managing supplier agreements.[Footnote 6] Nevertheless, there were several recurring weaknesses in the areas of measurement and analysis, quality assurance, and verification. These weaknesses prevent FAA from consistently and effectively managing its mission critical systems and increase the risk of cost overruns, schedule delays, and performance shortfalls. FAA's process improvement initiative has matured in rigor and scope, but more can be done to ensure continued improvement in the ATC organization. In response to GAO's prior recommendations, in 1999, FAA established a centralized process improvement office that reports to the Chief Information Officer. This office developed an FAA integrated capability maturity model (iCMM), a model that is similar to the CMMI, but crafted to include international standards as well as multiple capability maturity models. Over the past several years, the process improvement office has revised and improved the iCMM model, trained over 7,000 FAA personnel in using the model, guided FAA organizations and projects in using the model, and evaluated the process maturity of projects that adopted the model. Further, since its inception, the number of projects that adopted the iCMM process improvement model has grown, and the projects that adopted the model have demonstrated growth in the maturity of their processes. However, the agency does not require the use of this process improvement method, nor has it mandated which core process areas to work on. The recurring weaknesses that GAO identified in its project-specific evaluations are due in part to the choices these projects were given in deciding whether to and how to adopt process improvement initiatives. Further, as a result of reorganizing its ATC organizations to a performance-based organization,[Footnote 7] FAA is reconsidering prior policies, and it is not yet clear that process improvement will continue to be a priority. Without a strong senior-level commitment to process improvement, and a consistent, institutionalized approach to implementing and evaluating process improvement, FAA cannot ensure that key projects will continue to make progress in improving systems acquisition and development capabilities, and the agency cannot proceed to more advanced capability levels, which focus on organizationwide management of processes. As a result, FAA continues to risk the project management problems--including cost overruns, schedule delays, and performance shortfalls--that have plagued past acquisitions. We are recommending to the Secretary of Transportation that the FAA address specific weaknesses and institutionalize its process improvement initiatives by establishing a policy and plans for implementing and overseeing process improvement initiatives. In commenting on a draft of this report, agency officials generally agreed with our recommendations. Officials also provided technical corrections, which we have incorporated into this report as appropriate. Principal Findings: FAA Projects Are Performing Many System Management and Acquisition Practices, but They Are Not Yet Fully Managing Most of the Processes: Since GAO's 1997 report, FAA has made considerable progress in improving its processes for acquiring and developing software and systems. In a prior report, GAO found that FAA's processes for acquiring software were ad hoc and sometimes chaotic. Currently, FAA projects are performing many of the desired system and software management practices. However, there are still areas for improvement. GAO appraised four mission critical air traffic projects in 12 to 14 different process areas and found that capability levels varied among projects and process areas. Each of the four projects achieved a rating of capability level 1, or "performed," in eight or more process areas but also achieved a capability level 0 in one or more process areas. Additionally, two projects achieved a rating of capability level 2, or "managed," in one or more process areas. Despite these varied capability scores, many practices are being performed. Of the 900 individual practices evaluated, 83 percent were largely or fully implemented. The other 17 percent of the practices were either partially or not implemented. Although there was not complete consistency among projects, the projects GAO reviewed were strongest in project planning, project monitoring and control, risk management, configuration management, requirements development, requirements management, validation, identifying technical solutions, deployment, and managing supplier agreements. However, there were several recurring weaknesses in the areas of measurement and analysis, quality assurance, and verification. These weaknesses inhibit FAA's ability to consistently achieve a managed level of performance. Figure 1 shows the number of practices that GAO found to be fully, largely, partially, and not implemented on all four projects. Figure 1: Percentage of Fully, Largely, Partially and Not Implemented Practices for All Four Projects: [See PDF for image] [End of figure] FAA's Process Improvement Initiative Has Matured, but It Is Not Yet Institutionalized: Over the past several years, FAA has made considerable progress in improving its processes for acquiring and developing software and systems, but more can be done to ensure continued improvement in the new Air Traffic Organization. Acting on GAO's prior recommendations, FAA established in 1999 a centralized process improvement office that reports directly to the Chief Information Officer. This office led the government in an effort to integrate various standards and models into a single maturity model, the iCMM. The Chief Information Officer's process improvement office also developed and sponsored iCMM-related training, and by late 2003, it had trained over 7,000 participants. This training ranges from overviews on how to use the model to more focused courses in such specific process areas as quality assurance, configuration management, and project management. The office also guides FAA organizations in using the model and leads appraisal teams in evaluating the process maturity of the projects and organizations that adopted the model. In addition to the Chief Information Officer-sponsored process improvement efforts, several of FAA's business areas, including the business areas with responsibility for ATC system acquisitions and operations, endorsed and set goals for process improvement activities using the iCMM. As a result, there has been a continuing growth over the years in the number of individual projects and umbrella organizations that adopted process improvement and the iCMM model. Additionally, these projects and organizations have demonstrated improvements in process maturity. In internal surveys, the programs and organizations pursuing process improvement have consistently reported enhanced productivity, higher quality, increased ability to predict schedules and resources, higher morale, and better communication and teamwork. However, more can be done to ensure continued improvement in the ATC organization. In order to achieve advanced system management capabilities and to gain the benefits of more mature processes, leading organizations institutionalize process improvement initiatives. Specifically, these organizations provide senior-level endorsement and enforcement of process improvement initiatives and strive for consistency in the adoption of process improvement efforts. In recent years, FAA's air traffic control-related organizations have encouraged process improvement through the iCMM model, but individual projects' use of the model has been voluntary. That is, individual project teams could determine whether or not they would implement the model and which process areas to work on. In addition, project teams could decide when, if ever, to seek an appraisal of their progress in implementing the model. As a result, individual projects are making uneven progress. To date, less than half of the projects listed in FAA's system architecture have sought appraisals in at least one process area. Further, the recurring weaknesses GAO identified in its project-specific evaluations are due in part to the leeway these projects were given in adopting process improvement initiatives. Weaknesses in key processes could lead to systems that do not meet the users' needs, exceed estimated costs, or take longer than expected to complete. Currently, FAA's commitment to its process improvement initiative is not certain because the agency is undergoing a reorganization of its ATC business areas. FAA recently moved its air traffic-related organizations into a single, performance-based organization under the direction of a Chief Operating Officer. The Chief Operating Officer is currently reevaluating all policies and processes, and he plans to issue new acquisition guidance in coming months. Unless the Chief Operating Officer demonstrates a strong commitment to process improvement and establishes a consistent, institutionalized approach to implementing and evaluating this process improvement, FAA risks taking a major step backwards in its capabilities for acquiring ATC systems and software. That is, FAA is unlikely to ensure that critical projects will continue to make progress in improving systems acquisition and development capabilities, and the agency will be unable to proceed to more advanced capability levels, which focus on organizationwide management of processes. Without such advancement in its capabilities, FAA will continue to be vulnerable to acquisition problems, including cost overruns, schedule delays, and performance shortfalls. Recommendations for Executive Action: Given the importance of software-intensive systems to FAA's ATC modernization program, GAO recommends that the Secretary of Transportation direct the FAA Administrator to ensure that the following five actions take place: * The four projects that GAO appraised should take action to fully implement the practices that GAO identified as not implemented or partially implemented. * The new Air Traffic Organization should establish: * a policy requiring organizations and project teams to implement iCMM or equivalent process improvement initiatives and: * a plan for implementing iCMM or equivalent process improvement initiatives throughout the organization. This plan should specify a core set of process areas for all projects, clear criteria for when appraisals are warranted, and measurable goals and time frames. * The Chief Information Officer's process improvement office, in consultation with the Air Traffic Organization, should develop a strategy for overseeing all air traffic projects' progress to successive levels of maturity; this strategy should specify measurable goals and time frames. * To enforce process improvement initiatives, FAA investment decision makers should take a project's capability level in core process areas into consideration before approving new investments in the project. Agency Comments: In oral comments on a draft of this report, Department of Transportation and FAA officials generally concurred with GAO's recommendations, and they indicated that FAA is pleased with the significant progress that it has achieved in improving the processes used to acquire software and systems. Further, these officials noted that FAA has already started implementing changes to address issues identified in the report. They said that progress is evident both in the improved scores, compared with the prior GAO study, and also in the way FAA functions on a day-to-day basis. For example, these officials explained that FAA is now working better as a team because the agency is using cross-organizational teams that effectively share knowledge and best practices for systems acquisition and management. Agency officials also noted that the constructive exchange of information with GAO was very helpful to them in achieving progress, and they emphasized their desire to maintain a dialog with GAO to facilitate continued progress. FAA officials also provided technical corrections, which we have incorporated into this report as appropriate. [End of section] Chapter 1: Introduction: The primary mission of the Federal Aviation Administration (FAA) is to provide a safe, secure, and efficient global aerospace system that contributes to national security and the promotion of U.S. aerospace safety. FAA's ability to fulfill this mission depends on the adequacy and reliability of the nation's air traffic control (ATC) systems--a vast network of computer hardware, software, and communications equipment. To accommodate forecasted growth in air traffic and to relieve the problems of aging ATC systems, FAA embarked on an ambitious ATC modernization program in 1981. FAA now estimates that it will spend about $51 billion to replace and modernize ATC systems through 2007. Our work over the years has chronicled many FAA problems in meeting ATC projects' cost, schedule, and performance goals.[Footnote 8] As a result of these issues as well as the tremendous cost, complexity, and mission criticality of the modernization program, we designated the program as a high-risk information technology initiative in 1995, and it has remained on our high-risk list since that time. Overview of ATC: Automated information processing and display, communication, navigation, surveillance, and weather resources permit air traffic controllers to view key information--such as aircraft location, aircraft flight plans, and prevailing weather conditions--and to communicate with pilots. These resources reside at, or are associated with, several ATC facilities--ATC towers, terminal radar approach control facilities, air route traffic control centers (en route centers), flight service stations, and the ATC System Command Center. Figure 2 shows a visual summary of ATC over the continental United States and oceans. Figure 2: Summary of the ATC over the Continental United States and Oceans: [See PDF for image] Note: TRACON is a Terminal Radar Approach Control Facility. ARINC is a private organization funded by the airlines and FAA to operate radio stations. [End of figure] FAA's ATC Modernization Is a High-Risk Initiative: Faced with growing air traffic and aging equipment, in 1981, FAA initiated an ambitious effort to modernize its ATC system. This effort involves the acquisition of new surveillance, data processing, navigation, and communications equipment, in addition to new facilities and support equipment. Initially, FAA estimated that its ATC modernization effort would cost $12 billion and could be completed over 10 years. Now, 2 decades and $35 billion later, FAA expects to need another $16 billion through 2007 to complete key projects, for a total cost of $51 billion. Over the past 2 decades, many of the projects that make up the modernization program have experienced substantial cost overruns, schedule delays, and significant performance shortfalls. Our work over the years has documented many of these shortfalls.[Footnote 9] As a result of these problems, as well as the tremendous cost, complexity, and mission criticality of the modernization program, we designated the program as a high-risk information technology initiative in 1995, and it has remained on our high-risk list since that time.[Footnote 10] Our work since the mid-1990s has pinpointed root causes of the modernization program's problems, including (1) immature software acquisition capabilities, (2) lack of a complete and enforced system architecture, (3) inadequate cost estimating and cost accounting practices, (4) an ineffective investment management process, and (5) an organizational culture that impaired the acquisition process. We have made over 30 recommendations to address these issues, and FAA has made substantial progress in addressing them. Nonetheless, in our most recent high-risk report, we noted that more remains to be done--and with FAA still expecting to spend billions on new ATC systems, these actions are as critical as ever. Prior Report Noted Weaknesses in FAA's Software Acquisition Capabilities: In March 1997, we reported that FAA's processes for acquiring software, the most costly and complex component of its ATC systems, were ad hoc, sometimes chaotic, and not repeatable across projects.[Footnote 11] We also reported that the agency lacked an effective management structure for ensuring software process improvement. As a result, the agency was at great risk of not delivering promised software capabilities on time and within budget. We recommended that FAA establish a Chief Information Officer organizational structure, as prescribed in the Clinger-Cohen Act, and assign responsibility for software acquisition process improvement to this organization. We also recommended several actions intended to help FAA improve its software acquisition capabilities by institutionalizing mature processes. These included developing a comprehensive plan for process improvement, allocating adequate resources to ensure that improvement efforts were implemented, and requiring that projects achieve a minimum level of maturity before being approved. FAA has implemented most of our recommendations. The agency established a Chief Information Officer position that reports directly to the administrator and gave this position responsibility for process improvement. The Chief Information Officer's process improvement office developed a strategy and led the way in developing an integrated framework for improving maturity in system acquisition, development, and engineering processes. Some of the business organizations within FAA, including the organizations responsible for ATC acquisitions and operations, adopted the framework and provided resources to process improvement efforts. FAA did not, however, implement our recommendation to require that projects achieve a minimum level of maturity before being approved. Officials reported that rather than establish arbitrary thresholds for maturity, FAA intended to evaluate process areas that were most critical or at greatest risk for each project during acquisition management reviews. FAA Is Changing Its Approach to ATC Management: Recent legislation and an executive order have led to major changes in the way that FAA manages its ATC mission. In April 2000, the Wendell H. Ford Aviation Investment and Reform Act for the 21st Century (Air-21) established the position of Chief Operating Officer for the ATC system. In December 2000, executive order 13180 instructed FAA to establish a performance-based organization known as the Air Traffic Organization and to have the Chief Operating Officer lead this organization under the authority of the FAA administrator. This order, amended in June 2002, called for the Air Traffic Organization to enhance the FAA's primary mission of ensuring the safety, security, and efficiency of the National Airspace System and further improve the delivery of air traffic services to the American public by reorganizing air traffic services and related offices into: a performance-based, results-oriented organization.[Footnote 12] The order noted that as a performance-based organization, the Air Traffic Organization would be able to take better advantage of the unique procurement and personnel authorities currently used by FAA, as well as of the additional management reforms enacted by Congress under Air-21. In addition, the Air Traffic Organization is responsible for developing methods to accelerate ATC modernization, improving aviation safety related to ATC, and establishing strong incentives to agency managers for achieving results. In leading the new Air Traffic Organization, the Chief Operating Officer's responsibilities include establishing and maintaining organizational and individual goals, a 5-year strategic plan including ATC system mission and objectives, and a framework agreement with the Administrator to establish the new organization's relationships with other FAA organizations. In August 2003, the first Chief Operating Officer joined the agency and initiated a reorganization combining the separate ATC-related organizations and offices into the Air Traffic Organization. The Capability Maturity Model Integration Provides a Means of Assessing an Organization's Ability to Manage Software and System Acquisition and Development: An essential aspect of FAA's ATC modernization program is the quality of the software and systems involved, which is heavily influenced by the quality and maturity of the processes used to acquire, develop, manage, and maintain them. Carnegie Mellon University's Software Engineering Institute (SEI), recognized for its expertise in software and system processes, has developed the Capability Maturity Model Integration (CMMI) and a CMMI appraisal methodology[Footnote 13] to evaluate, improve, and manage system and software development and engineering processes. The CMMI model and appraisal methodology provide a logical framework for measuring and improving key processes needed for achieving high-quality software and systems. The model can help an organization set process improvement objectives and priorities and improve processes; the model can also provide guidance for ensuring stable, capable, and mature processes. According to SEI, organizations that implement such process improvements can achieve better project cost and schedule performance and higher quality products.[Footnote 14] In brief, the CMMI model identifies 25 process areas--clusters of related practices that, when performed collectively, satisfy a set of goals that are considered important for making significant improvements in that area. Table 1 describes these process areas. Table 1: CMMI Process Areas: CMMI process area: Project planning; Purpose: Establish and maintain plans that define project activities. CMMI process area: Project monitoring and control; Purpose: Provide an understanding of the project's progress so that appropriate corrective actions can be taken when the project's performance deviates significantly from the plan. CMMI process area: Supplier agreement management; Purpose: Manage the acquisition of products from suppliers for which there exists a formal agreement. CMMI process area: Integrated project management for integrated product and process development; Purpose: Establish and manage the product and the involvement of the relevant stakeholders, according to an integrated and defined process that is tailored from the organization's set of standard processes. For integrated product and process development, integrated project management also covers the establishment of a shared vision for the project and a team structure for integrated teams that will carry out the objectives of the project. CMMI process area: Risk management; Purpose: Identify potential problems before they occur, so that risk- handling activities may be planned and invoked as needed across the life of the product or project to mitigate adverse impacts on achieving objectives. CMMI process area: Integrated teaming; Purpose: Form and sustain an integrated team for the development of work products. CMMI process area: Integrated supplier management; Purpose: Proactively identify sources of products that may be used to satisfy the project's requirements and to manage selected suppliers while maintaining a cooperative project-supplier relationship. CMMI process area: Quantitative project management; Purpose: Quantitatively manage the project's defined process to achieve the project's established quality and process-performance objectives. CMMI process area: Requirements management; Purpose: Manage the requirements of the project's products and product components and to identify inconsistencies between those requirements and the project's plans and work products. CMMI process area: Requirements development; Purpose: Produce and analyze customer, product, and product-component requirements. CMMI process area: Technical solution; Purpose: Design, develop, and implement solutions to requirements. Solutions, designs, and implementations encompass products, product components, and product-related life-cycle processes either singly or in combinations as appropriate. CMMI process area: Product integration; Purpose: Assemble the product from the product components; ensure that the product, as integrated, functions properly; and deliver the product. CMMI process area: Verification; Purpose: Ensure that selected work products meet their specified requirements. CMMI process area: Validation; Purpose: Demonstrate that a product or product component fulfills its intended use when placed in its intended environment. CMMI process area: Configuration management; Purpose: Establish and maintain the integrity of work products using configuration identification, configuration control, configuration status accounting, and configuration audits. CMMI process area: Process and product quality assurance; Purpose: Provide staff and management with objective insight into processes and associated work products. CMMI process area: Measurement and analysis; Purpose: Develop and sustain a measurement capability that is used to support management information needs. CMMI process area: Decision analysis and resolution; Purpose: Analyze possible decisions using a formal evaluation process that evaluates identified alternatives against established criteria. CMMI process area: Organizational environment for integration; Purpose: Provide an integrated product and process development infrastructure and manage people for integration. CMMI process area: Causal analysis and resolution; Purpose: Identify causes of defects and other problems and take action to prevent them from occurring in the future. CMMI process area: Organizational process focus; Purpose: Plan and implement organizational process improvement based on a thorough understanding of the current strengths and weaknesses of the organization's processes and process assets. CMMI process area: Organizational process definition; Purpose: Establish and maintain a usable set of organizational process assets. CMMI process area: Organizational training; Purpose: Develop the skills and knowledge of people so they can perform their roles effectively and efficiently. CMMI process area: Organizational process performance; Purpose: Establish and maintain a quantitative understanding of the performance of the organization's set of standard processes in support of quality and process-performance objectives, and provide the process performance data, baselines, and models to quantitatively manage the organization's projects. CMMI process area: Organizational innovation and development; Purpose: Select and deploy incremental and innovative improvements that measurably improve the organization's processes and technologies. The improvements support the organization's quality and process- performance objectives as derived from the organization's business objectives. Source: SEI. [End of table] The CMMI model provides two alternative ways to view these process areas. One way, called continuous representation, focuses on improving capabilities in individual process areas. The second way, called staged representation, groups process areas together and focuses on achieving increased maturity levels by improving the group of process areas. The CMMI appraisal methodology calls for assessing process areas by determining whether the key practices are implemented and whether the overarching goals are satisfied. Under continuous representation, successful implementation of these practices and satisfaction of these goals result in the achievement of successive capability levels in a selected process area. CMMI capability levels range from 0 to 5, with level 0 meaning that the process is either not performed or partially performed; level 1 meaning that the basic process is performed; level 2 meaning that the process is managed; level 3 meaning that the processes is defined throughout the organization; level 4 meaning that the process is quantitatively managed; and level 5 meaning that the process is optimized. Figure 3 provides details on CMMIcapability levels. Figure 3: CMMI Capability Levels: [See PDF for image] [End of figure] Objectives, Scope, and Methodology: The Chairman, House Committee on Government Reform, and the Chairman of that Committee's Subcommittee on Technology, Information Policy, Intergovernmental Relations and the Census requested that we evaluate FAA's software and system development processes used to manage its ATC modernization. Our objectives were (1) to evaluate FAA's capabilities for developing and acquiring software and systems on its ATC modernization program and (2) to assess the actions FAA has under way to improve these capabilities. To evaluate FAA's capabilities for developing and acquiring software and systems, we applied the CMMI model (continuous representation) and its related appraisal methodology to four FAA projects. Our appraisers were all SEI-trained software and information systems specialists. In addition, we employed SEI-trained consultants as advisors on our first evaluation to ensure proper application of the model and appraisal methodology. In consultation with FAA officials, we selected four FAA projects with high impact, visibility, and cost, which represented different air traffic domains and reflected different stages of life cycle development. The projects included the Voice Switching and Control System (VSCS), the Integrated Terminal Weather System (ITWS), the En Route Automation Modernization (ERAM) project, and the Airport Surface Detection Equipment-Model X (ASDE-X). The four projects are described in table 2. Table 2: FAA Projects That GAO Assessed Using CMMI: Systems: Voice Switching and Control System (VSCS); Description: VSCS is a voice communication system that is in operation at 21 Air Route Traffic Control Centers in the continental United States and Alaska. VSCS provides air traffic controllers at these locations with air-to-ground and ground-to-ground voice communication capability. The project's acquisition cost was about $1.3 billion, and it has been fully deployed and operational for over 9 years. FAA is currently working to refresh the system's technologies and to extend the life of the system. Systems: En Route Automation Modernization (ERAM); Description: ERAM is a critical effort to replace existing software and hardware in the En Route air traffic control automation (Host) computer system, its backup system, and other associated interfaces, communications, and support infrastructure. The contract was awarded in 2002 to Lockheed Martin. ERAM is one of the most expensive and software-intensive ATC projects. The project is estimated to cost $2.1 billion, with deployment at 27 facilities scheduled to begin in 2009 and to end by 2010. Systems: Integrated Terminal Weather System (ITWS); Description: ITWS is an automated weather information system expected to use a variety of weather sensors to efficiently utilize airport runways in all kinds of weather. It is expected to acquire and integrate weather data from multiple sensors and provide traffic managers with a graphic display of weather information. The project's cost is estimated at $286.1 million, with deployment scheduled to begin in 2003 and end by 2008. Systems: Airport Surface Detection Equipment-Model X (ASDE-X); Description: ASDE-X is a radar system that provides surveillance to prevent runway incursions at a large number of airports. It is an important safety initiative, initially designed to provide a low-cost alternative to FAA's existing ASDE-3 radar systems. The project's cost is estimated at $505.2 million, with deployment scheduled to take place between 2003 and 2007. Source: FAA. [End of table] In conjunction with FAA's process improvement organization, we identified relevant CMMI process areas for each appraisal. In addition, because system deployment is an important aspect of FAA systems management that is not included in CMMI, we used the deployment, transition, and disposal process area from FAA's integrated Capability Maturity Model, version 2. For consistency, we merged FAA's criteria with SEI's framework and added the standard goals and practices needed to achieve capability level 2. In selected cases, we did not review a certain process area because it was not relevant to the current stage of a project's life cycle. For example, we did not evaluate supplier agreement management or deployment on VSCS because the system is currently in operation, and these process areas are no longer applicable to this system. Table 3 displays the CMMI process areas that we reviewed for each project. Table 3: Process Areas Assessed on Four ATC Projects: Process areas: Project planning; Projects: VSCS: Yes; Projects: ERAM: Yes; Projects: ITWS: Yes; Projects: ASDE-X: Yes. Process areas: Project monitoring and control; Projects: VSCS: Yes; Projects: ERAM: Yes; Projects: ITWS: Yes; Projects: ASDE-X: Yes. Process areas: Risk management; Projects: VSCS: Yes; Projects: ERAM: Yes; Projects: ITWS: Yes; Projects: ASDE-X: Yes. Process areas: Requirements development; Projects: VSCS: Yes; Projects: ERAM: Yes; Projects: ITWS: Yes; Projects: ASDE-X: Yes. Process areas: Requirements management; Projects: VSCS: Yes; Projects: ERAM: Yes; Projects: ITWS: Yes; Projects: ASDE-X: Yes. Process areas: Technical solution; Projects: VSCS: Yes; Projects: ERAM: Yes; Projects: ITWS: Yes; Projects: ASDE-X: Yes. Process areas: Product integration; Projects: VSCS: Yes; Projects: ERAM: No; Projects: ITWS: Yes; Projects: ASDE-X: Yes. Process areas: Verification; Projects: VSCS: Yes; Projects: ERAM: Yes; Projects: ITWS: Yes; Projects: ASDE-X: Yes. Process areas: Validation; Projects: VSCS: Yes; Projects: ERAM: Yes; Projects: ITWS: Yes; Projects: ASDE-X: Yes. Process areas: Configuration management; Projects: VSCS: Yes; Projects: ERAM: Yes; Projects: ITWS: Yes; Projects: ASDE-X: Yes. Process areas: Process and product quality assurance; Projects: VSCS: Yes; Projects: ERAM: Yes; Projects: ITWS: Yes; Projects: ASDE-X: Yes. Process areas: Measurement and analysis; Projects: VSCS: Yes; Projects: ERAM: Yes; Projects: ITWS: Yes; Projects: ASDE-X: Yes. Process areas: Supplier agreement management; Projects: VSCS: No; Projects: ERAM: Yes; Projects: ITWS: No; Projects: ASDE-X: Yes. Process areas: Deployment, transition, and disposal; Projects: VSCS: No; Projects: ERAM: No; Projects: ITWS: Yes; Projects: ASDE-X: Yes. Source: GAO. [End of table] For each process area reviewed, we evaluated project-specific documentation and interviewed project officials to determine whether key practices were implemented and goals were achieved. In accordance with CMMI guidance, we characterized practices as fully implemented, largely implemented, partially implemented, and not implemented, and characterized goals as satisfied or unsatisfied. After combining the practices and goals, the team determined if successive capability levels were achieved. According to the CMMI appraisal method, practices must be largely or fully implemented in order for a goal to be satisfied. Further, all goals must be satisfied in order to achieve a capability level. In order to achieve advanced capability levels, all preceding capability levels must be achieved. For example, a prerequisite for level 2 is the achievement of level 1. As agreed with FAA process improvement officials, we evaluated the projects through capability level 2.[Footnote 15] Consistent with the CMMI appraisal methodology, we validated our findings by sharing preliminary observations with the project team so that they were able to provide additional documentation or information as warranted. To assess the actions FAA has under way to improve its system and software acquisition and development processes, we evaluated process improvement strategies and plans. We also evaluated the progress the agency has made in expanding its process improvement initiative, both through the maturity of the model and the acceptance of the model by project teams. We also interviewed officials from the offices of the Chief Information Officer and the Chief Operating Officer to determine the effect current changes in the ATC organization could have on the process improvement initiatives. The Department of Transportation and FAA provided oral comments on a draft of this report. These comments are presented in chapter 17. We performed our work from September 2003 through July 2004 in accordance with generally accepted government auditing standards. [End of section] Chapter 2: FAA Is Performing Most Project Planning Practices, but It Is Not Yet Fully Managing the Process: The purpose of project planning is to establish and maintain plans that define the project activities. This process area involves developing and maintaining a plan, interacting with stakeholders, and obtaining commitment to the plan. As figure 4 shows, three of the four FAA projects satisfied all criteria for the "performing" capability level (level 1) in this process area. The fourth project would have achieved level 1 if it had performed one more practice (see the overview in table 4 for details). None of the four projects satisfied all criteria for the "managing" capability level (level 2). While all four projects had differing weaknesses that contributed to this result, common weaknesses across most of the projects occurred in the areas of monitoring and controlling the project planning process and in ensuring quality assurance of the process. As a result of these weaknesses, FAA is exposed to increased risks that projects will not meet cost, schedule, or performance goals and that projects will not meet mission needs. Looked at another way, of the 96 practices we evaluated in this process area, FAA projects had 88 practices that were fully or largely implemented and 8 practices that were partially or not implemented. Figure 4: Four Projects' Capability Levels in Project Planning: [See PDF for image] [End of figure] Table 4: Four Projects' Appraisal Results in Project Planning: Needed to achieve capability level 1: Goal: Estimates of project planning parameters are established and maintained; VSCS: Goal not satisfied; ERAM: Goal satisfied; ITWS: Goal satisfied; ASDE-X: Goal satisfied. Desired practice: Establish a top-level work breakdown structure to estimate the scope of the project; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Establish and maintain estimates of the attributes of the work products and tasks; VSCS: Practice partially implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Define the project life-cycle phases upon which to scope the planning effort; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice largely implemented; ASDE-X: Practice fully implemented. Desired practice: Estimate the project effort and cost for the work products and tasks based on estimation rationale; VSCS: Practice largely implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Goal: A project plan is established and maintained as the basis for managing the project; VSCS: Goal satisfied; ERAM: Goal satisfied; ITWS: Goal satisfied; ASDE-X: Goal satisfied. Desired practice: Establish and maintain the project's budget and schedule; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Identify and analyze project risks; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Plan for the management of project data; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice largely implemented. Desired practice: Plan for necessary resources to perform the project; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice largely implemented. Desired practice: Plan for knowledge and skills needed to perform the project; VSCS: Practice largely implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Plan the involvement of identified stakeholders; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Establish and maintain the overall project plan content; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Goal: Commitments to the project plan are established and maintained; VSCS: Goal satisfied; ERAM: Goal satisfied; ITWS: Goal satisfied; ASDE-X: Goal satisfied. Desired practice: Review all plans that affect the project to understand project commitments; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Reconcile the project plan to reflect available and estimated resources; VSCS: Practice fully implemented; ERAM: Practice largely implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Obtain commitment from relevant stakeholders responsible for performing and supporting plan execution; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Capability level 1 achieved? VSCS: No; ERAM: Yes; ITWS: Yes; ASDE-X: Yes. Goal: The process is institutionalized as a managed process; VSCS: Goal not satisfied; ERAM: Goal not satisfied; ITWS: Goal not satisfied; ASDE-X: Goal not satisfied. Desired practice: Establish and maintain an organizational policy for planning and performing the project planning process; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Establish and maintain the plan for performing the project planning process; VSCS: Practice largely implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Provide adequate resources for performing the project planning process, developing the work products, and providing the services of the process; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the project planning process; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice largely implemented. Desired practice: Train the people performing or supporting the project planning process as needed; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Place designated work products of the project planning process under appropriate levels of configuration management; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice partially implemented; ASDE-X: Practice largely implemented. Desired practice: Identify and involve the relevant stakeholders of the project planning process as planned; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Monitor and control the project planning process against the plan for performing the process, and take appropriate corrective action; VSCS: Practice not implemented; ERAM: Practice not implemented; ITWS: Practice partially implemented; ASDE-X: Practice largely implemented. Desired practice: Objectively evaluate adherence of the project planning process to its process description, standards, and procedures, and address noncompliance; VSCS: Practice fully implemented; ERAM: Practice partially implemented; ITWS: Practice partially implemented; ASDE-X: Practice partially implemented. Desired practice: Review the activities, status, and results of the project planning process with higher level management, and resolve issues; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Capability level 2 achieved? VSCS: No; ERAM: No; ITWS: No; ASDE-X: No. Sources: GAO, SEI. Additional details on each project's appraisal results at successive capability levels are provided in tables 5 through 12. Specifically, tables 5 and 6 provide results for VSCS; tables 7 and 8 provide results for ERAM; tables 9 and 10 provide results for ITWS; and tables 11 and 12 provide results for ASDE-X. [End of table] Table 5: VSCS Project Planning: Detailed Findings on Level 1 Goals and Practices: Goal: Estimates of project planning parameters are established and maintained; Rating: Unsatisfied; Comment: The goal is unsatisfied because one of the practices below is partially implemented. Desired practice: Establish a top-level work breakdown structure to estimate the scope of the project; Rating: Fully implemented. Desired practice: Establish and maintain estimates of the attributes of the work products and tasks; Rating: Partially implemented; Comment: The project team has established estimates of the work products; however, it does not maintain information on attributes, such as size, function points, requirements, inputs, and outputs. Desired practice: Define the project life-cycle phases upon which to scope the planning effort; Rating: Fully implemented. Desired practice: Estimate the project effort and cost for the work products and tasks based on estimation rationale; Rating: Largely implemented; Comment: The project team estimates the project effort and cost for the work products and tasks based on subject matter expert opinion, but it does not use objective rationale or historical data. Goal: A project plan is established and maintained as the basis for managing the project; Rating: Satisfied. Desired practice: Establish and maintain the project's budget and schedule; Rating: Fully implemented. Desired practice: Identify and analyze project risks; Rating: Fully implemented. Desired practice: Plan for the management of project data; Rating: Fully implemented. Desired practice: Plan for necessary resources to perform the project; Rating: Fully implemented. Desired practice: Plan for knowledge and skills needed to perform the project; Rating: Largely implemented; Comment: Project personnel stated that knowledge and skills to perform the project are planned for; however, this information is not documented. Desired practice: Plan the involvement of identified stakeholders; Rating: Fully implemented. Desired practice: Establish and maintain the overall project plan content; Rating: Fully implemented. Goal: Commitments to the project plan are established and maintained; Rating: Satisfied. Desired practice: Review all plans that affect the project to understand project commitments; Rating: Fully implemented. Desired practice: Reconcile the project plan to reflect available and estimated resources; Rating: Fully implemented. Desired practice: Obtain commitment from relevant stakeholders responsible for performing and supporting plan execution; Rating: Fully implemented. Capability level 1 not achieved. Sources: GAO, SEI. [End of table] Table 6: VSCS Project Planning: Detailed Findings on Level 2 Goals and Practices: Goal: The process is institutionalized as a managed process; Rating: Unsatisfied; Comment: The goal is unsatisfied because one of the practices is not implemented. Desired practice: Establish and maintain an organizational policy for planning and performing the project planning process; Rating: Fully implemented. Desired practice: Establish and maintain the plan for performing the project planning process; Rating: Largely implemented; Comment: The project team conducted the activities required for the project planning process; however, it did not have a documented plan for performing the project planning process. For future use, the project team has created a handbook that addresses the elements of a plan. Desired practice: Provide adequate resources for performing the project planning process, developing the work products, and providing the services of the process; Rating: Fully implemented. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the project planning process; Rating: Fully implemented. Desired practice: Train the people performing or supporting the project planning process as needed; Rating: Fully implemented. Desired practice: Place designated work products of the project planning process under appropriate levels of configuration management; Rating: Fully implemented. Desired practice: Identify and involve the relevant stakeholders of the project planning process as planned; Rating: Fully implemented. Desired practice: Monitor and control the project planning process against the plan for performing the process, and take appropriate corrective action; Rating: Not implemented; Comment: The project team does not monitor and control the project planning process against the plan. Desired practice: Objectively evaluate adherence of the project planning process to its process description, standards, and procedures, and address noncompliance; Rating: Fully implemented. Desired practice: Review the activities, status, and results of the project planning process with higher level management, and resolve issues; Rating: Fully implemented. Capability level 2 not achieved. Sources: GAO, SEI. [End of table] Table 7: ERAM Project Planning: Detailed Findings on Level 1 Goals and Practices: Goal: Estimates of project planning parameters are established and maintained; Rating: Satisfied. Desired practice: Establish a top-level work breakdown structure to estimate the scope of the project; Rating: Fully implemented. Desired practice: Establish and maintain estimates of the attributes of the work products and tasks; Rating: Fully implemented. Desired practice: Define the project life-cycle phases upon which to scope the planning effort; Rating: Fully implemented. Desired practice: Estimate the project effort and cost for the work products and tasks based on estimation rationale; Rating: Fully implemented. Goal: A project plan is established and maintained as the basis for managing the project; Rating: Satisfied. Desired practice: Establish and maintain the project's budget and schedule; Rating: Fully implemented. Desired practice: Identify and analyze project risks; Rating: Fully implemented. Desired practice: Plan for the management of project data; Rating: Fully implemented. Desired practice: Plan for necessary resources to perform the project; Rating: Fully implemented. Desired practice: Plan for knowledge and skills needed to perform the project; Rating: Fully implemented. Desired practice: Plan the involvement of identified stakeholders; Rating: Fully implemented. Desired practice: Establish and maintain the overall project plan content; Rating: Fully implemented. Goal: Commitments to the project plan are established and maintained; Rating: Satisfied. Desired practice: Review all plans that affect the project to understand project commitments; Rating: Fully implemented. Desired practice: Reconcile the project plan to reflect available and estimated resources; Rating: Largely implemented; Comment: The project team has processes in place to reconcile the project plan to reflect available and estimated resources; however, the plan has not been reconciled to date. Desired practice: Obtain commitment from relevant stakeholders responsible for performing and supporting plan execution; Rating: Fully implemented. Capability level 1 achieved. Sources: GAO, SEI. [End of table] Table 8: ERAM Project Planning: Detailed Findings on Level 2 Goals and Practices: Goal: The process is institutionalized as a managed process; Rating: Unsatisfied; Comment: The goal is unsatisfied because one of the practices is not implemented and another is partially implemented. Desired practice: Establish and maintain an organizational policy for planning and performing the project planning process; Rating: Fully implemented. Desired practice: Establish and maintain the plan for performing the project planning process; Rating: Fully implemented. Desired practice: Provide adequate resources for performing the project planning process, developing the work products, and providing the services of the process; Rating: Fully implemented. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the project planning process; Rating: Fully implemented. Desired practice: Train the people performing or supporting the project planning process as needed; Rating: Fully implemented. Desired practice: Place designated work products of the project planning process under appropriate levels of configuration management; Rating: Fully implemented. Desired practice: Identify and involve the relevant stakeholders of the project planning process as planned; Rating: Fully implemented. Desired practice: Monitor and control the project planning process against the plan for performing the process, and take appropriate corrective action; Rating: Not implemented; Comment: The project team does not monitor and control the project planning process against the plan for performing the process or take appropriate corrective action. Desired practice: Objectively evaluate adherence of the project planning process to its process description, standards, and procedures, and address noncompliance; Rating: Partially implemented; Comment: The project team has just started defining the quality assurance process that is expected to objectively evaluate adherence of the project planning process to the process description and standards and to address noncompliance. Desired practice: Review the activities, status, and results of the project planning process with higher level management, and resolve issues; Rating: Fully implemented. Capability level 2 not achieved. Sources: GAO, SEI. [End of table] Table 9: ITWS Project Planning: Detailed Findings on Level 1 Goals and Practices: Goal: Estimates of project planning parameters are established and maintained; Rating: Satisfied. Desired practice: Establish a top-level work breakdown structure to estimate the scope of the project; Rating: Fully implemented. Desired practice: Establish and maintain estimates of the attributes of the work products and tasks; Rating: Fully implemented. Desired practice: Define the project life-cycle phases upon which to scope the planning effort; Rating: Largely implemented; Comment: The project team defined the project life-cycle phases upon which to scope the planning effort; however, the project team has not documented a specific project life cycle for this project. Desired practice: Estimate the project effort and cost for the work products and tasks based on estimation rationale; Rating: Fully implemented. Goal: A project plan is established and maintained as the basis for managing the project; Rating: Satisfied. Desired practice: Establish and maintain the project's budget and schedule; Rating: Fully implemented. Desired practice: Identify and analyze project risks; Rating: Fully implemented. Desired practice: Plan for the management of project data; Rating: Fully implemented. Desired practice: Plan for necessary resources to perform the project; Rating: Fully implemented. Desired practice: Plan for knowledge and skills needed to perform the project; Rating: Fully implemented. Desired practice: Plan the involvement of identified stakeholders; Rating: Fully implemented. Desired practice: Establish and maintain the overall project plan content; Rating: Fully implemented. Goal: Commitments to the project plan are established and maintained; Rating: Satisfied. Desired practice: Review all plans that affect the project to understand project commitments; Rating: Fully implemented. Desired practice: Reconcile the project plan to reflect available and estimated resources; Rating: Fully implemented. Desired practice: Obtain commitment from relevant stakeholders responsible for performing and supporting plan execution; Rating: Fully implemented. Capability level 1 achieved. Sources: GAO, SEI. [End of table] Table 10: ITWS Project Planning: Detailed Findings on Level 2 Goals and Practices: Goal: The process is institutionalized as a managed process; Rating: Unsatisfied; Comment: The goal is unsatisfied because three of the practices are partially implemented. Desired practice: Establish and maintain an organizational policy for planning and performing the project planning process; Rating: Fully implemented. Desired practice: Establish and maintain the plan for performing the project planning process; Rating: Fully implemented. Desired practice: Provide adequate resources for performing the project planning process, developing the work products, and providing the services of the process; Rating: Fully implemented. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the project planning process; Rating: Fully implemented. Desired practice: Train the people performing or supporting the project planning process as needed; Rating: Fully implemented. Desired practice: Place designated work products of the project planning process under appropriate levels of configuration management; Rating: Partially implemented; Comment: The project team places designated work products of the project planning process under some levels of configuration management; however, the controls on these products are not adequate because all support contractors have full access to the system. Desired practice: Identify and involve the relevant stakeholders of the project planning process as planned; Rating: Fully implemented. Desired practice: Monitor and control the project planning process against the plan for performing the process, and take appropriate corrective action; Rating: Partially implemented; Comment: The project team collects contractor cost and schedule data, but it does not monitor and control the project planning process against the plan for performing the process or take appropriate corrective action. Desired practice: Objectively evaluate adherence of the project planning process to its process description, standards, and procedures, and address noncompliance; Rating: Partially implemented; Comment: The project team has just begun its quality assurance process, which is expected to objectively evaluate adherence of the project planning process to its process description, standards, and procedures and to address noncompliance. Desired practice: Review the activities, status, and results of the project planning process with higher level management, and resolve issues; Rating: Fully implemented. Capability level 2 not achieved. Sources: GAO, SEI. [End of table] Table 11: ASDE-X Project Planning: Detailed Findings on Level 1 Goals and Practices: Goal: Estimates of project planning parameters are established and maintained; Rating: Satisfied. Desired practice: Establish a top-level work breakdown structure to estimate the scope of the project; Rating: Fully implemented. Desired practice: Establish and maintain estimates of the attributes of the work products and tasks; Rating: Fully implemented. Desired practice: Define the project life-cycle phases upon which to scope the planning effort; Rating: Fully implemented. Desired practice: Estimate the project effort and cost for the work products and tasks based on estimation rationale; Rating: Fully implemented. Goal: A project plan is established and maintained as the basis for managing the project; Rating: Satisfied. Desired practice: Establish and maintain the project's budget and schedule; Rating: Fully implemented. Desired practice: Identify and analyze project risks; Rating: Fully implemented. Desired practice: Plan for the management of project data; Rating: Largely implemented; Comment: The project team plans for the management of data, but internal documents, such as the project plan, have yet to be placed under formal configuration management control. Desired practice: Plan for necessary resources to perform the project; Rating: Largely implemented; Comment: Although the project team provides resources to perform the project, such as staff and tools, there is no evidence that these are managed and controlled. Desired practice: Plan for knowledge and skills needed to perform the project; Rating: Fully implemented. Desired practice: Plan the involvement of identified stakeholders; Rating: Fully implemented. Desired practice: Establish and maintain the overall project plan content; Rating: Fully implemented. Goal: Commitments to the project plan are established and maintained; Rating: Satisfied. Desired practice: Review all plans that affect the project to understand project commitments; Rating: Fully implemented. Desired practice: Reconcile the project plan to reflect available and estimated resources; Rating: Fully implemented. Desired practice: Obtain commitment from relevant stakeholders responsible for performing and supporting plan execution; Rating: Fully implemented. Capability level 1 achieved. Sources: GAO, SEI. [End of table] Table 12: ASDE-X Project Planning: Detailed Findings on Level 2 Goals and Practices: Goal: The process is institutionalized as a managed process; Rating: Unsatisfied; Comment: The goal is unsatisfied because one of the practices is partially implemented. Desired practice: Establish and maintain an organizational policy for planning and performing the project planning process; Rating: Fully implemented. Desired practice: Establish and maintain the plan for performing the project planning process; Rating: Fully implemented. Desired practice: Provide adequate resources for performing the project planning process, developing the work products, and providing the services of the process; Rating: Fully implemented. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the project planning process; Rating: Largely implemented; Comment: The project team has assigned responsibility for the project planning process to various groups, but it does not assign duties to individuals. Desired practice: Train the people performing or supporting the project planning process as needed; Rating: Fully implemented. Desired practice: Place designated work products of the project planning process under appropriate levels of configuration management; Rating: Largely implemented; Comment: The project team places designated contractor-related work products of the project planning process under appropriate levels of configuration management; however, internal documents have yet to be placed under configuration management control. Desired practice: Identify and involve the relevant stakeholders of the project planning process as planned; Rating: Fully implemented. Desired practice: Monitor and control the project planning process against the plan for performing the process, and take appropriate corrective action; Rating: Largely implemented; Comment: The project team monitors and controls the project planning process against the plan for performing the process; however, the metrics are collected by the contractor, and the project team does not maintain, record, or track the metrics on a monthly basis. Desired practice: Objectively evaluate adherence of the project planning process to its process description, standards, and procedures, and address noncompliance; Rating: Partially implemented; Comment: The project team has just begun its quality assurance process, which is expected to objectively evaluate the adherence of the project planning process to its process description, standards, and procedures and to address noncompliance. Desired practice: Review the activities, status, and results of the project planning process with higher level management, and resolve issues; Rating: Fully implemented. Capability level 2 not achieved. Sources: GAO, SEI. [End of table] [End of section] Chapter 3: FAA Is Performing Most Project Monitoring and Control Practices, but It Is Not Yet Fully Managing the Process: The purpose of project monitoring and control is to provide an understanding of the project's progress so that appropriate corrective actions can be taken when the project's performance deviates significantly from the plan. Key activities include monitoring activities, communicating status, taking corrective action, and determining progress. As shown in figure 5, three of the four FAA projects satisfied all criteria for the "performing" capability level (level 1) in this process area. The fourth project would have achieved level 1 if it had performed one more practice (see the overview in table 13 for details). None of the four projects satisfied all criteria for the "managing" capability level (level 2). While the projects had differing weaknesses that contributed to this result, a common weakness across most of the projects occurred in the area of ensuring quality assurance of the process. As a result of this weakness, FAA is exposed to increased risks that projects will not meet cost, schedule, or performance goals and that projects will not meet mission needs. Looked at another way, of the 80 practices we evaluated in this process area, FAA projects had 74 practices that were fully or largely implemented and 6 practices that were partially or not implemented. Figure 5: Four Projects' Capability Levels in Project Monitoring and Control: [See PDF for image] [End of figure] Table 13: Four Projects' Appraisal Results in Project Monitoring and Control: Goal: Actual performance and progress of the project are monitored against the project plan; VSCS: Goal not satisfied; ERAM: Goal satisfied; ITWS: Goal satisfied; ASDE-X: Goal satisfied. Desired practice: Monitor the actual values of the project planning parameters against the project plan; VSCS: Practice partially implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Monitor commitments against those identified in the project plan; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Monitor risks against those identified in the project plan; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Monitor the management of project data against the project plan; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice largely implemented; ASDE-X: Practice largely implemented. Desired practice: Monitor stakeholder involvement against the project plan; VSCS: Practice fully implemented; ERAM: Practice largely implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Periodically review the project's progress, performance, and issues; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Review the accomplishments and results of the project at selected project milestones; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Goal: Corrective actions are managed to closure when the project's performance or results deviate significantly from the plan; VSCS: Goal satisfied; ERAM: Goal satisfied; ITWS: Goal satisfied; ASDE-X: Goal satisfied. Desired practice: Collect and analyze the issues and determine the corrective actions necessary to address the issues; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Take corrective action on identified issues; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Manage corrective actions to closure; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Capability level 1 achieved? VSCS: No; ERAM: Goal satisfied; ITWS: Goal satisfied; ASDE-X: Goal satisfied. Goal: The process is institutionalized as a managed process; VSCS: Goal not satisfied; ERAM: Goal not satisfied; ITWS: Goal not satisfied; ASDE-X: Goal not satisfied. Desired practice: Establish and maintain an organizational policy for planning and performing the project monitoring and control process; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Establish and maintain the plan for performing the project monitoring and control process; VSCS: Practice largely implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Provide adequate resources for performing the project monitoring and control process, developing the work products, and providing the services of the process; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the project monitoring and control process; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice largely implemented. Desired practice: Train the people performing or supporting the project monitoring and control process as needed; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Place designated work products of the project monitoring and control process under appropriate levels of configuration management; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice partially implemented; ASDE-X: Practice largely implemented. Desired practice: Identify and involve the relevant stakeholders of the project monitoring and control process as planned; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Monitor and control the project monitoring and control process against the plan for performing the process, and take appropriate corrective action; VSCS: Practice not implemented; ERAM: Practice fully implemented; ITWS: Practice largely implemented; ASDE-X: Practice largely implemented. Desired practice: Objectively evaluate adherence of the project monitoring and control process to its process description, standards, and procedures, and address noncompliance; VSCS: Practice fully implemented; ERAM: Practice partially implemented; ITWS: Practice partially implemented; ASDE-X: Practice partially implemented. Desired practice: Review the activities, status, and results of the project monitoring and control process with higher level management, and resolve issues; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Capability level 2 achieved? VSCS: No; ERAM: No; ITWS: No; ASDE-X: No. Sources: GAO, SEI. [End of table] Additional details on each project's appraisal results at successive capability levels are provided in tables 14 through 21. Specifically, tables 14 and 15 provide results for VSCS; tables 16 and 17 provide results for ERAM; tables 18 and 19 provide results for ITWS; and tables 20 and 21 provide results for ASDE-X. Table 14: VSCS Project Monitoring and Control: Detailed Findings on Level 1 Goals and Practices: Goal: Actual performance and progress of the project are monitored against the project plan; Rating: Unsatisfied; Comment: The goal is unsatisfied because one of the practices below is partially implemented. Desired practice: Monitor the actual values of the project planning parameters against the project plan; Rating: Partially implemented; Comment: The project team discusses project parameters at various meetings. However, it does not monitor the actual values of the parameters against the project plan. Desired practice: Monitor commitments against those identified in the project plan; Rating: Fully implemented. Desired practice: Monitor risks against those identified in the project plan; Rating: Fully implemented. Desired practice: Monitor the management of project data against the project plan; Rating: Fully implemented. Desired practice: Monitor stakeholder involvement against the project plan; Rating: Fully implemented. Desired practice: Periodically review the project's progress, performance, and issues; Rating: Fully implemented. Desired practice: Review the accomplishments and results of the project at selected project milestones; Rating: Fully implemented. Goal: Corrective actions are managed to closure when the project's performance or results deviate significantly from the plan; Rating: Satisfied. Desired practice: Collect and analyze the issues and determine the corrective actions necessary to address the issues; Rating: Fully implemented. Desired practice: Take corrective action on identified issues; Rating: Fully implemented. Desired practice: Manage corrective actions to closure; Rating: Fully implemented. Capability level 1 not achieved. Sources: GAO, SEI. [End of table] Table 15: VSCS Project Monitoring and Control: Detailed Findings on Level 2 Goals and Practices: Goal: The process is institutionalized as a managed process; Rating: Unsatisfied; Comment: The goal is unsatisfied because one of the practices below is not implemented. Desired practice: Establish and maintain an organizational policy for planning and performing the project monitoring and control process; Rating: Fully implemented. Desired practice: Establish and maintain the plan for performing the project monitoring and control process; Rating: Largely implemented; Comment: The project team conducts project monitoring and control practices; however, there is no specific plan for performing the project monitoring and control process. In addition, the project team has developed a handbook that provides guidelines on what should be done. This handbook was not available for the current projects. Desired practice: Provide adequate resources for performing the project monitoring and control process, developing the work products, and providing the services of the process; Rating: Fully implemented. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the project monitoring and control process; Rating: Fully implemented. Desired practice: Train the people performing or supporting the project monitoring and control process as needed; Rating: Fully implemented. Desired practice: Place designated work products of the project monitoring and control process under appropriate levels of configuration management; Rating: Fully implemented. Desired practice: Identify and involve the relevant stakeholders of the project monitoring and control process as planned; Rating: Fully implemented. Desired practice: Monitor and control the project monitoring and control process against the plan for performing the process, and take appropriate corrective action; Rating: Not implemented; Comment: The project team does not monitor and control the project monitoring and control process against the plan. Desired practice: Objectively evaluate adherence of the project monitoring and control process to its process description, standards, and procedures, and address noncompliance; Rating: Fully implemented. Desired practice: Review the activities, status, and results of the project monitoring and control process with higher level management, and resolve issues; Rating: Fully implemented. Capability level 2 not achieved. Sources: GAO, SEI. [End of table] Table 16: ERAM Project Monitoring and Control: Detailed Findings on Level 1 Goals and Practices: Goal: Actual performance and progress of the project are monitored against the project plan; Rating: Satisfied. Desired practice: Monitor the actual values of the project planning parameters against the project plan; Rating: Fully implemented. Desired practice: Monitor commitments against those identified in the project plan; Rating: Fully implemented. Desired practice: Monitor risks against those identified in the project plan; Rating: Fully implemented. Desired practice: Monitor the management of project data against the project plan; Rating: Fully implemented. Desired practice: Monitor stakeholder involvement against the project plan; Rating: Largely implemented; Comment: The project team stated that it monitors stakeholder involvement against the plan by recording stakeholder participation in program meetings, user team meetings, and in the review of documents. However, the team did not have a checklist at each meeting that stated which stakeholders should be present. Desired practice: Periodically review the project's progress, performance, and issues; Rating: Fully implemented. Desired practice: Review the accomplishments and results of the project at selected project milestones; Rating: Fully implemented. Goal: Corrective actions are managed to closure when the project's performance or results deviate significantly from the plan; Rating: Satisfied. Desired practice: Collect and analyze the issues and determine the corrective actions necessary to address the issues; Rating: Fully implemented. Desired practice: Take corrective action on identified issues; Rating: Fully implemented. Desired practice: Manage corrective actions to closure; Rating: Fully implemented. Capability level 1 achieved. Sources: GAO, SEI. [End of table] Table 17: ERAM Project Monitoring and Control: Detailed Findings on Level 2 Goals and Practices: Goal: The process is institutionalized as a managed process; Rating: Unsatisfied; Comment: The goal is unsatisfied because one of the practices below is partially implemented. Desired practice: Establish and maintain an organizational policy for planning and performing the project monitoring and control process; Rating: Fully implemented. Desired practice: Establish and maintain the plan for performing the project monitoring and control process; Rating: Fully implemented. Desired practice: Provide adequate resources for performing the project monitoring and control process, developing the work products, and providing the services of the process; Rating: Fully implemented. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the project monitoring and control process; Rating: Fully implemented. Desired practice: Train the people performing or supporting the project monitoring and control process as needed; Rating: Fully implemented. Desired practice: Place designated work products of the project monitoring and control process under appropriate levels of configuration management; Rating: Fully implemented. Desired practice: Identify and involve the relevant stakeholders of the project monitoring and control process as planned; Rating: Fully implemented. Desired practice: Monitor and control the project monitoring and control process against the plan for performing the process, and take appropriate corrective action; Rating: Fully implemented. Desired practice: Objectively evaluate adherence of the project monitoring and control process to its process description, standards, and procedures, and address noncompliance; Rating: Partially implemented; Comment: The project team has just started defining the quality assurance process that is expected to objectively evaluate adherence of the project monitoring and control process to its process description, standards, and procedures and to address noncompliance. Desired practice: Review the activities, status, and results of the project monitoring and control process with higher level management, and resolve issues; Rating: Fully implemented. Capability level 2 not achieved. Sources: GAO, SEI. [End of table] Table 18: ITWS Project Monitoring and Control: Detailed Findings on Level 1 Goals and Practices: Goal: Actual performance and progress of the project are monitored against the project plan; Rating: Satisfied. Desired practice: Monitor the actual values of the project planning parameters against the project plan; Rating: Fully implemented. Desired practice: Monitor commitments against those identified in the project plan; Rating: Fully implemented. Desired practice: Monitor risks against those identified in the project plan; Rating: Fully implemented. Desired practice: Monitor the management of project data against the project plan; Rating: Largely implemented; Comment: The project team monitors the management of project data against the project plan; however, it does not identify and document significant issues and impacts. Desired practice: Monitor stakeholder involvement against the project plan; Rating: Fully implemented. Desired practice: Periodically review the project's progress, performance, and issues; Rating: Fully implemented. Desired practice: Review the accomplishments and results of the project at selected project milestones; Rating: Fully implemented. Goal: Corrective actions are managed to closure when the project's performance or results deviate significantly from the plan; Rating: Satisfied. Desired practice: Collect and analyze the issues and determine the corrective actions necessary to address the issues; Rating: Fully implemented. Desired practice: Take corrective action on identified issues; Rating: Fully implemented. Desired practice: Manage corrective actions to closure; Rating: Fully implemented. Capability level 1 achieved. Sources: GAO, SEI. [End of table] Table 19: ITWS Project Monitoring and Control: Detailed Findings on Level 2 Goals and Practices: Goal: The process is institutionalized as a managed process; Rating: Unsatisfied; Comment: The goal is unsatisfied because two of the practices below are partially implemented. Desired practice: Establish and maintain an organizational policy for planning and performing the project monitoring and control process; Rating: Fully implemented. Desired practice: Establish and maintain the plan for performing the project monitoring and control process; Rating: Fully implemented. Desired practice: Provide adequate resources for performing the project monitoring and control process, developing the work products, and providing the services of the process; Rating: Fully implemented. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the project monitoring and control process; Rating: Fully implemented. Desired practice: Train the people performing or supporting the project monitoring and control process as needed; Rating: Fully implemented. Desired practice: Place designated work products of the project monitoring and control process under appropriate levels of configuration management; Rating: Partially implemented; Comment: The project team places designated work products of the project monitoring and control process under some levels of configuration management; however, the controls on these products are not adequate because all support contractors have full access to the system. Desired practice: Identify and involve the relevant stakeholders of the project monitoring and control process as planned; Rating: Fully implemented. Desired practice: Monitor and control the project monitoring and control process against the plan for performing the process, and take appropriate corrective action; Rating: Largely implemented; Comment: The project team monitors the process, but it does not use the data to control the process and take appropriate corrective action. Desired practice: Objectively evaluate adherence of the project monitoring and control process to its process description, standards, and procedures, and address noncompliance; Rating: Partially implemented; Comment: The project team has just begun its quality assurance process, which is expected to objectively evaluate adherence of the project monitoring and control process to its process description, standards, and procedures and to address noncompliance. Desired practice: Review the activities, status, and results of the project monitoring and control process with higher level management, and resolve issues; Rating: Fully implemented. Capability level 2 not achieved. Sources: GAO, SEI. [End of table] Table 20: ASDE-X Project Monitoring and Control: Detailed Findings on Level 1 Goals and Practices: Goal: Actual performance and progress of the project are monitored against the project plan; Rating: Satisfied. Desired practice: Monitor the actual values of the project planning parameters against the project plan; Rating: Fully implemented. Desired practice: Monitor commitments against those identified in the project plan; Rating: Fully implemented. Desired practice: Monitor risks against those identified in the project plan; Rating: Fully implemented. Desired practice: Monitor the management of project data against the project plan; Rating: Largely implemented; Comment: The project team monitors the contractor documents, but internal documents, such as the project plan, have yet to be placed under formal configuration management control. Desired practice: Monitor stakeholder involvement against the project plan; Rating: Fully implemented. Desired practice: Periodically review the project's progress, performance, and issues; Rating: Fully implemented. Desired practice: Review the accomplishments and results of the project at selected project milestones; Rating: Fully implemented. Goal: Corrective actions are managed to closure when the project's performance or results deviate significantly from the plan; Rating: Satisfied. Desired practice: Collect and analyze the issues and determine the corrective actions necessary to address the issues; Rating: Fully implemented. Desired practice: Take corrective action on identified issues; Rating: Fully implemented. Desired practice: Manage corrective actions to closure; Rating: Fully implemented. Capability level 1 achieved. Sources: GAO, SEI. [End of table] Table 21: ASDE-X Project Monitoring and Control: Detailed Findings on Level 2 Goals and Practices: Goal: The process is institutionalized as a managed process; Rating: Unsatisfied; Comment: The goal is unsatisfied because one of the practices below is partially implemented. Desired practice: Establish and maintain an organizational policy for planning and performing the project monitoring and control process; Rating: Fully implemented. Desired practice: Establish and maintain the plan for performing the project monitoring and control process; Rating: Fully implemented. Desired practice: Provide adequate resources for performing the project monitoring and control process, developing the work products, and providing the services of the process; Rating: Fully implemented. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the project monitoring and control process; Rating: Largely implemented; Comment: The project team has assigned responsibility for the project planning process to various groups; however, it does not assign duties to individuals. Desired practice: Train the people performing or supporting the project monitoring and control process as needed; Rating: Fully implemented. Desired practice: Place designated work products of the project monitoring and control process under appropriate levels of configuration management; Rating: Largely implemented; Comment: The project team places designated contractor-related work products of the project monitoring and control process under appropriate levels of configuration management; however, internal documents have yet to be placed under configuration management control. Desired practice: Identify and involve the relevant stakeholders of the project monitoring and control process as planned; Rating: Fully implemented. Desired practice: Monitor and control the project monitoring and control process against the plan for performing the process, and take appropriate corrective action; Rating: Largely implemented; Comment: The project team monitors and controls the project monitoring and control process against the plan for performing the process; however, the metrics are collected by the contractor, and the project team does not maintain, record, or track the metrics on a monthly basis. Desired practice: Objectively evaluate adherence of the project monitoring and control process to its process description, standards, and procedures, and address noncompliance; Rating: Partially implemented; Comment: The project team has just begun its quality assurance process, which is expected to objectively evaluate adherence of the project monitoring and control process to its process description, standards, and procedures and to address noncompliance. Desired practice: Review the activities, status, and results of the project monitoring and control process with higher level management, and resolve issues; Rating: Fully implemented. Capability level 2 not achieved. Sources: GAO, SEI. [End of table] [End of section] Chapter 4: FAA Is Performing Most Risk Management Practices, but It Is Not Yet Fully Managing the Process: The purpose of risk management is to identify potential problems before they occur, so that risk-handling activities may be planned and invoked as needed across the life of the product or project to mitigate adverse impacts on achieving objectives. Effective risk management includes early and aggressive identification of risks through the involvement of relevant stakeholders. Early and aggressive detection of risk is important, because it is typically easier, less costly, and less disruptive to make changes and correct work efforts during the earlier phases of the project. As shown in figure 6, three of the four FAA projects satisfied all criteria for the "performing" capability level (level 1) in this process area. The fourth project would have achieved level 1 if it had performed one more practice (see the overview in table 22 for details). Two of the four FAA projects also satisfied all criteria for the "managed" capability level (level 2) in this process area. While the other projects had differing weaknesses that contributed to this result, common weaknesses across some of the projects occurred in the area of monitoring and controlling the risk management process and in ensuring quality assurance of the process. As a result of these weaknesses, FAA faces increased likelihood that project risks will not be identified and addressed in a timely manner--thereby increasing the likelihood that projects will not meet cost, schedule, or performance goals. Looked at another way, of the 68 practices we evaluated in this key process area, FAA projects had 59 practices that were fully or largely implemented and 9 practices that were partially or not implemented. Figure 6: Four Projects' Capability Levels in Risk Management: [See PDF for image] [End of figure] Table 22: Four Projects' Appraisal Results in Risk Management: Goal: Preparation for risk management is conducted; VSCS: Goal satisfied; ERAM: Goal satisfied; ITWS: Goal satisfied; ASDE-X: Goal satisfied. Desired practice: Determine risk sources and categories; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Define the parameters used to analyze and categorize risks and the parameters used to control the risk management effort; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Establish and maintain the strategy to be used for risk management; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Goal: Risks are identified and analyzed to determine their relative importance; VSCS: Goal satisfied; ERAM: Goal satisfied; ITWS: Goal satisfied; ASDE-X: Goal satisfied. Desired practice: Identify and document the risks; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Evaluate and categorize each identified risk using the defined risk categories and parameters, and determine its relative priority; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Goal: Risks are handled and mitigated, where appropriate, to reduce adverse impacts on achieving objectives; VSCS: Goal not satisfied; ERAM: Goal satisfied; ITWS: Goal satisfied; ASDE-X: Goal satisfied. Desired practice: Develop a risk mitigation plan for the most important risks to the project, as defined by the risk management strategy; VSCS: Practice largely implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Monitor the status of each risk periodically and implement the risk mitigation plan as appropriate; VSCS: Practice partially implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Capability level 1 achieved? VSCS: No; ERAM: Yes; ITWS: Yes; ASDE-X: Yes. Goal: The process is institutionalized as a managed process; VSCS: Goal not satisfied; ERAM: Goal satisfied; ITWS: Goal not satisfied; ASDE-X: Goal satisfied. Desired practice: Establish and maintain an organizational policy for planning and performing the risk management process; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Establish and maintain the plan for performing the risk management process; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Provide adequate resources for performing the risk management process, developing the work products, and providing the services of the process; VSCS: Practice partially implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the risk management process; VSCS: Practice largely implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Train the people performing or supporting the risk management process as needed; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice largely implemented; ASDE-X: Practice fully implemented. Desired practice: Place designated work products of the risk management process under appropriate levels of configuration management; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice partially implemented; ASDE-X: Practice largely implemented. Desired practice: Identify and involve the relevant stakeholders of the risk management process as planned; VSCS: Practice partially implemented; ERAM: Practice fully implemented; ITWS: Practice largely implemented; ASDE-X: Practice fully implemented. Desired practice: Monitor and control the risk management process against the plan for performing the process, and take appropriate corrective action; VSCS: Practice not implemented; ERAM: Practice largely implemented; ITWS: Practice partially implemented; ASDE-X: Practice largely implemented. Desired practice: Objectively evaluate adherence of the risk management process to its process description, standards, and procedures, and address noncompliance; VSCS: Practice partially implemented; ERAM: Practice largely implemented; ITWS: Practice partially implemented; ASDE-X: Practice fully implemented. Desired practice: Review the activities, status, and results of the risk management process with higher level management, and resolve issues; VSCS: Practice partially implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Capability level 2 achieved? VSCS: No; ERAM: Yes; ITWS: No; ASDE-X: Yes. Sources: GAO, SEI. [End of table] Additional details on each project's appraisal results at successive capability levels are provided in tables 23 through 30. Specifically, tables 23 and 24 provide results for VSCS; tables 25 and 26 provide results for ERAM; tables 27 and 28 provide results for ITWS; and tables 29 and 30 provide results for ASDE-X. Table 23: VSCS Risk Management: Detailed Findings on Level 1 Goals and Practices: Goal: Preparation for risk management is conducted; Rating: Satisfied. Desired practice: Determine risk sources and categories; Rating: Fully implemented. Desired practice: Define the parameters used to analyze and categorize risks and the parameters used to control the risk management effort; Rating: Fully implemented. Desired practice: Establish and maintain the strategy to be used for risk management; Rating: Fully implemented. Goal: Risks are identified and analyzed to determine their relative importance; Rating: Satisfied. Desired practice: Identify and document the risks; Rating: Fully implemented. Desired practice: Evaluate and categorize each identified risk using the defined risk categories and parameters, and determine its relative priority; Rating: Fully implemented. Goal: Risks are handled and mitigated, where appropriate, to reduce adverse impacts on achieving objectives; Rating: Unsatisfied; Comment: The goal is unsatisfied because one of the practices below is partially implemented. Desired practice: Develop a risk mitigation plan for the most important risks to the project, as defined by the risk management strategy; Rating: Largely implemented; Comment: The project team has a high level risk mitigation plan that was developed as part of the project plan. However, this plan is not sufficiently detailed. Desired practice: Monitor the status of each risk periodically and implement the risk mitigation plan as appropriate; Rating: Partially implemented; Comment: While project officials stated that risks are discussed at meetings, there is little evidence that the project team monitors the status of each risk periodically or implements the risk mitigation plan. Capability level 1 not achieved. Sources: GAO, SEI. [End of table] Table 24: VSCS Risk Management: Detailed Findings on Level 2 Goals and Practices: Goal: The process is institutionalized as a managed process; Rating: Unsatisfied; Comment: The goal is unsatisfied because four of the practices below are partially implemented and one is not implemented. Desired practice: Establish and maintain an organizational policy for planning and performing the risk management process; Rating: Fully implemented. Desired practice: Establish and maintain the plan for performing the risk management process; Rating: Fully implemented. Desired practice: Provide adequate resources for performing the risk management process, developing the work products, and providing the services of the process; Rating: Partially implemented; Comment: Resources for performing the risk management process are not adequate. Some practices for this process area are not being performed because of a lack of resources. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the risk management process; Rating: Largely implemented; Comment: The project team has assigned responsibility and authority for performing the process, developing the work products, and providing the services of the risk management process. However, all of these responsibilities have been assigned to a single individual, who also has other duties, thereby increasing the risk that these activities will not be robust. Desired practice: Train the people performing or supporting the risk management process as needed; Rating: Fully implemented. Desired practice: Place designated work products of the risk management process under appropriate levels of configuration management; Rating: Fully implemented. Desired practice: Identify and involve the relevant stakeholders of the risk management process as planned; Rating: Partially implemented; Comment: The project team stated that risks are identified and discussed periodically during meetings with relevant stakeholders; however, meeting minutes did not always document that risks were discussed with relevant stakeholders. Desired practice: Monitor and control the risk management process against the plan for performing the process, and take appropriate corrective action; Rating: Not implemented; Comment: The project team does not monitor and control the risk management process against the plan for performing the process or take appropriate corrective action. Desired practice: Objectively evaluate adherence of the risk management process to its process description, standards, and procedures, and address noncompliance; Rating: Partially implemented; Comment: The project team subjectively evaluates adherence to the risk management process. The project team does not objectively evaluate adherence of the risk management process to its process description, standards, and procedures, and it does not address noncompliance. Desired practice: Review the activities, status, and results of the risk management process with higher level management, and resolve issues; Rating: Partially implemented; Comment: Project personnel stated that risk management activities are discussed with higher level management. However, the meeting minutes generally did not show that the project team reviews the activities, status, and results of the risk management process with higher level management and resolves issues. Capability level 2 not achieved. Sources: GAO, SEI. [End of table] Table 25: ERAM Risk Management: Detailed Findings on Level 1 Goals and Practices: Goal: Preparation for risk management is conducted; Rating: Satisfied. Desired practice: Determine risk sources and categories; Rating: Fully implemented. Desired practice: Define the parameters used to analyze and categorize risks and the parameters used to control the risk management effort; Rating: Fully implemented. Desired practice: Establish and maintain the strategy to be used for risk management; Rating: Fully implemented. Goal: Risks are identified and analyzed to determine their relative importance; Rating: Satisfied. Desired practice: Identify and document the risks; Rating: Fully implemented. Desired practice: Evaluate and categorize each identified risk using the defined risk categories and parameters, and determine its relative priority; Rating: Fully implemented. Goal: Risks are handled and mitigated, where appropriate, to reduce adverse impacts on achieving objectives; Rating: Satisfied. Desired practice: Develop a risk mitigation plan for the most important risks to the project, as defined by the risk management strategy; Rating: Fully implemented. Desired practice: Monitor the status of each risk periodically and implement the risk mitigation plan as appropriate; Rating: Fully implemented. Capability level 1 achieved. Sources: GAO, SEI. [End of table] Table 26: ERAM Risk Management: Detailed Findings on Level 2 Goals and Practices: Goal: The process is institutionalized as a managed process; Rating: Satisfied. Desired practice: Establish and maintain an organizational policy for planning and performing the risk management process; Rating: Fully implemented. Desired practice: Establish and maintain the plan for performing the risk management process; Rating: Fully implemented. Desired practice: Provide adequate resources for performing the risk management process, developing the work products, and providing the services of the process; Rating: Fully implemented. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the risk management process; Rating: Fully implemented. Desired practice: Train the people performing or supporting the risk management process as needed; Rating: Fully implemented. Desired practice: Place designated work products of the risk management process under appropriate levels of configuration management; Rating: Fully implemented. Desired practice: Identify and involve the relevant stakeholders of the risk management process as planned; Rating: Fully implemented. Desired practice: Monitor and control the risk management process against the plan for performing the process, and take appropriate corrective action; Rating: Largely implemented; Comment: The project team monitors the risk management process against the plan for performing the process and takes appropriate corrective action. However, the team is not using the available data on a monthly basis to control the process. Desired practice: Objectively evaluate adherence of the risk management process to its process description, standards, and procedures, and address noncompliance; Rating: Largely implemented; Comment: The project team recently performed an audit on the risk management process that objectively evaluated adherence of the process to its process description, standards, and procedures, and addressed noncompliance; however, the team has just started defining its quality assurance process. Desired practice: Review the activities, status, and results of the risk management process with higher level management, and resolve issues; Rating: Fully implemented. Capability level 2 achieved. Sources: GAO, SEI. [End of table] Table 27: ITWS Risk Management: Detailed Findings on Level 1 Goals and Practices: Goal: Preparation for risk management is conducted; Rating: Satisfied. Desired practice: Determine risk sources and categories; Rating: Fully implemented. Desired practice: Define the parameters used to analyze and categorize risks and the parameters used to control the risk management effort; Rating: Fully implemented. Desired practice: Establish and maintain the strategy to be used for risk management; Rating: Fully implemented. Goal: Risks are identified and analyzed to determine their relative importance; Rating: Satisfied. Desired practice: Identify and document the risks; Rating: Fully implemented. Desired practice: Evaluate and categorize each identified risk using the defined risk categories and parameters, and determine its relative priority; Rating: Fully implemented. Goal: Risks are handled and mitigated, where appropriate, to reduce adverse impacts on achieving objectives; Rating: Satisfied. Desired practice: Develop a risk mitigation plan for the most important risks to the project, as defined by the risk management strategy; Rating: Fully implemented. Desired practice: Monitor the status of each risk periodically and implement the risk mitigation plan as appropriate; Rating: Fully implemented. Capability level 1 achieved. Sources: GAO, SEI. [End of table] Table 28: ITWS Risk Management: Detailed Findings on Level 2 Goals and Practices: Goal: The process is institutionalized as a managed process; Rating: Unsatisfied; Comment: The goal is unsatisfied because three of the practices below are partially implemented. Desired practice: Establish and maintain an organizational policy for planning and performing the risk management process; Rating: Fully implemented. Desired practice: Establish and maintain the plan for performing the risk management process; Rating: Fully implemented. Desired practice: Provide adequate resources for performing the risk management process, developing the work products, and providing the services of the process; Rating: Fully implemented. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the risk management process; Rating: Fully implemented. Desired practice: Train the people performing or supporting the risk management process as needed; Rating: Largely implemented; Comment: Key members of the project team are trained on the risk management process; however, other project team members have yet to be trained. Desired practice: Place designated work products of the risk management process under appropriate levels of configuration management; Rating: Partially implemented; Comment: The project team places designated work products of the risk management process under some levels of configuration management; however, the controls on these products are not adequate because all support contractors have full access to the system. Desired practice: Identify and involve the relevant stakeholders of the risk management process as planned; Rating: Largely implemented; Comment: The project team has involved certain individuals in the monitoring of risks; however, stakeholders are not identified in the risk management plan, and the project lead has not approved the list of stakeholders. Desired practice: Monitor and control the risk management process against the plan for performing the process, and take appropriate corrective action; Rating: Partially implemented; Comment: The project team is using a risk management database; however, there is no evidence that the project team is monitoring and controlling the risk management process against the plan for performing the risk management process and taking appropriate corrective action. Desired practice: Objectively evaluate adherence of the risk management process to its process description, standards, and procedures, and address noncompliance; Rating: Partially implemented; Comment: The project team has just begun its quality assurance process, which is expected to objectively evaluate adherence of the risk management process to its process description, standards, and procedures and to address noncompliance. Desired practice: Review the activities, status, and results of the risk management process with higher level management, and resolve issues; Rating: Fully implemented. Capability level 2 not achieved. Sources: GAO, SEI. [End of table] Table 29: ASDE-X Risk Management: Detailed Findings on Level 1 Goals and Practices: Goal: Preparation for risk management is conducted; Rating: Satisfied. Desired practice: Determine risk sources and categories; Rating: Fully implemented. Desired practice: Define the parameters used to analyze and categorize risks and the parameters used to control the risk management effort; Rating: Fully implemented. Desired practice: Establish and maintain the strategy to be used for risk management; Rating: Fully implemented. Goal: Risks are identified and analyzed to determine their relative importance; Rating: Satisfied. Desired practice: Identify and document the risks; Rating: Fully implemented. Desired practice: Evaluate and categorize each identified risk using the defined risk categories and parameters, and determine its relative priority; Rating: Fully implemented. Goal: Risks are handled and mitigated, where appropriate, to reduce adverse impacts on achieving objectives; Rating: Satisfied. Desired practice: Develop a risk mitigation plan for the most important risks to the project, as defined by the risk management strategy; Rating: Fully implemented. Desired practice: Monitor the status of each risk periodically and implement the risk mitigation plan as appropriate; Rating: Fully implemented. Capability level 1 achieved. Sources: GAO, SEI. [End of table] Table 30: ASDE-X Risk Management: Detailed Findings on Level 2 Goals and Practices: Goal: The process is institutionalized as a managed process; Rating: Satisfied. Desired practice: Establish and maintain an organizational policy for planning and performing the risk management process; Rating: Fully implemented. Desired practice: Establish and maintain the plan for performing the risk management process; Rating: Fully implemented. Desired practice: Provide adequate resources for performing the risk management process, developing the work products, and providing the services of the process; Rating: Fully implemented. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the risk management process; Rating: Fully implemented. Desired practice: Train the people performing or supporting the risk management process as needed; Rating: Fully implemented. Desired practice: Place designated work products of the risk management process under appropriate levels of configuration management; Rating: Largely implemented; Comment: The project team places designated contractor-related work products of the risk management process under appropriate levels of configuration management; however, internal documents have yet to be placed under configuration management control. Desired practice: Identify and involve the relevant stakeholders of the risk management process as planned; Rating: Fully implemented. Desired practice: Monitor and control the risk management process against the plan for performing the process, and take appropriate corrective action; Rating: Largely implemented; Comment: The project team monitors and controls the risk management process against the plan for performing the process; however, the metrics are collected by the contractor, and the project team does not maintain, record, or track the metrics on a monthly basis, therefore making it difficult to take appropriate corrective action. Desired practice: Objectively evaluate adherence of the risk management process to its process description, standards, and procedures, and address noncompliance; Rating: Fully implemented. Desired practice: Review the activities, status, and results of the risk management process with higher level management, and resolve issues; Rating: Fully implemented. Capability level 2 achieved. Sources: GAO, SEI. [End of section] Chapter 5: FAA Is Performing Requirements Development Practices, but It Is Not Yet Fully Managing the Process: The purpose of requirements development is to produce and analyze customer, product, and product-component needs. This process area addresses the needs of relevant stakeholders, including those pertinent to various product life-cycle phases. It also addresses constraints caused by the selection of design solutions. The development of requirements includes elicitation, analysis, validation, and communication of customer and stakeholder needs and expectations. As shown in figure 7, all four FAA projects satisfied all criteria for the "performing" capability level (level 1) in this process area. None of the four projects satisfied all criteria for the "managing" capability level (level 2). While all four projects had differing weaknesses that contributed to this result, common weaknesses across multiple projects occurred in the areas of training people and in ensuring quality assurance of the requirements development process, as shown in the overview in table 31. As a result of these weaknesses, FAA is exposed to increased risks that projects will not fulfill mission and user needs. Looked at another way, of the 84 practices we evaluated in this key process area, FAA projects had 77 practices that were fully or largely implemented and 7 practices that were partially or not implemented. Figure 7: Four Projects' Capability Levels in Requirements Development: [See PDF for image] [End of figure] Table 31: Four Projects' Appraisal Results in Requirements Development: Goal: Stakeholder needs, expectations, constraints, and interfaces are collected and translated into customer requirements; VSCS: Goal satisfied; ERAM: Goal satisfied; ITWS: Goal satisfied; ASDE-X: Goal satisfied. Desired practice: Identify and collect stakeholder needs, expectations, constraints, and interfaces for all phases of the product life cycle; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Transform stakeholder needs, expectations, constraints, and interfaces into customer requirements; VSCS: Practice largely implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Goal: Customer requirements are refined and elaborated to develop product and product-component requirements; VSCS: Goal satisfied; ERAM: Goal satisfied; ITWS: Goal satisfied; ASDE-X: Goal satisfied. Desired practice: Establish and maintain product and product-component requirements that are based on the customer requirements; VSCS: Practice largely implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Allocate the requirements for each product component; VSCS: Practice largely implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Identify interface requirements; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Goal: The requirements are analyzed and validated, and a definition of required functionality is developed; VSCS: Goal satisfied; ERAM: Goal satisfied; ITWS: Goal satisfied; ASDE-X: Goal satisfied. Desired practice: Establish and maintain operational concepts and associated scenarios; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Establish and maintain a definition of required functionality; VSCS: Practice largely implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Analyze requirements to ensure that they are necessary and sufficient; VSCS: Practice largely implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Validate requirements to ensure that the resulting product will perform appropriately in its intended-use environment; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Capability level 1 achieved? VSCS: Yes; ERAM: Yes; ITWS: Yes; ASDE-X: Yes. Goal: Stakeholder needs, expectations, constraints, and interfaces are collected and translated into customer requirements.[A]; VSCS: Goal satisfied; ERAM: Goal satisfied; ITWS: Goal satisfied; ASDE-X: Goal satisfied. Desired practice: Elicit stakeholder needs, expectations, constraints, and interfaces for all phases of the product life cycle; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice largely implemented; ASDE-X: Practice fully implemented. Goal: The requirements are analyzed and validated, and a definition of required functionality is developed.[A]; VSCS: Goal satisfied; ERAM: Goal satisfied; ITWS: Goal satisfied; ASDE-X: Goal satisfied. Desired practice: Validate requirements to ensure that the resulting product will perform as intended in the user's environment using multiple techniques as appropriate; VSCS: Practice largely implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Goal: The process is institutionalized as a managed process; VSCS: Goal not satisfied; ERAM: Goal not satisfied; ITWS: Goal not satisfied; ASDE-X: Goal not satisfied. Desired practice: Establish and maintain an organizational policy for planning and performing the requirements development process; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Establish and maintain the plan for performing the requirements development process; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Provide adequate resources for performing the requirements development process, developing the work products, and providing the services of the process; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice largely implemented; ASDE-X: Practice fully implemented. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the requirements development process; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice largely implemented. Desired practice: Train the people performing or supporting the requirements development process as needed; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice partially implemented; ASDE-X: Practice partially implemented. Desired practice: Place designated work products of the requirements development process under appropriate levels of configuration management; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice partially implemented; ASDE-X: Practice largely implemented. Desired practice: Identify and involve the relevant stakeholders of the requirements development process as planned; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Monitor and control the requirements development process against the plan for performing the process, and take appropriate corrective action; VSCS: Practice partially implemented; ERAM: Practice largely implemented; ITWS: Practice largely implemented; ASDE-X: Practice fully implemented. Desired practice: Objectively evaluate adherence of the requirements development process to its process description, standards, and procedures, and address noncompliance; VSCS: Practice fully implemented; ERAM: Practice partially implemented; ITWS: Practice partially implemented; ASDE-X: Practice partially implemented. Desired practice: Review the activities, status, and results of the requirements development process with higher level management, and resolve issues; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice largely implemented; ASDE-X: Practice fully implemented. Capability level 2 achieved? VSCS: No; ERAM: No; ITWS: No; ASDE-X: No. Sources: GAO, SEI. [A] This goal is repeated at both capability levels 1 and 2, but it comprises more advanced practices at level 2. [End of table] Additional details on each project's appraisal results at successive capability levels are provided in tables 32 through 39. Specifically, tables 32 and 33 provide results for VSCS; tables 34 and 35 provide results for ERAM; tables 36 and 37 provide results for ITWS; and tables 38 and 39 provide results for ASDE-X. Table 32: VSCS Requirements Development: Detailed Findings on Level 1 Goals and Practices: Goal: Stakeholder needs, expectations, constraints, and interfaces are collected and translated into customer requirements; Rating: Satisfied. Desired practice: Identify and collect stakeholder needs, expectations, constraints, and interfaces for all phases of the product life cycle; Rating: Fully implemented. Desired practice: Transform stakeholder needs, expectations, constraints, and interfaces into customer requirements; Rating: Largely implemented; Comment: The project team usually--but not always--transforms stakeholder needs, expectations, constraints, and interfaces into customer requirements. Goal: Customer requirements are refined and elaborated to develop product and product-component requirements; Rating: Satisfied. Desired practice: Establish and maintain product and product-component requirements that are based on the customer requirements; Rating: Largely implemented; Comment: The project team has established and usually--but not always- -maintains product and product-component requirements that are based on the customer requirements. Desired practice: Allocate the requirements for each product component; Rating: Largely implemented; Comment: The project team usually--but not always--allocates the requirements for each product component. Desired practice: Identify interface requirements; Rating: Fully implemented. Goal: The requirements are analyzed and validated, and a definition of required functionality is developed; Rating: Satisfied. Desired practice: Establish and maintain operational concepts and associated scenarios; Rating: Fully implemented. Desired practice: Establish and maintain a definition of required functionality; Rating: Largely implemented; Comment: The project team has established and usually--but not always- -maintains a definition of required functionality. Desired practice: Analyze requirements to ensure that they are necessary and sufficient; Rating: Largely implemented; Comment: The project team usually--but not always--analyzes requirements to ensure that they are necessary and sufficient. Desired practice: Validate requirements to ensure that the resulting product will perform appropriately in its intended-use environment; Rating: Fully implemented. Capability level 1 achieved. Sources: GAO, SEI. [End of table] Table 33: VSCS Requirements Development: Detailed Findings on Level 2 Goals and Practices: Goal: Stakeholder needs, expectations, constraints, and interfaces are collected and translated into customer requirements; Rating: Satisfied. Desired practice: Elicit stakeholder needs, expectations, constraints, and interfaces for all phases of the product life cycle; Rating: Fully implemented. Goal: The requirements are analyzed and validated, and a definition of required functionality is developed; Rating: Satisfied. Desired practice: Validate requirements to ensure that the resulting product will perform as intended in the user's environment using multiple techniques as appropriate; Rating: Largely implemented; Comment: The project team usually--but not always--validates requirements to ensure that the resulting product will perform as intended in the user's environment using multiple techniques as appropriate. Goal: The process is institutionalized as a managed process; Rating: Unsatisfied; Comment: The goal is unsatisfied because one of the practices below is partially implemented. Desired practice: Establish and maintain an organizational policy for planning and performing the requirements development process; Rating: Fully implemented. Desired practice: Establish and maintain the plan for performing the requirements development process; Rating: Fully implemented. Desired practice: Provide adequate resources for performing the requirements development process, developing the work products, and providing the services of the process; Rating: Fully implemented. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the requirements development process; Rating: Fully implemented. Desired practice: Train the people performing or supporting the requirements development process as needed; Rating: Fully implemented. Desired practice: Place designated work products of the requirements development process under appropriate levels of configuration management; Rating: Fully implemented. Desired practice: Identify and involve the relevant stakeholders of the requirements development process as planned; Rating: Fully implemented. Desired practice: Monitor and control the requirements development process against the plan for performing the process, and take appropriate corrective action; Rating: Partially implemented; Comment: The project team does not consistently monitor and control the requirements development process against the plan for performing the process and take appropriate corrective action, although it does some of these activities. Desired practice: Objectively evaluate adherence of the requirements development process to its process description, standards, and procedures, and address noncompliance; Rating: Fully implemented. Desired practice: Review the activities, status, and results of the requirements development process with higher level management, and resolve issues; Rating: Fully implemented. Capability level 2 not achieved. Sources: GAO, SEI. [End of table] Table 34: ERAM Requirements Development: Detailed Findings on Level 1 Goals and Practices: Goal: Stakeholder needs, expectations, constraints, and interfaces are collected and translated into customer requirements; Rating: Satisfied. Desired practice: Identify and collect stakeholder needs, expectations, constraints, and interfaces for all phases of the product life cycle; Rating: Fully implemented. Desired practice: Transform stakeholder needs, expectations, constraints, and interfaces into customer requirements; Rating: Fully implemented. Goal: Customer requirements are refined and elaborated to develop product and product-component requirements; Rating: Satisfied. Desired practice: Establish and maintain product and product-component requirements that are based on the customer requirements; Rating: Fully implemented. Desired practice: Allocate the requirements for each product component; Rating: Fully implemented. Desired practice: Identify interface requirements; Rating: Fully implemented. Goal: The requirements are analyzed and validated, and a definition of required functionality is developed; Rating: Satisfied. Desired practice: Establish and maintain operational concepts and associated scenarios; Rating: Fully implemented. Desired practice: Establish and maintain a definition of required functionality; Rating: Fully implemented. Desired practice: Analyze requirements to ensure that they are necessary and sufficient; Rating: Fully implemented. Desired practice: Validate requirements to ensure that the resulting product will perform appropriately in its intended-use environment; Rating: Fully implemented. Capability level 1 achieved. Sources: GAO, SEI. [End of table] Table 35: ERAM Requirements Development: Detailed Findings on Level 2 Goals and Practices: Goal: Stakeholder needs, expectations, constraints, and interfaces are collected and translated into customer requirements; Rating: Satisfied. Desired practice: Elicit stakeholder needs, expectations, constraints, and interfaces for all phases of the product life cycle; Rating: Fully implemented. Goal: The requirements are analyzed and validated, and a definition of required functionality is developed; Rating: Satisfied. Desired practice: Validate requirements to ensure that the resulting product will perform as intended in the user's environment using multiple techniques as appropriate; Rating: Fully implemented. Goal: The process is institutionalized as a managed process; Rating: Unsatisfied; Comment: The goal is unsatisfied because one of the practices below is partially implemented. Desired practice: Establish and maintain an organizational policy for planning and performing the requirements development process; Rating: Fully implemented. Desired practice: Establish and maintain the plan for performing the requirements development process; Rating: Fully implemented. Desired practice: Provide adequate resources for performing the requirements development process, developing the work products, and providing the services of the process; Rating: Fully implemented. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the requirements development process; Rating: Fully implemented. Desired practice: Train the people performing or supporting the requirements development process as needed; Rating: Fully implemented. Desired practice: Place designated work products of the requirements development process under appropriate levels of configuration management; Rating: Fully implemented. Desired practice: Identify and involve the relevant stakeholders of the requirements development process as planned; Rating: Fully implemented. Desired practice: Monitor and control the requirements development process against the plan for performing the process, and take appropriate corrective action; Rating: Largely implemented; Comment: The project team monitors the requirements development process against the plan by reviewing periodic data; however, the project team does not always use these data to control the requirements development process. Desired practice: Objectively evaluate adherence of the requirements development process to its process description, standards, and procedures, and address noncompliance; Rating: Partially implemented; Comment: The project team has just started defining its quality assurance process, which is expected to objectively evaluate adherence of the requirements development process to its description, standards, and procedures and to address noncompliance. Desired practice: Review the activities, status, and results of the requirements development process with higher level management, and resolve issues; Rating: Fully implemented. Capability level 2 not achieved. Sources: GAO, SEI. [End of table] Table 36: ITWS Requirements Development: Detailed Findings on Level 1 Goals and Practices: Goal: Stakeholder needs, expectations, constraints, and interfaces are collected and translated into customer requirements; Rating: Satisfied. Desired practice: Identify and collect stakeholder needs, expectations, constraints, and interfaces for all phases of the product life cycle; Rating: Fully implemented. Desired practice: Transform stakeholder needs, expectations, constraints, and interfaces into customer requirements; Rating: Fully implemented. Goal: Customer requirements are refined and elaborated to develop product and product-component requirements; Rating: Satisfied. Desired practice: Establish and maintain product and product-component requirements that are based on the customer requirements; Rating: Fully implemented. Desired practice: Allocate the requirements for each product component; Rating: Fully implemented. Desired practice: Identify interface requirements; Rating: Fully implemented. Goal: The requirements are analyzed and validated, and a definition of required functionality is developed; Rating: Satisfied. Desired practice: Establish and maintain operational concepts and associated scenarios; Rating: Fully implemented. Desired practice: Establish and maintain a definition of required functionality; Rating: Fully implemented. Desired practice: Analyze requirements to ensure that they are necessary and sufficient; Rating: Fully implemented. Desired practice: Validate requirements to ensure that the resulting product will perform appropriately in its intended-use environment; Rating: Fully implemented. Capability level 1 achieved. Sources: GAO, SEI. [End of table] Table 37: ITWS Requirements Development: Detailed Findings on Level 2 Goals and Practices: Goal: Stakeholder needs, expectations, constraints, and interfaces are collected and translated into customer requirements; Rating: Satisfied. Desired practice: Elicit stakeholder needs, expectations, constraints, and interfaces for all phases of the product life cycle; Rating: Largely implemented; Comment: The project team has outlined a process for eliciting stakeholder needs, expectations, constraints, and interfaces for all phases of the product life cycle through the system engineering plan; however, the team did not provide evidence that this practice is performed. Goal: The requirements are analyzed and validated, and a definition of required functionality is developed; Rating: Satisfied. Desired practice: Validate requirements to ensure that the resulting product will perform as intended in the user's environment using multiple techniques as appropriate; Rating: Fully implemented. Goal: The process is institutionalized as a managed process; Rating: Unsatisfied; Comment: The goal is unsatisfied because three of the practices below are partially implemented. Desired practice: Establish and maintain an organizational policy for planning and performing the requirements development process; Rating: Fully implemented. Desired practice: Establish and maintain the plan for performing the requirements development process; Rating: Fully implemented. Desired practice: Provide adequate resources for performing the requirements development process, developing the work products, and providing the services of the process; Rating: Largely implemented; Comment: The project team has adequate personnel resources; however, the project team does not have adequate software resources for performing and providing the services of the requirements development process. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the requirements development process; Rating: Fully implemented. Desired practice: Train the people performing or supporting the requirements development process as needed; Rating: Partially implemented; Comment: The project team did not provide adequate supporting evidence that FAA personnel are trained on performing or supporting the requirements development process as needed. Desired practice: Place designated work products of the requirements development process under appropriate levels of configuration management; Rating: Partially implemented; Comment: The project team places designated work products of the requirements development process under some levels of configuration management; however, the controls on these products are not adequate because all support contractors have full access to the system. Desired practice: Identify and involve the relevant stakeholders of the requirements development process as planned; Rating: Fully implemented. Desired practice: Monitor and control the requirements development process against the plan for performing the process, and take appropriate corrective action; Rating: Largely implemented; Comment: The project team is using metrics for monitoring the process, and project officials reported that the team was controlling the process; however, the team did not provide sufficient evidence that it is controlling the requirements development process against the plan and taking appropriate corrective action. Desired practice: Objectively evaluate adherence of the requirements development process to its process description, standards, and procedures, and address noncompliance; Rating: Partially implemented; Comment: The project team has just begun its quality assurance process, which is expected to objectively evaluate adherence of the requirements development process to its process description, standards, and procedures and to address noncompliance. Desired practice: Review the activities, status, and results of the requirements development process with higher level management, and resolve issues; Rating: Largely implemented; Comment: At the beginning of the project, the project team reviewed the status and the results of the requirements development process during the product team meetings. However, the project team did not provide adequate support that the status of the requirements development process was currently being reviewed. Capability level 2 not achieved. Sources: GAO, SEI. [End of table] Table 38: ASDE-X Requirements Development: Detailed Findings on Level 1 Goals and Practices: Goal: Stakeholder needs, expectations, constraints, and interfaces are collected and translated into customer requirements; Rating: Satisfied. Desired practice: Identify and collect stakeholder needs, expectations, constraints, and interfaces for all phases of the product life cycle; Rating: Fully implemented. Desired practice: Transform stakeholder needs, expectations, constraints, and interfaces into customer requirements; Rating: Fully implemented. Goal: Customer requirements are refined and elaborated to develop product and product-component requirements; Rating: Satisfied. Desired practice: Establish and maintain product and product-component requirements that are based on the customer requirements; Rating: Fully implemented. Desired practice: Allocate the requirements for each product component; Rating: Fully implemented. Desired practice: Identify interface requirements; Rating: Fully implemented. Goal: The requirements are analyzed and validated, and a definition of required functionality is developed; Rating: Satisfied. Desired practice: Establish and maintain operational concepts and associated scenarios; Rating: Fully implemented. Desired practice: Establish and maintain a definition of required functionality; Rating: Fully implemented. Desired practice: Analyze requirements to ensure that they are necessary and sufficient; Rating: Fully implemented. Desired practice: Validate requirements to ensure that the resulting product will perform appropriately in its intended-use environment; Rating: Fully implemented. Capability level 1 achieved. Sources: GAO, SEI. [End of table] Table 39: ASDE-X Requirements Development: Detailed Findings on Level 2 Goals and Practices: Goal: Stakeholder needs, expectations, constraints, and interfaces are collected and translated into customer requirements; Rating: Satisfied. Desired practice: Elicit stakeholder needs, expectations, constraints, and interfaces for all phases of the product life cycle; Rating: Fully implemented. Goal: The requirements are analyzed and validated, and a definition of required functionality is developed; Rating: Satisfied. Desired practice: Validate requirements to ensure that the resulting product will perform as intended in the user's environment using multiple techniques as appropriate; Rating: Fully implemented. Goal: The process is institutionalized as a managed process; Rating: Unsatisfied; Comment: The goal is unsatisfied because two of the practices below are partially implemented. Desired practice: Establish and maintain an organizational policy for planning and performing the requirements development process; Rating: Fully implemented. Desired practice: Establish and maintain the plan for performing the requirements development process; Rating: Fully implemented. Desired practice: Provide adequate resources for performing the requirements development process, developing the work products, and providing the services of the process; Rating: Fully implemented. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the requirements development process; Rating: Largely implemented; Comment: The project team assigns responsibility and authority to various groups to perform the requirements development process. However, there is no single document that identifies specific requirements development duties by project team individual. Desired practice: Train the people performing or supporting the requirements development process as needed; Rating: Partially implemented; Comment: The project team is in the process of designing a training class, but it has not yet trained the people performing or supporting the requirements development process. Desired practice: Place designated work products of the requirements development process under appropriate levels of configuration management; Rating: Largely implemented; Comment: The project team places designated work products of the requirements development process under appropriate levels of configuration management; however, internal documents have yet to be placed under configuration management control. Desired practice: Identify and involve the relevant stakeholders of the requirements development process as planned; Rating: Fully implemented. Desired practice: Monitor and control the requirements development process against the plan for performing the process, and take appropriate corrective action; Rating: Fully implemented. Desired practice: Objectively evaluate adherence of the requirements development process to its process description, standards, and procedures, and address noncompliance; Rating: Partially implemented; Comment: The project team has just begun its quality assurance process, which is expected to objectively evaluate adherence of the requirements development process to its process description, standards, and procedures and to address noncompliance. Desired practice: Review the activities, status, and results of the requirements development process with higher level management, and resolve issues; Rating: Fully implemented. Capability level 2 not achieved. Sources: GAO, SEI. [End of table] [End of section] Chapter 6: FAA Is Performing Requirements Management Practices, but It Is Not Yet Fully Managing the Process: The purpose of requirements management is to manage the project's product components and to identify inconsistencies between requirements and the project's plans and work products. This process area includes managing all technical and nontechnical requirements and any changes to these requirements as they evolve. As shown in figure 8, all four FAA projects satisfied all criteria for the "performing" capability level (level 1) in this process area, but none satisfied all criteria for achieving a "managed" capability level (level 2). While the projects had differing weaknesses that contributed to this result, a common weakness across most of the projects occurred in the area of ensuring quality assurance of the requirements management process, as shown in the overview in table 40. As a result of these weaknesses, FAA is exposed to increased risks that projects will not fulfill mission and user needs. Looked at another way, of the 60 practices we evaluated in this key process area, FAA projects had 54 practices that were fully or largely implemented and 6 practices that were partially or not implemented. Figure 8: Four Projects' Capability Levels in Requirements Management: [See PDF for image] [End of figure] Table 40: Four Projects' Appraisal Results in Requirements Management: Goal: Requirements are managed, and inconsistencies with project plans and work products are identified; VSCS: Goal satisfied; ERAM: Goal satisfied; ITWS: Goal satisfied; ASDE-X: Goal satisfied. Desired practice: Develop an understanding with the requirements providers on the meaning of the requirements; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Manage changes to the requirements as they evolve during the project; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Identify inconsistencies between the project plans and work products and the requirements; VSCS: Practice largely implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Capability level 1 achieved? VSCS: Yes; ERAM: Yes; ITWS: Yes; ASDE-X: Yes. Goal: Requirements are managed, and inconsistencies with project plans and work products are identified.[A]; VSCS: Goal satisfied; ERAM: Goal satisfied; ITWS: Goal satisfied; ASDE-X: Goal satisfied. Desired practice: Obtain commitment to the requirements from the project participants; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Maintain bidirectional traceability between the requirements and the project plans and work products; VSCS: Practice largely implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Goal: The process is institutionalized as a managed process; VSCS: Goal not satisfied; ERAM: Goal not satisfied; ITWS: Goal not satisfied; ASDE-X: Goal not satisfied. Desired practice: Establish and maintain an organizational policy for planning and performing the requirements management process; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Establish and maintain the plan for performing the requirements management process; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Provide adequate resources for performing the requirements management process, developing the work products, and providing the services of the process; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice largely implemented; ASDE-X: Practice fully implemented. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the requirements management process; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice largely implemented. Desired practice: Train the people performing or supporting the requirements management process as needed; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice partially implemented; ASDE-X: Practice largely implemented. Desired practice: Place designated work products of the requirements management process under appropriate levels of configuration management; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice partially implemented; ASDE-X: Practice largely implemented. Desired practice: Identify and involve the relevant stakeholders of the requirements management process as planned; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Monitor and control the requirements management process against the plan for performing the process, and take appropriate corrective action; VSCS: Practice partially implemented; ERAM: Practice largely implemented; ITWS: Practice largely implemented; ASDE-X: Practice fully implemented. Desired practice: Objectively evaluate adherence of the requirements management process to its process description, standards, and procedures, and address noncompliance; VSCS: Practice fully implemented; ERAM: Practice partially implemented; ITWS: Practice partially implemented; ASDE-X: Practice partially implemented. Desired practice: Review the activities, status, and results of the requirements management process with higher level management, and resolve issues; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice largely implemented; ASDE-X: Practice fully implemented. Capability level 2 achieved? VSCS: No; ERAM: No; ITWS: No; ASDE-X: No. Source: GAO, SEI. [A] This goal is repeated at both capability levels 1 and 2, but it comprises more advanced practices at level 2. [End of table] Additional details on each project's appraisal results at successive capability levels are provided in tables 41 through 48. Specifically, tables 41 and 42 provide results for VSCS; tables 43 and 44 provide results for ERAM; tables 45 and 46 provide results for ITWS; and tables 47 and 48 provide results for ASDE-X. Table 41: VSCS Requirements Management: Detailed Findings on Level 1 Goals and Practices: Goal: Requirements are managed and inconsistencies with project plans and work products are identified; Rating: Satisfied. Desired practice: Develop an understanding with the requirements providers on the meaning of the requirements; Rating: Fully implemented. Desired practice: Manage changes to the requirements as they evolve during the project; Rating: Fully implemented. Desired practice: Identify inconsistencies between the project plans and work products and the requirements; Rating: Largely implemented; Comment: The project team usually--but not always--identifies inconsistencies between the project plans and work products and the requirements. Capability level 1 achieved. Sources: GAO, SEI. [End of table] Table 42: VSCS Requirements Management: Detailed Findings on Level 2 Goals and Practices: Goal: Requirements are managed and inconsistencies with project plans and work products are identified; Rating: Satisfied. Desired practice: Obtain commitment to the requirements from the project participants; Rating: Fully implemented. Desired practice: Maintain bidirectional traceability between the requirements and the project plans and work products; Rating: Largely implemented; Comment: The project team usually--but not always--maintains bidirectional traceability between the requirements and the project plans and work products. Goal: The process is institutionalized as a managed process; Rating: Unsatisfied; Comment: The goal is unsatisfied because one of the practices below is partially implemented. Desired practice: Establish and maintain an organizational policy for planning and performing the requirements management process; Rating: Fully implemented. Desired practice: Establish and maintain the plan for performing the requirements management process; Rating: Fully implemented. Desired practice: Provide adequate resources for performing the requirements management process, developing the work products, and providing the services of the process; Rating: Fully implemented. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the requirements management process; Rating: Fully implemented. Desired practice: Train the people performing or supporting the requirements management process as needed; Rating: Fully implemented. Desired practice: Place designated work products of the requirements management process under appropriate levels of configuration management; Rating: Fully implemented. Desired practice: Identify and involve the relevant stakeholders of the requirements management process as planned; Rating: Fully implemented. Desired practice: Monitor and control the requirements management process against the plan for performing the process, and take appropriate corrective action; Rating: Partially implemented; Comment: The project team does not consistently monitor or control the requirements management process against the plan for performing the process or take appropriate corrective action. Desired practice: Objectively evaluate adherence of the requirements management process to its process description, standards, and procedures, and address noncompliance; Rating: Fully implemented. Desired practice: Review the activities, status, and results of the requirements management process with higher level management, and resolve issues; Rating: Fully implemented. Capability level 2 not achieved. Sources: GAO, SEI. [End of table] Table 43: ERAM Requirements Management: Detailed Findings on Level 1 Goals and Practices: Goal: Requirements are managed and inconsistencies with project plans and work products are identified; Rating: Satisfied. Desired practice: Develop an understanding with the requirements providers on the meaning of the requirements; Rating: Fully implemented. Desired practice: Manage changes to the requirements as they evolve during the project; Rating: Fully implemented. Desired practice: Identify inconsistencies between the project plans and work products and the requirements; Rating: Fully implemented. Capability level 1 achieved. Sources: GAO, SEI. [End of table] Table 44: ERAM Requirements Management: Detailed Findings on Level 2 Goals and Practices: Goal: Requirements are managed and inconsistencies with project plans and work products are identified; Rating: Satisfied. Desired practice: Obtain commitment to the requirements from the project participants; Rating: Fully implemented. Desired practice: Maintain bidirectional traceability between the requirements and the project plans and work products; Rating: Fully implemented. Goal: The process is institutionalized as a managed process; Rating: Unsatisfied. Desired practice: Establish and maintain an organizational policy for planning and performing the requirements management process; Rating: Fully implemented. Desired practice: Establish and maintain the plan for performing the requirements management process; Rating: Fully implemented. Desired practice: Provide adequate resources for performing the requirements management process, developing the work products, and providing the services of the process; Rating: Fully implemented. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the requirements management process; Rating: Fully implemented. Desired practice: Train the people performing or supporting the requirements management process as needed; Rating: Fully implemented. Desired practice: Place designated work products of the requirements management process under appropriate levels of configuration management; Rating: Fully implemented. Desired practice: Identify and involve the relevant stakeholders of the requirements management process as planned; Rating: Fully implemented. Desired practice: Monitor and control the requirements management process against the plan for performing the process, and take appropriate corrective action; Rating: Largely implemented; Comment: The project team usually--but not always--monitors and controls the requirements management process against the plan for performing the process and takes appropriate corrective action. Desired practice: Objectively evaluate adherence of the requirements management process to its process description, standards, and procedures, and address noncompliance; Rating: Partially implemented; Comment: The project team has just started defining its quality assurance process, which is expected to objectively evaluate adherence of the requirements management process to its process description, standards, and address noncompliance. Desired practice: Review the activities, status, and results of the requirements management process with higher level management, and resolve issues; Rating: Fully implemented. Capability level 2 not achieved. Sources: GAO, SEI. [End of table] Table 45: ITWS Requirements Management: Detailed Findings on Level 1 Goals and Practices: Goal: Requirements are managed and inconsistencies with project plans and work products are identified; Rating: Satisfied. Desired practice: Develop an understanding with the requirements providers on the meaning of the requirements; Rating: Fully implemented. Desired practice: Manage changes to the requirements as they evolve during the project; Rating: Fully implemented. Desired practice: Identify inconsistencies between the project plans and work products and the requirements; Rating: Fully implemented. Capability level 1 achieved. Sources: GAO, SEI. [End of table] Table 46: ITWS Requirements Management: Detailed Findings on Level 2 Goals and Practices: Goal: Requirements are managed and inconsistencies with project plans and work products are identified; Rating: Satisfied. Desired practice: Obtain commitment to the requirements from the project participants; Rating: Fully implemented. Desired practice: Maintain bidirectional traceability between the requirements and the project plans and work products; Rating: Fully implemented. Goal: The process is institutionalized as a managed process; Rating: Unsatisfied; Comment: The goal is unsatisfied because three of the practices below are partially implemented. Desired practice: Establish and maintain an organizational policy for planning and performing the requirements management process; Rating: Fully implemented. Desired practice: Establish and maintain the plan for performing the requirements management process; Rating: Fully implemented. Desired practice: Provide adequate resources for performing the requirements management process, developing the work products, and providing the services of the process; Rating: Largely implemented; Comment: The project team has adequate resources to review requirements and assign requirements management responsibilities to resources; however, the project team does not have adequate software resources for performing and providing the services of the requirements management process. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the requirements management process; Rating: Fully implemented. Desired practice: Train the people performing or supporting the requirements management process as needed; Rating: Partially implemented; Comment: The project team stated that training was conducted; however, the team did not provide sufficient evidence that FAA personnel are trained as needed on performing or supporting the requirements management process. Desired practice: Place designated work products of the requirements management process under appropriate levels of configuration management; Rating: Partially implemented; Comment: The project team places designated work products of the requirements management process under some levels of configuration management; however, the controls on these products are not adequate because all of the support contractors have full access to the system. Desired practice: Identify and involve the relevant stakeholders of the requirements management process as planned; Rating: Fully implemented. Desired practice: Monitor and control the requirements management process against the plan for performing the process, and take appropriate corrective action; Rating: Largely implemented; Comment: The project team is using metrics for monitoring the process but did not provide sufficient evidence that it is controlling the requirements management process against the plan and taking appropriate corrective action. Desired practice: Objectively evaluate adherence of the requirements management process to its process description, standards, and procedures, and address noncompliance; Rating: Partially implemented; Comment: The project team has just begun its quality assurance process, which is expected to objectively evaluate adherence of the requirements management process to its process description, standards, and procedures and to address noncompliance. Desired practice: Review the activities, status, and results of the requirements management process with higher level management, and resolve issues; Rating: Largely implemented; Comment: At the beginning of the project, the project team reviewed the status and the results of the requirements management process during meetings. However, the project team did not provide adequate evidence that the status of the requirements management process was currently being reviewed. Capability level 2 not achieved. Sources: GAO, SEI. [End of table] Table 47: ASDE-X Requirements Management: Detailed Findings on Level 1 Goals and Practices: Goal: Requirements are managed and inconsistencies with project plans and work products are identified; Rating: Satisfied. Desired practice: Develop an understanding with the requirements providers on the meaning of the requirements; Rating: Fully implemented. Desired practice: Manage changes to the requirements as they evolve during the project; Rating: Fully implemented. Desired practice: Identify inconsistencies between the project plans and work products and the requirements; Rating: Fully implemented. Capability level 1 achieved. Sources: GAO, SEI. [End of table] Table 48: ASDE-X Requirements Management: Detailed Findings on Level 2 Goals and Practices: Goal: Requirements are managed and inconsistencies with project plans and work products are identified; Rating: Satisfied. Desired practice: Obtain commitment to the requirements from the project participants; Rating: Fully implemented. Desired practice: Maintain bidirectional traceability between the requirements and the project plans and work products; Rating: Fully implemented. Goal: The process is institutionalized as a managed process; Rating: Unsatisfied; Comment: The goal is unsatisfied because one of the practices below is partially implemented. Desired practice: Establish and maintain an organizational policy for planning and performing the requirements management process; Rating: Fully implemented. Desired practice: Establish and maintain the plan for performing the requirements management process; Rating: Fully implemented. Desired practice: Provide adequate resources for performing the requirements management process, developing the work products, and providing the services of the process; Rating: Fully implemented. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the requirements management process; Rating: Largely implemented; Comment: The project team assigns responsibility and authority to various groups to perform the requirements management process. However, it does not assign duties to project team individuals. Desired practice: Train the people performing or supporting the requirements management process as needed; Rating: Largely implemented; Comment: The project team is composed of individuals with previous requirements management process experience and is in the process of designing a requirements management training class; however, no project team members have received project-specific requirements management training. Desired practice: Place designated work products of the requirements management process under appropriate levels of configuration management; Rating: Largely implemented; Comment: The project team places designated contractor-related work products of the requirements management process under appropriate levels of configuration management; however, internal documents have yet to be placed under configuration management control. Desired practice: Identify and involve the relevant stakeholders of the requirements management process as planned; Rating: Fully implemented. Desired practice: Monitor and control the requirements management process against the plan for performing the process, and take appropriate corrective action; Rating: Fully implemented. Desired practice: Objectively evaluate adherence of the requirements management process to its process description, standards, and procedures, and address noncompliance; Rating: Partially implemented; Comment: The project team has just begun its quality assurance process, which is expected to objectively evaluate adherence of the requirements management process to its process description, standards, and procedures and to address noncompliance. Desired practice: Review the activities, status, and results of the requirements management process with higher level management, and resolve issues; Rating: Fully implemented. Capability level 2 not achieved. Sources: GAO, SEI. [End of table] [End of section] Chapter 7: FAA Is Performing Most Technical Solution Practices, but It Is Not Yet Fully Managing the Process: The purpose of the technical solution process area is to design, develop, and implement products, product components, and product- related life-cycle processes to meet requirements. This process involves evaluating and selecting solutions that potentially satisfy an appropriate set of allocated requirements, developing detailed designs, and implementing the design. As shown in figure 9, three FAA projects satisfied all criteria for the "performing" capability level (level 1) in this process area. The fourth project would have achieved level 1 if it had performed two more practices (see the overview in table 49 for details). None of the four projects satisfied all criteria for the "managing" capability level (level 2). While all four projects had differing weaknesses that contributed to this result, common weaknesses across most of the projects occurred in the area of ensuring quality assurance of the technical solution process. As a result of this weakness, FAA is exposed to increased risks that projects will not meet mission needs. Looked at another way, of the 72 practices we evaluated in this key process area, FAA projects had 62 practices that were fully or largely implemented and 10 practices that were partially or not implemented. Figure 9: Four Projects' Capability Levels in Technical Solution: [See PDF for image] [End of figure] Table 49: Four Projects' Appraisal Results in Technical Solution: Goal: Product or product-component solutions are selected from alternative solutions; VSCS: Goal satisfied; ERAM: Goal satisfied; ITWS: Goal not satisfied; ASDE-X: Goal satisfied. Desired practice: Develop alternative solutions and selection criteria; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice partially implemented; ASDE-X: Practice fully implemented. Desired practice: Select the product-component solutions that best satisfy the criteria established; VSCS: Practice largely implemented; ERAM: Practice fully implemented; ITWS: Practice partially implemented; ASDE-X: Practice fully implemented. Goal: Product or product-component designs are developed; VSCS: Goal satisfied; ERAM: Goal satisfied; ITWS: Goal satisfied; ASDE-X: Goal satisfied. Desired practice: Develop a design for the product or product component; VSCS: Practice largely implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Establish and maintain the solution for product- component interfaces; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Goal: Product components, and associated support documentation, are implemented from their designs; VSCS: Goal satisfied; ERAM: Goal satisfied; ITWS: Goal satisfied; ASDE-X: Goal satisfied. Desired practice: Implement the designs of the product components; VSCS: Practice fully implemented; ERAM: Practice largely implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Develop and maintain the end-use documentation; VSCS: Practice fully implemented; ERAM: Practice largely implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Capability level 1 achieved? VSCS: Yes; ERAM: Yes; ITWS: No; ASDE-X: Yes. Additional needed to achieve capability level 2: Goal: Product or product-component solutions are selected from alternative solutions.[A]; VSCS: Goal satisfied; ERAM: Goal satisfied; ITWS: Goal not satisfied; ASDE-X: Goal satisfied. Desired practice: Develop detailed alternative solutions and selection criteria; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice partially implemented; ASDE-X: Practice fully implemented. Desired practice: Evolve the operational concept, scenarios, and environments to describe the conditions, operating modes, and operating states specific to each product component; VSCS: Practice largely implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Goal: The process is institutionalized as a managed process; VSCS: Goal not satisfied; ERAM: Goal not satisfied; ITWS: Goal not satisfied; ASDE-X: Goal not satisfied. Desired practice: Establish and maintain an organizational policy for planning and performing the technical solution process; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Establish and maintain the plan for performing the technical solution process; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Provide adequate resources for performing the technical solution process, developing the work products, and providing the services of the process; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice partially implemented; ASDE-X: Practice fully implemented. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the technical solution process; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice partially implemented; ASDE-X: Practice fully implemented. Desired practice: Train the people performing or supporting the technical solution process as needed; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice largely implemented. Desired practice: Place designated work products of the technical solution process under appropriate levels of configuration management; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice partially implemented; ASDE-X: Practice largely implemented. Desired practice: Identify and involve the relevant stakeholders of the technical solution process as planned; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Monitor and control the technical solution process against the plan for performing the process, and take appropriate corrective action; VSCS: Practice partially implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Objectively evaluate adherence of the technical solution process to its process description, standards, and procedures, and address noncompliance; VSCS: Practice fully implemented; ERAM: Practice partially implemented; ITWS: Practice partially implemented; ASDE-X: Practice partially implemented. Desired practice: Review the activities, status, and results of the technical solution process with higher level management, and resolve issues; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Capability level 2 achieved? VSCS: No; ERAM: No; ITWS: No; ASDE-X: No. Sources: GAO, SEI. [A] This goal is repeated at both capability levels 1 and 2, but it comprises more advanced practices at level 2. [End of table] Additional details on each project's appraisal results at successive capability levels are provided in tables 50 through 57. Specifically, tables 50 and 51 provide results for VSCS; tables 52 and 53 provide results for ERAM; tables 54 and 55 provide results for ITWS; and tables 56 and 57 provide results for ASDE-X. Table 50: VSCS Technical Solution: Detailed Findings on Level 1 Goals and Practices: Goal: Product or product-component solutions are selected from alternative solutions; Rating: Satisfied. Desired practice: Develop alternative solutions and selection criteria; Rating: Fully implemented. Desired practice: Select the product-component solutions that best satisfy the criteria established; Rating: Largely implemented; Comment: The project team selects the product-component solutions that best satisfy the criteria established, but it does not provide sufficient detail regarding how it selected them. Goal: Product or product-component designs are developed; Rating: Satisfied. Desired practice: Develop a design for the product or product component; Rating: Largely implemented; Comment: The project team usually--but not always--develops designs for the product and product components. Desired practice: Establish and maintain the solution for product- component interfaces; Rating: Fully implemented. Goal: Product components, and associated support documentation, are implemented from their designs; Rating: Satisfied. Desired practice: Implement the designs of the product components; Rating: Fully implemented. Desired practice: Develop and maintain the end-use documentation; Rating: Fully implemented. Capability level 1 achieved. Sources: GAO, SEI. [End of table] Table 51: VSCS Technical Solution: Detailed Findings on Level 2 Goals and Practices: Goal: Product or product-component solutions are selected from alternative solutions; Rating: Satisfied. Desired practice: Develop detailed alternative solutions and selection criteria; Rating: Fully implemented. Desired practice: Evolve the operational concept, scenarios, and environments to describe the conditions, operating modes, and operating states specific to each product component; Rating: Largely implemented; Comment: The project team usually--but not always--evolves the operational concept, scenarios, and environments to describe the conditions, operating modes, and operating states specific to each product component. Goal: The process is institutionalized as a managed process; Rating: Unsatisfied; Comment: The goal is unsatisfied because one of the practices below is partially implemented. Desired practice: Establish and maintain an organizational policy for planning and performing the technical solution process; Rating: Fully implemented. Desired practice: Establish and maintain the plan for performing the technical solution process; Rating: Fully implemented. Desired practice: Provide adequate resources for performing the technical solution process, developing the work products, and providing the services of the process; Rating: Fully implemented. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the technical solution process; Rating: Fully implemented. Desired practice: Train the people performing or supporting the technical solution process as needed; Rating: Fully implemented. Desired practice: Place designated work products of the technical solution process under appropriate levels of configuration management; Rating: Fully implemented. Desired practice: Identify and involve the relevant stakeholders of the technical solution process as planned; Rating: Fully implemented. Desired practice: Monitor and control the technical solution process against the plan for performing the process, and take appropriate corrective action; Rating: Partially implemented; Comment: The project team does not consistently monitor and control the technical solution process against the plan for performing the process or take appropriate corrective action. Desired practice: Objectively evaluate adherence of the technical solution process to its process description, standards, and procedures, and address noncompliance; Rating: Fully implemented. Desired practice: Review the activities, status, and results of the technical solution process with higher level management, and resolve issues; Rating: Fully implemented. Capability level 2 not achieved. Sources: GAO, SEI. [End of table] Table 52: ERAM Technical Solution: Detailed Findings on Level 1 Goals and Practices: Goal: Product or product-component solutions are selected from alternative solutions; Rating: Satisfied. Desired practice: Develop alternative solutions and selection criteria; Rating: Fully implemented. Desired practice: Select the product-component solutions that best satisfy the criteria established; Rating: Fully implemented. Goal: Product or product-component designs are developed; Rating: Satisfied. Desired practice: Develop a design for the product or product component; Rating: Fully implemented. Desired practice: Establish and maintain the solution for product- component interfaces; Rating: Fully implemented. Goal: Product components, and associated support documentation, are implemented from their designs; Rating: Satisfied. Desired practice: Implement the designs of the product components; Rating: Largely implemented; Comment: The project team is working on the product-component designs and has contracted the implementation of the designs in the statement of work. However, the project has not yet fully reached the design implementation stage. Desired practice: Develop and maintain the end-use documentation; Rating: Largely implemented; Comment: The project team has defined and contracted the development and maintenance of the end-use documentation in the statement of work. However, the project has not reached the end-use documentation phase. Capability level 1 achieved. Sources: GAO, SEI. [End of table] Table 53: ERAM Technical Solution: Detailed Findings on Level 2 Goals and Practices: Goal: Product or product-component solutions are selected from alternative solutions; Rating: Satisfied. Desired practice: Develop detailed alternative solutions and selection criteria; Rating: Fully implemented. Desired practice: Evolve the operational concept, scenarios, and environments to describe the conditions, operating modes, and operating states specific to each product component; Rating: Fully implemented. Goal: The process is institutionalized as a managed process; Rating: Unsatisfied; Comment: The goal is unsatisfied because one of the practices below is partially implemented. Desired practice: Establish and maintain an organizational policy for planning and performing the technical solution process; Rating: Fully implemented. Desired practice: Establish and maintain the plan for performing the technical solution process; Rating: Fully implemented. Desired practice: Provide adequate resources for performing the technical solution process, developing the work products, and providing the services of the process; Rating: Fully implemented. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the technical solution process; Rating: Fully implemented. Desired practice: Train the people performing or supporting the technical solution process as needed; Rating: Fully implemented. Desired practice: Place designated work products of the technical solution process under appropriate levels of configuration management; Rating: Fully implemented. Desired practice: Identify and involve the relevant stakeholders of the technical solution process as planned; Rating: Fully implemented. Desired practice: Monitor and control the technical solution process against the plan for performing the process, and take appropriate corrective action; Rating: Fully implemented. Desired practice: Objectively evaluate adherence of the technical solution process to its process description, standards, and procedures, and address noncompliance; Rating: Partially implemented; Comment: The project team has just started defining its quality assurance process, which will objectively evaluate adherence of the technical solution process to its process description, standards, and procedures and address noncompliance. Desired practice: Review the activities, status, and results of the technical solution process with higher level management, and resolve issues; Rating: Fully implemented. Capability level 2 not achieved. Sources: GAO, SEI. [End of table] Table 54: ITWS Technical Solution: Detailed Findings on Level 1 Goals and Practices: Goal: Product or product-component solutions are selected from alternative solutions; Rating: Unsatisfied; Comment: The goal is unsatisfied because two of the practices below are partially implemented. Desired practice: Develop alternative solutions and selection criteria; Rating: Partially implemented; Comment: The project team developed alternative solutions; however, it did not develop alternative selection criteria. Desired practice: Select the product-component solutions that best satisfy the criteria established; Rating: Partially implemented; Comment: Although the project team selected product components by approving critical design reviews, no criteria were established to choose between product-component alternatives. Goal: Product or product-component designs are developed; Rating: Satisfied. Desired practice: Develop a design for the product or product component; Rating: Fully implemented. Desired practice: Establish and maintain the solution for product- component interfaces; Rating: Fully implemented. Goal: Product components, and associated support documentation, are implemented from their designs; Rating: Satisfied. Desired practice: Implement the designs of the product components; Rating: Fully implemented. Desired practice: Develop and maintain the end-use documentation; Rating: Fully implemented. Capability level 1 not achieved. Sources: GAO, SEI. [End of table] Table 55: ITWS Technical Solution: Detailed Findings on Level 2 Goals and Practices: Goal: Product or product-component solutions are selected from alternative solutions; Rating: Unsatisfied; Comment: The goal is unsatisfied because one of the practices below is partially implemented. Desired practice: Develop detailed alternative solutions and selection criteria; Rating: Partially implemented; Comment: The project team developed detailed alternative solutions; however, it did not develop detailed alternative selection criteria. Desired practice: Evolve the operational concept, scenarios, and environments to describe the conditions, operating modes, and operating states specific to each product component; Rating: Fully implemented. Goal: The process is institutionalized as a managed process; Rating: Unsatisfied; Comment: The goal is unsatisfied because four of the practices below are partially implemented. Desired practice: Establish and maintain an organizational policy for planning and performing the technical solution process; Rating: Fully implemented. Desired practice: Establish and maintain the plan for performing the technical solution process; Rating: Fully implemented. Desired practice: Provide adequate resources for performing the technical solution process, developing the work products, and providing the services of the process; Rating: Partially implemented; Comment: The project team provided adequate resources and documentation for performing the technical solution process in 1998; however, the documentation has not been updated since it was initially created. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the technical solution process; Rating: Partially implemented; Comment: The project team assigned responsibility and authority for performing the technical solution process and developing the work products in 1998; however, the documentation has not been updated since it was initially created. Desired practice: Train the people performing or supporting the technical solution process as needed; Rating: Fully implemented. Desired practice: Place designated work products of the technical solution process under appropriate levels of configuration management; Rating: Partially implemented; Comment: The project team places designated work products of the technical solution process under some levels of configuration management; however, the controls on these products are not adequate because all the support contractors have full access to the system. Desired practice: Identify and involve the relevant stakeholders of the technical solution process as planned; Rating: Fully implemented. Desired practice: Monitor and control the technical solution process against the plan for performing the process, and take appropriate corrective action; Rating: Fully implemented. Desired practice: Objectively evaluate adherence of the technical solution process to its process description, standards, and procedures, and address noncompliance; Rating: Partially implemented; Comment: The project team has just begun its quality assurance process, which is expected to objectively evaluate adherence of the technical solution process to its process description, standards, and procedures and to address noncompliance. Desired practice: Review the activities, status, and results of the technical solution process with higher level management, and resolve issues; Rating: Fully implemented. Capability level 2 not achieved. Sources: GAO, SEI. [End of table] Table 56: ASDE-X Technical Solution: Detailed Findings on Level 1 Goals and Practices: Goal: Product or product-component solutions are selected from alternative solutions; Rating: Satisfied. Desired practice: Develop alternative solutions and selection criteria; Rating: Fully implemented. Desired practice: Select the product-component solutions that best satisfy the criteria established; Rating: Fully implemented. Goal: Product or product-component designs are developed; Rating: Satisfied. Desired practice: Develop a design for the product or product component; Rating: Fully implemented. Desired practice: Establish and maintain the solution for product- component interfaces; Rating: Fully implemented. Goal: Product components, and associated support documentation, are implemented from their designs; Rating: Satisfied. Desired practice: Implement the designs of the product components; Rating: Fully implemented. Desired practice: Develop and maintain the end-use documentation; Rating: Fully implemented. Capability level 1 achieved. Sources: GAO, SEI. [End of table] Table 57: ASDE-X Technical Solution: Detailed Findings on Level 2 Goals and Practices: Goal: Product or product-component solutions are selected from alternative solutions; Rating: Satisfied. Desired practice: Develop detailed alternative solutions and selection criteria; Rating: Fully implemented. Desired practice: Evolve the operational concept, scenarios, and environments to describe the conditions, operating modes, and operating states specific to each product component; Rating: Fully implemented. Goal: The process is institutionalized as a managed process; Rating: Unsatisfied; Finding: The goal is unsatisfied because one of the practices below is partially implemented. Desired practice: Establish and maintain an organizational policy for planning and performing the technical solution process; Rating: Fully implemented. Desired practice: Establish and maintain the plan for performing the technical solution process; Rating: Fully implemented. Desired practice: Provide adequate resources for performing the technical solution process, developing the work products, and providing the services of the process; Rating: Fully implemented. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the technical solution process; Rating: Fully implemented. Desired practice: Train the people performing or supporting the technical solution process as needed; Rating: Largely implemented; Finding: The prime contractor's personnel are trained in design, development, and testing activities, with FAA personnel overseeing these activities; however, the FAA personnel are not trained in the technical solution process. Desired practice: Place designated work products of the technical solution process under appropriate levels of configuration management; Rating: Largely implemented; Finding: The project team places designated work products of the technical solution process under appropriate levels of configuration management; however, internal documents are not yet placed under configuration management control. Desired practice: Identify and involve the relevant stakeholders of the technical solution process as planned; Rating: Fully implemented. Desired practice: Monitor and control the technical solution process against the plan for performing the process, and take appropriate corrective action; Rating: Fully implemented. Desired practice: Objectively evaluate adherence of the technical solution process to its process description, standards, and procedures, and address noncompliance; Rating: Partially implemented; Finding: The project team has just begun its quality assurance process, which is expected to objectively evaluate adherence of the technical solution process to its process description, standards, and procedures and to address noncompliance. Desired practice: Review the activities, status, and results of the technical solution process with higher level management, and resolve issues; Rating: Fully implemented. Capability level 2 not achieved. Sources: GAO, SEI. [End of table] [End of section] Chapter 8: FAA Is Performing Product Integration Practices, but It Is Not Yet Fully Managing the Process: The purpose of the product integration process is to assemble the product components, ensure that the integrated product functions properly, and deliver the product. A critical aspect of this process is managing the internal and external interfaces of the products and product components, in one stage or in incremental stages. For this process area, we did not perform an appraisal for the ERAM project, because it was at a stage in which product integration was not applicable. As shown in figure 10, the three remaining projects satisfied all criteria for the "performing" capability level (level 1) in this process area. None of the projects satisfied all criteria for the "managing" capability level (level 2). While the projects had differing weaknesses that contributed to this result, common weaknesses across most of the projects occurred in the areas of monitoring and controlling the product integration process and ensuring quality assurance of the process, as shown in the overview in table 58. As a result of this weakness, FAA is exposed to increased risk that product components will not be compatible, resulting in projects that will not meet cost, schedule, or performance goals. Looked at another way, of the 54 practices we evaluated in this process area, FAA projects had 49 practices that were fully or largely implemented and 5 practices that were partially or not implemented. Figure 10: Three Projects' Capability Levels in Product Integration: [See PDF for image] [End of figure] Table 58: Three Projects' Appraisal Results in Product Integration: Goal: Preparation for product integration is conducted; VSCS: Goal satisfied; ERAM: N/A; ITWS: Goal satisfied; ASDE-X: Goal satisfied. Desired practice: Determine the product-component integration sequence; VSCS: Practice fully implemented; ERAM: N/A; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Goal: The product-component interfaces, both internal and external, are compatible; VSCS: Goal satisfied; ERAM: N/A; ITWS: Goal satisfied; ASDE-X: Goal satisfied. Desired practice: Review interface descriptions for coverage and completeness; VSCS: Practice fully implemented; ERAM: N/A; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Manage internal and external interface definitions, designs, and changes for products and product components; VSCS: Practice fully implemented; ERAM: N/A; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Goal: Verified product components are assembled, and the integrated, verified, and validated product is delivered; VSCS: Goal satisfied; ERAM: N/A; ITWS: Goal satisfied; ASDE-X: Goal satisfied. Desired practice: Confirm, before assembly, that each product component required to assemble the product has been properly identified and functions according to its description, and that the product-component interfaces comply with the interface descriptions; VSCS: Practice fully implemented; ERAM: N/A; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Assemble product components according to the product integration sequence and available procedures; VSCS: Practice fully implemented; ERAM: N/A; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Evaluate assembled product components for interface compatibility; VSCS: Practice fully implemented; ERAM: N/A; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Package the assembled product or product component and deliver it to the appropriate customer; VSCS: Practice fully implemented; ERAM: N/A; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Capability level 1 achieved? VSCS: Yes; ERAM: N/A; ITWS: Yes; ASDE-X: Yes. Goal: Preparation for product integration is conducted.[A]; VSCS: Goal satisfied; ERAM: N/A; ITWS: Goal satisfied; ASDE-X: Goal satisfied. Desired practice: Establish and maintain the environment needed to support the integration of the product components; VSCS: Practice largely implemented; ERAM: N/A; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Goal: The process is institutionalized as a managed process; VSCS: Goal not satisfied; ERAM: N/A; ITWS: Goal not satisfied; ASDE-X: Goal not satisfied. Desired practice: Establish and maintain an organizational policy for planning and performing the product integration process; VSCS: Practice fully implemented; ERAM: N/A; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Establish and maintain the plan for performing the product integration process; VSCS: Practice fully implemented; ERAM: N/A; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Provide adequate resources for performing the product integration process, developing the work products, and providing the services of the process; VSCS: Practice fully implemented; ERAM: N/A; ITWS: Practice largely implemented; ASDE-X: Practice fully implemented. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the product integration process; VSCS: Practice fully implemented; ERAM: N/A; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Train the people performing or supporting the product integration process as needed; VSCS: Practice fully implemented; ERAM: N/A; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Place designated work products of the product integration process under appropriate levels of configuration management; VSCS: Practice fully implemented; ERAM: N/A; ITWS: Practice partially implemented; ASDE-X: Practice largely implemented. Desired practice: Identify and involve the relevant stakeholders of the product integration process as planned; VSCS: Practice fully implemented; ERAM: N/A; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Monitor and control the product integration process against the plan for performing the process, and take appropriate corrective action; VSCS: Practice not implemented; ERAM: N/A; ITWS: Practice not implemented; ASDE-X: Practice largely implemented. Desired practice: Objectively evaluate adherence of the product integration process to its process description, standards, and procedures, and address noncompliance; VSCS: Practice fully implemented; ERAM: N/A; ITWS: Practice partially implemented; ASDE-X: Practice partially implemented. Desired practice: Review the activities, status, and results of the product integration process with higher level management, and resolve issues; VSCS: Practice fully implemented; ERAM: N/A; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Capability level 2 achieved? VSCS: No; ERAM: N/A; ITWS: No; ASDE-X: No. Sources: GAO, SEI. Note: N/A represents not applicable; project not appraised in this process area. [A] This goal is repeated at both capability levels 1 and 2, but it comprises more advanced practices at level 2. [End of table] Additional details on each project's appraisal results at successive capability levels are provided in tables 59 through 64. Specifically, tables 59 and 60 provide results for VSCS; tables 61 and 62 provide results for ITWS; and tables 63 and 64 provide results for ASDE-X. Table 59: VSCS Product Integration: Detailed Findings on Level 1 Goals and Practices: Goal: Preparation for product integration is conducted; Rating: Satisfied. Desired practice: Determine the product-component integration sequence; Rating: Fully implemented. Goal: The product-component interfaces, both internal and external, are compatible; Rating: Satisfied. Desired practice: Review interface descriptions for coverage and completeness; Rating: Fully implemented. Desired practice: Manage internal and external interface definitions, designs, and changes for products and product components; Rating: Fully implemented. Goal: Verified product components are assembled, and the integrated, verified, and validated product is delivered; Rating: Satisfied. Desired practice: Confirm, before assembly, that each product component required to assemble the product has been properly identified and functions according to its description and that the product-component interfaces comply with the interface descriptions; Rating: Fully implemented. Desired practice: Assemble product components according to the product integration sequence and available procedures; Rating: Fully implemented. Desired practice: Evaluate assembled product components for interface compatibility; Rating: Fully implemented. Desired practice: Package the assembled product or product component and deliver it to the appropriate customer; Rating: Fully implemented. Capability level 1 achieved. Sources: GAO, SEI. [End of table] Table 60: VSCS Product Integration: Detailed Findings on Level 2 Goals and Practices: Goal: Preparation for product integration is conducted; Rating: Satisfied. Desired practice: Establish and maintain the environment needed to support the integration of the product components; Rating: Largely implemented; Comment: The project team has established the environment needed to support the integration of the product components. However, the project team does not maintain this environment. Goal: The process is institutionalized as a managed process; Rating: Unsatisfied; Comment: The goal is unsatisfied because one of the practices below is not implemented. Desired practice: Establish and maintain an organizational policy for planning and performing the product integration process; Rating: Fully implemented. Desired practice: Establish and maintain the plan for performing the product integration process; Rating: Fully implemented. Desired practice: Provide adequate resources for performing the product integration process, developing the work products, and providing the services of the process; Rating: Fully implemented. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the product integration process; Rating: Fully implemented. Desired practice: Train the people performing or supporting the product integration process as needed; Rating: Fully implemented. Desired practice: Place designated work products of the product integration process under appropriate levels of configuration management; Rating: Fully implemented. Desired practice: Identify and involve the relevant stakeholders of the product integration process as planned; Rating: Fully implemented. Desired practice: Monitor and control the product integration process against the plan for performing the process, and take appropriate corrective action; Rating: Not implemented; Comment: The project team does not monitor and control the product integration process against the plan for performing the process or take appropriate corrective action. Desired practice: Objectively evaluate adherence of the product integration process to its process description, standards, and procedures, and address noncompliance; Rating: Fully implemented. Desired practice: Review the activities, status, and results of the product integration process with higher level management, and resolve issues; Rating: Fully implemented. Capability level 2 not achieved. Sources: GAO, SEI. [End of table] Table 61: ITWS Product Integration: Detailed Findings on Level 1 Goals and Practices: Goal: Preparation for product integration is conducted; Rating: Satisfied. Desired practice: Determine the product-component integration sequence; Rating: Fully implemented. Goal: The product-component interfaces, both internal and external, are compatible; Rating: Satisfied. Desired practice: Review interface descriptions for coverage and completeness; Rating: Fully implemented. Desired practice: Manage internal and external interface definitions, designs, and changes for products and product components; Rating: Fully implemented. Goal: Verified product components are assembled, and the integrated, verified, and validated product is delivered; Rating: Satisfied. Desired practice: Confirm, before assembly, that each product component required to assemble the product has been properly identified and functions according to its description and that the product-component interfaces comply with the interface descriptions; Rating: Fully implemented. Desired practice: Assemble product components according to the product integration sequence and available procedures; Rating: Fully implemented. Desired practice: Evaluate assembled product components for interface compatibility; Rating: Fully implemented. Desired practice: Package the assembled product or product component and deliver it to the appropriate customer; Rating: Fully implemented. Capability level 1 achieved. Sources: GAO, SEI. [End of table] Table 62: ITWS Product Integration: Detailed Findings on Level 2 Goals and Practices: Goal: Preparation for product integration is conducted; Rating: Satisfied. Desired practice: Establish and maintain the environment needed to support the integration of the product components; Rating: Fully implemented. Goal: The process is institutionalized as a managed process; Rating: Unsatisfied; Finding: The goal is unsatisfied because two of the practices below are partially implemented and another is not implemented. Desired practice: Establish and maintain an organizational policy for planning and performing the product integration process; Rating: Fully implemented. Desired practice: Establish and maintain the plan for performing the product integration process; Rating: Fully implemented. Desired practice: Provide adequate resources for performing the product integration process, developing the work products, and providing the services of the process; Rating: Largely implemented; Finding: The project team has conducted product integration activities, but the resources needed to support these activities were not defined. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the product integration process; Rating: Fully implemented. Desired practice: Train the people performing or supporting the product integration process as needed; Rating: Fully implemented. Desired practice: Place designated work products of the product integration process under appropriate levels of configuration management; Rating: Partially implemented; Finding: The project team places designated work products of the product integration process under some levels of configuration management; however, the controls on these products are not adequate because all the support contractors have full access to the system. Desired practice: Identify and involve the relevant stakeholders of the product integration process as planned; Rating: Fully implemented. Desired practice: Monitor and control the product integration process against the plan for performing the process, and take appropriate corrective action; Rating: Not implemented; Finding: The project team does not monitor and control the product integration process against the plan for performing the process or take appropriate corrective action. Desired practice: Objectively evaluate adherence of the product integration process to its process description, standards, and procedures, and address noncompliance; Rating: Partially implemented; Finding: The project team has just begun its quality assurance process, which is expected to objectively evaluate adherence of the product integration process to its process description, standards, and procedures and to address noncompliance. Desired practice: Review the activities, status, and results of the product integration process with higher level management, and resolve issues; Rating: Fully implemented. Capability level 2 not achieved. Sources: GAO, SEI. [End of table] Table 63: ASDE-X Product Integration: Detailed Findings on Level 1 Goals and Practices: Goal: Preparation for product integration is conducted; Rating: Satisfied. Desired practice: Determine the product-component integration sequence; Rating: Fully implemented. Goal: The product-component interfaces, both internal and external, are compatible; Rating: Satisfied. Desired practice: Review interface descriptions for coverage and completeness; Rating: Fully implemented. Desired practice: Manage internal and external interface definitions, designs, and changes for products and product components; Rating: Fully implemented. Goal: Verified product components are assembled, and the integrated, verified, and validated product is delivered; Rating: Satisfied. Desired practice: Confirm, before assembly, that each product component required to assemble the product has been properly identified and functions according to its description and that the product-component interfaces comply with the interface descriptions; Rating: Fully implemented. Desired practice: Assemble product components according to the product integration sequence and available procedures; Rating: Fully implemented. Desired practice: Evaluate assembled product components for interface compatibility; Rating: Fully implemented. Desired practice: Package the assembled product or product component and deliver it to the appropriate customer; Rating: Fully implemented. Capability level 1 achieved. Sources: GAO, SEI. [End of table] Table 64: ASDE-X Product Integration: Detailed Findings on Level 2 Goals and Practices: Goal: Preparation for product integration is conducted; Rating: Satisfied. Desired practice: Establish and maintain the environment needed to support the integration of the product components; Rating: Fully implemented. Goal: The process is institutionalized as a managed process; Rating: Unsatisfied; Comment: The goal is unsatisfied because one of the practices below is partially implemented. Desired practice: Establish and maintain an organizational policy for planning and performing the product integration process; Rating: Fully implemented. Desired practice: Establish and maintain the plan for performing the product integration process; Rating: Fully implemented. Desired practice: Provide adequate resources for performing the product integration process, developing the work products, and providing the services of the process; Rating: Fully implemented. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the product integration process; Rating: Fully implemented. Desired practice: Train the people performing or supporting the product integration process as needed; Rating: Fully implemented. Desired practice: Place designated work products of the product integration process under appropriate levels of configuration management; Rating: Largely implemented; Comment: The project team places designated contractor-related work products of the product integration process under appropriate levels of configuration management; however, internal documents have yet to be placed under configuration management control. Desired practice: Identify and involve the relevant stakeholders of the product integration process as planned; Rating: Fully implemented. Desired practice: Monitor and control the product integration process against the plan for performing the process, and take appropriate corrective action; Rating: Largely implemented; Comment: The project team monitors and controls the product integration process against the plan for performing the process; however, the metrics are collected by the contractor, and the project team does not maintain, record, or track the metrics on a monthly basis. Desired practice: Objectively evaluate adherence of the product integration process to its process description, standards, and procedures, and address noncompliance; Rating: Partially implemented; Comment: The project team has just begun its quality assurance process, which is expected to objectively evaluate adherence of the project integration process to its process description, standards, and procedures and to address noncompliance. Desired practice: Review the activities, status, and results of the product integration process with higher level management, and resolve issues; Rating: Fully implemented. Capability level 2 not achieved. Sources: GAO, SEI. [End of table] [End of section] Chapter 9: FAA Is Not Performing Key Verification Practices or Fully Managing the Process: The purpose of verification is to ensure that selected work products meet their specified requirements. This process area involves preparing for and performing tests and identifying corrective actions. Verification of work products substantially increases the likelihood that the product will meet the customer, product, and product-component requirements. As shown in figure 11, only one of four FAA projects satisfied all criteria for the "performing" capability level (level 1) in this process area. As shown in the overview in table 65, key weaknesses in preparing and conducting peer reviews prevented the other three projects from achieving level 1. None of the four projects satisfied all criteria for the "managing" capability level (level 2). While all four projects had differing weaknesses that contributed to this result, common weaknesses across most of the projects occurred in the areas of monitoring and controlling the verification process and in ensuring quality assurance of the process. As a result of these weaknesses, FAA is exposed to increased risk that the product will not meet the user and mission requirements, increasing the likelihood that projects that will not meet cost, schedule, or performance goals. Looked at another way, of the 68 practices we evaluated in this process area, FAA projects had 51 practices that were fully or largely implemented and 17 practices that were partially or not implemented. Figure 11: Four Projects' Capability Levels in Verification: [See PDF for image] [End of figure] Table 65: Four Projects' Appraisal Results in Verification: Goal: Preparation for verification is conducted; VSCS: Goal satisfied; ERAM: Goal satisfied; ITWS: Goal satisfied; ASDE-X: Goal satisfied. Desired practice: Select the work products to be verified and the verification methods that will be used for each; VSCS: Practice fully implemented; ERAM: Practice largely implemented; ITWS: Practice largely implemented; ASDE-X: Practice largely implemented. Goal: Peer reviews are performed on selected work products; VSCS: Goal satisfied; ERAM: Goal not satisfied; ITWS: Goal not satisfied; ASDE-X: Goal not satisfied. Desired practice: Prepare for peer reviews of selected work products; VSCS: Practice fully implemented; ERAM: Practice largely implemented; ITWS: Practice partially implemented; ASDE-X: Practice not implemented. Desired practice: Conduct peer reviews on selected work products and identify issues resulting from the peer review; VSCS: Practice largely implemented; ERAM: Practice partially implemented; ITWS: Practice not implemented; ASDE-X: Practice not implemented. Goal: Selected work products are verified against their specified requirements; VSCS: Goal satisfied; ERAM: Goal satisfied; ITWS: Goal satisfied; ASDE-X: Goal satisfied. Desired practice: Perform verification on the selected work products; VSCS: Practice fully implemented; ERAM: Practice largely implemented; ITWS: Practice largely implemented; ASDE-X: Practice largely implemented. Capability level 1 achieved? VSCS: Yes; ERAM: No; ITWS: No; ASDE-X: No. Goal: Preparation for verification is conducted.[A]; VSCS: Goal satisfied; ERAM: Goal satisfied; ITWS: Goal satisfied; ASDE-X: Goal satisfied. Desired practice: Establish and maintain the environment needed to support verification; VSCS: Practice fully implemented; ERAM: Practice largely implemented; ITWS: Practice largely implemented; ASDE-X: Practice largely implemented. Goal: Peer reviews are performed on selected work products.[A]; VSCS: Goal not satisfied; ERAM: Goal not satisfied; ITWS: Goal not satisfied; ASDE-X: Goal not satisfied. Desired practice: Analyze data about preparation, conduct, and results of the peer reviews; VSCS: Practice partially implemented; ERAM: Practice not implemented; ITWS: Practice not implemented; ASDE-X: Practice not implemented. Goal: Selected work products are verified against their specified requirements.[A]; VSCS: Goal satisfied; ERAM: Goal satisfied; ITWS: Goal satisfied; ASDE-X: Goal satisfied. Desired practice: Analyze the results of all verification activities and identify corrective action; VSCS: Practice fully implemented; ERAM: Practice largely implemented; ITWS: Practice fully implemented; ASDE-X: Practice largely implemented. Goal: The process is institutionalized as a managed process; VSCS: Goal not satisfied; ERAM: Goal not satisfied; ITWS: Goal not satisfied; ASDE-X: Goal not satisfied. Desired practice: Establish and maintain an organizational policy for planning and performing the verification process; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Establish and maintain the plan for performing the verification process; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice largely implemented; ASDE-X: Practice largely implemented. Desired practice: Provide adequate resources for performing the verification process, developing the work products, and providing the services of the process; VSCS: Practice fully implemented; ERAM: Practice largely implemented; ITWS: Practice largely implemented; ASDE-X: Practice largely implemented. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the verification process; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice largely implemented. Desired practice: Train the people performing or supporting the verification process as needed; VSCS: Practice largely implemented; ERAM: Practice fully implemented; ITWS: Practice partially implemented; ASDE-X: Practice fully implemented. Desired practice: Place designated work products of the verification process under appropriate levels of configuration management; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice partially implemented; ASDE-X: Practice largely implemented. Desired practice: Identify and involve the relevant stakeholders of the verification process as planned; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Monitor and control the verification process against the plan for performing the process, and take appropriate corrective action; VSCS: Practice not implemented; ERAM: Practice largely implemented; ITWS: Practice not implemented; ASDE-X: Practice not implemented. Desired practice: Objectively evaluate adherence of the verification process to its process description, standards, and procedures, and address noncompliance; VSCS: Practice fully implemented; ERAM: Practice partially implemented; ITWS: Practice partially implemented; ASDE-X: Practice partially implemented. Desired practice: Review the activities, status, and results of the verification process with higher level management, and resolve issues; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Capability level 2 achieved? VSCS: No; ERAM: No; ITWS: No; ASDE-X: No. Sources: GAO, SEI. [A] This goal is repeated at both capability levels 1 and 2, but it comprises more advanced practices at level 2. [End of table] Additional details on each project's appraisal results at successive capability levels are provided in tables 66 through 73. Specifically, tables 66 and 67 provide results for VSCS; tables 68 and 69 provide results for ERAM; tables 70 and 71 provide results for ITWS; and tables 72 and 73 provide results for ASDE-X. Table 66: VSCS Verification: Detailed Findings on Level 1 Goals and Practices: Goal: Preparation for verification is conducted; Rating: Satisfied. Desired practice: Select the work products to be verified and the verification methods that will be used for each; Rating: Fully implemented. Goal: Peer reviews are performed on selected work products; Rating: Satisfied. Desired practice: Prepare for peer reviews of selected work products; Rating: Fully implemented. Desired practice: Conduct peer reviews on selected work products and identify issues resulting from the peer review; Rating: Largely implemented; Comment: The project team usually--but not always--conducts peer reviews on selected work products and identifies issues resulting from the peer review. Goal: Selected work products are verified against their specified requirements; Rating: Satisfied. Desired practice: Perform verification on the selected work products; Rating: Fully implemented. Capability level 1 achieved. Sources: GAO, SEI. [End of table] Table 67: VSCS Verification: Detailed Findings on Level 2 Goals and Practices: Goal: Preparation for verification is conducted; Rating: Satisfied. Desired practice: Establish and maintain the environment needed to support verification; Rating: Fully implemented. Goal: Peer reviews are performed on selected work products; Rating: Unsatisfied; Comment: The goal is unsatisfied because the practice below is partially implemented. Desired practice: Analyze data about preparation, conduct, and results of the peer reviews; Rating: Partially implemented; Comment: The project team analyzes data about the results of peer reviews, but it does not collect or record data about the preparation or conduct of the peer reviews. Goal: Selected work products are verified against their specified requirements; Rating: Satisfied. Desired practice: Analyze the results of all verification activities and identify corrective action; Rating: Fully implemented. Goal: The process is institutionalized as a managed process; Rating: Unsatisfied; Comment: The goal is unsatisfied because one of the practices below is not implemented. Desired practice: Establish and maintain an organizational policy for planning and performing the verification process; Rating: Fully implemented. Desired practice: Establish and maintain the plan for performing the verification process; Rating: Fully implemented. Desired practice: Provide adequate resources for performing the verification process, developing the work products, and providing the services of the process; Rating: Fully implemented. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the verification process; Rating: Fully implemented. Desired practice: Train the people performing or supporting the verification process as needed; Rating: Largely implemented; Comment: The project team trains the people performing and supporting the process, but does not consistently document individuals' training to ensure that standards are met. Desired practice: Place designated work products of the verification process under appropriate levels of configuration management; Rating: Fully implemented. Desired practice: Identify and involve the relevant stakeholders of the verification process as planned; Rating: Fully implemented. Desired practice: Monitor and control the verification process against the plan for performing the process, and take appropriate corrective action; Rating: Not implemented; Comment: The project team does not monitor and control the verification process against the plan in order to take appropriate corrective action. Desired practice: Objectively evaluate adherence of the verification process to its process description, standards, and procedures, and address noncompliance; Rating: Fully implemented. Desired practice: Review the activities, status, and results of the verification process with higher level management, and resolve issues; Rating: Fully implemented. Capability level 2 not achieved. Sources: GAO, SEI. [End of table] Table 68: ERAM Verification: Detailed Findings on Level 1 Goals and Practices: Goal: Preparation for verification is conducted; Rating: Satisfied. Desired practice: Select the work products to be verified and the verification methods that will be used for each; Rating: Largely implemented; Comment: Although the project team does not verify internally generated documents, it selects other work products to be verified and the verification methods that will be used. Goal: Peer reviews are performed on selected work products; Rating: Unsatisfied; Comment: The goal is unsatisfied because one of the practices below is partially implemented. Desired practice: Prepare for peer reviews of selected work products; Rating: Largely implemented; Comment: The project team is preparing for peer reviews of software; however, other peer reviews for selected work products have not yet been addressed. Desired practice: Conduct peer reviews on selected work products and identify issues resulting from the peer review; Rating: Partially implemented; Comment: The project has awarded a contract to conduct peer reviews, but has not conducted any peer reviews to date. Goal: Selected work products are verified against their specified requirements; Rating: Satisfied. Desired practice: Perform verification on the selected work products; Rating: Largely implemented; Comment: The project team has performed verification on selected documents delivered under contract deliverable requirements, but not on internal documents. The project has not reached the verification testing phase. Capability level 1 not achieved. Sources: GAO, SEI. [End of table] Table 69: ERAM Verification: Detailed Findings on Level 2 Goals and Practices: Goal: Preparation for verification is conducted; Rating: Satisfied. Desired practice: Establish and maintain the environment needed to support verification; Rating: Largely implemented; Comment: The project team has established and maintains a generic environment needed to support verification of work products, but detailed environments for verification will be developed and delivered later in the program. Goal: Peer reviews are performed on selected work products; Rating: Unsatisfied; Comment: The goal is unsatisfied because the practice below is not implemented. Desired practice: Analyze data about preparation, conduct, and results of the peer reviews; Rating: Not implemented; Comment: The project team has not yet received any software peer review data to analyze. Goal: Selected work products are verified against their specified requirements; Rating: Satisfied. Desired practice: Analyze the results of all verification activities and identify corrective action; Rating: Largely implemented; Comment: The project team analyzes the results of the verification activities and identifies corrective actions for selected documents delivered under contract deliverable requirements; however, the team does not analyze or identify corrective actions on internal documents. Goal: The process is institutionalized as a managed process; Rating: Unsatisfied; Comment: The goal is unsatisfied because one of the practices below is partially implemented. Desired practice: Establish and maintain an organizational policy for planning and performing the verification process; Rating: Fully implemented. Desired practice: Establish and maintain the plan for performing the verification process; Rating: Fully implemented. Desired practice: Provide adequate resources for performing the verification process, developing the work products, and providing the services of the process; Rating: Largely implemented; Comment: The project team has provided most of the resources for performing the verification process, developing the work products, and providing the services of the process; however, the project team does not explicitly describe the resources required for the testing environment. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the verification process; Rating: Fully implemented. Desired practice: Train the people performing or supporting the verification process as needed; Rating: Fully implemented. Desired practice: Place designated work products of the verification process under appropriate levels of configuration management; Rating: Fully implemented. Desired practice: Identify and involve the relevant stakeholders of the verification process as planned; Rating: Fully implemented. Desired practice: Monitor and control the verification process against the plan for performing the process, and take appropriate corrective action; Rating: Largely implemented; Comment: The project team monitors the process by looking at periodic data on change request activity, but it does not always use these data to control the verification process. Desired practice: Objectively evaluate adherence of the verification process to its process description, standards, and procedures, and address noncompliance; Rating: Partially implemented; Comment: The project team has just started defining its quality assurance process, which is expected to objectively evaluate adherence of the verification process to its process description, standards, and procedures and to address noncompliance. Desired practice: Review the activities, status, and results of the verification process with higher level management, and resolve issues; Rating: Fully implemented. Capability level 2 not achieved. Sources: GAO, SEI. [End of table] Table 70: ITWS Verification: Detailed Findings on Level 1 Goals and Practices: Goal: Preparation for verification is conducted; Rating: Satisfied. Desired practice: Select the work products to be verified and the verification methods that will be used for each; Rating: Largely implemented; Comment: The project team selects some work products, including contractor documents and software products, to be verified and the verification methods that will be used for each. The project team does not select or verify internally generated documents. Goal: Peer reviews are performed on selected work products; Rating: Unsatisfied; Comment: The goal is unsatisfied because one of the practices below is partially implemented and the other is not implemented. Desired practice: Prepare for peer reviews of selected work products; Rating: Partially implemented; Comment: The project team has just established, but not yet implemented, a peer review plan that is expected to prepare for the peer review of selected products. Desired practice: Conduct peer reviews on selected work products and identify issues resulting from the peer review; Rating: Not implemented; Comment: The project team has not yet conducted peer reviews on selected work products or identified issues resulting from the peer review. Goal: Selected work products are verified against their specified requirements; Rating: Satisfied. Desired practice: Perform verification on the selected work products; Rating: Largely implemented; Comment: The project team performs verification on the software and contractor delivered documents; however, it does not verify internally generated work products. Capability level 1 not achieved. Sources: GAO, SEI. [End of table] Table 71: ITWS Verification: Detailed Findings on Level 2 Goals and Practices: Goal: Preparation for verification is conducted; Rating: Satisfied. Desired practice: Establish and maintain the environment needed to support verification; Rating: Largely implemented; Comment: The project team has established and maintains the environment needed to support verification for the software and contractor products. However, the environment to support the verification of internally generated documents has not been established. Goal: Peer reviews are performed on selected work products; Rating: Unsatisfied; Comment: The goal is unsatisfied because the practice below is not implemented. Desired practice: Analyze data about preparation, conduct, and results of the peer reviews; Rating: Not implemented; Comment: The project team has not yet identified how it will analyze data about the preparation, conduct, and results of the peer reviews. Goal: Selected work products are verified against their specified requirements; Rating: Satisfied. Desired practice: Analyze the results of all verification activities and identify corrective action; Rating: Fully implemented. Goal: The process is institutionalized as a managed process; Rating: Unsatisfied; Comment: The goal is unsatisfied because three of the practices below are partially implemented and one is not implemented. Desired practice: Establish and maintain an organizational policy for planning and performing the verification process; Rating: Fully implemented. Desired practice: Establish and maintain the plan for performing the verification process; Rating: Largely implemented; Comment: The project team has established and maintains the plan for performing the verification process for software and contractor delivered work products. However, the plan for internally generated work products has not been established or maintained. Desired practice: Provide adequate resources for performing the verification process, developing the work products, and providing the services of the process; Rating: Largely implemented; Comment: The project team provides adequate personnel resources for performing the verification process, developing the work products, and providing the services of the process for software and contractor delivered work products. However, there is no evidence of software resources (tools) for tracking the software verification efforts. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the verification process; Rating: Fully implemented. Desired practice: Train the people performing or supporting the verification process as needed; Rating: Partially implemented; Comment: The project team reported that it hires trained contractors. However, the project team does not train the people performing or supporting the verification process as needed. Desired practice: Place designated work products of the verification process under appropriate levels of configuration management; Rating: Partially implemented; Comment: The project team places designated work products of the verification process under some levels of configuration management; however, the controls on these products are not adequate because all support contractors have full access to the system. Desired practice: Identify and involve the relevant stakeholders of the verification process as planned; Rating: Fully implemented. Desired practice: Monitor and control the verification process against the plan for performing the process, and take appropriate corrective action; Rating: Not implemented; Comment: Although the project team monitors the verification progress through the program management review briefings and other mechanisms for documentation, there are no measures used by the project team to monitor or control the process against the plan for performing the process. Desired practice: Objectively evaluate adherence of the verification process to its process description, standards, and procedures, and address noncompliance; Rating: Partially implemented; Comment: The project team has just begun its quality assurance process, which is expected to objectively evaluate adherence of the verification process to its process description, standards, and procedures and to address noncompliance. Desired practice: Review the activities, status, and results of the verification process with higher level management, and resolve issues; Rating: Fully implemented. Capability level 2 not achieved. Sources: GAO, SEI. [End of table] Table 72: ASDE-X Verification: Detailed Findings on Level 1 Goals and Practices: Goal: Preparation for verification is conducted; Rating: Satisfied. Desired practice: Select the work products to be verified and the verification methods that will be used for each; Rating: Largely implemented; Comment: The project team performs verification on the software and contractor delivered documents; however, it does not verify internally generated work products. Goal: Peer reviews are performed on selected work products; Rating: Unsatisfied; Comment: The goal is unsatisfied because the two practices below are not implemented. Desired practice: Prepare for peer reviews of selected work products; Rating: Not implemented; Comment: The project team has not prepared for peer reviews of selected work products. Desired practice: Conduct peer reviews on selected work products and identify issues resulting from the peer review; Rating: Not implemented; Comment: The project team has not conducted peer reviews on selected work products or identified issues resulting from the peer review. Goal: Selected work products are verified against their specified requirements; Rating: Satisfied. Desired practice: Perform verification on the selected work products; Rating: Largely implemented; Comment: The project team performs verification on the software and contractor-delivered documents; however, it does not verify internally generated work products. Capability level 1 not achieved. Sources: GAO, SEI. [End of table] Table 73: ASDE-X Verification: Detailed Findings on Level 2 Goals and Practices: Goal: Preparation for verification is conducted; Rating: Satisfied. Desired practice: Establish and maintain the environment needed to support verification; Rating: Largely implemented; Comment: The project team has established and maintains the environment needed to support verification of the software at a high level; however, detailed environments for verification have not been established. In addition, the project team has not yet established an environment to support the verification of internally generated documents. Goal: Peer reviews are performed on selected work products; Rating: Unsatisfied; Comment: The goal is unsatisfied because the practice below is not implemented. Desired practice: Analyze data about preparation, conduct, and results of the peer reviews; Rating: Not implemented; Comment: The project team has not analyzed data about preparation, conduct, and results of the peer reviews. Goal: Selected work products are verified against their specified requirements; Rating: Satisfied. Desired practice: Analyze the results of all verification activities and identify corrective action; Rating: Largely implemented; Comment: The project team analyzes the results of the software and contractor-delivered documentation verification activities and identifies corrective actions needed. Goal: The process is institutionalized as a managed process; Rating: Unsatisfied; Comment: The goal is unsatisfied because one of the practices below is partially implemented and another is not implemented. Desired practice: Establish and maintain an organizational policy for planning and performing the verification process; Rating: Fully implemented. Desired practice: Establish and maintain the plan for performing the verification process; Rating: Largely implemented; Comment: The project team has established and maintains the plan for performing the verification process for software and contractor delivered work products. However, the plan for internally generated work products has not been established or maintained. Desired practice: Provide adequate resources for performing the verification process, developing the work products, and providing the services of the process; Rating: Largely implemented; Comment: The project team provides adequate personnel resources for performing the verification process, developing the work products, and providing the services of the process for software and other work products. However, the project team did not provide evidence of adequate software resources for performing the verification process. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the verification process; Rating: Largely implemented; Comment: The project team has assigned responsibility for the verification process to various groups, but it does not assign duties to individuals. Desired practice: Train the people performing or supporting the verification process as needed; Rating: Fully implemented. Desired practice: Place designated work products of the verification process under appropriate levels of configuration management; Rating: Largely implemented; Comment: The project team places designated work products of the process under appropriate levels of configuration management; however, internal documents have yet to be placed under configuration management control. Desired practice: Identify and involve the relevant stakeholders of the verification process as planned; Rating: Fully implemented. Desired practice: Monitor and control the verification process against the plan for performing the process, and take appropriate corrective action; Rating: Not implemented; Comment: The project team does not monitor and control the verification process against the plan for performing the process or take appropriate corrective action. Desired practice: Objectively evaluate adherence of the verification process to its process description, standards, and procedures, and address noncompliance; Rating: Partially implemented; Comment: The project team has just begun its quality assurance process, which is expected to objectively evaluate adherence of the verification process to its process description, standards, and procedures and to address noncompliance. Desired practice: Review the activities, status, and results of the verification process with higher level management, and resolve issues; Rating: Fully implemented. Capability level 2 not achieved. Sources: GAO, SEI. [End of table] [End of section] Chapter 10: FAA Is Performing Validation Practices, but It Is Not Yet Fully Managing the Process: The purpose of validation is to demonstrate that a product or product component fulfills its intended use when placed in its intended environment. Validation activities are vital to ensuring that the products are suitable for use in their intended operating environment. As shown in figure 12, all four FAA projects satisfied all criteria for the "performing" capability level (level 1) in this process area. None of the four projects satisfied all criteria for the "managing" capability level (level 2). While all four projects had differing weaknesses that contributed to this result, common weaknesses across most of the projects occurred in the areas of monitoring and controlling the validation process and in ensuring quality assurance of the process, as shown in the overview in table 74. As a result of these weaknesses, FAA is exposed to increased risk that the project will not fulfill its intended use, thereby increasing the likelihood that the projects will not meet cost, schedule, or performance goals. Looked at another way, of the 56 practices we evaluated in this process area, FAA projects had 47 practices that were fully or largely implemented and 9 practices that were partially or not implemented. Figure 12: Four Projects' Capability Levels in Validation: [See PDF for image] [End of figure] Table 74: Four Projects' Appraisal Results in Validation: Goal: Preparation for validation is conducted; VSCS: Goal satisfied; ERAM: Goal satisfied; ITWS: Goal satisfied; ASDE-X: Goal satisfied. Desired practice: Select products and product components to be validated and the validation methods that will be used for each; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Goal: The product or product components are validated to ensure that they are suitable for use in their intended operating environment; VSCS: Goal satisfied; ERAM: Goal satisfied; ITWS: Goal satisfied; ASDE-X: Goal satisfied. Desired practice: Perform validation on the selected products and product components; VSCS: Practice fully implemented; ERAM: Practice largely implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Analyze the results of the validation activities and identify issues; VSCS: Practice fully implemented; ERAM: Practice largely implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Capability level 1 achieved? VSCS: Yes; ERAM: Yes; ITWS: Yes; ASDE-X: Yes. Goal: Preparation for validation is conducted.[A]; VSCS: Goal satisfied; ERAM: Goal satisfied; ITWS: Goal satisfied; ASDE-X: Goal satisfied. Desired practice: Establish and maintain the environment needed to support validation; VSCS: Practice fully implemented; ERAM: Practice largely implemented; ITWS: Practice fully implemented; ASDE-X: Practice largely implemented. Goal: The process is institutionalized as a managed process; VSCS: Goal not satisfied; ERAM: Goal not satisfied; ITWS: Goal not satisfied; ASDE-X: Goal not satisfied. Desired practice: Establish and maintain an organizational policy for planning and performing the validation process; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Establish and maintain the plan for performing the validation process; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Provide adequate resources for performing the validation process, developing the work products, and providing the services of the process; VSCS: Practice fully implemented; ERAM: Practice largely implemented; ITWS: Practice largely implemented; ASDE-X: Practice largely implemented. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the validation process; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice largely implemented. Desired practice: Train the people performing or supporting the validation process as needed; VSCS: Practice largely implemented; ERAM: Practice fully implemented; ITWS: Practice partially implemented; ASDE-X: Practice fully implemented. Desired practice: Place designated work products of the validation process under appropriate levels of configuration management; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice partially implemented; ASDE-X: Practice largely implemented. Desired practice: Identify and involve the relevant stakeholders of the validation process as planned; VSCS: Practice largely implemented; ERAM: Practice largely implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Monitor and control the validation process against the plan for performing the process, and take appropriate corrective action; VSCS: Practice not implemented; ERAM: Practice not implemented; ITWS: Practice not implemented; ASDE-X: Practice not implemented. Desired practice: Objectively evaluate adherence of the validation process to its process description, standards, and procedures, and address noncompliance; VSCS: Practice fully implemented; ERAM: Practice partially implemented; ITWS: Practice partially implemented; ASDE-X: Practice partially implemented. Desired practice: Review the activities, status, and results of the validation process with higher level management, and resolve issues; VSCS: Practice fully implemented; ERAM: Practice largely implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Capability level 2 achieved? VSCS: No; ERAM: No; ITWS: No; ASDE-X: No. Sources: GAO, SEI. [A] This goal is repeated at both levels 1 and 2, but it requires a more advanced practice at level 2. [End of table] Additional details on each project's appraisal results at successive capability levels are provided in tables 75 through 82. Specifically, tables 75 and 76 provide results for VSCS; tables 77 and 78 provide results for ERAM; tables 79 and 80 provide results for ITWS; and tables 81 and 82 provide results for ASDE-X. Table 75: VSCS Validation: Detailed Findings on Level 1 Goals and Practices: Goal: Preparation for validation is conducted; Rating: Satisfied. Desired practice: Select products and product components to be validated and the validation methods that will be used for each; Rating: Fully implemented. Goal: The product or product components are validated to ensure that they are suitable for use in their intended operating environment; Rating: Satisfied. Desired practice: Perform validation on the selected products and product components; Rating: Fully implemented. Desired practice: Analyze the results of the validation activities and identify issues; Rating: Fully implemented. Capability level 1 achieved. Sources: GAO, SEI. [End of table] Table 76: VSCS Validation: Detailed Findings on Level 2 Goals and Practices: Goal: Preparation for validation is conducted; Rating: Satisfied. Desired practice: Establish and maintain the environment needed to support validation; Rating: Fully implemented. Goal: The process is institutionalized as a managed process; Rating: Unsatisfied; Comment: The goal is unsatisfied because one of the practices below is not implemented. Desired practice: Establish and maintain an organizational policy for planning and performing the validation process; Rating: Fully implemented. Desired practice: Establish and maintain the plan for performing the validation process; Rating: Fully implemented. Desired practice: Provide adequate resources for performing the validation process, developing the work products, and providing the services of the process; Rating: Fully implemented. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the validation process; Rating: Fully implemented. Desired practice: Train the people performing or supporting the validation process as needed; Rating: Largely implemented; Comment: The project team generally trains the people performing and supporting the process but does not consistently document individuals' training to ensure standards are met. Desired practice: Place designated work products of the validation process under appropriate levels of configuration management; Rating: Fully implemented. Desired practice: Identify and involve the relevant stakeholders of the validation process as planned; Rating: Fully implemented; Comment: The project team identifies and involves the relevant stakeholders of the validation process as planned. Desired practice: Monitor and control the validation process against the plan for performing the process, and take appropriate corrective action; Rating: Not implemented; Comment: The project team does not monitor and control the validation process against the plan for performing the process or take appropriate corrective action. Desired practice: Objectively evaluate adherence of the validation process to its process description, standards, and procedures, and address noncompliance; Rating: Fully implemented. Desired practice: Review the activities, status, and results of the validation process with higher level management, and resolve issues; Rating: Fully implemented. Capability level 2 not achieved. Sources: GAO, SEI. [End of table] Table 77: ERAM Validation: Detailed Findings on Level 1 Goals and Practices: Goal: Preparation for validation is conducted; Rating: Satisfied. Desired practice: Select products and product components to be validated and the validation methods that will be used for each; Rating: Fully implemented. Goal: The product or product components are validated to ensure that they are suitable for use in their intended operating environment; Rating: Satisfied. Desired practice: Perform validation on the selected products and product components; Rating: Largely implemented; Comment: The project team has started the validation on the selected product and product components through extensive validation plans. However, the project is not yet in the validation phase. Desired practice: Analyze the results of the validation activities and identify issues; Rating: Largely implemented; Comment: The project team has started to analyze the results of the validation activities and identify issues. However, the project is not yet in the validation phase. Capability level 1 achieved. Sources: GAO, SEI. [End of table] Table 78: ERAM Validation: Detailed Findings on Level 2 Goals and Practices: Goal: Preparation for validation is conducted; Rating: Satisfied. Desired practice: Establish and maintain the environment needed to support validation; Rating: Largely implemented; Comment: The project team establishes a general environment needed to support validation of work products. However, detailed environments have not yet been developed. Goal: The process is institutionalized as a managed process; Rating: Unsatisfied; Comment: The goal is unsatisfied because one of the practices below is partially implemented. Desired practice: Establish and maintain an organizational policy for planning and performing the validation process; Rating: Fully implemented. Desired practice: Establish and maintain the plan for performing the validation process; Rating: Fully implemented. Desired practice: Provide adequate resources for performing the validation process, developing the work products, and providing the services of the process; Rating: Largely implemented; Comment: The project has provided most of the resources for performing the validation process, developing the work products, and providing the services of the process; however, the project does not explicitly describe the resources required for the testing environment. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the validation process; Rating: Fully implemented. Desired practice: Train the people performing or supporting the validation process as needed; Rating: Fully implemented. Desired practice: Place designated work products of the validation process under appropriate levels of configuration management; Rating: Fully implemented. Desired practice: Identify and involve the relevant stakeholders of the validation process as planned; Rating: Largely implemented; Comment: The project team has identified and plans to involve the relevant stakeholders of the validation process, but it has not yet done so. Desired practice: Monitor and control the validation process against the plan for performing the process, and take appropriate corrective action; Rating: Not implemented; Comment: The project team does not monitor and control its validation planning process. Desired practice: Objectively evaluate adherence of the validation process to its process description, standards, and procedures, and address noncompliance; Rating: Partially implemented; Comment: The project team has just started defining its quality assurance process, which is expected to objectively evaluate adherence of the validation process to its process description, standards, and procedures and to address noncompliance. Desired practice: Review the activities, status, and results of the validation process with higher level management, and resolve issues; Rating: Largely implemented; Comment: The project team reviews test planning activities for the validation process with higher level management; however, the validation process has not yet reached the stage where validation issues need to be addressed and resolved. Capability level 2 not achieved. Sources: GAO, SEI. [End of table] Table 79: ITWS Validation: Detailed Findings on Level 1 Goals and Practices: Goal: Preparation for validation is conducted; Rating: Satisfied. Desired practice: Select products and product components to be validated and the validation methods that will be used for each; Rating: Fully implemented. Goal: The product or product components are validated to ensure that they are suitable for use in their intended operating environment; Rating: Satisfied. Desired practice: Perform validation on the selected products and product components; Rating: Fully implemented. Desired practice: Analyze the results of the validation activities and identify issues; Rating: Fully implemented. Capability level 1 achieved. Sources: GAO, SEI. [End of table] Table 80: ITWS Validation: Detailed Findings on Level 2 Goals and Practices: Goal: Preparation for validation is conducted; Rating: Satisfied. Desired practice: Establish and maintain the environment needed to support validation; Rating: Fully implemented. Goal: The process is institutionalized as a managed process; Rating: Unsatisfied; Comment: The goal is unsatisfied because three of the practices below are partially implemented, and one is not implemented. Desired practice: Establish and maintain an organizational policy for planning and performing the validation process; Rating: Fully implemented. Desired practice: Establish and maintain the plan for performing the validation process; Rating: Fully implemented. Desired practice: Provide adequate resources for performing the validation process, developing the work products, and providing the services of the process; Rating: Largely implemented; Comment: The project team provides adequate personnel resources for performing the validation process, developing the work products, and providing the services of the process. However, team lacks the software resources (tools) to track the software validation efforts. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the validation process; Rating: Fully implemented. Desired practice: Train the people performing or supporting the validation process as needed; Rating: Partially implemented; Comment: The project team reported that it hires trained contractors. However, the project team does not train the people performing or supporting the validation process as needed. Desired practice: Place designated work products of the validation process under appropriate levels of configuration management; Rating: Partially implemented; Comment: The project team places designated work products of the validation process under some levels of configuration management; however, the controls on these products are not adequate because all support contractors have access to the system. Desired practice: Identify and involve the relevant stakeholders of the validation process as planned; Rating: Fully implemented. Desired practice: Monitor and control the validation process against the plan for performing the process, and take appropriate corrective action; Rating: Not implemented; Comment: Although the project team monitors the verification progress through the program management review briefings, there are no measures used by the project team to monitor or control the process against the plan for performing the process or to take appropriate corrective action. Desired practice: Objectively evaluate adherence of the validation process to its process description, standards, and procedures, and address noncompliance; Rating: Partially implemented; Comment: The project team has just begun its quality assurance process, which is expected to objectively evaluate adherence of the validation process to its process description, standards, and procedures and to address noncompliance. Desired practice: Review the activities, status, and results of the validation process with higher level management, and resolve issues; Rating: Fully implemented. Capability level 2 not achieved. Sources: GAO, SEI. [End of table] Table 81: ASDE-X Validation: Detailed Findings on Level 1 Goals and Practices: Goal: Preparation for validation is conducted; Rating: Satisfied. Desired practice: Select products and product components to be validated and the validation methods that will be used for each; Rating: Fully implemented. Goal: The product or product components are validated to ensure that they are suitable for use in their intended operating environment; Rating: Satisfied. Desired practice: Perform validation on the selected products and product components; Rating: Fully implemented. Desired practice: Analyze the results of the validation activities and identify issues; Rating: Fully implemented. Capability level 1 achieved. Sources: GAO, SEI. [End of table] Table 82: ASDE-X Validation: Detailed Findings on Level 2 Goals and Practices: Goal: Preparation for validation is conducted; Rating: Satisfied. Desired practice: Establish and maintain the environment needed to support validation; Rating: Largely implemented; Comment: The project team has established and maintains the environment needed to support validation of the software at a high level; however, detailed environments for validation are not yet established. Goal: The process is institutionalized as a managed process; Rating: Unsatisfied; Comment: The goal is unsatisfied because one of the practices below is partially implemented and another is not implemented. Desired practice: Establish and maintain an organizational policy for planning and performing the validation process; Rating: Fully implemented. Desired practice: Establish and maintain the plan for performing the validation process; Rating: Fully implemented. Desired practice: Provide adequate resources for performing the validation process, developing the work products, and providing the services of the process; Rating: Largely implemented; Comment: The project team provides adequate personnel resources for performing the validation process, developing the work products, and providing the services of the process. However, the project team did not provide evidence of software resources. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the validation process; Rating: Largely implemented; Comment: The project team has assigned responsibility and authority for performing the process, developing the work products, and providing the services of the validation process; however, it does not assign duties to individuals. Desired practice: Train the people performing or supporting the validation process as needed; Rating: Fully implemented. Desired practice: Place designated work products of the validation process under appropriate levels of configuration management; Rating: Largely implemented; Comment: The project team places designated work products of the process under appropriate levels of configuration management; however, internal documents have yet to be placed under configuration management control. Desired practice: Identify and involve the relevant stakeholders of the validation process as planned; Rating: Fully implemented. Desired practice: Monitor and control the validation process against the plan for performing the process, and take appropriate corrective action; Rating: Not implemented; Comment: The project team does not monitor and control the validation process against the plan for performing the process or take appropriate corrective action. Desired practice: Objectively evaluate adherence of the validation process to its process description, standards, and procedures, and address noncompliance; Rating: Partially implemented; Comment: The project team has just begun its quality assurance process, which is expected to objectively evaluate adherence of the validation process to its process description, standards, and procedures and to address noncompliance. Desired practice: Review the activities, status, and results of the validation process with higher level management, and resolve issues; Rating: Fully implemented. Capability level 2 not achieved. Sources: GAO, SEI. [End of table] [End of section] Chapter 11: FAA Is Performing Most Configuration Management Practices, but It Is Not Yet Fully Managing the Process: The purpose of configuration management is to establish and maintain the integrity of work products. This process area includes both the functional processes used to establish and track work product changes and the technical systems used to manage these changes. Through configuration management, accurate status and data are provided to developers, end users, and customers. As shown in figure 13, three of the four FAA projects satisfied all criteria for the "performing" capability level (level 1) in this process area. The fourth project would have achieved level 1 if it had performed two more practices (see the overview in table 83 for details). Only one of the four projects satisfied all criteria for the "managing" capability level (level 2). While all four projects had differing weaknesses that contributed to this result, common weaknesses across some of the projects occurred in the areas of monitoring and controlling the process and in ensuring the quality assurance of the configuration management process, as shown in the overview in table 83. As a result of these weaknesses, FAA is exposed to increased risk that the project teams will not effectively manage their work products, resulting in projects that do not meet cost, schedule, or performance goals. Looked at another way, of the 68 practices we evaluated in this process area, FAA projects had 60 practices that were fully or largely implemented and 8 practices that were partially or not implemented. Figure 13: Four Projects' Capability Levels in Configuration Management: [See PDF for image] [End of figure] Table 83: Four Projects' Appraisal Results in Configuration Management: Goal: Baselines of identified work products are established; VSCS: Goal satisfied; ERAM: Goal satisfied; ITWS: Goal not satisfied; ASDE-X: Goal satisfied. Desired practice: Identify the configuration items, components, and related work products that will be placed under configuration management; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Establish and maintain a configuration management and change management system for controlling work products; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice partially implemented; ASDE-X: Practice fully implemented. Desired practice: Create or release baselines for internal use and for delivery to the customer; VSCS: Practice fully implemented; ERAM: Practice largely implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Goal: Changes to the work products under configuration management are tracked and controlled; VSCS: Goal satisfied; ERAM: Goal satisfied; ITWS: Goal not satisfied; ASDE-X: Goal satisfied. Desired practice: Track change requests for the configuration items; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Control changes to the configuration items; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice not implemented; ASDE-X: Practice fully implemented. Goal: Integrity of baselines is established and maintained; VSCS: Goal satisfied; ERAM: Goal satisfied; ITWS: Goal satisfied; ASDE-X: Goal satisfied. Desired practice: Establish and maintain records describing configuration items; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Perform configuration audits to maintain integrity of the configuration baselines; VSCS: Practice fully implemented; ERAM: Practice largely implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Capability level 1 achieved? VSCS: Yes; ERAM: Yes; ITWS: No; ASDE-X: Yes. Goal: The process is institutionalized as a managed process; VSCS: Goal not satisfied; ERAM: Goal not satisfied; ITWS: Goal not satisfied; ASDE-X: Goal satisfied. Desired practice: Establish and maintain an organizational policy for planning and performing the configuration management process; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Establish and maintain the plan for performing the configuration management process; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Provide adequate resources for performing the configuration management process, developing the work products, and providing the services of the process; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice partially implemented; ASDE-X: Practice fully implemented. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the configuration management process; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Train the people performing or supporting the configuration management process as needed; VSCS: Practice largely implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Place designated work products of the configuration management process under appropriate levels of configuration management; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice partially implemented; ASDE-X: Practice largely implemented. Desired practice: Identify and involve the relevant stakeholders of the configuration management process as planned; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Monitor and control the configuration management process against the plan for performing the process, and take appropriate corrective action; VSCS: Practice not implemented; ERAM: Practice fully implemented; ITWS: Practice not implemented; ASDE-X: Practice fully implemented. Desired practice: Objectively evaluate adherence of the configuration management process to its process description, standards, and procedures, and address noncompliance; VSCS: Practice fully implemented; ERAM: Practice partially implemented; ITWS: Practice partially implemented; ASDE-X: Practice fully implemented. Desired practice: Review the activities, status, and results of the configuration management process with higher level management, and resolve issues; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Capability level 2 achieved? VSCS: No; ERAM: No; ITWS: No; ASDE-X: Yes. Sources: GAO, SEI. [End of table] Additional details on each project's appraisal results at successive capability levels are provided in tables 84 through 91. Specifically, tables 84 and 85 provide results for VSCS; tables 86 and 87 provide results for ERAM; tables 88 and 89 provide results for ITWS; and tables 90 and 91 provide results for ASDE-X. Table 84: VSCS Configuration Management: Detailed Findings on Level 1 Goals and Practices: Goal: Baselines of identified work products are established; Rating: Satisfied. Desired practice: Identify the configuration items, components, and related work products that will be placed under configuration management; Rating: Fully implemented. Desired practice: Establish and maintain a configuration management and change management system for controlling work products; Rating: Fully implemented. Desired practice: Create or release baselines for internal use and for delivery to the customer; Rating: Fully implemented. Goal: Changes to the work products under configuration management are tracked and controlled; Rating: Satisfied. Desired practice: Track change requests for the configuration items; Rating: Fully implemented. Desired practice: Control changes to the configuration items; Rating: Fully implemented. Goal: Integrity of baselines is established and maintained; Rating: Satisfied. Desired practice: Establish and maintain records describing configuration items; Rating: Fully implemented. Desired practice: Perform configuration audits to maintain integrity of the configuration baselines; Rating: Fully implemented. Capability level 1 achieved. Sources: GAO, SEI. [End of table] Table 85: VSCS Configuration Management: Detailed Findings on Level 2 Goals and Practices: Goal: The process is institutionalized as a managed process; Rating: Unsatisfied; Comment: The goal is unsatisfied because one of the practices below is not implemented. Desired practice: Establish and maintain an organizational policy for planning and performing the configuration management process; Rating: Fully implemented. Desired practice: Establish and maintain the plan for performing the configuration management process; Rating: Fully implemented. Desired practice: Provide adequate resources for performing the configuration management process, developing the work products, and providing the services of the process; Rating: Fully implemented. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the configuration management process; Rating: Fully implemented. Desired practice: Train the people performing or supporting the configuration management process as needed; Rating: Largely implemented; Comment: The project team generally trains the people performing and supporting the configuration management process, but does not document individuals' training to ensure standards are met. Desired practice: Place designated work products of the configuration management process under appropriate levels of configuration management; Rating: Fully implemented. Desired practice: Identify and involve the relevant stakeholders of the configuration management process as planned; Rating: Fully implemented. Desired practice: Monitor and control the configuration management process against the plan for performing the process, and take appropriate corrective action; Rating: Not implemented; Comment: The project team does not monitor and control the configuration management process against the plan for performing the process. Desired practice: Objectively evaluate adherence of the configuration management process to its process description, standards, and procedures, and address noncompliance; Rating: Fully implemented. Desired practice: Review the activities, status, and results of the configuration management process with higher level management, and resolve issues; Rating: Fully implemented. Capability level 2 not achieved. Sources: GAO, SEI. [End of table] Table 86: ERAM Configuration Management: Detailed Findings on Level 1 Goals and Practices: Goal: Baselines of identified work products are established; Rating: Satisfied. Desired practice: Identify the configuration items, components, and related work products that will be placed under configuration management; Rating: Fully implemented. Desired practice: Establish and maintain a configuration management and change management system for controlling work products; Rating: Fully implemented. Desired practice: Create or release baselines for internal use and for delivery to the customer; Rating: Largely implemented; Comment: The project team generally creates or releases baselines for internal use and for delivery to the customer; however, the team does not yet have baselines for the current project. Goal: Changes to the work products under configuration management are tracked and controlled; Rating: Satisfied. Desired practice: Track change requests for the configuration items; Rating: Fully implemented. Desired practice: Control changes to the configuration items; Rating: Fully implemented. Goal: Integrity of baselines is established and maintained; Rating: Satisfied. Desired practice: Establish and maintain records describing configuration items; Rating: Fully implemented. Desired practice: Perform configuration audits to maintain integrity of the configuration baselines; Rating: Largely implemented; Comment: The project team performs configuration audits to maintain integrity of the configuration baselines; however, the current project has not yet been subject to audits. Capability level 1 achieved. Sources: GAO, SEI. [End of table] Table 87: ERAM Configuration Management: Detailed Findings on Level 2 Goals and Practices: Goal: The process is institutionalized as a managed process; Rating: Unsatisfied; Comment: The goal is unsatisfied because one of the practices below is partially implemented. Desired practice: Establish and maintain an organizational policy for planning and performing the configuration management process; Rating: Fully implemented. Desired practice: Establish and maintain the plan for performing the configuration management process; Rating: Fully implemented. Desired practice: Provide adequate resources for performing the configuration management process, developing the work products, and providing the services of the process; Rating: Fully implemented. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the configuration management process; Rating: Fully implemented. Desired practice: Train the people performing or supporting the configuration management process as needed; Rating: Fully implemented. Desired practice: Place designated work products of the configuration management process under appropriate levels of configuration management; Rating: Fully implemented. Desired practice: Identify and involve the relevant stakeholders of the configuration management process as planned; Rating: Fully implemented. Desired practice: Monitor and control the configuration management process against the plan for performing the process, and take appropriate corrective action; Rating: Fully implemented. Desired practice: Objectively evaluate adherence of the configuration management process to its process description, standards, and procedures, and address noncompliance; Rating: Partially implemented; Comment: The project team has just started defining its quality assurance process, which is expected to objectively evaluate adherence of the configuration management process to its process description, standards, and procedures and to address noncompliance. Desired practice: Review the activities, status, and results of the configuration management process with higher level management, and resolve issues; Rating: Fully implemented. Capability level 2 not achieved. Sources: GAO, SEI. [End of table] Table 88: ITWS Configuration Management: Detailed Findings on Level 1 Goals and Practices: Goal: Baselines of identified work products are established; Rating: Unsatisfied; Comment: The goal is unsatisfied because one of the practices below is partially implemented. Desired practice: Identify the configuration items, components, and related work products that will be placed under configuration management; Rating: Fully implemented. Desired practice: Establish and maintain a configuration management and change management system for controlling work products; Rating: Partially implemented; Comment: The project team has established and maintains a configuration management system for controlling work products; however, the system does not have adequate controls in place because all support contractors have full access to the system. Desired practice: Create or release baselines for internal use and for delivery to the customer; Rating: Fully implemented. Goal: Changes to the work products under configuration management are tracked and controlled; Rating: Unsatisfied; Comment: The goal is unsatisfied because one of the practices below is not implemented. Desired practice: Track change requests for the configuration items; Rating: Fully implemented. Desired practice: Control changes to the configuration items; Rating: Not implemented; Comment: The project team does not control all changes to the configuration items. Goal: Integrity of baselines is established and maintained; Rating: Satisfied. Desired practice: Establish and maintain records describing configuration items; Rating: Fully implemented. Desired practice: Perform configuration audits to maintain integrity of the configuration baselines; Rating: Fully implemented. Capability level 1 not achieved. Sources: GAO, SEI. [End of table] Table 89: ITWS Configuration Management: Detailed Findings on Level 2 Goals and Practices: Goal: The process is institutionalized as a managed process; Rating: Unsatisfied; Comment: The goal is unsatisfied because three of the practices below are partially implemented and one is not implemented. Desired practice: Establish and maintain an organizational policy for planning and performing the configuration management process; Rating: Fully implemented. Desired practice: Establish and maintain the plan for performing the configuration management process; Rating: Fully implemented. Desired practice: Provide adequate resources for performing the configuration management process, developing the work products, and providing the services of the process; Rating: Partially implemented; Comment: The project team provides some resources, such as staffing, for performing the configuration management process, developing the work products, and providing the services of the process. However, the team does not provide sufficient software tools to support the configuration management process. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the configuration management process; Rating: Fully implemented. Desired practice: Train the people performing or supporting the configuration management process as needed; Rating: Fully implemented. Desired practice: Place designated work products of the configuration management process under appropriate levels of configuration management; Rating: Partially implemented; Comment: The project team places designated work products of the configuration management process under configuration management; however, the controls on these products are not adequate because all support contractors have full access to the system. Desired practice: Identify and involve the relevant stakeholders of the configuration management process as planned; Rating: Fully implemented. Desired practice: Monitor and control the configuration management process against the plan for performing the process, and take appropriate corrective action; Rating: Not implemented; Comment: The project team does not monitor and control the configuration management process against the plan for performing the process or take appropriate corrective action. The only metrics that the team records are start and stop dates of the change proposal process. Desired practice: Objectively evaluate adherence of the configuration management process to its process description, standards, and procedures, and address noncompliance; Rating: Partially implemented; Comment: The project team has just begun its quality assurance process, which is expected to objectively evaluate adherence of the configuration management process to its process description, standards, and procedures and to address noncompliance. Desired practice: Review the activities, status, and results of the configuration management process with higher level management, and resolve issues; Rating: Fully implemented. Capability level 2 not achieved. Sources: GAO, SEI. [End of table] Table 90: ASDE-X Configuration Management: Detailed Findings on Level 1 Goals and Practices: Goal: Baselines of identified work products are established; Rating: Satisfied. Desired practice: Identify the configuration items, components, and related work products that will be placed under configuration management; Rating: Fully implemented. Desired practice: Establish and maintain a configuration management and change management system for controlling work products; Rating: Fully implemented. Desired practice: Create or release baselines for internal use and for delivery to the customer; Rating: Fully implemented. Goal: Changes to the work products under configuration management are tracked and controlled; Rating: Satisfied. Desired practice: Track change requests for the configuration items; Rating: Fully implemented. Desired practice: Control changes to the configuration items; Rating: Fully implemented. Goal: Integrity of baselines is established and maintained; Rating: Satisfied. Desired practice: Establish and maintain records describing configuration items; Rating: Fully implemented. Desired practice: Perform configuration audits to maintain integrity of the configuration baselines; Rating: Fully implemented. Capability level 1 achieved. Sources: GAO, SEI. [End of table] Table 91: ASDE-X Configuration Management: Detailed Findings on Level 2 Goals and Practices: Goal: The process is institutionalized as a managed process; Rating: Satisfied. Desired practice: Establish and maintain an organizational policy for planning and performing the configuration management process; Rating: Fully implemented. Desired practice: Establish and maintain the plan for performing the configuration management process; Rating: Fully implemented. Desired practice: Provide adequate resources for performing the configuration management process, developing the work products, and providing the services of the process; Rating: Fully implemented. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the configuration management process; Rating: Fully implemented. Desired practice: Train the people performing or supporting the configuration management process as needed; Rating: Fully implemented. Desired practice: Place designated work products of the configuration management process under appropriate levels of configuration management; Rating: Largely implemented; Comment: The project team places designated contractor-related work products of the configuration management process under appropriate levels of configuration management; however, internal documents have yet to be placed under configuration management control. Desired practice: Identify and involve the relevant stakeholders of the configuration management process as planned; Rating: Fully implemented. Desired practice: Monitor and control the configuration management process against the plan for performing the process, and take appropriate corrective action; Rating: Fully implemented. Desired practice: Objectively evaluate adherence of the configuration management process to its process description, standards, and procedures, and address noncompliance; Rating: Fully implemented. Desired practice: Review the activities, status, and results of the configuration management process with higher level management, and resolve issues; Rating: Fully implemented. Capability level 2 achieved. Sources: GAO, SEI. [End of table] [End of section] Chapter 12: FAA Is Not Performing Key Process and Product Quality Assurance Practices or Managing the Process: The purpose of process and product quality assurance is to provide staff and management with objective insights into processes and associated work products. This process area includes the objective evaluation of project processes and products against approved descriptions and standards. Through process and product quality assurance, the project is able to identify and document noncompliance issues and provide appropriate feedback to project members. As shown in figure 14, only one of the four FAA projects satisfied all criteria for the "performing" capability level (level 1) in this process area. Weaknesses in the objective evaluation of designated performed processes, work products, and services against the applicable process descriptions, standards, and procedures prevented the projects from achieving level 1. None of the four projects satisfied all criteria for the "managing" capability level (level 2). Table 92 provides an overview of our appraisal results. As shown in the table, while the four projects had differing weaknesses that contributed to this result, common weaknesses across multiple projects occurred in the areas of establishing a plan, providing resources, training people, providing configuration management, identifying stakeholders, monitoring and controlling the process, ensuring quality assurance, and reviewing the status of the quality assurance process with higher level managers. As a result of these weaknesses, FAA is exposed to increased risk that the projects will not effectively implement key management processes, resulting in projects that will not meet cost, schedule, or performance goals, and that will not meet mission needs. Looked at another way, of the 56 practices we evaluated in this process area, FAA projects had 33 practices that were fully or largely implemented and 23 practices that were partially or not implemented. Figure 14: Four Projects' Capability Levels in Process and Product Quality Assurance: [See PDF for image] [End of figure] Table 92: Four Projects' Appraisal Results in Process and Product Quality Assurance: Goal: Adherence of the performed process and associated work products and services to applicable process descriptions, standards, and procedures is objectively evaluated; VSCS: Goal satisfied; ERAM: Goal not satisfied; ITWS: Goal not satisfied; ASDE-X: Goal not satisfied. Desired practice: Objectively evaluate the designated performed processes against the applicable process descriptions, standards, and procedures; VSCS: Practice fully implemented; ERAM: Practice partially implemented; ITWS: Practice partially implemented; ASDE-X: Practice partially implemented. Desired practice: Objectively evaluate the designated work products and services against the applicable process descriptions, standards, and procedures; VSCS: Practice fully implemented; ERAM: Practice partially implemented; ITWS: Practice partially implemented; ASDE-X: Practice partially implemented. Goal: Noncompliance issues are objectively tracked and communicated, and resolution is ensured; VSCS: Goal satisfied; ERAM: Goal satisfied; ITWS: Goal not satisfied; ASDE-X: Goal not satisfied. Desired practice: Communicate quality issues and ensure resolution of noncompliance issues with the staff and managers; VSCS: Practice largely implemented; ERAM: Practice largely implemented; ITWS: Practice partially implemented; ASDE-X: Practice partially implemented. Desired practice: Establish and maintain records of the quality assurance activities; VSCS: Practice fully implemented; ERAM: Practice largely implemented; ITWS: Practice largely implemented; ASDE-X: Practice largely implemented. Capability level 1 achieved? VSCS: Yes; ERAM: No; ITWS: No; ASDE-X: No. Goal: The process is institutionalized as a managed process; VSCS: Goal not satisfied; ERAM: Goal not satisfied; ITWS: Goal not satisfied; ASDE-X: Goal not satisfied. Desired practice: Establish and maintain an organizational policy for planning and performing the process and product quality assurance process; VSCS: Practice fully implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Establish and maintain the plan for performing the process and product quality assurance process; VSCS: Practice fully implemented; ERAM: Practice largely implemented; ITWS: Practice partially implemented; ASDE-X: Practice partially implemented. Desired practice: Provide adequate resources for performing the process and product quality assurance process, developing the work products, and providing the services of the process; VSCS: Practice fully implemented; ERAM: Practice largely implemented; ITWS: Practice largely implemented; ASDE-X: Practice partially implemented. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the process and product quality assurance process; VSCS: Practice fully implemented; ERAM: Practice largely implemented; ITWS: Practice largely implemented; ASDE-X: Practice largely implemented. Desired practice: Train the people performing or supporting the process and product quality assurance process as needed; VSCS: Practice largely implemented; ERAM: Practice largely implemented; ITWS: Practice partially implemented; ASDE-X: Practice fully implemented. Desired practice: Place designated work products of the process and product quality assurance process under appropriate levels of configuration management; VSCS: Practice fully implemented; ERAM: Practice partially implemented; ITWS: Practice partially implemented; ASDE-X: Practice largely implemented. Desired practice: Identify and involve the relevant stakeholders of the process and product quality assurance process as planned; VSCS: Practice largely implemented; ERAM: Practice largely implemented; ITWS: Practice largely implemented; ASDE-X: Practice partially implemented. Desired practice: Monitor and control the process and product quality assurance process against the plan for performing the process, and take appropriate corrective action; VSCS: Practice not implemented; ERAM: Practice not implemented; ITWS: Practice not implemented; ASDE-X: Practice not implemented. Desired practice: Objectively evaluate adherence of the process and product quality assurance process to its process description, standards, and procedures, and address noncompliance; VSCS: Practice fully implemented; ERAM: Practice partially implemented; ITWS: Practice partially implemented; ASDE-X: Practice partially implemented. Desired practice: Review the activities, status, and results of the process and product quality assurance process with higher level management, and resolve issues; VSCS: Practice fully implemented; ERAM: Practice largely implemented; ITWS: Practice fully implemented; ASDE-X: Practice partially implemented. Capability level 2 achieved? VSCS: No; ERAM: No; ITWS: No; ASDE-X: No. Sources: GAO, SEI. [End of table] Additional details on each project's appraisal results at successive capability levels are provided in tables 93 through 100. Specifically, tables 93 and 94 provide results for VSCS; tables 95 and 96 provide results for ERAM; tables 97 and 98 provide results for ITWS; and tables 99 and 100 provide results for ASDE-X. Table 93: VSCS Process and Product Quality Assurance: Detailed Findings on Level 1 Goals and Practices: Goal: Adherence of the performed process and associated work products and services to applicable process descriptions, standards, and procedures is objectively evaluated; Rating: Satisfied. Desired practice: Objectively evaluate the designated performed processes against the applicable process descriptions, standards, and procedures; Rating: Fully implemented. Desired practice: Objectively evaluate the designated work products and services against the applicable process descriptions, standards, and procedures; Rating: Fully implemented. Goal: Noncompliance issues are objectively tracked and communicated, and resolution is ensured; Rating: Satisfied. Desired practice: Communicate quality issues and ensure resolution of noncompliance issues with the staff and managers; Rating: Largely implemented; Comment: The project team communicates quality issues and usually--but not always--ensures the resolution of noncompliance issues. Desired practice: Establish and maintain records of the quality assurance activities; Rating: Fully implemented. Capability level 1 achieved. Sources: GAO, SEI. [End of table] Table 94: VSCS Process and Product Quality Assurance: Detailed Findings on Level 2 Goals and Practices: Goal: The process is institutionalized as a managed process; Rating: Unsatisfied; Comment: The goal is unsatisfied because one of the practices below is not implemented. Desired practice: Establish and maintain an organizational policy for planning and performing the process and product quality assurance process; Rating: Fully implemented. Desired practice: Establish and maintain the plan for performing the process and product quality assurance process; Rating: Fully implemented. Desired practice: Provide adequate resources for performing the process and product quality assurance process, developing the work products, and providing the services of the process; Rating: Fully implemented. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the process and product quality assurance process; Rating: Fully implemented. Desired practice: Train the people performing or supporting the process and product quality assurance process as needed; Rating: Largely implemented; Comment: The project team generally trains the people performing and supporting the process and product quality assurance process, but it does not document individuals' training to ensure standards are met. Desired practice: Place designated work products of the process and product quality assurance process under appropriate levels of configuration management; Rating: Fully implemented. Desired practice: Identify and involve the relevant stakeholders of the process and product quality assurance process as planned; Rating: Largely implemented; Comment: The project team has identified and involves the relevant stakeholders of the process and product quality assurance process, but it does not track noncompliance issues to closure. Desired practice: Monitor and control the process and product quality assurance process against the plan for performing the process, and take appropriate corrective action; Rating: Not implemented; Comment: The project team does not monitor and control the process and product quality assurance process against the plan for performing the process or take appropriate corrective action. Desired practice: Objectively evaluate adherence of the process and product quality assurance process to its process description, standards, and procedures, and address noncompliance; Rating: Fully implemented. Desired practice: Review the activities, status, and results of the process and product quality assurance process with higher level management, and resolve issues; Rating: Fully implemented. Capability level 2 not achieved. Sources: GAO, SEI. [End of table] Table 95: ERAM Process and Product Quality Assurance: Detailed Findings on Level 1 Goals and Practices: Goal: Adherence of the performed process and associated work products and services to applicable process descriptions, standards, and procedures is objectively evaluated; Rating: Unsatisfied; Comment: The goal is unsatisfied because the two practices below are partially implemented. Desired practice: Objectively evaluate the designated performed processes against the applicable process descriptions, standards, and procedures; Rating: Partially implemented; Comment: The project team has just started defining a quality assurance process, which is expected to objectively evaluate the designated performed processes against the applicable process descriptions, standards, and procedures. Desired practice: Objectively evaluate the designated work products and services against the applicable process descriptions, standards, and procedures; Rating: Partially implemented; Comment: The project team has just started defining a quality assurance process, which is expected to objectively evaluate the designated work products and services against the applicable process descriptions, standards, and procedures. Goal: Noncompliance issues are objectively tracked and communicated, and resolution is ensured; Rating: Satisfied. Desired practice: Communicate quality issues and ensure resolution of noncompliance issues with the staff and managers; Rating: Largely implemented; Comment: The project team communicates quality issues and ensures resolution of noncompliance issues with the staff and managers; however, the team has just started defining a quality assurance process. Desired practice: Establish and maintain records of the quality assurance activities; Rating: Largely implemented; Comment: The project team has established and maintains records of the quality assurance activities; however, the team has just started defining a quality assurance process. Capability level 1 not achieved. Sources: GAO, SEI. [End of table] Table 96: ERAM Process and Product Quality Assurance: Detailed Findings on Level 2 Goals and Practices: Goal: The process is institutionalized as a managed process; Rating: Unsatisfied; Comment: The goal is unsatisfied because two of the practices below are partially implemented, and one is not implemented. Desired practice: Establish and maintain an organizational policy for planning and performing the process and product quality assurance process; Rating: Fully implemented. Desired practice: Establish and maintain the plan for performing the process and product quality assurance process; Rating: Largely implemented; Comment: The project team has just started defining the quality assurance process and has established and maintains the plan for performing the process. Desired practice: Provide adequate resources for performing the process and product quality assurance process, developing the work products, and providing the services of the process; Rating: Largely implemented; Comment: The project team provides adequate resources for performing the process, developing the work products, and providing the services of the process; however, the team has just started defining the quality assurance process. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the process and product quality assurance process. This responsibility should be performed by an independent party; Rating: Largely implemented; Comment: The project team has assigned responsibility and authority for performing the process, developing the work products, and providing the services of the process and product quality assurance process; however, the responsible party is not independent of the project team. Desired practice: Train the people performing or supporting the process and product quality assurance process as needed; Rating: Largely implemented; Comment: The project team trains the people performing or supporting the process as needed; however, the team has just started defining the quality assurance process. Desired practice: Place designated work products of the process and product quality assurance process under appropriate levels of configuration management; Rating: Partially implemented; Comment: The project team has just started defining a process that is expected to identify the work products of the quality assurance process that are to be placed under configuration control. Desired practice: Identify and involve the relevant stakeholders of the process and product quality assurance process as planned; Rating: Largely implemented; Comment: The project team has identified and involves the relevant stakeholders as planned; however, the team has just started defining a quality assurance process. Desired practice: Monitor and control the process and product quality assurance process against the plan for performing the process, and take appropriate corrective action; Rating: Not implemented; Comment: Because the project team has just started defining a quality assurance process, there are no provisions in the current plan to monitor and control the process. Desired practice: Objectively evaluate adherence of the process and product quality assurance process to its process description, standards, and procedures, and address noncompliance; Rating: Partially implemented; Comment: The project team has just started defining a quality assurance process that is expected to objectively evaluate adherence of the process and product quality assurance process to its process description, standards, and procedures and to address noncompliance. Desired practice: Review the activities, status, and results of the process and product quality assurance process with higher level management, and resolve issues; Rating: Largely implemented; Comment: The project team reviews the activities, status, and results of the process with higher level management and resolves issues; however, the team has just started defining a quality assurance process. Capability level 2 not achieved. Sources: GAO, SEI. [End of table] Table 97: ITWS Process and Product Quality Assurance: Detailed Findings on Level 1 Goals and Practices: Goal: Adherence of the performed process and associated work products and services to applicable process descriptions, standards, and procedures is objectively evaluated; Rating: Unsatisfied; Comment: The goal is unsatisfied because the two practices below are partially implemented. Desired practice: Objectively evaluate the designated performed processes against the applicable process descriptions, standards, and procedures; Rating: Partially implemented; Comment: The project team is planning, but has not yet implemented, a quality assurance process that is expected to objectively evaluate the designated performed processes against the applicable process descriptions, standards, and procedures. Desired practice: Objectively evaluate the designated work products and services against the applicable process descriptions, standards, and procedures; Rating: Partially implemented; Comment: The project team is planning, but has not yet implemented, a quality assurance process that is expected to objectively evaluate the designated work products and services against the applicable process descriptions, standards, and procedures. Goal: Noncompliance issues are objectively tracked and communicated, and resolution is ensured; Rating: Unsatisfied; Comment: The goal is unsatisfied because one of the practices below is partially implemented. Desired practice: Communicate quality issues and ensure resolution of noncompliance issues with the staff and managers; Rating: Partially implemented; Comment: The project team communicates quality issues and ensures resolution of noncompliance issues with staff and managers. However, the actions are not centrally managed in the project action database. Desired practice: Establish and maintain records of the quality assurance activities; Rating: Largely implemented; Comment: The project team has established and maintains records of the quality assurance activities; however, quality assurance activities are limited. Capability level 1 not achieved. Sources: GAO, SEI. [End of table] Table 98: ITWS Process and Product Quality Assurance: Detailed Findings on Level 2 Goals and Practices: Goal: The process is institutionalized as a managed process; Rating: Unsatisfied; Comment: The goal is unsatisfied because four of the practices below are partially implemented, and one is not implemented. Desired practice: Establish and maintain an organizational policy for planning and performing the process and product quality assurance process; Rating: Fully implemented. Desired practice: Establish and maintain the plan for performing the process and product quality assurance process; Rating: Partially implemented; Comment: The project team has just started updating its quality assurance plan for performing the process and product quality assurance process. However, the plan is still in draft and has not yet been finalized. Desired practice: Provide adequate resources for performing the process and product quality assurance process, developing the work products, and providing the services of the process; Rating: Largely implemented; Comment: The project team has provided adequate personnel resources for performing the process and product quality assurance process, developing the work products, and providing the services of the process. However, the team currently has no automated tools to assist in performing the process. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the process and product quality assurance process; Rating: Largely implemented; Comment: The project team has assigned responsibility and authority for performing the process, developing the work products, and providing the services of the process and product quality assurance process; however, the quality assurance point of contact is located within the project team and is not sufficiently objective. Desired practice: Train the people performing or supporting the process and product quality assurance process as needed; Rating: Partially implemented; Comment: The project team trains the personnel supporting the process and product quality assurance process in process improvement, but not in product quality assurance. Desired practice: Place designated work products of the process and product quality assurance process under appropriate levels of configuration management; Rating: Partially implemented; Comment: The project team places designated work products of the process and product quality assurance process under some levels of configuration management; however, the controls on these products are not adequate because all support contractors have full access to the system. Desired practice: Identify and involve the relevant stakeholders of the process and product quality assurance process as planned; Rating: Largely implemented; Comment: While the project team involves stakeholders in the process and product quality assurance process, the team did not identify a complete list of stakeholders. Desired practice: Monitor and control the process and product quality assurance process against the plan for performing the process, and take appropriate corrective action; Rating: Not implemented; Comment: The project team does not monitor or control the process and product quality assurance process against the plan for performing the process or take appropriate corrective action. Desired practice: Objectively evaluate adherence of the process and product quality assurance process to its process description, standards, and procedures, and address noncompliance; Rating: Partially implemented; Comment: The project team has just begun its quality assurance process, which is expected to objectively evaluate adherence of the process and product quality assurance process to its process description, standards, and procedures and to address noncompliance. Desired practice: Review the activities, status, and results of the process and product quality assurance process with higher level management, and resolve issues; Rating: Fully implemented. Capability level 2 not achieved. Sources: GAO, SEI. [End of table] Table 99: ASDE-X Process and Product Quality Assurance: Detailed Findings on Level 1 Goals and Practices: Goal: Adherence of the performed process and associated work products and services to applicable process descriptions, standards, and procedures is objectively evaluated; Rating: Unsatisfied; Comment: The goal is unsatisfied because the two practices below are partially implemented. Desired practice: Objectively evaluate the designated performed processes against the applicable process descriptions, standards, and procedures; Rating: Partially implemented; Comment: The project team has just started updating a quality assurance process that is expected to objectively evaluate the designated performed processes against the applicable process descriptions, standards, and procedures. Desired practice: Objectively evaluate the designated work products and services against the applicable process descriptions, standards, and procedures; Rating: Partially implemented; Comment: The project team has just started updating a quality assurance process that is expected to objectively evaluate the designated work products against the applicable process descriptions, standards, and procedures. Goal: Noncompliance issues are objectively tracked and communicated, and resolution is ensured; Rating: Unsatisfied; Comment: The goal is unsatisfied because one of the practices below is partially implemented. Desired practice: Communicate quality issues and ensure resolution of noncompliance issues with the staff and managers; Rating: Partially implemented; Comment: The project team has just started updating a quality assurance process that is expected to communicate quality issues and ensure resolution of noncompliance issues with the staff and managers. Desired practice: Establish and maintain records of the quality assurance activities; Rating: Largely implemented; Comment: The project team has established and maintains records of the quality assurance activities; however, quality assurance activities are limited. Capability level 1 not achieved. Sources: GAO, SEI. [End of table] Table 100: ASDE-X Process and Product Quality Assurance: Detailed Findings on Level 2 Goals and Practices: Goal: The process is institutionalized as a managed process; Rating: Unsatisfied; Comment: The goal is unsatisfied because five of the practices below are partially implemented, and one is not implemented. Desired practice: Establish and maintain an organizational policy for planning and performing the process and product quality assurance process; Rating: Fully implemented. Desired practice: Establish and maintain the plan for performing the process and product quality assurance process; Rating: Partially implemented; Comment: The project team has just started defining a quality assurance process and has begun to establish and maintain the plan for performing the process. Desired practice: Provide adequate resources for performing the process and product quality assurance process, developing the work products, and providing the services of the process; Rating: Partially implemented; Comment: The project team has provided adequate personnel resources for performing the process, developing the work products, and providing the services of the process. However, the project team has just started defining a quality assurance process and currently has no tools to assist in performing the process. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the process and product quality assurance process; Rating: Largely implemented; Comment: The project team has assigned responsibility for the process and product quality assurance process to various groups, but it does not assign duties to individuals. Desired practice: Train the people performing or supporting the process and product quality assurance process as needed; Rating: Fully implemented. Desired practice: Place designated work products of the process and product quality assurance process under appropriate levels of configuration management; Rating: Largely implemented; Comment: The project team places designated work products of the process under appropriate levels of configuration management; however, internal documents have yet to be placed under configuration management control. Desired practice: Identify and involve the relevant stakeholders of the process and product quality assurance process as planned; Rating: Partially implemented; Comment: The project team has just started defining a quality assurance process. The project team has identified stakeholders, but it has not yet established formal communication processes that involve the relevant stakeholders. Desired practice: Monitor and control the process and product quality assurance process against the plan for performing the process, and take appropriate corrective action; Rating: Not implemented; Comment: The project team has just started defining a quality assurance process, and there are no provisions in the current plan to monitor and control the process. Desired practice: Objectively evaluate adherence of the process and product quality assurance process to its process description, standards, and procedures, and address noncompliance; Rating: Partially implemented; Comment: The project team has just begun its quality assurance process, which is expected to objectively evaluate adherence of the process and product quality assurance process to its process description, standards, and procedures and to address noncompliance. Desired practice: Review the activities, status, and results of the process and product quality assurance process with higher level management, and resolve issues; Rating: Partially implemented; Comment: The project team has just started defining a quality assurance process and has not established formal communication processes with higher level management. Capability level 2 not achieved. Sources: GAO, SEI. [End of table] [End of section] Chapter 13: FAA Is Not Performing Most Measurement and Analysis Practices or Managing the Process: The purpose of measurement and analysis is to develop and sustain a measurement capability that is used to support management information needs. This process area includes the specification of measures, data collection and storage, analysis techniques, and the reporting of these values. This process allows users to objectively plan and estimate project activities and identify and resolve potential issues. As shown in figure 15, none of the four FAA projects satisfied all criteria for the "performing" capability level (level 1) in this process area. Weaknesses in managing and storing measurement data, measurement specifications, and analysis results kept the projects from achieving level 1. Further, none of the four projects satisfied all criteria for the "managing" capability level (level 2). As shown in the overview in table 101, while the four projects had differing weaknesses that contributed to this result, common weaknesses across multiple projects occurred in the areas of establishing an organizational policy, establishing a plan, providing resources, assigning responsibility, training people, configuration management, identifying stakeholders, monitoring and controlling the process, ensuring quality assurance, and reviewing status with higher level management of the measurement and analysis process. As a result of these weaknesses, FAA is exposed to increased risk that the projects will not have adequate estimates of work metrics or a sufficient view into actual performance. This increases the likelihood that projects will not meet cost, schedule, or performance goals, and that projects will not meet mission needs. Looked at another way, of the 72 practices we evaluated in this process area, FAA projects had 30 practices that were fully or largely implemented and 42 practices that were partially or not implemented. Figure 15: Four Projects' Capability Levels in Measurement and Analysis: [See PDF for image] [End of figure] Table 101: Four Projects' Appraisal Results in Measurement and Analysis: Goal: Measurement objectives and activities are aligned with identified information needs and objectives; VSCS: Goal not satisfied; ERAM: Goal satisfied; ITWS: Goal not satisfied; ASDE-X: Goal not satisfied. Desired practice: Establish and maintain measurement objectives that are derived from identified information needs and objectives; VSCS: Practice not implemented; ERAM: Practice largely implemented; ITWS: Practice not implemented; ASDE-X: Practice fully implemented. Desired practice: Specify measures to address the measurement objectives; VSCS: Practice not implemented; ERAM: Practice largely implemented; ITWS: Practice partially implemented; ASDE-X: Practice fully implemented. Desired practice: Specify how measurement data will be obtained and stored; VSCS: Practice not implemented; ERAM: Practice largely implemented; ITWS: Practice partially implemented; ASDE-X: Practice partially implemented. Desired practice: Specify how measurement data will be analyzed and reported; VSCS: Practice not implemented; ERAM: Practice largely implemented; ITWS: Practice partially implemented; ASDE-X: Practice largely implemented. Goal: Measurement results that address identified information needs and objectives are provided; VSCS: Goal not satisfied; ERAM: Goal not satisfied; ITWS: Goal not satisfied; ASDE-X: Goal not satisfied. Desired practice: Obtain specified measurement data; VSCS: Practice not implemented; ERAM: Practice largely implemented; ITWS: Practice partially implemented; ASDE-X: Practice fully implemented. Desired practice: Analyze and interpret measurement data; VSCS: Practice not implemented; ERAM: Practice partially implemented; ITWS: Practice partially implemented; ASDE-X: Practice fully implemented. Desired practice: Manage and store measurement data, measurement specifications, and analysis results; VSCS: Practice not implemented; ERAM: Practice partially implemented; ITWS: Practice not implemented; ASDE-X: Practice partially implemented. Desired practice: Report results of measurement and analysis activities to all relevant stakeholders; VSCS: Practice not implemented; ERAM: Practice partially implemented; ITWS: Practice largely implemented; ASDE-X: Practice fully implemented. Capability level 1 achieved? VSCS: No; ERAM: No; ITWS: No; ASDE-X: No. Goal: The process is institutionalized as a managed process; VSCS: Goal not satisfied; ERAM: Goal not satisfied; ITWS: Goal not satisfied; ASDE-X: Goal not satisfied. Desired practice: Establish and maintain an organizational policy for planning and performing the measurement and analysis process; VSCS: Practice not implemented; ERAM: Practice fully implemented; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Establish and maintain the plan for performing the measurement and analysis process; VSCS: Practice not implemented; ERAM: Practice largely implemented; ITWS: Practice not implemented; ASDE-X: Practice fully implemented. Desired practice: Provide adequate resources for performing the measurement and analysis process, developing the work products, and providing the services of the process; VSCS: Practice not implemented; ERAM: Practice largely implemented; ITWS: Practice partially implemented; ASDE-X: Practice largely implemented. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the measurement and analysis process; VSCS: Practice not implemented; ERAM: Practice largely implemented; ITWS: Practice not implemented; ASDE-X: Practice largely implemented. Desired practice: Train the people performing or supporting the measurement and analysis process as needed; VSCS: Practice not implemented; ERAM: Practice fully implemented; ITWS: Practice not implemented; ASDE-X: Practice largely implemented. Desired practice: Place designated work products of the measurement and analysis process under appropriate levels of configuration management; VSCS: Practice not implemented; ERAM: Practice fully implemented; ITWS: Practice partially implemented; ASDE-X: Practice largely implemented. Desired practice: Identify and involve the relevant stakeholders of the measurement and analysis process as planned; VSCS: Practice not implemented; ERAM: Practice largely implemented; ITWS: Practice not implemented; ASDE-X: Practice fully implemented. Desired practice: Monitor and control the measurement and analysis process against the plan for performing the process, and take appropriate corrective action; VSCS: Practice not implemented; ERAM: Practice not implemented; ITWS: Practice not implemented; ASDE-X: Practice partially implemented. Desired practice: Objectively evaluate adherence of the measurement and analysis process to its process description, standards, and procedures, and address noncompliance; VSCS: Practice not implemented; ERAM: Practice largely implemented; ITWS: Practice partially implemented; ASDE-X: Practice partially implemented. Desired practice: Review the activities, status, and results of the measurement and analysis process with higher level management, and resolve issues; VSCS: Practice not implemented; ERAM: Practice partially implemented; ITWS: Practice largely implemented; ASDE-X: Practice fully implemented. Capability level 2 achieved? VSCS: No; ERAM: No; ITWS: No; ASDE-X: No. Sources: GAO, SEI. [End of table] Additional details on each project's appraisal results at successive capability levels are provided in tables 102 through 109. Specifically, tables 102 and 103 provide results for VSCS; tables 104 and 105 provide results for ERAM; tables 106 and 107 provide results for ITWS; and tables 108 and 109 provide results for ASDE-X. Table 102: VSCS Measurement and Analysis: Detailed Findings on Level 1 Goals and Practices: Goal: Measurement objectives and activities are aligned with identified information needs and objectives; Rating: Unsatisfied; Comment: The goal is unsatisfied because all four of the practices below are not implemented. Desired practice: Establish and maintain measurement objectives that are derived from identified information needs and objectives; Rating: Not implemented; Comment: The project team has neither established nor maintains measurement objectives that are derived from identified information needs and objectives for the project or organization. Desired practice: Specify measures to address the measurement objectives; Rating: Not implemented; Comment: The project team does not specify measures to address the measurement objectives. Desired practice: Specify how measurement data will be obtained and stored; Rating: Not implemented; Comment: The project team does not specify how measurement data will be obtained and stored. The project team periodically collects subjective data on the status of the project. Desired practice: Specify how measurement data will be analyzed and reported; Rating: Not implemented; Comment: The project team does not specify how measurement data will be analyzed and reported. Goal: Measurement results that address identified information needs and objectives are provided; Rating: Unsatisfied; Comment: The goal is unsatisfied because all four of the practices below are not implemented. Desired practice: Obtain specified measurement data; Rating: Not implemented; Comment: The project team does not obtain measurement data. Desired practice: Analyze and interpret measurement data; Rating: Not implemented; Comment: The project team does not have any measurement data to analyze and interpret. Desired practice: Manage and store measurement data, measurement specifications, and analysis results; Rating: Not implemented; Comment: The project team has no measurement data to manage, store, or analyze. Desired practice: Report results of measurement and analysis activities to all relevant stakeholders; Rating: Not implemented; Comment: The project team does not report results of measurement and analysis activities to all relevant stakeholders. Capability level 1 not achieved. Sources: GAO, SEI. [End of table] Table 103: VSCS Measurement and Analysis: Detailed Findings on Level 2 Goals and Practices: Goal: The process is institutionalized as a managed process; Rating: Unsatisfied; Comment: The goal is unsatisfied because none of the 10 practices below are implemented. Desired practice: Establish and maintain an organizational policy for planning and performing the measurement and analysis process; Rating: Not implemented; Comment: The organization did not establish and maintain a policy for planning and performing the measurement and analysis process. Desired practice: Establish and maintain the plan for performing the measurement and analysis process; Rating: Not implemented; Comment: The project team does not have an established plan for performing the measurement and analysis process. Desired practice: Provide adequate resources for performing the measurement and analysis process, developing the work products, and providing the services of the process; Rating: Not implemented; Comment: The project team does not have adequate resources for performing the measurement and analysis process, developing the work products, and providing the services of the process. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the measurement and analysis process; Rating: Not implemented; Comment: The project team has not assigned responsibility for performing the measurement and analysis process. Desired practice: Train the people performing or supporting the measurement and analysis process as needed; Rating: Not implemented; Comment: The project team has not been trained to perform the measurement and analysis process. Desired practice: Place designated work products of the measurement and analysis process under appropriate levels of configuration management; Rating: Not implemented; Comment: The project team does not conduct measurement and analysis practices. There are no work products generated. Desired practice: Identify and involve the relevant stakeholders of the measurement and analysis process as planned; Rating: Not implemented; Comment: The project team has not identified any stakeholders for the measurement and analysis process. Desired practice: Monitor and control the measurement and analysis process against the plan for performing the process, and take appropriate corrective action; Rating: Not implemented; Comment: The project team does not monitor and control the measurement and analysis process. There is no tracking to the plan. Desired practice: Objectively evaluate adherence of the measurement and analysis process to its process description, standards, and procedures, and address noncompliance; Rating: Not implemented; Comment: The project team does not objectively evaluate adherence of the measurement and analysis process to its process description, standards, and procedures, and it does not address noncompliance. Desired practice: Review the activities, status, and results of the measurement and analysis process with higher level management, and resolve issues; Rating: Not implemented; Comment: The project team does not review the activities, status, and results of the measurement and analysis process with higher level management. Capability level 2 not achieved. Sources: GAO, SEI. [End of table] Table 104: ERAM Measurement and Analysis: Detailed Findings on Level 1 Goals and Practices: Goal: Measurement objectives and activities are aligned with identified information needs and objectives; Rating: Satisfied. Desired practice: Establish and maintain measurement objectives that are derived from identified information needs and objectives; Rating: Largely implemented; Comment: The project team establishes and usually--but not always-- maintains measurement objectives that are derived from identified information needs and objectives. Desired practice: Specify measures to address the measurement objectives; Rating: Largely implemented; Comment: The project team usually--but not always--specifies measures to address the measurement objectives. Desired practice: Specify how measurement data will be obtained and stored; Rating: Largely implemented; Comment: The project team specifies how measurement data will be obtained and stored for each project area. However, these specifications were only recently identified, and the team is just beginning to implement them. Desired practice: Specify how measurement data will be analyzed and reported; Rating: Largely implemented; Comment: The project team specifies how measurement data will be analyzed and reported for each project area. However, these specifications were only recently identified, and the team is just beginning to implement them. Goal: Measurement results that address identified information needs and objectives are provided; Rating: Unsatisfied; Comment: The goal is unsatisfied because three of the practices below are partially implemented. Desired practice: Obtain specified measurement data; Rating: Largely implemented; Comment: The project team obtains specified measurement data for each project area. However, these data were only recently identified, and the team is just beginning to obtain them. Desired practice: Analyze and interpret measurement data; Rating: Partially implemented; Comment: The project team does not consistently analyze and interpret measurement data for each project area. Desired practice: Manage and store measurement data, measurement specifications, and analysis results; Rating: Partially implemented; Comment: The project team does not consistently manage and store measurement data, measurement specifications, and analysis results. Desired practice: Report results of measurement and analysis activities to all relevant stakeholders; Rating: Partially implemented; Comment: The project team does not consistently report results of measurement and analysis activities to all relevant stakeholders. Capability level 1 not achieved. Sources: GAO, SEI. [End of table] Table 105: ERAM Measurement and Analysis: Detailed Findings on Level 2 Goals and Practices: Goal: The process is institutionalized as a managed process; Rating: Unsatisfied; Comment: The goal is unsatisfied because one of the practices below is partially implemented, and one is not implemented. Desired practice: Establish and maintain an organizational policy for planning and performing the measurement and analysis process; Rating: Fully implemented. Desired practice: Establish and maintain the plan for performing the measurement and analysis process; Rating: Largely implemented; Comment: The project team establishes and usually--but not always-- maintains the plan for performing the process. Desired practice: Provide adequate resources for performing the measurement and analysis process, developing the work products, and providing the services of the process; Rating: Largely implemented; Comment: The project team provides adequate resources for performing the measurement and analysis process, developing the work products, and providing the services of the process. However, these resources were only recently identified. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the measurement and analysis process; Rating: Largely implemented; Comment: The project team assigns responsibility and authority for performing the process, developing the work products, and providing the services of the measurement and analysis process. However, these responsibilities were only recently identified, and the team is just beginning to carry them out. Desired practice: Train the people performing or supporting the measurement and analysis process as needed; Rating: Fully implemented. Desired practice: Place designated work products of the measurement and analysis process under appropriate levels of configuration management; Rating: Fully implemented. Desired practice: Identify and involve the relevant stakeholders of the measurement and analysis process as planned; Rating: Largely implemented; Comment: The project team consistently identifies the relevant stakeholders of the measurement and analysis process; however, the team does not consistently involve the stakeholders in the process. Desired practice: Monitor and control the measurement and analysis process against the plan for performing the process, and take appropriate corrective action; Rating: Not implemented; Comment: The project team does not monitor and control the measurement and analysis process against the plan for performing the process or take appropriate action. Desired practice: Objectively evaluate adherence of the measurement and analysis process to its process description, standards, and procedures, and address noncompliance; Rating: Largely implemented; Comment: The project team recently evaluated the adherence of the measurement and analysis process to its process description, and standards, but it has just started defining its quality assurance process. Desired practice: Review the activities, status, and results of the measurement and analysis process with higher level management, and resolve issues; Rating: Partially implemented; Comment: The project team does not consistently review the activities, status, and results of the measurement and analysis process with higher level management or resolve issues. Capability level 2 not achieved. Sources: GAO, SEI. [End of table] Table 106: ITWS Measurement and Analysis: Detailed Findings on Level 1 Goals and Practices: Goal: Measurement objectives and activities are aligned with identified information needs and objectives; Rating: Unsatisfied; Comment: The goal is unsatisfied because three of the practices below are partially implemented, and one is not implemented. Desired practice: Establish and maintain measurement objectives that are derived from identified information needs and objectives; Rating: Not implemented; Comment: The project team has not established or maintained measurement objectives that are derived from identified information needs and objectives. Desired practice: Specify measures to address the measurement objectives; Rating: Partially implemented; Comment: The project team previously specified measures to address measurement objectives; however, the guidance for these measures is no longer used. Desired practice: Specify how measurement data will be obtained and stored; Rating: Partially implemented; Comment: Although the project team is collecting data, it does not have procedures for data collection and storage. Desired practice: Specify how measurement data will be analyzed and reported; Rating: Partially implemented; Comment: Although the project team collects and reports data, it has no procedures for analysis of these data. Goal: Measurement results that address identified information needs and objectives are provided; Rating: Unsatisfied; Comment: The goal is unsatisfied because two of the practices below are partially implemented, and one is not implemented. Desired practice: Obtain specified measurement data; Rating: Partially implemented; Comment: Although the project team collects data, the data are not specified or linked to any measurement objectives. Desired practice: Analyze and interpret measurement data; Rating: Partially implemented; Comment: The project team receives project data from the contractor and performs limited analysis on the data. However, the project team does not draw conclusions, conduct additional measurement and analysis as necessary, or refine criteria for future analysis. Desired practice: Manage and store measurement data, measurement specifications, and analysis results; Rating: Not implemented; Comment: The project team does not manage or store measurement data to be used for future analysis. Desired practice: Report results of measurement and analysis activities to all relevant stakeholders; Rating: Largely implemented; Comment: The project team communicates the collected data to all relevant stakeholders. However, the measurement and analysis data collected are limited to overall status, cost, and budget metrics. Capability level 1 not achieved. Sources: GAO, SEI. [End of table] Table 107: ITWS Measurement and Analysis: Detailed Findings on Level 2 Goals and Practices: Goal: The process is institutionalized as a managed process; Rating: Unsatisfied; Comment: The goal is unsatisfied because three of the practices below are partially implemented, and five are not implemented. Desired practice: Establish and maintain an organizational policy for planning and performing the measurement and analysis process; Rating: Fully implemented. Desired practice: Establish and maintain the plan for performing the measurement and analysis process; Rating: Not implemented; Comment: The project team has not established and does not maintain a plan for performing the measurement and analysis process. Desired practice: Provide adequate resources for performing the measurement and analysis process, developing the work products, and providing the services of the process; Rating: Partially implemented; Comment: The project team has provided some personnel resources for performing the measurement and analysis process, developing the work products, and providing the services of the process. However, the software resources used for measurement and analysis are limited. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the measurement and analysis process; Rating: Not implemented; Comment: The project team has not assigned responsibility and authority for performing the process, developing the work products, or providing the services of the measurement and analysis process. Desired practice: Train the people performing or supporting the measurement and analysis process as needed; Rating: Not implemented; Comment: The project team has not trained people performing or supporting the measurement and analysis process. Desired practice: Place designated work products of the measurement and analysis process under appropriate levels of configuration management; Rating: Partially implemented; Comment: The project team places designated work products of the measurement and analysis process under some levels of configuration management; however, the controls on these products are not adequate because all support contractors have full access to the system. Desired practice: Identify and involve the relevant stakeholders of the measurement and analysis process as planned; Rating: Not implemented; Comment: The project team has not identified or involved the relevant stakeholders of the measurement and analysis process as planned. Desired practice: Monitor and control the measurement and analysis process against the plan for performing the process, and take appropriate corrective action; Rating: Not implemented; Comment: The project team does not monitor or control the measurement and analysis process. Desired practice: Objectively evaluate adherence of the measurement and analysis process to its process description, standards, and procedures, and address noncompliance; Rating: Partially implemented; Comment: The project team has just begun its quality assurance process, which is expected to objectively evaluate adherence of the measurement and analysis process to its process description, standards, and procedures and to address noncompliance. Desired practice: Review the activities, status, and results of the measurement and analysis process with higher level management, and resolve issues; Rating: Largely implemented; Comment: The project team reviews the activities, status, and results of the measurement and analysis process with higher level management and resolves issues, but it does not collect or use a comprehensive set of metrics. Capability level 2 not achieved. Sources: GAO, SEI. [End of table] Table 108: ASDE-X Measurement and Analysis: Detailed Findings on Level 1 Goals and Practices: Goal: Measurement objectives and activities are aligned with identified information needs and objectives; Rating: Unsatisfied; Comment: The goal is unsatisfied because one of the practices below is partially implemented. Desired practice: Establish and maintain measurement objectives that are derived from identified information needs and objectives; Rating: Fully implemented. Desired practice: Specify measures to address the measurement objectives; Rating: Fully implemented. Desired practice: Specify how measurement data will be obtained and stored; Rating: Partially implemented; Comment: Although the project team specifies how measurement data will be obtained, storage procedures for each project area do not provide ready access to the data for analysis and future use. Desired practice: Specify how measurement data will be analyzed and reported; Rating: Largely implemented; Comment: The project team specifies how measurement data for each project area will be analyzed; however, documented reporting procedures are limited. Goal: Measurement results that address identified information needs and objectives are provided; Rating: Unsatisfied; Comment: The goal is unsatisfied because one of the practices below is partially implemented. Desired practice: Obtain specified measurement data; Rating: Fully implemented. Desired practice: Analyze and interpret measurement data; Rating: Fully implemented. Desired practice: Manage and store measurement data, measurement specifications, and analysis results; Rating: Partially implemented; Comment: Although the project team stores measurement data, it does not effectively manage this data. Specifically, the team cannot readily access the data for future use. Desired practice: Report results of measurement and analysis activities to all relevant stakeholders; Rating: Fully implemented. Capability level 1 not achieved. Sources: GAO, SEI. [End of table] Table 109: ASDE-X Measurement and Analysis: Detailed Findings on Level 2 Goals and Practices: Goal: The process is institutionalized as a managed process; Rating: Unsatisfied; Comment: The goal is unsatisfied because two of the practices below are partially implemented. Desired practice: Establish and maintain an organizational policy for planning and performing the measurement and analysis process; Rating: Fully implemented. Desired practice: Establish and maintain the plan for performing the measurement and analysis process; Rating: Fully implemented. Desired practice: Provide adequate resources for performing the measurement and analysis process, developing the work products, and providing the services of the process; Rating: Largely implemented; Comment: The project team provides resources for measurement and analysis activities, including data collection, analysis, and reporting, but it provides only limited resources for data storage and retrieval. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the measurement and analysis process; Rating: Largely implemented; Comment: The project team has responsibility and authority for performing the measurement and analysis process, but it has not assigned these duties to specific individuals. Desired practice: Train the people performing or supporting the measurement and analysis process as needed; Rating: Largely implemented; Comment: The project team performs some training to support the measurement and analysis process, and more training is being implemented. Desired practice: Place designated work products of the measurement and analysis process under appropriate levels of configuration management; Rating: Largely implemented; Comment: The project team places designated work products of the process under appropriate levels of configuration management; however, internal documents have yet to be placed under configuration management control. Desired practice: Identify and involve the relevant stakeholders of the measurement and analysis process as planned; Rating: Fully implemented. Desired practice: Monitor and control the measurement and analysis process against the plan for performing the process, and take appropriate corrective action; Rating: Partially implemented; Comment: Although the project team stated that it monitors and controls the process against the plan and takes action, the project has not yet resolved deficiencies in data storage, retrieval, and analysis. Desired practice: Objectively evaluate adherence of the measurement and analysis process to its process description, standards, and procedures, and address noncompliance; Rating: Partially implemented; Comment: The project team has just begun its quality assurance process, which is expected to objectively evaluate adherence of the measurement and analysis process to its process description, standards, and procedures and to address noncompliance. Desired practice: Review the activities, status, and results of the measurement and analysis process with higher level management, and resolve issues; Rating: Fully implemented. Capability level 2 not achieved. Sources: GAO, SEI. [End of table] [End of section] Chapter 14: FAA Is Performing Supplier Agreement Management Practices, but It Is Not Yet Fully Managing the Process: The purpose of supplier agreement management is to manage the acquisition of products. This process area involves determining the type of acquisition that will be used for the products acquired, selecting suppliers, establishing, maintaining, and executing agreements, accepting delivery of acquired products, and transitioning acquired products to the project, among other items. For this process area, we did not perform an appraisal for the VSCS or ITWS projects, because these projects were at stages in which supplier agreement management was not applicable. As shown in figure 16, both of the remaining FAA projects satisfied all criteria for the "performing" capability level (level 1) in this process area. One of the two projects satisfied all criteria for the "managing" capability level (level 2). In not consistently managing this process, FAA is exposed to increased risk that projects will not be performed in accordance with contractual requirements, resulting in projects that will not meet cost, schedule, or performance goals, and systems that will not meet mission needs. Looked at another way, of the 34 practices we evaluated in this process area, FAA projects had 33 practices that were fully or largely implemented and 1 practice that was partially implemented. Table 110 provides an overview of the appraisal results. Figure 16: Two Projects' Capability Levels in Supplier Agreement Management: [See PDF for image] [End of figure] Table 110: Two Projects' Appraisal Results in Supplier Agreement Management: Goal: Agreements with the suppliers are established and maintained; VSCS: N/A; ERAM: Goal satisfied; ITWS: N/A; ASDE-X: Goal satisfied. Desired practice: Determine the type of acquisition for each product or product component to be acquired; VSCS: N/A; ERAM: Practice fully implemented; ITWS: N/A; ASDE-X: Practice fully implemented. Desired practice: Select suppliers based on an evaluation of their ability to meet the specified requirements and established criteria; VSCS: N/A; ERAM: Practice fully implemented; ITWS: N/A; ASDE-X: Practice fully implemented. Desired practice: Establish and maintain formal agreements with the supplier; VSCS: N/A; ERAM: Practice fully implemented; ITWS: N/A; ASDE-X: Practice fully implemented. Goal: Agreements with the suppliers are satisfied by both the project and the supplier; VSCS: N/A; ERAM: Goal satisfied; ITWS: N/A; ASDE-X: Goal satisfied. Desired practice: Review candidate commercial, off-the-shelf (COTS) products to ensure they satisfy the specified requirements that are covered under a supplier agreement; VSCS: N/A; ERAM: Practice largely implemented; ITWS: N/A; ASDE-X: Practice fully implemented. Desired practice: Perform activities with the supplier as specified in the supplier agreement; VSCS: N/A; ERAM: Practice fully implemented; ITWS: N/A; ASDE-X: Practice fully implemented. Desired practice: Ensure that the supplier agreement is satisfied before accepting the acquired product; VSCS: N/A; ERAM: Practice fully implemented; ITWS: N/A; ASDE-X: Practice fully implemented. Desired practice: Transition the acquired products from the supplier to the project; VSCS: N/A; ERAM: Practice largely implemented; ITWS: N/A; ASDE-X: Practice largely implemented. Capability level 1 achieved? VSCS: N/A; ERAM: Yes; ITWS: N/A; ASDE-X: Yes. Goal: The process is institutionalized as a managed process; VSCS: N/ A; ERAM: Goal not satisfied; ITWS: N/A; ASDE-X: Goal satisfied. Desired practice: Establish and maintain an organizational policy for planning and performing the supplier agreement management process; VSCS: N/A; ERAM: Practice fully implemented; ITWS: N/A; ASDE-X: Practice fully implemented. Desired practice: Establish and maintain the plan for performing the supplier agreement management process; VSCS: N/A; ERAM: Practice fully implemented; ITWS: N/A; ASDE-X: Practice fully implemented. Desired practice: Provide adequate resources for performing the supplier agreement management process, developing the work products, and providing the services of the process; VSCS: N/A; ERAM: Practice fully implemented; ITWS: N/A; ASDE-X: Practice fully implemented. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the supplier agreement management process; VSCS: N/A; ERAM: Practice fully implemented; ITWS: N/A; ASDE-X: Practice largely implemented. Desired practice: Train the people performing or supporting the supplier agreement management process as needed; VSCS: N/A; ERAM: Practice fully implemented; ITWS: N/A; ASDE-X: Practice largely implemented. Desired practice: Place designated work products of the supplier agreement management process under appropriate levels of configuration management; VSCS: N/ A; ERAM: Practice fully implemented; ITWS: N/A; ASDE-X: Practice fully implemented. Desired practice: Identify and involve the relevant stakeholders of the supplier agreement management process as planned; VSCS: N/A; ERAM: Practice fully implemented; ITWS: N/ A; ASDE-X: Practice fully implemented. Desired practice: Monitor and control the supplier agreement management process against the plan for performing the process, and take appropriate corrective action; VSCS: N/A; ERAM: Practice fully implemented; ITWS: N/A; ASDE-X: Practice largely implemented. Desired practice: Objectively evaluate adherence of the supplier agreement management process to its process description, standards, and procedures, and address noncompliance; VSCS: N/A; ERAM: Practice partially implemented; ITWS: N/A; ASDE-X: Practice largely implemented. Desired practice: Review the activities, status, and results of the supplier agreement management process with higher level management, and resolve issues; VSCS: N/A; ERAM: Practice fully implemented; ITWS: N/A; ASDE-X: Practice fully implemented. Capability level 2 achieved? VSCS: N/A; ERAM: No; ITWS: N/A; ASDE-X: Goal satisfied. Sources: GAO, SEI. Note: N/A represents not applicable; project not evaluated in this process. [End of table] Additional details on each project's appraisal results at successive capability levels are provided in tables 111 through 114. Specifically, tables 111 and 112 provide results for ERAM, and tables 113 and 114 provide results for ASDE-X. Table 111: ERAM Supplier Agreement Management: Detailed Findings on Level 1 Goals and Practices: Goal: Agreements with the suppliers are established and maintained; Rating: Satisfied. Desired practice: Determine the type of acquisition for each product or product component to be acquired; Rating: Fully implemented. Desired practice: Select suppliers based on an evaluation of their ability to meet the specified requirements and established criteria; Rating: Fully implemented. Desired practice: Establish and maintain formal agreements with the supplier; Rating: Fully implemented. Goal: Agreements with the suppliers are satisfied by both the project and the supplier; Rating: Satisfied. Desired practice: Review candidate COTS products to ensure they satisfy the specified requirements that are covered under a supplier agreement; Rating: Largely implemented; Comment: The project team has developed guidance for the review of COTS products to ensure they satisfy the specified requirements that are covered under a supplier agreement. However, the project team has deferred the selection of the products to later in the acquisition when it is more appropriate to do so. Desired practice: Perform activities with the supplier as specified in the supplier agreement; Rating: Fully implemented. Desired practice: Ensure that the supplier agreement is satisfied before accepting the acquired product; Rating: Fully implemented. Desired practice: Transition the acquired products from the supplier to the project; Rating: Largely implemented; Comment: The project team has plans to transition acquired products from the supplier to the project. Capability level 1 achieved. Sources: GAO, SEI. [End of table] Table 112: ERAM Supplier Agreement Management: Detailed Findings on Level 2 Goals and Practices: Goal: The process is institutionalized as a managed process; Rating: Unsatisfied; Comment: The goal is unsatisfied because one of the practices below is partially implemented. Desired practice: Establish and maintain an organizational policy for planning and performing the supplier agreement management process; Rating: Fully implemented. Desired practice: Establish and maintain the plan for performing the supplier agreement management process; Rating: Fully implemented. Desired practice: Provide adequate resources for performing the supplier agreement management process, developing the work products, and providing the services of the process; Rating: Fully implemented. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the supplier agreement management process; Rating: Fully implemented. Desired practice: Train the people performing or supporting the supplier agreement management process as needed; Rating: Fully implemented. Desired practice: Place designated work products of the supplier agreement management process under appropriate levels of configuration management; Rating: Fully implemented. Desired practice: Identify and involve the relevant stakeholders of the supplier agreement management process as planned; Rating: Fully implemented. Desired practice: Monitor and control the supplier agreement management process against the plan for performing the process, and take appropriate corrective action; Rating: Fully implemented. Desired practice: Objectively evaluate adherence of the supplier agreement management process against its process description, standards, and procedures, and address noncompliance; Rating: Partially implemented; Comment: The project team has just started defining its quality assurance process, which is expected to objectively evaluate adherence of the supplier agreement management process against its process description, standards, and procedures and to address noncompliance. Desired practice: Review the activities, status, and results of the supplier agreement management process with higher level management, and resolve issues; Rating: Fully implemented. Capability level 2 not achieved. Sources: GAO, SEI. [End of table] Table 113: ASDE-X Supplier Agreement Management: Detailed Findings on Level 1 Goals and Practices: Goal: Agreements with the suppliers are established and maintained; Rating: Satisfied. Desired practice: Determine the type of acquisition for each product or product component to be acquired; Rating: Fully implemented. Desired practice: Select suppliers based on an evaluation of their ability to meet the specified requirements and established criteria; Rating: Fully implemented. Desired practice: Establish and maintain formal agreements with the supplier; Rating: Fully implemented. Goal: Agreements with the suppliers are satisfied by both the project and the supplier; Rating: Satisfied. Desired practice: Review candidate COTS products to ensure they satisfy the specified requirements that are covered under a supplier agreement; Rating: Fully implemented. Desired practice: Perform activities with the supplier as specified in the supplier agreement; Rating: Fully implemented. Desired practice: Ensure that the supplier agreement is satisfied before accepting the acquired product; Rating: Fully implemented. Desired practice: Transition the acquired products from the supplier to the project; Rating: Largely implemented; Comment: The project team is planning to transition the acquired products from the supplier to the project team; however, the transition phase has not yet started. Capability level 1 achieved. Sources: GAO, SEI. [End of table] Table 114: ASDE-X Supplier Agreement Management: Detailed Findings on Level 2 Goals and Practices: Goal: The process is institutionalized as a managed process; Rating: Satisfied. Desired practice: Establish and maintain an organizational policy for planning and performing the supplier agreement management process; Rating: Fully implemented. Desired practice: Establish and maintain the plan for performing the supplier agreement management process; Rating: Fully implemented. Desired practice: Provide adequate resources for performing the supplier agreement management process, developing the work products, and providing the services of the process; Rating: Fully implemented. Desired practice: Assign responsibility and authority for performing the process, developing the work products, and providing the services of the supplier agreement management process; Rating: Largely implemented; Comment: The project team has assigned responsibility for the project planning process to various groups, but it does not assign duties to individuals. Desired practice: Train the people performing or supporting the supplier agreement management process as needed; Rating: Largely implemented; Comment: The project team has provided training as needed to some but not all of the people performing or supporting the process. Desired practice: Place designated work products of the supplier agreement management process under appropriate levels of configuration management; Rating: Fully implemented. Desired practice: Identify and involve the relevant stakeholders of the supplier agreement management process as planned; Rating: Fully implemented. Desired practice: Monitor and control the supplier agreement management process against the plan for performing the process, and take appropriate corrective action; Rating: Largely implemented; Comment: The team collects measurements and monitors the process against the plan for performing the process; however, the project is not archiving and using the data to control the process beyond the monthly snapshots. Desired practice: Objectively evaluate adherence of the supplier agreement management process to its process description, standards, and procedures, and address noncompliance; Rating: Largely implemented; Comment: The project team objectively evaluates adherence of the products to its requirements and addresses noncompliance through the quality reliability officer. However, the project team has just begun its quality assurance process, which is expected to objectively evaluate adherence of the supplier agreement management process to its process description, standards, and procedures and to address noncompliance. Desired practice: Review the activities, status, and results of the supplier agreement management process with higher level management, and resolve issues; Rating: Fully implemented. Capability level 2 achieved. Sources: GAO, SEI. [End of table] [End of section] Chapter 15: FAA Is Performing Deployment, Transition, and Disposal Practices, but It Is Not Yet Fully Managing the Process: The purpose of the deployment, transition, and disposal process area is to place a product or service into an operational environment, transfer it to the customer and to the support organization, and deactivate and dispose of the replaced product or dispense with the service. This process area includes the design and coordination of plans and procedures for placement of a product or service into an operational or support environment and bringing it into operational use. It ensures that an effective support capability is in place to manage, maintain, and modify the supplied product or service. It further ensures the successful transfer of the product or service to the customer/ stakeholder and the deactivation and disposition of the replaced capability. For this process area, we did not perform an appraisal for the VSCS or ERAM projects, because these projects were at stages in which deployment was not applicable. As shown in figure 17, both of the remaining FAA projects satisfied all criteria for the "performing" capability level (level 1) in this process area. Neither satisfied all criteria for the "managing" capability level (level 2). As shown in the overview in table 115, while the projects had differing weaknesses that contributed to this result, a common weakness across projects occurred in the area of monitoring and controlling the deployment process. As a result of this weakness, FAA is exposed to increased risk that the projects will not be delivered on time, resulting in projects that will not meet cost, schedule, or performance goals. Looked at another way, of the 32 practices we evaluated in this process area, FAA projects had 28 practices that were fully or largely implemented and 4 practices that were partially implemented. Figure 17: Two Projects' Capability Levels in Deployment, Transition, and Disposal: [See PDF for image] [End of figure] Table 115: Two Projects' Appraisal Results in Deployment, Transition, and Disposal: Goal: Customer/stakeholder operation and support facilities and personnel are prepared to accept the delivery, placement, and transition of the product or service into use; customer/stakeholder operation and support organizations demonstrate their capacity to support the product or service upon assumption of responsibility; and the continuity of operational performance is maintained.[A]; VSCS: N/A; ERAM: N/A; ITWS: Goal satisfied; ASDE-X: Goal satisfied. Desired practice: Develop a strategy for deployment, transition, and disposal, and perform activities in accordance with the strategy; VSCS: N/A; ERAM: N/A; ITWS: Practice fully implemented; ASDE-X: Practice largely implemented. Desired practice: Establish the facility and infrastructure environment to receive and operate the product or service; VSCS: N/A; ERAM: N/A; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Verify that fielded configuration items reflect the product or service baseline, and manage change control; VSCS: N/A; ERAM: N/A; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Demonstrate the ability of the customer/stakeholder support organization to maintain, modify, and support the product or service; VSCS: N/A; ERAM: N/A; ITWS: Practice largely implemented; ASDE-X: Practice largely implemented. Desired practice: Transfer the product or service to the customer/ stakeholder operation and support organizations; VSCS: N/A; ERAM: N/A; ITWS: Practice largely implemented; ASDE-X: Practice largely implemented. Goal: The replaced product or service is deactivated and disposed of and/or dispensed with, as appropriate; VSCS: N/A; ERAM: N/A; ITWS: Goal satisfied; ASDE-X: Goal satisfied. Desired practice: Deactivate and dispose of replaced product and/or dispense with service; VSCS: N/A; ERAM: N/A; ITWS: Practice fully implemented; ASDE-X: Practice largely implemented. Capability level 1 achieved? VSCS: N/A; ERAM: N/A; ITWS: Goal satisfied; ASDE-X: Goal satisfied. Goal: The process is institutionalized as a managed process; VSCS: N/ A; ERAM: N/A; ITWS: Goal not satisfied; ASDE-X: Goal not satisfied. Desired practice: Establish and maintain an organizational policy for planning and performing the deployment process; VSCS: N/A; ERAM: N/A; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Establish and maintain the plan for performing the deployment process; VSCS: N/A; ERAM: N/A; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Provide adequate resources for performing the deployment process, developing the work products, and providing the services of the process; VSCS: N/A; ERAM: N/A; ITWS: Practice fully implemented; ASDE-X: Practice largely implemented. Desired practice: Assign responsibility and authority for performing the deployment process, developing the work products, and providing the services of the process; VSCS: N/A; ERAM: N/A; ITWS: Practice fully implemented; ASDE-X: Practice largely implemented. Desired practice: Train the people performing or supporting the deployment process as needed; VSCS: N/A; ERAM: N/A; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Place designated work products of the deployment process under appropriate levels of configuration management; VSCS: N/A; ERAM: N/A; ITWS: Practice partially implemented; ASDE-X: Practice fully implemented. Desired practice: Identify and involve the relevant stakeholders of the deployment process as planned; VSCS: N/A; ERAM: N/A; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Desired practice: Monitor and control the deployment process against the plan for performing the process, and take appropriate corrective action; VSCS: N/A; ERAM: N/A; ITWS: Practice partially implemented; ASDE-X: Practice partially implemented. Desired practice: Objectively evaluate adherence of the deployment process against its process description, standards, and procedures, and address noncompliance; VSCS: N/A; ERAM: N/A; ITWS: Practice partially implemented; ASDE-X: Practice fully implemented. Desired practice: Review the activities, status, and results of the deployment process with higher level management, and resolve issues; VSCS: N/A; ERAM: N/ A; ITWS: Practice fully implemented; ASDE-X: Practice fully implemented. Capability level 2 achieved? VSCS: N/A; ERAM: N/A; ITWS: No; ASDE-X: No. Sources: GAO, FAA, and SEI. Note: N/A represents not applicable; project not appraised in this process area. [A] FAA's iCMM model identifies this goal as four separate goals, each with underlying practices. We combined them into one goal for our reporting purposes, because some of the practices supporting individual goals were repeated in other goals and would have skewed our summary results. [End of table] Additional details on each project's appraisal results at successive capability levels are provided in tables 116 through 119. Specifically, tables 116 and 117 provide results for ITWS, and tables 118 and 119 provide results for ASDE-X. Table 116: ITWS Deployment, Transition, and Disposal: Detailed Findings on Level 1 Goals and Practices: Goal: Customer/stakeholder operation and support facilities and personnel are prepared to accept the delivery, placement, and transition of the product or service into use; customer/stakeholder operation and support organizations demonstrate their capacity to support the product or service upon assumption of responsibility; and the continuity of operational performance is maintained; Rating: Satisfied. Desired practice: Develop a strategy for deployment, transition, and disposal, and perform activities in accordance with the strategy; Rating: Fully implemented. Desired practice: Establish the facility and infrastructure environment to receive and operate the product or service; Rating: Fully implemented. Desired practice: Verify that fielded configuration items reflect the product or service baseline, and manage change control; Rating: Fully implemented. Desired practice: Demonstrate the ability of the customer/ stakeholder support organization to maintain, modify, and support the product or service; Rating: Largely implemented; Comment: The contractor support organization has demonstrated the ability to maintain, modify, and support the product or service and has trained FAA personnel to support the product or service; however, the product or service has not yet been transitioned from the contractor to FAA personnel. Desired practice: Transfer the product or service to the customer/ stakeholder operation and support organizations; Rating: Largely implemented; Comment: The project team is planning to transfer the product to the FAA operation and support organization but has not yet completed this activity. Goal: The replaced product or service is deactivated and disposed of and/or dispensed with, as appropriate; Rating: Satisfied. Desired practice: Deactivate and dispose of the replaced product and/or dispense with the replaced service; Rating: Fully implemented. Capability level 1 achieved. Sources: GAO, FAA, and SEI. [End of table] Table 117: ITWS Deployment, Transition, and Disposal: Detailed Findings on Level 2 Goals and Practices: Goal: The process is institutionalized as a managed process; Rating: Unsatisfied; Comment: The goal is unsatisfied because three of the practices below are partially implemented. Desired practice: Establish and maintain an organizational policy for planning and performing the deployment process; Rating: Fully implemented. Desired practice: Establish and maintain the plan for performing the deployment process; Rating: Fully implemented. Desired practice: Provide adequate resources for performing the deployment process, developing the work products, and providing the services of the process; Rating: Fully implemented. Desired practice: Assign responsibility and authority for performing the deployment process, developing the work products, and providing the services of the process; Rating: Fully implemented. Desired practice: Train the people performing or supporting the deployment process as needed; Rating: Fully implemented. Desired practice: Place designated work products of the deployment process under appropriate levels of configuration management; Rating: Partially implemented; Comment: The project team places designated work products of the deployment process under some levels of configuration management; however, the controls on these products are not adequate because all support contractors have full access to the system. Desired practice: Identify and involve the relevant stakeholders of the deployment process as planned; Rating: Fully implemented. Desired practice: Monitor and control the deployment process against the plan for performing the process, and take appropriate corrective action; Rating: Partially implemented; Comment: The project team reported, but provided no evidence, that it is monitoring and controlling the deployment process against the plan for performing the deployment process and taking corrective action. Desired practice: Objectively evaluate adherence of the deployment process against its process description, standards, and procedures, and address noncompliance; Rating: Partially implemented; Comment: The project team is planning, but has not yet implemented, a quality assurance process that is expected to objectively evaluate adherence of the deployment process against its process description, standards, and procedures and to address noncompliance. Desired practice: Review the activities, status, and results of the deployment process with higher level management, and resolve issues; Rating: Fully implemented. Capability level 2 not achieved. Sources: GAO, SEI. [End of table] Table 118: ASDE-X Deployment, Transition, and Disposal: Detailed Findings on Level 1 Goals and Practices: Goal: Customer/stakeholder operation and support facilities and personnel are prepared to accept the delivery, placement, and transition of the product or service into use; customer/stakeholder operation and support organizations demonstrate their capacity to support the product or service upon assumption of responsibility; and the continuity of operational performance is maintained; Rating: Satisfied. Desired practice: Develop a strategy for deployment, transition, and disposal, and perform activities in accordance with the strategy; Rating: Largely implemented; Comment: The project team has developed and is implementing strategies for deployment, but the transition and disposal plans are still in the early stages of development. Desired practice: Establish the facility and infrastructure environment to receive and operate the product or service; Rating: Fully implemented. Desired practice: Verify that fielded configuration items reflect the product or service baseline, and manage change control; Rating: Fully implemented. Desired practice: Demonstrate the ability of the customer/ stakeholder support organization to maintain, modify, and support the product or service; Rating: Largely implemented; Comment: The project team has planned to demonstrate support capability for software projects in the in-service support stages. However, the project has not reached this stage of the life cycle and has not begun to implement the support required for this practice. Desired practice: Transfer the product or service to the customer/ stakeholder operation and support organizations; Rating: Largely implemented; Comment: The project team has plans to transition the product to the customer/stakeholder operation and support organization. However, the transition from the prime contractor to the government organization has not yet begun. Goal: The replaced product or service is deactivated and disposed of and/or dispensed with, as appropriate; Rating: Satisfied. Desired practice: Deactivate and dispose of the replaced product and/or dispense with the replaced service; Rating: Largely implemented; Comment: The project team has plans to deactivate and dispose of replaced products and dispense with services. However, the project has not reached this phase of the life cycle. Capability level 1 achieved. Sources: GAO, FAA, and SEI. [End of table] Table 119: ASDE-X Deployment, Transition, and Disposal: Detailed Findings on Level 2 Goals and Practices: Goal: The process is institutionalized as a managed process; Rating: Unsatisfied; Comment: The goal is unsatisfied because one of the practices below is partially implemented. Desired practice: Establish and maintain an organizational policy for planning and performing the deployment process; Rating: Fully implemented. Desired practice: Establish and maintain the plan for performing the deployment process; Rating: Fully implemented. Desired practice: Provide adequate resources for performing the deployment process, developing the work products, and providing the services of the process; Rating: Largely implemented; Comment: The project team provides adequate personnel resources and is planning to provide adequate tools and facilities for performing the process, developing the work products, and providing the services of the process. However, the project team has not yet begun the deployment phase, and the tools have not yet been put into use. Desired practice: Assign responsibility and authority for performing the deployment process, developing the work products, and providing the services of the process; Rating: Largely implemented; Comment: The project team has assigned responsibility for the deployment process to various groups, but it has not yet assigned duties to individuals. Desired practice: Train the people performing or supporting the deployment process as needed; Rating: Fully implemented. Desired practice: Place designated work products of the deployment process under appropriate levels of configuration management; Rating: Fully implemented. Desired practice: Identify and involve the relevant stakeholders of the deployment process as planned; Rating: Fully implemented. Desired practice: Monitor and control the deployment process against the plan for performing the process, and take appropriate corrective action; Rating: Partially implemented; Comment: The project team is monitoring the deployment process against the plan for performing the process. However, the team is not currently controlling the deployment process or taking corrective action. Desired practice: Objectively evaluate adherence of the deployment process against its process description, standards, and procedures, and address noncompliance; Rating: Fully implemented. Desired practice: Review the activities, status, and results of the deployment process with higher level management, and resolve issues; Rating: Fully implemented. Capability level 2 not achieved. Sources: GAO, SEI. [End of table] [End of section] Chapter 16: FAA's Process Improvement Initiative Has Matured, but It Is Not Yet Institutionalized: Since our 1997 report, the Federal Aviation Administration's (FAA) process improvement initiative has grown tremendously in rigor and scope. In our earlier appraisal, we found that FAA's performance of key processes was ad hoc and sometimes chaotic, whereas current results show that FAA projects are performing most key practices. However, these process improvement activities are not required throughout the air traffic organizations, and the recurring weaknesses we identified in our project-specific evaluations are due in part to the choices these projects were given in deciding whether to and how to adopt process improvement initiatives. Further, because of a recent reorganization, the new Air Traffic Organization's commitment to this process improvement initiative is not certain. As a result, FAA is not consistent in its adoption and management of process improvement efforts, so that individual projects' costs, schedules, and performance remain at risk. Without agencywide adoption of process improvement initiatives, the agency cannot increase the maturity of its organizational capabilities. FAA's Process Improvement Initiative Has Matured: Over the past several years, FAA has made considerable progress in improving its processes for acquiring and developing software and systems. Acting on our prior recommendations, in 1999, FAA established a centralized process improvement office that reports directly to the Chief Information Officer. This office led the government in an effort to integrate various standards and models into a single maturity model, called the integrated Capability Maturity Model (iCMM).[Footnote 16] In fact, FAA's iCMM served as a demonstration for the Software Engineering Institute's effort to integrate various models into its own Capability Maturity Model Integration (CMMI). The Chief Information Officer's process improvement office also developed and sponsored iCMM-related training, and by late 2003, it had trained over 7,000 participants. The training offered ranges from overviews on how to use the model to more focused courses in such specific process areas as quality assurance, configuration management, and project management. The office also guides FAA organizations in using the model and leads appraisal teams in evaluating the process maturity of the projects and organizations that adopted the model. In addition to the Chief Information Officer-sponsored process improvement efforts, several of FAA's business areas, including the business areas with responsibility for air traffic control (ATC) system acquisitions and operations, endorsed and set goals for process improvement activities using the iCMM. As a result, there has been a continuing growth over the years in the number of individual projects and umbrella organizations that adopted process improvement and the iCMM model. Specifically, the number of projects and organizations (which account for multiple projects) undergoing iCMM appraisals grew from 1 project in 1997, to 28 projects and 3 organizations by 2000, to 39 projects and 11 organizations by 2003. These projects and organizations have demonstrated improvements in process maturity. Under the iCMM model, in addition to achieving capability levels in individual process areas, entities can achieve successive maturity levels by demonstrating capabilities in a core set of process areas.[Footnote 17] FAA process improvement officials reported that by 2000, 10 projects and one organization had achieved iCMM maturity level 2. To date, 14 projects and three organizations have achieved iCMM maturity level 2, and one project and two organizations have achieved iCMM maturity level 3.[Footnote 18] Additionally, 13 projects and four organizations achieved capability levels 2 or 3 in one or more process areas. Moreover, in internal surveys, the programs and organizations pursuing process improvement have consistently reported enhanced productivity, higher quality, increased ability to predict schedules and resources, higher morale, and better communication and teamwork. These findings are reiterated by the Software Engineering Institute in its recent study of the benefits of using the CMMI model for process improvement.[Footnote 19] According to that study, organizations that implement such process improvements can achieve better project cost and schedule performance and higher quality products. Specifically, of the 12 cases that the Software Engineering Institute assessed, there were: * nine examples of cost related benefits, including reductions in the cost to find and fix a defect, and in overall cost savings; * eight cases of schedule related benefits, including decreased time needed to complete tasks and increased predictability in meeting schedules; * five cases of measurable improvements in quality, mostly related to reducing defects over time; * three cases of improvements in customer satisfaction; and: * three cases showing positive return on investment from their CMMI- based process improvements. FAA Has Not Yet Institutionalized Process Improvement in Its ATC Organizations: Leading organizations have found that in order to achieve advanced system management capabilities and to gain the benefits of more mature processes, an organization needs to institutionalize process improvement. Specifically, to be effective, an organization needs senior-level endorsement of its process improvement initiatives and consistency in the adoption and management of process improvement efforts. In recent years, FAA's ATC-related organizations have encouraged process improvement through the iCMM model. Specifically, FAA's acquisition policy calls for continuous process improvement and endorses the use of the iCMM model. Also, the former air traffic organizations set annual goals for improving maturity using the iCMM model in selected projects and process areas. For example, in 1997, the former ATC acquisition organization set a goal of having 11 selected projects achieve iCMM maturity level 2 by 1999 and maturity level 3 by 2001. While the projects did not meet the 1999 goal, several projects achieved level 2 in 2000, and most made improvements in selected process areas. However, FAA did not institutionalize the use of the iCMM model throughout the organization and, as a result, individual projects' use and application of the model has been voluntary. Individual project teams could determine whether or not they would implement the model and which process areas to work on. In addition, project teams could decide when, if ever, to seek an appraisal of their progress in implementing the model. Because of this voluntary approach, to date less than half of the projects listed in FAA's system architecture have sought appraisals in at least one process area. Specifically, of the 48 systems listed in FAA's system architecture, only 18 have sought appraisals. Some of the mission critical systems that have not sought appraisals include an advanced radar system and air traffic information processing system. Another result of this voluntary approach is that individual projects are making uneven progress in core areas. For example, the four projects that we appraised ranged from capability levels 0 to 2 in the risk management process area: in other words, projects varied from performing only part of the basic process, to performing the basic process, to actively managing the process. As another example, all four of the projects we appraised captured some metrics on their performance. However, these metrics varied greatly from project to project in depth, scope, and usefulness. Individual weaknesses in key processes could lead to systems that do not meet the users' needs, exceed estimated costs, or take longer than expected to complete. While FAA encouraged process improvement in the past, the agency's current commitment to process improvement in its new Air Traffic Organization is not certain. FAA recently moved its air traffic-related organizations into a single, performance-based organization, the Air Traffic Organization, under the direction of a Chief Operating Officer. The Chief Operating Officer is currently reevaluating all policies and processes, and plans to issue new acquisition guidance in coming months. As a result, the Air Traffic Organization does not currently have a policy that requires organizations and project teams to implement process improvement initiatives such as the iCMM. It also does not have a detailed plan--including goals, metrics, and milestones--for implementing these initiatives throughout the organization, nor does it have a mechanism for enforcing compliance with any requirements--such as taking a project's capability levels into consideration before approving new investments. Further, because the Air Traffic Organization's commitment to the iCMM is not yet certain, FAA's centralized process improvement organization is unable to define a strategy for improving and overseeing process improvement efforts in the Air Traffic Organization. Unless the Chief Operating Officer demonstrates a strong commitment to process improvement and establishes a consistent, institutionalized approach to implementing, enforcing, and evaluating this process improvement, FAA risks taking a major step backwards in its capabilities for acquiring ATC systems and software. That is, FAA may not be able to ensure that critical projects will continue to make progress in improving systems acquisition and development capabilities, and the agency is not likely to proceed to the more advanced capability levels which focus on organizationwide management of processes. Further, FAA may miss out on the benefits that process improvement models offer, such as better managed projects and improved product quality. Should this occur, FAA will continue to be vulnerable to project management problems including cost overruns, schedule delays, and performance shortfalls. [End of section] Chapter 17: Conclusions and Recommendations: Conclusions: The Federal Aviation Administration (FAA) has made considerable progress in implementing processes for managing software acquisitions. Key projects are performing most of the practices needed to reach a basic level of capability in process areas including risk management, project planning, project monitoring and control, and configuration management. However, recurring weaknesses in the areas of verification, quality assurance, and measurement and analysis prevented the projects from achieving a basic level of performance in these areas and from effectively managing these and other process areas. These weaknesses could lead to systems that do not meet the users' needs, exceed estimated costs, or take longer than expected to complete. Further, because of the recurring weaknesses in measurement and analysis, senior executives may not receive the project status information they need to make sound decisions on major project investments. FAA's process improvement initiative has matured in recent years, but more can be done to institutionalize improvement efforts. The Chief Information Officer's centralized process improvement organization has developed an integrated Capability Maturity Model (iCMM) and demonstrated improvements in those using the model, but to date the agency has not ensured that projects and organizational units consistently adopt such process improvements. Specifically, the agency lacks a detailed plan--including goals, metrics, and milestones--for implementing these initiatives throughout the new Air Traffic Organization, and a mechanism for enforcing compliance with any requirements--such as taking a project's capability level into consideration before approving new investments. With the recent move of FAA's air traffic control-related organizations into a performance- based organization, the agency has an opportunity to reiterate the value of process improvement and to achieve the benefits of more mature processes. In the coming months, it will be critical for this new organization to demonstrate its commitment to process improvement through its policies, plans, goals, oversight, and enforcement mechanisms. Without such endorsement, the progress that FAA has made in recent years could dissipate. Recommendations for Executive Action: Given the importance of software-intensive systems to FAA's air traffic control modernization program, we recommend that the Secretary of Transportation direct the FAA Administrator to ensure that the following five actions take place: * The four projects that we appraised should take action to fully implement the practices that we identified as not implemented or partially implemented. * The new Air Traffic Organization should establish: * a policy requiring organizations and project teams to implement iCMM or equivalent process improvement initiatives and: * a plan for implementing iCMM or equivalent process improvement initiatives throughout the organization. This plan should specify a core set of process areas for all projects, clear criteria for when appraisals are warranted, and measurable goals and time frames. * The Chief Information Officer's process improvement office, in consultation with the Air Traffic Organization, should develop a strategy for overseeing all air traffic projects' progress to successive levels of maturity; this strategy should specify measurable goals and time frames. * To enforce process improvement initiatives, FAA investment decision makers should take a project's capability level in core process areas into consideration before approving new investments in the project. Agency Comments: In its oral comments on a draft of this report, Department of Transportation and FAA officials generally concurred with our recommendations, and they indicated that FAA is pleased with the significant progress that it has achieved in improving the processes used to acquire software and systems. Further, these officials noted that FAA has already started implementing changes to address issues identified in the report. They said that progress is evident in both the improved scores, compared with our prior study, and also in the way FAA functions on a day-to-day basis. For example, these officials explained that FAA is now working better as a team because the organization is using cross-organizational teams that effectively share knowledge and best practices for systems acquisition and management. FAA officials also noted that the constructive exchange of information with us was very helpful to them in achieving progress, and they emphasized their desire to maintain a dialog with us to facilitate continued progress. Agency officials also provided technical corrections, which we have incorporated into this report as appropriate. [End of section] Appendixes: Appendix I: GAO Contacts and Staff Acknowledgments: GAO Contacts: Colleen Phillips (202) 512-6326 David Powner (202) 512-9286 Keith Rhodes (202) 512-6412: Staff Acknowledgments: In addition to those named above, Jim Belford, Dan Bennett, Barbara Collier, John Dale, Season Dietrich, Thayne Hill, Deborah Lott, Tammi Nguyen, Madhav Panwar, Andrea Smith, and Carl Urie made key contributions to this report. (310452): FOOTNOTES [1] U.S. General Accounting Office, High-Risk Series: An Update, GAO- 03-119 (Washington, D.C.: January 2003). [2] CMM®, Capability Maturity Model, and Capability Maturity Modeling are registered in the U.S. Patent and Trademark Office. CMMISM is a service mark of Carnegie Mellon University. [3] Standard CMMI Appraisal Methodology for Process Improvement. [4] Not all process areas are relevant to all stages of development. The number of process areas evaluated varied, depending on each project's stage of development. [5] U.S. General Accounting Office, Air Traffic Control: Immature Software Acquisition Processes Increase FAA Acquisition Risks, GAO/ AIMD-97-47 (Washington, D.C.: Mar. 21, 1997). [6] These and other system management process areas are described in chapter 1. [7] A performance-based organization is an organization that commits to clear management objectives, measurable goals, customer service standards, and specific targets for improved performance. [8] U.S. General Accounting Office, Air Traffic Control: FAA's Modernization Efforts--Past, Present, and Future, GAO-04-227T (Washington, D.C.: Oct. 30, 2003). [9] Examples include U.S. General Accounting Office, Air Traffic Control: Status of FAA's Modernization Program, GAO/RCED-95-175FS (Washington, D.C.: May 26, 1995); Air Traffic Control: Status of FAA's Modernization Program, GAO/RCED-99-25 (Washington, D.C.: Dec. 3, 1998); Air Traffic Control: FAA's Modernization Efforts--Past, Present, and Future, GAO-04-227T (Washington, D.C.: Oct. 30, 2003). [10] U.S. General Accounting Office, High-Risk Series: An Overview, GAO/HR-95-1 (Washington, D.C.: February 1995); High-Risk Series: Information Management and Technology, GAO/HR-97-9 (Washington, D.C.: February 1997); High-Risk Series: An Update, GAO/HR-99-1 (Washington, D.C.: January 1999); High-Risk Series: An Update, GAO-01-263 (Washington, D.C.: January 2001); High-Risk Series: An Update, GAO-03- 119 (Washington, D.C.: January 2003). [11] U.S. General Accounting Office, Air Traffic Control: Immature Software Acquisition Processes Increase FAA Acquisition Risks, GAO/ AIMD-97-47 (Washington, D.C.: Mar. 21, 1997). [12] A performance-based organization is an organization that commits to clear management objectives, measurable goals, customer service standards, and specific targets for improved performance. [13] Standard CMMI® Appraisal Methodology for Process Improvement. [14] Software Engineering Institute, Demonstrating the Impact and Benefits of CMMI: An Update and Preliminary Results, CMU/SEI-2003-SR- 009 (October 2003). [15] We also appraised one system at level 3 because it had achieved maturity level 3 on FAA's iCMM. However, because the system did not meet the CMMI requirements for level 2, it could not achieve level 3. [16] FAA's iCMM version 1.0 combined the features of three Software Engineering Institute capability maturity models: the Software Acquisition Capability Maturity Model, the Capability Maturity Model for Software, and the Systems Engineering Capability Maturity Model. FAA's iCMM version 2.0 integrates the following standards and models: International Organization for Standardization standards 9001: 2000, TR 15504, 12207, and CD 15288; the National Institute of Standards and Technology's Malcolm Baldrige National Quality Award criteria; the Office of Personnel Management's President's Quality Award criteria; the Software Engineering Institute's Capability Maturity Model Integration Integrated Product and its Process Development and Capability Maturity Model Integration A Specification; and the Electronic Industries Association Interim Standard 731. [17] Under iCMM, version 1, an entity must achieve capability level 2 in 9 core process areas to attain maturity level 2, and it must achieve capability level 3 in 18 core process areas in order to attain maturity level 3. Version 2 is more stringent, requiring achievement of capability level 3 in 20 core process areas in order to attain maturity level 3. [18] All the organizations except one were assessed on iCMM v1.0. [19] Software Engineering Institute, Demonstrating the Impact and Benefits of CMMI: An Update and Preliminary Results, CMU/SEI-2003-SR- 009 (October 2003). GAO's Mission: The Government Accountability Office, the investigative arm of Congress, exists to support Congress in meeting its constitutional responsibilities and to help improve the performance and accountability of the federal government for the American people. GAO examines the use of public funds; evaluates federal programs and policies; and provides analyses, recommendations, and other assistance to help Congress make informed oversight, policy, and funding decisions. GAO's commitment to good government is reflected in its core values of accountability, integrity, and reliability. Obtaining Copies of GAO Reports and Testimony: The fastest and easiest way to obtain copies of GAO documents at no cost is through the Internet. GAO's Web site ( www.gao.gov ) contains abstracts and full-text files of current reports and testimony and an expanding archive of older products. The Web site features a search engine to help you locate documents using key words and phrases. You can print these documents in their entirety, including charts and other graphics. Each day, GAO issues a list of newly released reports, testimony, and correspondence. GAO posts this list, known as "Today's Reports," on its Web site daily. The list contains links to the full-text document files. To have GAO e-mail this list to you every afternoon, go to www.gao.gov and select "Subscribe to e-mail alerts" under the "Order GAO Products" heading. Order by Mail or Phone: The first copy of each printed report is free. Additional copies are $2 each. A check or money order should be made out to the Superintendent of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or more copies mailed to a single address are discounted 25 percent. Orders should be sent to: U.S. Government Accountability Office 441 G Street NW, Room LM Washington, D.C. 20548: To order by Phone: Voice: (202) 512-6000: TDD: (202) 512-2537: Fax: (202) 512-6061: To Report Fraud, Waste, and Abuse in Federal Programs: Contact: Web site: www.gao.gov/fraudnet/fraudnet.htm E-mail: fraudnet@gao.gov Automated answering system: (800) 424-5454 or (202) 512-7470: Public Affairs: Jeff Nelligan, managing director, NelliganJ@gao.gov (202) 512-4800 U.S. Government Accountability Office, 441 G Street NW, Room 7149 Washington, D.C. 20548:

The Justia Government Accountability Office site republishes public reports retrieved from the U.S. GAO These reports should not be considered official, and do not necessarily reflect the views of Justia.