Office of Personnel Management

Improvements Needed to Ensure Successful Retirement Systems Modernization Gao ID: GAO-08-345 January 31, 2008

Through its Retirement Systems Modernization (RSM) program, the Office of Personnel Management (OPM) is modernizing the paper intensive processes and antiquated information systems it uses to support the retirement of civilian federal employees. RSM is intended to deploy new or modified systems beginning in February 2008 to improve the efficiency and effectiveness of the agency's retirement program. GAO was asked to (1) determine whether OPM is effectively managing the RSM program to ensure that system components perform as intended and (2) evaluate the risks, cost, and progress of the RSM program. To meet these objectives, GAO analyzed program documentation against relevant plans, policies, and practices.

In executing the RSM program, OPM has improved its management processes for selecting contractors, defining system and security requirements, managing risks, planning organizational change, and providing program executive oversight. The agency also recently established performance targets for the improvements to retirement processing accuracy and timeliness that it expects the program to achieve and established an independent verification and validation capability. However, the agency's management of RSM in areas that are important to successful deployment of new systems has not ensured that system components will perform as intended. Specifically, initial test results do not provide assurance that a major system component will perform as intended. In addition, OPM's system testing schedule has been compressed, and upcoming tests are to be conducted concurrently, increasing the risk that the agency will not have sufficient resources or time for testing. Further, trends in identifying and resolving system defects indicate a growing backlog of problems to be resolved prior to deployment. Although OPM has established a risk management process that has identified program risks, the agency has not reliably estimated RSM's cost or reported progress. In particular, the reliability of the program's revised life-cycle cost estimate of $421.6 million is questionable because the agency could not support the estimate with a description of the system to be developed and a description of the methodology used to produce the estimate. Also, the agency's reporting of RSM progress--based on the satisfaction of established program goals and the calculation of variances from the planned cost and schedule--has not reflected the state of the program. With respect to goals, the agency reported that it had met its fiscal year 2007 goals, including completing the imaging of paper-based retirement records and beginning training. While OPM's reporting that it satisfied program goals provided a favorable view of progress, this view did not include program areas for which the agency had not established goals. With respect to program cost and schedule variances, OPM reported in October 2007 that the program was progressing almost exactly as planned. However, the agency's reported favorable view of program progress was not consistent with the state of the program.

Recommendations

Our recommendations from this work are listed below with a Contact for more information. Status will change from "In process" to "Open," "Closed - implemented," or "Closed - not implemented" based on our follow up work.

Director: Team: Phone:


GAO-08-345, Office of Personnel Management: Improvements Needed to Ensure Successful Retirement Systems Modernization This is the accessible text file for GAO report number GAO-08-345 entitled 'Office Of Personnel Management: Improvements Needed to Ensure Successful Retirement Systems Modernization' which was released on January 31, 2008. This text file was formatted by the U.S. Government Accountability Office (GAO) to be accessible to users with visual impairments, as part of a longer term project to improve GAO products' accessibility. Every attempt has been made to maintain the structural and data integrity of the original printed product. Accessibility features, such as text descriptions of tables, consecutively numbered footnotes placed at the end of the file, and the text of agency comment letters, are provided but may not exactly duplicate the presentation or format of the printed version. The portable document format (PDF) file is an exact electronic replica of the printed version. We welcome your feedback. Please E-mail your comments regarding the contents or accessibility features of this document to Webmaster@gao.gov. This is a work of the U.S. government and is not subject to copyright protection in the United States. It may be reproduced and distributed in its entirety without further permission from GAO. Because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. United States Government Accountability Office: GAO: Report to the Subcommittee on Financial Services and General Government, Committee on Appropriations, U.S. Senate: January 2008: Office Of Personnel Management: Improvements Needed to Ensure Successful Retirement Systems Modernization: GAO-08-345: GAO Highlights: Highlights of GAO-08-345, a report to the Subcommittee on Financial Services and General Government, Committee on Appropriations, U.S. Senate. Why GAO Did This Study: Through its Retirement Systems Modernization (RSM) program, the Office of Personnel Management (OPM) is modernizing the paper intensive processes and antiquated information systems it uses to support the retirement of civilian federal employees. RSM is intended to deploy new or modified systems beginning in February 2008 to improve the efficiency and effectiveness of the agency‘s retirement program. GAO was asked to (1) determine whether OPM is effectively managing the RSM program to ensure that system components perform as intended and (2) evaluate the risks, cost, and progress of the RSM program. To meet these objectives, GAO analyzed program documentation against relevant plans, policies, and practices. What GAO Found: In executing the RSM program, OPM has improved its management processes for selecting contractors, defining system and security requirements, managing risks, planning organizational change, and providing program executive oversight. The agency also recently established performance targets for the improvements to retirement processing accuracy and timeliness that it expects the program to achieve and established an independent verification and validation capability. However, the agency‘s management of RSM in areas that are important to successful deployment of new systems has not ensured that system components will perform as intended. Specifically, initial test results do not provide assurance that a major system component will perform as intended. In addition, OPM‘s system testing schedule has been compressed, and upcoming tests are to be conducted concurrently, increasing the risk that the agency will not have sufficient resources or time for testing (see figure below). Further, trends in identifying and resolving system defects indicate a growing backlog of problems to be resolved prior to deployment. Although OPM has established a risk management process that has identified program risks, the agency has not reliably estimated RSM‘s cost or reported progress. In particular, the reliability of the program‘s revised life-cycle cost estimate of $421.6 million is questionable because the agency could not support the estimate with a description of the system to be developed and a description of the methodology used to produce the estimate. Also, the agency‘s reporting of RSM progress”based on the satisfaction of established program goals and the calculation of variances from the planned cost and schedule”has not reflected the state of the program. With respect to goals, the agency reported that it had met its fiscal year 2007 goals, including completing the imaging of paper-based retirement records and beginning training. While OPM‘s reporting that it satisfied program goals provided a favorable view of progress, this view did not include program areas for which the agency had not established goals. With respect to program cost and schedule variances, OPM reported in October 2007 that the program was progressing almost exactly as planned. However, the agency‘s reported favorable view of program progress was not consistent with the state of the program. RSM Original versus Revised Test Schedule: Original dates: Integrated product test: August, September, 2007; Performance test: October, 2007; Parallel/business capacity release test: November, December, 2007; System deployment: early February, 2008. Revised dates: Parallel test: late December, 2007; Parallel test: mid-January to mid-February, 2008; Integrated product test: late January to mid-February, 2008; Business capability release test: late January to late February, 2008; System deployment: late February, 2008. Source: GAO based on OPM data. What GAO Recommends: To improve the management of RSM and reduce the risks to successful system deployment, GAO is making recommendations to the Director of the Office of Personnel Management, including conducting effective system tests, resolving system defects, and improving cost estimating and earned value reporting. In written comments on a draft of this report, the Director of OPM expressed her appreciation of GAO‘s insightful recommendations and stated that the agency is taking steps to address them. To view the full product, including the scope and methodology, click on [hyperlink, http://www.GAO-08-345]. For more information, contact Valerie C. Melvin at (202) 512-6304 or melvinv@gao.gov [End of section] Contents: Letter: Conclusions: Recommendations for Executive Action: Agency Comments and Our Evaluation: Appendix I: Briefing to the Staff of the Subcommittee on Financial Services and General Government, Committee on Appropriations, U.S. Senate: Appendix II: Comments from the Office of Personnel Management: Appendix III: GAO Contact and Staff Acknowledgments: Abbreviations: CIS: Center for Information Services: CSRS: Civil Service Retirement System: DBTS: defined benefits technology solution: EVM: earned value management: FERS: Federal Employees Retirement System: IV&V: independent verification and validation: OPM: Office of Personnel Management: PMB: performance measurement baseline: RSM: Retirement Systems Modernization: UAT: user acceptance test: [End of section] United States Government Accountability Office: Washington, DC 20548: January 31, 2008: The Honorable Richard Durbin: Chairman: The Honorable Sam Brownback: Ranking Member: Subcommittee on Financial Services and General Government: Committee on Appropriations: United States Senate: The Office of Personnel Management (OPM) is modernizing the paper intensive processes and antiquated systems it uses to support the retirement of civilian federal employees. According to OPM, existing processes and systems make providing timely and accurate benefit payments to retirees and their families increasingly difficult and must be improved before an expected increase in the number of retirements occurs. The Retirement Systems Modernization (RSM) program is intended to remedy this situation with the deployment of new or modified systems beginning in February 2008, and full deployment planned by the end of 2009. The agency expects RSM to improve the efficiency and effectiveness of its retirement program, which serves civilian federal employees who are eligible to receive benefits in the future, employees who are already retired, and their survivors and beneficiaries. OPM has estimated the RSM life-cycle cost to be $421.6 million. At your request, we reviewed OPM's management of the RSM program. Specifically, our objectives were to: * determine whether OPM is effectively managing the RSM program to ensure that system components perform as intended, and: * evaluate the risks, cost, and progress of the program. On November 30, 2007, we provided your offices with briefing slides that outlined the results of our study. On December 3, 2007, we met with your staff to discuss our findings, conclusions, and recommendations. The purpose of this report is to provide the published briefing slides to you and to officially transmit our recommendations to the Director of the Office of Personnel Management. The slides, which discuss our scope and methodology, are included as appendix I. We performed our work from March 2007 to January 2008 in accordance with generally accepted government auditing standards. In summary, our study highlighted two key issues: First, OPM has improved its management processes for selecting contractors, defining system and security requirements, managing risks, planning organizational change, and providing program executive oversight. Additionally, the agency recently established performance targets for the improvements to retirement processing accuracy and timeliness that it expects RSM to achieve. However, the agency's management of the program in other areas that are important to the successful deployment of new systems has not ensured that system components will perform as intended. A recently established independent verification and validation capability should help the agency identify and make program management improvements. Nevertheless, management improvements are needed in key areas: * Initial test results do not provide assurance that a major system component, the defined benefits technology solution, will perform as intended. OPM officials acknowledged that test results had not met established quality goals and stated that they expected future test results to indicate improved quality. Nevertheless, until actual test results indicate improved system quality, the agency faces increased risk that it will deploy technology that does not work as expected in February 2008. * The system testing schedule has been compressed and upcoming tests are to be conducted concurrently in about half the time originally planned. The agency identified a shortage of testing resources and the need for further system development as contributing to the delay and concurrency of planned tests. This high degree of concurrent testing increases the risk that the agency will not have sufficient resources or time to verify that the technology it plans to deploy in February 2008 will work as expected. * Trends in identifying and resolving system defects indicate a growing backlog of problems to be resolved prior to deployment. Until defect trends indicate resolution of the backlog of urgent and high priority defects, OPM faces increased risk that it will not have sufficient time to resolve significant problems before its planned February 2008 deployment. Second, OPM has established a risk management process that has been effective in identifying program risks, but it has not reliably estimated RSM's cost or reported progress. Several examples follow: * The agency has established and used a risk management process that has resulted in the identification of risks to the successful completion of the RSM program. For example, the agency has identified risks associated with the need to plan activities, conduct training, design and build interfaces, modify legacy systems, and execute tests prior to system deployment in February 2008. As a result of identifying these and other risks, the program should be positioned to reduce the probability of their occurrence and to reduce the impact if they occur. * In 2007, OPM revised the program life-cycle cost estimate from $371.2 million to $421.6 million. However, the reliability of this estimate is questionable because the agency could not support the estimate with a description of the system to be developed and a description of the methodology used to produce the estimate. Without a reliable cost estimate, the agency does not have a sound basis for formulating future RSM program budgets or for monitoring and predicting program performance. * OPM's reporting of RSM progress has not reflected the state of the program. Specifically, the agency reported two views of program progress: by describing the satisfaction of established program goals and by using earned value management (EVM).[Footnote 1] With respect to goals, the agency reported that it had met its fiscal year 2007 goals, including completing the imaging of paper-based retirement records and beginning training. While OPM's reporting that it satisfied program goals provided a favorable view of progress, this view did not include program areas for which the agency had not established goals. Using EVM, which is intended to provide a view of progress on the program as a whole, the agency reported in October 2007 that the program was progressing almost exactly as planned. However, the agency's reported favorable view of program progress was not consistent with the state of the program. As a result of this approach, whereby OPM frequently revised its performance measurement baseline in lieu of establishing and controlling a valid baseline, the agency's EVM reporting did not reliably reflect program progress. Conclusions: To its credit, OPM has undertaken the RSM program to expedite retirement processing for civilian federal employees and the agency reported that it has met key program goals. Further, the agency has improved its management processes for selecting contractors, defining system and security requirements, managing risks, planning organizational change, and providing program executive oversight. Nevertheless, much remains to be accomplished before the program is effectively positioned to deploy its first planned increment of new technology in February 2008. Although OPM has developed performance targets necessary to gauge the success of the new system and has entered into a contract to obtain an independent review of its management of the RSM program, the agency has not ensured that system components will perform as intended. In particular, initial test results indicate that the defined benefits technology solution that is a major component of the new system has not performed as intended, the backlog of system defects to be addressed before deployment is growing, and future system tests are to be conducted concurrently in about half the time originally planned. Further, OPM recognized the importance of risk management and has established a risk management process and identified program risks. However, the agency has not yet developed the capability to reliably analyze and report program progress. Such progress reporting should be grounded in a reliable cost estimate that is in part the basis for reliable earned value management. Without a reliable cost estimate, the agency does not have a firm foundation for the RSM program budget or for reliable earned value management reporting. Until OPM makes improvements to the RSM program in the areas discussed above, the agency risks not achieving successful program outcomes, including the planned deployment of new technology beginning in February 2008. Recommendations for Executive Action: To address the risks to OPM's deployment of new retirement system technology and improve the agency's ability to reliably report progress of the RSM program, we are recommending that the Director of the Office of Personnel Management direct the RSM Executive Director to take the following actions: * Ensure that sufficient resources are provided to fully test functionality, actions for mitigating the risks inherent in concurrent testing are identified, test results verify that all system components perform as expected, and test activities and results are subjected to independent verification and validation. * Monitor and review defined benefits technology solution defects to ensure all urgent and high priority defects are resolved prior to system deployment and that the resolution of urgent and high priority defects is subjected to independent verification and validation. * Develop a revised RSM cost estimate that addresses the weaknesses identified in this briefing and task an independent verification and validation contractor with reviewing the process used to develop the estimate and assessing the reliability of the resulting estimate. * Establish a basis for effective use of earned value management by validating the RSM performance measurement baseline through a program level integrated baseline review and task an independent verification and validation contractor with reviewing the process used to develop the baseline and assessing the reliability of the performance measurement baseline. Agency Comments and Our Evaluation: In written comments on a draft of this report, the Director of OPM expressed her appreciation of GAO's insightful recommendations and stated that the agency is taking steps to address them. The Director reiterated the agency's intention to begin deploying the new retirement system in February 2008, and stated that when fully deployed, the new system will provide a high level of customer service, enhanced retirement planning tools, and prompt, complete annuity payments. To this end, the Director stated that the agency has, among other things, dedicated additional resources to monitor, evaluate, and troubleshoot system development and testing. OPM's actions, if effectively implemented and monitored by agency leadership, should facilitate deployment of the new retirement system. The comments are reprinted in appendix II. We are sending copies of this report to the Director of the Office of Personnel Management and other appropriate congressional committees. We will make copies available to other interested parties upon request. Copies of this report will also be made available at no charge on GAO's Web site at [hyperlink, http://www.gao.gov]. If you or your staff have any questions about this report, please contact me at (202) 512-6304 or melvinv@gao.gov. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this report. GAO staff who made major contributions to this report are listed in appendix III. Signed by: Valerie C. Melvin: Director, Human Capital and Management: Information Systems Issues: [End of section] Appendix I: Briefing to the Staff of the Subcommittee on Financial Services and General Government, Committee on Appropriations, U.S. Senate: Office of Personnel Management: Improvements Needed to Ensure Successful Retirement Systems Modernization: Briefing for the Subcommittee on Financial Services and General Government: Committee on Appropriations: United States Senate: November 30, 2007: Table of Contents: Introduction: Objectives: Scope and Methodology: Results in Brief: Background: Results: Conclusions: Recommendations: Agency Comments and Our Evaluation: [End of section] Introduction: The Office of Personnel Management (OPM) is modernizing the paper intensive processes and antiquated systems it uses to support the retirement of civilian federal employees. According to OPM, these existing processes and systems make providing timely and accurate benefit payments to retirees and their families increasingly difficult and must be improved before an expected increase in the number of retirements occurs. The Retirement Systems Modernization (RSM) program is intended to remedy this situation with the deployment of new or modified systems beginning in February 2008, with full deployment planned by the end of 2009. OPM has estimated the RSM life-cycle cost to be $421.6 million. RSM is intended to improve the efficiency and effectiveness of OPM‘s retirement program, which serves civilian federal employees who are eligible to receive benefits in the future, employees who are already retired and their survivors and beneficiaries. Objectives: The Chairman and the Ranking Member of the Senate Subcommittee on Financial Services and General Government requested that we: * determine whether OPM is effectively managing the RSM program to ensure that system components perform as intended, and; * evaluate the risks, cost, and progress of the RSM program. Scope and Methodology: To determine whether OPM is effectively managing the RSM program to ensure that system components perform as intended, we: * analyzed RSM program objectives and current retirement performance measures to determine what target goals OPM established for the modernized system, and evaluated metrics developed for the new system and analyzed the extent to which those metrics included target goals for performance relative to program objectives; * evaluated planned tests and dates and compared them to actual tests completed to determine to what extent tests were delayed or compressed, and analyzed system test results and compared them to expected quality goals to determine whether test results provided assurance that the system would perform as intended; * analyzed defect data of a major system component to determine the number of defects identified and resolved each week since the beginning of development, and analyzed trends in defects opened and identified the number of defects to be resolved prior to deployment; * evaluated independent verification and validation (IV&V) plans and analyzed what program activities were subject to IV&V to determine the scope of these activities; and; * interviewed OPM officials, RSM program management, and contractors about performance measures, testing activities, defect management, and IV&V plans. To evaluate RSM risks, cost, and progress, we: * reviewed the RSM risk management program to determine to what extent the program had been implemented, and analyzed risk logs and attended monthly program management review meetings at which program risks were identified, tracked, mitigated, and reported; * analyzed the current RSM life-cycle cost and compared the process OPM used to develop the cost estimate with best practices identified in GAO‘s Cost Assessment Guide, and reviewed cost estimation documentation to determine the sufficiency of support for the estimate as outlined in the guide; * evaluated RSM progress as reported through program goals and earned value management; reviewed fiscal year 2007 program goals to determine progress toward achievement of these goals; evaluated RSM progress as reported in EVM performance reports to determine variances in cost and schedule, and analyzed inputs to performance reports to evaluate to what extent the EVM approach aligned with best practices identified in the GAO Cost Assessment Guide; and; * interviewed RSM program officials and contractors at OPM headquarters about risk management, cost estimating, and program progress. Our work was performed at OPM headquarters in Washington, D.C., from March 2007 to November 2007 in accordance with generally accepted government auditing standards. Results in Brief: To its credit, OPM has undertaken the RSM program to speed retirement processing for civilian federal employees and has reported meeting key agency goals. Further, the agency has improved its management processes for selecting contractors, defining system and security requirements, managing risks, planning organizational change, and providing program executive oversight. Additionally, OPM recently established performance targets for the improvements to retirement processing accuracy and timeliness that it expects RSM to achieve. However, the agency has not ensured that system components will perform as intended when they are planned to be deployed in February 2008. A recently established independent verification and validation function should help the agency identify and make program management improvements. Nevertheless, management improvements are needed in key areas: * Initial test results do not provide assurance that a major system component will perform as intended. OPM officials acknowledged that test results had not met established quality goals and stated that they expected future test results to indicate improved quality. Nevertheless, until actual test results indicate improved system quality, OPM faces increased risk that it will deploy technology that does not work as expected (e.g., does not accurately calculate retirement benefits) in February 2008. * OPM‘s system testing schedule has been compressed and upcoming tests are to be conducted concurrently. OPM identified a shortage of testing resources and the need for further system development to occur as contributing to the delay and increased concurrency of planned tests. This high degree of concurrent testing that OPM now plans increases the risk that OPM will not have sufficient resources or time to verify that the technology it plans to deploy in February 2008 will work as expected. * Trends in identifying and resolving system defects indicate a growing backlog of problems to be resolved prior to deployment. Until defect trends indicate resolution of the backlog of urgent and high priority defects, OPM faces increased risk that it will not have sufficient time to resolve significant problems before its planned February 2008 deployment. OPM has established a risk management process that has been effective in identifying program risks, but has not reliably estimated RSM‘s cost or reported progress: * OPM has established and used a risk management process, which has resulted in the identification of risks to successful completion of the RSM program. For example, the agency has identified risks associated with the need to plan activities, conduct training, design and build interfaces, modify legacy systems, and execute tests prior to system deployment in February 2008. As a result of identifying these and other risks, the RSM program should be positioned to reduce the probability of their occurrence and to reduce the impact if the risks do occur. * In 2007, OPM revised its RSM life-cycle cost estimate from $371.2 million to $421.6 million. However, the reliability of this estimate is questionable because the agency could not support the estimate with a description of the system to be developed and a description of the methodology used to produce the estimate. Without a reliable cost estimate, OPM does not have a sound basis for formulating future RSM program budgets or for monitoring and predicting program performance. * OPM‘s reporting of RSM progress has not reflected the state of the program. Specifically, the agency reported two views of RSM progress: by describing satisfaction of established program goals and by using earned value management (EVM). [Footnote 2] With respect to goals, the agency reported that it had met its fiscal year 2007 goals, including completing the imaging of paper-based retirement records and beginning RSM training. While OPM‘s reporting that it satisfied program goals provided a favorable view of progress, this view did not include program areas for which the agency had not established goals. Using EVM, which is intended to provide a view of progress on the program as a whole, OPM reported in October 2007 that the program was progressing almost exactly as planned. However, the agency‘s reported favorable view of program progress was not consistent with the state of the program. To improve OPM‘s management of RSM, we are making recommendations aimed at reducing the risks to successful system deployment by conducting effective system tests, resolving system defects, and improving cost estimating and earned value reporting (see pages 53-54). In oral comments on a draft of this briefing, OPM officials including the Director and the RSM Executive Director generally agreed with our recommendations and provided additional information and written technical comments related to program activities, which we incorporated in the briefing as appropriate. Background: OPM Mission and Retirement Plans: OPM‘s mission is to ensure that the federal government has an effective civilian workforce. [Footnote 3] In this regard, one of its major human resources tasks is to manage and administer the federal retirement program for federal employees. OPM‘s Center for Retirement and Insurance Services administers two defined-benefit retirement plans that provide retirement, disability, and survivor benefits for federal employees: [Footnote 4] * Civil Service Retirement System (CSRS)”a pension system that covers most employees hired before 1984. * Federal Employees Retirement System (FERS)”a plan that also includes Social Security and a defined contribution system; it covers most employees hired in 1984 and subsequent years. [Footnote 5] Background: Retirement Services: According to OPM, there are approximately 2.5 million active federal employees and 2.4 million retired federal employees. The agency estimates that between 2007 and 2017,60 percent of active federal employees will be eligible to retire and 40 percent will most likely retire. OPM reports that the current retirement process takes approximately 60 to 90 days from the submission of a retirement application until initial benefit payments are made to a federal retiree. OPM has identified factors that could limit its ability to provide high- quality retirement services, including: * an increasing number of FERS retirement claims, which are more difficult to process than CSRS claims because of the complexity of FERS calculations; * high costs, limited capabilities, and other problems with existing OPM information systems and processes; and; * the inability to attract personnel to work with antiquated retirement processes. Background: RSM History: OPM began planning RSM in 1997 and originally intended to structure the program as an acquisition of commercially available hardware and software that the agency would modify to meet its needs. From 1997 to 2001, OPM developed plans and analyses and began developing business and security requirements for RSM. In 2001, OPM decided to change the direction of RSM. Specifically, the agency examined the possibility of increasing the role of private sector vendors by contracting for key system components. In 2002, the agency issued a request for information to vendors regarding contracting for key retirement program functions. The agency‘s analysis of responses showed that contracting was a viable alternative that would be cost efficient, less risky, and more likely to be completed on time and on budget. As a result, the agency decided to contract for use of a commercially available pension benefits solution. In 2003, OPM conducted an assessment of alternative contracting options before issuing a request for proposals in 2004. During 2005 and early 2006, OPM evaluated proposals and selected vendors. Background: RSM Objectives: OPM‘s January 2007 RSM Program Management Plan defined two program objectives to address current retirement processing performance issues: * Timely, accurate retirement benefit payments”streamlined electronic processing of retirement applications for the establishment of accurate automatic benefit payments for retirees. * More efficient and flexible processes”electronic processes to record and retrieve retirement information to reduce delays associated with the storage and management of paper records, and reduce processing times with fewer errors. With RSM, OPM expects the retirement process will take less time and result in fewer errors. Background: RSM Program: The RSM program is expected to improve retirement services to active and retired federal employees by implementing new technology and business processes. Modernizing the current paper-based manual processing of retirement applications and claims involves: * modifying a commercial pension benefits solution, called the Defined Benefits Technology Solution (DBTS), to meet federal requirements and policies; * capturing and converting federal employee paper records to electronic files; * redesigning retirement processes to align with supporting technology; * developing interfaces to receive data from and send data to external and remaining legacy systems; * decommissioning or modifying OPM‘s 86 legacy systems that currently support retirement processing; and; * training OPM employees to use the modernized technology and processes. The cornerstone technology of the RSM program is DBTS. In addition to calculating retirement benefit amounts, the technology is intended to provide active and already retired federal employees self-service Internet-based tools for accessing accounts, updating retirement records, submitting transactions, monitoring the status of claims, and forecasting retirement income. This technology is also expected to provide electronic tools for accessing retirement information to OPM staff and agency customer service representatives. In addition to DBTS, RSM includes modifying OPM legacy systems and developing interfaces to external and internal systems. Figure 1 shows the planned process and technology for the program. Figure 1: Simplified View of Planned RSM Process and Technology: * Payroll Processing Centers (e.g. GDA): * OPM Interface; * Defined Benefits Technology Solution (e.g. Benefit estimation, calculation, and payment, claims processing, customer service; - Active employees, benefit officers, retirees, and OPM administrators; * OPM Interface; * OPM Legacy Systems (e.g. financial management, actuarial analysis); * External Systems (e.g. SSA, Treasury). Source: GAO analysis of OPM data. [End of figure] Background: RSM Contracts: OPM has overall responsibility for RSM, including retirement process redesign, legacy system modification, interface development, integrating system components, system testing, training employees, and program management.Additionally, the RSM program is supported by contractors in key areas, as shown in the following table. Table 1: RSM Program Contracts: Contract Name: DBTS; Award Date: May 2006; Contract Value: $290 million; Awarded to: Hewitt Associates; Deliverables: Modification of commercial solution to meet federal requirements and policies. Contract Name: Paper Data Capture and Conversion; Award Date: September 2006; Contract Value: $30.7 million Awarded to: Integic; Deliverables: Capture and conversion of federal paper records to electronic files. Contract Name: Business Transformation and Information Technology; Award Date: May 2006; Contract Value: $40 million; Awarded to: Accenture Ltd. Deliverables: Assist in redesigning the retirement processes and developing supporting technology. Contract Name: Program Management; Award Date: September 2004; Contract Value: $10.7 million; Awarded to: Booz Allen Hamilton; Deliverables: Program management support. Contract Name: IV&V; Award Date: August 2007; Contract Value: $0.2 million; Awarded to: Bearing Point; Deliverables: Requirements assessment, test validation and results evaluation. [End of table] Background: RSM Implementation Approach: OPM plans to implement its DBTS-based retirement system through a series of five increments. Each increment consists of deploying the new retirement system to support specific populations of federal agency employees according to the following schedule: * Increment 1 – active employees serviced by the General Services Administration payroll processing center starting in February 2008. * Increment 2 – active United States Postal Service employees starting in May 2008. * Increment 3 – already retired federal employees, active employees serviced by the National Business Center payroll processing center, and other independent agencies starting in August 2008. * Increment 4 – active employees serviced by the National Finance Center payroll processing center starting in November 2008. * Increment 5 – active employees serviced by the Department of Defense payroll processing center in February 2009. Background: RSM Governance: OPM has established governance committees and senior management positions to lead RSM, as table 2 shows. Table 2: RSM Governance: Title: Executive Steering Committee; Description: Chaired by the OPM Director, the Executive Steering Committee makes decisions related to the RSM program that have cross- OPM impact and resolves cross-OPM issues. Title: Associate Director of Human Resources Products and Services; Description: As the current business owner, the associate director is responsible for the transition to operations and assumes ownership of the RSM program when complete. Title: Executive Director; Description: As the program manager, the executive director is responsible for daily operations and progress of the program. Title: Chief Information Officer; Description: As the deputy associate director for the Center for Information Services (CIS), the chief information officer is responsible for providing support and oversight for acquisition, systems, contract, and security management. CIS supports the transition and modification of legacy systems, which are owned by this unit. [End of table] Background: GAO Prior Review: In February 2005, we reported that OPM lacked processes for RSM acquisition activities such as determining requirements, developing acquisition strategies, and implementing a risk program. Furthermore, the agency also had not yet established effective security management, change management, and program executive oversight. [Footnote 6] Accordingly, we recommended that the director of OPM ensure that the RSM program office expeditiously establish processes for effective oversight of RSM. We made nine recommendations in the areas of system acquisition management, information security, organizational change management and IT investment management. Between 2005 and 2007, OPM made progress towards establishing management processes for RSM and demonstrated the completion of activities with respect to each of the nine recommendations. Specifically, the agency developed key system acquisition processes for determining requirements, developed acquisition strategies, and implemented a risk management program. Additionally, OPM developed information security plans and requirements, updated organizational change management plans, and instituted processes for guiding the executive investment oversight committee. For example, OPM: * developed both system specific and programwide security plans along with a set of security requirements for RSM; * updated its change management plans to include tasks and milestones that represented the current acquisition approach; and; * developed and established processes to guide the executive steering committee‘s activities through a charter and program management plans. As a result of these actions, OPM improved its ability to select contractors, define system and security requirements, manage risks, plan organizational change, and provide RSM program executive oversight. Objective 1: Management Effectiveness: OPM‘s Management of the RSM Program Has Not Ensured that System Components Will Perform as Intended, but IV&V Capability Could Result in Improvements. OPM recently established performance targets for the improvements to retirement processing accuracy and timeliness that it expects RSM to achieve. However, the agency‘s management of RSM in other areas that are important to successful deployment of new systems has not ensured that system components will perform as intended. Key areas of program management weaknesses are system testing and defect management. The agency recently established an IV&V capability, which should identify and recommend areas for improvement. Objective 1: Performance Measures: Performance targets for expected improvements to retirement processing were recently developed. The Government Performance and Results Act of 1993 provides, among other things, that federal agencies establish program performance measures, including the assessment of relevant outputs and outcomes of program activities. By analyzing the gap between target measurements and actual levels of performance, management can focus on those processes that are most in need of improvement, set improvement goals, and identify appropriate process improvements or other actions. [Footnote 7] Recognizing the importance of performance measures, OPM has established goals for measuring the accuracy and timeliness of current retirement claims processing. OPM also has defined targets that relate to measuring progress towards these goals and reports actual performance. Specifically, in 2006 the agency reported targets for current retirement claims processing, including: * 93 percent accuracy (actual accuracy was 89 percent) and; * 30-day claims processing (actual performance was 41 days). According to the RSM Program Management Plan, OPM expects RSM to achieve more accurate and faster retirement claims processing. In November 2007, the agency completed its development of new performance targets for timeliness and accuracy. Specifically, the agency developed new targets for future retirement claims processing, including: * 99 percent accuracy and; * 30-day processing for 99 percent of claims. According to the OPM Director, the agency plans to use these targets beginning in February 2008. By establishing these targets, OPM should be positioned to determine whether the technology that it plans to deploy in February 2008 and beyond will result in the timeliness and accuracy improvements in retirement processing that it has asserted RSM will achieve. Objective 1: Test Results: Test results do not provide assurance that a major system component will perform as intended. Effective testing is an essential component of any program that includes system development. Generally, the purpose of testing is to identify defects or problems in meeting defined system requirements or satisfying system user needs. [Footnote 8] Recognizing the importance of testing, OPM plans to perform user acceptance tests (UATs) that are intended to verify that DBTS meets requirements such as accurately calculating retirement benefits. The agency plans six UATs to be conducted incrementally as the system is developed and established goals, expressed as the percentage of test scenarios passed, for each UAT. Although the results of UAT 1 slightly exceeded the goal, the results of UATs 2, 3, and 4 fell far short of meeting the established goals and indicate that DBTS is not performing as intended. Table 3: Test Results in Terms of Scenarios: Test: UAT1; Scenarios Tested: 44; Scenarios Passed: 34; Scenarios Failed: 20; Passed Goal (percent): 75; Passed Actual (percent): 77. Test: UAT2; Scenarios Tested: 179; Scenarios Passed: 8; Scenarios Failed: 171; Passed Goal (percent): 80; Passed Actual (percent): 4. Test: UAT3; Scenarios Tested: 193; Scenarios Passed: 42; Scenarios Failed: 151; Passed Goal (percent): 85; Passed Actual (percent): 22. Test: UAT4; Scenarios Tested: 302; Scenarios Passed: 117; Scenarios Failed: 185; Passed Goal (percent): 90; Passed Actual (percent): 39. [End of table] OPM officials acknowledged that these test results showed that DBTS had not performed as intended and stated that they expected the UAT 5 test results to indicate continued quality improvement. Until actual test results indicate improvement in the quality of DBTS, OPM faces increased risk that it will deploy technology that does not work as expected (e.g., does not accurately calculate retirement benefits) in February 2008. Objective 1: Test Schedule: Testing schedule is compressed and concurrency of tests is increased. In addition to UATs, OPM planned tests that we and others recognize as important to ensuring that system components and the new system as a whole perform as intended. [Footnote 9] These tests are intended to verify that DBTS and other RSM components work together as intended when they are combined and that the complete system resulting from the RSM program satisfies all requirements (e.g., functional and performance) and is acceptable to end users. Specifically, OPM plans the following tests: * integrated product test to confirm that DBTS, modified legacy systems, and associated interfaces meet functional requirements (e.g., accurately calculate benefits). * performance test to confirm that the new system meets performance requirements (e.g., processing volume and execution time). * parallel test to verify that the new system produces the same results as existing systems. * business capability test to confirm the operational readiness of the new system for end users. OPM planned to perform these tests in sequence over about 5 months from late July 2007 through December 2007. In November 2007 OPM revised its test schedule. According to the revised schedule, the integrated product test was delayed from late July 2007 until late January 2008. In order to maintain its schedule to deploy technology in late February 2008, the agency now plans to reduce the time necessary to conduct the integrated product, performance, parallel, and business capability tests by performing them concurrently within about a 2-1/2 month time period beginning in mid-December 2007 and concluding in late February 2008. OPM‘s original test schedule is compared to the actual or revised schedule in figure 2. Figure 2: RSM Original Versus Actual or Revised Test Schedule: [See PDF for image] Original dates: UAT1: Mid-March, 2007; UAT2: Late May, 2007; UAT3: Early August, 2007; UAT4: Late September, 2007; Integrated test product: August and September, 2007; Performance test: October, 2007; Parallel/business capability release test: November and December, 2007; UAT5: Late December, 2007; Increment 1 deployment: Early February, 2008. Revised or Actual dates: UAT1: Mid-March, 2007; UAT2 part 1: Late May, 2007; UAT2 part 2: Early June, 2007; UAT3 part 1: Early August, 2007; UAT3 part 2: Mid-September, 2007; UAT4: Mid-October, 2007; UAT5: Mid-December, 2007; Parallel test: Mid-to-late December, 2007; UAT6: Mid-January, 2008; Parallel test: Mid-January to Mid-February, 2008; Performance test: Early to Mid-February, 2008; Integrated product test: Early to Mid-February, 2008; Business capability release test: February, 2008; Increment 1 deployment: Late February, 2008. Source: GAO based on OPM data. [End of figure] OPM identified the lack of testing resources, including the availability of subject matter experts, and the need for further system development as contributing to the delay of planned tests and the need for concurrent testing. As a result of test concurrency, OPM is faced with performing a significant volume of concurrent test activities at the same time that critical resources, particularly key staff, are also engaged in activities such as completing the building of interfaces and modification of legacy systems. As we have previously reported, concurrent tests can require additional time if critical defects are found that necessitate stopping all affected tests, fixing the baseline configuration, and then restarting the tests. [Footnote 10] The high degree of concurrent testing that OPM now plans increases the risk that OPM will not have sufficient resources or time to verify that the technology it plans to deploy in February 2008 works as expected. Objective 1: Defect Management: Trends in defects indicate a growing backlog of problems to be resolved before deployment. In addition to test results, a measure of system maturity and quality is trends in defects. Defects are system problems that require a resolution and can be due to a failure to meet the system specifications. Defects are often identified prior to and during system tests. As we have previously reported, having current and accurate defect information is necessary to adequately understand system maturity and to make informed decisions about how to best allocate limited resources to meet competing priorities. [Footnote 11] The contractor that is providing DBTS identified defects during its work to modify the system to meet OPM‘s requirements. Each defect is documented and assigned a unique identifier, status, and priority. According to the contractor, defects are categorized according to the following priority categories: * Urgent priority defects prevent progress of the solution in the current system phase. * High priority defects are items that need to be addressed in the current system phase. * Medium priority defects are items to be addressed in an upcoming system phase. * Low priority defects are items that can be addressed during transition to deployment. Our analysis of the contractor-identified defects showed a pattern of defect identification that is consistent with the axiom that defects are often identified prior to and during system tests. Because DBTS functionality remains to be developed, integrated, and tested, the pattern of defect identification is likely to continue. Further, the RSM executive director stated that OPM expects the number of defects identified to reach a peak in December 2007. Figure 3 shows the total number of defects that the DBTS contractor identified on a weekly basis. Figure 3: New Defects Identified per Week: [See PDF for image] This figure is a multiple line graph depicting the number of new defects identified per week. The vertical axis of the graph represents number of defects from 0 to 200. The horizontal axis of the graph represents weekly dates from January 6, 2007 to October 19, 2007. Lines represent defects in three categories: high, urgent, and total. Also depicted are the following start dates, with defect totals approximated from the graph: UAT1 Start: Date: 3/13/2007; High: less than 5; Urgent: less than 5; Total: less than 10. UAT2 Start: Date: 5/21/2007; High: approximately 30; Urgent: approximately 20; Total: approximately 70; UAT3 Start: Date: 8/6/2007; High: approximately 25; Urgent: approximately 10; Total: approximately 45; UAT4 Start: Date: 10/15/2007; High: approximately 20; Urgent: approximately 10; Total: approximately 30. Source: GAO analysis of OPM data. [End of figure] At the end of October 2007, a total of 367 defects remained open. Nine of these defects were assigned urgent priority and 129 were high priority. Among these defects, one identified as urgent related to a system weakness that could result in the generation of different identifications for the same Social Security number. A high priority defect was identified as a result of DBTS indicating that a user already had an active session when logging back in to the system hours after having logged out. Figure 4 shows the cumulative number of defects that remained open at the end of each week. Figure 4: Cumulative Total, Urgent, and High Open Defects: [See PDF for image] This figure is a multiple line graph depicting the cumulative total of urgent, and high open defects. The vertical axis of the graph represents number of defects from 0 to 400. The horizontal axis of the graph represents weekly dates from January 6, 2007 to October 19, 2007. Line represent defects in three categories: high, urgent, and total. The cumulative totals, as approximated from the graph at the end of the tracking on 10/19/2007 appear to be as follows: High: approximately 20; Urgent: approximately 100; Total: approximately 350. Source: GAO analysis of OPM data. [End of figure] The increasing numbers of defects that remain open (shown in figure 4) indicate a growing backlog of defects. Because two additional user acceptance tests as well as the integrated product test, parallel test, and performance/business capability test remain to be conducted, additional defects could be identified. Until defect trends indicate resolution of the backlog of urgent and high priority defects, which by definition are to be resolved prior to deployment, OPM faces increased risk that it will not have sufficient time to resolve significant problems before its planned February 2008 deployment. Objective 1: Independent Verification and Validation: OPM recently engaged an IV&V contractor, which could result in improvements. The purpose of independent verification and validation is to provide an independent review of system processes and products to ensure that quality standards are being met. As we have previously reported, the use of IV&V is a recognized best practice for large and complex system development and acquisition projects such as RSM and involves an independent organization conducting unbiased reviews of processes, products, and results to verify and validate that they meet stated requirements and standards. [Footnote 12] OPM‘s February 2007 RSM IV&V Approach emphasized the importance of IV&V for identifying and correcting problems as well as for providing visibility into the system in order to deliver it on schedule and within budget. In August 2007, OPM awarded a contract for IV&V of the RSM program. According to the contract statement of work, the scope of the contractor‘s work includes reviewing project approaches, plans, analyses, methods, processes, and deliverables. Further, according to the November 2007 IV&V plan, the contractor is expected to recommend approaches for resolving issues regarding design, development, testing, and any potential problem area. As a result of instituting IV&V, OPM should be better positioned to identify and correct deficiencies in the RSM program and deploy technology that performs as expected. Objective 2: Risks, Cost, and Progress: OPM Has Identified Program Risks but Has Not Reliably Estimated RSM‘s Cost or Reported Progress. OPM has established and used a risk management process to identify RSM program risks. However, the agency‘s current RSM life-cycle cost estimate of $421.6 million is not supported by necessary documentation and is thus of questionable reliability. OPM‘s two methods of reporting program progress, by reporting achievement of goals and using EVM, provide a favorable view of progress. However, the agency‘s EVM reporting was unreliable and neither of the two progress reporting methods reflected OPM‘s decision to delay deployment of a portion of the technology originally planned for February 2008. Objective 2: Risks: OPM has established a risk management process and identified risks.Risk management is vital to the success of a program such as RSM. Relevant best practice guidance advocates proactively identifying facts and circumstances that can increase the probability of a program failing to meet cost, schedule, and performance commitments and then taking steps to reduce the probability of their occurrence and impact. [Footnote 13] OPM has established a risk management program that includes, among other things, written policies and procedures, roles and responsibilities, and guidance for identifying, prioritizing and mitigating risks. Additionally, the agency has implemented the program and uses a database to help track risks. Risks are summarized and reported at monthly program management review meetings. At the October 2007 monthly program management review, OPM reported the following risks: * A clear action plan for work related to deployment of increment 1 must be defined, executed, and tracked. * Training employees for increment 1 must be developed and executed to ensure the transition. * A significant volume of testing must be executed in a short time with constrained resources.??Significant planning and coordination for preparing records for increment 1 is required. * Interfaces and legacy system modifications required for increment 1 must be designed, built, and tested in a limited time frame. As a result of identifying these and other risks, OPM should be positioned to reduce the probability of their occurrence and to reduce the impact if the risks do occur. Objective 2: Cost: OPM has estimated RSM cost, but reliability of the estimate is questionable. A cost estimate is the summation of individual program cost elements, using established methods and valid data to estimate future costs. The establishment of a reliable cost estimate is important for developing a program budget and having a sound basis for measuring performance, including comparing the actual and planned costs of program activities. Credible cost estimates are produced by following rigorous steps and are accompanied by detailed documentation, including descriptions of the system under development, estimation methodology, ground rules and assumptions, and sensitivity and uncertainty analyses. [Footnote 14] In August 2007, OPM revised its RSM life-cycle cost estimate from $371.2 million to $421.6 million based in part on the estimated costs of major program activities such as contracting for the capture and conversion of paper files, development of the defined benefits technology solution, and program support. However, OPM‘s revised estimate was not supported by the documentation that is fundamental to a reliable cost estimate. Specifically, OPM did not document: * a technical baseline description; * a cost estimating methodology; * ground rules and assumptions, and; * sensitivity and uncertainty analyses. OPM officials asserted that the revised cost estimate is adequately supported by the firm fixed priced contracts upon which the estimate is largely based. However, while the RSM contracts partially support the cost estimate, they do not provide sufficient documentation of the technical baseline description, estimation methodology, ground rules and assumptions, and sensitivity and uncertainty analyses. Without such supporting documentation, the reliability of the RSM cost estimate is questionable. Without a reliable cost estimate, OPM does not have a sound basis for formulating future RSM program budgets or for developing the program- level baseline that is necessary for measuring and predicting program performance. Objective 2: Progress: OPM reported that it met its fiscal year 2007 RSM program progress goals, but progress reporting using EVM was unreliable. OPM reported two views of RSM progress. Specifically, it described satisfaction of established program goals and used EVM as a progress measurement and reporting tool. With respect to goals, OPM established five fiscal year 2007 goals to support development and testing of the RSM program components necessary to deploying initial functionality in February 2008. As of October 2007, OPM reported that it met these goals. Table 4: RSM FY 2007 Program Goals: Operational Goal: Make data element dictionary available for Government Shared Service Centers; Date Completed: January 2007; Description: Published updated list of electronic human resources information needed for federal agencies to send records to OPM. Operational Goal: Complete development of licensed technology – employee/client application; Date Completed: April 2007; Description: Delivered licensed technology solution to OPM for testing and development of interfaces. Operational Goal: Begin RSM training; Date Completed: May 2007; Description: Finalized training approach and started courses. Operational Goal: Develop licensed technology for GSA active employees; Date Completed: July 2007; Description: Delivered increment 1 data for the licensed technology solution. Operational Goal: Complete Active Employee Folder imaging in the Retirement Operations Center; Date Completed: September 2007; Description: Captured paper records stored at the Retirement Operations Center. [End of table] OPM also uses EVM to measure and report program progress. EVM is a tool for measuring program progress by comparing the value of work accomplished with the amount of work expected to be accomplished. Such a comparison permits actual performance to be evaluated, based on variances from the planned cost and schedule, and future performance to be forecasted. Identification of significant variances, which OPM defines as plus or minus 10 percent from planned cost and schedule, and analysis of their causes helps program managers determine the need for corrective actions. Fundamental to reliable EVM is the development of a performance measurement baseline (PMB), which represents the cumulative value of planned work and is the baseline against which variances are calculated. To establish a meaningful PMB, programs must fully: * define the work in a work breakdown structure; * develop a complete integrated master schedule, and; * formulate budgets for all planned work. [Footnote 15] After these inputs are integrated to develop the PMB, it should be validated through an integrated baseline review during which stakeholders reach agreement on the baseline. Once validated, the PMB is closely controlled and generally not subject to change unless events beyond the program manager‘s control occur (e.g. changes to program scope). OPM used EVM to measure and report monthly performance. The agency‘s October 2007 monthly report identified a cumulative actual cost of $110.1 million for the work performed on RSM through September 2007. OPM‘s comparison of this actual cost with the budgeted cost resulted in a positive cost variance of about $0.5 million or less than 1 percent (i.e., the program created less than 1 percent more value than planned for the money spent) and a negative schedule variance of $1.3 million or about 1 percent (i.e., the program created about 1 percent less value than planned for the time spent). These reported results provided a favorable view of program performance over time because the variances indicated the program was progressing almost exactly as planned. Further, OPM‘s earned value reporting indicated that such favorable performance had been sustained as shown by the agency‘s reporting of cumulative variances less than plus or minus 2 percent since September 2006. However, this view of program performance is not reliable because it is not consistent with established EVM practices: * The baseline upon which the results were based was derived from a work breakdown structure and an integrated master schedule that did not reflect the full scope of the program. For example, the full scope of system integration activities such as legacy system modification and the development of interfaces critical to deploying initial functionality in February 2008 was not included. * An integrated baseline review had not been conducted to validate the baseline. * The baseline was unstable. For example, the baseline used to measure performance was $137 million in September 2006 and changed ten times between then and the $175.1 million baseline reported in September 2007. Further, OPM‘s recently reported EVM-based view of favorable program progress was not consistent with the state of the program. Specifically, in September 2007, the agency decided to delay deployment of new retirement system technology for already retired federal employees, originally planned for February 2008, to August 2008 because the agency did not have sufficient time and resources to modify its legacy systems and develop interfaces necessary to fully support the originally planned deployment. OPM officials acknowledged that their approach to instituting EVM reflected that they did not have a level of confidence in their definition of the program scope that was sufficient to establish a PMB that could be validated in an integrated baseline review. In September 2007, the agency reported that it had established a PMB that it plans to validate in an integrated baseline review in December 2007. At the time we concluded our review in November 2007, the agency had not provided a PMB for our analysis. This EVM approach, whereby OPM frequently revised its baseline in lieu of establishing and controlling a valid PMB, in effect ensured that material program level variances from planned performance over time (i.e., above the 10 percent threshold) would not be identified and that the state of the program would not be reliably reported. Conclusions: To its credit, OPM has undertaken the RSM program to expedite retirement processing for civilian federal employees and the agency reported that it has met key program goals. Further, OPM has improved its management processes for selecting contractors, defining system and security requirements, managing risks, planning organizational change, and providing program executive oversight. Nevertheless, much remains to be accomplished before the program is effectively positioned to deploy its first planned increment of new technology in February 2008. OPM developed performance targets necessary to gauge the success of the new system with respect to the major objectives of increasing the timeliness and accuracy of retirement claims processing. The agency also recognized the value that an independent review of the RSM program could provide and entered into a contract to obtain such a review, which should help the agency identify and address weaknesses in its management of the RSM program. However, the agency‘s program management has not ensured that system components will perform as intended. In particular, initial test results indicate that the defined benefits technology solution that is a major component of the new system has not performed as intended and future system tests are to be conducted concurrently in about half the time originally planned. Compounding this already risky scenario, the contractor that is providing the defined benefits technology solution continues to identify system defects faster than they can be resolved, thus building a backlog of defects that will need to be resolved and tested within the same time period that OPM is concurrently conducting important system tests. OPM recognized the importance of risk management and has established a risk management process and identified program risks. However, the agency has not yet developed the capability to reliably analyze and report RSM progress. Such progress reporting should be grounded in a reliable cost estimate that is in part the basis for reliable earned value management. Without a reliable cost estimate, OPM does not have a firm foundation for the RSM program budget or for reliable earned value management reporting. Until OPM makes improvements to the RSM program in the areas discussed above, the agency risks not achieving successful program outcomes, including the planned deployment of new technology beginning in February 2008. GAO Recommendations: To address the risks to OPM‘s deployment of new retirement system technology and improve the agency‘s ability to reliably report progress of the RSM program, we are recommending that the Director of the Office of Personnel Management direct the RSM Executive Director to take the following actions: * Ensure that sufficient resources are provided to fully test functionality, actions for mitigating the risks inherent in concurrent testing are identified, test results verify that all system components perform as expected, and test activities and results are subjected to independent verification and validation. * Monitor and review DBTS defects to ensure all urgent and high priority defects are resolved prior to system deployment and that the resolution of urgent and high priority defects is subjected to independent verification and validation. * Develop a revised RSM cost estimate that addresses the weaknesses identified in this briefing and task an independent verification and validation contractor with reviewing the process used to develop the estimate and assessing the reliability of the resulting estimate. * Establish a basis for effective use of earned value management by validating the RSM performance measurement baseline through a program level integrated baseline review and task an independent verification and validation contractor with reviewing the process used to develop the baseline and assessing the reliability of the performance measurement baseline. Agency Comments and Our Evaluation: In oral comments on a draft of this briefing, OPM officials including the Director and the RSM Executive Director generally agreed with our recommendations and provided additional information and written technical comments related to program activities, which we incorporated in the briefing as appropriate. In the comments, the Director stated that the agency: * had contracted in October 2007 for additional resources and expertise to help modify and test DBTS; * continues to actively track the identification and resolution of system defects and is in the process of determining which open defects must be resolved prior to deployment of new retirement system technology in February 2008; and; * plans to review guidance on preparing reliable cost estimates. [End of section] Appendix II: Comments from the Office of Personnel Management: The Director: United States Office Of Personnel Management: Washington, DC 20415: [hyperlink, http://www.opm.gov]: [hyperlink, http://www.usajobs.gov]: Our mission is to ensure the Federal Government has an effective civilian workforce. January 15, 2008: The Honorable David M. Walker: Comptroller General: U.S. Government Accountability Office: 441 G Street, NW: Washington, DC 20548: Dear Mr. Walker: Thank you for the opportunity to provide comments in response to the Government Accountability Office's (GAO) draft report entitled Office of Personnel Management: Improvements Needed to Ensure Successful Retirement Systems Modernization (GAO-08-345SU). The Office of Personnel Management (OPM) appreciates GAO's recognition of the process improvements made for managing and executing the agency's Retirement Systems Modernization (RSM) program. We also acknowledge the importance of the four recommendations outlined in the draft report and are already taking steps to address them. Developing and implementing a retirement system that provides Federal employees and annuitants with a modem and reliable information technology platform is one of my top priorities. This historic modernization is one of the largest projects of its type and is long overdue. Unlike past initiatives to address this major deficiency in a core OPM function, this project will have a successful outcome. When fully live, retirement administration processes resulting from the Retirement Systems Modernization effort will provide high quality customer service, enhanced modeling and planning tools, and prompt, complete payment on annuity commencement. To accomplish our objectives, OPM has invested a tremendous amount of time, energy, and resources to enable deployment of the new system for the first wave of employees in late February 2008. Project management includes the highest level of agency executives. Such accepted practices as Earned Value Management and monthly performance management review meetings where status, risks, and other issues of the modernization effort are addressed have been in place since project inception. We also dedicated additional resources with subject matter expertise in the Defined Benefit Technology Solution to monitor, evaluate, and troubleshoot building and testing of the solution. OPM has reviewed and verified the components of the existing cost estimate and conducted sensitivity analysis on the resulting program costs. We believe our comprehensive strategy, which includes these and other actions, will address GAO's concerns outlined in the draft report and thereby enable the system components to perform as intended. I am also pleased to report that the results of our latest User Acceptance Test (UAT 5) yielded positive trends for identifying and resolving system defects. Specifically, the results of UAT 5 indicate that the backlog of urgent and high priority defects has been reduced from UAT 4 levels. This supports the anticipated pattern we communicated to GAO, namely that testing scores will improve while the defect rate declines as we approach the February "Go Live" milestone. We appreciate GAO's insightful recommendations outlined in the draft report. OPM is taking the necessary steps to ensure that the Federal Government has a state-of-the-art retirement system for annuitants and employees. We look forward to giving GAO an update after "Go Live" for the first wave of employees. Sincerely, Signed by: Linda M. Springer: Director: [End of section] Appendix III: GAO Contact and Staff Acknowledgments: GAO Contact: Valerie C. Melvin (202) 512-6304 or melvinv@gao.gov: Staff Acknowledgments: In addition to the contact named above, key contributions to this report were made by Mark T. Bird, Assistant Director; Neil J. Doherty; David A. Hong; Jacqueline K. Mai; Teresa M. Neven; B. Scott Pettis; Margaret E. Poston; and Amos A. Tevelow. Footnotes: [1] EVM is a management tool used for measuring program performance by comparing the value of work accomplished with the amount of work expected to be accomplished. Such a comparison permits performance to be evaluated based on variances from the planned cost and schedule. [2] EVM is a management tool used for measuring program performance by comparing the value of work accomplished with the amount of work expected to be accomplished. Such a comparison permits performance to be evaluated based on variances from the planned cost and schedule. [3] Office of Personnel Management, OPM Strategic and Operational Plan 2006-2010, (Washington, D.C.: March 2006). [4] Defined benefit plans calculate benefit amounts in advance of retirement based on factors such as salary level and years of service and defined contribution plans calculate benefit amounts based on how the amount is invested by the employee and employer. [5] The Social Security Administration is responsible for administering Social Security and the Federal Thrift Investment Board administers the defined contribution system known as the Thrift Savings Plan. [6] GAO, Office of Personnel Management: Retirement Systems Modernization Faces Numerous Challenges, GAO-05-237(Washington, D.C.: February 28, 2005). [7] GAO, Performance Measurement and Evaluation, GAO-05-739SP (Washington, D.C.: May 2005). [8] GAO,Year 2000 Computing Crisis: A Testing Guide, GAO/AIMD-10.1.21 (Washington, D.C.: November 1998); Information Technology: Customs Automated Commercial Environment Progressing, but Need for Management Improvements Continues, GAO-05-267 (Washington, D.C.: March 14, 2005); and Homeland Security: Visitor and Immigrant Status Program Operating, but Management Improvements Are Still Needed, GAO-06-318T (Washington, D.C.: January 25, 2006). [9] GAO/AIMD-10.1.21. [10] GAO,2000 Census: New Data Capture System Progress and Risks, GAO/AIMD-00-61 (Washington, D.C.: February 4, 2000). [11] GAO,Customs Service Modernization: Automated Commercial Environment Progressing, but Further Acquisition Management Improvements Needed, GAO-03-406 (Washington, D.C.: February 28, 2003); Homeland Security: Visitor and Immigrant Status Program Operating, but Management Improvements Are Still Needed, GAO-06-318T (Washington, D.C.: January 25, 2006); and GAO/AIMD-00-61. [12] GAO,U.S. Customs Service: Observations on Selected Operations and Program Issues, GAO-01-968T (Washington, D.C.: July 17, 2001) and Homeland Security: First Phase of Visitor and Immigration Status Program Operating, but Improvements Needed, GAO-04-586 (Washington, D.C.: May 11, 2004). [13] Software Engineering Institute, Software Acquisition Capability Maturity Model® version 1.03, CMU/SEI-2002-TR-010 (Pittsburgh, PA: March 2002). [14] GAO, Cost Assessment Guide: Best Practices for Estimating and Managing Program Costs, Exposure Draft, GAO-07-1134SP (Washington, D.C.: July 2007). [15] GAO-07-1134SP. [End of section] GAO's Mission: The Government Accountability Office, the audit, evaluation and investigative arm of Congress, exists to support Congress in meeting its constitutional responsibilities and to help improve the performance and accountability of the federal government for the American people. GAO examines the use of public funds; evaluates federal programs and policies; and provides analyses, recommendations, and other assistance to help Congress make informed oversight, policy, and funding decisions. GAO's commitment to good government is reflected in its core values of accountability, integrity, and reliability. Obtaining Copies of GAO Reports and Testimony: The fastest and easiest way to obtain copies of GAO documents at no cost is through GAO's Web site [hyperlink, http://www.gao.gov]. Each weekday, GAO posts newly released reports, testimony, and correspondence on its Web site. To have GAO e-mail you a list of newly posted products every afternoon, go to [hyperlink, http://www.gao.gov] and select "Subscribe to Updates." Order by Mail or Phone: The first copy of each printed report is free. Additional copies are $2 each. A check or money order should be made out to the Superintendent of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or more copies mailed to a single address are discounted 25 percent. Orders should be sent to: U.S. Government Accountability Office: 441 G Street NW, Room LM: Washington, D.C. 20548: To order by Phone: Voice: (202) 512-6000: TDD: (202) 512-2537: Fax: (202) 512-6061: To Report Fraud, Waste, and Abuse in Federal Programs: Contact: Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]: E-mail: fraudnet@gao.gov: Automated answering system: (800) 424-5454 or (202) 512-7470: Congressional Relations: Gloria Jarmon, Managing Director, JarmonG@gao.gov: (202) 512-4400: U.S. Government Accountability Office: 441 G Street NW, Room 7125: Washington, D.C. 20548: Public Affairs: Chuck Young, Managing Director, youngc1@gao.gov: (202) 512-4800: U.S. Government Accountability Office: 441 G Street NW, Room 7149: Washington, D.C. 20548:

The Justia Government Accountability Office site republishes public reports retrieved from the U.S. GAO These reports should not be considered official, and do not necessarily reflect the views of Justia.