Defense Logistics

Actions Needed to Improve Implementation of the Army Logistics Modernization Program Gao ID: GAO-10-461 April 30, 2010

The Logistics Modernization Program (LMP) is an Army business system that is intended to replace the aging Army systems that manage inventory and depot repair operations. Through 2009, the Army obligated more than $1 billion for LMP. LMP was originally scheduled to be completed by 2005, but after the first deployment in July 2003, the Army delayed fielding because of significant problems. The Army has since decided to field the system in two additional deployments: the second deployment occurred in May 2009, and the third deployment is scheduled to occur in October 2010. GAO was asked to evaluate the effectiveness of the Army's management processes in enabling the second deployment sites to realize the full benefits of LMP.

The Army's management processes that were established prior to the second deployment of LMP were not effective in enabling the second deployment sites to realize the full benefits of LMP. When LMP becomes fully operational at the second deployment locations, the Army expects that it will significantly enhance depot operations. However, the Army was unable to ensure that the data used by LMP were of sufficient quality to enable the depots to perform their day-to-day missions after LMP became operational. As a result of these data quality issues, depot personnel had to develop and use manual work-around processes until they could correct the data in LMP, which prevented the Army from achieving the expected benefits from LMP. Data quality issues occurred despite improvements made by the Army to address data issues experienced during the first deployment of LMP because the Army's testing strategy did not provide reasonable assurance that the data being used by LMP were accurate and reliable. Instead, the Army's testing efforts focused on whether the software was functioning, but did not assess whether the data used by the depots to perform their repair missions were of sufficient quality to work in LMP. According to depot officials, the data problems are being corrected as they are identified. Additionally, the Army's training strategy did not effectively provide LMP users the skills necessary to perform their new tasks. Users at the depots stated that the training they received did not provide a realistic environment that showed them how to perform their expected duties, and did not always match their new responsibilities. However, users at the depots also stated that they had received additional training that resolved the issue. The Army also lacked a comprehensive set of metrics with which to measure the success of LMP implementation. GAO's previous work has shown that successful performance measures should be aligned throughout the organization and cover the activities that an entity is expected to perform. However, the Army did not have common metrics with which to measure success during the second deployment, and the Army's scorecard for measuring LMP implementation focused on the software, but did not assess whether the depots were able to perform their work using LMP as envisioned. Despite these challenges, LMP has provided the Army some benefits, and officials at the second deployment sites provided examples of how LMP had improved their day-to-day operations, for example, through the increased visibility of assets. The third deployment of LMP is scheduled to occur in October 2010, and will involve more commands, occur at locations across the globe, and affect more users than the previous deployments. LMP program management officials stated that they are taking steps to address the issues discussed in this report for the third deployment and are adjusting plans related to data testing and training. However, because these plans are being developed, GAO was unable to verify that the problems have been resolved. Without correcting these issues prior to the third deployment, the Army is likely to face similar, or potentially greater, problems that prevent it from realizing the full benefits of LMP.

Recommendations

Our recommendations from this work are listed below with a Contact for more information. Status will change from "In process" to "Open," "Closed - implemented," or "Closed - not implemented" based on our follow up work.

Director: William M. Solis Team: Government Accountability Office: Defense Capabilities and Management Phone: (202) 512-8365


GAO-10-461, Defense Logistics: Actions Needed to Improve Implementation of the Army Logistics Modernization Program This is the accessible text file for GAO report number GAO-10-461 entitled 'Defense Logistics: Actions Needed to Improve Implementation of the Army Logistics Modernization Program' which was released on April 30, 2010. This text file was formatted by the U.S. Government Accountability Office (GAO) to be accessible to users with visual impairments, as part of a longer term project to improve GAO products' accessibility. Every attempt has been made to maintain the structural and data integrity of the original printed product. Accessibility features, such as text descriptions of tables, consecutively numbered footnotes placed at the end of the file, and the text of agency comment letters, are provided but may not exactly duplicate the presentation or format of the printed version. The portable document format (PDF) file is an exact electronic replica of the printed version. We welcome your feedback. Please E-mail your comments regarding the contents or accessibility features of this document to Webmaster@gao.gov. This is a work of the U.S. government and is not subject to copyright protection in the United States. It may be reproduced and distributed in its entirety without further permission from GAO. Because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. Report to the Chairman, Subcommittee on Readiness, Committee on Armed Services, House of Representatives: United States Government Accountability Office: GAO: April 2010: Defense Logistics: Actions Needed to Improve Implementation of the Army Logistics Modernization Program: GAO-10-461: GAO Highlights: Highlights of GAO-10-461, a report to the Chairman, Subcommittee on Readiness, Committee on Armed Services, House of Representatives. Why GAO Did This Study: The Logistics Modernization Program (LMP) is an Army business system that is intended to replace the aging Army systems that manage inventory and depot repair operations. Through 2009, the Army obligated more than $1 billion for LMP. LMP was originally scheduled to be completed by 2005, but after the first deployment in July 2003, the Army delayed fielding because of significant problems. The Army has since decided to field the system in two additional deployments: the second deployment occurred in May 2009, and the third deployment is scheduled to occur in October 2010. GAO was asked to evaluate the effectiveness of the Army‘s management processes in enabling the second deployment sites to realize the full benefits of LMP. What GAO Found: The Army‘s management processes that were established prior to the second deployment of LMP were not effective in enabling the second deployment sites to realize the full benefits of LMP. When LMP becomes fully operational at the second deployment locations, the Army expects that it will significantly enhance depot operations. However, the Army was unable to ensure that the data used by LMP were of sufficient quality to enable the depots to perform their day-to-day missions after LMP became operational. As a result of these data quality issues, depot personnel had to develop and use manual work-around processes until they could correct the data in LMP, which prevented the Army from achieving the expected benefits from LMP. Data quality issues occurred despite improvements made by the Army to address data issues experienced during the first deployment of LMP because the Army‘ s testing strategy did not provide reasonable assurance that the data being used by LMP were accurate and reliable. Instead, the Army‘s testing efforts focused on whether the software was functioning, but did not assess whether the data used by the depots to perform their repair missions were of sufficient quality to work in LMP. According to depot officials, the data problems are being corrected as they are identified. Additionally, the Army‘s training strategy did not effectively provide LMP users the skills necessary to perform their new tasks. Users at the depots stated that the training they received did not provide a realistic environment that showed them how to perform their expected duties, and did not always match their new responsibilities. However, users at the depots also stated that they had received additional training that resolved the issue. The Army also lacked a comprehensive set of metrics with which to measure the success of LMP implementation. GAO‘s previous work has shown that successful performance measures should be aligned throughout the organization and cover the activities that an entity is expected to perform. However, the Army did not have common metrics with which to measure success during the second deployment, and the Army‘s scorecard for measuring LMP implementation focused on the software, but did not assess whether the depots were able to perform their work using LMP as envisioned. Despite these challenges, LMP has provided the Army some benefits, and officials at the second deployment sites provided examples of how LMP had improved their day-to-day operations, for example, through the increased visibility of assets. The third deployment of LMP is scheduled to occur in October 2010, and will involve more commands, occur at locations across the globe, and affect more users than the previous deployments. LMP program management officials stated that they are taking steps to address the issues discussed in this report for the third deployment and are adjusting plans related to data testing and training. However, because these plans are being developed, GAO was unable to verify that the problems have been resolved. Without correcting these issues prior to the third deployment, the Army is likely to face similar, or potentially greater, problems that prevent it from realizing the full benefits of LMP. What GAO Recommends: In order to improve the third deployment of LMP, GAO is recommending that the Secretary of the Army direct the Commanding General, Army Materiel Command, to (1) improve testing activities to obtain reasonable assurance that the data used by LMP can support the LMP processes, (2) improve training for LMP users, and (3) establish performance metrics to enable the Army to assess whether the deployment sites are able to use LMP as intended. The Army concurred with our recommendations. View [hyperlink, http://www.gao.gov/products/GAO-10-461] or key components. For more information, contact William M. Solis at (202) 512-8365 or solisw@gao.gov or Asif A. Khan at (202) 512-9869 or khana@gao.gov or Nabajyoti Barkakati at (202) 512-4499 or barkakatin@gao.gov. [End of section] Contents: Letter: Background: Army's Management Processes Not Effective in Enabling Second Deployment Sites to Achieve LMP Benefits: Conclusion: Recommendations for Executive Action: Agency Comments and Our Evaluation: Appendix I: Scope and Methodology: Appendix II: Comments from the Department of Defense: Tables: Table 1: LMP Deployment Summary: Table 2: Post Go-Live Scorecard: [End of section] United States Government Accountability Office: Washington, DC 20548: April 30, 2010: The Honorable Solomon P. Ortiz: Chairman: Subcommittee on Readiness: Committee on Armed Services: House of Representatives: Dear Mr. Chairman: For decades, the Department of Defense (DOD) has been challenged in modernizing its business systems.[Footnote 1] In 1995, we designated DOD's business system modernization program as high risk, and continue to do so because these systems are fundamental to addressing long- standing weaknesses related to the management of contracts, finance, and the supply chain.[Footnote 2] Moreover, we continue to report on business system investments that fail to effectively deliver promised benefits and capabilities on time and within budget.[Footnote 3] Organizations that implement enterprise resource planning systems have faced substantial challenges because of the complexity of the endeavor, and the Army is no exception. In 1998, the Army initiated the Logistics Modernization Program (LMP) in order to replace two aging Army systems used to manage its inventory and its repair operations at the depots. LMP was originally scheduled to be completed by 2005, but the Army delayed fielding LMP because of the significant problems faced by the first deployment sites--the Communications- Electronics Command and Tobyhanna Army Depot--in July 2003, which we detailed in several reports.[Footnote 4] The Army has since determined that fielding of LMP would occur in two additional phases of deployment: the Aviation and Missile Command in 2009, which includes Corpus Christi Army Depot and Letterkenny Army Depot, and the Tank- automotive and Armaments Command, the Joint Munitions and Lethality Command, and the Army Sustainment Command in October 2010. Through 2009, the Army has obligated more than $1 billion to implement LMP, and estimates a total life cycle cost in excess of $2.6 billion to procure and operate the system. The Army expects that LMP will reduce redundant and stove piped information technology investments and assist in driving business transformation across the Army, which will enable the Army to supply and service the warfighter more quickly and cost effectively. We previously assessed the Army's preparation for the second deployment of LMP and reported that the Army had begun to implement our prior recommendations related to past issues on data conversion, billing and collection, requirements management and testing, and independent verification and validation.[Footnote 5] We also reported that the Army had implemented critical project management processes and controls that enabled the Army to identify and understand the risks associated with making a deployment decision. We further noted that these critical management processes--which addressed data conversion from legacy systems to LMP, training LMP users, and the ability to rapidly evaluate and respond to potential issues--were absent during the first deployment of LMP at Tobyhanna Army Depot. However, the effectiveness of the Army's management processes could not be evaluated until the second deployment had occurred on May 14, 2009. You asked us to continue monitoring the Army's efforts to deploy LMP and evaluate the Army's progress in addressing the issues that are critical to successful implementation. You also asked us to monitor the actions taken by the Army after the system has been deployed to ensure that its stated processes are adequate and have been effectively implemented. Accordingly, the objective of this review was to evaluate the effectiveness of the Army's management processes in enabling the second deployment sites to realize the full benefits of LMP. To address this objective, we reviewed and analyzed the Army plans and policies that governed LMP implementation. We obtained briefings from the LMP program management office on the intended purpose of LMP, as well as information related to the execution of the second deployment. We also monitored the second deployment of LMP as it occurred and had personnel observing operations at Corpus Christi Army Depot, Texas; Letterkenny Army Depot, Pennsylvania; and the LMP program management office's command center in Marlton, New Jersey. We met with officials at each location to discuss how they were using LMP to perform their missions and attended and observed daily meetings held by each location and among the locations where they discussed issues that had arisen and the actions they were taking to resolve the issues. After the initial deployment, we conducted follow-up visits to Corpus Christi Army Depot and Letterkenny Army Depot to monitor the progress of implementation. We also met with officials at the Army Materiel Command, the Aviation and Missile Command, and the LMP program management office to discuss how the Army was managing the implementation and using the system. Lastly, we met with LMP program management officials to discuss plans for the third deployment of LMP scheduled to occur in October 2010. We conducted this performance audit from May 2009 through March 2010 in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. Background: In February 1998, the Army Materiel Command began an effort to replace its existing materiel management systems--the Commodity Command Standard System and the Standard Depot System--with LMP. The Army has used these existing systems for over 30 years to manage inventory and depot maintenance operations. LMP is intended to transform the Army's logistics operations in six core processes: order fulfillment, demand and supply planning, procurement, asset management, materiel maintenance, and financial management. If effectively implemented, LMP is expected to provide the Army benefits associated with commercial best practices, such as inventory reduction, improved repair cycle time, and increased response time. Additionally, LMP is intended to improve supply and demand forecast planning and maintenance workload planning and to provide a single source of data for decision making. LMP became operational at the Army Communications-Electronics Command and Tobyhanna Army Depot in July 2003, and was originally expected to be fully deployed by fiscal year 2005. However, because of problems experienced at Tobyhanna Army Depot, the Army delayed implementation until the operational problems were resolved. In May 2009, LMP became operational at the Army Aviation and Missile Command and Corpus Christi and Letterkenny Army Depots, and the third and final deployment of LMP is expected to occur in October 2010 at the Army Sustainment Command, the Joint Munitions and Lethality Command, the Tank-automotive and Armaments Command, and Anniston and Red River Army Depots. When LMP is fully implemented, it is expected to include approximately 21,000 users at 104 locations across the globe (see table 1), and it will be used to manage more than $40 billion worth of goods and services, such as inventory managed at the national level and repairs at depot facilities. Table 1: LMP Deployment Summary: Number of sites: Communications-Electronics Command: 14; Aviation and Missile Command: 7; Army Sustainment Command, Joint Munitions and Lethality Command, and Tank-automotive and Armaments Command: 83; Total at full operational capability: 104. Approximate number of users: Communications-Electronics Command: 4,000; Aviation and Missile Command: 6,000; Army Sustainment Command, Joint Munitions and Lethality Command, and Tank-automotive and Armaments Command: 11,000; Total at full operational capability: 21,000. Source: U.S. Army. [End of table] According to LMP program management officials, preparations for the third deployment of LMP began in December 2008. In September 2009, leaders from the third deployment sites formally established the Executive Steering Committee to ensure a successful implementation of LMP during the third deployment. The committee's responsibilities include reviewing plans and schedules, exchanging lessons learned, performing ongoing assessments of readiness to implement LMP, and making decisions on important issues. As of January 2010, LMP program management officials told us that several education courses had been completed and that the third deployment sites were in the process of completing their initial data conversion activities. Army's Management Processes Not Effective in Enabling Second Deployment Sites to Achieve LMP Benefits: The Army's management processes that were established prior to the second deployment of LMP were not effective in enabling the second deployment sites to realize the full benefits of LMP. Based on our observations, we found that data quality issues prevented personnel at Corpus Christ and Letterkenny Army Depots from using LMP as envisioned. The Army acknowledged that data quality was one of the most important and challenging success factors in deriving the optimal business benefits from LMP. At both depots, LMP users were unable to use LMP to conduct depot operations and inventory management functions immediately following the second deployment. Although the depots were able to continue to repair items and support the warfighter, LMP users had to rely on manual work-around processes, which are not part of how LMP is intended to function and hinder the Army's ability to realize the benefits expected from LMP. The depots experienced data quality issues, despite improvements the Army made to address data quality issues experienced during the first deployment of LMP at Tobyhanna Army Depot, because the Army's testing strategy did not provide reasonable assurance that the data being used by LMP were accurate and reliable. Specifically, the Army's testing strategy focused on determining whether the software worked as designed, but did not assess whether LMP was capable of functioning in a depot environment using the actual data from the depots. Additionally, the Army's training strategy did not effectively provide users the skills necessary to perform all of their tasks in LMP. Users at the depots stated that the training they received before LMP became operational was not conducted in a realistic environment that showed them how to perform their expected duties. However, users at both depots also stated that the LMP program management office had provided additional training after LMP became operational to address their concerns. Additionally, when monitoring LMP implementation at the second deployment sites, the Army lacked a comprehensive set of metrics with which to accurately assess whether LMP was delivering the intended functionality. Instead, the metrics used by the LMP program management office focused on whether the software was working but did not measure whether the deployment sites were performing their day-to-day operations using LMP as envisioned. Despite these challenges, the second deployment sites reported achieving some benefits through the use of LMP, such as increased visibility over assets. Data Quality Issues Prevented Depots from Realizing the Full Benefits of LMP: The Army was unable to realize the full benefits of LMP at the second deployment sites because of data quality issues. The benefits that LMP was expected to provide the depots included reducing the amount of time to repair items, automatically calculating the material requirements for repair projects, and improving the management of maintenance capacity. However, the ability of the second deployment sites to realize these benefits depended on data quality, which according to the Army, is one of the most important and challenging success factors in deriving optimal business benefits from an enterprise resource planning system like LMP. In preparation for the second deployment, the Army focused on addressing systemic data quality problems associated with the first deployment of LMP at Tobyhanna Army Depot. As we reported in June 2005,[Footnote 6] LMP did not always contain the correct unit price or unit of issue[Footnote 7] for certain materials, resulting in excess material being ordered and incorrect prices being charged to jobs. To avoid similar problems during the second deployment of LMP, the Army developed processes to ensure that the unit of issue and unit of measure values shown for a given item were appropriate for that item. According to LMP program management office officials, out of a target population of more than 12 million data records, 99.9 percent were loaded into LMP (12.40 million out of 12.41 million data records) before the system became operational at the second deployment sites. Although the Army's processes addressed some of the data issues from the first deployment of LMP, according to LMP program management office officials, these processes did not assess whether the overall quality of the data that the system would use was sufficient to support the LMP processes. Both Corpus Christi and Letterkenny Army Depots experienced data quality issues that prevented the Army from realizing the full benefits of LMP. For example, during our visit to Corpus Christi Army Depot shortly after LMP became operational in May 2009, we observed that personnel at the depot could not use LMP to induct a helicopter for repair because of data quality issues. As a result, LMP could not automatically identify the materials needed to support the repair and ensure that parts would be available in time to support the repairs. Furthermore, labor rates were also missing for some stages of repair, thereby precluding LMP from computing the expected labor costs for a repair project. Users at Corpus Christi Army Depot addressed the data quality problems as they arose. However, because of these data quality problems, some of the enterprise processes that LMP can perform had to be conducted using manual work-around processes to ensure that the depot accomplished its repair mission and thereby continued to meet the needs of the warfighter. Furthermore, these manual work-around processes are labor intensive and prevent the Army from achieving the benefits that LMP is expected to provide, such as increasing the efficiency in ordering parts, determining whether sufficient funds are available to perform the expected work, and determining whether the production schedule could be achieved with existing resources. Letterkenny Army Depot also experienced issues with data quality. For example, when we visited Letterkenny Army Depot shortly after LMP became operational, depot officials told us that they had identified data quality errors when they attempted to induct a ground vehicle for the Patriot system for repair. Letterkenny Army Depot officials also stated that prior to LMP becoming operational, they spent about 18 months rebuilding the data that were needed to perform repair and supply functions in LMP. However, after LMP became operational, they realized that the data were not of sufficient quality to enable the depot to use LMP as envisioned, so Letterkenny Army Depot officials took steps to correct the data. During our subsequent visit to Letterkenny Army Depot in August 2009, depot officials stated that they were continuing to refine the data, and that they had nearly completed doing so for their most common repair item--the High Mobility Multi-Purpose Wheeled Vehicle. This refinement would then serve as a template for the remaining repair items. Officials at Letterkenny Army Depot noted, however, that they had pre-positioned materials prior to the transition to LMP to ensure that the depot could perform its repair mission if difficulties arose, which enabled the depot to perform its day-to-day operations despite the data shortcomings. Nonetheless, officials at Letterkenny Army Depot acknowledged that they were not using the preferred automated LMP processes as intended, and needed to use manual work-around processes to complete their tasks. Army's Testing Strategy Did Not Evaluate Data Quality: The Army's testing strategy, as illustrated by the problems experienced at the second deployment locations, was not comprehensive enough to provide reasonable assurance that the data being used by LMP were accurate and reliable. As we have previously reported,[Footnote 8] testing is a critical process utilized by organizations with the intent of finding errors before a system is implemented. According to the Army's Master Test Plan for LMP implementation, the objective of system testing was to ensure that LMP operated as intended. However, the Army's testing efforts focused on determining whether the software worked as designed. Army officials stated that Army testing efforts for the second deployment of LMP had incorporated lessons learned from the first deployment of LMP. With respect to testing data conversion, the Army noted two improvements in its testing process: the use of data-specific scenario testing and engaging users in critical business process tests. According to the Army's test plan, data-specific scenario tests were designed to assess the quality, validity, and integrity of the data to be migrated into LMP, as well as to validate that data migration processes function as designed. Additionally, critical business process tests, which were performed by users, involved the execution of business process-oriented scenarios using the data loaded into LMP with the intent of assessing the functional readiness of the software. In order to assess the functional readiness of the software, the Army used simulated test data to test the system. For example, when assessing the functional readiness of the software to perform an induction of an item for repair, Army officials told us that they did not attempt to induct an item for repair using the data loaded into LMP. Instead, the Army tested whether LMP could perform an induction, and performed these tests using simulated data so that developers would know whether LMP could provide the intended capability. While this approach is useful and desirable to determine whether the software can operate as expected, it does not assess whether the data are of sufficient quality to work in LMP. Thus, the first attempt to perform a process in LMP using actual data during the second deployment occurred when the depots attempted to use the system. Consequently, the Army's testing strategy did not detect problems with the quality of the data at the deployment sites. During lessons learned sessions held in June 2009, Army officials at the LMP program management office and at the second deployment sites acknowledged these weaknesses in their testing strategy. Army officials we interviewed also stated that testing needed to address whether the deployment sites could perform their work using the system as intended. For example, officials at Corpus Christi Army Depot agreed that a better strategy to test whether LMP could perform as intended would be to induct a representative number of items at each depot using actual data from each deployment site. By attempting to induct several items into the depot repair processes that are representative of the majority of their workloads prior to transitioning to LMP, the Army would have likely detected problems related to data quality, which could have provided reasonable assurance that LMP could operate as intended if the tests were successful. Furthermore, by incorporating the use of actual data into its testing strategy, the Army could have obtained reasonable assurance that the data loaded into LMP were of sufficient quality to be used and increased user confidence in the system. For example, officials at Letterkenny Army Depot told us that testing the system in a simulated environment with actual data would not only assess whether the system would work but also would gauge the effectiveness of training. Had the Army's testing strategy used the actual data at the deployment locations, the Army could have identified issues related to data quality prior to the transition to LMP. Army's Training Strategy Did Not Fully Meet Needs of LMP Users: Although the Army's training strategy was designed to provide LMP users the skills and knowledge to successfully perform their new roles, LMP users we interviewed at Corpus Christi and Letterkenny Army Depots stated that the training they received prior to LMP becoming operational did not fully meet their needs. LMP users we interviewed at Corpus Christi and Letterkenny Army Depots stated that the training focused on what LMP was supposed to do rather than on how they were to use the system to perform their day-to-day missions. Additionally, because the duties of some LMP users at the depots were changing, the training users received was not always commensurate with the responsibilities they were assigned. Consequently, some LMP users told us that they did not always understand the actions they needed to perform in order to accomplish their assigned tasks. Furthermore, since the timeline for implementing the various LMP processes differed at each location, the training sometimes occurred weeks before it could be actually used, which resulted in users needing additional training after the system became operational. Despite these issues, LMP users at both depots stated that the additional training they received after LMP became operational was effective and addressed their needs. According to the Army's LMP End-User Training and Development Delivery Plan dated January 30, 2009, most implementation failures are caused by poor end user training, which is the transfer of knowledge to the end users who will run the enterprise with the new solution. To prevent poor end user training, the Army's plan stated that end user training must provide users not only with the transactions and tasks performed in LMP, but also with the ability to recognize the underlying flow of information through LMP. To meet this goal, the Army used a blended learning solution designed to provide LMP users with the skills, process knowledge, and performance support to successfully perform their new roles after the transition to LMP. The Army's plan also stated that training should occur as close as possible to the date of implementation because the user's ability to retain information diminishes each day the user is not able to put into practice the training he or she has received. The Army experienced challenges with the quality and timing of its education and training efforts. According to the Army, education focused on LMP concepts, while training demonstrated how the system should be utilized by the user. However, according to Army lessons learned documents, users received education on LMP concepts too far in advance of training, which limited the ability of LMP users to understand how the changes in the business processes that were to occur as a result of LMP implementation affected their job responsibilities. Additionally, LMP users we met at the depots questioned the overall quality of the training they received. For example, these users stated that the instruction focused too much on concepts, rather than providing them the skills necessary to perform their day-to-day operations. Users also noted that they had limited opportunities to enhance their knowledge of the system by actually using the system in a training environment. The Army was also unable to deliver the correct training to some users because of challenges in properly assigning roles. The Army's plan for implementing LMP required not only the adoption of a new technological solution, but also required changes to the duty descriptions for some LMP users. For example, prior to LMP implementation, production controllers at Letterkenny Army Depot were responsible for tracking the flow of a repair item; however, after LMP implementation, production controllers became asset managers, who were responsible not only for tracking the flow of the item through the repair process but also for ordering parts. Additionally, supervisors were generally responsible for assigning roles to their personnel, and based on these roles, users would receive training. However, as noted in Army lessons learned documents, the process for determining roles sometimes occurred before supervisors received any notable LMP education. As a result, some users did not receive the correct training. The Army also faced challenges in its ability to deliver training as close as possible to the date of implementation because of the decentralized execution strategy at each deployment site. In order to prepare users, the Army used a master training calendar and standard training curriculum. Although this training was standardized across the second deployment sites, execution of LMP implementation was decentralized. For example, after LMP implementation occurred on May 14, 2009, Letterkenny Army Depot consolidated all of its production controllers into one area and began performing LMP functions in a phased approach. In contrast, Corpus Christi Army Depot executed LMP implementation by attempting to perform all tasks shortly after the transition. Although each approach has merit, the approach used by Letterkenny Army Depot did not match the Army's approach to training. That is, LMP users at Letterkenny Army Depot only performed some of the LMP processes immediately after LMP became operational. Accordingly, some of the users at Letterkenny Army Depot had to receive refresher training before they could perform their assigned duties. Despite the concerns that LMP users expressed about training, LMP users also stated that the Army was able to successfully provide refresher training and ad hoc coursework to address their issues. For example, LMP users we met with at both Corpus Christi and Letterkenny Army Depots stated that these courses were effective because the content was focused and specific. Officials at both depots also stated that the contractor support personnel that the LMP program management office provided to assist each depot were also effective in providing information. Army Lacked Comprehensive Metrics to Assess LMP Implementation: The Army was unable to determine whether the second deployment sites had achieved the envisioned functionality of LMP because the Army lacked a comprehensive set of metrics to measure the success of LMP implementation. Our previous work has shown that successful performance measures should be aligned throughout the organization and cover the activities that an entity is expected to perform to support the intent of the program.[Footnote 9] However, based on our review of the second deployment of LMP, we determined that the Army did not develop a comprehensive set of metrics. As a result, the scorecards that the LMP program management office used to measure the progress of LMP implementation did not measure whether the deployment sites were performing their day-to-day operations using LMP as envisioned. The LMP program management office assessed the progress of LMP implementation using a scorecard that was agreed to by the deployment sites and the LMP program management office. This scorecard focused on four elements: (1) validating user access, (2) critical business processes validated by sites,[Footnote 10] (3) production support and infrastructure readiness, and (4) training readiness. Table 2 provides an explanation of each area. Table 2: Post Go-Live Scorecard: Element: Validating user access; Definition: End user access will be determined by end user ability to (1) log on to the LMP solution and (2) perform assigned duties with assigned security roles; Metric: Green: 90 percent access; Yellow: 80 percent access; Red: 70 percent access. Element: Critical business process validation (CBPV); Definition: Assess the expected outcomes as defined in the CBPV template for each critical business process after implementation; Metric: Green: End users executed processes, documented issues, and assessed and recommended actions with assistance of CBPV team with no issues; Yellow: Issues identified, but work-around(s) in place; Red: No work-around(s) in place. Element: Production support and infrastructure; Definition: Real-time monitoring of key performance parameters, including system response time, system availability, and system capacity; Metric: Green: All key performance parameters within tolerance; Yellow: Some key performance parameters not met, but work-around in place; Red: Some key performance parameters not met, but no work-around in place. Element: Validate expert user and end user education and training effectiveness; Definition: Delivery of end user training, refresher workshops, and ad hoc training; Metric: Green: Delivery of all scheduled end user training, refresher workshops, and ad hoc courses to identified end users and end user completion of exercise assessments; Yellow: Schedules missed, but work-around in place; Red: Schedules missed, but no work-around in place. Source: U.S. Army. [End of table] As noted above, in addition to the criteria for each element, the LMP program management office's measurements included a legend that provided detail on the color coding used to assess progress. According to this scorecard, a "green" rating was assigned if the element was "on track," a "yellow" rating if the element had "issues being worked," a "red" rating if the element had "significant problems/issues," and a "white" rating if the element had not been started. Of the elements on the scorecard, the only category that assessed whether LMP could be used as intended was critical business process validation. However, as stated in the critical business process validation reports submitted by the Aviation and Missile Command, Corpus Christi Army Depot, and Letterkenny Army Depot, this validation allowed the use of manual work-around processes that were not part of the envisioned LMP processes. Accordingly, the Army did not have an assessment in place to determine whether LMP was delivering the envisioned capability to the second deployment sites. Based on our observations at Corpus Christi and Letterkenny Army Depots, we found that the elements on the LMP program management office's scorecard did not accurately reflect the activities that the depots were expected to perform. One of the primary purposes of implementing LMP was to gain the capability and efficiencies through automated processes associated with the software. Accordingly, while the scorecard measured the functionality of the software after LMP became operational, the scorecard did not assess whether the depots were able to perform their work using LMP as envisioned. In addition to its scorecard, the LMP program management office provided periodic briefings regarding LMP implementation that identified problems as "What's Important Now" items. The briefings described each problem, the impact of the problem, steps being taken to mitigate the problem, and an estimated date for when the problem would be resolved. However, despite the presence of these items, the LMP program management office's scorecard reflected the status of LMP implementation as "green." For example, a May 28, 2009, briefing that was provided to senior Army management contained 17 "What's Important Now" items that identified problems related to missing data, the ability of the depots to fill customer requisitions, the ability to correct data in LMP, and challenges related to issuing materials to the shop floor to support repairs. Additionally, in the same briefing, the LMP program management office reported that Corpus Christi Army Depot had inducted an aircraft for repair using LMP. In actuality, as discussed earlier, the aircraft was inducted by depot personnel using manual work-around processes that were similar to legacy processes rather than the envisioned LMP processes. Although the depot was unable to induct a helicopter using LMP as intended, the LMP program management office's scorecard reflected a "green" rating for all of the elements. We also found that the LMP program management office's scorecard did not accurately reflect the internal assessment of LMP implementation at the depots. For example, Letterkenny Army Depot developed and used a scorecard to measure progress of LMP implementation, which included more than 50 processes that end users had to perform in LMP covering areas such as supply, maintenance, and finance. According to officials at Letterkenny Army Depot, a process was identified as "green" once the user had successfully performed the task in LMP using the envisioned processes. However, the progress as tracked by the depot did not match the progress as reported on the LMP program management office's scorecard. For example, on May 26, 2009, Letterkenny Army Depot had identified 48 of its processes as "red" because the depot either had not yet performed the function in LMP or was unable to perform it successfully in LMP using the envisioned processes. However, on the same day, the LMP program management office reported that LMP was "green" in all elements measured by the LMP program management office's scorecard. These differences reflect the lack of a comprehensive set of metrics for measuring the success of LMP implementation, because while the LMP program management office was measuring whether the software was working, Letterkenny Army Depot had identified that it was unable to conduct its daily operations using LMP as envisioned. LMP Has Provided the Army Some Benefits: Although data quality and training issues prevented the second deployment sites from using the full capabilities of LMP as envisioned, the use of LMP at the second deployment sites has provided the Army some benefits that were not available in legacy systems, such as increased visibility. For example, officials at Corpus Christi Army Depot stated that LMP has provided them the ability to track and trace individual transactions to specific end users. With this tool, officials at Corpus Christi Army Depot stated that they are able to research individual actions, as well as ensure that individuals are following the procedures at the depot. Corpus Christi Army Depot officials also stated that LMP provided them increased visibility over contractor-managed inventories, which was not available in the legacy systems. The Army has also achieved benefits through the common picture provided by LMP. For example, when explaining how the life cycle management commands were using LMP, an item manager from the Aviation and Missile Command showed us how the common view provided by LMP improved communication with the depots. When attempting to find the location of an item for repair, the item manager stated that both the depot and the item manager saw that the item to be repaired had already arrived at the depot, so the depot could then begin the repair process. According to the item manager, this visibility was not available in the legacy systems, and the lack of a common picture sometimes delayed induction of items for repair. The item manager stated that the delays occurred because the item manager's system showed that an item was located at a depot, but the depot's system did not show the item as received, so personnel at the depot had to locate the item before it could be inducted for repair. Furthermore, the item manager demonstrated how the common view provided by LMP helped locate a critical part for the Patriot missile system of a deployed unit. In this case, the item manager was able to identify where the part was stored in order to support the deployed unit. The item manager stated that this capability was not available in legacy systems. Preparations for Third Deployment of LMP Are Under Way: The Army has taken steps to address some of the concerns we identified during the second deployment of LMP. LMP program management officials told us that preparations for the third deployment of LMP had begun in December 2008, but that the Army had adjusted its plans--specifically in the areas of testing and training--based on lessons learned from the second deployment. For example, LMP program management officials told us that the third deployment sites had already begun data preparation activities in early 2009, and that these activities would continue through September 2010. Additionally, LMP program management officials stated that they had developed changes to their testing strategy and that tests are scheduled to begin in May 2010. LMP program management officials also stated that they intended to conduct selective testing using actual depot-specific data and that this testing is scheduled to begin in August 2010. However, according to LMP program management officials, the exact processes to be tested and the materials associated with supporting the tests have yet to be developed. With respect to training, LMP program management officials told us that the training for the third deployment will be conducted by a cadre of personnel from each of the major deployment sites, and that members of this cadre would both serve as instructors to users at the locations as well as assist sites with implementing LMP. LMP program management officials stated that they were in the process of completing instruction for the cadre of trainers, and that LMP users had already begun receiving instruction on the LMP process. During a meeting with DOD and Army officials, a representative from one of the sites preparing for the third deployment stated that while the instructors had changed from contractors to the cadre of trainers, LMP users were still being taught with the same training materials from the second deployment. Additionally, according to the Army's schedule for the third deployment of LMP, the Army plans to update training for users in June 2010 and begin delivery of training to users in July 2010. Accordingly, because these events have yet to occur, we were unable to determine whether they adequately addressed the issues we identified during the second deployment related to data testing and training. Conclusion: Implementation of enterprise resource planning systems, like LMP, is a complex endeavor, and based on the second LMP deployment, the Army has improved its ability to manage implementation. Because the Army improved its management processes based on lessons learned during the first deployment of LMP at Tobyhanna Army Depot, the Army was successful in mitigating some of the previous issues. Furthermore, the second deployment of LMP also demonstrated the potential to provide benefits for the Army, such as providing better visibility of inventory and asset management. However, despite the improvements and the benefits that were achieved, the deployment sites still faced challenges related to data quality and training, which limited their ability to use LMP as intended. Additionally, the Army's lack of a comprehensive set of performance metrics prevented it from measuring whether the intended LMP functionality had been achieved at the depots. Although the Army was effective in addressing these challenges after they had arisen at the second deployment sites, applying these lessons learned to the Army's plans as it prepares to support the third phase of deployment in October 2010 is critical. Unless the Army addresses these challenges, the third deployment locations are likely to face the same, or even greater, problems, since the third deployment of LMP will occur at more locations and affect more users than the previous deployments. Recommendations for Executive Action: To improve the third deployment of LMP, we are recommending that the Secretary of the Army direct the Commanding General, Army Materiel Command, to take the following three actions: * Improve testing activities to obtain reasonable assurance that the data used by LMP can support the LMP processes. * Improve training for LMP users by: - providing training on actual job processes in a manner that allows the users to understand how the new processes support their job responsibilities and the types of work they are expected to perform and: - providing training at the individual deployment sites to match deployment timelines. * Establish performance metrics that will enable the Army to assess whether the deployment sites are able to use LMP as intended. Agency Comments and Our Evaluation: In written comments on a draft of this report, DOD stated that the Army concurred with our recommendations and highlighted the corrective actions it is taking to (1) improve testing activities to obtain reasonable assurance that the data used by LMP can support the LMP processes, (2) improve training for LMP users, and (3) establish performance metrics that will enable the Army to assess whether the deployment sites are able to use LMP as intended. Regarding our first recommendation, the Army commented that the third deployment of LMP involves two new test activities--the Process and Data Integration Test and the Business Operational Test--that are designed to address lessons learned from the second deployment of LMP. The Army commented that the Process and Data Integration Test will evaluate business processes using migrated business data from critical weapon systems, and that the Business Operational Test will require expert and select end users to perform transactions in LMP using local data. According to the Army, these tests will bring together data, business processes, standard operating procedures, and end user training materials to ensure success. While we have not evaluated the effectiveness or sufficiency of these two new tests in correcting the data testing issues we discuss in this report, we believe that the steps the Army is taking appropriately address the intent of our recommendation. With respect to our second recommendation, the Army commented that it began role mapping and development of the training calendar earlier in the deployment process, and that training will be delivered to end users in a just-in-time method to ensure that the training is timely and focused to meet the needs of the users. The Army also commented that it will continue to provide refresher training after deployment. We have not reviewed these new training initiatives, but we believe that they are a step in the right direction toward addressing the intent of our recommendation and improving LMP user training. Finally, regarding our third recommendation, the Army commented that it is working to improve standard performance measures, and that the metrics will reflect lessons learned from previous LMP deployments. The Army commented that the expected date of completion for development of these measures is July 1, 2010. The Army's written comments are reprinted in appendix II. We are sending copies of this report to interested congressional committees; the Secretary of Defense; the Secretary of the Army; and the Director, Office of Management and Budget. The report also is available at no charge on the GAO Web site at [hyperlink, http://www.gao.gov]. Please contact William M. Solis at (202) 512-8365 or solisw@gao.gov, Asif A. Khan at (202) 512-9869 or khana@gao.gov, or Nabajyoti Barkakati at (202) 512-4499 or barkakatin@gao.gov if you or your staff have questions on matters discussed in this report. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this report. Key contributors to this report were J. Chris Martin, Senior-Level Technologist; David Schmitt, Assistant Director; Evelyn Logue; Darby Smith; and Jim Melton. Sincerely yours, Signed by: William M. Solis: Director, Defense Capabilities and Management: Signed by: Asif A. Khan: Director, Financial Management and Assurance: Signed by: Nabajyoti Barkakati: Chief Technologist: Applied Research and Methods Center for Technology and Engineering: [End of section] Appendix I: Scope and Methodology: In order to evaluate the effectiveness of the Army's management processes in enabling the second deployment sites to realize the full benefits of the Logistics Modernization Program (LMP), we reviewed and analyzed Army plans and policies that governed LMP implementation. Specifically, we reviewed plans created by the LMP program management office related to data conversion and migration, system testing, the training curriculum, and how the Army intended to monitor the implementation of LMP. We also held several meetings with LMP program management office officials and received briefings related to LMP implementation. During these briefings, we also received information on how LMP is intended to function, as well as the benefits that the Army expects to receive by using the system. We also monitored the second deployment of LMP as it occurred and had personnel observing operations at Corpus Christi Army Depot, Texas; Letterkenny Army Depot, Pennsylvania; and the LMP program management office's command center at Marlton, New Jersey. During our initial visits to Corpus Christi Army Depot and Letterkenny Army Depot, we met with depot officials to discuss how they were using LMP and any problems or successes that had arisen from their usage. We also observed how personnel at the depots were performing certain processes using LMP and received documents related to those actions. Additionally, we reviewed how each of the depots was assessing the progress of LMP implementation, and attended several of the daily internal meetings held at Corpus Christi Army Depot and Letterkenny Army Depot. To assess how the Army was managing the overall progress of implementation, we also attended daily meetings held by the LMP program management office, the Aviation and Missile Command, Corpus Christi Army Depot, and Letterkenny Army Depot. After the initial deployment, we attended a lessons learned discussion hosted by the LMP program management office. We conducted follow-up visits to Letterkenny Army Depot in June 2009 and August 2009 and to Corpus Christi Army Depot in September 2009 to monitor the progress of implementation. We also met with officials at the Army Materiel Command, the Aviation and Missile Command, and the LMP program management office to discuss how the Army was managing the implementation, and received documents that were used to inform senior leaders at the Army Materiel Command on the status of LMP implementation. We also met with the Army Deputy Chief of Staff for Logistics in January 2010 to discuss our initial observations. During meetings we held with officials in the Army Materiel Command and the LMP program management office, we discussed the steps that the Army was taking to support the third phase of deployment for LMP. We received copies of Army briefings assessing the progress of implementation, as well as revisions to Army plans and drafts of new plans based on the lessons learned from the second deployment of LMP. Because of the timing of this review, we did not assess the Army's plans for the third deployment locations, nor were we able to observe actions taken by the third deployment locations to prepare for LMP implementation. We conducted this performance audit from May 2009 through March 2010 in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. [End of section] Appendix II: Comments from the Department of Defense: Office Of The Deputy Chief Management Officer: 9010 Defense Pentagon: Washington, DC 20301-9010: April 21, 2010: Mr. William M. Solis: Director, Defense Capabilities and Management: United States Government Accountability Office: 441 G Street, NW: Washington, DC 20548: Dear Mr. Solis: The Department of Defense response to the Government Accountability Office's (GAO) draft report 10-461, "Defense Logistics: Actions Needed to Improve Implementation of the Army Logistics Modernization Program," dated March 22, 2010, (GAO Code 351366) is attached and provides detailed comments on the draft report. Sincerely, Signed by: Elizabeth A. McGrath: Assistant Deputy Chief Management Officer: Attachment: As stated: [End of letter] Government Accountability Office (GAO) Draft Report Dated March 22, 2010: GA0-10-461 (GAO CODE 351366): "Defense Logistics: Actions Needed To Improve Implementation Of The Army Logistics Modernization Program" Department Of Defense (DoD) Comments To GAO Recommendations: Recommendation 1: The GAO recommends that the Secretary of the Army direct the Commanding General, Army Materiel Command [AMC] to improve testing activities to obtain reasonable assurance that the data used by [Logistics Modernization Program] LMP can support the LMP processes (Page 23/GAO Draft Report.) DoD Response: The Army concurs with the recommendation. The third deployment of LMP to the Tank-automotive and Armaments Life Cycle Management Command, Joint Munitions & Lethality Life Cycle Management Command and the Army Sustainment Command involves two new test activities designed to address lessons learned from the second deployment. During the Process and Data Integration Test activity, End- to-End business processes will be tested using migrated, validated business data from critical weapon systems. During the Business Operational Test, expert and select end users will perform transactions in the LMP system using local data, from their home station, bringing data, business process, standard operating procedures and end user training materials together to ensure success. AMC also improved data quality processes to supplement LMP data trial loads, with monthly data quality tests against local site data and established business rules. Under the AMC Data Audit Strategy, records failing validation tests will be corrected and reviewed for accuracy, and AMC will also take a random sampling of records passing the tests to review accuracy of those records. Recommendation 2: The GAO recommends that the Secretary of the Army direct the Commanding General, Army Materiel Command to improve training for LMP users by: * Providing training on actual job processes in a manner that allows the user to understand how the new processes support their job responsibilities and the types of work they are expected to perform. * Providing training at the individual deployment sites to match deployment timelines (Page 23/GAO Draft Report.) DoD Response: The Army concurs with the recommendation. The third deployment of LMP utilizes an organic cadre of instructors and subject matter experts drawn from different levels and backgrounds at AMC. Together, this cadre has expertise in different areas of AMC operations, and is responsible for the business transformation activities at depots, arsenals and commands; participating in the LMP Process and Data Integration Test and Business Operational Test activities; tailoring training materials for end users; developing local Standard Operating Procedures and desktop guides; training expert users; and delivering end user training. Training materials leverage experience from deployed sites, and will be demonstrated during expert user training and Business Operation Test, with any required changes being made prior to end user training. Cadre and expert users work collaboratively with current LMP sites to gain experience and insight into business processes and organizational change in order to develop the expertise necessary for training end users. AMC and the LMP Project Management Office began role mapping and training calendar development earlier in the deployment process to determine the number of course offerings necessary at each site. To the extent possible, training will be delivered at end user locations in a "just in time" method to ensure the training is timely and focused to the needs of users. Training activities will continue following deployment to provide refresher training, and provide training on monthly and quarterly business processes before the processes are performed. Desk side ad-hoc training will be delivered by expert users to answer questions from end users. Recommendation 3: The GAO recommends that the Secretary of the Army direct the Commanding General, Army Materiel Command to establish performance metrics that will enable the Army to assess whether the deployment sites are able to use LMP as intended. (See pages 23-24/GAO Draft Report.) DoD Response: The Army concurs with the recommendation. AMC and the LMP Program Management Office are working across the third deployment and deployed sites and commands to improve standard performance measures for business process effectiveness and the ability to meet mission requirements using the LMP system. The expected date of completion for development of Post Go-Live metrics is July 1, 2010. LMP's Pre Go-Live Scorecard monitors readiness of third deployment sites to go live across 13 elements, which relate to data, organization, process, technology and application. Development of LMP's Post Go-Live metrics is monitored as part of a Pre Go-Live activity under the process element, "Prepare for Post Go-Live Support." Post Go-Live metrics will reflect lessons learned from previous LMP deployments and monitor LMP deployed sites' status and readiness to transition to "steady state" sustainment, based on completion of dependent activities and events captured and aligned against several major areas to include critical business process validation, data, training, security role mapping and production support. [End of section] Footnotes: [1] Business systems are information systems that support DOD business operations, such as civilian personnel, finance, health, logistics, military personnel, procurement, and transportation. [2] GAO, High-Risk Series: An Update, [hyperlink, http://www.gao.gov/products/GAO-09-271] (Washington, D.C.: January 2009). [3] See, for example, GAO, DOD Business Systems Modernization: Navy Implementing a Number of Key Management Controls on Enterprise Resource Planning System, but Improvements Still Needed, [hyperlink, http://www.gao.gov/products/GAO-09-841] (Washington, D.C.: Sept. 15, 2009), and DOD Business Systems Modernization: Key Marine Corps System Acquisition Needs to Be Better Justified, Defined, and Managed, [hyperlink, http://www.gao.gov/products/GAO-08-822] (Washington, D.C.: July 28, 2008). [4] GAO, DOD Business Systems Modernization: Billions Continue to Be Invested with Inadequate Management Oversight and Accountability, [hyperlink, http://www.gao.gov/products/GAO-04-615] (Washington, D.C.: May 27, 2004); Army Depot Maintenance: Ineffective Oversight of Depot Maintenance Operations and System Implementation Efforts, [hyperlink, http://www.gao.gov/products/GAO-05-441] (Washington, D.C.: June 30, 2005); and DOD Business Transformation: Lack of an Integrated Strategy Puts the Army's Asset Visibility System Investments at Risk, [hyperlink, http://www.gao.gov/products/GAO-07-860] (Washington, D.C.: July 27, 2007). [5] GAO, Defense Logistics: Observations on Army's Implementation of the Logistics Modernization Program, [hyperlink, http://www.gao.gov/products/GAO-09-852R] (Washington, D.C.: July 8, 2009). [6] [hyperlink, http://www.gao.gov/products/GAO-05-441]. [7] DOD defines unit of issue as the quantity of an item, such as each number, dozen, gallon, pair, pound, ream, set, or yard. [8] [hyperlink, http://www.gao.gov/products/GAO-04-615]. [9] GAO, Tax Administration: IRS Needs to Further Refine Its Tax Filing Season Performance Measures, [hyperlink, http://www.gao.gov/products/GAO-03-143] (Washington, D.C.: Nov. 22, 2002). [10] According to the Aviation and Missile Command, critical business process validation is an element used to verify that LMP has immediately restored the ability to meet critical business process needs and satisfy mandatory regulations and requirements. [End of section] GAO's Mission: The Government Accountability Office, the audit, evaluation and investigative arm of Congress, exists to support Congress in meeting its constitutional responsibilities and to help improve the performance and accountability of the federal government for the American people. GAO examines the use of public funds; evaluates federal programs and policies; and provides analyses, recommendations, and other assistance to help Congress make informed oversight, policy, and funding decisions. GAO's commitment to good government is reflected in its core values of accountability, integrity, and reliability. Obtaining Copies of GAO Reports and Testimony: The fastest and easiest way to obtain copies of GAO documents at no cost is through GAO's Web site [hyperlink, http://www.gao.gov]. Each weekday, GAO posts newly released reports, testimony, and correspondence on its Web site. To have GAO e-mail you a list of newly posted products every afternoon, go to [hyperlink, http://www.gao.gov] and select "E-mail Updates." Order by Phone: The price of each GAO publication reflects GAO‘s actual cost of production and distribution and depends on the number of pages in the publication and whether the publication is printed in color or black and white. Pricing and ordering information is posted on GAO‘s Web site, [hyperlink, http://www.gao.gov/ordering.htm]. Place orders by calling (202) 512-6000, toll free (866) 801-7077, or TDD (202) 512-2537. Orders may be paid for using American Express, Discover Card, MasterCard, Visa, check, or money order. Call for additional information. To Report Fraud, Waste, and Abuse in Federal Programs: Contact: Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]: E-mail: fraudnet@gao.gov: Automated answering system: (800) 424-5454 or (202) 512-7470: Congressional Relations: Ralph Dawn, Managing Director, dawnr@gao.gov: (202) 512-4400: U.S. Government Accountability Office: 441 G Street NW, Room 7125: Washington, D.C. 20548: Public Affairs: Chuck Young, Managing Director, youngc1@gao.gov: (202) 512-4800: U.S. Government Accountability Office: 441 G Street NW, Room 7149: Washington, D.C. 20548:

The Justia Government Accountability Office site republishes public reports retrieved from the U.S. GAO These reports should not be considered official, and do not necessarily reflect the views of Justia.