Weather Forecasting

National Weather Service Is Planning to Improve Service and Gain Efficiency, but Impacts of Potential Changes Are Not Yet Known Gao ID: GAO-06-792 July 14, 2006

To provide accurate and timely weather forecasts, the National Weather Service (NWS) uses systems, technologies, and manual processes to collect, process, and disseminate weather data to its nationwide network of field offices and centers. After completing a major modernization program in the 1990s, NWS is seeking to upgrade its systems with the goal of improving its forecasting abilities, and it is considering changing how its nationwide office structure operates in order to enhance efficiency. GAO was asked to (1) evaluate NWS's efforts to achieve improvements in the delivery of its services through system and technology upgrades, (2) assess agency plans to achieve service improvements through training its employees, and (3) evaluate agency plans to revise its nationwide office configuration and the implications of these plans on local forecasting services, staffing, and budgets.

NWS is positioning itself to provide better service through over $315 million in planned upgrades to its systems and technologies. In annual plans, the agency links expected improvements in its service performance measures with the technologies and systems expected to improve them. For example, NWS expects to reduce the average error in its forecasts of hurricane paths by approximately 20 nautical miles between 2005 and 2011 through a combination of upgrades to observation systems, better hurricane forecast models, enhancements to the computer infrastructure, and research that will be transferred to forecast operations. Also, NWS expects to increase tornado warning lead times from 13 to 15 minutes by the end of fiscal year 2008 after the agency completes an upgrade to its radar system and realizes benefits from software improvements to its forecaster workstations. NWS also provides training courses for its employees to help improve its forecasting services, but the agency's process for selecting training lacks sufficient oversight. Program officials propose and justify training needs on the basis of up to eight different criteria--including whether a course is expected to improve NWS forecasting performance measures, support customer outreach, or increase scientific awareness. Many of these course justifications appropriately demonstrate support for improved forecasting performance. For example, training on how to more effectively use forecaster workstations is expected to help improve tornado and hurricane warnings. However, in justifying training courses, program officials routinely link courses to NWS forecasting performance measures. For example, in 2006, almost all training needs were linked to expectations for improved performance--including training on cardiopulmonary resuscitation, spill prevention, and systems security. The training selection process did not validate or question how these courses could help improve weather forecasts. Overuse of this justification undermines the distinctions among different training courses and the credibility of the course selection process. Additionally, because the training selection process does not clearly distinguish among courses, it is difficult to determine whether sufficient funds are dedicated to the courses that are expected to improve performance. To improve its efficiency, NWS plans to develop a prototype of a new concept of operations, an effort that could affect its national office configuration, including the location and functions of its offices nationwide. However, many details about the impact of any proposed changes on NWS forecast services, staffing, and budget have yet to be determined. Further, the agency has not yet determined key activities, timelines, or measures for evaluating the prototype of the new office operational structure. As a result, it is not evident that NWS will collect the information it needs on the impact and benefits of any office restructuring in order to make sound and cost-effective decisions.

Recommendations

Our recommendations from this work are listed below with a Contact for more information. Status will change from "In process" to "Open," "Closed - implemented," or "Closed - not implemented" based on our follow up work.

Director: Team: Phone:


GAO-06-792, Weather Forecasting: National Weather Service Is Planning to Improve Service and Gain Efficiency, but Impacts of Potential Changes Are Not Yet Known This is the accessible text file for GAO report number GAO-06-792 entitled 'Weather Forecasting: National Weather Service is Planning to Improve Service and Gain Efficiency, but Impacts of Potential Changes Are Not Yet Known' which was released on July 14, 2006. This text file was formatted by the U.S. Government Accountability Office (GAO) to be accessible to users with visual impairments, as part of a longer term project to improve GAO products' accessibility. Every attempt has been made to maintain the structural and data integrity of the original printed product. Accessibility features, such as text descriptions of tables, consecutively numbered footnotes placed at the end of the file, and the text of agency comment letters, are provided but may not exactly duplicate the presentation or format of the printed version. The portable document format (PDF) file is an exact electronic replica of the printed version. We welcome your feedback. Please E-mail your comments regarding the contents or accessibility features of this document to Webmaster@gao.gov. This is a work of the U.S. government and is not subject to copyright protection in the United States. It may be reproduced and distributed in its entirety without further permission from GAO. Because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. Report to the Subcommittee on Environment, Technology, and Standards, Committee on Science, House of Representatives: United States Government Accountability Office: GAO: July 2006: Weather Forecasting: National Weather Service Is Planning to Improve Service and Gain Efficiency, but Impacts of Potential Changes Are Not Yet Known: Weather Forecasting: GAO-06-792: GAO Highlights: Highlights of GAO-06-792, a report to the Subcommittee on Environment, Technology, and Standards, Committee on Science, House of Representatives Why GAO Did This Study: To provide accurate and timely weather forecasts, the National Weather Service (NWS) uses systems, technologies, and manual processes to collect, process, and disseminate weather data to its nationwide network of field offices and centers. After completing a major modernization program in the 1990s, NWS is seeking to upgrade its systems with the goal of improving its forecasting abilities, and it is considering changing how its nationwide office structure operates in order to enhance efficiency. GAO was asked to (1) evaluate NWS‘s efforts to achieve improvements in the delivery of its services through system and technology upgrades, (2) assess agency plans to achieve service improvements through training its employees, and (3) evaluate agency plans to revise its nationwide office configuration and the implications of these plans on local forecasting services, staffing, and budgets. What GAO Found: NWS is positioning itself to provide better service through over $315 million in planned upgrades to its systems and technologies. In annual plans, the agency links expected improvements in its service performance measures with the technologies and systems expected to improve them. For example, NWS expects to reduce the average error in its forecasts of hurricane paths by approximately 20 nautical miles between 2005 and 2011 through a combination of upgrades to observation systems, better hurricane forecast models, enhancements to the computer infrastructure, and research that will be transferred to forecast operations. Also, NWS expects to increase tornado warning lead times from 13 to 15 minutes by the end of fiscal year 2008 after the agency completes an upgrade to its radar system and realizes benefits from software improvements to its forecaster workstations. NWS also provides training courses for its employees to help improve its forecasting services, but the agency‘s process for selecting training lacks sufficient oversight. Program officials propose and justify training needs on the basis of up to eight different criteria”including whether a course is expected to improve NWS forecasting performance measures, support customer outreach, or increase scientific awareness. Many of these course justifications appropriately demonstrate support for improved forecasting performance. For example, training on how to more effectively use forecaster workstations is expected to help improve tornado and hurricane warnings. However, in justifying training courses, program officials routinely link courses to NWS forecasting performance measures. For example, in 2006, almost all training needs were linked to expectations for improved performance”including training on cardiopulmonary resuscitation, spill prevention, and systems security. The training selection process did not validate or question how these courses could help improve weather forecasts. Overuse of this justification undermines the distinctions among different training courses and the credibility of the course selection process. Additionally, because the training selection process does not clearly distinguish among courses, it is difficult to determine whether sufficient funds are dedicated to the courses that are expected to improve performance. To improve its efficiency, NWS plans to develop a prototype of a new concept of operations, an effort that could affect its national office configuration, including the location and functions of its offices nationwide. However, many details about the impact of any proposed changes on NWS forecast services, staffing, and budget have yet to be determined. Further, the agency has not yet determined key activities, timelines, or measures for evaluating the prototype of the new office operational structure. As a result, it is not evident that NWS will collect the information it needs on the impact and benefits of any office restructuring in order to make sound and cost-effective decisions. What GAO Recommends: GAO is making recommendations to the Secretary of Commerce to direct NWS to strengthen its training selection process, and to establish key activities, timelines, and measures for evaluating the prototype of a new concept of operations before beginning the prototype. In written comments, the Department of Commerce agreed with the recommendations and identified plans for implementing them. [Hyperlink, http://www.gao.gov/cgi-bin/getrpt?GAO-06-792]. To view the full product, including the scope and methodology, click on the link above. For more information, contact David Powner at (202) 512- 9286 or pownerd@gao.gov. [End of Section] Contents: Letter: Results in Brief: Background: NWS Is Positioning Itself to Provide Better Service through Upgrades to Its Systems and Technologies: NWS's Training Is Expected to Result in Forecast Service Improvements, but the Training Selection Process Lacks Sufficient Oversight: Changing Concept of Operations Could Affect Nationwide Office Configuration, but Impact on Forecast Services, Staffing, and Budget Is Not Yet Known: Conclusions: Recommendations for Executive Action: Agency Comments: Appendix I: Objectives, Scope, and Methodology: Appendix II: NWS Performance Goals for Fiscal Years 2005 to 2011: Appendix III: NWS Previously Used A Stringent Process to Ensure Service Was Not Degraded: Appendix IV: Comments from the Department of Commerce: Appendix V: GAO Contact and Staff Acknowledgments: Tables: Table 1: NWS's Performance Measures, Goals, and Actual Performance for Fiscal Years (FY) 2005, 2006, and 2011: Table 2: Ongoing and Planned NEXRAD Improvements (as of May 31, 2006): Table 3: Ongoing and Planned ASOS Improvements (as of May 31, 2006): Table 4: Ongoing and Planned AWIPS Improvements: Table 5: System Upgrades Are Linked to Expected Performance Improvements: Figures: Figure 1: NWS's 122 Weather Forecast Offices: Figure 2: Overview of Key Systems and Technologies Supporting NWS Forecasts: Figure 3: NEXRAD Radar Tower: Figure 4: An ASOS System: Figure 5: ASOS Sensors: Figure 6: Approximate GOES Geographic Coverage: Figure 7: Configuration of Operational Polar Satellites: Figure 8: An AWIPS Workstation: Figure 9: Weather Model Output Shown on an AWIPS Workstation: Abbreviations: ASOS: Automated Surface Observing System: AWIPS: Advanced Weather Interactive Processing System: DMSP: Defense Meteorological Satellite Program: DOD: Department of Defense: FAA: Federal Aviation Administration: GOES: Geostationary Operational Environmental Satellites: NEXRAD: Next Generation Weather Radar: NPOESS: National Polar-orbiting Operational Environmental Satellite System: NWS: National Weather Service: NOAA: National Oceanic and Atmospheric Administration: OMB: Office of Management and Budget: POES: Polar-orbiting Operational Environmental Satellites: [End of section] United States Government Accountability Office: Washington, DC 20548: July 14, 2006: The Honorable Vernon J. Ehlers: Chairman: The Honorable David Wu: Ranking Minority Member: Subcommittee on Environment, Technology, and Standards: Committee on Science: House of Representatives: The National Weather Service's (NWS) ability to forecast the weather affects the life and property of every American. The agency's basic mission is to provide storm and flood warnings and weather forecasts for the United States, its territories, and adjacent oceans and waters- -in order to protect life and property and to enhance the national economy. NWS operations also support other agencies by providing aviation and marine-related weather forecasts and warnings. To carry out its mission, NWS uses a variety of systems, technologies, and manual processes to collect, process, and disseminate weather data to and among its network of field offices and regional and national centers. In the 1980s and 1990s, NWS undertook a nationwide modernization program to upgrade its systems and technologies and to consolidate its field office structure. Today, with the modernization completed, NWS continues to seek ways to upgrade its systems with a goal of further improving its forecasting abilities. The agency is also considering changing how its nationwide office structure works in order to enhance its efficiency. Because of your interest in these plans for continued improvements, you asked us to (1) evaluate NWS's efforts to achieve improvements in the delivery of its services through the upgrades to its systems, models, and computational abilities; (2) assess agency plans to achieve improvements in the delivery of its services through the training and professional development of its employees; and (3) evaluate agency plans to revise its nationwide office configuration and the implications of these plans on local forecasting services, staffing, and budgets. To address these objectives, we reviewed NWS plans for system enhancements, technology improvements, and professional training, and assessed the extent to which these plans were tied to the agency's service improvement goals. We also interviewed officials from NWS and the National Oceanic and Atmospheric Administration to obtain clarification on agency plans and goals. To determine the status and potential impact of any plans to revise the national office configuration, we assessed NWS reports on ways to enhance its operations and interviewed key officials involved in these reports. We conducted our work at NWS headquarters in the Washington, D.C., metropolitan area and at NWS offices in Denver, Tampa, and Miami. We performed our work from October 2005 to June 2006, in accordance with generally accepted government auditing standards. Additional details on our objectives, scope, and methodology are provided in appendix I. Results in Brief: NWS is positioning itself to provide better service through over $315 million in planned upgrades to its systems and technologies through 2011. In annual plans, the agency links expected improvements in its service performance measures with the technologies and systems expected to improve them. For example, NWS expects to reduce the average error in its forecasts of hurricane paths by approximately 20 nautical miles between 2005 and 2011 through a combination of upgrades to observation systems, better hurricane forecast models, enhancements to the computer infrastructure, and research that will be transferred to forecast operations. Also, NWS expects to increase its lead time on tornado warnings from 13 to 15 minutes by the end of fiscal year 2008 after the agency completes an upgrade to its radar system and realizes benefits from software improvements to its forecaster workstations. NWS also provides training courses for its employees to help improve its forecast services, but the agency's process for selecting training lacks sufficient oversight. Program officials propose and justify training needs on the basis of up to eight different criteria-- including whether a course is expected to improve NWS forecasting performance measures, support customer outreach, or increase scientific awareness. Many of these course justifications appropriately demonstrate support for improved forecasting performance. For example, training on how to more effectively use forecaster workstations is expected to help improve tornado, flash flood, and hurricane warnings. However, in justifying training courses, program officials routinely link courses to NWS forecasting performance measures. For example, in 2006, almost all training needs were linked to expectations for improved forecast performance--including training on cardiopulmonary resuscitation, spill prevention, and systems security. The training selection process did not validate or question how these courses could help improve weather forecasts. The overuse of this justification undermines the distinctions among different training courses and the credibility of the course selection process. Until it establishes a training selection process that uses reliable justifications, NWS risks selecting courses that do not most effectively support its training goals. To improve its efficiency, NWS plans to develop a prototype of a new concept of operations--an effort that could affect its national office configuration, including the location and functions of its offices nationwide. However, many details about the impact of any proposed changes on forecast services, staffing, and budget have yet to be determined. Further, NWS has not yet determined key activities, timelines, or measures for evaluating the prototype of the new office operational structure. As a result, it is not evident that NWS will collect the information it needs on the impact and benefits of any office restructuring in order to make sound and cost-effective decisions. We are making recommendations to the Secretary of Commerce to direct NWS to strengthen its training selection process; to establish key activities, timelines, and measures for evaluating the prototype of the new concept of operations; and to ensure that the plans for evaluating the new concept of operations address the impact of any changes on budget, staffing, and services. The Department of Commerce provided written comments on a draft of this report in which it agreed with our recommendations and identified planned steps for implementing them (see app. IV). The department also provided technical corrections, which we have incorporated in this report as appropriate. Background: The mission of NWS--an agency within the Department of Commerce's National Oceanic and Atmospheric Administration (NOAA)--is to provide weather, water, and climate forecasts and warnings for the United States, its territories, and its adjacent waters and oceans, in order to protect life and property and to enhance the national economy. NWS is the official source of aviation-and marine-related weather forecasts and warnings, as well as warnings about life-threatening weather situations. In the 1980s and 1990s, NWS undertook a nationwide modernization program to develop new systems and technologies and to consolidate its field office structure. The goals of the modernization program were to achieve more uniform weather services across the nation, improve forecasts, provide more reliable detection and prediction of severe weather and flooding, permit more cost-effective operations, and achieve higher productivity. The weather observing systems (including radars, satellites, and ground-based sensors) and data processing systems that currently support NWS operations were developed and deployed under the modernization program. During this period, NWS consolidated over 250 large and small weather service offices into the office structure currently in use. NWS Office Structure: An Overview: The coordinated activities of weather facilities throughout the United States allow NWS to deliver a broad spectrum of climate, weather, water, and space weather services. These facilities include weather forecast offices, river forecast centers, national centers, and aviation center weather service units. The functions of these facilities are described below. * 122 weather forecast offices are responsible for providing a wide variety of weather, water, and climate services for their local county warning areas, including advisories, warnings, and forecasts (see fig. 1 for the current location of weather forecast offices). * 13 river forecast centers provide river, stream, and reservoir information to a wide variety of government and commercial users as well as to local weather forecast offices for use in flood forecasts and warnings. * 9 national centers constitute the National Centers for Environmental Prediction, which provide nationwide computer model output and manual forecast information to all NWS field offices and to a wide variety of government and commercial users. These centers include the Environmental Modeling Center, Storm Prediction Center, Tropical Prediction Center, Climate Prediction Center, Aviation Weather Center, and Space Environment Center, among others. * 21 aviation center weather service units, which are co-located with key Federal Aviation Administration (FAA) air traffic control centers across the nation, provide meteorological support to air traffic controllers. Figure 1: NWS's 122 Weather Forecast Offices: [See PDF for image] Sources: NWS and MapArt. [End of figure] NWS Relies on Key Systems and Technologies to Fulfill Its Mission: To fulfill its mission, NWS relies on a national infrastructure of systems and technologies to gather and process data from the land, sea, and air. NWS collects data from many sources, including ground-based Automated Surface Observing Systems (ASOS), Next Generation Weather Radars (NEXRAD), and operational environmental satellites. These data are integrated by advanced data processing workstations--called Advanced Weather Interactive Processing Systems (AWIPS)--used by meteorologists to issue local forecasts and warnings. The data are also fed into sophisticated computer models running on high-speed supercomputers, which are then used to help develop forecasts and warnings. Figure 2 depicts the integration of the various systems and technologies and is followed by a description of each. Figure 2: Overview of Key Systems and Technologies Supporting NWS Forecasts: [See PDF for image] Source: GAO. [End of figure] Next Generation Weather Radar (NEXRAD): NEXRAD is a Doppler radar system[Footnote 1] that detects, tracks, and determines the intensity of storms and other areas of precipitation, determines wind velocities in and around detected storm events, and generates data and imagery to help forecasters distinguish hazards such as severe thunderstorms and tornadoes. It also provides information about heavy precipitation that leads to warnings about flash floods and heavy snow. The NEXRAD network provides data to other government and commercial users and to the general public via the Internet. The NEXRAD network is made up of 158 operational radars and 8 nonoperational radars that are used for training and testing. Of these, NWS operates 120 radars, the Air Force operates 26 radars, and the FAA operates 12 radars. These radars are located throughout the continental United States and in 17 locations outside the continental United States. Figure 3 shows a NEXRAD radar tower. Figure 3: NEXRAD Radar Tower: [See PDF for image] Source: NOAA. [End of figure] Automated Surface Observing System (ASOS): ASOS is a system of sensors, computers, display units, and communications equipment that automates the ground-based observation and dissemination of weather information nationwide. This system collects data on temperature and dew point, visibility, wind direction and speed, pressure, cloud height and amount, and types and amounts of precipitation. ASOS supports weather forecast activities and aviation operations, as well as the needs of research communities that study weather, water, and climate. Figure 4 is a picture of the system, while figure 5 depicts a configuration of ASOS sensors and describes their functions. There are currently 1,002 ASOS units deployed across the United States, with NWS, FAA, and the Department of Defense (DOD) operating 313, 571, and 118 units, respectively. Figure 4: An ASOS System: [See PDF for image] Source: NOAA. [End of figure] Figure 5: ASOS Sensors: [See PDF for image] Source: NWS. [End of figure] Operational Environmental Satellites: Although NWS does not own or operate satellites, geostationary and polar-orbiting environmental satellite programs are key sources of data for its operations. NOAA manages the Geostationary Operational Environmental Satellite (GOES) system and the Polar-orbiting Operational Environmental Satellite (POES) system. In addition, DOD operates a different polar satellite program called the Defense Meteorological Satellite Program (DMSP). These satellite systems continuously collect environmental data about the Earth's atmosphere, surface, cloud cover, and electromagnetic environment. These data are used by meteorologists to develop weather forecasts and other services, and are critical to the early and reliable prediction of severe storms, such as tornadoes and hurricanes. Geostationary satellites orbit above the Earth's surface at the same speed as the Earth rotates, so that each satellite remains over the same location on Earth. NOAA operates GOES as a two-satellite system that is primarily focused on the United States (see fig. 6). To provide continuous satellite coverage, NOAA acquires several satellites at a time as part of a series and launches new satellites every few years.[Footnote 2] Three satellites, GOES-10, GOES-11, and GOES-12, are currently in orbit. Both GOES-10 and GOES-12 are operational satellites, while GOES-11 is in an on-orbit storage mode. It is a backup for the other two satellites should they experience any degradation in service. The first in the next series of satellites, GOES-13, was launched in May 2006, and the others in the series, GOES- O and GOES-P, are planned for launch over the next few years.[Footnote 3] In addition, NOAA is planning a future generation of satellites, known as the GOES-R series, which are planned for launch beginning in 2014. Figure 6: Approximate GOES Geographic Coverage: [See PDF for image] Sources: NWS and MapArt. [End of figure] Unlike the GOES satellites, which maintain a fixed position above the earth, polar satellites constantly circle the Earth in an almost north- south orbit, providing global coverage of conditions that affect the weather and climate. Each satellite makes about 14 orbits a day. As the Earth rotates beneath it, each satellite views the entire Earth's surface twice a day. Currently, there are four operational polar- orbiting satellites--two are POES satellites and two are DMSP satellites. These satellites are positioned so that they can observe the Earth in early morning, morning, and afternoon polar orbits. Together, they ensure that for any region of the Earth, the data are generally no more than 6 hours old. Figure 7 illustrates the current configuration of operational polar satellites. Figure 7: Configuration of Operational Polar Satellites: [See PDF for image] Sources: GAO and MapArt. [End of figure] NOAA and DOD plan to continue to launch remaining satellites in the POES and DMSP programs, with final launches scheduled for 2007 and 2011, respectively. In addition, NOAA, DOD, and the National Aeronautics and Space Administration are planning to replace the POES and DMSP systems with a state-of-the-art environment monitoring satellite system called the National Polar-orbiting Operational Environmental Satellite System (NPOESS). In recent years, we reported on a variety of issues affecting this major system acquisition.[Footnote 4] Advanced Weather Interactive Processing System (AWIPS): AWIPS is a computer system that integrates and displays all hydrometeorological data at NWS field offices. This system integrates data from NEXRAD, ASOS, GOES, and other sources to produce rich graphical displays to aid forecaster analysis and decision making. AWIPS is used to disseminate weather information to the national centers, weather offices, the media, and other federal, state, and local government agencies. NWS deployed hardware and software for this system to weather forecast offices, river forecast centers, and national centers throughout the United States between 1996 and 1999. As a software-intensive system, AWIPS regularly receives software upgrades called "builds." The most recent build, called Operational Build 6, is currently being deployed. NWS officials estimated that the nationwide deployment of this build should be completed by July 2006. Figure 8 shows a standard AWIPS workstation. Figure 8: An AWIPS Workstation: [See PDF for image] Source: NOAA. [End of figure] Numerical Models: Numerical models are advanced software programs that assimilate data from satellites and ground-based observing systems and provide short- and long-term weather pattern predictions. Meteorologists typically use a combination of models and their own experience to develop local forecasts and warnings. Numerical weather models are also a critical source for forecasting weather up to 7 days in advance and forecasting long-term climate changes. One of NWS's National Centers for Environmental Prediction, the Environmental Modeling Center, is the primary developer of these models within NWS and is responsible for making new and improved models available to regional forecasters via the AWIPS system. Figure 9 depicts model output as shown on an AWIPS workstation. Figure 9: Weather Model Output Shown on an AWIPS Workstation: [See PDF for image] Source: NOAA. [End of figure] Supercomputers: NWS leases high-performance supercomputers to execute numerical calculations supporting weather prediction and climate modeling. In 2002, NWS awarded a $227 million contract to lease high-performance supercomputers to run its environmental models from 2002 through September 2011. Included in this contract are an operational supercomputer used to run numerical weather models, an identical backup supercomputer located at a different site, and a research and development supercomputer on which researchers can test out new analyses and models. The supercomputer lease contract allows NWS to exercise options to upgrade the processing capabilities of the operational supercomputer. Previous Reports Focused on NWS Modernization Systems Risks: During the 1990s, we issued a series of reports on NWS modernization systems and made recommendations to improve them.[Footnote 5] For example, early in the AWIPS acquisition, we reported that the respective roles and responsibilities of the contractor and government were not clear and that a structured system development environment had not been established. We made recommendations to correct these shortfalls before the system design was approved. We also reported that the ASOS system was not meeting specifications or user needs, and recommended that NWS define and prioritize system corrections and enhancements. On NEXRAD, we reported that selected units were falling short of availability requirements and recommended that NWS analyze and monitor system availability on a site-specific basis and correct any shortfalls. Because of such concerns, we identified NWS modernization as a high-risk information technology initiative in 1995, 1997, and 1999.[Footnote 6] NWS took a number of actions to address our recommendations and to resolve system risks. For example, NWS enhanced its AWIPS system development processes, prioritized its ASOS enhancements, and improved the availability of its NEXRAD systems. In 2001, because of NWS's progress in addressing key concerns and in deploying and using the AWIPS system--the final component of its modernization program--we removed the modernization from our high-risk list. NWS Established Performance Goals and Tracks Progress against These Goals: In accordance with federal legislation requiring federal managers to focus more directly on program results, NWS established short-and long- term performance goals and regularly tracks its actual performance in meeting these goals.[Footnote 7] Specifically, NWS established 14 different performance measures--such as lead time for flash floods and false-alarm rates for tornado warnings. It also established 5-year goals for improving its performance in each of the 14 performance measures through 2011. For example, the agency plans to increase its lead time on tornado warnings from 13 minutes in 2005 to 15 minutes in 2011. Table 1 identifies NWS's 14 performance measures, selected goals, and performance against those goals, when available. Appendix II provides additional information on NWS's performance goals. Table 1: NWS's Performance Measures, Goals, and Actual Performance for Fiscal Years (FY) 2005, 2006, and 2011: Performance measure: Tornado warning lead time (minutes); Description: The difference between the time a warning is issued and the time of the first report of a tornado in a given county; FY05: Goal: 13; FY05: Actual (final): 13; FY05: Goal met?: Yes; FY06: Goal: 13; FY06: Actual to date: 13[A]; FY06: On target?: Yes; FY11: Goal: 15. Performance measure: Tornado warning accuracy (percent); Description: The percentage of time a tornado actually occurred in an area covered by a tornado warning; FY05: Goal: 73; FY05: Actual (final): 75; FY05: Goal met?: Yes; FY06: Goal: 76; FY06: Actual to date: 82[A]; FY06: On target?: Yes; FY11: Goal: 76. Performance measure: Tornado warning false-alarm rate (percent); Description: The percentage of time a tornado warning was issued but no tornado event was reported; FY05: Goal: 73; FY05: Actual (final): 77; FY05: Goal met?: No; FY06: Goal: 75; FY06: Actual to date: 76[A]; FY06: On target?: No; FY11: Goal: 74. Performance measure: Flash flood warning lead time (minutes); Description: The difference between the time a warning is issued and the time of the first report of a flash flood in a given county; FY05: Goal: 48; FY05: Actual (final): 54; FY05: Goal met?: Yes; FY06: Goal: 48; FY06: Actual to date: 63[A]; FY06: On target?: Yes; FY11: Goal: 49. Performance measure: Flash flood warning accuracy (percent); Description: The percentage of time a flash flood actually occurred in an area covered by a flash flood warning; FY05: Goal: 89; FY05: Actual (final): 88; FY05: Goal met?: No; FY06: Goal: 89; FY06: Actual to date: 93[A]; FY06: On target?: Yes; FY11: Goal: 90. Performance measure: Marine wind speed forecast accuracy (percent); Description: A measure of the accuracy of wind speed forecasts; FY05: Goal: 57; FY05: Actual (final): 57; FY05: Goal met?: Yes; FY06: Goal: 13: 76: 75: 48: 89: 58; FY06: Actual to date: 56[B]; FY06: On target?: No; FY11: Goal: 59. Performance measure: Marine wave height forecast accuracy (percent); Description: A measure of the accuracy of wave forecasts; FY05: Goal: 67; FY05: Actual (final): 67; FY05: Goal met?: Yes; FY06: Goal: 13: 76: 75: 48: 89: 68; FY06: Actual to date: 71[B]; FY06: On target?: Yes; FY11: Goal: 69. Performance measure: Aviation forecast Instrument Flight Rule ceiling/ visibility accuracy (percent); Description: The percentage of time Instrument Flight Rule conditions[E] are predicted and occur; FY05: Goal: 46; FY05: Actual (final): 46; FY05: Goal met?: Yes; FY06: Goal: 13: 76: 75: 48: 89: 47; FY06: Actual to date: 45[B]; FY06: On target?: No; FY11: Goal: 59. Performance measure: Aviation forecast Instrument Flight Rule ceiling/ visibility false-alarm rate (percent); Description: The percentage of time Instrument Flight Rule conditions[E] are predicted but do not occur; FY05: Goal: 68; FY05: Actual (final): 63; FY05: Goal met?: Yes; FY06: Goal: 13: 76: 75: 48: 89: 65; FY06: Actual to date: 61[B]; FY06: On target?: Yes; FY11: Goal: 50. Performance measure: Winter storm warning lead time (hours); Description: The average time from the issuance of a warning to the time of the first report of a winter storm in a given county; FY05: Goal: 15; FY05: Actual (final): 17; FY05: Goal met?: Yes; FY06: Goal: 13: 76: 75: 48: 89: 15; FY06: Actual to date: 16[C]; FY06: On target?: Yes; FY11: Goal: 17. Performance measure: Winter storm warning accuracy (percent); Description: The percentage of verified winter storm events that were covered by winter storm warnings; FY05: Goal: 90; FY05: Actual (final): 91; FY05: Goal met?: Yes; FY06: Goal: 13: 76: 75: 48: 89: 90; FY06: Actual to date: 91[C]; FY06: On target?: Yes; FY11: Goal: 92. Performance measure: Precipitation forecast day 1 threat (score); Description: A score based on the agency's accuracy in forecasting precipitation; FY05: Goal: 27; FY05: Actual (final): 29; FY05: Goal met?: Yes; FY06: Goal: 13: 76: 75: 48: 89: 28; FY06: Actual to date: 39[D]; FY06: On target?: Yes; FY11: Goal: 30. Performance measure: U.S. seasonal temperature forecast skill (score); Description: A score based on the agency's accuracy in forecasting temperature; FY05: Goal: 18; FY05: Actual (final): 19; FY05: Goal met?: Yes; FY06: Goal: 13: 76: 75: 48: 89: 18; FY06: Actual to date: 24[D]; FY06: On target?: Yes; FY11: Goal: 20. Performance measure: Hurricane track forecast error at 48 hours (nautical miles); Description: A measure of the difference between the projected locations of the center of storms and the actual location in nautical miles for the Atlantic Basin; FY05: Goal: 128; FY05: Actual (final): 101; FY05: Goal met?: Yes; FY06: Goal: 13: 76: 75: 48: 89: 111; FY06: Actual to date: N/A[F]; FY06: On target?: N/A[F]; FY11: Goal: 106. Source: GAO analysis of NOAA and NWS reports. [A] Metric measured between October 2005 and January 2006. [B] Metric measured between October 2005 and February 2006. [C] Metric measured between October 2005 and December 2005. [D] Metric measured between October 2005 and March 2006. [E] Instrument Flight Rule conditions exist when ceilings and visibilities are less than 1,000 feet and/or 3 miles, respectively, and ceilings and visibilities are greater than, or equal to, 500 feet and/ or 1 mile, respectively. [F] Data for this metric are not available until the beginning of the next calendar year because of the timing of the hurricane season. [End of table] NWS periodically adjusts its performance goals as its assumptions change. After reviewing actual results from previous fiscal years and its assumptions about the future, in January 2006, NWS adjusted eight of its 5-year performance goals to make more realistic predictions for performance for the next several years. Specifically, NWS made six performance goals less stringent and two goals more stringent. The six goals that were made less stringent--and the reasons for the changes-- are the following: * Tornado warning lead time: NWS changed its 2011 goal from 17 to 15 minutes of warning because of delays in deploying new technologies on NEXRAD radars and a lack of access to FAA radar data. * Tornado warning false-alarm rate: NWS changed its 2011 goal from a 70 to 74 percent false-alarm rate for the same reasons listed above. * Flash flood warning accuracy: NWS changed its 2011 goal from 91 to 90 percent accuracy after delays on two different systems in 2004, 2005, and 2006. * Marine wind speed accuracy: NWS changed its 2011 goal from 67 to 59 percent accuracy after experiencing the delay of marine models and datasets, a deficiency of shallow water wave guidance, and a reduction in funds for training. * Marine wave height accuracy: NWS changed its 2011 goal from 77 to 69 percent accuracy for the same reasons identified above for marine wind speed accuracy. * Aviation instrument flight rule ceiling/visibility: NWS changed its goal from 48 to 47 percent accuracy in 2006 because of a system delay and a reduction in funds for training. Goals for 2007 through 2011 remained the same. Additionally, the following two goals were made more stringent: * Aviation instrument flight rule ceiling/visibility false-alarm rate: NWS reduced its expected false-alarm rate from 68 percent to 65 percent for 2006 because of better than anticipated results from the AWIPS aviation forecast preparation system and an aviation learning training course. Goals for the remaining years in the 5-year plan, 2007 to 2011, remained the same. * Hurricane track forecasts: NWS changed its 2011 hurricane track forecast goal from 123 to 106 nautical miles after trends in observed data from 1987 to 2004 showed that this measure was improving more quickly than expected. NWS Is Positioning Itself to Provide Better Service through Upgrades to Its Systems and Technologies: NWS is positioning itself to provide better service through system and technology upgrades. Over the next few years, the agency plans to upgrade and improve its systems, predictive weather models, and computational abilities, and it appropriately links these upgrades to its performance goals. For example, planned improvements in NEXRAD technology are expected to help improve the lead times for tornado warnings, while AWIPS software enhancements are expected to help improve the accuracy of marine weather forecasts. The agency anticipates continued steady improvement in its forecast accuracy as it obtains better observation data, as computational resources are increased, and as scientists are better able to implement advanced modeling and data assimilation techniques. NWS Has Plans for Upgrading Its Systems, Models, and Computational Abilities: Over the next few years, NWS has plans to spend over $315 million to upgrade its systems, models, and computational abilities.[Footnote 8] Some planned upgrades are to maintain the weather system infrastructure (either to replace obsolete and difficult-to-maintain parts or to refresh aging hardware and workstations), while others are to take advantage of new technologies. Often, the infrastructure upgrades allow NWS to take advantage of newer technologies. For example, the replacement of an aging and proprietary NEXRAD subsystem is expected to allow the agency to implement enhancements in image resolution. Key planned upgrades for each of NWS's major systems and technologies are listed below. NEXRAD: NWS has initiated two major NEXRAD improvements. It is currently replacing an outdated subsystem--the radar data acquisition subsystem- -with current hardware that is compliant with open system standards. This new hardware is expected to enable important software upgrades. In addition, NWS plans to add a new technology called dual polarization to this subsystem, which will provide more accurate rainfall estimates and differentiate various forms of precipitation. Table 2 shows the details of these two projects. Table 2: Ongoing and Planned NEXRAD Improvements (as of May 31, 2006): Improvement: Radar data acquisition subsystem replacement; Description: A subsystem that transmits and receives radar signals, controls the radar antenna, processes the received signal, and sends the processed data to the radar product generator; replacement of this subsystem will enable software upgrades including an enhancement that will allow operators to view more detailed weather features; Current status: In process; 107 of 158 sites have been installed; Estimated acquisition cost: $43.8 million (NWS portion is $22.6 million); Estimated completion date: Estimated to be completed in late 2006. Improvement: Dual polarization technology upgrade; Description: A technology upgrade to allow enhanced target identification; Current status: Acquisition process is under way; E Estimated acquisition cost: $38 million (NWS portion is $25 million); Estimated completion date: Expected contract award at the end of 2006; Deployment is expected to begin in fiscal year 2009 and end in fiscal year 2011. Source: NEXRAD Program Office. [End of table] ASOS: NWS has seven ongoing and planned improvements for its ASOS system (see table 3). Many of these improvements are to replace aging parts and are expected to make the system more reliable and maintainable. Key subsystem replacements--including the all-weather precipitation accumulation gauge--are also expected to result in more accurate measurements. Table 3: Ongoing and Planned ASOS Improvements (as of May 31, 2006): Improvement: Processor upgrade; Description: Provides a more robust processor with increased capacity, speed, and memory; Current status: 962 installations completed out of 1002 total planned sites (312 installations completed out of 313 NWS sites); Estimated or actual acquisition cost: $6.61 million (NWS portion is $2.89 million); Estimated or actual completion date: June 30, 2006. Improvement: All-weather precipitation accumulation gauge; Description: Replaces existing heated tipping bucket rain gauge with a gauge that measures precipitation by weight, resulting in more accurate measurements; Current status: 323 of 331 installed (303 of 311 NWS); Estimated or actual acquisition cost: $7.10 million; Estimated or actual completion date: June 30, 2006. Improvement: Dewpoint sensor; Description: Replace existing sensor's chilled mirror technology with a humidity sensitive capacitor; Current status: 958 of 1002 installed (303 of 311 NWS); Estimated or actual acquisition cost: $9.20 million (NWS portion is $3.14 million); Estimated or actual completion date: June 30, 2006. Improvement: Ice-free wind sensor; Description: Replaces the existing cup and vane anemometer with a new ultrasonic sensor; Current status: 231 of 1000 installed (60 of 311 NWS); Estimated or actual acquisition cost: $7.53 million (NWS portion is $2.90 million); Estimated or actual completion date: November 30, 2006. Improvement: Enhanced precipitation identifier; Description: Replaces sensor that only reports rain and snow with one that is to report rain, snow, drizzle, hail, and ice pellets; Current status: Field demonstration testing to begin July 2006; Estimated or actual acquisition cost: $10.14 million (NWS portion is $3.55 million); Estimated or actual completion date: March 31, 2009. Improvement: Ceilometer (cloud height); Description: Replaces senor that measures cloud heights up to 12,000 feet with one that is expected to measure cloud heights up to 40,000 feet; Current status: Evaluation of commercial sensors almost complete; solicitation for system development expected to begin by end of May 2006; Estimated or actual acquisition cost: $33 million (NWS portion is $12 million); Estimated or actual completion date: September 30, 2011. Improvement: Sunshine duration sensor; Description: Adds a new sensor to measure solar radiation; Current status: Program on hold pending ceilometer production; will be developed after the ceilometer; planned to restart by 2010; Estimated or actual acquisition cost: $1.77 million (this upgrade affects only NWS systems); Estimated or actual completion date: September 30, 2011. Source: ASOS program office. [End of table] AWIPS: Selected AWIPS system components have become obsolete, and NWS is replacing these components. In 2001, NWS began to migrate the existing Unix-based systems to a Linux system to reduce its dependence on any particular hardware platform. NWS expects this project, combined with upgraded information technology, to delay the need for a major information technology replacement. Table 4 shows planned improvements for the AWIPS system. Table 4: Ongoing and Planned AWIPS Improvements: Improvement: Linux migration; Description: An effort to replace legacy hardware and to port approximately 4 million source lines of code of AWIPS software from the original proprietary Hewlett-Packard Unix operating system to the open source Linux operating system; Current status: In progress; Estimated cost: $17.92 million; Timeline/estimated completion date: 2002 to 2007. Improvement: Architecture analysis; Description: An effort to refine AWIPS hardware and communications architecture in support of the Linux migration and to build an advanced Linux prototype system; Current status: In progress; Estimated cost: $900,000; Timeline/estimated completion date: 2004 to 2006. Improvement: Information technology security; Description: An initiative to replace obsolete routers and firewalls throughout the system; Current status: In progress; Estimated cost: $3.22 million; Timeline/estimated completion date: 2004 to 2006. Improvement: Hardware refresh; Description: An initiative to keep the AWIPS hardware baseline fresh and maintainable through a continuous technology refresh. NWS plans to refresh hardware components every 4 to 5 years after the Linux migration is completed; Current status: In progress; Estimated cost: $53.21 million; Timeline/estimated completion date: 2006 to 2015. Improvement: Software re-architecture; Description: An initiative to reengineer the AWIPS software suite to a standard service-oriented architecture; Current status: In progress; Estimated cost: $23 million; Timeline/estimated completion date: 2006 to 2010. Improvement: Software upgrades; Description: Includes efforts to enhance advanced precipitation algorithms for estimating rainfall; continue enhancement of advanced decision assistance tools; implement a distributed hydraulic model; and enhance forecasting and evaluation of seas and lakes to provide a prediction capability tool for marine forecasters; Current status: In progress; Estimated cost: About $10 million per year; Timeline/estimated completion date: Continuous. Source: NWS. [End of table] Numerical Models: NWS plans to continue to improve its modeling capabilities by (1) better assimilating data from improved observation systems such as ASOS, NEXRAD, and environmental satellites; (2) developing and implementing an advanced global forecasting model (called the Weather Research and Forecast model) to allow forecasters to look at a larger domain area; (3) implementing a hurricane weather research forecast model; and (4) improving ensemble modeling, which involves running a single model multiple times with slight variations on a variable to get a probability that a given forecast is likely to occur. NWS expects to spend approximately $12.7 million in fiscal year 2006 to improve its weather and real-time ocean models. Supercomputers: NWS is planning to exercise an option within its existing supercomputer lease to upgrade its computing capabilities to allow more advanced numerical weather and climate prediction modeling. NWS Appropriately Links Its System and Technical Upgrades to Expected Service Improvements: In accordance with federal legislation and policy, NWS's planned upgrades to its systems and technologies are expected to result in improved service. The Government Performance and Results Act calls for federal managers to develop strategic performance goals and to focus program activities on obtaining results.[Footnote 9] Also, the Office of Management and Budget (OMB) requires agencies to justify major investments by showing how they support performance goals.[Footnote 10] NOAA and NWS implement the act and OMB guidance by requiring project officials to describe how planned system and technology upgrades are linked to the agency's programmatic priorities and performance measures. Further, in its annual performance plans, NOAA reports on expected NWS service improvements and identifies the technologies and systems that are expected to help improve them. NWS service improvements are often expected through a combination of system and technology improvements. For example, NWS expects to reduce its average error in forecasting a hurricane's path by approximately 20 nautical miles between 2005 and 2011 through a combination of upgrades to observation systems, better hurricane forecast models, enhancements to the computer infrastructure, and research that will be transferred to NWS forecast operations. Also, NWS expects tornado warning lead times to increase from 13 to 15 minutes by the end of fiscal year 2008 after NWS completes retrofits to the NEXRAD systems, realizes the benefits of AWIPS software enhancements, and implements new training techniques. Table 5 provides a summary of how system upgrades are expected to result in service improvements. Table 5: System Upgrades Are Linked to Expected Performance Improvements: System: NEXRAD; Expected results of ongoing and planned system upgrades: Replacement of the data acquisition subsystem is expected to allow future software and hardware enhancements. These enhancements are expected to improve forecasting performance; Primary performance measures affected: Tornado warnings lead time Tornado warnings accuracy Tornado warnings false-alarm rate Flash flood warning lead time Flash flood warning accuracy Winter storm warnings lead time Winter storm warnings accuracy. System: ASOS; Expected results of ongoing and planned system upgrades: Processor and sensor replacements are expected to allow more reliable and maintainable systems. Selected system improvements--including the deployment of an all-weather precipitation gauge, an enhanced precipitation identifier, and a new ceilometer--are expected to directly improve forecasting performance; Primary performance measures affected: Flash flood warning lead time Flash flood warning accuracy Aviation forecast ceiling/visibility accuracy Aviation forecast ceiling/visibility false-alarm rate. System: AWIPS; Expected results of ongoing and planned system upgrades: Infrastructure upgrades (including a software migration and hardware refreshment) are expected to allow major software enhancements that will result in more accurate and timely forecasts; Primary performance measures affected: Tornado warnings lead time Tornado warnings accuracy Tornado warnings false-alarm rate Flash flood warning lead time Flash flood warning accuracy Marine wind speed forecasts accuracy Marine wave height forecasts accuracy Aviation forecast ceiling/visibility accuracy Aviation forecast ceiling/visibility false-alarm rate Winter storm warnings lead time. System: Supercomputers; Expected results of ongoing and planned system upgrades: Increased computational capabilities are expected to allow advanced modeling and data assimilation--and to result in improved forecast accuracy; Primary performance measures affected: Winter storm warnings lead time Winter storm warnings accuracy Precipitation forecast day 1 threat score U.S. seasonal temperature forecast skill Hurricane track forecasts at 48 hours. System: Models; Expected results of ongoing and planned system upgrades: Modeling improvements, enabled by increased supercomputer capacity, are expected to result in more accurate and timely forecasts; Primary performance measures affected: Flash flood warning lead time Flash flood warning accuracy Marine wind speed forecasts accuracy Marine wave height forecasts accuracy Aviation forecast ceiling/visibility accuracy Aviation forecast ceiling/visibility false- alarm rate Winter storm warnings lead time Winter storm warnings accuracy Precipitation forecast day 1 threat score U.S. seasonal temperature forecast skill Hurricane track forecasts at 48 hours. Source: GAO analysis of NWS data. [End of table] NWS's Training Is Expected to Result in Forecast Service Improvements, but the Training Selection Process Lacks Sufficient Oversight: NWS provides employee training courses that are expected to help improve forecast service performance, but the agency's process for selecting this training lacks sufficient oversight. Each year, NWS identifies its training needs and develops this training in order to enhance its services. NWS develops an annual training and education plan identifying planned training, how this training supports key criteria, and associated costs for the upcoming year. To develop the annual plan, program area teams, with representatives from NWS headquarters and field offices, prioritize and submit training recommendations. Each submission identifies how the training will support up to eight different criteria--including the course's effect on NWS forecasting performance measures, NOAA strategic goals, ensuring operational continuity, and providing customer outreach. These submissions are screened by a training and education team, and depending on available resources, selected for development (if not pre- existing) and implementation. The planned training courses are then delivered through a variety of means, including courses at the NWS training center, online training, and training at local forecast offices. In its 2006 training process, 25 program area teams identified 134 training needs, such as training on how to more effectively use AWIPS, training on an advanced weather simulator, and training on maintaining ASOS systems. Given an expected funding level of $6.1 million, the training and education team then selected 68 of these training needs for implementation. NWS later identified another 5 training needs and allocated an additional $1.25 million to its training budget. In total, NWS funded 73 of 139 training courses. The majority of planned training courses demonstrate a clear link to expected forecasting service improvements. For example, NWS developed a weather event simulator to help forecasters improve their tornado warning lead times. In addition, AWIPS-related training courses are expected to help improve each of the agency's 14 forecasting performance measures by teaching forecasters advanced techniques in using the integrated data processing workstations. However, NWS's process for selecting which training courses to implement lacks sufficient oversight. In justifying training courses, program officials routinely link proposed courses to NWS forecast performance measures. Specifically, in 2006, 131 of the 134 original training needs were linked to expectations for improved forecasting performance--including training on cardiopulmonary resuscitation, spill prevention, leadership, systems security, and equal employment opportunity/diversity. The training selection process did not validate or question that these courses would improve tornado warning lead times or hurricane warning accuracy. Although these courses are important and likely justifiable on other bases, the overuse of this justification undermines the distinctions among training courses and the credibility of the course selection process. Additionally, because the training selection process does not clearly distinguish among courses, it is difficult to determine whether sufficient funds are dedicated to the courses that are expected improve performance. NWS training officials acknowledged that some of the course justifications seem questionable and that more needs to be done to strengthen the training selection process to ensure oversight of the justification and prioritization process. They noted that the training division plans to improve the training selection process over the next few years by adding a more systematic worker-focused assessment of training needs, better prioritizing strategic and organizational needs, and initiating post-implementation reviews. However, until NWS establishes a training selection process that uses reliable justification and results in understandable decisions, NWS risks selecting courses that do not most effectively support its training goals. Changing Concept of Operations Could Affect Nationwide Office Configuration, but Impact on Forecast Services, Staffing, and Budget Is Not Yet Known: NWS plans to develop a prototype of a new concept of operations--an effort that could affect its national office configuration,[Footnote 11] including the location and functions of its offices nationwide. However, NWS has yet to determine many details about the impact of any proposed changes on NWS forecast services, staffing, and budget. Further, NWS has not yet identified key activities, timelines, or measures for evaluating the concept of operations prototype. As a result, it is not evident that NWS will collect the information it needs on the impact and benefits of any office restructuring in order to make sound and cost-effective decisions. NWS Is Evaluating Changes to Its Current Operations: According to agency officials, over the last several years, NWS's corporate board[Footnote 12] noted that the constrained budget, high labor costs, difficulty in training and developing its employees, and a lack of flexibility in how the agency was operating were making it more difficult for the agency to continue to perform its mission. In August 2005, the board chartered a working group to evaluate the roles, responsibilities, and functions of weather offices nationwide and to make a proposal for a new concept of operations. The group was given a set of guiding principles, including that the proposed concept should (1) be cost effective, (2) ensure that there would be no degradation of service, (3) ensure that weather services nationwide were equitable, and (4) not reduce the number of forecast offices nationwide. In addition, the working group was instructed not to address grade structure, staffing levels, office sizes, or overall organizational chart structure. The group gathered input from various agency stakeholders and other partners within NOAA and considered multiple alternatives. They dismissed all but one of the alternative concepts because they were not consistent with the guiding principles. In its December 2005 proposal, the working group proposed a "clustered peer" office plan designed to redistribute some functions among various offices, particularly when there is a high-intensity weather event. An agency official explained that each weather forecast office currently has a fixed geographic area for which it provides forecasts. If a severe weather event occurs, forecast offices ask their staff to work overtime so that there are enough personnel available to do both the normal forecasting work and the watches and warnings required by the severe event. If a local office becomes unable to provide forecast and warning functions, an adjacent office will temporarily assume those duties by calling in extra personnel to handle the workload of both offices. Alternatively, under a clustered peer office structure, several offices with the same type of weather and warning responsibilities, climate, and customers would be grouped in a cluster. Offices within a cluster would share the workload associated with routine services, such as 7- day forecasts. During a high-impact weather event--such as a severe storm, flood, or wildfire--the offices would redistribute the workload to allow the impacted office to focus solely on the event, while the other offices in the cluster would pick up the impacted office's routine services. In this way, peer offices could help supplement staffing needs and the workload across multiple offices could be more efficiently balanced. After receiving this proposal, the NWS corporate board chartered another team to develop a prototype of the clustered peer idea to evaluate the benefits of this approach. The team plans to recommend the scope of the prototype and select several weather offices for the prototype demonstration by the end of September 2006. It also plans to conduct the prototype demonstration in fiscal years 2007 and 2008. Initial prototype results are due in fiscal year 2009. Impacts of New Concept of Operations Have Yet to Be Determined: Many details about the impact of the changes on NWS forecast services, staffing, and budget have yet to be determined. Sound decision making on moving forward with a new concept of operations will require data on the relative costs, benefits, and impacts of such a change, but at this time the implications of NWS's revised concept of operations on staffing, budget, and forecasting services are unknown. The charter for the team developing the prototype for the new concept of operations calls for it to identify metrics for evaluating the prototype and to define mechanisms for obtaining customer feedback. However, the team has not yet established a plan or timeline for developing these metrics or mechanisms. Further, it is not yet evident that these metrics will include the relative costs, benefits, or impacts of this change or which customers will be offered the opportunity to provide feedback. This is not consistent with the last time NWS undertook a major change to its concept of operations--during its modernization in the mid-1990s. During that effort, the agency developed a detailed process for identifying impacts and ensuring that there would be no degradation of service (see app. III for a summary of this prior process). Until it establishes plans, timelines, and metrics for evaluating its prototype of a revised concept of operations, NWS is not able to ensure that it is on track to gather the information it needs to fully evaluate the merits of the revised concept of operations and to make sound and informed decisions on a new office configuration. Conclusions: NWS is appropriately positioning itself to improve its forecasting services by upgrading its systems and technologies and by developing training to enhance the performance of its professional staff. Over the next few years, NWS expects to improve all of its 14 performance measures--ranging from seasonal temperature forecasts, to severe weather warnings, to specialized aviation and marine weather warnings. However, it is not clear that NWS is consistently choosing the best training courses to improve its performance because the training selection process does not rigorously review the training justifications. Recognizing that high labor costs, difficulty in training and developing its employees, and a constrained budget environment make it difficult to fulfill its mission, NWS is evaluating changes to its office structure and operations in order to achieve greater productivity and efficiency. It plans to develop a prototype of a new concept of operations that entails sharing responsibilities among a cluster of offices. Because it is early in the prototype process, the implications of these plans on staffing, budget, and forecasting services are unknown at this time. However, NWS does not yet have detailed plans, timelines, or measures for assessing the prototype. As a result, NWS risks not gathering the information it needs to make an informed decision in moving forward with a new office operational structure. Recommendations for Executive Action: To improve NWS's ability to achieve planned service improvements, we recommend that the Secretary of Commerce direct the Assistant Administrator for Weather Services to take the following three actions: * require training officials to validate the accuracy of training justifications; * establish key activities, timelines, and measures for evaluating the "clustered peer" office structure prototype before beginning the prototype; and: * ensure that plans for evaluating the prototype address the impact of any changes on budget, staffing, and services. Agency Comments: We received written comments on a draft of this report from the Department of Commerce (see app. IV). In the department's response, the Deputy Secretary of Commerce agreed with our recommendations and identified plans for implementing them. Specifically, the department noted that it plans to revise its training process to ensure limited training resources continue to target improvements in NWS performance. The department also noted that the concept of operations working team is developing a plan for the prototype and stated that this plan will include the items we recommended. The department also provided technical corrections, which we have incorporated as appropriate. We are sending copies of this report to the Secretary of Commerce, the Director of the Office of Management and Budget, and other interested congressional committees. Copies will be made available to others on request. In addition, this report will be available at no charge on our Web site at [Hyperlink, http://www.gao.gov]. If you have any questions about this report, please contact me at (202) 512-9286 or by e-mail at pownerd@gao.gov. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this report. GAO staff who made major contributions to this report are listed in appendix V. Signed by: David A. Powner: Director, Information Technology Management Issues: [End of section] Appendix I: Objectives, Scope, and Methodology: Our objectives were (1) to evaluate the National Weather Service's (NWS) efforts to achieve improvements in the delivery of its services through upgrades to its systems, models, and computational abilities; (2) to assess the agency's plans to achieve improvements in the delivery of its services through the training and professional development of its employees; and (3) to evaluate the agency's plans for revising its nationwide office configuration and the implications of these plans on local forecasting services, staffing, and budgets. To evaluate NWS's efforts to achieve service improvements through system and technology upgrades, we reviewed the agency's system development plans and discussed system-specific plans with NWS program officials. We assessed system-specific documentation justifying system upgrades to evaluate whether these upgrades were linked to anticipated improvements in performance goals. We also evaluated NWS performance goals and identified the extent to which anticipated service improvements were tied to system and technology upgrades. We interviewed National Oceanic and Atmospheric Administration (NOAA) and NWS officials to obtain clarification on agency plans and goals. To assess NWS's plans for achieving service improvements through the training and professional development of its employees, we reviewed NWS policies and plans for training and professional development. We reviewed the agency's service performance goals and assessed the link between those goals and planned and expected training and professional development activities. We also interviewed NWS officials responsible for training and professional development activities. To evaluate the status and potential impact of any plans to revise the national office configuration, we assessed studies of options for changing the NWS concept of operations. We also reviewed the charter for the prototype and interviewed key NWS officials to determine the possible effect of these plans on local forecasting services, staffing, and budgets and to identify plans for determining the implications of changing to a new concept of operations. We performed our work at NWS headquarters in the Washington, D.C., metropolitan area, and at geographically diverse NOAA and NWS weather forecast offices in Denver and in Tampa, and at the NWS National Hurricane Center in Miami. We performed our work from October 2005 to June 2006 in accordance with generally accepted government auditing standards. [End of section] Appendix II: NWS Performance Goals for Fiscal Years 2005 to 2011: Performance measure: Tornado warning lead time (minutes); Description: The difference between the time a warning is issued and the time of the first report of a tornado in a given county; FY05: Goal: 13; FY05: Final actual: 13; FY06: Goal: 13; FY06: Actual to date: 13[A]; FY07: Goal: 14; FY08: Goal: 15; FY09: Goal: 15; FY10: Goal: 15; FY11: Goal: 15. Performance measure: Tornado warning accuracy (percent); Description: The percentage of time a tornado actually occurred in an area covered by a tornado warning; FY05: Goal: 73; FY05: Final actual: 75; FY06: Goal: 76; FY06: Actual to date: 82[A]; FY07: Goal: 76; FY08: Goal: 76; FY09: Goal: 76; FY10: Goal: 76; FY11: Goal: 76. Performance measure: Tornado warning false-alarm rate (percent); Description: The percentage of time a tornado warning was issued but no tornado event was reported; FY05: Goal: 73; FY05: Final actual: 77; FY06: Goal: 75; FY06: Actual to date: 76[A]; FY07: Goal: 74; FY08: Goal: 74; FY19: Goal: 74; FY10: Goal: 74; FY11: Goal: 74. Performance measure: Flash flood warning lead time (minutes); Description: The difference between the time a warning is issued and the time of the first report of a flash flood in a given county; FY05: Goal: 48; FY05: Final actual: 54; FY06: Goal: 48; FY06: Actual to date: 63[A]; FY07: Goal: 49; FY08: Goal: 49; FY098: Goal: 49; FY10: Goal: 49; FY11: Goal: 49. Performance measure: Flash flood warning accuracy (percent); Description: The percentage of time a flash flood actually occurred in an area covered by a flash flood warning; FY05: Goal: 89; FY05: Final actual: 88; FY06: Goal: 89; FY06: Actual to date: 93[A]; FY07: Goal: 90; FY08: Goal: 90; FY09: Goal: 90; FY10: Goal: 90; FY11: Goal: 90. Performance measure: Marine wind speed forecast accuracy (percent); Description: A measure of the accuracy of wind speed forecasts; FY05: Goal: 57; FY05: Final actual: 57; FY06: Goal: 58; FY06: Actual to date: 56[B]; FY07: Goal: 58; FY08: Goal: 58; FY09: Goal: 59; FY10: Goal: 59; FY11: Goal: 59. Performance measure: Marine wave height forecasts accuracy (percent); Description: A measure of the accuracy of wave forecasts; FY05: Goal: 67; FY05: Final actual: 67; FY06: Goal: 68; FY06: Actual to date: 71[B]; FY07: Goal: 68; FY08: Goal: 68; FY09: Goal: 69; FY10: Goal: 69; FY11: Goal: 69. Performance measure: Aviation forecast Instrument Flight Rule ceiling/ visibility accuracy (percent); Description: The percentage of time Instrument Flight Rule conditions[E] are predicted and occur; FY05: Goal: 46; FY05: Final actual: 46; FY06: Goal: 47; FY06: Actual to date: 45[B]; FY07: Goal: 48; FY08: Goal: 51; FY09: Goal: 52; FY10: Goal: 53; FY11: Goal: 59. Performance measure: Aviation forecast Instrument Flight Rule ceiling/ visibility false-alarm rate (percent); Description: The percentage of time Instrument Flight Rule conditions[E] are predicted but do not occur; FY05: Goal: 68; FY05: Final actual: 63; FY06: Goal: 65; FY06: Actual to date: 61[B]; FY07: Goal: 64; FY08: Goal: 58; FY09: Goal: 57; FY10: Goal: 56; FY11: Goal: 50. Performance measure: Winter storm warning lead time (hours); Description: The average time from the issuance of a warning to the time of the first report of a winter storm in a given county; FY05: Goal: 15; FY05: Final actual: 17; FY06: Goal: 15; FY06: Actual to date: 16[C]; FY07: Goal: 15; FY08: Goal: 15; FY09: Goal: 16; FY10: Goal: 17; FY11: Goal: 17. Performance measure: Winter storm warning accuracy (percent); Description: The percentage of verified winter storm events that were covered by winter storm warnings; FY05: Goal: 90; FY05: Final actual: 91; FY06: Goal: 90; FY06: Actual to date: 91[C]; FY07: Goal: 90; FY08: Goal: 90; FY09: Goal: 91; FY10: Goal: 92; FY11: Goal: 92. Performance measure: Precipitation forecast day 1 threat (score); Description: A score based on the agency's accuracy in forecasting precipitation; FY05: Goal: 27; FY05: Final actual: 29; FY06: Goal: 28; FY06: Actual to date: 39[D]; FY07: Goal: 29; FY08: Goal: 29; FY09: Goal: 29; FY10: Goal: 30; FY11: Goal: 30. Performance measure: U.S. seasonal temperature forecast skill (score); Description: A score based on the agency's accuracy in forecasting temperature; FY05: Goal: 18; FY05: Final actual: 19; FY06: Goal: 18; FY06: Actual to date: 24[D]; FY07: Goal: 19; FY08: Goal: 19; FY09: Goal: 19; FY10: Goal: 20; FY11: Goal: 20. Performance measure: Hurricane track forecasts at 48 hours (nautical miles); Description: A measure of the difference between the projected locations of the center of storms and the actual locations in nautical miles for the Atlantic Basin; FY05: Goal: 128; FY05: Final actual: 101; FY06: Goal: 111; FY06: Actual to date: N/A[F]; FY07: Goal: 110; FY08: Goal: 109; FY09: Goal: 108; FY10: Goal: 107; FY11: Goal: 106. Source: GAO analysis of NOAA and NWS reports. [A] Metric measured between October 2005 and January 2006. [B] Metric measured between October 2005 and February 2006. [C] Metric measured between October 2005 and December 2005. [D] Metric measured between October 2005 and March 2006. [E] Instrument Flight Rules take effect when ceilings and visibilities are less than 1,000 feet and/or 3 miles, respectively, and ceilings and visibilities are greater than, or equal to, 500 feet and/or 1 mile, respectively. [F] Data for this metric are not available until the beginning of the next calendar year because of the timing of the hurricane season. [End of Table] [End of section] Appendix III: NWS Previously Used A Stringent Process to Ensure Service Was Not Degraded: In the 1980s, NWS began a nationwide modernization program to upgrade weather observing systems such as satellites and radars, to design and develop advanced computer workstations for forecasters, and to reorganize its field office structure. The goals of the modernization were to achieve more uniform weather services across the nation, improve forecasting, provide more reliable detection and prediction of severe weather and flooding, achieve higher productivity, and permit more cost-effective operations through staff and office reductions. NWS's plans for revising its office structure were governed by the Weather Service Modernization Act,[Footnote 13] which required that, prior to closing a field office, the Secretary of Commerce certify that there was no degradation of service. NWS developed a plan for complying with the law. To identify community concerns regarding modernization changes and to study the potential for degradation of service, the Department of Commerce published a notice in the Federal Register requesting comments on service areas where it was believed that services could be degraded by planned modernization changes. The department also contracted for an independent assessment by the National Research Council on whether weather services would be degraded by the proposed changes. As part of this assessment, the contractor developed criteria to identify whether service would be degraded in certain areas of concern. The department then applied these criteria to areas of concern to determine whether services would be degraded or not. Before closing any office, the Secretary of Commerce certified that services would not be degraded. [End of section] Appendix IV: Comments from the Department of Commerce: The Deputy Secretary Of Commerce: Washington, D.C. 20230: June 29, 2006: Mr. David A. Powner: Director, Information Technology Management Issues: U.S. Government Accountability: Office 441 G Street, NW: Washington, D.C. 20548: Dear Mr. Powner: Thank you for the opportunity to review and comment on the Government Accountability Office's draft report entitled Weather Forecasting: National Weather Service is Planning to Improve Service and Gain Efficiency, but Impacts of Potential Changes Are Not Yet Known (GAO-06- 792). I enclose the Department of Commerce's comments to the draft report. Sincerely, Signed by: David A. Sampson: Enclosure: Department of Commerce's Comments on the Draft GAO Report Entitled "Weather Forecasting: National Weather Service Is Planning to Improve Service and Gain Efficiency, but Impacts of Potential Changes Are Not Yet Known" (GAO-06-792/July 2006): General Comments: The Department of Commerce appreciates the opportunity to review this report. We commend the Government Accountability Office (GAO) staff for conducting a thorough examination. The report captures the major elements regarding the National Weather Service's (NWS) delivery of services and plans for future improvements. NWS is striving to meet America's rapidly growing weather, water, and climate needs. NOAA Response to GAO Recommendations: The draft GAO report states, "To improve NWS's ability to achieve planned service improvements, we recommend that the Secretary of Commerce direct the Assistant Administrator for Weather Services to take the following three actions: Recommendation 1: ".require training officials to validate the accuracy of training justifications;" NOAA Response: NWS agrees with this recommendation and is now working to revise the present training process to ensure limited training resources continue to be targeted to those competencies most directly supporting NWS performance requirements. Recommendations 2 and 3: ".establish key activities, timelines, and measures for evaluating the "clustered peer" office structure prototype, and ensure that plans for evaluating the prototype address the impact of any changes on budget, staffing, and services." NOAA Response: NWS agrees with recommendations 2 and 3. Prototyping the "clustered peer" office structure, as described on page 32 of the GAO report, is one activity under an NWS initiative to explore a new concept of agency operations. During the past year, NWS has undertaken this work along with initiatives in aviation and information technology to ensure its services meet America's rapidly growing weather, water, and climate needs in the most efficient manner. NWS has formed a Concept of Operations Team, an Aviation Team, and an Information Technology Team to focus on the three initiatives. Also, NWS has established ground rules for the new initiatives that include no degradation of service, no reduction in the number of offices, and equitable services across the Nation. NWS has formed a Coordination Team, which created a high-level plan to oversee the prototyping efforts of the three teams. Within this effort, the Concept of Operations Team is working on the plan for the prototype. This plan will include key activities and timelines for conducting and evaluating the prototype. Planning for a new concept of operations and other new initiatives is still in its early stages, with plans for establishing criteria and metrics for evaluating the prototype, and defining mechanisms for obtaining customer feedback still under development. We intend to leverage the lessons learned from the 1990s NWS modernization initiative in finalizing our plans for developing and prototyping the new concept of operations, including applying similar rigorous processes for assessing impacts and ensuring no degradation of service, as described on pages 33 and 40 of the draft GAO report. Given our commitment to consistent levels of customer service across the Nation, we will ensure a rigorous and thorough assessment process is followed in evaluating the revised concept of operations prototype to ensure NOAA has adequate information to make an informed implementation decision. [End of section] Appendix V: GAO Contact and Staff Acknowledgments: GAO Contact: David A. Powner, (202) 512-9286 or pownerd@gao.gov. Staff Acknowledgments: In addition to the contact named above, William Carrigg, Barbara Collier, Neil Doherty, Kathleen S. Lovett, Colleen Phillips, Karen Talley, and Jessica Waselkow made key contributions to this report. FOOTNOTES [1] Doppler radar is used to determine the speed and direction of rain or snow particles, cloud droplets, or dust moving toward or away from the radar. The radar accomplishes this by sending out a pulse using a stable frequency and then measuring the changing frequencies as the distance between the radar and the object changes. [2] GOES has historically been a joint program between NOAA and the National Aeronautics and Space Administration (NASA), with NOAA funding and managing the program and NASA providing engineering and launch capabilities. [3] Satellites in a series are identified by letters of the alphabet when they are on the ground and by numbers once they are in orbit. [4] GAO, Polar-orbiting Operational Environmental Satellites: Cost Increases Trigger Review and Place Program's Direction on Hold, GAO-06- 573T (Washington, D.C.: Mar. 30, 2006); Polar-orbiting Operational Environmental Satellites: Technical Problems, Cost Increases, and Schedule Delays Trigger Need for Difficult Tradeoff Decisions, GAO-06- 249T Washington, D.C.: Nov. 16, 2005); Polar-orbiting Environmental Satellites: Information on Program Cost and Schedule Changes, GAO-04- 1054 (Washington, D.C.: Sept. 30, 2004); Polar-orbiting Environmental Satellites: Project Risks Could Affect Weather Data Needed by Civilian and Military Users, GAO-03-987T (Washington, D.C.: July 15, 2003); and Polar-orbiting Environmental Satellites: Status, Plans, and Future Data Management Challenges, GAO-02-684T (Washington, D.C.: July 24, 2002). [5] See, for example, GAO, Weather Forecasting: Improvements Needed in Laboratory Software Development Process, GAO/AIMD-95-24 (Washington, D.C.: Dec. 14, 1994); Weather Forecasting: Unmet Needs and Unknown Costs Warrant Reassessment of Observing System Plans, GAO/AIMD-95-81 (Washington, D.C.: Apr. 21, 1995); Weather Forecasting: Radar Availability Requirement Not Being Met, GAO/AIMD-95-132 (Washington, D.C.: May 31, 1995); Weather Forecasting: Radars Far Superior to Predecessors, but Location and Availability Questions Remain, GAO/ T- AIMD-96-2 (Washington, D.C.: Oct. 17, 1995); Weather Forecasting: New Processing System Faces Uncertainties and Risks, GAO/T-AIMD-96-47 (Washington, D.C.: Feb. 29, 1996); Weather Forecasting: Recommendations to Address New Weather Processing System Development Risks, GAO/ AIMD- 96-74 (May 13, 1996); and Weather Satellites: Planning for the Geostationary Satellite Program Needs More Attention, GAO/AIMD-97-37 (Washington, D.C.: Mar. 13, 1997). [6] GAO, High-Risk Series: An Overview, GAO/HR-95-1 (Washington, D.C.: February 1995); High-Risk Series: Information Management and Technology, GAO/HR-97-9 (Washington, D.C.: February 1997); High-Risk Series: An Update, GAO/HR-99-1 (Washington, D.C.: January 1999); High- Risk Series: An Update, GAO-01-263 (Washington, D.C.: January 2001). [7] The Government Performance and Results Act of 1993 (Pub. L. 103-62) was intended to improve federal program effectiveness, accountability, and service delivery by requiring federal agencies to develop strategic plans with long-term, outcome-oriented goals and objectives; annual performance goals linked to the long-term goals; and annual reports on actual results. [8] This cost estimate includes the expected cost of key system upgrades, as well as estimated annual costs for improvements to AWIPS software and numerical models through the year 2011. It does not include the expected costs of supercomputer upgrades because NWS does not estimate what portion of its $26 million annual supercomputer budget is attributable to upgrades. [9] Pub. L. 103-62, 107 Stat. 285 (1993). [10] OMB requires agencies to annually submit documentation, called an exhibit 300, justifying major information technology initiatives or improvements. [11] Because there is no precise definition of the term "office configuration," we have defined it as NWS's current number of offices, the location of the offices, hours worked at each of the offices, and the services and functions provided at each of the offices. [12] NWS's Corporate Board is chaired by the Director of the National Weather Service, and made up of senior officials responsible for different aspects of the agency's mission, including the Chief Information Officer and the Directors of the Office of Climate, Water, and Weather Services; the National Centers for Environmental Prediction; and the NWS Regions. It meets at least twice annually to discuss the NWS budget and other strategic issues. It also holds special meetings, as needed, to focus on NWS issues such as postevent assessments of major weather services, such as an assessment of weather services during Hurricane Charley in 2004. [13] Pub. L. 102-567 § 706(b), 106 Stat. 4303, 4306 (1992). GAO's Mission: The Government Accountability Office, the investigative arm of Congress, exists to support Congress in meeting its constitutional responsibilities and to help improve the performance and accountability of the federal government for the American people. GAO examines the use of public funds; evaluates federal programs and policies; and provides analyses, recommendations, and other assistance to help Congress make informed oversight, policy, and funding decisions. GAO's commitment to good government is reflected in its core values of accountability, integrity, and reliability. Obtaining Copies of GAO Reports and Testimony: The fastest and easiest way to obtain copies of GAO documents at no cost is through the Internet. GAO's Web site ( www.gao.gov ) contains abstracts and full-text files of current reports and testimony and an expanding archive of older products. The Web site features a search engine to help you locate documents using key words and phrases. You can print these documents in their entirety, including charts and other graphics. Each day, GAO issues a list of newly released reports, testimony, and correspondence. GAO posts this list, known as "Today's Reports," on its Web site daily. The list contains links to the full-text document files. To have GAO e-mail this list to you every afternoon, go to www.gao.gov and select "Subscribe to e-mail alerts" under the "Order GAO Products" heading. Order by Mail or Phone: The first copy of each printed report is free. Additional copies are $2 each. A check or money order should be made out to the Superintendent of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or more copies mailed to a single address are discounted 25 percent. Orders should be sent to: U.S. Government Accountability Office 441 G Street NW, Room LM Washington, D.C. 20548: To order by Phone: Voice: (202) 512-6000: TDD: (202) 512-2537: Fax: (202) 512-6061: To Report Fraud, Waste, and Abuse in Federal Programs: Contact: Web site: www.gao.gov/fraudnet/fraudnet.htm E-mail: fraudnet@gao.gov Automated answering system: (800) 424-5454 or (202) 512-7470: Public Affairs: Jeff Nelligan, managing director, NelliganJ@gao.gov (202) 512-4800 U.S. Government Accountability Office, 441 G Street NW, Room 7149 Washington, D.C. 20548:

The Justia Government Accountability Office site republishes public reports retrieved from the U.S. GAO These reports should not be considered official, and do not necessarily reflect the views of Justia.