Information Technology
OMB's Dashboard Has Increased Transparency and Oversight, but Improvements Needed
Gao ID: GAO-10-701 July 16, 2010
Federal IT spending has risen to an estimated $79 billion for fiscal year 2011. To improve transparency and oversight of this spending, in June 2009 the Office of Management and Budget (OMB) deployed a public website, known as the IT Dashboard, which provides information on federal agencies' major IT investments, including assessments of actual performance against cost and schedule targets (referred to as ratings). According to OMB, these data are intended to provide both a near real-time and historical perspective of the performance of these investments. GAO was asked to (1) examine the accuracy of the cost and schedule performance ratings on the Dashboard for selected investments and (2) determine whether the data on the Dashboard are used as a management tool to make improvements to IT investments. To do so, GAO selected 8 major investments from 5 agencies with large IT budgets, compared its analyses of the selected investments' performance to the ratings on the Dashboard, and interviewed agency officials about their use of the Dashboard to manage investments.
The cost and schedule ratings on OMB's Dashboard were not always accurate for the selected investments. GAO found that 4 of the 8 selected investments had notable discrepancies on either their cost or schedule ratings. For example, the Dashboard indicated one investment had a less than 5 percent variance on cost every month from July 2009 through January 2010. GAO's analysis shows the investment's cost performance in December 2009 through January 2010 had a variance of 10 percent to less than 15 percent. Additionally, another investment on the Dashboard reported that it had been less than 30 days behind schedule since July 2009. However, investment data GAO examined showed that from September to December 2009 it was behind schedule greater than or equal to 30 days and less than 90 days. A primary reason for the data inaccuracies was that while the Dashboard was intended to represent near real-time performance information, the cost and schedule ratings did not take into consideration current performance. As a result, the ratings were based on outdated information. For example, cost ratings for each of the investments were based on data between 2 months and almost 2 years old. As of July 1, 2010, OMB plans to release an updated version of the Dashboard in July that includes ratings that factor in the performance of ongoing milestones. Another issue with the ratings was the wide variation in the number of milestones agencies reported, which was partly because OMB's guidance to agencies was too general. Having too many milestones can mask recent performance problems because the performance of every milestone (dated and recent) is equally averaged into the ratings. Specifically, investments that perform well during many previously completed milestones and then start performing poorly on a few recently completed milestones can maintain ratings that still reflect good performance. Conversely, having too few milestones limits the amount of information available to rate performance and allows agencies to potentially skew the ratings. OMB officials stated that they have recently chartered a working group with the intention of developing guidance for standardizing milestone reporting. However, until such guidance is available, the ratings may continue to have accuracy issues. Officials at three of the five agencies stated they were not using the Dashboard to manage their investments because they maintain they already had existing means to do so; officials at the other two agencies indicated that they were using the Dashboard to supplement their existing management processes. OMB officials indicated that they relied on the Dashboard as a management tool, including using the Dashboard's investment trend data to identify and address issues with investments' performance. According to OMB officials, the Dashboard was one of the key sources of information that they used to determine if an investment requires additional oversight. In addition, the Federal Chief Information Officer (CIO) stated that the Dashboard has greatly improved oversight capabilities compared to previously used mechanisms. He also stated that the Dashboard has increased the accountability of agencies' CIOs and established much needed visibility. GAO recommends that OMB report on its planned changes to the Dashboard to improve the accuracy of performance information and provide guidance to agencies that standardizes milestone reporting. OMB agreed with these recommendations, but disagreed with aspects of the draft report that GAO addressed, as appropriate.
Recommendations
Our recommendations from this work are listed below with a Contact for more information. Status will change from "In process" to "Open," "Closed - implemented," or "Closed - not implemented" based on our follow up work.
Director:
David A. Powner
Team:
Government Accountability Office: Information Technology
Phone:
(202) 512-9286
GAO-10-701, Information Technology: OMB's Dashboard Has Increased Transparency and Oversight, but Improvements Needed
This is the accessible text file for GAO report number GAO-10-701
entitled 'Information Technology: OMB's Dashboard Has Increased
Transparency and Oversight, but Improvements Needed' which was
released on July 20, 2010.
This text file was formatted by the U.S. Government Accountability
Office (GAO) to be accessible to users with visual impairments, as
part of a longer term project to improve GAO products' accessibility.
Every attempt has been made to maintain the structural and data
integrity of the original printed product. Accessibility features,
such as text descriptions of tables, consecutively numbered footnotes
placed at the end of the file, and the text of agency comment letters,
are provided but may not exactly duplicate the presentation or format
of the printed version. The portable document format (PDF) file is an
exact electronic replica of the printed version. We welcome your
feedback. Please E-mail your comments regarding the contents or
accessibility features of this document to Webmaster@gao.gov.
This is a work of the U.S. government and is not subject to copyright
protection in the United States. It may be reproduced and distributed
in its entirety without further permission from GAO. Because this work
may contain copyrighted images or other material, permission from the
copyright holder may be necessary if you wish to reproduce this
material separately.
Report to Congressional Requesters:
United States Government Accountability Office:
GAO:
July 2010:
Information Technology:
OMB's Dashboard Has Increased Transparency and Oversight, but
Improvements Needed:
GAO-10-701:
GAO Highlights:
Highlights of GAO-10-701, a report to congressional requesters.
Why GAO Did This Study:
Federal IT spending has risen to an estimated $79 billion for fiscal
year 2011. To improve transparency and oversight of this spending, in
June 2009 the Office of Management and Budget (OMB) deployed a public
website, known as the IT Dashboard, which provides information on
federal agencies‘ major IT investments, including assessments of
actual performance against cost and schedule targets (referred to as
ratings). According to OMB, these data are intended to provide both a
near real-time and historical perspective of the performance of these
investments.
GAO was asked to (1) examine the accuracy of the cost and schedule
performance ratings on the Dashboard for selected investments and (2)
determine whether the data on the Dashboard are used as a management
tool to make improvements to IT investments. To do so, GAO selected 8
major investments from 5 agencies with large IT budgets, compared its
analyses of the selected investments‘ performance to the ratings on
the Dashboard, and interviewed agency officials about their use of the
Dashboard to manage investments.
What GAO Found:
The cost and schedule ratings on OMB‘s Dashboard were not always
accurate for the selected investments. GAO found that 4 of the 8
selected investments had notable discrepancies on either their cost or
schedule ratings. For example, the Dashboard indicated one investment
had a less than 5 percent variance on cost every month from July 2009
through January 2010. GAO‘s analysis shows the investment‘s cost
performance in December 2009 through January 2010 had a variance of 10
percent to less than 15 percent. Additionally, another investment on
the Dashboard reported that it had been less than 30 days behind
schedule since July 2009. However, investment data GAO examined showed
that from September to December 2009 it was behind schedule greater
than or equal to 30 days and less than 90 days.
A primary reason for the data inaccuracies was that while the
Dashboard was intended to represent near real-time performance
information, the cost and schedule ratings did not take into
consideration current performance. As a result, the ratings were based
on outdated information. For example, cost ratings for each of the
investments were based on data between 2 months and almost 2 years
old. As of July 1, 2010, OMB plans to release an updated version of
the Dashboard in July that includes ratings that factor in the
performance of ongoing milestones. Another issue with the ratings was
the wide variation in the number of milestones agencies reported,
which was partly because OMB‘s guidance to agencies was too general.
Having too many milestones can mask recent performance problems
because the performance of every milestone (dated and recent) is
equally averaged into the ratings. Specifically, investments that
perform well during many previously completed milestones and then
start performing poorly on a few recently completed milestones can
maintain ratings that still reflect good performance. Conversely,
having too few milestones limits the amount of information available
to rate performance and allows agencies to potentially skew the
ratings. OMB officials stated that they have recently chartered a
working group with the intention of developing guidance for
standardizing milestone reporting. However, until such guidance is
available, the ratings may continue to have accuracy issues.
Officials at three of the five agencies stated they were not using the
Dashboard to manage their investments because they maintain they
already had existing means to do so; officials at the other two
agencies indicated that they were using the Dashboard to supplement
their existing management processes. OMB officials indicated that they
relied on the Dashboard as a management tool, including using the
Dashboard‘s investment trend data to identify and address issues with
investments‘ performance. According to OMB officials, the Dashboard
was one of the key sources of information that they used to determine
if an investment requires additional oversight. In addition, the
Federal Chief Information Officer (CIO) stated that the Dashboard has
greatly improved oversight capabilities compared to previously used
mechanisms. He also stated that the Dashboard has increased the
accountability of agencies‘ CIOs and established much needed
visibility.
What GAO Recommends:
GAO recommends that OMB report on its planned changes to the Dashboard
to improve the accuracy of performance information and provide
guidance to agencies that standardizes milestone reporting. OMB agreed
with these recommendations, but disagreed with aspects of the draft
report that GAO addressed, as appropriate.
View [hyperlink, http://www.gao.gov/products/GAO-10-701] or key
components. For more information, contact David A. Powner at (202) 512-
9286 or pownerd@gao.gov.
[End of section]
Contents:
Letter:
Background:
Performance Ratings on the Dashboard Were Not Always Accurate:
Use of the Dashboard as a Management Tool Varies:
Conclusions:
Recommendations for Executive Action:
Agency Comments and Our Evaluation:
Appendix I: Objectives, Scope, and Methodology:
Appendix II: Selected Investment Descriptions:
Appendix III: Comments from the Office of Management and Budget:
Appendix IV: Comments from the Department of Energy:
Appendix V: GAO Contact and Staff Acknowledgments:
Tables:
Table 1: Management Watch List Projects and Their Associated Budgets
for Fiscal Years 2004-2009:
Table 2: Dashboard Cost Rating Scale:
Table 3: Dashboard Schedule Rating Scale:
Table 4: Selected Investments Whose Program Officials Reported a Lower
Number of Rebaselines Than What was Reported on the Dashboard:
Table 5: Number of Days the Cost Rating is Outdated for Each Selected
Investment:
Table 6: Number of Milestones per Selected Investment:
Table 7: Investment Management Details:
Figures:
Figure 1: Dashboard Overall Ratings:
Figure 2: Comparison of Selected Investments' Dashboard Cost Ratings
to GAO's Ratings:
Figure 3: Comparison of Selected Investments' Dashboard Schedule
Ratings to GAO's Ratings:
Abbreviations:
CIO: chief information officer:
DOD: Department of Defense:
DOE: Department of Energy:
DOJ: Department of Justice:
EVM: earned value management:
HHS: Department of Health and Human Services:
IT: information technology:
OMB: Office of Management and Budget:
USDA: Department of Agriculture:
[End of section]
United States Government Accountability Office:
Washington, DC 20548:
July 16, 2010:
The Honorable Susan M. Collins:
Ranking Member:
Committee on Homeland Security and Governmental Affairs:
United States Senate:
The Honorable Thomas R. Carper:
Chairman:
Subcommittee on Federal Financial Management, Government Information,
Federal Services, and International Security:
Committee on Homeland Security and Governmental Affairs:
United States Senate:
Billions of taxpayer dollars are spent on information technology (IT)
investments each year; federal IT spending has now risen to an
estimated $79 billion for fiscal year 2011. During the past several
years, we have issued several reports and testimonies and made
numerous recommendations to the Office of Management and Budget (OMB)
to improve the transparency, oversight, and management of the federal
government's IT investments.[Footnote 1] In June 2009, OMB deployed a
public website, known as the IT Dashboard, which provides detailed
information on federal agencies' major IT investments, including
assessments of actual performance against cost and schedule targets
(referred to as ratings) for approximately 800 major federal IT
investments. The Dashboard aims to improve the transparency and
oversight of these investments.
This report responds to your request that we (1) examine the accuracy
of the cost and schedule performance ratings on the Dashboard for
selected investments and (2) determine whether the data on the
Dashboard are used as a management tool to make improvements to IT
investments.
To address our first objective, we selected five agencies--the
Departments of Agriculture (USDA), Defense (DOD), Energy (DOE), Health
and Human Services (HHS), and Justice (DOJ)--and ten
investments[Footnote 2] to review. To select these agencies and
investments, we first identified ten agencies with large IT budgets,
and then identified the five largest investments at each of the ten
agencies. In narrowing the list to five agencies and ten total
investments, we considered several factors to ensure there were two
viable investments at each agency, such as selecting investments that
were not part of our ongoing audit work and providing a balance of
investment sizes. We then collected and analyzed monthly investment
performance reports from the ten investments. We compared our analyses
of each investment's performance to the ratings on the Dashboard to
determine if the information was consistent. We also reviewed and
analyzed OMB's and the selected agencies' processes for populating and
updating the Dashboard. Additionally, we interviewed officials from
OMB and the agencies to obtain further information on their efforts to
ensure the accuracy of the data on the Dashboard. We did not test the
adequacy of the agency or contractor cost-accounting systems. Our
evaluation of these cost data was based on the documentation the
agencies provided.
To address our second objective, we interviewed officials and analyzed
documentation at the selected agencies to determine the extent to
which they use the data on the Dashboard to make management decisions.
We also attended one of OMB's TechStat sessions, which are reviews of
selected IT investments between OMB and agencies.
We conducted this performance audit from January to July 2010, in
accordance with generally accepted government auditing standards.
Those standards require that we plan and perform the audit to obtain
sufficient, appropriate evidence to provide a reasonable basis for our
findings and conclusions based on our audit objectives. We believe
that the evidence obtained provides a reasonable basis for our
findings and conclusions based on our audit objectives. Further
details of our objectives, scope, and methodology are provided in
appendix I.
Background:
Each year, OMB and federal agencies work together to determine how
much the government plans to spend on IT projects and how these funds
are to be allocated. Planned federal IT spending has now risen to an
estimated $79.4 billion for fiscal year 2011, a 1.2 percent increase
from the 2010 level of $78.4 billion. OMB plays a key role in helping
federal agencies manage their investments by working with them to
better plan, justify, and determine how much they need to spend on
projects and how to manage approved projects.
To assist agencies in managing their investments, Congress enacted the
Clinger-Cohen Act of 1996, which requires OMB to establish processes
to analyze, track, and evaluate the risks and results of major capital
investments in information systems made by federal agencies and report
to Congress on the net program performance benefits achieved as a
result of these investments.[Footnote 3] Further, the act places
responsibility for managing investments with the heads of agencies and
establishes chief information officers (CIO) to advise and assist
agency heads in carrying out this responsibility. Another key law is
the E-Government Act of 2002, which requires OMB to report annually to
Congress on the status of e-government.[Footnote 4] In these reports,
referred to as the Implementation of the E-Government Act reports, OMB
is to describe the Administration's use of e-government principles to
improve government performance and the delivery of information and
services to the public.
To help carry out its oversight role and assist the agencies in
carrying out their responsibilities as assigned by the Clinger-Cohen
Act, OMB developed a Management Watch List in 2003 and a High Risk
List in 2005 to focus executive attention and to ensure better
planning and tracking of major IT investments. Consistent with the
Clinger-Cohen Act, OMB reported on the status of investments on the
Management Watch List and High Risk List in its annual budget
documents.
Over the past several years, we have reported and testified on OMB's
initiatives to highlight troubled projects, justify investments, and
use project management tools.[Footnote 5] We have made multiple
recommendations to OMB and federal agencies to improve these
initiatives to further enhance the oversight and transparency of
federal projects. Among other things, we recommended that OMB develop
a central list of projects and their deficiencies and analyze that
list to develop governmentwide and agency assessments of the progress
and risks of the investments, identifying opportunities for continued
improvement.[Footnote 6] In addition, in 2006 we also recommended that
OMB develop a single aggregate list of high-risk projects and their
deficiencies and use that list to report to Congress on progress made
in correcting high-risk problems.[Footnote 7] As a result, OMB started
publicly releasing aggregate data on its Management Watch List and
disclosing the projects' deficiencies. Furthermore, OMB issued
governmentwide and agency assessments of the projects on the
Management Watch List and identified risks and opportunities for
improvement, including risk management and security.
Table 1 provides a historical perspective of the number of projects on
the Management Watch List and their associated budgets for the period
of time during which OMB updated the Management Watch List. The table
shows that while the number of projects and their associated budgets
on the list generally decreased, the number of projects on the
Management Watch List increased by 239 projects and $13 billion for
fiscal year 2009, and represented a significant percentage of the
total budget.
Table 1: Management Watch List Projects and Their Associated Budgets
for Fiscal Years 2004-2009:
Fiscal year: 2004;
Number of major federal IT projects: 1400;
Associated budget: $59.0 billion;
Number of Management Watch List projects: 771;
Associated budget: Fiscal year: $20.9 billion;
Percentage of federal IT projects on Management Watch List: 55%;
Percentage of budget: 35%.
Fiscal year: 2005;
Number of major federal IT projects: 1Fiscal year: 200;
Associated budget: $60.0 billion;
Number of Management Watch List projects: 621;
Associated budget: $22.0 billion;
Percentage of federal IT projects on Management Watch List: 52;
Percentage of budget: 37.
Fiscal year: 2006;
Number of major federal IT projects: 1087;
Associated budget: $65.0 billion;
Number of Management Watch List projects: 342;
Associated budget: $15.0 billion;
Percentage of federal IT projects on Management Watch List: 31;
Percentage of budget: 23.
Fiscal year: 2007;
Number of major federal IT projects: 857;
Associated budget: $64.0 billion;
Number of Management Watch List projects: 263;
Associated budget: $9.9 billion;
Percentage of federal IT projects on Management Watch List: 31;
Percentage of budget: 15.
Fiscal year: 2008;
Number of major federal IT projects: 840;
Associated budget: $65.0 billion;
Number of Management Watch List projects: 346;
Associated budget: $14.0 billion;
Percentage of federal IT projects on Management Watch List: 41;
Percentage of budget: 22.
Fiscal year: 2009;
Number of major federal IT projects: 810;
Associated budget: $70.7 billion;
Number of Management Watch List projects: 585;
Associated budget: $27.0 billion;
Percentage of federal IT projects on Management Watch List: 72;
Percentage of budget: 38.
Source: GAO analysis of OMB data.
[End of table]
OMB's Dashboard Publicizes Investment Details and Performance Status:
More recently, to further improve the transparency into and oversight
of agencies' IT investments, and to address data quality issues, in
June 2009, OMB publicly deployed a Web site, known as the IT
Dashboard, which replaced the Management Watch List and High Risk
List. It displays information on federal agencies' cost, schedule, and
performance information for the approximately 800 major federal IT
investments at 28 federal agencies. According to OMB, these data are
intended to provide a near real-time perspective of the performance of
these investments, as well as a historical perspective. Further, the
public display of these data are intended to allow OMB, other
oversight bodies, and the general public to hold the government
agencies accountable for results and progress.
The Dashboard was initially deployed in June 2009 based on each
agency's Exhibit 53 and Exhibit 300 submissions.[Footnote 8] After the
initial population of data, agency CIOs have been responsible for
updating cost, schedule, and performance fields on a monthly basis,
which is a major improvement from the quarterly reporting cycle OMB
previously used for the Management Watch List and High Risk List.
For each major investment, the Dashboard provides performance ratings
on cost and schedule, a CIO evaluation, and an overall rating which is
based on the cost, schedule, and CIO ratings. The cost rating is
determined by a formula that calculates the amount by which an
investment's aggregated actual costs deviate from the aggregated
planned costs. Table 2 displays the rating scale and associated
category for cost variations.
Table 2: Dashboard Cost Rating Scale:
Variance from planned costs: =5% and =10% and =15% and =20% and =25% and =30% and =35% and =40% and =45% and =50%;
Rating: 0;
Category: Red.
Source: OMB's Dashboard.
Note: Green = Normal; Yellow = Needs attention; Red = Significant
concerns.
[End of table]
An investment's schedule rating is calculated by determining the
average days late or early. Table 3 displays the rating scale and
associated category for schedule deviations.
Table 3: Dashboard Schedule Rating Scale:
Average days late: =30 and =90;
Rating: 0;
Category: Red.
Source: OMB's Dashboard.
Note: Green = Normal; Yellow = Needs attention; Red = Significant
concerns.
[End of table]
Each major investment on the Dashboard also includes a rating
determined by the agency CIO, which is based on his or her evaluation
of the performance of each investment. The rating is expected to take
into consideration the following criteria: risk management,
requirements management, contractor oversight, historical performance,
and human capital. This rating is to be updated when new information
becomes available that would impact the assessment of a given
investment.
Lastly, the Dashboard calculates an overall rating for each major
investment. Figure 1 identifies the Dashboard's overall ratings scale.
This overall rating is an average of the cost, schedule, and CIO
ratings, with each representing one-third of the overall rating.
However, when the CIO's rating is lower than both the cost and
schedule ratings, the CIO's rating will be the overall rating. Of the
792 major investments on the Dashboard as of May 2010, 540 (68
percent) were green, 204 (26 percent) were yellow, and 48 (6 percent)
were red.
Figure 1: Dashboard Overall Ratings:
[Refer to PDF for image: illustration]
Source: GAO based on OMB's Dashboard.
[End of figure]
Earned Value Management Provides Additional Insight on Program Cost
and Schedule:
Earned value management is a technique that integrates the technical,
cost, and schedule parameters of a development contract and measures
progress against them. During the planning phase, a performance
measurement baseline is developed by assigning and scheduling budget
resources for defined work. As work is performed and measured against
the baseline, the corresponding budget value is "earned." Using this
earned value metric, cost and schedule variances, as well as cost and
time to complete estimates, can be determined and analyzed.
Without knowing the planned cost of completed work and work in
progress (i.e., the earned value), it is difficult to determine a
program's true status. Earned value allows for this key information,
which provides an objective view of program status and is necessary
for understanding the health of a program. As a result, earned value
management can alert program managers to potential problems sooner
than using expenditures alone, thereby reducing the chance and
magnitude of cost overruns and schedule slippages. Moreover, earned
value management directly supports the institutionalization of key
processes for acquiring and developing systems and the ability to
effectively manage investments--areas that are often found to be
inadequate on the basis of our assessments of major IT investments. In
August 2005, OMB issued guidance that all major and high-risk
development projects, among other things, develop comprehensive
policies to ensure that their major IT investments use earned value
management to manage their investments.
Performance Ratings on the Dashboard Were Not Always Accurate:
Cost and schedule performance ratings were not always accurate for the
selected investments we reviewed. A key reason for the inaccuracies is
that the Dashboard's cost and schedule ratings do not reflect current
performance. Another issue with the ratings is that large
inconsistencies exist in the number of milestones that agencies report
on the Dashboard.
Cost and Schedule Performance Ratings Were Not Always Accurate:
The cost and schedule performance ratings of selected investments were
not always accurate. There were several instances of inaccurate cost
ratings; however, two investments experienced notable discrepancies
while the other discrepancies were not as dramatic. Specifically, 5 of
the 8 selected investments[Footnote 9] on the Dashboard had inaccurate
cost ratings: BioSense, Financial Management Modernization Initiative,
Joint Precision Approach and Landing System, Law Enforcement Wireless
Communication, and Unified Financial Management System. For example,
the Dashboard rated the Law Enforcement Wireless Communication
investment a 10 for cost (less than 5 percent variance) every month
from July 2009 through January 2010. However, our analysis shows the
investment's cost rating during December 2009 and January 2010 is
equivalent to an 8 (a variance of 10 percent to less than 15 percent).
Accordingly, this investment's cost performance should have been rated
a "yellow" instead of a "green," meaning it needed attention. Further,
the Dashboard's cost rating for the Financial Management Modernization
Initiative reported that this investment was "yellow," while it should
have been "green" for 7 months. Maneuver Control System, Sequoia
Platform, and Risk Management Agency-13 are the three investments that
had accurate cost ratings. Figure 2 shows the comparison of selected
investments' Dashboard cost ratings to GAO's ratings for the months of
July 2009-January 2010.
Figure 2: Comparison of Selected Investments' Dashboard Cost Ratings
to GAO's Ratings:
[Refer to PDF for image: table]
Agency: DOD;
Investment: Joint Precision Approach and Landing System;
July 2009:
Dashboard: 10;
GAO: 9;
August 2009:
Dashboard: 10;
GAO: 10;
September 2009:
Dashboard: 10;
GAO: 10;
October 2009:
Dashboard: 10;
GAO: 10;
November 2009:
Dashboard: 10;
GAO: 10;
December 2009:
Dashboard: 10;
GAO: 10;
January 2010:
Dashboard: 10;
GAO: 10.
Agency: DOD;
Investment: Maneuver Control System;
July 2009:
Dashboard: 10;
GAO: 10;
August 2009:
Dashboard: 10;
GAO: 10;
September 2009:
Dashboard: 10;
GAO: 10;
October 2009:
Dashboard: 10;
GAO: 10;
November 2009:
Dashboard: 10;
GAO: 10;
December 2009:
Dashboard: 10;
GAO: 10;
January 2010:
Dashboard: 10;
GAO: 10.
Agency: DOE;
Investment: Sequoia Platform;
July 2009:
Dashboard: 10;
GAO: 10;
August 2009:
Dashboard: 10;
GAO: 10;
September 2009:
Dashboard: 10;
GAO: 10;
October 2009:
Dashboard: 10;
GAO: 10;
November 2009:
Dashboard: 10;
GAO: 10;
December 2009:
Dashboard: 10;
GAO: 10;
January 2010:
Dashboard: 10;
GAO: 10.
Agency: DOJ;
Investment: Law Enforcement Wireless Communication;
July 2009:
Dashboard: 10;
GAO: 10;
August 2009:
Dashboard: 10;
GAO: 9;
September 2009:
Dashboard: 10;
GAO: 9;
October 2009:
Dashboard: 10;
GAO: 9;
November 2009:
Dashboard: 10;
GAO: 9;
December 2009:
Dashboard: 10;
GAO: 8 (needs attention);
January 2010:
Dashboard: 10;
GAO: 8 (needs attention).
Agency: DOJ;
Investment: Unified Financial Management System;
July 2009:
Dashboard: 10;
GAO: 9;
August 2009:
Dashboard: 10;
GAO: 9;
September 2009:
Dashboard: 10;
GAO: 9;
October 2009:
Dashboard: 10;
GAO: 9;
November 2009:
Dashboard: 10;
GAO: 9;
December 2009:
Dashboard: 10;
GAO: 9;
January 2010:
Dashboard: 10;
GAO: 9.
Agency: HHS;
Investment: BioSense[A];
July 2009:
Dashboard: 9;
GAO: 10;
August 2009:
Dashboard: [Empty];
GAO: [Empty];
September 2009:
Dashboard: [Empty];
GAO: [Empty];
October 2009:
Dashboard: [Empty];
GAO: [Empty];
November 2009:
Dashboard: [Empty];
GAO: [Empty];
December 2009:
Dashboard: [Empty];
GAO: [Empty];
January 2010:
Dashboard: [Empty];
GAO: [Empty];
Agency: USDA;
Investment: Financial Management Modernization Initiative;
July 2009:
Dashboard: 8 (needs attention);
GAO: 10;
August 2009:
Dashboard: 8 (needs attention);
GAO: 10;
September 2009:
Dashboard: 8 (needs attention);
GAO: 10;
October 2009:
Dashboard: 8 (needs attention);
GAO: 10;
November 2009:
Dashboard: 8 (needs attention);
GAO: 10;
December 2009:
Dashboard: 8 (needs attention);
GAO: 10;
January 2010:
Dashboard: 8 (needs attention);
GAO: 10.
Agency: USDA;
Investment: Risk Management Agency-13;
July 2009:
Dashboard: 10;
GAO: 10;
August 2009:
Dashboard: 10;
GAO: 10;
September 2009:
Dashboard: 10;
GAO: 10;
October 2009:
Dashboard: 10;
GAO: 10;
November 2009:
Dashboard: 10;
GAO: 10;
December 2009:
Dashboard: 10;
GAO: 10;
January 2010:
Dashboard: 10;
GAO: 10.
Normal: 9; 10.
Source: OMB‘s Dashboard and GAO analysis.
Note: See appendix II for descriptions of each investment.
[A] BioSense's development work was completed in July 2009. This was
the only month we could assess its performance.
[End of figure]
There were fewer instances of discrepancies with the schedule ratings;
however, these discrepancies were also notable. Specifically, of the 8
selected investments, the Dashboard's schedule ratings were inaccurate
for 2 investments: Risk Management Agency-13 and the Unified Financial
Management System. The Unified Financial Management System's last
completed milestone was in May 2009 and the Dashboard rating for the
investment's schedule has been a 10 since July 2009. However,
investment data we examined showed the schedule rating should have
been a 5 (greater than or equal to 30 days and less than 90 days
behind schedule) from September 2009 through December 2009. As a
result, this investment's schedule performance should have been rated
a "yellow" instead of a "green" for those months. Additionally, the
Dashboard's schedule rating for Risk Management Agency-13 reported
that this investment was "red" for two months, while it should have
been "green," and "yellow" for four months, when it should have been
"green." BioSense, Financial Management Modernization Initiative,
Joint Precision Approach and Landing System, Law Enforcement Wireless
Communication, Maneuver Control System, and Sequoia Platform are the 6
investments that had accurate schedule ratings. Figure 3 shows the
comparison of selected investments' Dashboard schedule ratings to
GAO's ratings for the months of July 2009-January 2010.
Figure 3: Comparison of Selected Investments' Dashboard Schedule
Ratings to GAO's Ratings:
[Refer to PDF for image: table]
Agency: DOD;
Investment: Joint Precision Approach and Landing System;
July 2009:
Dashboard: 10;
GAO: 10;
August 2009:
Dashboard: 10;
GAO: 10;
September 2009:
Dashboard: 10;
GAO: 10;
October 2009:
Dashboard: 10;
GAO: 10;
November 2009:
Dashboard: 10;
GAO: 10;
December 2009:
Dashboard: 10;
GAO: 10;
January 2010:
Dashboard: 10;
GAO: 10.
Agency: DOD;
Investment: Maneuver Control System;
July 2009:
Dashboard: 10;
GAO: 10;
August 2009:
Dashboard: 10;
GAO: 10;
September 2009:
Dashboard: 10;
GAO: 10;
October 2009:
Dashboard: 10;
GAO: 10;
November 2009:
Dashboard: 10;
GAO: 10;
December 2009:
Dashboard: 10;
GAO: 10;
January 2010:
Dashboard: 10;
GAO: 10.
Agency: DOE;
Investment: Sequoia Platform;
July 2009:
Dashboard: 10;
GAO: 10;
August 2009:
Dashboard: 10;
GAO: 10;
September 2009:
Dashboard: 10;
GAO: 10;
October 2009:
Dashboard: 10;
GAO: 10;
November 2009:
Dashboard: 10;
GAO: 10;
December 2009:
Dashboard: 10;
GAO: 10;
January 2010:
Dashboard: 10;
GAO: 10.
Agency: DOJ;
Investment: Law Enforcement Wireless Communication;
July 2009:
Dashboard: 10;
GAO: 10;
August 2009:
Dashboard: 10;
GAO: 9;
September 2009:
Dashboard: 10;
GAO: 9;
October 2009:
Dashboard: 10;
GAO: 9;
November 2009:
Dashboard: 10;
GAO: 9;
December 2009:
Dashboard: 10;
GAO: 8;
January 2010:
Dashboard: 10;
GAO: 8.
Agency: DOJ;
Investment: Unified Financial Management System;
July 2009:
Dashboard: 10;
GAO: 10;
August 2009:
Dashboard: 10;
GAO: 10;
September 2009:
Dashboard: 10;
GAO: 5 (needs attention);
October 2009:
Dashboard: 10;
GAO: 5 (needs attention);
November 2009:
Dashboard: 10;
GAO: 5 (needs attention);
December 2009:
Dashboard: 10;
GAO: 5 (needs attention);
January 2010:
Dashboard: 10;
GAO: 10.
Agency: HHS;
Investment: BioSense[A];
July 2009:
Dashboard: 10;
GAO: 10;
August 2009:
Dashboard: [Empty];
GAO: [Empty];
September 2009:
Dashboard: [Empty];
GAO: [Empty];
October 2009:
Dashboard: [Empty];
GAO: [Empty];
November 2009:
Dashboard: [Empty];
GAO: [Empty];
December 2009:
Dashboard: [Empty];
GAO: [Empty];
January 2010:
Dashboard: [Empty];
GAO: [Empty];
Agency: USDA;
Investment: Financial Management Modernization Initiative;
July 2009:
Dashboard: 10;
GAO: 10;
August 2009:
Dashboard: 10;
GAO: 10;
September 2009:
Dashboard: 10;
GAO: 10;
October 2009:
Dashboard: 10;
GAO: 10;
November 2009:
Dashboard: 5 (needs attention);
GAO: 5 (needs attention);
December 2009:
Dashboard: 5 (needs attention);
GAO: 5 (needs attention);
January 2010:
Dashboard: 5 (needs attention);
GAO: 5 (needs attention).
Agency: USDA;
Investment: Risk Management Agency-13;
July 2009:
Dashboard: 10;
GAO: 10;
August 2009:
Dashboard: 5;
GAO: 10;
September 2009:
Dashboard: 5;
GAO: 10;
October 2009:
Dashboard: 5;
GAO: 10;
November 2009:
Dashboard: 5;
GAO: 10;
December 2009:
Dashboard: 0 (Significant concerns);
GAO: 10;
January 2010:
Dashboard: 0 (Significant concerns);
GAO: 10.
Normal: 10.
Source: OMB‘s Dashboard and GAO analysis.
[A] BioSense's development work was completed in July 2009. This was
the only month we could assess its performance.
[End of figure]
In addition to determining that cost and schedule ratings are not
always accurate, we found other data inaccuracies. Specifically,
rebaseline information[Footnote 10] on the Dashboard was not always
accurate. Best practices and GAO's Cost Estimating Guide state that a
rebaseline should occur when the current cost and schedule baseline
does not adequately represent the amount of work to be completed,
causing difficulty in monitoring progress of the program.[Footnote 11]
However, OMB reports all major and minor corrections to planned
information on the Dashboard, including typographical fixes, as a
rebaseline. More specifically, while the Dashboard allows agencies to
provide reasons for baseline changes, the current version of the
Dashboard, at a high level, identifies all changes to planned
information as rebaselines. For example, according to the Dashboard,
DOJ's Law Enforcement Wireless Communication investment has been
rebaselined four times. However, program officials stated that the
program has only been rebaselined once. Similarly, the Dashboard shows
that the Sequoia Platform and Integrated Management Navigation System
investments at DOE have both been rebaselined four times. However,
program officials stated that neither of these programs had actually
been rebaselined. Rather, they stated that this number represents
instances in which they made minor corrections to the data on the
Dashboard. Table 4 shows the selected investments whose program
officials reported a lower number of rebaselines than what was
reported on the Dashboard.
Table 4: Selected Investments Whose Program Officials Reported a Lower
Number of Rebaselines Than What was Reported on the Dashboard:
Agency: DOE;
Investment: Integrated Management Navigation System;
Rebaselines reported on Dashboard: 4;
Rebaselines reported by program officials: 0.
Agency: DOE;
Investment: Sequoia Platform;
Rebaselines reported on Dashboard: 4;
Rebaselines reported by program officials: 0.
Agency: DOJ;
Investment: Law Enforcement Wireless Communication;
Rebaselines reported on Dashboard: 4;
Rebaselines reported by program officials: 1.
Agency: HHS;
Investment: Electronic Research Administration;
Rebaselines reported on Dashboard: 1;
Rebaselines reported by program officials: 0.
Source: OMB's Dashboard and data provided by program officials.
[End of table]
Mixing corrections with the number of true rebaselines overstates
instances where an investment has rebaselined, which can divert
management attention from areas truly needing oversight. OMB officials
stated that they intentionally designed the Dashboard this way because
they wanted to discourage agencies from modifying data on the
Dashboard in order to hold them accountable for the information they
report. Further, OMB officials noted that any agency needing to make a
data correction may do so through a manual process. However, the
officials agreed that including corrections in with the number of
rebaselines is problematic and they will consider tracking corrections
and rebaselines separately.
Subsequent to the completion of our audit work, OMB provided us with
its new guidance on managing IT baselines, which was issued on June
28, 2010. The guidance, among other things, describes when agencies
should report baseline changes on the Dashboard. Additionally, OMB
provided documentation of the specific modifications that will be made
in an upcoming release of the Dashboard to improve the way baseline
changes are displayed.
Cost and Schedule Ratings Do Not Reflect Current Performance and Wide
Variation in Milestone Reporting Exists:
A primary reason why the cost and schedule ratings were not always
accurate is that the cost and schedule ratings do not take current
performance into consideration for many investments on the Dashboard,
though it is intended to represent near real-time performance
information on all major IT investments. Specifically, as of April
2010, the formula to calculate the cost ratings on the Dashboard
intentionally only factored in completed portions of the investments
(referred to as milestones) to determine cost ratings. As such,
milestones that are currently under way are not taken into account.
[Footnote 12] Table 5 identifies each selected investment's last
completed milestone and the number of days that the Dashboard's cost
rating is out of date for each selected investment.
Table 5: Number of Days the Cost Rating is Outdated for Each Selected
Investment:
Agency: DOD;
Investment: Joint Precision Approach and Landing System;
Last completed milestone date: 07/15/2008;
Number of days cost rating is outdated, as of April 21, 2010: 645.
Agency: DOD;
Investment: Maneuver Control System;
Last completed milestone date: No completed milestone;
Number of days cost rating is outdated, as of April 21, 2010: [Empty].
Agency: DOE;
Investment: Integrated Management Navigation System;
Last completed milestone date: 01/04/2010;
Number of days cost rating is outdated, as of April 21, 2010: 107.
Agency: DOE;
Investment: Sequoia Platform;
Last completed milestone date: 02/12/2010;
Number of days cost rating is outdated, as of April 21, 2010: 68.
Agency: DOJ;
Investment: Law Enforcement Wireless Communication;
Last completed milestone date: No completed milestone;
Number of days cost rating is outdated, as of April 21, 2010: [Empty].
Agency: DOJ;
Investment: Unified Financial Management System;
Last completed milestone date: 05/15/2009;
Number of days cost rating is outdated, as of April 21, 2010: 341.
Agency: HHS;
Investment: BioSense[A];
Last completed milestone date: 11/30/2009;
Number of days cost rating is outdated, as of April 21, 2010: 142.
Agency: HHS;
Investment: Electronic Research Administration;
Last completed milestone date: 09/30/2009;
Number of days cost rating is outdated, as of April 21, 2010: 203.
Agency: USDA;
Investment: Financial Management Modernization Initiative;
Last completed milestone date: 11/27/2009;
Number of days cost rating is outdated, as of April 21, 2010: 145.
Agency: USDA;
Investment: Risk Management Agency-13;
Last completed milestone date: 09/30/2009;
Number of days cost rating is outdated, as of April 21, 2010: 203.
Source: OMB's Dashboard and GAO analysis.
[A] BioSense's development work was completed in July 2009.
[End of table]
OMB officials agreed that the ratings not factoring in current
performance is an area needing improvement and said that they are
planning on upgrading the Dashboard application in July 2010 to
include updated cost and schedule formulas that factor in the
performance of ongoing milestones; however, they have not yet made
this change. One step OMB has taken toward collecting the information
needed for the new formulas is that it now requires agencies to
provide information on their investment milestones' planned and actual
start dates. In addition, OMB officials stated that they plan to use a
previously unused data field--percent complete. These are key data
points necessary to calculate the performance of ongoing milestones.
Another issue with the ratings is that there were wide variations in
the number of milestones agencies reported. For example, DOE's
Integrated Management Navigation System investment lists 314
milestones, whereas DOD's Joint Precision Approach and Landing System
investment lists 6. Having too many milestones may mask recent
performance problems because the performance of every milestone (i.e.,
historical and recently completed) is equally averaged into the
ratings. Specifically, investments that perform well during many
previously completed milestones and then start performing poorly on a
few recently completed milestones can maintain ratings that still
reflect good performance. A more appropriate approach could be to give
additional weight to recently completed and ongoing milestones when
calculating the ratings. Too many detailed milestones also defeat the
purpose of an executive-level reporting tool. Conversely, having too
few milestones can limit the amount of information available to track
work and rate performance and allows agencies to potentially skew the
performance ratings.
In commenting on a draft of this report, the Federal CIO stated that
OMB has a new version of the Dashboard that implements updated cost
and schedule calculations. He stated that the new calculations greatly
increase the weight of current activities. As of July 1, 2010, this
updated Dashboard had not been released. An OMB analyst subsequently
told us that the agency plans to release the new version in July 2010.
Additionally, OMB officials have provided us with documentation of the
new calculations and demonstrated the new version of the Dashboard
that will be released soon. The Federal CIO also added that OMB will
consider additional changes to the ratings in the future.
Table 6 demonstrates the large inconsistencies in the number of
milestones reported for each selected investment.
Table 6: Number of Milestones per Selected Investment:
Agency: DOD;
Investment: Joint Precision Approach and Landing System;
Milestones per investment: 6.
Agency: DOD;
Investment: Maneuver Control System;
Milestones per investment: 5.
Agency: DOE;
Investment: Integrated Management Navigation System;
Milestones per investment: 314.
Agency: DOE;
Investment: Sequoia Platform;
Milestones per investment: 26.
Agency: DOJ;
Investment: Law Enforcement Wireless Communication;
Milestones per investment: 19.
Agency: DOJ;
Investment: Unified Financial Management System;
Milestones per investment: 32.
Agency: HHS;
Investment: BioSense;
Milestones per investment: 50.
Agency: HHS;
Investment: Electronic Research Administration;
Milestones per investment: 54.
Agency: USDA;
Investment: Financial Management Modernization Initiative;
Milestones per investment: 10.
Agency: USDA;
Investment: Risk Management Agency-13;
Milestones per investment: 8.
Source: OMB's Dashboard.
[End of table]
In June 2009, OMB issued guidance that agencies are responsible for
providing quality data and, at minimum, should provide milestones that
consist of major segments of the investment, referred to as work
breakdown structure level 2, but prefers that agencies provide lower-
level milestones within each segment (work breakdown structure level
3). A work breakdown structure is the cornerstone of every program
because it defines in detail the work necessary to accomplish a
program's objectives. Standardizing a work breakdown structure is
considered a best practice because it enables an organization to
collect and share data among programs. Further, standardizing work
breakdown structures allows data to be shared across organizations.
However, certain agencies are not following OMB's guidance and list
milestones that they consider to be at work breakdown structure level
1, which are high-level milestones. Specifically, of the 5 agencies we
reviewed, officials at DOD, USDA, and DOE stated that they were
reporting work breakdown structure level 1 milestones to the Dashboard
for each of their selected investments. OMB officials acknowledge that
not all agencies are following their guidance, but stated that OMB
analysts are working with agencies to try to improve compliance.
Furthermore, the guidance that OMB has provided is not clear on the
level of detail that it wants agencies to report in their milestones
and has left it to the agencies to individually interpret their
general guidance. Specifically, while OMB states that agencies should
report milestones that are, at a minimum, work breakdown structure
level 2, there is no commonly accepted definition among federal
agencies on the level of detail that should comprise each of these
levels. OMB officials acknowledged that they have not provided clear
guidance, but recently stated that they have begun exploring ways to
ensure more uniformity across agencies' reporting. Specifically, in
commenting on a draft of this report, the Federal CIO stated that OMB
has recently chartered a working group comprised of representatives
from several federal agencies, with the intention of developing clear
guidance for standardizing and improving investment activity reporting.
OMB and agencies acknowledge that additional improvements can be made
beyond the cost and schedule ratings and have taken certain steps to
try to improve the accuracy of the data. For example, OMB implemented
an automated monthly data upload process and created a series of data
validation rules that detect common data entry errors, such as
investment milestone start dates that occur after completion dates. In
addition, four of the five agencies we reviewed indicated that they
have processes in place aimed at improving the accuracy of the data.
For instance, HHS has established a process wherein an official has
been assigned responsibility for ensuring the Dashboard is accurately
updated. Further, DOJ has developed an automated process to find
missing data elements in the information to be uploaded on the
Dashboard.
Despite these efforts, until OMB upgrades the Dashboard application to
improve the accuracy of the cost and schedule ratings to include
ongoing milestones, explains the outcome of these improvements in its
next annual report to Congress on the Implementation of the E-
Government Act (which is a key mechanism for reporting on the
implementation of the Dashboard), provides clear and consistent
guidance to agencies that standardizes milestone reporting, and
ensures agencies comply with the new guidance, the Dashboard's cost
and schedule ratings will likely continue to experience data accuracy
issues.
Use of the Dashboard as a Management Tool Varies:
Officials at three of the five agencies we reviewed--DOD, DOJ, and
HHS--stated that they are not using the Dashboard to manage their
investments, and the other two agencies, DOE and USDA, indicated that
they are using the Dashboard to manage their investments.
Specifically, officials from the three agencies are not using the
Dashboard to manage their investments because they have other existing
means to do so:
* DOD officials indicated that they use the department's Capital
Planning and Investment Control process to track IT investment data--
including cost and schedule.
* DOJ uses an internal dashboard that the office of the CIO developed
that provides for more detailed management of investments than OMB's
Dashboard.
* HHS officials said they use a portfolio investment management tool,
which they indicated provides greater insight into their investments.
Officials from the other two agencies--DOE and USDA--noted that they
are using the Dashboard as a management tool to supplement their
existing internal processes to manage their IT investments.
* DOE officials stated that since their current process is based on a
quarterly review cycle, the monthly reporting nature of the Dashboard
has allowed officials to gain more frequent insight into investment
performance. As a result, DOE officials say that they are able to
identify potential issues before these issues present problems for
investments.
* USDA officials stated that they use the ratings on the Dashboard to
identify investments that appear to be problematic and hold meetings
with the investments' program managers to discuss corrective actions.
Additionally, in OMB's fiscal year 2009 Report to Congress on the
Implementation of the E-Government Act of 2002, 11 agencies reported
on how the Dashboard has increased their visibility and awareness of
IT investments. For example, the Department of Veterans' Affairs
terminated 12 IT projects, partly because of the increased visibility
that the CIO obtained from the Dashboard.
OMB indicated that it is using the Dashboard to manage IT investments.
Specifically, OMB analysts are using the Dashboard's investment trend
data to track changes and identify issues with investments'
performance in a timely manner and are also using the Dashboard to
identify and drive investment data quality issues. The Federal CIO
stated that the Dashboard has greatly improved oversight capabilities
compared to previously used mechanisms. He also stated that the
Dashboard has increased the accountability of agencies' CIOs and
established much-needed visibility. According to OMB officials, the
Dashboard is one of the key sources of information that OMB analysts
use to identify investments that are experiencing performance problems
and select them for a TechStat session--a review of selected IT
investments between OMB and agency leadership that is led by the
Federal CIO. OMB has identified factors that may result in a TechStat
session, such as policy interests, Dashboard data inconsistencies,
recurring patterns of problems, and/or an OMB analyst's concerns with
an investment. As of June 2010, OMB officials indicated that 27
TechStat sessions have been held with federal agencies. According to
OMB, this program enables the government to improve or terminate IT
investments that are experiencing performance problems.
Conclusions:
OMB has taken significant steps to enhance the oversight,
transparency, and accountability of federal IT investments by creating
its IT Dashboard. However, the cost and schedule ratings on the
Dashboard were not always accurate. Further, the rebaseline data were
not always accurate. The cost and schedule inaccuracies were due, in
part, to calculations of ratings that did not factor in current
performance. Additionally, there were large inconsistencies in the
number of milestones that agencies report on the Dashboard because OMB
has not fully defined the level of detail that federal agencies should
use to populate the Dashboard and several selected agencies decided
not to follow OMB's general guidance. Moreover, the performance of
historical and recently completed milestones are equally averaged in
the cost and schedule ratings, which is counter to OMB's goal to
report near real-time performance on the Dashboard. While the use of
the Dashboard as a management tool varies, OMB has efforts under way
to include the performance of ongoing milestones and its officials
acknowledge that additional improvements are needed. Nevertheless,
until OMB explains in its next annual Implementation of the E-
Government Act report how the upgrade to the Dashboard application has
improved the accuracy of the cost and schedule ratings, and provides
clear and consistent guidance that enables agencies to report
standardized information on their milestones, the accuracy of the data
on the Dashboard may continue to be in question.
Recommendations for Executive Action:
To better ensure that the IT Dashboard provides meaningful ratings and
accurate investment data, we are recommending that the Director of OMB
take the following two actions:
* include in its next annual Implementation of the E-Government Act
report the effect of planned formula changes on the accuracy of data;
and:
* develop and issue clear guidance that standardizes milestone
reporting on the Dashboard.
In addition, we are recommending that the Secretaries of the
Departments of Agriculture, Defense, and Energy direct their Chief
Information Officers to ensure that they comply with OMB's guidance on
standardized milestone reporting, once it is available.
Agency Comments and Our Evaluation:
We received written comments on a draft of this report from the
Federal CIO and DOE's Associate CIO for IT Planning, Architecture, and
E-Government. Letters from these agencies are reprinted in appendixes
III and IV. In addition, we received technical comments via e-mail
from a Coordinator at HHS, which we incorporated where appropriate. In
addition, the Deputy CIO from USDA, the Principal Director to the
Deputy Assistant Secretary of Defense for Resources from DOD, and an
Audit Liaison Specialist from DOJ indicated via e-mail that they had
reviewed the draft report and did not have any comments.
In OMB's comments on our draft report, which contained four
recommendations to the OMB Director, the Federal CIO stated that he
agreed with two recommendations and disagreed with two because of
actions OMB has recently taken. After reviewing these actions, we
agreed that they addressed our concerns and will not make these two
recommendations.
OMB agreed with our recommendation that it include in its next annual
Implementation of the E-Government Act report how the planned formula
changes have improved the accuracy of data.
OMB agreed with our recommendation that it develop and issue clear
guidance that standardizes milestone reporting on the Dashboard.
Additionally, the Federal CIO asked that we update the report to
reflect that they have recently chartered a working group comprised of
representatives from several federal agencies, with the intention of
developing clear guidance for standardizing and improving investment
activity reporting. We have incorporated this additional information
into the report.
In response to our draft recommendation that OMB revise the IT
Dashboard and its guidance so that only major changes to investments
are considered to be rebaselines, OMB provided us with its new
guidance on managing IT baselines, which was issued on June 28, 2010.
The guidance, among other things, describes when agencies should
report baseline changes on the Dashboard. OMB also provided
documentation of the specific modifications that will be made in an
upcoming release of the Dashboard to improve the way baseline changes
are displayed. We agree that these recent changes address our
recommendation. As such, we updated the report to acknowledge and
include this additional information, where appropriate.
Regarding our recommendation that OMB consider weighing recently
completed and ongoing milestones more heavily than historical
milestones in the cost and schedule ratings, the Federal CIO stated
that OMB has a new version of the Dashboard that implements updated
cost and schedule calculations. He stated that the new calculations
greatly increase the weight of current activities. As previously
stated, as of July 1, 2010, this updated Dashboard had not been
released. An OMB analyst subsequently told us that the agency plans to
release the new version in July 2010. Additionally, OMB officials have
provided us with documentation of the new calculations and
demonstrated the new version of the Dashboard that will be released
soon. The Federal CIO also added that OMB will consider additional
changes to the ratings in the future. We agree that these recent
changes address our recommendation. As such, we updated the report to
acknowledge and include this additional information, where
appropriate. Additionally, OMB will report on the effect of the
upcoming changes to the calculations in its next annual Implementation
of the E-Government Act report.
OMB also provided additional comments, which we address in appendix
III.
In DOE's comments on our draft report, the Associate CIO for IT
Planning, Architecture, and E-Government indicated that she agreed
with our assessment of the implementation of the IT Dashboard across
federal agencies and with the recommendations presented to OMB.
Additionally, in response to our recommendation that the CIO of DOE
comply with OMB guidance on milestone reporting once it is available,
the Associate CIO stated that once OMB releases the additional
guidance, DOE officials will work to ensure the appropriate level of
detail is reported on the Dashboard. DOE also provided an additional
comment, which we address in appendix IV.
As agreed with your offices, unless you publicly announce the contents
of this report earlier, we plan no further distribution until 30 days
from the report date. At that time, we will send copies of this report
to interested congressional committees; the Director of the Office of
Management and Budget; the Secretaries of the Departments of
Agriculture, Defense, Energy, Health and Human Services, and Justice;
and other interested parties. In addition, the report will be
available at no charge on our Web site at [hyperlink,
http://www.gao.gov].
If you or your staffs have any questions on the matters discussed in
this report, please contact me at (202) 512-9286 or pownerd@gao.gov.
Contact points for our Offices of Congressional Relations and Public
Affairs may be found on the last page of this report. GAO staff who
made major contributions to this report are listed in appendix V.
Signed by:
David A. Powner:
Director, Information Technology Management Issues:
[End of section]
Appendix I: Objectives, Scope, and Methodology:
Our objectives were to (1) examine the accuracy of the cost and
schedule performance ratings on the Dashboard for selected investments
and (2) determine whether the data on the Dashboard are used as a
management tool to make improvements to IT investments.
To address both objectives, we selected five agencies and ten
investments to review. To select these agencies and investments, we
first identified ten agencies with large IT budgets as reported in the
Office of Management and Budget's (OMB) fiscal year 2010 Exhibit 53.
We then identified the five largest investments at each of the ten
agencies, according to the fiscal year 2010 budget, that were spending
more than half of their budget on IT development, modernization, and
enhancement work, and were primarily carried out by contractors. In
narrowing the list to five agencies and ten total investments, we
considered several factors to ensure there were two viable investments
at each agency:
* The investment is not part of our ongoing audit work related to
cost, schedule, and technical performance.
* The investment is not part of a recent governmentwide earned value
management review.[Footnote 13]
* The investment has not been highlighted as an investment needing
significant attention.
* The collective list of investments creates a balance of investment
sizes to include both larger and smaller investments.
The five agencies are: the Departments of Agriculture (USDA), Defense
(DOD), Energy (DOE), Health and Human Services (HHS), and Justice
(DOJ).
The ten investments are: USDA's Financial Management Modernization
Initiative and Risk Management Agency-13 Program; DOD's Joint
Precision Approach and Landing System and Maneuver Control System;
DOE's Integrated Management Navigation System and Sequoia Platform;
HHS's BioSense Program and Electronic Research Administration System;
DOJ's Law Enforcement Wireless Communication and Unified Financial
Management System (see appendix II for descriptions of each
investment).
To address the first objective, we evaluated earned value data of the
selected investments to determine their cost and schedule performance
and compared it to the ratings on the Dashboard. The investment earned
value data was contained in contractor earned value management
performance reports obtained from the programs. To perform this
analysis, we compared the investment's cumulative cost variance for
each month from July 2009 through January 2010 to the cost variance
reported on the Dashboard for those months. Similarly, we calculated
the number of months each investment was ahead or behind schedule over
the same period on the Dashboard. We also assessed 13 months of
investment data to analyze trends in cost and schedule performances.
To further assess the accuracy of the cost data, we compared it with
other available supporting program documents, including monthly and
quarterly investment program management reports; electronically tested
the data to identify obvious problems with completeness or accuracy;
and interviewed agency and program officials about the data and earned
value management systems. For the purposes of this report, we
determined that the cost data at eight of the investments were
sufficiently reliable to use for our assessment. For the two remaining
investments, we determined that based on their methods of earned value
management, the data would not allow us to sufficiently assess and
rate monthly investment performance. We did not test the adequacy of
the agency or contractor cost-accounting systems. Our evaluation of
these cost data was based on the documentation the agency provided.
We also reviewed and analyzed OMB's and the selected agencies'
processes for populating and updating the Dashboard. Additionally, we
interviewed officials from OMB and the selected agencies and reviewed
OMB guidance to obtain additional information on OMB's and agencies'
efforts to ensure the accuracy of the investment performance data and
cost and schedule performance ratings on the Dashboard. We used the
information provided by OMB and agency officials to identify the
factors contributing to inaccurate cost and schedule performance
ratings on the Dashboard. Moreover, we used this information to
examine the accuracy of the rebaseline information on the Dashboard,
we interviewed agency and program officials about the number of
rebaselines each investment has had, and compared these data with the
rebaseline information listed on the Dashboard.
To address our second objective, we analyzed related agency
documentation to assess what policies or procedures they have
implemented for using the data on the Dashboard to make management
decisions. We also interviewed agency and program officials regarding
the extent to which they use the data on the Dashboard as a management
tool. Additionally, we attended one of OMB's TechStat sessions, which
are reviews of selected IT investments between OMB and agencies.
We conducted this performance audit from January to July 2010 at the
selected agencies' offices in the Washington, D.C., metropolitan area.
Our work was done in accordance with generally accepted government
auditing standards. Those standards require that we plan and perform
the audit to obtain sufficient, appropriate evidence to provide a
reasonable basis for our findings and conclusions based on our audit
objectives. We believe that the evidence obtained provides a
reasonable basis for our findings and conclusions based on our audit
objectives.
[End of section]
Appendix II: Selected Investment Descriptions:
Below are descriptions of each of the selected investments that are
included in this review.
Department of Agriculture (USDA):
Financial Management Modernization Initiative:
The Financial Management Modernization Initiative is USDA's financial
management system modernization program. It is intended to be the
central financial system for USDA and is to consolidate the current
financial management system environment from 19 legacy systems into
one Web-based system.
Risk Management Agency-13:
USDA's Risk Management Agency-13 program is intended to support the
reengineering of all business systems associated with the crop
insurance program and provide a central financial system that will
provide Web-based tools and applications for accessing Risk Management
Agency data.[Footnote 14]
Department of Defense (DOD):
Joint Precision Approach and Landing System:
DOD's Joint Precision Approach and Landing System investment is
intended to provide a precision approach and landing capability for
all DOD ground and airborne systems. It is intended to enable U.S.
forces to safely land aircraft on any suitable surface worldwide (land
and sea), with ceiling and/or visibility the limiting factor.
Maneuver Control System:
DOD's maneuver control system investment is intended to provide, among
other things, the warfighter environment and collaborative and
situational awareness tools used to support executive decision making,
planning, rehearsal, and execution management. This system is to be
used throughout the Army to provide a common view of critical
information.
Department of Energy (DOE):
Integrated Management Navigation System:
DOE's Integrated Management Navigation System consists of 5 major
projects and is intended to standardize and integrate accounting, data
warehouse, human resource, procurement, and budget processes
throughout DOE. The Integrated Management Navigation System
incorporates enterprisewide projects from DOE's Office of the Chief
Financial Officer, Office of Human Capital Management, and Office of
Management.
Sequoia Platform:
DOE's Sequoia Platform is a supercomputer being developed for use by
three weapons laboratories--Los Alamos, Lawrence Livermore, and Sandia
National Laboratories--to contribute dramatically to the national
security enterprise. This supercomputer will also be used in
maintaining the nuclear deterrence and areas of nonproliferation,
nuclear counterterrorism, and support to the intelligence community.
Department of Health and Human Services (HHS):
BioSense:
HHS's BioSense program is intended to improve the nation's
capabilities for disease detection, monitoring, and near real-time
health situational awareness by creating a system that uses data from
existing health-related databases to identify patterns of disease
symptoms prior to specific diagnoses.[Footnote 15]
Electronic Research Administration:
HHS's Electronic Research Administration program is the National
Institutes of Health's system for conducting interactive electronic
transactions for the receipt, review, monitoring, and administration
of grant awards to biomedical investigators worldwide. It is also
intended to provide the technology capabilities for the agency to
efficiently and effectively perform grants administration functions.
Department of Justice (DOJ):
Law Enforcement Wireless Communication:
DOJ's Law Enforcement Wireless Communication System, also known as the
Integrated Wireless Network, is to support the replacement and
modernization of failing radio systems and achieve communication
standards at DOJ's law enforcement agencies. This program is intended
to provide all four law enforcement components with a shared unified
radio network, which should eliminate redundant coverage and
duplicative radio sites, while providing efficient and comparable
coverage.
Unified Financial Management System:
DOJ's Unified Financial Management System is to improve the existing
and future financial management and procurement operations across DOJ.
Upon full implementation, the Unified Financial Management System will
replace five financial management systems and multiple procurement
systems with an integrated commercial off-the-shelf solution. This is
to streamline and standardize business processes and procedures across
the DOJ components.
Table 7 provides additional details for each of the selected
investments in our review.
Table 7: Investment Management Details:
Agency: DOD;
Bureau: Department of the Navy;
Program name: Joint Precision Approach and Landing System;
Program start date: 1995;
Program end date: 2024;
Prime contractor: Raytheon.
Agency: DOD;
Bureau: Department of the Army;
Program name: Maneuver Control System;
Program start date: 2005;
Program end date: 2018;
Prime contractor: General Dynamics.
Agency: DOE;
Bureau: Departmental Administration;
Program name: Integrated Management Navigation System;
Program start date: 2000;
Program end date: 2015;
Prime contractor: IBM.
Agency: DOE;
Bureau: National Nuclear Security Administration;
Program name: Sequoia Platform;
Program start date: 2009;
Program end date: 2012;
Prime contractor: LLNS.
Agency: DOJ;
Bureau: Justice Management Division;
Program name: Law Enforcement Wireless Communication;
Program start date: 2005;
Program end date: 2015;
Prime contractor: General Dynamics.
Agency: DOJ;
Bureau: Justice Management Division;
Program name: Unified Financial Management System;
Program start date: 2001;
Program end date: 2021;
Prime contractor: IBM.
Agency: HHS;
Bureau: Centers for Disease Control and Prevention;
Program name: BioSense;
Program start date: 2003;
Program end date: 2015;
Prime contractor: SAIC.
Agency: HHS;
Bureau: National Institutes of Health;
Program name: Electronic Research Administration;
Program start date: 1994;
Program end date: 2018;
Prime contractor: ICF/Z-Tech, LTS, PMC, RN Solutions, SAIC, and Wyle.
Agency: USDA;
Bureau: Departmental Administration;
Program name: Financial Management Modernization Initiative;
Program start date: 2008;
Program end date: 2011;
Prime contractor: Accenture LLP.
Agency: USDA;
Bureau: Risk Management Agency;
Program name: Risk Management Agency-13;
Program start date: 2006;
Program end date: 2011;
Prime contractor: SAIC.
Source: OMB's Dashboard and data from program officials.
[End of table]
[End of section]
Appendix III: Comments from the Office of Management and Budget:
Note: GAO comments supplementing those in the report text appear at
the end of this appendix.
Executive Office Of The President:
Office Of Management And Budget:
Washington, D.C. 20503:
June 25, 2010:
Mr. David A. Powner:
U.S. Government Accountability Office:
441 G Street, NW:
Washington, DC 20548:
Dear Mr. Powner:
OMB appreciates GAO's analysis of the IT Dashboard contained in the
draft report "OMB's Dashboard Has Increased Transparency, but Data
Accuracy Improvements Needed" (GAO-10-71). After review of this draft,
we have several concerns, which we organized into the following three
categories: 1) objections to some recommendations in the draft report,
2) concerns with the lack of findings related to improvements in
transparency and oversight, and 3) other concerns with specific
findings throughout the report.
OMB has several concerns related to GAO's four recommendations, found
on page 20 of the draft report. In summary, OMB accepts one
recommendation, requests removal of two, and requests rewording of one
to reflect progress made since initiation of this report.
OMB believes we have implemented the fourth recommendation to consider
weighing recently completed and ongoing milestones more heavily than
historical milestones, and therefore request removal of this
recommendation from the final report. In June 2010, OMB implemented
updated cost and schedule calculations as part of the new release of
the IT Dashboard. The updated calculations greatly increase the weight
of current and ongoing activities in relation to historical
milestones, as recommended, and we do not rule out the use of
additional adjustments to the formula as we assess these calculations
in the future.
Development of these calculations began prior to the beta release of
the IT Dashboard, in collaboration with the formation of a cost and
schedule working group comprised of subject matter experts from
several major agencies. As clearly signaled in the FAQs from launch,
the plan was always for the original calculations to be temporary
because of the inability to factor in ongoing activities due to known
limits of the available data. The group advised requiring the
collection of planned start date, actual start date, actual percent
complete and planned percent complete. The start date fields were
added as new fields in the A-11 Exhibit 300 schema for FY 2011, and
planned and actual percent complete were required for all activities
replacing the "percent complete" from the previous schema, which was
required only for milestones representing development, modernization
and enhancements.
Regarding the first recommendation, to revise guidance on re-
baselines, OMB believes that GAO's findings do not adequately address
changes already made to re-baseline guidance
and implementation since the launch of the IT Dashboard, and requests
removal of this recommendation. Prior to the launch of the IT
Dashboard, reporting of re-baselines was inconsistent and widely
unreported as required. Today, all changes to the baseline are
recorded by the system and clearly displayed on the investment
dashboard and in data feeds, along with agency-provided reasons for
the change. Contrary to GAO's findings, the reasons given do allow
agencies to differentiate between minor and major changes to the
baseline. OMB acknowledges that this differentiation of re-baselines
was not adequately presented in the previous release, and has made
changes in the new release to make this differentiation clearer.
On the third recommendation, OMB requests GAO adjust the report to
reflect the full progress achieved. OMB has chartered a working group
with the primary task of developing clear guidance for standardizing
and improving investment activity reporting. This group is comprised
of several subject matter experts from a cross-section of major
agencies, including HUD, EPA, Interior, DoD, VA, and DOT. In June
2010, additional guidance will be issued to amend previous baseline
management policies. This memorandum provides policy direction
regarding development of agency IT investment baseline management
policies and defines a common structure for IT investment baseline
management policy with the goal of improving transparency, performance
management, and effective investment oversight. Exhibit 300 for FY
2012 and the IT Dashboard schema are in the process of being updated
to reflect this change in policy, and results will be visible in the
IT Dashboard with the release of the new schema this fall.
On the second recommendation, OMB agrees with GAO's recommendation to
address how the recently updated cost and schedule calculations have
improved the accuracy of the data in the next annual Implementation of
the E-Govemment Act report.
OMB also has a general concern with the report's lack of information
about the first half of the thesis, namely that the IT "Dashboard has
increased transparency and oversight." In the "What GAO Found" section
of the Highlights page, virtually no mention is made as to how the IT
Dashboard has improved transparency, accountability, and oversight. To
address this omission, OMB suggests insertion of one or more of the
following improvements that have been made as a result of the IT
Dashboard: [See comment 1]
* In the FY 2009 E-Government Act report, numerous agencies reported
on how the IT Dashboard had increased the visibility of IT investment
management within the agency and increased scrutiny of investment data
by the most senior leadership of agencies.
* Last July, the Department of Veterans Affairs (VA) halted 45 IT
projects that were significantly behind schedule or over budget,
identified in part thanks to the IT Dashboard. In terminating 12 of
these projects, the VA avoided $54 million in wasteful spending during
fiscal year 2010.
* Building on the foundation of the IT Dashboard, OMB launched face-to-
face, evidence-based reviews of IT programs called TechStat
Accountability Sessions. These sessions enable Government officials to
collaborate with one another to turn around or halt IT investments
that do not produce dividends for the American people. Since January
2010, OMB has conducted 27 of these sessions with the responsible
agencies.
* Trend data collected over the past year allows OMB analysts and the
public to track changes in investments over time and identify issues
more timely and target investments for more in-depth reviews.
* OMB observed significant increases in data clean-up activities and
more rigorous internal agency review processes prior to monthly
reporting across the Federal agencies. Specifically, immediately upon
release of the IT Dashboard, we saw a marked improvement in the
quality of agency-reported contract numbers. Just prior to launch,
analysis of the legacy data showed only 455 matches with contract
Procurement Identifier in USASpending.gov. Today, the match rate has
more than quadrupled to 1,914.
* Agencies are paying greater attention to the impact of changing
planned data and informing OMB of changes to investment baselines.
Prior to the launch of the IT Dashboard, reporting of re-baselines was
inconsistent and widely unreported as required. Today, all changes to
the baseline are recorded by the system and clearly displayed on the
investment dashboard and in data feeds, along with agency-provided
reasons for the change.
* A lack of timely performance data and clear lines of accountability
have been historical impediments to the ability to identify potential
failures early on. By dramatically increasing the frequency of major
investment reporting from yearly to monthly, we greatly improved
oversight capabilities.
* By placing the CIO's picture on each investment page and requiring
them to rate each investment, a clear point of accountability was
established and long-needed visibility was given to CIOs government-
wide. By enabling CIO ratings to override cost and schedule ratings, a
mechanism was provided to overcome some of the concerns with the early
IT Dashboard calculations.
Finally, OMB is concerned about findings throughout the report related
to data quality, cost and schedule calculations, and management
baseline changes and requests that GAO update the report to address
the following:
* Data quality. [See comment 2] The IT Dashboard has brought into
focus data quality issues which have existed for years and should not
be directly attributed to the IT Dashboard itself. For example, only
after the IT Dashboard began using milestone data for meaningful
analysis did the need to collect activity start dates and separate
percentage complete fields become apparent. OMB immediately
established a policy to collect this information on the same day we
launched and included these new data fields as part of the BY 2010
Circular A-11 update released summer of 2009. Just one year later, the
IT Dashboard contains start dates for every one of the nearly 800
major investments, which has enabled more current cost and schedule
calculations today.
* Cost and schedule calculations. [See comment 3] OMB disagrees with
the characterization of cost and schedule ratings as "inaccurate"
throughout the report. This statement is misleading for four major
reasons. First, it could be interpreted to imply a binary standard for
accuracy (i.e. the calculation is either "inaccurate" or "accurate"),
and presumes that the agency systems or GAO's own calculations set the
standard for accuracy. GAO's report states that the primary data on
cost and schedule was obtained from agency earned value reports, which
generally cover only the contract portion of the investment. IT
Dashboard calculations cover a broader set of activities (development
and operational activities) performed by both contractor and
government resources. A direct comparison of EVM reports against IT
Dashboard calculations is therefore not appropriate.
Second, the methodology selected by GAO as the standard for accuracy
in this report relies on quality and consistency of agency EVM
reports. OMB recognizes that the availability and quality of EVM data
varies significantly by investment and by agency, calling into
question the reliability of any methodology which relies upon this
data As GAO states in this report, for two of the ten investments it
chose for its analysis, "the earned value data was not sufficient for
our purposes of rating cost and schedule performance." It is in
recognition of this very fact that OMB worked with agencies to develop
a calculation that did not rely solely upon EVM outputs that OMB could
apply consistently across all agencies. [See comment 4]
Third, the word "inaccurate" as applied to the IT Dashboard
calculations could be easily construed by the public to mean that the
calculation was defective or not performing as designed. As GAO has
not attempted to demonstrate how the calculations used were not
accurately and consistently applied to the entire dataset and
transparent to the public, we assume this not to be the case and ask
that the report clarifies that the calculation performs as designed.
[See comment 5]
Finally, by GAO's own assessment, the severity of the discrepancies of
the cost ratings was not dramatic. In fact, given that these figures
were calculated from entirely separate sources using different
methodologies, OMB believes the discrepancies between the cost
calculation ratings, as displayed in table 4, to be remarkably small.
[See comment 6]
* OMB guidance. [See comment 7] OMB believes that the report
overstates the impact of OMB's "vague" guidance on data quality and
underemphasizes the responsibility of agencies as the providers of the
data. While OMB acknowledges the need to assess and improve guidance
on an ongoing basis, in some cases clear existing guidance is not
always being followed by some agencies. For example, DoD agrees that
the milestone data it supplies is not at WBS level 2 or 3, but cites
internal policy as to why it reports at a higher level. Another
example is the ever-improving but still inadequate match rate on
contracts data, yielding a "Contractor's name match not found" message
on the investment dashboard page. Adherence to existing guidance would
have a significant impact on the quality of the data in the IT
Dashboard. As such cases arise, OMB analysts notify the agency, direct
the agency to take corrective actions, and monitor their progress.
Improvements to data quality have on numerous occasions been the
subject of TechStat action items. OMB plays an important role in
defining frameworks and standards for data quality; however, the
ultimate responsibility for data quality lies in the agency. This
should be made clear throughout the report.
* Baseline changes. [See comment 8] On page 12, the report
mischaracterizes the manner in which the IT Dashboard collects and
presents baseline corrections or changes. According to IT Dashboard
guidance, available in the FAQs on the site, "Only baselines approved
in accordance with agency policy should be entered into the IT
Dashboard by an authorized user." While the IT Dashboard did
previously identify changes to the baselines at the top level as re-
baselines, it has also always allowed agencies to differentiate
between various types of re-baselines and provide reasons for those
baseline changes. This includes provisions for data corrections. The
full set of reasons provided is as follows:
* External - Direct Mandate - Re-baselined/Planning change made due to
investment-specific budget changes mandated by Congress or OMB. For
example, Congressional rescissions or OMB budget decisions.
* External - Agency Mandate - Re-baselined/Planning changes made due
to agency budget changes resulting from OMB, Congressional, or
legislative action. For example, Fund disaster recovery efforts,
legislation directing all agencies to take across-the-board reduction,
leaving each agency with discretion to make the cut.
* External - Phase Gate - Re-baselined/Planning changes due to a
change from one investment life-cycle phase to another (i.e.,
planning, acquisition, mixed life cycle, operations and maintenance)
where cost and schedule milestones are expected to change
significantly.
* External ” Protest - Re-baselined/Planning changes due to sustained
long-term protest of contract.
* Internal - Bureau Mandate - Agency internal factors are re-baselined
due to Department to bureau budget changes (e.g., Department Passback)
that necessitate a change to the previously approved baseline.
* Internal - IRB Mandate - Agency internal factors are re-baselined
due to agency investment review board approved changes in
requirements, performance measures or scope (including enhancements
and feature changes).
* Internal - Technological Change - Agency internal factors are re-
baselined due to major technology changes or process innovations
requiring changes to investment's cost, schedule or performance
(usually to reduce cost/time to deployment).
* Internal - Poor Performance - Agency internal factors are re-
baselined due to significant poor project performance and leadership
change.
* Internal - Contractor Performance - Agency internal factors are re-
baselined due to poor contractor performance requiring contractor
change.
* Internal - Inaccurate Baseline - Agency internal factors are re-
baselined due to inaccuracies in the original baseline.
* Internal ” Other.
In addition, agencies wishing to make changes contrary to this policy
(e.g. to make data corrections) have always been granted the ability
by OMB to make specific changes through the back end of the system,
avoiding unnecessary and inaccurate "re-baseline" designations on the
IT Dashboard. To do so, agencies are asked to send a written request
to the assigned OMB analyst for review, who approves the specific
edits. In most cases, these changes can be made within 24 hours of
request.
With the launch of the beta release of the IT Dashboard in June 2009,
we made the decision to fundamentally change the way OMB exercises its
IT investment oversight function. No longer would data be locked up in
PDFs on agency websites, in varying formats with inconsistent display
of information and updated only once a year. Under the new model, data
would be dynamic, readily available and updated monthly, and easily
consumable by the public, who should serve as a new partner in
overseeing the Federal government's $80 billion investment in
information technology. [See comment 9]
We launched the IT Dashboard as a beta product, well-aware of the
challenges posed by the quality and consistency of the data and the
mechanisms to collect it. We did so recognizing that these challenges
would take time and effort to overcome, but would not be as far along
as we are today had we not taken these early steps toward
transparency. In the interest of promoting the use of transparency to
drive positive change in the way our government operates, we hope that
GAO takes these facts into strong consideration in the final version
of this report.
Sincerely,
Signed by:
Vivek Kundra:
Federal Chief Information Officer:
The following is GAO's response to the Office of Management and
Budget's (OMB) additional comments.
GAO Comments:
1. We agree that the Dashboard has increased transparency,
accountability, and oversight; therefore, we updated the report to
discuss additional uses of the Dashboard, such as the use of trend
data, improved oversight capabilities, and enhancements to agencies'
investment management processes. We also updated the number of
Techstat sessions that have taken place.
2. While additional data quality issues need to be addressed in the
Dashboard, we agree that the Dashboard is an improvement when compared
to OMB's previous oversight tools such as the Management Watch List
and High Risk List. As such, we modified the report to highlight these
improvements. For example, we added to the report that the Dashboard's
monthly reporting cycle is a significant improvement in the quality of
the data from the Management Watch List and High Risk List, which were
updated on a quarterly basis.
3. As stated in the report, we found that the ratings were not always
accurate. We based this characterization on the fact that there were
several instances in which the ratings were inconsistent with the
performance indicated in our analysis of the investments' earned value
management (EVM) reports and were notably different (e.g., ratings of
"green" versus "yellow"). We agree that EVM data generally only covers
the contracted development parts of the investments. As such, as part
of our methodology, we specifically selected investments where the
majority of each investment was focused on development efforts (versus
operational) and primarily carried out by contractors.[Footnote 16] As
such, we maintain that the comparison between the selected
investments' Dashboard ratings and the performance indicated in their
EVM reports is a fair assessment.
4. We acknowledge that the quality of EVM reports can vary. As such,
we took steps to ensure that the EVM reports we used were reliable
enough to evaluate the ratings on the Dashboard, and as OMB's comments
indicate, we discounted two of the ten selected investments after
determining that their data was insufficient for our needs. We do not
state that OMB should base their ratings solely on EVM data.
5. We agree that the original cost and schedule calculations are
performing as planned (i.e., are not defective) and we further
clarified this point in the report. We also note that planned changes
to the rating calculations will incorporate current performance.
However, these calculations, as originally planned and implemented, do
not factor in the performance of ongoing milestones, which we and OMB
agree is an area for improvement.
6. We agree that the severity of the discrepancies were not always
dramatic. However, 4 of the 8 investments had notable discrepancies on
either their cost or schedule ratings. Specifically, as demonstrated
in the report, there were multiple instances in which the ratings were
discrepant enough to change the color of the ratings. The difference
between a "green" rating (i.e., normal performance) and a "yellow"
rating (i.e., needs attention) is the difference between whether an
investment is flagged for needing attention or not, which we believe
is an important point to highlight.
7. We agree that agencies have a responsibility to provide quality
milestone data; however, we maintain that OMB's existing guidance on
which milestones to report is too general for agencies to ensure they
are reporting consistently. OMB acknowledges that this is an area for
improvement and has established a working group to address this issue.
8. As previously discussed, on June 28, 2010, OMB issued its new
guidance on managing IT baselines. This guidance, among other things,
describes when agencies should report baselines changes to the
Dashboard. Officials also provided information on the upcoming release
of the Dashboard--which is intended to be released in July 2010--that
will change the way baseline changes are displayed. We agree that
these recent changes address the issues we identified.
9. We acknowledge that the Dashboard has made significant improvements
to oversight and transparency, in comparison to OMB's previous methods
of overseeing IT investments, and we have added additional information
to the background of the report to highlight this point.
[End of section]
Appendix IV: Comments from the Department of Energy:
Note: GAO comments supplementing those in the report text appear at
the end of this appendix.
Department of Energy:
Washington. DC 20585:
June 3, 2010:
Mr. David A. Powner:
Director, Information Technology Management Issues:
General Accountability Office:
Dear Mr. Powner:
Thank you for the opportunity to respond to the U.S. General
Accountability Office's draft report entitled "OMB's Dashboard Has
Increased Transparency and Oversight, but Data Accuracy Improvements
Needed". In general, we agree with your assessment of the
implementation of the IT Dashboard across federal agencies and with
the recommendations that are presented to the Office of Management and
Midget (0MB).
We have the following comments to the report that we would like to
provide for your consideration:
1. There is a section of the report that indicates OMB released
guidance in June 2009, requesting agencies provide milestones at level
2 in their investments' work breakdown structures. It is worth noting
that until March 2010, investments were unable to transmit milestones
at a level lower than level 1. The system at OMB would not accept
lower-level milestones, and would in fact, double-count any sub-
milestones reported below level 1, For that reason. investments
reported milestones at level 1 and are currently working to report
milestones at level 2 for future Dashboard submissions to OMB. [See
comment 1]
2. In addition, there is a recommendation for Chief Information
Officers to ensure compliance with OMB guidance On milestone reporting
that has not been issued. OMB has indicated that agencies should
report milestones at level 2; however, as stated in your report, there
is no commonly accepted definition among federal agencies on the level
of details that should comprise level 2. Once OMB releases this
additional guidance, we will work with our Program and Staff Offices
to ensure the appropriate level of detail is reported to the
Dashboard. We will also work with OMB on the timing of when the
updated milestones can be reflected on the Dashboard. Typically new
milestones are posted once the final budget is submitted to OMB, which
occurred in March 2010 for the BY 2011 budget. If this same timeline
is followed for the upcoming budget cycle, the Department will not be
able to report level 2 milestones on the Dashboard until the nest
refresh of data which may not occur until March 2011, unless OMB
allows agencies to update milestones prior to that timeframe.
Again. thank you for the opportunity to review this report. If you
have any questions related to this letter, please feel free to contact
me at (202) 586-3705.
Sincerely,
Signed by:
TheAnne Gordon:
Associate Chief Information Officer for IT Planning, Architecture, and
E-Government:
The following is GAO's response to the Department of Energy's (DOE)
additional comment.
GAO Comment:
OMB's guidance required agencies to provide data at one consistent
work breakdown structure level, rather than a mix of multiple levels.
OMB and others confirmed that agencies were able to transmit
milestones at a single consistent level. For this report, we observed
agencies uploading at levels 1 through 4 and, thus, disagree that
agencies were unable to transmit milestones lower than level 1.
[End of section]
Appendix V: GAO Contact and Staff Acknowledgments:
GAO Contact:
David A. Powner at (202) 512-9286 or pownerd@gao.gov:
Staff Acknowledgments:
In addition to the contact name above, the following staff also made
key contributions to this report: Shannin O'Neill, Assistant Director;
Carol Cha; Eric Costello; Rebecca Eyler; Emily Longcore; Bradley
Roach; and Kevin Walsh.
[End of section]
Footnotes:
[1] GAO, Information Technology: Management and Oversight of Projects
Totaling Billions of Dollars Need Attention, [hyperlink,
http://www.gao.gov/products/GAO-09-624T] (Washington, D.C.: Apr. 28,
2009); GAO, Information Technology: OMB and Agencies Need to Improve
Planning, Management, and Oversight of Projects Totaling Billions of
Dollars, [hyperlink,
http://www.gao.gov/products/GAO-08-1051T] (Washington, D.C.: July 31,
2008); GAO, Information Technology: Further Improvements Needed to
Identify and Oversee Poorly Planned and Performing Projects,
[hyperlink, http://www.gao.gov/products/GAO-07-1211T] (Washington,
D.C.: Sept. 20, 2007); GAO, Information Technology: Improvements
Needed to More Accurately Identify and Better Oversee Risky Projects
Totaling Billions of Dollars, [hyperlink,
http://www.gao.gov/products/GAO-06-1099T] (Washington, D.C.: Sept. 7,
2006); GAO, Information Technology: Agencies and OMB Should Strengthen
Processes for Identifying and Overseeing High Risk Projects,
[hyperlink, http://www.gao.gov/products/GAO-06-647] (Washington, D.C.:
June 15, 2006).
[2] The ten investments are: DOD's Joint Precision Approach and
Landing System and Maneuver Control System, DOE's Integrated
Management Navigation System and Sequoia Platform, DOJ's Law
Enforcement Wireless Communication System and Unified Financial
Management System, HHS's BioSense Program and Electronic Research
Administration System, and USDA's Financial Management Modernization
Initiative and Risk Management Agency-13 Program. See appendix II for
descriptions of each investment.
[3] 40 U.S.C. § 11302(c).
[4] 44 U.S.C. § 3606.
[5] [hyperlink, http://www.gao.gov/products/GAO-09-624T]; GAO,
Information Technology: Treasury Needs to Better Define and Implement
Its Earned Value Management Policy, [hyperlink,
http://www.gao.gov/products/GAO-08-951 (Washington, D.C.: Sept. 22,
2008); [hyperlink, http://www.gao.gov/products/GAO-07-1211T];
[hyperlink, http://www.gao.gov/products/GAO-06-1099T]; [hyperlink,
http://www.gao.gov/products/GAO-06-647]; GAO, Information Technology:
OMB Can Make More Effective Use of Its Investment Reviews, [hyperlink,
http://www.gao.gov/products/GAO-05-276] (Washington, D.C.: Apr. 15,
2005); and GAO, Air Traffic Control: FAA Uses Earned Value Techniques
to Help Manage Information Technology Acquisitions, but Needs to
Clarify Policy and Strengthen Oversight, [hyperlink,
http://www.gao.gov/products/GAO-08-756] (Washington, D.C.: July 18,
2008).
[6] [hyperlink, http://www.gao.gov/products/GAO-05-276].
[7] [hyperlink, http://www.gao.gov/products/GAO-06-647].
[8] Exhibit 53s list all of the IT projects and their associated costs
within a federal organization. An exhibit 300 is also called the
Capital Asset Plan and Business Case. It is used to justify resource
requests for major IT investments and is intended to enable an agency
to demonstrate to its own management, as well as to OMB, that a major
project is well planned.
[9] We were unable to compare the performance of 2 of the investments--
Electronic Research Administration and Integrated Management
Navigation System--due to the Electronic Research Administration
investment not using standard earned value management practices and
Integrated Management Navigation System using a method of earned value
management that does not adequately allow us to assess ongoing
milestones. As a result, we determined that the earned value data was
not sufficient for our purposes of rating cost and schedule
performance. In addition, for the 10 selected investments, we did not
assess whether each investment had met cost estimating best practices.
As a result, we were unable to determine whether each investment's
earned value management data is overly optimistic. Overly optimistic
performance data may result in a program costing more and taking
longer than planned. See appendix II for descriptions of each
investment.
[10] At times, a project's cost, schedule, and performance goals--
known as its baseline--are modified to reflect changed development
circumstances. These changes--called rebaselining--can be done for
valid reasons, but can also be used to mask cost overruns and schedule
delays.
[11] GAO, GAO Cost Estimating and Assessment Guide: Best Practices for
Developing and Managing Capital Program Costs, [hyperlink,
http://www.gao.gov/products/GAO-09-3SP] (Washington, D.C.: Mar. 2,
2009), p. 23.
[12] The schedule rating is calculated slightly different. In addition
to factoring in completed milestones, it also includes overdue
milestones, but similar to the cost rating, it does not include the
performance of ongoing milestones.
[13] GAO, Information Technology: Agencies Need to Improve the
Implementation and Use of Earned Value Techniques to Help Manage Major
System Acquisitions, [hyperlink, http://www.gao.gov/products/GAO-10-2]
(Washington, D.C.: Oct. 8, 2009).
[14] Federal crop insurance provides protection for participating
farmers against the financial losses caused by droughts, floods, or
other natural disasters and against the risk of crop price
fluctuations.
[15] Situational awareness is the knowledge of the size, location, and
rate of spread of an outbreak.
[16] During the course of our review, we found that one of the
investments we selected (BioSense), had completed its development work
in July 2009. Therefore, as we note in the report, July 2009 was the
only month we assessed its performance.
[End of section]
GAO's Mission:
The Government Accountability Office, the audit, evaluation and
investigative arm of Congress, exists to support Congress in meeting
its constitutional responsibilities and to help improve the performance
and accountability of the federal government for the American people.
GAO examines the use of public funds; evaluates federal programs and
policies; and provides analyses, recommendations, and other assistance
to help Congress make informed oversight, policy, and funding
decisions. GAO's commitment to good government is reflected in its core
values of accountability, integrity, and reliability.
Obtaining Copies of GAO Reports and Testimony:
The fastest and easiest way to obtain copies of GAO documents at no
cost is through GAO's Web site [hyperlink, http://www.gao.gov]. Each
weekday, GAO posts newly released reports, testimony, and
correspondence on its Web site. To have GAO e-mail you a list of newly
posted products every afternoon, go to [hyperlink, http://www.gao.gov]
and select "E-mail Updates."
Order by Phone:
The price of each GAO publication reflects GAO‘s actual cost of
production and distribution and depends on the number of pages in the
publication and whether the publication is printed in color or black and
white. Pricing and ordering information is posted on GAO‘s Web site,
[hyperlink, http://www.gao.gov/ordering.htm].
Place orders by calling (202) 512-6000, toll free (866) 801-7077, or
TDD (202) 512-2537.
Orders may be paid for using American Express, Discover Card,
MasterCard, Visa, check, or money order. Call for additional
information.
To Report Fraud, Waste, and Abuse in Federal Programs:
Contact:
Web site: [hyperlink, http://www.gao.gov/fraudnet/fraudnet.htm]:
E-mail: fraudnet@gao.gov:
Automated answering system: (800) 424-5454 or (202) 512-7470:
Congressional Relations:
Ralph Dawn, Managing Director, dawnr@gao.gov:
(202) 512-4400:
U.S. Government Accountability Office:
441 G Street NW, Room 7125:
Washington, D.C. 20548:
Public Affairs:
Chuck Young, Managing Director, youngc1@gao.gov:
(202) 512-4800:
U.S. Government Accountability Office:
441 G Street NW, Room 7149:
Washington, D.C. 20548: